reading and writing file using autoresetevent in C# - c#

I have written a simple program of thread synchronization. But when I run this program I get an error "The process cannot access the file 'D:\Vivek.txt' because it is being used by another process." Why I am getting this error.
class Program
{
const string Filepath = "D:\\Vivek.txt";
static AutoResetEvent writerwaithandle= new AutoResetEvent(true);// Signaled state
static AutoResetEvent readerwaithandle = new AutoResetEvent(false);
static void Main()
{
if (File.Exists(Filepath))
{
File.Delete(Filepath);
}
File.CreateText(Filepath);
CreateWriterThread();
CreateReaderThread();
Console.ReadKey();
}
private static void CreateWriterThread()
{
for (int i = 1; i <= 10;i++ )
{
var thread = new Thread(WriteFile);
thread.Name = "Writer " + i;
thread.Start();
Thread.Sleep(250);
}
}
private static void CreateReaderThread()
{
for (int i = 1; i <= 10; i++)
{
var thread = new Thread(ReadFile);
thread.Name = "Reader " + i;
thread.Start();
}
}
private static void WriteFile()
{
writerwaithandle.WaitOne();
var stream = new FileStream(Filepath, FileMode.Append);
var streamwriter = new StreamWriter(stream);
streamwriter.WriteLine("written by"+Thread.CurrentThread.Name+DateTime.Now));
streamwriter.Flush();
streamwriter.Close();
readerwaithandle.Set();
}
private static void ReadFile()
{
readerwaithandle.WaitOne();
if (File.Exists(Filepath))
{
var stream = new FileStream(Filepath, FileMode.Open);
var streamreader = new StreamReader(stream);
var text = streamreader.ReadToEnd();
streamreader.Close();
Console.WriteLine("Read by thread {0} \n",Thread.CurrentThread.Name);
Console.WriteLine(text);
}
writerwaithandle.Set();
}
}
when I replace the code from
if (File.Exists(Filepath))
{
File.Delete(Filepath);
}
File.CreateText(Filepath);
to
if (!File.Exists(Filepath))
{
File.CreateText(Filepath);
}
the program shows the same error for first time. after that it never gives any error.
please anyone tell me the bug area, reason and what should be the best solution.

When you use FileStream always use it with using like
using (var fileStream = new FileStream(Filepath, FileMode.Open))
{
Your code...
}
This ensures that the stream gets disposed properly.
You can also use the StreamReader along with using
using (var fileStream = new FileStream(Filepath, FileMode.Open))
{
using (var reader = new StreamReader(stream))
{
Your code...
}
}

Be vigilant, look at the documentation of File.CreateText
Creates or opens a file for writing UTF-8 encoded text.
Return Value Type: System.IO.StreamWriter A StreamWriter that writes
to the specified file using UTF-8 encoding
It means that there is no need to create a new FileStream because FileStream is already created and returned when you use File.CreateText. You should only use that created FileStream in your code.
Here is the fixed version of your code:
class Program
{
const string Filepath = "D:\\Vivek.txt";
static AutoResetEvent writerwaithandle = new AutoResetEvent(true);// Signaled state
static AutoResetEvent readerwaithandle = new AutoResetEvent(false);
static void Main()
{
if (File.Exists(Filepath))
{
File.Delete(Filepath);
}
//File.CreateText(Filepath);
CreateWriterThread();
CreateReaderThread();
Console.ReadKey();
}
private static void CreateWriterThread()
{
for (int i = 1; i <= 10; i++)
{
var thread = new Thread(WriteFile);
thread.Name = "Writer " + i;
thread.Start();
Thread.Sleep(250);
}
}
private static void CreateReaderThread()
{
for (int i = 1; i <= 10; i++)
{
var thread = new Thread(ReadFile);
thread.Name = "Reader " + i;
thread.Start();
}
}
private static void WriteFile()
{
writerwaithandle.WaitOne();
var streamwriter = File.CreateText(Filepath);
//var stream = new FileStream(Filepath, FileMode.Append);
//var streamwriter = new StreamWriter(stream);
streamwriter.WriteLine("written by" + Thread.CurrentThread.Name + DateTime.Now);
streamwriter.Flush();
streamwriter.Close();
readerwaithandle.Set();
}
private static void ReadFile()
{
readerwaithandle.WaitOne();
if (File.Exists(Filepath))
{
var stream = new FileStream(Filepath, FileMode.Open);
var streamreader = new StreamReader(stream);
var text = streamreader.ReadToEnd();
streamreader.Close();
Console.WriteLine("Read by thread {0} \n", Thread.CurrentThread.Name);
Console.WriteLine(text);
}
writerwaithandle.Set();
}
}

Related

RestSharp, Forge API - Getting error:overlapping ranges on file upload

I am trying to upload a file to a bucket using the forge .NET SDK. It works most of the time but gives an {error: overlapping ranges} occasionally. Here is the code snippet.
private string uploadFileToBucket(Configuration configuration, string bucketKey, string filePath)
{
ObjectsApi objectsApi = new ObjectsApi(configuration);
string fileName = Path.GetFileName(filePath);
string base64EncodedUrn, objectKey;
using (FileStream fileStream = File.Open(filePath, FileMode.Open))
{
long contentLength = fileStream.Length;
string content_range = "bytes 0-" + (contentLength - 1) + "/" + contentLength;
dynamic result = objectsApi.UploadChunk(bucketKey, fileName, (int)fileStream.Length, content_range,
"12313", fileStream);
DynamicJsonResponse dynamicJsonResponse = (DynamicJsonResponse)result;
JObject json = dynamicJsonResponse.ToJson();
JToken urn = json.GetValue("objectId");
string urnStr = urn.ToString();
base64EncodedUrn = ApiClient.encodeToSafeBase64(urnStr);
objectKey = fileName;
}
return base64EncodedUrn;
}
Before uploading, the file content must have to read to the computer memory, otherwise, the FileStream object in your code snippet is empty.
However, I would like to advise you to use PUT buckets/:bucketKey/objects/:objectName instead if you want to upload the whole file in a single chunk only. Here is my test code. Hope it helps~
private static TwoLeggedApi oauth2TwoLegged;
private static dynamic twoLeggedCredentials;
private static Random random = new Random();
public static string RandomString(int length)
{
const string chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
return new string(Enumerable.Repeat(chars, length)
.Select(s => s[random.Next(s.Length)]).ToArray());
}
// Initialize the 2-legged OAuth 2.0 client, and optionally set specific scopes.
private static void initializeOAuth()
{
// You must provide at least one valid scope
Scope[] scopes = new Scope[] { Scope.DataRead, Scope.DataWrite, Scope.BucketCreate, Scope.BucketRead };
oauth2TwoLegged = new TwoLeggedApi();
twoLeggedCredentials = oauth2TwoLegged.Authenticate(FORGE_CLIENT_ID, FORGE_CLIENT_SECRET, oAuthConstants.CLIENT_CREDENTIALS, scopes);
objectsApi.Configuration.AccessToken = twoLeggedCredentials.access_token;
}
private static void uploadFileToBucket(string bucketKey, string filePath)
{
Console.WriteLine("*****Start uploading file to the OSS");
string path = filePath;
//File Total size
var info = new System.IO.FileInfo(path);
long fileSize = info.Length;
using (FileStream fileStream = File.Open(filePath, FileMode.Open))
{
string sessionId = RandomString(12);
Console.WriteLine(string.Format("sessionId: {0}", sessionId));
long contentLength = fileSize;
string content_range = "bytes 0-" + (contentLength - 1) + "/" + contentLength;
Console.WriteLine("Uploading rangeļ¼š " + content_range);
byte[] buffer = new byte[contentLength];
MemoryStream memoryStream = new MemoryStream(buffer);
int nb = fileStream.Read(buffer, 0, (int)contentLength);
memoryStream.Write(buffer, 0, nb);
memoryStream.Position = 0;
dynamic response = objectsApi.UploadChunk(bucketKey, info.Name, (int)contentLength, content_range,
sessionId, memoryStream);
Console.WriteLine(response);
}
}
static void Main(string[] args)
{
initializeOAuth();
uploadFileToBucket(BUCKET_KEY, FILE_PATH);
}

I can't invoke an SQL command on multiple threads C#

So i have a method that does an invoke on a datagridview and works fine for the first thread that runs it, however when a second thread tries to utilise the method, the download part of it still functions, however the invoke statement stops working for the first thread and starts to change both
public void ByteTransferResume(int indexResume)
{
HttpWebRequest req;
HttpWebResponse resp;
req = (HttpWebRequest)HttpWebRequest.Create(FileLocationName);
req.AddRange((int)fileInfoDestination.Length);
resp = (HttpWebResponse)(req.GetResponse());
long fileLength = resp.ContentLength;
FileLocationLength = fileLength;
using (Stream responseStream = resp.GetResponseStream())
{
int iBufferSize = 1024;
iBufferSize *= 1000;
using (FileStream saveFileStream = new FileStream(FileDestination, FileMode.Append, FileAccess.Write, FileShare.ReadWrite))
{
int iByteSize;
byte[] downBuffer = new byte[iBufferSize];
while ((iByteSize = responseStream.Read(downBuffer, 0, downBuffer.Length)) > 0)
{
saveFileStream.Write(downBuffer, 0, iByteSize);
FileInfo fileInfoDestinations = new FileInfo(FileDestination);
FileDestinationLength = (int)fileInfoDestinations.Length;
double downloadProgress = ((double)FileDestinationLength / FileLocationLength) * 100;
// MessageBox.Show(downloadProgress.ToString());
dgvDownloadInfo.Invoke(new dgvCommandDelegate(DGVCommand), new object[] { $"UPDATE Download_Info SET [Progress] = '{downloadProgress:F2}%' WHERE [File Name] = '{thread1[indexResume].Name}'" });
//MessageBox.Show(thread1[indexResume].Name);
//MessageBox.Show(indexResume.ToString());
// dgvDownloadInfo.Invoke(new dgvConnectionDelegate(DGVConnection));
Thread.Sleep(10);
}
}
}
}
Maybe this will help you:
public object _lock = new object();
public void ByteTransferResume(int indexResume)
{
lock (_lock)
{
HttpWebRequest req;
//rest of your method
}
}

What is the fastest method to merge a number of files into a file in c#?

I handle with big files(of which capacities is minimum 500MB) to split and merge by c#.
I have to split the file into thousands of files, sort these files into some groups, and merge these by each group.
The minimum number of files are 10,000.
I implement the merge function by using the method Stream.CopyTo(). Here is the main part of that.
using (Stream writer = File.OpenWrite(outputFilePath))
{
int fileNum = filePaths.Count();
for (int i = 0; i < fileNum; i++)
{
using (Stream reader = File.OpenRead(filePaths.ElementAt(i)))
{ reader.CopyTo(writer); }
}
}
I've tested my program to split 500MB into 17000 files of 2 groups and to merge each group of 8500 files into one file.
The merging part takes about 80 seconds. I think it is pretty slow compared to splitting the same file which takes about 15~20 seconds
Is there any method which is faster than my code?
Your code looks fine but ElementAt is a code smell. Convert that to an array and use [i] instead. If you have 10K elements I'm positive you're wasting a lot of time.
Why not just use the Stream.CopyTo() method?
private static void CombineMultipleFilesIntoSingleFile(string inputDirectoryPath, string inputFileNamePattern, string outputFilePath)
{
string[] inputFilePaths = Directory.GetFiles(inputDirectoryPath, inputFileNamePattern);
Console.WriteLine("Number of files: {0}.", inputFilePaths.Length);
using (var outputStream = File.Create(outputFilePath))
{
foreach (var inputFilePath in inputFilePaths)
{
using (var inputStream = File.OpenRead(inputFilePath))
{
// Buffer size can be passed as the second argument.
inputStream.CopyTo(outputStream);
}
Console.WriteLine("The file {0} has been processed.", inputFilePath);
}
}
}
OR
Do it in chunks:
const int chunkSize = 2 * 1024; // 2KB
var inputFiles = new[] ;
using (var output = File.Create("output.dat"))
{
foreach (var file in inputFiles)
{
using (var input = File.OpenRead(file))
{
var buffer = new byte[chunkSize];
int bytesRead;
while ((bytesRead = input.Read(buffer, 0, buffer.Length)) > 0)
{
output.Write(buffer, 0, bytesRead);
}
}
}
}
Maybe try compressing the files?
using System;
using System.Collections.Generic;
using System.Text;
using System.IO;
using System.IO.Compression;
class Program {
static void SaveCompressedFile(string filename, string data) {
FileStream fileStream = new FileStream(filename, FileMode.Create, FileAccess.Write);
GZipStream compressionStream = new GZipStream(fileStream, CompressionMode.Compress);
StreamWriter writer = new StreamWriter(compressionStream);
writer.Write(data);
writer.Close();
}
static string LoadCompressedFile(string filename) {
FileStream fileStream = new FileStream(filename, FileMode.Open, FileAccess.Read);
GZipStream compressionStream = new GZipStream(fileStream, CompressionMode.Decompress);
StreamReader reader = new StreamReader(compressionStream);
string data = reader.ReadToEnd();
reader.Close();
return data;
}
static void Main(string[] args) {
try {
string filename = "compressedFile.txt";
string sourceString = "Source String";
SaveCompressedFile(filename, sourceString);
FileInfo compressedFileData = new FileInfo(filename);
string recoveredString = LoadCompressedFile(filename);
} catch (IOException ex) {
Console.WriteLine(ex.ToString());
}
}
}
Source
Also check out the example of compressing a directory.
using System;
using System.Text;
using System.IO;
using System.IO.Compression;
namespace CmprDir
{
class Program
{
delegate void ProgressDelegate(string sMessage);
static void CompressFile(string sDir, string sRelativePath, GZipStream zipStream)
{
//Compress file name
char[] chars = sRelativePath.ToCharArray();
zipStream.Write(BitConverter.GetBytes(chars.Length), 0, sizeof(int));
foreach (char c in chars)
zipStream.Write(BitConverter.GetBytes(c), 0, sizeof(char));
//Compress file content
byte[] bytes = File.ReadAllBytes(Path.Combine(sDir, sRelativePath));
zipStream.Write(BitConverter.GetBytes(bytes.Length), 0, sizeof(int));
zipStream.Write(bytes, 0, bytes.Length);
}
static bool DecompressFile(string sDir, GZipStream zipStream, ProgressDelegate progress)
{
//Decompress file name
byte[] bytes = new byte[sizeof(int)];
int Readed = zipStream.Read(bytes, 0, sizeof(int));
if (Readed < sizeof(int))
return false;
int iNameLen = BitConverter.ToInt32(bytes, 0);
bytes = new byte[sizeof(char)];
StringBuilder sb = new StringBuilder();
for (int i = 0; i < iNameLen; i++)
{
zipStream.Read(bytes, 0, sizeof(char));
char c = BitConverter.ToChar(bytes, 0);
sb.Append(c);
}
string sFileName = sb.ToString();
if (progress != null)
progress(sFileName);
//Decompress file content
bytes = new byte[sizeof(int)];
zipStream.Read(bytes, 0, sizeof(int));
int iFileLen = BitConverter.ToInt32(bytes, 0);
bytes = new byte[iFileLen];
zipStream.Read(bytes, 0, bytes.Length);
string sFilePath = Path.Combine(sDir, sFileName);
string sFinalDir = Path.GetDirectoryName(sFilePath);
if (!Directory.Exists(sFinalDir))
Directory.CreateDirectory(sFinalDir);
using (FileStream outFile = new FileStream(sFilePath, FileMode.Create, FileAccess.Write, FileShare.None))
outFile.Write(bytes, 0, iFileLen);
return true;
}
static void CompressDirectory(string sInDir, string sOutFile, ProgressDelegate progress)
{
string[] sFiles = Directory.GetFiles(sInDir, "*.*", SearchOption.AllDirectories);
int iDirLen = sInDir[sInDir.Length - 1] == Path.DirectorySeparatorChar ? sInDir.Length : sInDir.Length + 1;
using (FileStream outFile = new FileStream(sOutFile, FileMode.Create, FileAccess.Write, FileShare.None))
using (GZipStream str = new GZipStream(outFile, CompressionMode.Compress))
foreach (string sFilePath in sFiles)
{
string sRelativePath = sFilePath.Substring(iDirLen);
if (progress != null)
progress(sRelativePath);
CompressFile(sInDir, sRelativePath, str);
}
}
static void DecompressToDirectory(string sCompressedFile, string sDir, ProgressDelegate progress)
{
using (FileStream inFile = new FileStream(sCompressedFile, FileMode.Open, FileAccess.Read, FileShare.None))
using (GZipStream zipStream = new GZipStream(inFile, CompressionMode.Decompress, true))
while (DecompressFile(sDir, zipStream, progress));
}
public static int Main(string[] argv)
{
if (argv.Length != 2)
{
Console.WriteLine("Usage: CmprDir.exe <in_dir compressed_file> | <compressed_file out_dir>");
return 1;
}
string sDir;
string sCompressedFile;
bool bCompress = false;
try
{
if (Directory.Exists(argv[0]))
{
sDir = argv[0];
sCompressedFile = argv[1];
bCompress = true;
}
else
if (File.Exists(argv[0]))
{
sCompressedFile = argv[0];
sDir = argv[1];
bCompress = false;
}
else
{
Console.Error.WriteLine("Wrong arguments");
return 1;
}
if (bCompress)
CompressDirectory(sDir, sCompressedFile, (fileName) => { Console.WriteLine("Compressing {0}...", fileName); });
else
DecompressToDirectory(sCompressedFile, sDir, (fileName) => { Console.WriteLine("Decompressing {0}...", fileName); });
return 0;
}
catch (Exception ex)
{
Console.Error.WriteLine(ex.Message);
return 1;
}
}
}
}
Source

Write from a stream to a string

I'm trying to use the streamwriter to write into a file that is created temporarily i.e. _logFileName and at the same time write the data written into the file to a string using stream reader. The current code shows no errors but at runtime says that it can not read from _logFileName as it is in use already.
how to do i do this ?
using (StreamWriter _logFile = File.CreateText(_logFileName))
{
//string s = "";
//using (StreamReader fill_log = new StreamReader(s))
using (StreamReader fill_log = new StreamReader(_logFileName))
{
_logFile.WriteLine("Logfile name is: " + _logFileName);
content += fill_log.ReadLine();
_logFile.WriteLine("LOG FILE STARTED AT: " + _startDateTime.ToString());
content += fill_log.ReadLine();
_logFile.WriteLine("============================================");
content += fill_log.ReadLine();
_logFile.Write(_message);
content += fill_log.ReadLine();
_logFile.WriteLine();
content += fill_log.ReadLine();
}
_logFile.Close();
}
So based on the suggestion i changed the code to this:
using (var fsWrite = new FileStream(_logFileName, FileMode.Create, FileAccess.Write, FileShare.ReadWrite))
using (var _logFile = new StreamWriter(fsWrite))
using (var fsRead = new FileStream(_logFileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (var fill_log = new StreamReader(fsRead))
{
_logFile.WriteLine();
content += fill_log.ReadLine();
_logFile.WriteLine("TIME OF LOG ENTRY: " + DateTime.Now);
content += fill_log.ReadLine();
// Arbitrary objects can also be written to the file.
_logFile.WriteLine(_message);
content += fill_log.ReadLine();
_logFile.Flush();
_logFile.Close();
On doing so, i am able to red and write simultaneously! that gave no problem. Thanks. But the content string variable seems to end after everyright. and ideas why this would happen ?
In order to be able to simultaneously read and write from the same file you have to create the FileStream object manually using one of the constructors that take a FileShare parameter, for example this one.
using (var fsWrite = new FileStream(name, FileMode.Create, FileAccess.Write, FileShare.ReadWrite))
using (var _logFile = new StreamWriter(fsWrite))
using (var fsRead = new FileStream(name, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (var fill_log = new StreamReader(fsRead))
{
...
}
Another way to achieve what you want is using a specialized TextWriter that writes the StreamWriter and a StringBuilder:
using (StreamWriter _logFile = File.CreateText(_logFileName))
{
using (var builder = new StringBuildingStreamWriter(_logFile))
{
builder.WriteLine("Logfile name is: " + _logFileName);
builder.WriteLine("LOG FILE STARTED AT: " + _startDateTime.ToString());
builder.WriteLine("============================================");
builder.Write(_message);
builder.WriteLine();
content += builder.ToString();
}
_logFile.Close();
}
public class StringBuildingStreamWriter:TextWriter
{
StringBuilder sb = new StringBuilder();
private StreamWriter sw;
public StringBuildingStreamWriter(StreamWriter sw)
{
this.sw = sw;
}
public override void WriteLine(string value)
{
sb.AppendLine(value);
sw.WriteLine(value);
}
public override void WriteLine()
{
sw.WriteLine();
sb.AppendLine();
}
public override void Write(string value)
{
sb.Append(value);
sw.Write(value);
}
public override string ToString()
{
return sb.ToString();
}
public override Encoding Encoding
{
get { return UTF8Encoding.UTF8; }
}
}

Writing a file adding random characters to start of each line

I'm overwriting a file using C# in Windows Phone 7. When I do this a seemingly random character is added to the start of each line.
Why is this happening?
Code:
public static bool overwriteFile(string filename, string[] inputArray)
{
try
{
IsolatedStorageFile store = IsolatedStorageFile.GetUserStoreForApplication();
FileStream stream = store.OpenFile(filename, FileMode.Create);
BinaryWriter writer = new BinaryWriter(stream);
foreach (string input in inputArray)
{
writer.Write(input + "\n");
}
writer.Close();
return true;
}
catch (IOException ex)
{
return false;
}
}
Lodaing Code:
public static Idea[] getFile(string filename)
{
try
{
IsolatedStorageFile store = IsolatedStorageFile.GetUserStoreForApplication();
string fileContents = null;
if (store.FileExists(filename)) // Check if file exists
{
IsolatedStorageFileStream save = new IsolatedStorageFileStream(filename, FileMode.Open, store);
StreamReader streamReader = new StreamReader(save);
fileContents = streamReader.ReadToEnd();
save.Close();
}
string[] lines = null;
if (fileContents != null)
{
lines = fileContents.Split('\n');
}
Idea[] ideaList = null;
if (lines != null)
{
ideaList = new Idea[lines.Length];
for (int i = 0; i < lines.Length; i++)
{
ideaList[i] = new Idea(lines[i].TrimEnd('\r'));
}
}
return ideaList;
}
catch (IOException ex)
{
return null;
}
}
The random character is a length prefix; see http://msdn.microsoft.com/en-us/library/yzxa6408.aspx.
You should be using some type of TextWriter to write strings to the file; NOT a BinaryWriter.
A StreamWriter might be the best and then you could use the WriteLine method.
Instead of using '\n', try using Environment.NewLine
You are using a BinaryWriter to write, and a TextReader to read. Change your write code to use a StreamWriter (which is a TextWriter) instead of a BinaryWriter. This will also get you the WriteLine method that Naveed recommends.
try changing this
writer.Write(input + "\n");
to
writer.WriteLine(input);

Categories