I have following code to read a file
StreamReader str = new StreamReader(File.Open(fileName, FileMode.Open, FileAccess.Read));
string fichier = str.ReadToEnd();
str.Close();
This is part of a asp.net webservice and has been working fine for an year now in production. Now with increasing load on server, customer has started getting "File already in use" error. That file is being read from this code and is never written to from application.
One problem that I clearly see is that we are not caching the contents of file for future use. We will do that. But I need to understand why and how we are getting this issue.
Is it because of multiple threads trying to read the file? I read that StreamReader is not thread safe but why should it be a problem when I am opening file in Read mode?
You need to open the file with read access allowed. Use this overload of File.Open to specify a file sharing mode. You can use FileShare.Read to allow read access to this file.
Anothr possible solution is to load this file once into memory in a static constructor of a class and then store the contents in a static read-only variable. Since a static constructor is guaranteed to run only once and is thread-safe, you don't have to do anything special to make it work.
If you never change the contents in memory, you won't even need to lock when you access the data. If you do change the contents, you need to first clone this data every time when you're about to change it but then again, you don't need a lock for the clone operation since your actual original data never changes.
For example:
public static class FileData
{
private static readonly string s_sFileData;
static FileData ()
{
s_sFileData = ...; // read file data here using your code
}
public static string Contents
{
get
{
return ( string.Copy ( s_sFileData ) );
}
}
}
This encapsulates your data and gives you read-only access to it.
You only need String.Copy() if your code may modify the file contents - this is just a precaution to force creating a new string instance to protect the original string. Since string is immutable, this is only necessary if your code uses string pointers - I only added this bit because I ran into an issue with a similar variable in my own code just last week where I used pointers to cached data. :)
FileMode just controls what you can do (read/write).
Shared access to files is handled at the operating system level, and you can request behaviors with FileShare (3rd param), see doc
Related
I need to write a big file in my project.
What I learned:
I should NOT write the big file directly to the destination path,
because this may leave a incomplete file in case the app crash while writing it.
Instead, I should write to a temporary file and move (rename) it. (called atomic file operation)
My code snippet:
[NotNull]
public static async Task WriteAllTextAsync([NotNull] string path, [NotNull] string content)
{
string temporaryFilePath = null;
try {
temporaryFilePath = Path.GetTempFileName();
using (var stream = new StreamWriter(temporaryFilePath, true)) {
await stream.WriteAsync(content).ConfigureAwait(false);
}
File.Delete(path);
File.Move(temporaryFilePath, path);
}
finally {
if (temporaryFilePath != null) File.Delete(temporaryFilePath);
}
}
My Question:
The file will be missing if the app crashes between File.Delete and File.Move. Can I avoid this?
Is there any other best practice for writing big files?
Is there any suggestion on my code?
The file will be missing if the app crashes between File.Delete and File.Move. Can I avoid this?
Not that I'm aware of, but you can detect it - and if you use a more predictable filename, you can recover from that. It helps if you tweak the process somewhat to use three file names: the target, a "new" file and an "old" file. The process becomes:
Write to "new" file (e.g. foo.txt.new)
Rename the target file to the "old" file (e.g. foo.txt.old)
Rename the "new" file to the target file
Delete the "old" file
You then have three files, each of which may be present or absent. That can help you to detect the situation when you come to read the new file:
No files: Nothing's written data yet
Just target: All is well
Target and new: App crashed while writing new file
Target and old: App failed to delete old file
New and old: App failed after the first rename, but before the second
All three, or just old, or just new: Something very odd is going on! User may have interfered
Note: I was unaware of File.Replace before, but I suspect it's effectively just a simpler and possibly more efficient way of doing the code you're already doing. (That's great - use it!) The recovery process would still be the same though.
You can use File.Replace instead of deleting and moving files. In case of hard fault (electricity cut or something like this) you will always lost data, you have to count with that.
I have a Windows Forms Application that uses 2 forms, with both writing to separate files (file paths given by inclusion of strings in textboxes on the form).
For form1, I have a number of functions that write data to the file on various different button clicks. This being the case, I used the StreamWriter consoleFile = new StreamWriter(File.OpenWrite(fileName)); method for the first writing to file and StreamWriter consoleFile = File.AppendText(fileName); for any subsequent ones. This has worked fine.
When it came to implementing the same feature for Form2, the main difference is that all the text is written at once (one function containing four sub-functions to try and keep the code tidy). I went about it like this...
public void writeChecklistToFile()
{
//open new file for writing
StreamWriter checklistFileStart = new StreamWriter(File.OpenWrite(getChecklistFile()));
checklistFileStart.WriteLine("Pre-Anaesthetic Checklist\n");
//sub-functions (one for each section of list)
//append tool used in separate functions
//StreamWriter checklistFile = File.AppendText(getChecklistFile());
writeAnimalDetails();
writeAnimalHistory();
writeAnimalExamination();
writeDrugsCheck();
}
Each of the sub-functions then contains the appendText variable shown above:
public void writeAnimalDetails()
{
StreamWriter checklistFile = File.AppendText(getChecklistFile());
//...
}
Whenever I click the button that calls the main function, it throws an exception on the first File.AppendText() method. It states that the destination file cannot be accessed because it is already being used in another process.
Presumably this has to be the OpenWrite() as it is not used anywhere before that, but I don't understand why this error would occur in my form2 when it doesn't in form1!
If anyone could help me get around this, or can point me in the direction of an easier way to do it, I'd really appreciate that.
Thanks
Mark
Read the error as "File cannot be accessed because [the file is still open for use by this] process".
The problem is that the file resource - from File.OpenWrite - is not being Disposed correctly and an unmanaged file handle, with an exclusive lock, is kept open. This in turn results in exceptions when trying to open the still-open file for writing. Use the using statement to help with lifetime management, as discussed here.
In this particular case I recommend supplying the StreamWriter - created once - as an argument to the functions that need to write to it and then Dispose the entire open file resource once at the end when complete. This ensures a more visible resource lifetime and avoids several open-close operations.
public void writeChecklistToFile()
{
// Open file for writing once..
using (var checklistWriter = new StreamWriter(File.OpenWrite(getChecklistFile())))
{
// .. write everything to it, using the same Stream
checklistWriter.WriteLine("Pre-Anaesthetic Checklist\n");
writeAnimalDetails(checklistWriter);
writeAnimalHistory(checklistWriter);
writeAnimalExamination(checklistWriter);
writeDrugsCheck(checklistWriter);
}
// And the file is really closed here, thanks to using/Dispose
}
Also see
File cannot be accessed because it is being used by another program
I think the reason it works in your first form is that you only ever have one StreamWriter existing at a time. You click a button, a StreamWriter is created, the function ends, and the StreamWriter is automatically closed before the next button click calls a function.
With your second form, however, you're calling your sub functions with their StreamWriters within a main function that also has a StreamWriter. What that amounts to is you have more than one StreamWriter trying to open the file at the same time, and thus the error.
To fix this, you can put after your call to WriteLine in your writeChecklistToFile function:
checklistFileStart.Close();
This will close your first FileStream, and allow your subsequent ones to open up the file.
I am using the StreamWriter to create a file and to write some text to that file. In some cases I have no text to write via StreamWriter, but the file was already created when StreamWriter was initialized.
using (StreamWriter sw = new StreamWriter(#"C:\FileCreated.txt"))
{
}
Currently I am using the following code, when StreamWriter is closed, to check if the FileCreated.txt content is empty, if it is delete it. I am wondering if there is a more elegant approach than this (an option within StreamWriter perhaps)?
if (File.Exists(#"C:\FileCreated.txt"))
{
if (new FileInfo(#"C:\FileCreated.txt").Length == 0)
{
File.Delete(#"C:\FileCreated.txt");
}
}
By the way, I must open a stream to write before I can check if there is any text because of some other logic in the code.
If you want to take input from the user bit by bit, you can make your source a StringBuilder, and then just commit to disk when you're done
StringBuilder SB = new StringBuilder();
...
SB.AppendLine("text");
...
if(SB.Length > 0)
File.WriteAllLines(SB.ToString());
Delaying opening the file until the first output would solve this problem, but it might create a new one (if there's a permission error creating the file, you won't find out until later, maybe when the operator is no longer at the computer).
Your current approach is decent. I don't see the need to test File.Exists, though, if you just closed a stream to it. Also consider the race condition:
You find that the file is zero-length
Another process writes to the file
You delete the file
Also consider that you might have permission to create a file, and not to delete it afterwards!
Doing this correctly requires using the raw Win32 API, as I described in a previous answer. Do note that a .NET stream could be used for the first file handle, as long as you specify the equivalent of FILE_SHARE_WRITE.
Revisit your assumptions, i.e. that you must open the stream before checking for content. Simply reorganize your logic.
I am trying to delete/open/edit some files in my C# .Net application.Sometimes i get exception stating the file/directory is being accessed by another process.Is there a way to check if a file/directory is being accessed by process and try to release the file from that process?
No. The only way to do this is to try to access the file, and handle the IOException.
Realistically this is the only safe way anyway. Suppose there was a IsFileInUse() method, and you called it, and it returned "nope, nobody's using that file," and you went ahead and accessed the file. The problem is that in the meantime some other process might have locked or deleted the file. So you'd need to put exception handling around your attempt to access the file anyway. The "test by acquiring" model is the only one that is 100% reliable.
If a file is in use by another process, .NET doesn't provide a way of determining which other process that might be. I believe this would require some pretty low-level unmanaged code though I could be wrong. It is a very low-level operation, if it is possible at all, to "release the file from that process" because that would violate the other process' expectations -- e.g. it thinks it is allowed to write to the file but you have deleted the file and garbaged the handle. I believe you would need to terminate the other process if it's not willing to give up its lock voluntarily.
First, I suppose there are 2 things that may help you:
consider using FileAccess and FileShare flags when opening files
if data from the file is needed only withing the scope of the function use the construction
using(FileStream stream = File.Open(...)) { <file operations> }
this will ensure that file is closed immediately after exiting 'using' block, and not when FileStream object is collected by GC.
Second, there is an unsafe way to get processes that use the file. It is based on debugging features provided by windows. The main idea is to get all system handles and iterate through them to find which are the files handle and additional information. This is done using functions that I'm not sure are documented. If you are interested use google to find more information, but I do not think it is not a good way.
public bool IsInUse(string path)
{
bool IsFree = true;
try
{
//Just opening the file as open/create
using (FileStream fs = new FileStream(path, FileMode.OpenOrCreate))
{
//we can check by using
fs.CanRead // or
fs.CanWrite
}
}
catch (IOException ex)
{
IsFree = false;
}
return IsFree;
}
string path = "D:\\test.doc";
bool IsFileFree = IsInUse(path);
My application use "FileSystemWatcher()" to raise an event when a TXT file is created by an "X" application and then read its content.
the "X" application create a file (my application detect it successfully) but it take some time to fill the data on it, so the this txt file cannot be read at the creation time, so im
looking for something to wait until the txt file come available to reading. not a static delay but something related to that file.
any help ? thx
Create the file like this:
myfile.tmp
Then when it's finished, rename it to
myfile.txt
and have your filewatcher watch for the .txt extension
The only way I have found to do this is to put the attempt to read the file in a loop, and exit the loop when I don't get an exception. Hopefully someone else will come up with a better way...
bool FileRead = false;
while (!FileRead)
{
try
{
// code to read file, which you already know
FileRead = true;
}
catch(Exception)
{
// do nothing or optionally cause the code to sleep for a second or two
}
}
You could track the file's Changed event, and see if it's available for opening on change. If the file is still locked, just watch for the next change event.
You can open and read a locked file like this
using (var stream = new FileStream(#"c:\temp\file.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite)) {
using (var file = new StreamReader(stream)) {
while (!file.EndOfStream) {
var line = file.ReadLine();
Console.WriteLine(line);
}
}
}
However, make sure your file writer flushes otherwise you may not see any changes.
The application X should lock the file until it closes it. Is application X also a .NET application and can you modify it? In that case you can simply use the FileInfo class with the proper value for FileShare (in this case FileShare.Read).
If you have no control over application X, the situation becomes a little more complex. But then you can always attempt to open the file exclusively via the same FileInfo.Open method. Provide FileShare.None in that case. It will attempt to open the file exclusively and will fail if the file is still in use. You can perform this action inside a loop until the file is closed by application X and ready to be read.
We have a virtual printer for creating pdf documents, and I do something like this to access that document after it's sent to the printer:
using (FileSystemWatcher watcher = new FileSystemWatcher(folder))
{
if(!File.Exists(docname))
for (int i = 0; i < 3; i++)
watcher.WaitForChanged(WatcherChangeTypes.Created, i * 1000);
}
So I wait for a total of 6 seconds (some documents can take a while to print but most come very fast, hence the increasing wait time) before deciding that something has gone awry.
After this, I also read in a for loop, in just the same way that I wait for it to be created. I do this just in case the document has been created, but not released by the printer yet, which happens nearly every time.
You can use the same class to be notified when file changes.
The Changed event is raised when changes are made to the size, system attributes, last write time, last access time, or security permissions of a file or directory in the directory being monitored.
So I think you can use that event to check if file is readable and open it if it is.
If you have a DB at your disposal I would recommend using a DB table as a queue with the file names and then monitor that instead. nice and transactional.
You can check if file's size has changed. Although this will require you to poll it's value with some frequency.
Also, if you want to get the data faster, you can .Flush() while writing, and make sure to .Close() stream as soon as you will finish writing to it.