This question already has answers here:
Detecting whether a file is locked by another process (or indeed the same process) [duplicate]
(4 answers)
Closed 9 years ago.
In my program I pass the path to a file which may be still being written to by another process (independent external non-c#).
How can I check the file is fully readable and accessible before I actually send the path back to the client? FileIOPermission with Demand for reading did not work for me, so I am wondering if there is some other way of doing this without attempting to read the whole file upfront.
Thank you.
The trouble with checking to see if a file is accessible and fully readable, and then opening the file for reading is that potentially, the state could change between the check and the open where you actually want to be doing something with it.
My suggestion is to go ahead and open the file for reading, being sure to catch the appropriate exception if a problem occurs. You can then wait a little and try again, or do something else.
Remember that just because your program has a sequence of events that are logical to you, many things are going on in an operating system and the state can easily change between two lines of code that seem watertight for you, but have an epoch between them at a multitasking level.
Moo-Juice is right. You can't check whether the file can be read successfully and then open it for reading... things can change and you might get an exception anyway. Best to just read it and catch the exception.
public bool TryReadFile(String path, out String contentsOfFile)
{
try
{
// Try reading file
contentsOfFile = File.ReadAllText(path);
// Success! Yay!
return true;
}
catch (IOException)
{
// Oops! Can't read that file!
// Return some default value and let the caller know we failed
contentsOfFile = String.Empty;
return false;
}
Best way would be to read the file in a normal try catch block, and then apply the logic to continue or not, based on if an exception was thrown or not.
Also, a secondary way is to check if a file is not of zero size, but purely as a secondary check.
Create a loop and try to open the file
If exception occur make your thread sleep for some seconds
and repeat the process
When the external process is complete, does it close? If so, you could modify the solution found here. It would look something like the following:
//While the process is found running
while (System.Diagnostics.Process.GetProcessesByName(process).Length != 0)
{
//Sleep for three seconds
System.Threading.Thread.Sleep(3000);
}
//open file for reading
If the process doesn't close when complete, however, the above won't work. You could try placing the following in a loop that tries to open the file exclusively until it's successful:
System.IO.File.Open(PathToFile, FileMode.Open, FileAccess.Read, FileShare.None);
Both of these methods should also have some kind of count added to ensure that they eventually stop trying and error out, or else you could end up with an infinite loop.
Related
Hello and thanks for your help.
This time I am having a curious problem with a program (C#) I am writing and would like to hear your advice.
I am writing a normal program (not multithreaded) but then added a timer (System.Timers.Timer)
Also I am using a StreamWriter to write on a file. I open this like this
StreamWriter logStream=new StreamWriter(filename, true);
meaning that if the file exists, it appends , if not it creates.
Later I write in the file like this
logStream.WriteLine(message);
However, I write to the stream from both the main function and from the function that is called by the timer.
the problem symptoms
My program is throwing an error sometimes when I flush or write the stream saying that "Can not access a closed file" and other times "Can not access a closed TextWriter... (What is a "TextWriter"?)
However curiously, the file keeps being written without problem. (Even the "can not access a closed file" message is written in the supposed closed file)
I am not familiar with the inner workings of a Timer. (I suppose it runs a separate thread?)
My question is
Is it possible to use a StreamWriter from several threads? (in this case the main one and the Timer one)
Is it possible that there is happening a race condition or some problem like that?
One more thing: I made a logic mistake and close and reopen the file every time I want to write on it. Yes, it is a mistake and I should correct it. But maybe if I correct this the error I described above will disappear masking a more serious flaw.
My suspicions is that since I am closing and opening the file every time I write on it, maybe the both threads try to access them on a wrong time
Any help will be greatly appreciated
Closing and opening you file under this scenario will create a race condition like you suspect. You cannot keep the stream open and pass the object to the thread because you might end up with a similar issue if you call from different thread. Your best solution remain using a thread safe method that will write what you send to it.
the methods are static because the lock has to be accessible from all instance of the class.
private static ReaderWriterLockSlim readerWriterLockSlim = new ReaderWriterLockSlim();
public static void AppendToFile(string path, string text)
{
// Set to locked (other thread will freeze here until object is unlocked
readerWriterLockSlim.EnterWriteLock();
try
{
// Write that will append to the file
using (StreamWriter sw = File.AppendText(path))
{
// append the text
sw.WriteLine(text);
sw.Close();
}
}
finally
{
// Clear the lock
readerWriterLockSlim.ExitWriteLock();
}
}
We currently have one application that monitors a folder for new files. To make it fault tolerant and be able to process more files at once, we want to be able to run multiple instances of this application on different machines. We use File.Move to "lock" a file and make sure that only one thread can process a file at a time.
To test that only one application and/or thread can perform a File.Move on a file, I created a simple application (based on the original application's code), which created 10 threads per application and monitored a folder, when each thread detects a new file, it performs File.Move on it and changes the file's extension, to try and stop other thread's from doing the same.
I have seen an issue when running multiple copies of this application (and it running on its own), whereby 2 threads (either in the same application or different ones), both successfully perform File.Move with no exception thrown, but the thread that performed it last (I change the file's extension to include the DateTime.Now.ToFileTime()), successfully renamed the file.
I have looked at what File.Move does and it checks to see if the file exists before it performs the operation, then it calls out to Win32Native.MoveFile to perform the move.
All the other threads/applications throw an exception, as I would expect.
The reasons why this is an issue are:
I thought only 1 thread can perform a File.Move on a file at a time.
I need to reliably have only one application/thread be able to process a file at a time.
Here is the code that performs the File.Move:
public bool TryLock(string originalFile, out string changedFileName)
{
FileInfo fileInfo = new FileInfo(originalFile);
changedFileName = Path.ChangeExtension(originalFile, ".original." + DateTime.Now.ToFileTime());
try
{
File.Move(originalFile, changedFileName);
}
catch (IOException ex)
{
Console.WriteLine("{3} - Thread {1}-{2} File {0} is already in use", fileInfo.Name, Thread.CurrentThread.ManagedThreadId, id, DateTime.Now.ToLongTimeString());
return false;
}
catch (Exception ex)
{
Console.WriteLine("{3} - Thread {1}-{2} File {0} error {4}", fileInfo.Name, Thread.CurrentThread.ManagedThreadId, id, DateTime.Now.ToLongTimeString(), ex);
return false;
}
return true;
}
Note - id is just a sequential number I assigned to each thread for logging.
I am running Windows 7 Enterprise SP1 on a SSD with NTFS.
From the MSDN description I assume that File.Move does not open the file in exclusive mode.
If you try to move a file across disk volumes and that file is in use,
the file is copied to the destination, but it is not deleted from the
source.
Anyway, I think you are better off to create your own move mechanism and have it open the file in exclusive mode prior to copying it (and then deleting it):
File.Open(pathToYourFile, FileMode.Open, FileAccess.Read, FileShare.None);
Other threads won't be able to open it if the move operation is already in progress. You might have race condition issues between the moment the copy is finalized (thus you need to dispose of the file handle) and deleting it.
Using File.Move as a lock isn't going to work. As stated in #marceln's answer, it won't delete the source file it is already in use elsewhere and doesn't have a "locking" behavior, you can't relay on it.
What i would suggest is to use a BlockingCollection<T> to manage the processing of your files:
// Assuming this BlockingCollection is already filled with all string file paths
private BlockingCollection<string> _blockingFileCollection = new BlockingCollection<string>();
public bool TryProcessFile(string originalFile, out string changedFileName)
{
FileInfo fileInfo = new FileInfo(originalFile);
changedFileName = Path.ChangeExtension(originalFile, ".original." + DateTime.Now.ToFileTime());
string itemToProcess;
if (_blockingFileCollection.TryTake(out itemToProcess))
{
return false;
}
// The file should exclusively be moved by one thread,
// all other should return false.
File.Move(originalFile, changedFileName);
return true;
}
Are you moving across volumes or within a volume? In the latter case no copying is necessary.
.
#usr In production, once a thread has "locked" a file, we will be moving it across network shares
I'm not sure whether that is a true move or a copy operation. In any case, you could:
open the file exclusively
copy the data
delete the source by handle (Deleting or Renaming a file using an open handle)
That allows you to lock other processes out of that file for the duration of the move. It is more of a workaround than a real solution. Hope it helps.
Note, that for the duration of the move the file is unavailable and other processes will receive an error accessing it. You might need a retry loop with a time delay between operations.
Here's an alternative:
Copy the file to the target folder with a different extension that is being ignored by readers
Atomically rename the file to remove the extension
Renaming on the same volume is always atomic. Readers might receive a sharing violation error for a very short period of time. Again, you need a retry loop or tolerate a very small window of unavailability.
Based on #marceln and #YuvalItzchakov answer/comments, I tried the following, which seems to give more reliable results:
using (var readFileStream = File.Open(originalFile, FileMode.Open, FileAccess.Read, FileShare.Delete))
{
readFileStream.Lock(0, readFileStream.Length - 1);
File.Move(originalFile, changedFileName);
readFileStream.Unlock(0, readFileStream.Length - 1);
}
I want to use Windows's own file copying as it should be more efficient than copying the stream and in production we will be moving the files from one network share to another.
Before I go into too detail, my program is written in Visual Studio 2010 using C# .Net 4.0.
I wrote a program that will generate separate log files for each run. The log file is named after the time, and accurate up at millisecond (for example, 20130726103042375.log). The program will also generate a master log file for the day if it has not already exist (for example, *20130726_Master.log*)
At the end of each run, I want to append the log file to a master log file. Is there a way to check if I can append successfully? And retry after Sleep for like a second or something?
Basically, I have 1 executable, and multiple users (let's say there are 5 users).
All 5 users will access and run this executable at the same time. Since it's nearly impossible for all user to start at the exact same time (up to millisecond), there will be no problem generate individual log files.
However, the issue comes in when I attempt to merge those log files to the master log file. Though it is unlikely, I think the program will crash if multiple users are appending to the same master log file.
The method I use is
File.AppendAllText(masterLogFile, File.ReadAllText(individualLogFile));
I have check into the lock object, but I think it doesn't work in my case, as there are multiple instances running instead of multiple threads in one instance.
Another way I look into is try/catch, something like this
try
{
stream = file.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch {}
But I don't think this solve the problem, because the status of the masterLogFile can change in that brief millisecond.
So my overall question is: Is there a way to append to masterLogFile if it's not in use, and retry after a short timeout if it is? Or if there is an alternative way to create the masterLogFile?
Thank you in advance, and sorry for the long message. I want to make sure I get my message across and explain what I've tried or look into so we are not wasting anyone's time.
Please let me know if there's anymore information I can provide to help you help me.
Your try/catch is the way to do things. If the call to File.Open succeeds, then you can write to to the file. The idea is to keep the file open. I would suggest something like:
bool openSuccessful = false;
while (!openSuccessful)
{
try
{
using (var writer = new StreamWriter(masterlog, true)) // append
{
// successfully opened file
openSuccessful = true;
try
{
foreach (var line in File.ReadLines(individualLogFile))
{
writer.WriteLine(line);
}
}
catch (exceptions that occur while writing)
{
// something unexpected happened.
// handle the error and exit the loop.
break;
}
}
}
catch (exceptions that occur when trying to open the file)
{
// couldn't open the file.
// If the exception is because it's opened in another process,
// then delay and retry.
// Otherwise exit.
Sleep(1000);
}
}
if (!openSuccessful)
{
// notify of error
}
So if you fail to open the file, you sleep and try again.
See my blog post, File.Exists is only a snapshot, for a little more detail.
I would do something along the lines of this as I think in incurs the least overhead. Try/catch is going to generate a stack trace(which could take a whole second) if an exception is thrown. There has to be a better way to do this atomically still. If I find one I'll post it.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Wait until file is unlocked in .NET
I have an open file, like a .Doc or .txt, and I have to wait until the user close it.
I already try this, according to Wait until file is unlocked in .NET :
while (true)
{
try
{
using (FileStream Fs = new FileStream(fileName, FileMode.Open, FileAccess.ReadWrite, FileShare.None, 100))
{
//the file is close
break;
}
}
catch (IOException)
{
//wait and retry
Thread.Sleep(1000);
}
}
This works well ,but it may be possible to find a solution without a try/catch and handler the exception ?
Unfortunately no, there is no other way.
The API doesn't have an event that will fire when a file in unlocked or anything else that is convenient.
Retrying with waits is the best solution with the current API.
For one, though, don't use the loop you have right now, breaking if there's no exception - perform your actual file access in that using loop.
Next, if the file is open in a known process, you could get its Process object, set EnableRaisingEvents to true, and handle its Exited event to try again. It's not failsafe, though, so you would still handle exceptions and use a loop.
You can make P/Invoke calls to native CreateFile function and then analyze error code. However, try/catch will still be necessary.
I have method which writes a file to the local disk. Another method takes that file and uploads it to a sftp server. The problem is, that the file on the sftp server is empty. This is a small piece of my code:
WriteToLocalFolder(content, filename);
WriteToSftpServer(filename, server logon params);
Could it be that WriteToSftpServer gets called before WriteToLocalFolder finished writing? If yes, how can I say that WriteToSftpServer should start only after WriteToLocalFolder has finished writing the file?
WriteToLocalFolder looks like this in the real code:
public static void WriteToFolder(StringBuilder content, string whereTo)
{
File.WriteAllText(whereTo, content.ToString());
}
So the stream is closed I think...
Thanks :-)
The code in WriteToSftpServer shouldn't ever happen before WriteToLocalFolder is done (because it does not seem to be async). However it could be that filestream is not properly closed and so WriteToSftpServer can't access it.
Try getting a breakpoint inside WriteToSftpServer where the file gets loaded to see what does it load. You can always "step next" inside the method, if the file loads correctly, to see where does it break.
If WriteToLocalFolder doesn't spawn a separate thread, this is NOT possible. Maybe you are missing a Stream.Flush, Stream.Close or Stream.Dispose in any of the methods?
It depends on the code in WriteToLocalFolder, but it sounds more likely that you're writing content to the file and forgetting to Flush the write buffer. If this is not the case, please edit your question and add the code for WriteToLocalFolder.
do
{
WriteToLocalFolder(content, filename);
} while ((System.IO.File.Exists(filename) != true));
WriteToSftpServer(filename, server logon params);
this loop will ensure that the file exists with its content befor getting to the next line..