C# FileStream in Mono - File sharing violation - c#

I have a C# WinForms application running on Raspbian with Mono. It has a timer. When the OnTimedEvent fires, I check if I have exclusive access to a file that I want to upload (to make sure it is finished being written to disk), then attempt to upload. If the upload is successful, I move the file to an archive folder, otherwise I leave it there and wait for the next timer event. I have no problems when connected to the Internet, but when I test without and my upload fails, the second OnTimedEvent gets an exception when checking if the same file is ready (again). I am getting :
Error message: ***Sharing violation on path 'path'
***HResult: ***-2147024864
Method to check if file is ready:
public static bool IsFileReady(string filename)
{
// If the file can be opened for exclusive access it means that the file
// is no longer locked by another process.
try
{
var inputStream = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.None);
bool test = inputStream.Length > 0;
inputStream.Close();
inputStream.Dispose();
return test;
}
catch (Exception e)
{
//log
throw e;
}
}
This is what executes on the OntimedEvent:
var csvFiles = from f in di.GetFiles()
where f.Extension == ".csv"
select f; //get csv files in upload folder
foreach (var file in csvFiles)
{
if (IsFileReady(file.FullName)) //check that file is done writing before trying to move.
{
bool IsUploadSuccess = await WritingCSVFileToS3Async(file);//.Wait(); //upload file to S3
if (IsUploadSuccess)
{
File.Move(file.FullName, archivePath + file.Name); //move to completed folder if upload successful. else, leave there for next upload attempt
}
}
}
From what I can understand, it looks like my first FileStream (File.Open) still has the file locked when the 2nd event fires. However, I've added .Close() and .Dispose() to the IsFileReady method but that doesn't seem to be working.
Any help would be appreciated!
EDIT: Below is the WritingCSVFileToS3Async method.
static async Task<bool> WritingCSVFileToS3Async(FileInfo file)
{
try
{
client = new AmazonS3Client(bucketRegion);
// Put the object-set ContentType and add metadata.
var putRequest = new PutObjectRequest
{
BucketName = bucketName,
Key = file.Name,
FilePath = file.FullName ,
ContentType = "text/csv"
};
//putRequest.Metadata.Add("x-amz-meta-title", "someTitle"); //don't need meta data at this time
PutObjectResponse response = await client.PutObjectAsync(putRequest);
if (response.HttpStatusCode == System.Net.HttpStatusCode.OK)
return true;
else
return false;
}
catch (AmazonS3Exception e)
{
ErrorLogging.LogErrorToFile(e);
return false;
}
catch (Exception e)
{
ErrorLogging.LogErrorToFile(e);
return false;
}
Also, I ran the same application on Windows, and am getting a similar exception:
The process cannot access the file 'path' because it is being used by another process.

I believe I've found the problem. I noticed that I was not catching the client timeout exception for the PUT request(not connected to internet). My timer interval was 20 seconds, which is shorter than the S3 client timeout (30 seconds). So the client still had the file tied up by the time the second timer event fired, hence the access violation. I increased the timer interval to 60 seconds, and I now catch the client timeout exception and can handle it before the next timer event.
Thanks for your help.

Related

How to wait until a file is successfully copied from a network drive before reading from it?

I am currently facing a problem with an application which consists of multiple components.
One component of the application periodically checks for new files on a network drive and copies them into a local folder. Another component of the application uses a FileSystemWatcher to watch for any new files in the local folder. If a new file is copied, the Created event of the FileSystemWatcher gets called and the application will then read the file contents and import the file into a database.
To prevent the application from trying to read the file before it is fully copied into the local folder, it calls the following function periodically until it returns false:
private bool isFileLocked(string filePath)
{
try
{
if (!File.Exists(filePath))
{
return false;
}
using (FileStream fs = File.OpenRead(filePath))
{
}
return false;
}
catch (IOException)
{
return true;
}
}
Unfortunately this does not seem to work in all cases. Sometimes, I noticed that the file is being read before it is completely written into the local folder. When this happens, the component which tries to copy the file gets the following error:
System.IO.IOException: The process cannot access the file '...' because it is being used by another process.
The component which copies the file is written in PowerShell and uses the following Cmdlet for copying:
Copy-Item $currentfile.FullName -Destination "$destfolder" –Force -ErrorAction Stop
The component which uses the FileSystemWatcher and imports the file is a C# based windows service.
How can I prevent it from reading the file before it is fully copied into the local folder?
If you don't worry about little delay - it may solve your trouble:
static void Main(string[] args)
{
FileSystemWatcher fsw = new FileSystemWatcher("SomePathToFolder");
fsw.EnableRaisingEvents = true;
fsw.Created += async (s, a) =>
{
while (FileIsLocked(a.FullPath))
{
Console.WriteLine($"File {a.Name} is locked!");
await Task.Delay(TimeSpan.FromSeconds(5)); // 5 seconds delay between checks
}
Console.WriteLine($"File {a.Name} available!");
// You can put here another delay to be 102% sure that file is free,
// but I suppose this is too much.
using (FileStream fs = File.OpenRead(a.FullPath))
{
Console.WriteLine($"File {a.Name} opened for reading.");
// Do what you need
await Task.Run(() => ImportFileToDatabase(fs));
}
Console.WriteLine($"File {a.Name} closed.");
};
Console.ReadKey();
}
static bool FileIsLocked(string filePath)
{
if (!File.Exists(filePath))
return false;
try
{
using (FileStream fs = File.OpenRead(filePath)) { }
return false;
}
catch { }
return true;
}
Some solutions are suggested here. I've had a similar problem using FileSystemWatcher. This is what I use (simplified):
async Task<FileStream> OpenWaitAsync(string path, TimeSpan interval, CancellationToken cancellationToken = default)
{
const int ERROR_SHARING_VIOLATION = unchecked((int)0x80070020);
while (true)
{
try
{
return File.OpenRead(path);
}
catch (IOException ioe) when (ioe.HResult == ERROR_SHARING_VIOLATION)
{
await Task.Delay(interval, cancellationToken);
}
}
}

Monitor folder and copy new files once their creation is complete

I built a console app that monitors a set of folders on a Windows 2019 Server and copies any newly-created .txt files to another folder, using the same file name. So far it's working for the basic functionality. Now I have to handle the fact that most of the time, these files are large and take several minutes to complete creation. I have gone through several SO posts and pieced together the following code trying to accomplish this:
using System;
using System.IO;
namespace Folderwatch
{
class Program
{
static void Main(string[] args)
{
string sourcePath = #"C:\Users\me\Documents\SomeFolder";
FileSystemWatcher watcher = new FileSystemWatcher(sourcePath);
watcher.EnableRaisingEvents = true;
watcher.IncludeSubdirectories = true;
watcher.Filter = "*.txt";
// Add event handlers.
watcher.Created += new FileSystemEventHandler(OnCreated);
}
// Define the event handlers.
private static void OnCreated(object source, FileSystemEventArgs e)
{
// Specify what is done when a file is created.
FileInfo file = new FileInfo(e.FullPath);
string wctPath = e.FullPath;
string wctName = e.Name;
string createdFile = Path.GetFileName(wctName);
string destPath = #"C:\Users\SomeOtherFolder";
string sourceFile = wctPath;
string destFile = Path.Combine(destPath, createdFile);
WaitForFile(file);
File.Copy(sourceFile, destFile, true);
}
public static bool IsFileLocked(FileInfo file)
{
try
{
using (FileStream stream = file.Open(FileMode.Open, FileAccess.Read, FileShare.None))
{
stream.Close();
}
}
catch (IOException)
{
//the file is unavailable because it is:
//still being written to
//or being processed by another thread
//or does not exist (has already been processed)
return true;
}
//file is not locked
return false;
}
public static void WaitForFile(FileInfo filename)
{
//This will lock the execution until the file is ready
//TODO: Add some logic to make it async and cancelable
while (!IsFileLocked(filename)) { }
}
}
}
What I'm attempting to do in the OnCreated method is to check and wait until the file is done being created, and then copy the file to another destination. I don't seem to know what I'm doing with the WaitForFile(file) line - if I comment out that line and the file creation is instant, the file copies as intended. If I use the WaitForFile line, nothing ever happens. I took the IsFileLocked and WaitForFile methods from other posts on SO, but I'm clearly not implementing them correctly.
I've noted this Powershell version Copy File On Creation (once complete) and I'm not sure if the answer here could be pointing me in the right direction b/c I'm even less versed in PS than I am in C#.
EDIT #1: I should have tested for longer before accepting the answer - I think we're close but after about a minute of the program running, I got the following error before the program crashed:
Unhandled exception. System.IO.IOException: The process cannot access
the file
'C:\Users\me\Dropbox\test1.log'
because it is being used by another process. at
System.IO.FileSystem.CopyFile(String sourceFullPath, String
destFullPath, Boolean overwrite) at
Folderwatch.Program.OnCreated(Object source, FileSystemEventArgs e) in
C:\Users\me\OneDrive -
Development\Source\repos\FolderWatchCG\FolderWatchCG\Program.cs:line
61 at System.Threading.Tasks.Task.<>c.b__139_1(Object
state) at
System.Threading.QueueUserWorkItemCallbackDefaultContext.Execute()
at System.Threading.ThreadPoolWorkQueue.Dispatch() at
System.Threading._ThreadPoolWaitCallback.PerformWaitCallback()
Any advice on this would be appreciated. As I further analyze the files in these folders, some of them are log files getting written in realtime, so it could be that the file is being written to for hours before it's actually completed. I am wondering if somehow one of the NotifyFilter comes into play here?
There's a bug in the WaitForFile() method, that is, it currently waits while the file is not locked (not the other way around). In addition to that, you need a way to confirm that the file actually exists. A simple way to achieve that would be to change the WaitForFile() method into something like this:
public static bool WaitForFile(FileInfo file)
{
while (IsFileLocked(file))
{
// The file is inaccessible. Let's check if it exists.
if (!file.Exists) return false;
}
// The file is accessible now.
return true;
}
This will keep waiting as long as the file exists and is inaccessible.
Then, you can use it as follows:
bool fileAvailable = WaitForFile(file);
if (fileAvailable)
{
File.Copy(sourceFile, destFile, true);
}
The problem with this approach though is that the while loop keeps the thread busy, which a) consumes a considerable amount of the CPU resources, and b) prevents the program from processing other files until it finishes waiting for that one file. So, it's probably better to use an asynchronous wait between each check.
Change the WaitForFile method to:
public static async Task<bool> WaitForFile(FileInfo file)
{
while (IsFileLocked(file))
{
// The file is inaccessible. Let's check if it exists.
if (!file.Exists) return false;
await Task.Delay(100);
}
// The file is accessible now.
return true;
}
Then, await it inside OnCreated like this:
private async static void OnCreated(object source, FileSystemEventArgs e)
{
// ...
bool fileAvailable = await WaitForFile(file);
if (fileAvailable)
{
File.Copy(sourceFile, destFile, true);
}
}

How to process files in directory concurrently in .net

I'm having issues processing files in parallel within a directory. I've read several similar questions and examples but I can't seem to find why my code causes exception.
My directory gets populated by other processes and will contain thousands of files at any one time. Each file has to be parsed and validated which takes time filesystem/network io etc. I need this step to be done in parallel, the rest has to be done serially.
Here's my code:
public void run()
{
XmlMessageFactory factory = new XmlMessageFactory();
DirectoryInfo dir = new DirectoryInfo(m_sourceDir);
Dictionary<string, int> retryList = new Dictionary<string, int>();
ConcurrentQueue<Tuple<XmlMsg,FileInfo>> MsgQueue = new
ConcurrentQueue<Tuple<XmlMsg,FileInfo>>();
//start worker to handle messages
System.Threading.ThreadPool.QueueUserWorkItem(o =>
{
XmlMsg msg;
Tuple<XmlMsg, FileInfo> item;
while (true)
{
if (!MsgQueue.TryDequeue(out item))
{
System.Threading.Thread.Sleep(5000);
continue;
}
try
{
msg = item.Item1;
/* processing on msg happens here */
handleMessageProcessed(item.Item2, ref retryList);
}
catch (Exception e)
{
//if this method is called it gives the
//exception below
handleMessageFailed(item.Item2, e.ToString());
}
}
}
);
while (true)
{
try
{
FileInfo[] files = dir.GetFiles(m_fileTypes);
Partitioner<FileInfo> partitioner = Partitioner.Create(files, true);
Parallel.ForEach(partitioner, f =>
{
try
{
XmlMsg msg = factory.getMessage(messageType);
try
{
msg.loadFile(f.FullName);
MsgQueue.Enqueue(new Tuple<XmlMsg, FileInfo>(msg, f));
}
catch (Exception e)
{
handleMessageFailed(f, e.ToString());
}
}
});
}
}
}
static void handleMessageFailed(FileInfo f, string message)
{
//Erorr here:
f.MoveTo(m_failedDir + f.Name);
//"The process cannot access the file because it is
//being used by another process."} System.Exception {System.IO.IOException}
}
Using ConcurrentQueue how can it end up attempting to access a file twice at the same time?
I have a test setup currently with 5000 files and this will happen at least once per run and on a different file each time. When I inspect the directory, the source file causing exception will have already been processed and is in the "processed" directory.
After a fair bit of head scratching the problem turned out to be annoyingly simple! What was happening was the parallel processing of the files in the directory was completing before the serial activity on the file, so the loop was restarting and re-adding some of the files to the Queue that were already in there.
For completeness here's the modified section of code:
while (true)
{
try
{
FileInfo[] files = dir.GetFiles(m_fileTypes);
Partitioner<FileInfo> partitioner = Partitioner.Create(files, true);
Parallel.ForEach(partitioner, f =>
{
try
{
XmlMsg msg = factory.getMessage(messageType);
try
{
msg.loadFile(f.FullName);
MsgQueue.Enqueue(new Tuple<XmlMsg, FileInfo>(msg, f));
}
catch (Exception e)
{
handleMessageFailed(f, e.ToString());
}
}
});
//Added check to wait for queue to deplete before
//re-scanning the directory
while (MsgQueue.Count > 0)
{
System.Threading.Thread.Sleep(5000);
}
}
}
I suspect a problem in XmlMsg.loadFile()
I think that you may have code like this in it:
public void loadFile(string filename)
{
FileStream file = File.OpenRead(filename);
// Do something with file
file.Close();
}
If an exception occurs in the "do something with file" part, the file won't be closed because file.Close() will never be executed. Then you'll get the "file in use" exception inside handleMessageFailed().
If so, the solution is to access the file in a using block as follows; then it will be closed even if an exception occurs:
public void loadFile(string filename)
{
using (FileStream file = File.OpenRead(filename))
{
// Do something with file
}
}
But assuming that this does turn out to be the problem, when you start using real files produced by external processes, you may have another issue if the external processes still have the files open when your worker threads try to process them.

System.IO.File.Move--How to wait for move completion?

I am writing a WPF application in c# and I need to move some files--the rub is that I really REALLY need to know if the files make it. To do this, I wrote a check that makes sure that the file gets to the target directory after the move--the problem is that sometimes I get to the check before the file finishes moving:
System.IO.File.Move(file.FullName, endLocationWithFile);
System.IO.FileInfo[] filesInDirectory = endLocation.GetFiles();
foreach (System.IO.FileInfo temp in filesInDirectory)
{
if (temp.Name == shortFileName)
{
return true;
}
}
// The file we sent over has not gotten to the correct directory....something went wrong!
throw new IOException("File did not reach destination");
}
catch (Exception e)
{
//Something went wrong, return a fail;
logger.writeErrorLog(e);
return false;
}
Could somebody tell me how to make sure that the file actually gets to the destination?--The files that I will be moving could be VERY large--(Full HD mp4 files of up to 2 hours)
Thanks!
You could use streams with Aysnc Await to ensure the file is completely copied
Something like this should work:
private void Button_Click(object sender, RoutedEventArgs e)
{
string sourceFile = #"\\HOMESERVER\Development Backup\Software\Microsoft\en_expression_studio_4_premium_x86_dvd_537029.iso";
string destinationFile = "G:\\en_expression_studio_4_premium_x86_dvd_537029.iso";
MoveFile(sourceFile, destinationFile);
}
private async void MoveFile(string sourceFile, string destinationFile)
{
try
{
using (FileStream sourceStream = File.Open(sourceFile, FileMode.Open))
{
using (FileStream destinationStream = File.Create(destinationFile))
{
await sourceStream.CopyToAsync(destinationStream);
if (MessageBox.Show("I made it in one piece :), would you like to delete me from the original file?", "Done", MessageBoxButton.YesNo) == MessageBoxResult.Yes)
{
sourceStream.Close();
File.Delete(sourceFile);
}
}
}
}
catch (IOException ioex)
{
MessageBox.Show("An IOException occured during move, " + ioex.Message);
}
catch (Exception ex)
{
MessageBox.Show("An Exception occured during move, " + ex.Message);
}
}
If your using VS2010 you will have to install Async CTP to use the new Async/Await syntax
You could watch for the files to disappear from the original directory, and then confirm that they indeed appeared in the target directory.
I have not had great experience with file watchers. I would probably have the thread doing the move wait for an AutoResetEvent while a separate thread or timer runs to periodically check for the files to disappear from the original location, check that they are in the new location, and perhaps (depending on your environment and needs) perform a consistency check (e.g. MD5 check) of the files. Once those conditions are satisfied, the "checker" thread/timer would trigger the AutoResetEvent so that the original thread can progress.
Include some "this is taking way too long" logic in the "checker".
Why not manage the copy yourself by copying streams?
//http://www.dotnetthoughts.net/writing_file_with_non_cache_mode_in_c/
const FileOptions FILE_FLAG_NO_BUFFERING = (FileOptions) 0x20000000;
//experiment with different buffer sizes for optimal speed
var bufLength = 4096;
using(var outFile =
new FileStream(
destPath,
FileMode.Create,
FileAccess.Write,
FileShare.None,
bufLength,
FileOptions.WriteThrough | FILE_FLAG_NO_BUFFERING))
using(var inFile = File.OpenRead(srcPath))
{
//either
//inFile.CopyTo(outFile);
//or
var fileSizeInBytes = inFile.Length;
var buf = new byte[bufLength];
long totalCopied = 0L;
int amtRead;
while((amtRead = inFile.Read(buf,0,bufLength)) > 0)
{
outFile.Write(buf,0,amtRead);
totalCopied += amtRead;
double progressPct =
Convert.ToDouble(totalCopied) * 100d / fileSizeInBytes;
progressPct.Dump();
}
}
//file is written
You most likely want the move to happen in a separate thread so that you aren't stopping the execution of your application for hours.
If the program cannot continue without the move being completed, then you could open a dialog and check in on the move thread periodically to update a progress tracker. This provides the user with feedback and will prevent them from feeling as if the program has frozen.
There's info and an example on this here:
http://hintdesk.com/c-wpf-copy-files-with-progress-bar-by-copyfileex-api/
try checking periodically in a background task whether the copied file
size reached the file size of the original file (you can add hashes comparing between the files)
Got similar problem recently.
OnBackupStarts();
//.. do stuff
new TaskFactory().StartNew(() =>
{
OnBackupStarts()
//.. do stuff
OnBackupEnds();
});
void OnBackupEnds()
{
if (BackupChanged != null)
{
BackupChanged(this, new BackupChangedEventArgs(BackupState.Done));
}
}
do not wait, react to event
In first place, consider that Moving files in an operating system does not “recreates” the file in the new directory, but only changes its location data in the “files allocation table”, as physically copy all bytes to delete old ones is just a waste of time.
Due to that reason, moving files is a very fast process, no matter the file size.
EDIT: As Mike Christiansen states in his comment, this "speedy" process only happens when files are moving inside the same volume (you know, C:\... to C:\...)
Thus, copy/delete behavior as proposed by “sa_ddam213” in his response will work but is not the optimal solution (takes longer to finish, will not work if for example you don’t have enough free disk to make the copy of the file while the old one exists, …).
MSDN documentation about File.Move(source,destination) method does not specifies if it waits for completion, but the code given as example makes a simple File.Exists(…) check, saying that having there the original file “is unexpected”:
// Move the file.
File.Move(path, path2);
Console.WriteLine("{0} was moved to {1}.", path, path2);
// See if the original exists now.
if (File.Exists(path))
{
Console.WriteLine("The original file still exists, which is unexpected.");
}
else
{
Console.WriteLine("The original file no longer exists, which is expected.");
}
Perhaps, you could use a similar approach to this one, checking in a while loop for the existence of the new file, and the non existence of the old one, giving a “timer” exit for the loop just in case something unexpected happens at operating system level, and the files get lost:
// We perform the movement of the file
File.Move(source,destination);
// Sets an "exit" datetime, after wich the loop will end, for example 15 seconds. The moving process should always be quicker than that if files are in the same volume, almost immediate, but not if they are in different ones
DateTime exitDateTime = DateTime.Now.AddSeconds(15);
bool exitLoopByExpiration = false;
// We stops here until copy is finished (by checking fies existence) or the time limit excedes
while (File.Exists(source) && !File.Exists(destination) && !exitLoopByExpiration ) {
// We compare current datetime with the exit one, to see if we reach the exit time. If so, we set the flag to exit the loop by expiration time, not file moving
if (DateTime.Now.CompareTo(exitDateTime) > 0) { exitLoopByExpiration = true; }
}
//
if (exitLoopByExpiration) {
// We can perform extra work here, like log problems or throw exception, if the loop exists becouse of time expiration
}
I have checked this solution and seems to work without problems.

FileSystemWatcher and FileCopy issue, after copied delete it [duplicate]

When a file is created (FileSystemWatcher_Created) in one directory I copy it to another. But When I create a big (>10MB) file it fails to copy the file, because it starts copying already, when the file is not yet finished creating...
This causes Cannot copy the file, because it's used by another process to be raised. ;(
Any help?
class Program
{
static void Main(string[] args)
{
string path = #"D:\levan\FolderListenerTest\ListenedFolder";
FileSystemWatcher listener;
listener = new FileSystemWatcher(path);
listener.Created += new FileSystemEventHandler(listener_Created);
listener.EnableRaisingEvents = true;
while (Console.ReadLine() != "exit") ;
}
public static void listener_Created(object sender, FileSystemEventArgs e)
{
Console.WriteLine
(
"File Created:\n"
+ "ChangeType: " + e.ChangeType
+ "\nName: " + e.Name
+ "\nFullPath: " + e.FullPath
);
File.Copy(e.FullPath, #"D:\levan\FolderListenerTest\CopiedFilesFolder\" + e.Name);
Console.Read();
}
}
There is only workaround for the issue you are facing.
Check whether file id in process before starting the process of copy. You can call the following function until you get the False value.
1st Method, copied directly from this answer:
private bool IsFileLocked(FileInfo file)
{
FileStream stream = null;
try
{
stream = file.Open(FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (IOException)
{
//the file is unavailable because it is:
//still being written to
//or being processed by another thread
//or does not exist (has already been processed)
return true;
}
finally
{
if (stream != null)
stream.Close();
}
//file is not locked
return false;
}
2nd Method:
const int ERROR_SHARING_VIOLATION = 32;
const int ERROR_LOCK_VIOLATION = 33;
private bool IsFileLocked(string file)
{
//check that problem is not in destination file
if (File.Exists(file) == true)
{
FileStream stream = null;
try
{
stream = File.Open(file, FileMode.Open, FileAccess.ReadWrite, FileShare.None);
}
catch (Exception ex2)
{
//_log.WriteLog(ex2, "Error in checking whether file is locked " + file);
int errorCode = Marshal.GetHRForException(ex2) & ((1 << 16) - 1);
if ((ex2 is IOException) && (errorCode == ERROR_SHARING_VIOLATION || errorCode == ERROR_LOCK_VIOLATION))
{
return true;
}
}
finally
{
if (stream != null)
stream.Close();
}
}
return false;
}
From the documentation for FileSystemWatcher:
The OnCreated event is raised as soon as a file is created. If a file
is being copied or transferred into a watched directory, the
OnCreated event will be raised immediately, followed by one or more
OnChanged events.
So, if the copy fails, (catch the exception), add it to a list of files that still need to be moved, and attempt the copy during the OnChanged event. Eventually, it should work.
Something like (incomplete; catch specific exceptions, initialize variables, etc):
public static void listener_Created(object sender, FileSystemEventArgs e)
{
Console.WriteLine
(
"File Created:\n"
+ "ChangeType: " + e.ChangeType
+ "\nName: " + e.Name
+ "\nFullPath: " + e.FullPath
);
try {
File.Copy(e.FullPath, #"D:\levani\FolderListenerTest\CopiedFilesFolder\" + e.Name);
}
catch {
_waitingForClose.Add(e.FullPath);
}
Console.Read();
}
public static void listener_Changed(object sender, FileSystemEventArgs e)
{
if (_waitingForClose.Contains(e.FullPath))
{
try {
File.Copy(...);
_waitingForClose.Remove(e.FullPath);
}
catch {}
}
}
It's an old thread, but I'll add some info for other people.
I experienced a similar issue with a program that writes PDF files, sometimes they take 30 seconds to render.. which is the same period that my watcher_FileCreated class waits before copying the file.
The files were not locked.
In this case I checked the size of the PDF and then waited 2 seconds before comparing the new size, if they were unequal the thread would sleep for 30 seconds and try again.
You're actually in luck - the program writing the file locks it, so you can't open it. If it hadn't locked it, you would have copied a partial file, without having any idea there's a problem.
When you can't access a file, you can assume it's still in use (better yet - try to open it in exclusive mode, and see if someone else is currently opening it, instead of guessing from the failure of File.Copy). If the file is locked, you'll have to copy it at some other time. If it's not locked, you can copy it (there's slight potential for a race condition here).
When is that 'other time'? I don't rememeber when FileSystemWatcher sends multiple events per file - check it out, it might be enough for you to simply ignore the event and wait for another one. If not, you can always set up a time and recheck the file in 5 seconds.
Well you already give the answer yourself; you have to wait for the creation of the file to finish. One way to do this is via checking if the file is still in use. An example of this can be found here: Is there a way to check if a file is in use?
Note that you will have to modify this code for it to work in your situation. You might want to have something like (pseudocode):
public static void listener_Created()
{
while CheckFileInUse()
wait 1000 milliseconds
CopyFile()
}
Obviously you should protect yourself from an infinite while just in case the owner application never releases the lock. Also, it might be worth checking out the other events from FileSystemWatcher you can subscribe to. There might be an event which you can use to circumvent this whole problem.
When the file is writing in binary(byte by byte),create FileStream and above solutions Not working,because file is ready and wrotted in every bytes,so in this Situation you need other workaround like this:
Do this when file created or you want to start processing on file
long fileSize = 0;
currentFile = new FileInfo(path);
while (fileSize < currentFile.Length)//check size is stable or increased
{
fileSize = currentFile.Length;//get current size
System.Threading.Thread.Sleep(500);//wait a moment for processing copy
currentFile.Refresh();//refresh length value
}
//Now file is ready for any process!
So, having glanced quickly through some of these and other similar questions I went on a merry goose chase this afternoon trying to solve a problem with two separate programs using a file as a synchronization (and also file save) method. A bit of an unusual situation, but it definitely highlighted for me the problems with the 'check if the file is locked, then open it if it's not' approach.
The problem is this: the file can become locked between the time that you check it and the time you actually open the file. Its really hard to track down the sporadic Cannot copy the file, because it's used by another process error if you aren't looking for it too.
The basic resolution is to just try to open the file inside a catch block so that if its locked, you can try again. That way there is no elapsed time between the check and the opening, the OS does them at the same time.
The code here uses File.Copy, but it works just as well with any of the static methods of the File class: File.Open, File.ReadAllText, File.WriteAllText, etc.
/// <param name="timeout">how long to keep trying in milliseconds</param>
static void safeCopy(string src, string dst, int timeout)
{
while (timeout > 0)
{
try
{
File.Copy(src, dst);
//don't forget to either return from the function or break out fo the while loop
break;
}
catch (IOException)
{
//you could do the sleep in here, but its probably a good idea to exit the error handler as soon as possible
}
Thread.Sleep(100);
//if its a very long wait this will acumulate very small errors.
//For most things it's probably fine, but if you need precision over a long time span, consider
// using some sort of timer or DateTime.Now as a better alternative
timeout -= 100;
}
}
Another small note on parellelism:
This is a synchronous method, which will block its thread both while waiting and while working on the thread. This is the simplest approach, but if the file remains locked for a long time your program may become unresponsive. Parellelism is too big a topic to go into in depth here, (and the number of ways you could set up asynchronous read/write is kind of preposterous) but here is one way it could be parellelized.
public class FileEx
{
public static async void CopyWaitAsync(string src, string dst, int timeout, Action doWhenDone)
{
while (timeout > 0)
{
try
{
File.Copy(src, dst);
doWhenDone();
break;
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
}
public static async Task<string> ReadAllTextWaitAsync(string filePath, int timeout)
{
while (timeout > 0)
{
try {
return File.ReadAllText(filePath);
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
return "";
}
public static async void WriteAllTextWaitAsync(string filePath, string contents, int timeout)
{
while (timeout > 0)
{
try
{
File.WriteAllText(filePath, contents);
return;
}
catch (IOException) { }
await Task.Delay(100);
timeout -= 100;
}
}
}
And here is how it could be used:
public static void Main()
{
test_FileEx();
Console.WriteLine("Me First!");
}
public static async void test_FileEx()
{
await Task.Delay(1);
//you can do this, but it gives a compiler warning because it can potentially return immediately without finishing the copy
//As a side note, if the file is not locked this will not return until the copy operation completes. Async functions run synchronously
//until the first 'await'. See the documentation for async: https://msdn.microsoft.com/en-us/library/hh156513.aspx
CopyWaitAsync("file1.txt", "file1.bat", 1000);
//this is the normal way of using this kind of async function. Execution of the following lines will always occur AFTER the copy finishes
await CopyWaitAsync("file1.txt", "file1.readme", 1000);
Console.WriteLine("file1.txt copied to file1.readme");
//The following line doesn't cause a compiler error, but it doesn't make any sense either.
ReadAllTextWaitAsync("file1.readme", 1000);
//To get the return value of the function, you have to use this function with the await keyword
string text = await ReadAllTextWaitAsync("file1.readme", 1000);
Console.WriteLine("file1.readme says: " + text);
}
//Output:
//Me First!
//file1.txt copied to file1.readme
//file1.readme says: Text to be duplicated!
You can use the following code to check if the file can be opened with exclusive access (that is, it is not opened by another application). If the file isn't closed, you could wait a few moments and check again until the file is closed and you can safely copy it.
You should still check if File.Copy fails, because another application may open the file between the moment you check the file and the moment you copy it.
public static bool IsFileClosed(string filename)
{
try
{
using (var inputStream = File.Open(filename, FileMode.Open, FileAccess.Read, FileShare.None))
{
return true;
}
}
catch (IOException)
{
return false;
}
}
I would like to add an answer here, because this worked for me. I used time delays, while loops, everything I could think of.
I had the Windows Explorer window of the output folder open. I closed it, and everything worked like a charm.
I hope this helps someone.

Categories