How to dispose/release file being use by another process? - c#

Using filesystemwatcher in the changes event i'm using fileinfo to get the file and then copy the file to a new directory and keep copying the file with overwrite when copy until the file changes end :
private void Watcher_Changes(object sender, FileSystemEventArgs e)
{
try
{
var info = new FileInfo(e.FullPath);
var newSize = info.Length;
string FileN1 = "File Name : ";
string FileN2 = info.Name;
string FileN3 = " Size Changed : From ";
string FileN5 = "To";
string FileN6 = newSize.ToString();
Println(FileN1 + FileN2 + FileN3 + FileN5 + FileN6);
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);
}
catch (Exception ex)
{
PrintErr(ex);
}
}
And the copy file method :
bool makeonce = false;
string NewFileName = "";
private void CopyFileOnChanged(string Folder, string FileName)
{
if (makeonce == false)
{
string t = "";
string fn = "";
string locationToCreateFolder = Folder;
string folderName;
string date = DateTime.Now.ToString("ddd MM.dd.yyyy");
string time = DateTime.Now.ToString("HH.mm tt");
string format = "Save Game {0} {1}";
folderName = string.Format(format, date, time);
Directory.CreateDirectory(locationToCreateFolder + "\\" + folderName);
t = locationToCreateFolder + "\\" + folderName;
fn = System.IO.Path.GetFileName(FileName);
NewFileName = System.IO.Path.Combine(t, fn);
makeonce = true;
}
File.Copy(FileName, NewFileName, true);
}
The problem is when it's making the File.Copy over again it's throwing exception the file is being using by other process.
[+] File Name : New Text Document (2).txt Size Changed : From To662 At
: 6/3/2022 3:56:14 PM [+] File Name : New Text Document (2).txt Size
Changed : From To662 At : 6/3/2022 3:56:14 PM [-]
System.IO.IOException: The process cannot access the file 'C:\Program
Files (x86)\Win\Save Game Fri 06.03.2022 15.56 PM\New Text Document
(2).txt' because it is being used by another process. at
System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalCopy(String sourceFileName, String
destFileName, Boolean overwrite, Boolean checkHost) at
System.IO.File.Copy(String sourceFileName, String destFileName,
Boolean overwrite) at
Watcher_WPF.MainWindow.CopyFileOnChanged(String Folder, String
FileName) in
C:\Users\Chocolade1972\Downloads\Watcher_WPF-master\Watcher_WPF-master\Watcher_WPF\MainWindow.xaml.cs:line
356 at Watcher_WPF.MainWindow.Watcher_Changes(Object sender,
FileSystemEventArgs e) in
C:\Users\Chocolade1972\Downloads\Watcher_WPF-master\Watcher_WPF-master\Watcher_WPF\MainWindow.xaml.cs:line
258
Line 258 is :
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);

For brevity I will only outline the solution I created in a professional setting for Invoice processing instead of give you the complete solution (I also cannot, because the code is copyrighted).
So that out of the way, here we go:
What I had first was an "Inbox" Folder, I had a FileSystemWatcher watch. I reacted to new files, but that works quite the same for file changed. For each event, I enqueued an Item:
private ConcurrentQueue<string> _queue = new ();
private void Watcher_Changes(object sender, FileSystemEventArgs e)
{
_queue.Enqueue(e.FullPath);
}
That's all the EventHandler did. Objective here is to handle events from the FSW as quickly as any possible. Otherwise you may run into exhaustion and the FSW will discard events! (Yes, I learned it the hard way. Through bug reports and a lot of sweat :D)
The actual work was done in a separate thread, that consumed the Queue.
// Just brief display of the concept.
// This function would be used as Thread run every
// x Time, triggered by a Timer if the Thread is not still running.
private void MyWorkerRun()
{
// My Input came in mostly in batches, so I ran until the queue was empty.
// You may need to adapt to maybe only dequeue N Items for each run ...
// Whatever does the trick.
// while( _queue.Any() )
//
// Maybe only process the N amount of Items the Queue has at the
// start of the current run?
var itemsToProcess = _queue.Count;
if( itemsToProcess <= 0 ) return;
for( int i = 0; i < itemsToProcess; i++)
{
string sourcePath = _queue.Dequeue(); // ConcurrentQueue is Thread-Safe
// No file there anymore? Drop it.
if(!File.Exists(sourcePath)) continue;
// TODO Construct Target-Path
string targetPath = GetTargetPath(sourcePath); // Just a dummy for this example...
// Try to copy, requeue if failed.
if(!TryCopy(sourcePath, targetPath))
{
// Requeue for later
// It will be picked up in _next_ run,
// so there should be enough time in between tries.
_queue.Enqueue(sourcePath);
}
}
}
private bool TryCopy(string source, string target){ /* TODO for OP */ }
I have to add that I did this years ago. Today I would probably consider TPL DataFlow to handle the queueing and requeuing for me.
And of course, you can always spice this up. I tried to keep it as simple as possible, while showing the concept clearly.
I later had more requirements: For example, the program should be able to be exited and pick up from where it stopped when started again. It should only retry for X times then write the file into a "deadletterbox", then more processing steps were added, then it should send an email to a certain adress if the queue exceeded N entries ... you get it. You can always make it more complicated if you need to.

t = locationToCreateFolder + "\\" + folderName;
your "locationToCreateFolder" is a directory name and not a path. be cause it comes from here :
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);
so when you Combine the global path is not valid :
NewFileName = System.IO.Path.Combine(t, fn);

Related

C# System.IO.File.Copy issue

So I needed to make a quick Windows Form app to process a list of files with their directories and copy them to a new dir. Normally I would use a batch file to do this e.g.
#echo off
mkdir "C:\Users\%username%\Desktop\inst"
:A
ping 127.0.0.1 -n 1 > nul
xcopy "C:\Users\%username%\Desktop\M14.0.1512.400-enu-x64.exe" "C:\Users\%username%\Desktop\inst" /y
xcopy "C:\Users\%username%\AppData\Local\Temp\vcredist.exe" "C:\Users\%username%\Desktop\inst" /y
GOTO A
I know I'm not using the best practices etc. but its a quick script I came up with to help speed up my work. Now doing this for a couple of files is fine but some of the applications I work on havelike 40+ files that I need to copy over and its a bit of a pain having to write a batch file each time.
So I slapped up a simple WF app with a simple input field and a button to start the process.
The user places a list of files (with the path and dir e.g. C:\foo\bar\hello.txt) into the input field and then the app takes each line from the text box and shoves it into a list after doing some basic filtering e.g. removing \n, \t, \r and encapsulaing the strings with double quotes if they are not in place already.
new Thread(() =>
{
Thread.CurrentThread.IsBackground = true;
while (run)
{
foreach (string path in paths)
{
Thread.Sleep(100);
try
{
File.Copy(path, dir, true);
}
catch (Exception e)
{
g.log.WriteToLog("Failed to copy asset: " + e.ToString());
}
}
};
}).Start();
When I run that this is what I get in the logs:
//LOGS
21/03/2019 11:25:56 - Failed to copy asset: System.ArgumentException: Illegal characters in path. at System.IO.LongPathHelper.Normalize(String path, UInt32 maxPathLength, Boolean checkInvalidCharacters, Boolean expandShortPaths) at System.IO.Path.NormalizePath(String path, Boolean fullCheck, Int32 maxPathLength, Boolean expandShortPaths) at System.IO.Path.GetFullPathInternal(String path) at System.IO.File.InternalCopy(String sourceFileName, String destFileName, Boolean overwrite, Boolean checkHost) at S2_PackagingTool.Application_Extractor.ExtractAssets() in C:\Users\xxxx\Desktop\GitHub\S2-EnvPrepTool\Application_Extractor.cs:line 98
21/03/2019 11:25:56 - Path: "C:\Program Files\Freedom Scientific\Runtime JAWS\18.0\jrt.exe" Dir: "C:\Users\xxxx\Desktop\GitHub\S2-EnvPrepTool\bin\Debug\Extracted Assets"
The second line in the logs is a value dump from the path and dir variables.
When I run the code without the while loop and add the path and dir in manually e.g.
File.Copy(#"C:\foo\bar\hello.txt", #"C:\hello\world", true);
or
File.Copy("C:\\foo\\bar\\hello.txt", "C:\\hello\\world", true);
It works fine.
I will also attatch the filter method incase you guys want to see it. Keep in mind this is quick and dirty so yeah:
public string QuoteEncapsulationFilter(string s)
{
s = s.Replace("\n", String.Empty);
s = s.Replace("\r", String.Empty);
s = s.Replace("\t", String.Empty);
s = s.Replace("\\", "\\\\");
if (!s.Contains("\""))
{
s = "\"" + s + "\"";
}
return s;
}
I have tried looking for an answer everywhere with no luck can someone please shed some light on what I am doing wrong here please. If you need me to provide anymore information please let me know.
Thanks!
You're missing the file name within the File.Copy (string sourceFileName, string destFileName, bool overwrite); function. Your dir path needs the file name.
https://learn.microsoft.com/en-us/dotnet/api/system.io.file.copy?view=netframework-4.7.2
says the following:
destFileName
String The name of the destination file. This cannot be a
directory.
Edit:
To answer your second question in the comments:
new Thread(() =>
{
Thread.CurrentThread.IsBackground = true;
while (run)
{
foreach (string path in paths)
{
Thread.Sleep(100);
try
{
var fileName = Path.GetFileName(path); // Get the file name
var fullDestination = dir + fileName; // Complete the uri
File.Copy(path, fullDestination, true);
}
catch (Exception e)
{
g.log.WriteToLog("Failed to copy asset: " + e.ToString());
}
}
};
}).Start();

SSIS Script Task for moving files based on their file extension

I have the following code contained within a Script Task in SSIS 2012.
public void Main()
{
string inputDir = (string) Dts.Variables["User::InputDirectory"].Value;
string CSVFolder = (string) Dts.Variables["User::CSVFolder"].Value;
string XMLFolder = (string) Dts.Variables["User::XMLFolder"].Value;
string XLSXFolder = (string) Dts.Variables["User::XLSXFolder"].Value;
bool isXMLFolderEmpty = (bool) Dts.Variables["User::isXMLFolderEmpty"].Value;
bool isCSVFolderEmpty = (bool)Dts.Variables["User::isCSVFolderEmpty"].Value;
bool isXLSXFolderEmpty = (bool)Dts.Variables["User::isXLSXFolderEmpty"].Value;
string[] fileNames = Directory.GetFiles(#inputDir);
if (fileNames.Length > 0)
{
foreach (string inputFile in fileNames)
{
string FileExtension = Path.GetExtension(inputFile);
if (FileExtension == ".csv")
{
File.Move(inputDir + "\\" + inputFile, CSVFolder + "\\" + inputFile);
isCSVFolderEmpty = false;
}
else if (FileExtension == ".xlsx")
{
File.Move(inputDir + "\\" + inputFile, XLSXFolder + "\\" + inputFile);
isXLSXFolderEmpty = false;
}
else if (FileExtension == ".xml")
{
File.Move(inputDir + "\\" + inputFile, XMLFolder + "\\" + inputFile);
isXMLFolderEmpty = false;
}
}
}
Dts.TaskResult = (int)ScriptResults.Success;
}
However when I execute the Script task I am getting the following error:
DTS Script Task has encountered an exception in User code: Exception has been thrown by the target of invocation.
Can anybody point out what is going wrong? All of variable names are correct.
File.Move is incorrect
You might also be interested in the Path.Combine as it will more gracefully handle building paths than your blind string concatenation route.
Directory.GetFile indicates
Returns the names of files (including their paths) in the specified directory
so inputFile is already going to be in the form of C:\ssisdata\so_35605920\Foo.csv so your initial parameter to FileMove is simply inputFile
So really the challenge becomes changing the folder path out of the inputFile variable to the targeted one (CSV, XML, or XLSX). Lazy hack would be to call the string.replace method specifying the inputDir and using the new directory. I'd probably go with something a bit more elegant in case there's weirdness with UNC or relative paths.
Path.GetFileName will give me the file and extension. So using that + Path.Combine would yield the correct final path for our Move operation
Path.Combine(CSVFolder, Path.GetFileName(inputFile));
Thus
File.Move(inputFile, Path.Combine(CSVFolder, Path.GetFileName(inputFile)));
Visual studio is rather unhappy at the moment so pardon the lack of precision in the suggested code but if there are typos, you have the references to books online and the logic behind it.
Also, try/catch blocks are exceptionally helpful in defensive coding as will be testing to ensure the folders exist, there's no process with a lock on the file, etc.

System.IO.File.Move error - Could not find a part of the path

I have a sync software, which loads CSV files from "Incoming" folder, processes them and then moves them to the "Archive" folder.
Today, I saw the following error with this sync software:
[23/06/2014 00:06:04 AM] : Failed to move file from
D:\IBI_ORDER_IMPORTER_FTP_SERVER\Template3\Fifty &
Dean\Incoming\5A040K___d6f1ca45937b4ceb98d29d0db4601bf4.csv to
D:\IBI_ORDER_IMPORTER_FTP_SERVER\Template3\Fifty &
Dean\Archive\5A040K___d6f1ca45937b4ceb98d29d0db4601bf4.csv - Could not
find a part of the path.
Here's a snippet taken out of the sync software, where the file is processed and moved:
public static void ProcessSingleUserFile(Int32 TemplateId, String ImportedBy, String FilePath)
{
// Always Rename File To Avoid Conflict
string FileName = Path.GetFileNameWithoutExtension(FilePath);
String NewFilePath = FilePath.Replace(FileName, Utils.RandomString() + "___" + FileName);
File.Move(FilePath, NewFilePath);
FilePath = NewFilePath;
// Log
SyncUtils.ConsoleLog(String.Format("Processing [ {0} as {1} ] By [ {2} ] On Template [ #{3} ]",
FileName + ".csv",
Path.GetFileName(FilePath),
ImportedBy,
TemplateId));
// Init
List<OrderDraft> myOrderDrafts = new List<OrderDraft>();
// Parsed Based On Template Id
if (TemplateId == Settings.Default.Multi_Order_Template_Id)
{
// Try Parse File
myOrderDrafts = Utils.ParseMultiImportFile(TemplateId, ImportedBy, FilePath, true);
}
else
{
// Try Parse File
myOrderDrafts.Add(Utils.ParseImportFile(TemplateId, ImportedBy, FilePath, true));
}
// Process Orders
foreach (OrderDraft myOrderDraft in myOrderDrafts)
{
/* code snipped */
}
// Archive File
File.Move(FilePath, FilePath.Replace("Incoming", "Archive"));
}
Any idea what this error means? and how to circumvent it?
I wrote a cut down version of the above to test this in a controlled environment and I am not getting the error with this code:
static void Main(string[] args)
{
try
{
string baseDir = #"C:\Users\Administrator\Desktop\FTP_SERVER\Template3\Fifty & Dean\Incoming\";
string[] filePaths = Directory.GetFiles(baseDir, "*.csv");
foreach (string filePath in filePaths)
{
// do some work here ...
// move file
string newFilePath = filePath.Replace("Incoming", "Archive");
File.Move(filePath, newFilePath);
Console.WriteLine("File successfully moved");
}
}
catch (Exception ex)
{
Console.WriteLine("Error: " + ex.Message);
}
Console.ReadKey();
}
You need to include the checks to make sure that the paths exist at runtime and check the output, something very simple like:
if(!Directory.Exists(Path.GetDirectoryName(filePath)))
{
Console.WriteLine("filePath does not exist: " + filePath);
}
if(!Directory.Exists(Path.GetDirectoryName(newFilePath)))
{
Console.WriteLine("newFilePath does not exist: " + newFilePath);
}
File.Move(filePath, newFilePath);
The reason I am suggesting this method is due to a possibility that the paths momentarily become available or not under the multi-tasking OSs depending on a multitude of factors: network connectivity, permissions (pushed down by GPO at any time), firewall rules, AV exclusions getting blown away etc. Even running low on CPU or RAM may create issues. In short, you never know what exactly occurred when your code was running if you are only checking the paths availability after the fact.
Or if your issue is intermittent, you can try and catch the error and write information to some sort of a log similarly to below:
try
{
File.Move(filePath, newFilePath);
}
catch(Exception ex)
{
if(!Directory.Exists(Path.GetDirectoryName(filePath)))
{
Console.WriteLine("filePath does not exist: " + filePath);
}
if(!Directory.Exists(Path.GetDirectoryName(newFilePath)))
{
Console.WriteLine("newFilePath does not exist: " + newFilePath);
}
}
"Could not find a part of the path" exception could also thrown from File.Move if argument used was longer than MAX_PATH (260) in .NET Framework.
So I prepend the path I used with long path syntax before passing to File.Move and it worked.
// Prepend long file path support
if( !packageFile.StartsWith( #"\\?\" ) )
packageFile = #"\\?\" + packageFile;
See:
How to deal with files with a name longer than 259 characters?
I have this
Could not find a part of the path
happened to me when I
File.Move(mapfile_path , Path.Combine(newPath, mapFile));
. After some testing, I find out our server administrator has blocked any user application from writing to that directory in that [newPath]!
So, right click on that directory to observe the rights matrix on Security tab to see anything that would block you.
Another cause of DirectoryNotFoundException "could not find a part of the path" thrown by File.Move can be spaces at the end of the directory name. Consider the following code:
string destinationPath = "C:\\Folder1 "; //note the space on the end
string destinationFileNameAndPath = Path.Combine(destinationPath, "file.txt");
if (!Directory.Exists(destinationPath))
Directory.CreateDirectory(destinationPath);
File.Move(sourceFileNameAndPath, destinationFileNameAndPath);
You may expect this code to succeed, since it creates the destination directory if it doesn't already exist, but Directory.CreateDirectory seems to trim the extra space on the end of the directory name, whereas File.Move does not and gives the above exception.
In this circumstance, you can workaround this by trimming the extra space yourself, e.g. (depending on how you load your path variable, the below is obviously overkill for a hard-coded string)
string destinationPath = "C:\\Folder1 ".Trim();

Assistance with Threads and waiting form completion

I have a question about dealing with threads. I am copying files from one folder to another, then zipping them up. Problem is the winform appears to be attempting to zip the files before they are finished copying which in turn is causing the zip function to not complete. I did some looking around on here and to be honest I am having issues wrapping my head around how it works. MSDN has a nice little snippet:
// Wait on a single task with no timeout specified.
Task taskA = Task.Factory.StartNew(() => DoSomeWork(10000000));
taskA.Wait();
Console.WriteLine("taskA has completed.");
static void DoSomeWork(int val)
{
// Pretend to do something.
Thread.SpinWait(val);
}
again it's a bit over my head the code that I am trying to wait for complete
error_handling("Daily Backup Started", "BackupLog.txt");
string fileName = "";
string Source = #"C:\folder\Program";
string target = #"C:\folder\day_backup";
string datestamp = DateTime.Now.ToString("MMddyy-HHmm");
string[] files = System.IO.Directory.GetFiles(Source, "*.mdb");
foreach (string file in files)
{
fileName = System.IO.Path.GetFileName(file);
string destfile = System.IO.Path.Combine(target, fileName);
System.IO.File.Copy(file, destfile);
sub_error_handling(fileName+" has been copied", "DailyBackupLog.txt");
}
compression(#"C:\backupfolder\day_backup", #"\day_backup"+datestamp+".zip");
sub_error_handling("Files were packaged for transmission", "DailyBackupLog.txt");
also here is my zip code:
private void compression(string zipdir, string zipfilename)
{
try
{
using (ZipFile zip = new ZipFile())
{
zip.AddDirectory(zipdir);
zip.Comment = "This backup was created at " + System.DateTime.Now.ToString("G");
zip.Save(zipdir + zipfilename);
}
}
catch (Exception error)
{
error_handling("Incremental Backup Failed Compression was unsuccessful", "Incbackuplog.txt");
sub_error_handling(error + "", "Incbackuplog.txt");
error_handling("End Of Error Report", "Incbackuplog.txt");
}
}
it won't let me use a void to perform the new task, so not sure what else to try. Any suggestions ?
I'd say this is causing the hang:
zip.AddDirectory(zipdir);
zip.Comment = "This backup was created at " + System.DateTime.Now.ToString("G");
zip.Save(zipdir + zipfilename);
You are compressing all files in zipdir to a new file in the same location, so that the new file will be included in the zip file. So you are trying to include the new file inside itself, which is obviously impossible and explains, I think, why the zip process never ends.
By the way, you've set zipdir to a directory that's not the same as the one to which your code copies the files.

File Watcher Class With Drop Box

I'm trying to write a small utility service that will detect when a item(s) has been added to a synced drop box folder and then wait (to allow full sync) and move item(s) into a date stamped staging folder for further processing. Simple enough ...
Here is my code:
static void Main(string[] args)
{
var watcher = new FileSystemWatcher();
string _path = #"E:\IMPORT\Dropbox\";
watcher.Path = _path;
watcher.EnableRaisingEvents = true;
watcher.Created += new FileSystemEventHandler(watcher_Created);
Console
.WriteLine("FileSystemWatcher ready and listening to changes in :\n\n"
+ _path);
watcher.Path = _path;
Console.ReadLine();
}
static void watcher_Created(object sender, FileSystemEventArgs e)
{
Thread.Sleep(3000);
Console.WriteLine(e.Name + " file has been created.");
string filename = Path.GetFileName(e.FullPath);
string path = #"E:\IMPORT\Staging\"
+ DateTime.Now.ToFileTime().ToString()
+ #"\";
try
{
Directory.CreateDirectory(path);
}
catch(Exception ex)
{
Console.WriteLine("Error: " + ex.ToString());
}
try
{
File.Move(e.FullPath, path + filename);
}
catch(Exception ex)
{
Console.WriteLine("Error: " + ex.ToString());
}
}
This code would work fine if one item was being added to the synced directory, however multiple items will be added and there needs to be a delay for items to be added to the dropbox. Any ideas on how I can accomplish this ?
I think that your best option here is to remove FileSystemWatcher and replace it with your own periodic monitoring of the directory (for example, a loop in a BackgroundWorker thread or a Timer-triggered event).
In this design, you can delay processing of the file for as long as you need by comparing the timestamp on the file with current time and only processing the file when you think that enough time has passed.
This design will also support restarting the application with files already present in the directory which the FileSystemWatcher approach probably will not.

Categories