Assistance with Threads and waiting form completion - c#

I have a question about dealing with threads. I am copying files from one folder to another, then zipping them up. Problem is the winform appears to be attempting to zip the files before they are finished copying which in turn is causing the zip function to not complete. I did some looking around on here and to be honest I am having issues wrapping my head around how it works. MSDN has a nice little snippet:
// Wait on a single task with no timeout specified.
Task taskA = Task.Factory.StartNew(() => DoSomeWork(10000000));
taskA.Wait();
Console.WriteLine("taskA has completed.");
static void DoSomeWork(int val)
{
// Pretend to do something.
Thread.SpinWait(val);
}
again it's a bit over my head the code that I am trying to wait for complete
error_handling("Daily Backup Started", "BackupLog.txt");
string fileName = "";
string Source = #"C:\folder\Program";
string target = #"C:\folder\day_backup";
string datestamp = DateTime.Now.ToString("MMddyy-HHmm");
string[] files = System.IO.Directory.GetFiles(Source, "*.mdb");
foreach (string file in files)
{
fileName = System.IO.Path.GetFileName(file);
string destfile = System.IO.Path.Combine(target, fileName);
System.IO.File.Copy(file, destfile);
sub_error_handling(fileName+" has been copied", "DailyBackupLog.txt");
}
compression(#"C:\backupfolder\day_backup", #"\day_backup"+datestamp+".zip");
sub_error_handling("Files were packaged for transmission", "DailyBackupLog.txt");
also here is my zip code:
private void compression(string zipdir, string zipfilename)
{
try
{
using (ZipFile zip = new ZipFile())
{
zip.AddDirectory(zipdir);
zip.Comment = "This backup was created at " + System.DateTime.Now.ToString("G");
zip.Save(zipdir + zipfilename);
}
}
catch (Exception error)
{
error_handling("Incremental Backup Failed Compression was unsuccessful", "Incbackuplog.txt");
sub_error_handling(error + "", "Incbackuplog.txt");
error_handling("End Of Error Report", "Incbackuplog.txt");
}
}
it won't let me use a void to perform the new task, so not sure what else to try. Any suggestions ?

I'd say this is causing the hang:
zip.AddDirectory(zipdir);
zip.Comment = "This backup was created at " + System.DateTime.Now.ToString("G");
zip.Save(zipdir + zipfilename);
You are compressing all files in zipdir to a new file in the same location, so that the new file will be included in the zip file. So you are trying to include the new file inside itself, which is obviously impossible and explains, I think, why the zip process never ends.
By the way, you've set zipdir to a directory that's not the same as the one to which your code copies the files.

Related

How to dispose/release file being use by another process?

Using filesystemwatcher in the changes event i'm using fileinfo to get the file and then copy the file to a new directory and keep copying the file with overwrite when copy until the file changes end :
private void Watcher_Changes(object sender, FileSystemEventArgs e)
{
try
{
var info = new FileInfo(e.FullPath);
var newSize = info.Length;
string FileN1 = "File Name : ";
string FileN2 = info.Name;
string FileN3 = " Size Changed : From ";
string FileN5 = "To";
string FileN6 = newSize.ToString();
Println(FileN1 + FileN2 + FileN3 + FileN5 + FileN6);
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);
}
catch (Exception ex)
{
PrintErr(ex);
}
}
And the copy file method :
bool makeonce = false;
string NewFileName = "";
private void CopyFileOnChanged(string Folder, string FileName)
{
if (makeonce == false)
{
string t = "";
string fn = "";
string locationToCreateFolder = Folder;
string folderName;
string date = DateTime.Now.ToString("ddd MM.dd.yyyy");
string time = DateTime.Now.ToString("HH.mm tt");
string format = "Save Game {0} {1}";
folderName = string.Format(format, date, time);
Directory.CreateDirectory(locationToCreateFolder + "\\" + folderName);
t = locationToCreateFolder + "\\" + folderName;
fn = System.IO.Path.GetFileName(FileName);
NewFileName = System.IO.Path.Combine(t, fn);
makeonce = true;
}
File.Copy(FileName, NewFileName, true);
}
The problem is when it's making the File.Copy over again it's throwing exception the file is being using by other process.
[+] File Name : New Text Document (2).txt Size Changed : From To662 At
: 6/3/2022 3:56:14 PM [+] File Name : New Text Document (2).txt Size
Changed : From To662 At : 6/3/2022 3:56:14 PM [-]
System.IO.IOException: The process cannot access the file 'C:\Program
Files (x86)\Win\Save Game Fri 06.03.2022 15.56 PM\New Text Document
(2).txt' because it is being used by another process. at
System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalCopy(String sourceFileName, String
destFileName, Boolean overwrite, Boolean checkHost) at
System.IO.File.Copy(String sourceFileName, String destFileName,
Boolean overwrite) at
Watcher_WPF.MainWindow.CopyFileOnChanged(String Folder, String
FileName) in
C:\Users\Chocolade1972\Downloads\Watcher_WPF-master\Watcher_WPF-master\Watcher_WPF\MainWindow.xaml.cs:line
356 at Watcher_WPF.MainWindow.Watcher_Changes(Object sender,
FileSystemEventArgs e) in
C:\Users\Chocolade1972\Downloads\Watcher_WPF-master\Watcher_WPF-master\Watcher_WPF\MainWindow.xaml.cs:line
258
Line 258 is :
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);
For brevity I will only outline the solution I created in a professional setting for Invoice processing instead of give you the complete solution (I also cannot, because the code is copyrighted).
So that out of the way, here we go:
What I had first was an "Inbox" Folder, I had a FileSystemWatcher watch. I reacted to new files, but that works quite the same for file changed. For each event, I enqueued an Item:
private ConcurrentQueue<string> _queue = new ();
private void Watcher_Changes(object sender, FileSystemEventArgs e)
{
_queue.Enqueue(e.FullPath);
}
That's all the EventHandler did. Objective here is to handle events from the FSW as quickly as any possible. Otherwise you may run into exhaustion and the FSW will discard events! (Yes, I learned it the hard way. Through bug reports and a lot of sweat :D)
The actual work was done in a separate thread, that consumed the Queue.
// Just brief display of the concept.
// This function would be used as Thread run every
// x Time, triggered by a Timer if the Thread is not still running.
private void MyWorkerRun()
{
// My Input came in mostly in batches, so I ran until the queue was empty.
// You may need to adapt to maybe only dequeue N Items for each run ...
// Whatever does the trick.
// while( _queue.Any() )
//
// Maybe only process the N amount of Items the Queue has at the
// start of the current run?
var itemsToProcess = _queue.Count;
if( itemsToProcess <= 0 ) return;
for( int i = 0; i < itemsToProcess; i++)
{
string sourcePath = _queue.Dequeue(); // ConcurrentQueue is Thread-Safe
// No file there anymore? Drop it.
if(!File.Exists(sourcePath)) continue;
// TODO Construct Target-Path
string targetPath = GetTargetPath(sourcePath); // Just a dummy for this example...
// Try to copy, requeue if failed.
if(!TryCopy(sourcePath, targetPath))
{
// Requeue for later
// It will be picked up in _next_ run,
// so there should be enough time in between tries.
_queue.Enqueue(sourcePath);
}
}
}
private bool TryCopy(string source, string target){ /* TODO for OP */ }
I have to add that I did this years ago. Today I would probably consider TPL DataFlow to handle the queueing and requeuing for me.
And of course, you can always spice this up. I tried to keep it as simple as possible, while showing the concept clearly.
I later had more requirements: For example, the program should be able to be exited and pick up from where it stopped when started again. It should only retry for X times then write the file into a "deadletterbox", then more processing steps were added, then it should send an email to a certain adress if the queue exceeded N entries ... you get it. You can always make it more complicated if you need to.
t = locationToCreateFolder + "\\" + folderName;
your "locationToCreateFolder" is a directory name and not a path. be cause it comes from here :
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);
so when you Combine the global path is not valid :
NewFileName = System.IO.Path.Combine(t, fn);

lock ExtractToDirectory only runs once

This function takes a few seconds to run locally. It unpacks a zip file onto the local server and is called before serving individual files in the zip. It should only run once if the extracted folder does not exist.
internal static void Init(string releaseName)
{
var unpackedFolderDirectory = Settings.Editor.FilesRootFolder + releaseName + "\\";
var dirInfo = new DirectoryInfo(unpackedFolderDirectory);
if (dirInfo.Exists) return;
lock ("UnpackLock")
{
dirInfo.Refresh();
if (dirInfo.Exists) return;
// Unpack all
var bytes = getBytesFromAzure();
using var ms = new MemoryStream(bytes);
using (var archive = new ZipArchive(ms))
{
archive.ExtractToDirectory(unpackedFolderDirectory);
}
}
}
If I make multiple requests to this function, some of them return the error:
The file 'C:\SomeFolder\SomeSubFolder\SomeFile.png' already exists.
On line -> archive.ExtractToDirectory(unpackedFolderDirectory);
I am expecting the archive.ExtractToDirectory(unpackedFolderDirectory); to only execute once, but it appears to be running multiple times.
What am I doing wrong? Is there some race condition here I'm not spotting?

Why is my download from Azure storage empty?

I can connect to the Azure Storage account and can even upload a file, but when I go to download the file using DownloadToFileAsync() I get a 0kb file as a result.
I have checked and the "CloudFileDirectory" and the "CloudFile" fields are all correct, which means the connection with Azure is solid. I can even write the output from the file to the console, but I cannot seem to save it as a file.
public static string PullFromAzureStorage(string azureFileConn, string remoteFileName, string clientID)
{
var localDirectory = #"C:\cod\clients\" + clientID + #"\ftp\";
var localFileName = clientID + "_xxx_" + remoteFileName;
//Retrieve storage account from connection string
var storageAccount = CloudStorageAccount.Parse(azureFileConn);
var client = storageAccount.CreateCloudFileClient();
var share = client.GetShareReference("testing");
// Get a reference to the root directory for the share
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
//Get a ref to client folder
CloudFileDirectory cloudFileDirectory = rootDir.GetDirectoryReference(clientID);
// Get a reference to the directory we created previously
CloudFileDirectory unprocessed = cloudFileDirectory.GetDirectoryReference("Unprocessed");
// Get a reference to the file
CloudFile sourceFile = unprocessed.GetFileReference(remoteFileName);
//write to console and log
Console.WriteLine("Downloading file: " + remoteFileName);
LogWriter.LogWrite("Downloading file: " + remoteFileName);
//Console.WriteLine(sourceFile.DownloadTextAsync().Result);
sourceFile.DownloadToFileAsync(Path.Combine(localDirectory, localFileName), FileMode.Create);
//write to console and log
Console.WriteLine("Download Successful!");
LogWriter.LogWrite("Download Successful!");
//delete remote file after download
//sftp.DeleteFile(remoteDirectory + remoteFileName);
return localFileName;
}
In the commented out line of code where you write the output to the Console, you explicitly use .Result because you're calling an async method in a synchronous one. You should either also do so while downloading the file as well, or make the entire method around it async.
The first solution would look something like this:
sourceFile.DownloadToFileAsync(Path.Combine(localDirectory, localFileName), FileMode.Create).Result();
EDIT:
As far as the difference with the comment, that uses GetAwaiter().GetResult(), goes: .Result wraps any exception that might occur in an AggregateException, while GetAwaiter().GetResult() won't. Anyhow: if there's any possibility you can refactor the method to be async so you can use await: please do so.

C# Windows Service Not Compressing Folder Correctly

Im currently building a Windows service that will be used to create backups of logs. Currently, the logs are stored at the path E:\Logs and intent is to copy the contents, timestamp their new folder and compress it. After this, you should have E:\Logs and E:\Logs_[Timestamp].zip. The zip will be moved to C:\Backups\ for later processing. Currently, I am using the following to try and zip the log folder:
var logDirectory = "E://Logs";
var timeStamp = DateTime.Now.ToString("yyyyMMddHHmm");
var zippedFolder = logDirectory + "_" + timeStamp + ".zip";
System.IO.Compression.ZipFile.CreateFromDirectory(logDirectory, zippedFolder);
While this appears to create a zip folder, I get the error Windows cannot open the folder. The Compressed (zipped) Folder E:\Logs_201805161035.zip is invalid.
To address any troubleshooting issues, the service is running with an AD account that has a sufficient permission level to perform administrative tasks. Another thing to consider is that the service kicks off when its FileSystemWatcher detects a new zip folder in the path C:\Aggregate. Since there are many zip folders that are added to C:\Aggregate at once, the FileSystemWatcher creates a new Task for each zip found. You can see how this works in the following:
private void FileFoundInDrops(object sender, FileSystemEventArgs e)
{
var aggregatePath = new DirectoryInfo("C://Aggregate");
if (e.FullPath.Contains(".zip"))
{
Task task = Task.Factory.StartNew(() =>
{
try
{
var logDirectory = "E://Logs";
var timeStamp = DateTime.Now.ToString("yyyyMMddHHmm");
var zippedFolder = logDirectory + "_" + timeStamp + ".zip";
ZipFile.CreateFromDirectory(logDirectory, zippedFolder);
}
catch (Exception ex)
{
Log.WriteLine(System.DateTime.Now.ToString() + " - ERROR: " + ex);
}
});
task.Dispose();
}
}
How can I get around the error I am receiving? Any help would be appreciated!

System.IO.File.Move error - Could not find a part of the path

I have a sync software, which loads CSV files from "Incoming" folder, processes them and then moves them to the "Archive" folder.
Today, I saw the following error with this sync software:
[23/06/2014 00:06:04 AM] : Failed to move file from
D:\IBI_ORDER_IMPORTER_FTP_SERVER\Template3\Fifty &
Dean\Incoming\5A040K___d6f1ca45937b4ceb98d29d0db4601bf4.csv to
D:\IBI_ORDER_IMPORTER_FTP_SERVER\Template3\Fifty &
Dean\Archive\5A040K___d6f1ca45937b4ceb98d29d0db4601bf4.csv - Could not
find a part of the path.
Here's a snippet taken out of the sync software, where the file is processed and moved:
public static void ProcessSingleUserFile(Int32 TemplateId, String ImportedBy, String FilePath)
{
// Always Rename File To Avoid Conflict
string FileName = Path.GetFileNameWithoutExtension(FilePath);
String NewFilePath = FilePath.Replace(FileName, Utils.RandomString() + "___" + FileName);
File.Move(FilePath, NewFilePath);
FilePath = NewFilePath;
// Log
SyncUtils.ConsoleLog(String.Format("Processing [ {0} as {1} ] By [ {2} ] On Template [ #{3} ]",
FileName + ".csv",
Path.GetFileName(FilePath),
ImportedBy,
TemplateId));
// Init
List<OrderDraft> myOrderDrafts = new List<OrderDraft>();
// Parsed Based On Template Id
if (TemplateId == Settings.Default.Multi_Order_Template_Id)
{
// Try Parse File
myOrderDrafts = Utils.ParseMultiImportFile(TemplateId, ImportedBy, FilePath, true);
}
else
{
// Try Parse File
myOrderDrafts.Add(Utils.ParseImportFile(TemplateId, ImportedBy, FilePath, true));
}
// Process Orders
foreach (OrderDraft myOrderDraft in myOrderDrafts)
{
/* code snipped */
}
// Archive File
File.Move(FilePath, FilePath.Replace("Incoming", "Archive"));
}
Any idea what this error means? and how to circumvent it?
I wrote a cut down version of the above to test this in a controlled environment and I am not getting the error with this code:
static void Main(string[] args)
{
try
{
string baseDir = #"C:\Users\Administrator\Desktop\FTP_SERVER\Template3\Fifty & Dean\Incoming\";
string[] filePaths = Directory.GetFiles(baseDir, "*.csv");
foreach (string filePath in filePaths)
{
// do some work here ...
// move file
string newFilePath = filePath.Replace("Incoming", "Archive");
File.Move(filePath, newFilePath);
Console.WriteLine("File successfully moved");
}
}
catch (Exception ex)
{
Console.WriteLine("Error: " + ex.Message);
}
Console.ReadKey();
}
You need to include the checks to make sure that the paths exist at runtime and check the output, something very simple like:
if(!Directory.Exists(Path.GetDirectoryName(filePath)))
{
Console.WriteLine("filePath does not exist: " + filePath);
}
if(!Directory.Exists(Path.GetDirectoryName(newFilePath)))
{
Console.WriteLine("newFilePath does not exist: " + newFilePath);
}
File.Move(filePath, newFilePath);
The reason I am suggesting this method is due to a possibility that the paths momentarily become available or not under the multi-tasking OSs depending on a multitude of factors: network connectivity, permissions (pushed down by GPO at any time), firewall rules, AV exclusions getting blown away etc. Even running low on CPU or RAM may create issues. In short, you never know what exactly occurred when your code was running if you are only checking the paths availability after the fact.
Or if your issue is intermittent, you can try and catch the error and write information to some sort of a log similarly to below:
try
{
File.Move(filePath, newFilePath);
}
catch(Exception ex)
{
if(!Directory.Exists(Path.GetDirectoryName(filePath)))
{
Console.WriteLine("filePath does not exist: " + filePath);
}
if(!Directory.Exists(Path.GetDirectoryName(newFilePath)))
{
Console.WriteLine("newFilePath does not exist: " + newFilePath);
}
}
"Could not find a part of the path" exception could also thrown from File.Move if argument used was longer than MAX_PATH (260) in .NET Framework.
So I prepend the path I used with long path syntax before passing to File.Move and it worked.
// Prepend long file path support
if( !packageFile.StartsWith( #"\\?\" ) )
packageFile = #"\\?\" + packageFile;
See:
How to deal with files with a name longer than 259 characters?
I have this
Could not find a part of the path
happened to me when I
File.Move(mapfile_path , Path.Combine(newPath, mapFile));
. After some testing, I find out our server administrator has blocked any user application from writing to that directory in that [newPath]!
So, right click on that directory to observe the rights matrix on Security tab to see anything that would block you.
Another cause of DirectoryNotFoundException "could not find a part of the path" thrown by File.Move can be spaces at the end of the directory name. Consider the following code:
string destinationPath = "C:\\Folder1 "; //note the space on the end
string destinationFileNameAndPath = Path.Combine(destinationPath, "file.txt");
if (!Directory.Exists(destinationPath))
Directory.CreateDirectory(destinationPath);
File.Move(sourceFileNameAndPath, destinationFileNameAndPath);
You may expect this code to succeed, since it creates the destination directory if it doesn't already exist, but Directory.CreateDirectory seems to trim the extra space on the end of the directory name, whereas File.Move does not and gives the above exception.
In this circumstance, you can workaround this by trimming the extra space yourself, e.g. (depending on how you load your path variable, the below is obviously overkill for a hard-coded string)
string destinationPath = "C:\\Folder1 ".Trim();

Categories