I'm trying to write a small utility service that will detect when a item(s) has been added to a synced drop box folder and then wait (to allow full sync) and move item(s) into a date stamped staging folder for further processing. Simple enough ...
Here is my code:
static void Main(string[] args)
{
var watcher = new FileSystemWatcher();
string _path = #"E:\IMPORT\Dropbox\";
watcher.Path = _path;
watcher.EnableRaisingEvents = true;
watcher.Created += new FileSystemEventHandler(watcher_Created);
Console
.WriteLine("FileSystemWatcher ready and listening to changes in :\n\n"
+ _path);
watcher.Path = _path;
Console.ReadLine();
}
static void watcher_Created(object sender, FileSystemEventArgs e)
{
Thread.Sleep(3000);
Console.WriteLine(e.Name + " file has been created.");
string filename = Path.GetFileName(e.FullPath);
string path = #"E:\IMPORT\Staging\"
+ DateTime.Now.ToFileTime().ToString()
+ #"\";
try
{
Directory.CreateDirectory(path);
}
catch(Exception ex)
{
Console.WriteLine("Error: " + ex.ToString());
}
try
{
File.Move(e.FullPath, path + filename);
}
catch(Exception ex)
{
Console.WriteLine("Error: " + ex.ToString());
}
}
This code would work fine if one item was being added to the synced directory, however multiple items will be added and there needs to be a delay for items to be added to the dropbox. Any ideas on how I can accomplish this ?
I think that your best option here is to remove FileSystemWatcher and replace it with your own periodic monitoring of the directory (for example, a loop in a BackgroundWorker thread or a Timer-triggered event).
In this design, you can delay processing of the file for as long as you need by comparing the timestamp on the file with current time and only processing the file when you think that enough time has passed.
This design will also support restarting the application with files already present in the directory which the FileSystemWatcher approach probably will not.
Related
Using filesystemwatcher in the changes event i'm using fileinfo to get the file and then copy the file to a new directory and keep copying the file with overwrite when copy until the file changes end :
private void Watcher_Changes(object sender, FileSystemEventArgs e)
{
try
{
var info = new FileInfo(e.FullPath);
var newSize = info.Length;
string FileN1 = "File Name : ";
string FileN2 = info.Name;
string FileN3 = " Size Changed : From ";
string FileN5 = "To";
string FileN6 = newSize.ToString();
Println(FileN1 + FileN2 + FileN3 + FileN5 + FileN6);
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);
}
catch (Exception ex)
{
PrintErr(ex);
}
}
And the copy file method :
bool makeonce = false;
string NewFileName = "";
private void CopyFileOnChanged(string Folder, string FileName)
{
if (makeonce == false)
{
string t = "";
string fn = "";
string locationToCreateFolder = Folder;
string folderName;
string date = DateTime.Now.ToString("ddd MM.dd.yyyy");
string time = DateTime.Now.ToString("HH.mm tt");
string format = "Save Game {0} {1}";
folderName = string.Format(format, date, time);
Directory.CreateDirectory(locationToCreateFolder + "\\" + folderName);
t = locationToCreateFolder + "\\" + folderName;
fn = System.IO.Path.GetFileName(FileName);
NewFileName = System.IO.Path.Combine(t, fn);
makeonce = true;
}
File.Copy(FileName, NewFileName, true);
}
The problem is when it's making the File.Copy over again it's throwing exception the file is being using by other process.
[+] File Name : New Text Document (2).txt Size Changed : From To662 At
: 6/3/2022 3:56:14 PM [+] File Name : New Text Document (2).txt Size
Changed : From To662 At : 6/3/2022 3:56:14 PM [-]
System.IO.IOException: The process cannot access the file 'C:\Program
Files (x86)\Win\Save Game Fri 06.03.2022 15.56 PM\New Text Document
(2).txt' because it is being used by another process. at
System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
at System.IO.File.InternalCopy(String sourceFileName, String
destFileName, Boolean overwrite, Boolean checkHost) at
System.IO.File.Copy(String sourceFileName, String destFileName,
Boolean overwrite) at
Watcher_WPF.MainWindow.CopyFileOnChanged(String Folder, String
FileName) in
C:\Users\Chocolade1972\Downloads\Watcher_WPF-master\Watcher_WPF-master\Watcher_WPF\MainWindow.xaml.cs:line
356 at Watcher_WPF.MainWindow.Watcher_Changes(Object sender,
FileSystemEventArgs e) in
C:\Users\Chocolade1972\Downloads\Watcher_WPF-master\Watcher_WPF-master\Watcher_WPF\MainWindow.xaml.cs:line
258
Line 258 is :
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);
For brevity I will only outline the solution I created in a professional setting for Invoice processing instead of give you the complete solution (I also cannot, because the code is copyrighted).
So that out of the way, here we go:
What I had first was an "Inbox" Folder, I had a FileSystemWatcher watch. I reacted to new files, but that works quite the same for file changed. For each event, I enqueued an Item:
private ConcurrentQueue<string> _queue = new ();
private void Watcher_Changes(object sender, FileSystemEventArgs e)
{
_queue.Enqueue(e.FullPath);
}
That's all the EventHandler did. Objective here is to handle events from the FSW as quickly as any possible. Otherwise you may run into exhaustion and the FSW will discard events! (Yes, I learned it the hard way. Through bug reports and a lot of sweat :D)
The actual work was done in a separate thread, that consumed the Queue.
// Just brief display of the concept.
// This function would be used as Thread run every
// x Time, triggered by a Timer if the Thread is not still running.
private void MyWorkerRun()
{
// My Input came in mostly in batches, so I ran until the queue was empty.
// You may need to adapt to maybe only dequeue N Items for each run ...
// Whatever does the trick.
// while( _queue.Any() )
//
// Maybe only process the N amount of Items the Queue has at the
// start of the current run?
var itemsToProcess = _queue.Count;
if( itemsToProcess <= 0 ) return;
for( int i = 0; i < itemsToProcess; i++)
{
string sourcePath = _queue.Dequeue(); // ConcurrentQueue is Thread-Safe
// No file there anymore? Drop it.
if(!File.Exists(sourcePath)) continue;
// TODO Construct Target-Path
string targetPath = GetTargetPath(sourcePath); // Just a dummy for this example...
// Try to copy, requeue if failed.
if(!TryCopy(sourcePath, targetPath))
{
// Requeue for later
// It will be picked up in _next_ run,
// so there should be enough time in between tries.
_queue.Enqueue(sourcePath);
}
}
}
private bool TryCopy(string source, string target){ /* TODO for OP */ }
I have to add that I did this years ago. Today I would probably consider TPL DataFlow to handle the queueing and requeuing for me.
And of course, you can always spice this up. I tried to keep it as simple as possible, while showing the concept clearly.
I later had more requirements: For example, the program should be able to be exited and pick up from where it stopped when started again. It should only retry for X times then write the file into a "deadletterbox", then more processing steps were added, then it should send an email to a certain adress if the queue exceeded N entries ... you get it. You can always make it more complicated if you need to.
t = locationToCreateFolder + "\\" + folderName;
your "locationToCreateFolder" is a directory name and not a path. be cause it comes from here :
CopyFileOnChanged(System.IO.Path.GetDirectoryName(e.FullPath), e.FullPath);
so when you Combine the global path is not valid :
NewFileName = System.IO.Path.Combine(t, fn);
Im currently building a Windows service that will be used to create backups of logs. Currently, the logs are stored at the path E:\Logs and intent is to copy the contents, timestamp their new folder and compress it. After this, you should have E:\Logs and E:\Logs_[Timestamp].zip. The zip will be moved to C:\Backups\ for later processing. Currently, I am using the following to try and zip the log folder:
var logDirectory = "E://Logs";
var timeStamp = DateTime.Now.ToString("yyyyMMddHHmm");
var zippedFolder = logDirectory + "_" + timeStamp + ".zip";
System.IO.Compression.ZipFile.CreateFromDirectory(logDirectory, zippedFolder);
While this appears to create a zip folder, I get the error Windows cannot open the folder. The Compressed (zipped) Folder E:\Logs_201805161035.zip is invalid.
To address any troubleshooting issues, the service is running with an AD account that has a sufficient permission level to perform administrative tasks. Another thing to consider is that the service kicks off when its FileSystemWatcher detects a new zip folder in the path C:\Aggregate. Since there are many zip folders that are added to C:\Aggregate at once, the FileSystemWatcher creates a new Task for each zip found. You can see how this works in the following:
private void FileFoundInDrops(object sender, FileSystemEventArgs e)
{
var aggregatePath = new DirectoryInfo("C://Aggregate");
if (e.FullPath.Contains(".zip"))
{
Task task = Task.Factory.StartNew(() =>
{
try
{
var logDirectory = "E://Logs";
var timeStamp = DateTime.Now.ToString("yyyyMMddHHmm");
var zippedFolder = logDirectory + "_" + timeStamp + ".zip";
ZipFile.CreateFromDirectory(logDirectory, zippedFolder);
}
catch (Exception ex)
{
Log.WriteLine(System.DateTime.Now.ToString() + " - ERROR: " + ex);
}
});
task.Dispose();
}
}
How can I get around the error I am receiving? Any help would be appreciated!
By default a Script written by the machine saves the files to Local/Server Path folder, but due to network issue, both the folders are not in sync. I have witten a C# Window Service program using FileSystemWatcher, DiffEngine, System.Timers and PingService as below coding to handle this.
To monitor a local folder OnChange Event, Ping server IP whether is success/fail before Compare/Copy to Server path, When Ping Fail it will goes to logtemp folder, system timer handle this and Ping again before re-dump the logtemp files.
I do not know how to use threading for this. Where should be my system timer code when the ping fails?
protected override void OnStart(string[] args)
{ //all watcher config here//
watcher.Path = "path";
watcher.NotifyFilter = NotifyFilters.LastWrite;
watcher.Filter = "filename_company-Pg1_Product*";
watcher.Changed += new FileSystemEventHandler(LogFileSystemChanges);
watcher.EnableRaisingEvents = true;}
private void LogFileSystemChanges(object sender, FileSystemEventArgs e)
{
FileInfo sourcepath = new FileInfo(e.FullPath);
FileInfo destPath = new FileInfo(Path.Combine(dFile, e.Name));
FileInfo _tempPath = new FileInfo(Path.Combine(tempPath, e.Name));
if (PingService())
//PingService Bool Type....Ping Specific IP Before Compare/Copy
{
if (!destPath.Exists)
{
LogEvent(destPath + " DOES NOT EXIST!! ");
CopyFunction.CopyFile(sourcepath, destPath, true, true);
}
else
{
if (BinaryDiff(sFile, Path.Combine(dFile, e.Name)))
//DiffEngine If Source & Diff are Different is TRUE.
{
CopyFunction.CopyFile(sourcepath, destPath, true, true);
}
}
string msg = string.Format("Filename {0} are {1} now at {2} ", _
e.Name, e.ChangeType, DateTime.Now.ToString());
LogEvent(msg);
}
else
{
CopyFunction.CopyFile(sourcepath, _tempPath, true, true);
}
}
Use NServiceBus (with what they call a Saga)
Its an open source project that lets you foolproof your code against cases where the network is down.
Or are you just asking how to make a thread?
If so see MSDN on examples for creating threads
I want to monitor a log file of our PBX for changes. I made a small program that does just that with a FileSystemWatcher.
Now it's getting strange: The FileSystemWatcher never fires the Changed-Event when I simply start the program. Despite the fact that the log file really has changed. But when I open the directory in the Windows Explorer where the log file is located, the program works as expected. But only as long as the Explorer Window stays open... what the..?
Operating System: Windows Server 2008 R2
EDIT: Sorry, here is the code:
class Program
{
static void Main(string[] args)
{
new LogFileWatcher(#"C:\PBX\Dial.log");
System.Console.Read();
}
}
public class LogFileWatcher
{
public string LogFilePath { get; private set; }
private DateTime _lastLogFileWriteTime;
public LogFileWatcher(string path)
{
LogFilePath = path;
var directoryName = Path.GetDirectoryName(LogFilePath);
var fileName = Path.GetFileName(LogFilePath);
var fsw = new FileSystemWatcher { Path = directoryName, Filter = fileName };
fsw.Changed += fsw_Changed;
fsw.EnableRaisingEvents = true;
}
private void fsw_Changed(object sender, FileSystemEventArgs e)
{
// Get and fix the last write time of the log file
var fixLastWriteTime = File.GetLastWriteTime(LogFilePath);
// Don't do anything when file didn't change since last time
if (fixLastWriteTime == _lastLogFileWriteTime) return;
Console.WriteLine("File changed on: {0} - ID:{1}", DateTime.Now.ToLongTimeString(), Guid.NewGuid());
// Save last write time of the log file
_lastLogFileWriteTime = fixLastWriteTime;
}
}
EDIT2: Maybe this is important: The log file is in use by the PBX Windows-Service! I can open it with Notepad though.
For optimization reasons, the FileStream.Flush()-Method doesn't flush metadata anymore (Vista and later Microsoft operating systems). Therefore the FileSystemWatcher gets no file notification and won't fire the Changed-Method.
http://connect.microsoft.com/VisualStudio/feedback/details/94772/filestream-flush-does-not-flush-the-file-in-the-correct-way-to-work-with-filesystemwatcher-or-native-readdirectorychangesw
Consider this code:
string dir = Environment.CurrentDirectory + #"\a";
Directory.CreateDirectory(dir);
FileSystemWatcher watcher = new FileSystemWatcher(dir);
watcher.IncludeSubdirectories = false;
watcher.EnableRaisingEvents = true;
Console.WriteLine("Deleting " + dir);
Directory.Delete(dir, true);
if (Directory.Exists(dir))
{
Console.WriteLine("Getting dirs of " + dir);
Directory.GetDirectories(dir);
}
Console.ReadLine();
Interestingly this throws an UnauthorizedAccessException on Directory.GetDirectories(dir).
Deleting the watched directory returns without error, but Directory.Exists() still returns true and the directory is still listed. Furthermore accessing the directory yields "Access denied" for any program. Once the .NET application with the FileSystemWatcher exits the directory vanishes.
How can I watch a directory while still allowing it to be properly deleted?
You did in fact delete the directory. But the directory won't be physically removed from the file system until the last handle that references it is closed. Any attempt to open it in between (like you did with GetDirectories) will fail with an access denied error.
The same mechanism exists for files. Review FileShare.Delete
Try this line:
if (new DirectoryInfo(dir).Exists)
instead of:
if (Directory.Exists(dir))
You should use FileSystemInfo.Refresh. After .Refresh() .Exists shows correct result:
var dir = new DirectoryInfo(path);
// delete dir in explorer
System.Diagnostics.Debug.Assert(dir.Exists); // true
dir.Refresh();
System.Diagnostics.Debug.Assert(!dir.Exists); // false
Unfortunately FileSystemWatcher has taken a handle to the directory this means that when the directory is deleted that there is still a handle to the directory marked as PENDING DELETE. I've tried some experiments and it seems that you can use the Error event handler from FileSystemWatcher to identify when this happens.
public myClass(String dir)
{
mDir = dir;
Directory.CreateDirectory(mDir);
InitFileSystemWatcher();
Console.WriteLine("Deleting " + mDir);
Directory.Delete(mDir, true);
}
private FileSystemWatcher watcher;
private string mDir;
private void MyErrorHandler(object sender, FileSystemEventArgs args)
{
// You can try to recreate the FileSystemWatcher here
try
{
mWatcher.Error -= MyErrorHandler;
mWatcher.Dispose();
InitFileSystemWatcher();
}
catch (Exception)
{
// a bit nasty catching Exception, but you can't do much
// but the handle should be released now
}
// you might not be able to check immediately as your old FileSystemWatcher
// is in your current callstack, but it's a start.
}
private void InitFileSystemWatcher()
{
mWatcher = new FileSystemWatcher(mDir);
mWatcher.IncludeSubdirectories = false;
mWatcher.EnableRaisingEvents = true;
mWatcher.Error += MyErrorHandler;
}