I'm using a FileSystemWatcher to detect that a text file is created in directory A and subsequently created in directory B.
The issue I'm having is, the process which moves the file from directory A to directory B also zips the file up, changing the filename from say "999_XXX_001.txt" to "999_XXX_001.txt.zip"
Three problems with this;
1) I can no longer open and read the file to analyse the contents
2) The filename has changed
3) The FileSystemWatcher appears to only support a single extension
Solution
Using two watchers, one for ".zip" and one for ".txt", I'm removing the .zip and comparing filenames because moved files no longer exist to be compared byte-for-byte.. I guess the real question here was how can I use the watcher to detect ".txt.zip" as an extension!
Why? You would have to wait until the process has finished its zipping magic and afterwards you can open the zip file with your framework of choice
Why is it a problem itself that the filename has changed?
No, the file watcher will detect any change of all files within the given directory
But maybe it is better to describe what you actually try to achieve here. There is probably a better solution to what you actually need.
Related
I am writing a backup program that requires predefined multiple folder(s) & single file(s) to be added to a single zip archive. I have had no issues adding a single folder using -ZipFile.CreateFromDirectory(string, string, c..level, bool(false))
However i am having a hard time adding multiple folders as there does not seem to be a way to update an archive or target two folders using the CreateFromDirectory method.
- Would be nice if there was an UpdateFromDirectory mehod!
I have been trying to stay away from third party libraries for no reason really, however as far as i have found none deal with multiple unrecursive folders.
I have tried just about everything other than writing my own code to recurse & add individually which i don't really want to do.
The program has several inputs that defines the folders / files to be zipped and depending on whether they are not null should add them to a single zip file regardless of whether they are a folder or file.
I guess my question is whether this is possible at all using the boxed libraries without custom recursing or even with a third party library without heavy mods... Not sure if i have made my question clear, sure you will all let me know if i have not.
From what I can tell using the ZipFile class you can only create and read. if you want to update you would need to create the whole zip again. [Source: ZipFile methods]
to target more than one folder you could arrange all the files and folders into one folder then zip the entire source without including the source folder. In most cases moving this files/folders isn't possible so I'd recommend looking into Symlinks within windows. I'd redirect to you [Issue with creating symbolic link to directory
You can create a "myFolder" folder and put in it all the folders you want found in the zipped folder. Then do ZipFile.CreateFromDirectory("myFolder", "name of zip file to create", CompressionLevel.Fastest, false, Encoding.UTF8). Overriding this IncludeBaseDirectory method to false allows this to be done.
We have a some files or directories with their absolute paths.
These files or directories will be renamed in a process. We can't get the new names from the process at all, but we have the root directory full name(full absolute path)
Now, I wanna find that items again
Is there a unique key or something for directories or files to find them without the exact name?
No, there is nothing directly available on regular NTFS from C# code to do so.
You can
compute some sort of hashes of files yourself + size check to find them again after rename (if they simply renamed)
use events from FileSystemWatcher to track file movements
add alternative streams to files if they will not be stripped by "the process" to use as custom markers.
Currently I try to improve the design of two windows services (C#).
Service A produces data exports (csv files) and writes them to a temporary directory.
So the file is written to a temporary directory that is a sub dir. of the main output directory.
Then the file is moved (via File.Move) to the output directory (after a successful write).
This export may be performed by multiple threads.
Another service B tries to fetch the files from this output directory in a defined interval.
How to assure that Directory.GetFiles() excludes locked files.
Should I try to check every file by creating a new FileStream (using
(Stream stream = new FileStream("MyFilename.txt", FileMode.Open)) as
described
here.
Or should the producer service (A) use temporary file names (*.csv.tmp) that are
automatically excluded by the consumer serivce (B) with appropriate search pattterns. And rename a file after the move was finished.
Are there better ways to handle such file listing operations.
Don't bother checking!
Huh? How can that be?
If the files are on the same drive, a Move operation is atomic! The operation is effectively a rename, erasing the directory entry from the previous, and inserting it into the next directory, pointing to the same sectors (or whatevers) where the data really are, without rewriting it. The file system's internal locking mechanism has to lock & block directory reads during this process to prevent a directory scan from returning corrupt results.
That means, by the time it ever shows up in a directory, it won't be locked; in fact, the file won't have been opened/modified since the close operation that wrote it to the previous directory.
caveats - (1) definitely won't work between drives, partitions, or other media mounted as a subdirectory. The OS does a copy+delete behind the scenes instead of a directory entry edit. (2) this behaviour is a convention, not a rule. Though I've never seen it, file systems are free to break it, and even to break it inconsistently!
So this will probably work. If it doesn't, I'd recommend using your own idea of temp extensions (I've done it before for this exact purpose, but between a client and server that only could talk by communicating via a shared drive) and it's not that hard and worked flawlessly.
If your own idea is too low-tech, and you're on the same machine (sounds like you are), you can set a mutex (google that), with the filename embedded, that lives while the file is being written, in the writer process; then do a blocking test on it when you open each file you are reading from the other process. If you want the second process to respond ASAP combine this with the filesystem watcher. Then pat yourself on the back for spending ten times the effort as the temp filename idea, with no extra gain >:-}
good luck!
One way would be to mark the files as temporary from the writing app whilst they're in use, and only unmark them once they are written to and closed, eg.
FileStream f = File.Create (filename);
FileAttributes attr = File.GetAttributes (filename);
File.SetAttributes (filename, attr | FileAttributes.Temporary);
//write to file.
f.Close ();
File.SetAttributes (filename, attr);
From the consuming app, you just want to skip any temporary files.
foreach (var file in Directory.GetFiles (Path.GetDirectoryName (filename))) {
if ((File.GetAttributes (file) & FileAttributes.Temporary) != 0) continue;
// do normal stuff.
}
i am doing an application that uses file system watcher and the main aim is to monitor files that are being copied , so i looked at the file system watcher method i found 4 events i can sue , they are change , delete , rename and create , i didn't find the copy event , umm and i'd like to watch specific files and when a user tries to copy the files i prevent him from doing so using the file system watcher method , so in file system watcher is there any method for copying files monitoring that i can use ? , sorry my question may look stupid but this is the first time i use file system watcher and i read about it a lot and almost all people agree about the 4 events that can be used in that class .
There is no operation called copying a file.
Rather, copying is a combination of reading one file and writing another one.
You cannot reliably do this.
You cannot do this with FileSystemWatcher
May be monitor Clipboard and if you find list of files, check location of it
I think you can at least have a notification of who copied the file by using WMI notification. Another option is restricting user access to the folders.
Try this link.
http://technet.microsoft.com/en-us/library/ee176985.aspx#EHAA
Why don't you check that the executable is launched from the usb key at the beginning of your software ? this way the user who copied the file won't be able to launch it even if he copied the exe.
When I run the code below, it fills my array with a list of files in the specified directory.
This is good.
However, it also grabs files that are 'in flight' - meaning files that are currently being copied to that directory.
This is bad.
How do I go about ignoring those 'in-flight' files? Is there a way to check each file to make sure it's 'fully there' before I process it?
string[] files = Directory.GetFiles(ConfigurationSettings.AppSettings.Get("sourcePath"));
if (files.Length > 0)
{
foreach (string filename in files)
{
string filenameonly = Path.GetFileName(filename);
AMPFileEntity afe = new AMPFileEntity(filenameonly);
afe.processFile();
}
}
Unfortunately there is no way to achieve what you are looking for. Robert's suggestion of opening the file for writing solves a subset of the problem but does not solve the bigger issue which is
The file system is best viewed as a multi-threaded object over which you have no synchronization capabilities
No matter what synchronization construct you try to use to put the file system into a "known state", there is a way for the user to beat it.
The best way to approach this problem is to process the files as normal and catch the exceptions that result from using files that are "in flight". This is the only sane way to deal with the file system.
Yes, you can try to open the file for writing. If you are able to open for writing without an exception then it's likely not "in-flight" anymore. Unfortunately, I have encountered this problem several times before and have not come across a better solution.
Get a list of files in the directory.
Get a list of open handles (see below).
Remove the latter from the former.
You can get a list of open handles by p/invoking NtQuerySystemInformation. There's a project on CodeProject that shows how to do this. Alternatively, you can call Handle.exe, from Sysinternals, and parse its output.
Try to rename/move the file. If you can rename it, it's no longer in use.
Carra's answer gave me an idea.
If you have access to the program that copies the files to this directory, modify it so that it:
Writes files to a temporary directory on the same disk.
Move the files to the appropriate folder after they're finished writing to disk.
On the same filesystem, a move operation just updates the directory entries rather than changing the file's physical location on disk. Which means that it's extremely fast.