I want to provide my users with the ability to post a log file onto my server rather than requiring them to find it and email it to me.
I am using this library http://ftplib.codeplex.com/ to open an ftp session on the server and send the file. There is a bit of renaming involved but that is it.
Unfortunately the log file to be sent is actually open so I got a 'file is being used by another process' exception. This makes sense when I think about it in so far as the log is open while my app is running. I closed it but, of course, uploading is a long process. I put the upload code into a background thread so that the user may continue. However the log cannot be re-opened until the upload is complete. In the meantime there could be some event that should be written to the log.
So I am looking for a way to copy the log and then upload it. What would be the best way to do that? The log is a binary file BTW.
If you don't own the code that has the log file open (ie, it's another app or a closed source dll), you can try doing a File.Copy(<log>, <tempdest>) and send that file, deleting it when you're done. This only sometimes works when you don't have read access to the file.
If you do own the code that is accessing the file in the first place, you want to open it with an explicit ShareMode ie
File.Open(path, FileMode.Open, FileAccess.ReadWrite, FileShare.Read)
Related
Searched a lot, but without luck - so here goes
My C# winforms application creates temp files which are opened using the default registered application (let's call them viewer apps). Once the user is done viewing those files, I want to delete them.
Currently, I register for an Application.ApplicationExit event, to delete the file. This approach covers most of the situations but not all. Sometimes the user still has the viewing application open while exiting my app, so the success of my File.Delete depends on whether the viewer has opened the file with FileShare.Delete or not - which is out of my control.
This is what I have found so far, but fall short of what I want
FileOptions.DeleteOnClose does not help, since my app will already be closed in some cases and the temp file will still be needed. Also, when I create the file like this: new FileStream(fn, FileMode.CreateNew, FileAccess.ReadWrite, FileShare.ReadWrite | FileShare.Delete, 4096, FileOptions.DeleteOnClose), the viewer apps like say adobe reader & notepad, still complain about file in use by my application The process cannot access the file because it is being used by another process
MoveFileEx with MOVEFILE_DELAY_UNTIL_REBOOT dwFlags works, but it would wait till a reboot to delete it - I would rather have it deleted once the use is done, since reboots can be few and far between and forcing reboots IMO is not the most user friendly approach. On a side note, does windows automatically clear the %temp% folder on restart? Or is there any temp folder that windows automatically clears on restart?
I can write another background process which constantly tries to delete the temp files till it succeeds, but I would like to avoid deploying one more piece of software to accomplish this. It can be done using a windows service or scheduled task or adding command line switches to my existing app and making it run in "delete mode" in background or through task scheduler. All of it decrease my ease of deployment and use along with increasing my footprint on client's computer
In a nutshell, I am wondering if there is any Win32 API or .NET Framework API that will delete a file as soon as there are no processes with open handle to that file?
EDIT:
The information in the temp files are reasonably private (think your downloaded bank account statements) and hence the need for immediate deletion after viewing as opposed to waiting for a reboot or app restart
Summary of all Answers and Comments
After doing some more experiments with inputs from Scott Chamberlain's answer, and other comments on this question, the best path seems to be to force the end users to close the viewer app before closing my application, if the viewer app disallows deletion (FileShare.Delete) of the temp file. The below factors played a role in the decision
The best option is FileOptions.DeleteOnClose, but this only works if all files open before or after this call use FileShare.Delete option to open the file.
Viewer apps can frequently open files without FileShare.Delete option.
Some viewers close the handle immediately after reading/displaying the file contents (like notepad), whereas some other apps (like Adobe Reader) retain such handle till the file is closed in the viewer
Keeping sensitive files on disk for any longer than required is definitely not a good way to proceed. So waiting till reboot should only be used as a fail-safe and not as the main strategy.
The costs of maintaining another process to do the temp file cleanup, far exceeds the slight user inconvenience when they are forced to "close" the viewer before proceeding further.
This answer is based on my comments in the question.
Try write the file without the delete, close the file, let the editor open the file, then open a new filestream as a read with DeleteOnClose with an empty body in the using section.
If that 2nd opening does not fail it will behave exactly like you wanted, it will delete the file as soon as there are no processes with open handle to that file. If the 2nd opening for the delete does fail you can use MoveFileEx as a fallback failsafe.
On web server, I have an ASP.NET website with error.txt inside it.
This file always gets written to. And it is in the bin folder.
Should I be concerned if:
I go to web server and open the bin folder
Go to web server and open the error.txt
My C# code always writes to it in a share mode FileShare.ReadWrite
Is copying the file to another folder using command prompt safer to avoid locking?
It depends on the application you are opening it with. For instance, if you open it with Microsoft Word, it will lock the file and other processes will only be able to read it. However, if you open it with Notepad, the file is not locked and another process can read, write, delete, move, or rename it.
You can test simply by opening a file with the application, and then try to delete, rename, write to, or move the file. If you can perform any of those actions then you know that your application did not lock it.
Assuming you are just copying it with Explorer, copying the file will lock it for the time it takes to read the file. If it is a small file, this will just be a very few milliseconds.
To ensure that your application doesn't crash if it tries to write to the file while it is locked, put the write operation(s) in a loop that retry if an exception is returned, with a 100 ms wait between tries.
We have here a windows server and one day we will get via sftp some text files in a folder. I dont have more information, but maybe this is enough. Now I should write a function that is moving these files into another folder. Well that should not be that hard, I thought... but now I realized that Im able to move a file before its finished. So I was searching for some solutions and Im really confused.
My solution would be to check the file and the processes around it. Because if the file is not finished yet, there is a copy-process and I can check this process. To make this easy, I just have to try to lock the file and if there is no another process, well then the file is ready for move?
using (File.Open("myFile", FileMode.Open, FileAccess.Read, FileShare.None))
{ /*rdy!*/ }
But now I see that people are writing something about checksum test or to test the filesize and if the filesize is not changing then the file is ready. Is this stuff not a little bit complicated? Please tell me that my solution could work also... Im not able to test it with any server to server sftp stuff. I just know that if I copy a file to another folder (via explorer) this is working. Does this work via sftp transfer as well? Any ideas? Thank you
File-size checks are dangerous - what if the upload is suspended and later resumed? How much time should go by until you accept the current file size as the final file size? => Not a good solution.
I'd go for the locking, however, this only works if the process that writes the file also opens the file in a way so that it is locked exclusively. If the process doesn't do that, you'll be stuck with your problem again.
Another solution would be to upload the files with temporary names, like ".sftptmp". And to have the uploader rename it after it is done. That way you can be sure the file has been uploaded - just ignore all files that end with ".sftptmp". This, however, assumes that you actually have control over the process of uploading files.
Another option is to have the sender put a control file after the data file. For example, put uploadfile-20220714.txt, then put uploadfile-20220714.ctl. The control file can contain file information such as the name and size of the data file. This option requires the sender to modify their process, but it shouldn't require too much effort.
When I write to an XML file, an exception occurs: "Cannot access file because it used by another process". How can I fix that problem?
You can use things like "Process Explorer" (easy to find) on the machine in question to double-check which process is locking a file. If you don't own the competing process, the best you can do is ask the operator to kindly close the file and/or app that is blocking you.
Assuming you do you manage the other process that is locking the file? The most common cause of unexpected locks is files not being closed cleanly. Check that you are religiously closing all file handles after use, ideally using using so that they are closed even in error conditions - for example:
using(Stream dest = File.Create(path)) {
// write to dest
}
Most likely this means another program has this file locked. Try saving in another location and make sure you're properly disposing objects used to write to the file when you're done writing. Also double-check you have proper permissions to write this folder (try creating a basic text file there)
Keep in mind that your program may be running with different permissions than what you are logged in with.
The XML file that you are trying to write to will be currently open by any other process [file opened] and will be in a locked state. You cannot modify a file that has been locked.
Close any file handles that are currently using the resource.
I want to replace existing files on an IIS website with updated versions. Say these files are large pdf documents, which can be accessed via hyperlinks. The site is up 24x7, so I'm concerned about locking issues when a file is being updated at exactly the same time that someone is trying to read the file.
The files are updated using C# code run on the server.
I can think of two options for opening the file for writing.
Option 1) Open the file for writing, using FileShare.Read :
using (FileStream stream = new FileStream(path, FileMode.Create, FileAccess.Write, FileShare.Read))
While this file is open, and a user requests the same file for reading in a web browser via a hyperlink, the document opens up as a blank page.
Option 2) Open the file for writing using FileShare.None :
using (FileStream stream = new FileStream(path, FileMode.Create, FileAccess.Write, FileShare.None))
While this file is open, and a user requests the same file for reading in a web browser via a hyperlink, the browser shows an error. In IE 8, you get HTTP 500, "The website cannot display the page", and in Firefox 3.5, you get : "The process cannot access the file because it is being used by another process."
The browser behaviour kind of makes sense, and seem reasonable. I guess its highly unlikely that a user will attempt to read a file at exactly the same time you are updating it. It would be nice if somehow, the file update was atomic, like updating a database with SQL wrapped around a transaction.
I'm wondering if you guys worry about this sort of thing, and prefer either of the above options, or even have other options of your own for updating files.
How about copying the new version of the file with a different name and then do File.Move() setting the overwrite argument to true? While you're writing it you won't interfere with the web server and moving the file(s) will be quick.
I usually don't worry about this kind of problem, but you could work it around this way:
Create a link to an ASPX page which downloads referenced file, like "Download.aspx?Name=Document1.pdf"
In that page, before download that file, look for a folder named "Updated"
If you find it, get your file from it
If not, go get it from "Current" folder
To update your files:
Create a folder name "Updating"
Copy your new files into it
Rename it to "Updated" (so new downloads use it as source)
Update your "Current" folder
Delete your "Updated" folder
One option is to build an extra layer between the hyperlink and the file. Instead of linking directly to the file have the hyperlink point to another page (or similar resource). This resource/page can then determine which is the latest file that needs to be sent to the browser and then send it down to the browser. This way the link will always be the same, but the file can change.
This is a threading issue at heart, and those can be tricky to solve in a clean way. Conceptually, you want to synchronize read and write access to the file from multiple threads.
To achieve this, I would store the file outside of IIS' website root so that IIS does not serve it directly. I would create a separate HttpHandler to serve it instead. The HttpHandler would lock on an object that the write code would also lock on. You should use a ReaderWriterLockSlim so that you can have multiple concurrent reads (EnterReadLock) while also ensuring only a single write thread can execute (EnterWriteLock) and that reads are blocked while writing.
Having a similar issue, I developed my own solution (basically having multiple versions of a file, serving them through an ASHX handler).
Please see my CodeProject article discussing the solution (and possible caveats)