Error creating an XmlWriter when file already exists - c#

In my code, I have these lines:
XmlWriterSettings writerSettings = new XmlWriterSettings();
writerSettings.Indent = true;
XmlWriter writer = XmlWriter.Create(filename, writerSettings);
document.Save(writer);
This works fine when filename does not exist. But when it does, I get this error (on the 3rd line, not the 4th):
System.IO.IOException: Sharing violation on path [the file path]
I want to overwrite the file if it already exists. How do I do this?

If you look carefully at the IOException, it says that it's a "sharing violation". This means that while you are trying to access this file, another program is using it. Usually, it's not much of a problem with reading, but with writing to files this can happen quite a lot. You should:
Try to find out if some other program is using this file, what the program is, and why it's doing so. It's possible that some programs (especially those written in languages without good garbage handling capabilities) were accessing the file and then did not close the IO stream, thus locking up the file. There are also some utilities (if my memory serves me correctly) that allow you to see what processes are using a certain file - just google it.
There's a possibility that when you were debugging your program, you may have killed the process or something (I do that sometimes), and the IO stream may have not been closed. For this, the easiest fix (as far as I know) is just a reboot.
Alternatively, the issue may be coming from your own code. However, as you're writing in C#, and garbage collection, along with the IO capabilities, usually prevent such problems, you might have forgotten to close a file stream somewhere. I do this sometimes, and it takes quite a while to find the location of the bug, even though the fix is nearly instant. If you step through your program and utilize watches to keep track of your IO operations, it should be relatively simple to find such a bug.
Good luck!

The problem isn't that the file exists, but that it is in use by a different program (or your own program). If it was simply that the file existed it would be overwritten and cause no exception.
If it's your program that has created the file that exists already, it's likely that you haven't disposed properly the object that created the file, so the file is still open.

Try using the overload of XmlWriter.Create that accepts a Stream, and pass in a FileStream from File.Create(filename)...

Related

File.open hangs and freezes thread when accessing a local file

I'm currently using filestreams to copy files form one location to another.
It all functioned as intended until now when I suddenly have the problemn that File.open freezes the thread that it is running in.
FileStream sourceStream = File.Open(filePath, FileMode.Open)
It only happens for 1 specific file (3 GB in size). The interesting thing is one day prior it functioned normally though for this file so it can't be the file size. Next thing I checked was if some sort of exception was thrown that I don't catch.
I put a try / catch block the whole thing (normally I use the calling method to catch the exceptions) and still same effect.
try
{
FileStream sourceStream = File.Open(filePath, FileMode.Open);
sourceStream.Close();
}
catch (Exception e)
{
Console.Write("A");
}
I also checked what happens if the file is being accessed already. Then an exception is thrown (tested it for other files as like I said for this specific file it always hangs up the thread now when I try to open it).
The file is located on the local harddrive and other files (smaller though) in the same folder don't show this problem.
As I'm now running out of ideas what the possible reason could be, my question is:
What could possible reasons for this unexpected behaviour be and how can they be adverted?
EDIT:
It now functions again (just when I tried to use the process monitor it started functioning again).
So in total no clue what could have caused the phenomenon. If anyone would have an idea what could be a possible reason for this it would be good to know to avoid a possible repeat of the problem in the future.
Also of note as one question brought it up before the File.Open I have an using block with:
using (var stream = new BufferedStream(File.OpenRead(filePath), 1024 * 1024))
{
//..do calculations
}
Which I use to make some hash calculations in regards to the file. THIS one had no issues at all with opening the file (only the later File.Open had the issues)
Edit:
I've just received an info from the sysadmins here that shines a new light onto the problem:
The system is set up in a way so that the whole system is backuped time and again file by file wihtout the OS having any knowledge of it. This means in the case of the backuped file that the OS thinks it is there and nobody accesses it when in reality it is currently being backuped (and thus accessed and unable to be accessed from within the OS according to how they described the backup process.....as the OS doesn't know about the backup happening nothing was shown in the resources hard drive access nor the task manager).
Thus with that information it could be that as the OS didnt know about the file being accessed it tried to access it (through the open command) and waited and waited and waited for the hard drive read head to come to the file which never happened as it was not accessible in reality).
Thus it would have had to run into a timeout which the file.open command doesn't have (at least my guess there with the new infos if I understood the sys admins accurately there)
tnx
A couple possible reasons:
Your antivirus. That thing hooks into the OS and replaces the I/O functions with its own. When you open a file, it can actually perform a virus check before returning back control to your application. You could have had a bad signature update which forced the AV to perform the check on your 3GB file, and a subsequent update could have fixed the problem.
A bad sector on your drive. This usually makes I/O perform very poorly, but your system could have relocated the bad sector into another one, so the performance went back to normal. You can run a chkdsk /R to see if you have bad sectors.
Another app that locks the file, though I'd rather expect an exception in this case.
The Problem stemmed not from c# or the Windows System, but from the architecture of how the PC was set up itself.
In this case it was set up so, that the files I tried to read could be inacessible (because they were being backed up) WITHOUT the OS of the local PC knowing it.
Thus the OS thought the file was accessible and C# received that answer from the OS when it tried to open the file. And as file operations in C# use their Windows aequivalents and those have no timeouts.... the whole Operation hanged / freezed until the file backup was finished.
In retrospect I would say: Lucas Trzesniewski answer should cover most situations where the freeze happens....my own Problem was not answerd by that only because I had such a Special Situation that caused the Problem in the end.
Are you absolutely sure that the freezing always occurs in File.Open()?
Given the absence of exceptions it appears that the problem may be at lower level. When you've experienced it you tried to open the file with a hex editor or some other tool to check that it is actually entirely readable? It could be a problem of access to a certain area of ​​the hard drive.
Try to specify the access mode with FileAccess if you need read-only, write-only, etc.
See also this post for the actual usefulness of BufferedStream.
Have you check with File.Open() function with FileAccess & FileShare values ,
I think it's a file locking issue
I had a similar issue when sometimes File.Open hangs when trying to check if a file is locked.
I solved it like this:
public async Task<bool> IsLocked(FileInfo file)
{
var checkTask = Task.Run(() =>
{
try
{
using (file.Open(FileMode.Open, FileAccess.Read, FileShare.None)) { };
return false;
}
catch (Exception)
{
return true;
}
});
var delayTask = Task.Delay(1000);
var firstTask = await Task.WhenAny(checkTask, delayTask);
if (firstTask == delayTask)
{
return true;
}
else
{
return await checkTask;
}
}

Allow tail when using MemoryMappedFile

I'm trying to create a logfile using a MemoryMappedFile. The code looks something like this:
_currentFile = MemoryMappedFile.CreateFromFile(filename, FileMode.Create, filename, MaxSize, MemoryMappedFileAccess.ReadWrite);
_currentFile.GetAccessControl().SetAccessRule(new AccessRule<MemoryMappedFileRights>("everyone", MemoryMappedFileRights.FullControl, AccessControlType.Allow));
_accessor = _currentFile.CreateViewAccessor();
The logging works just fine but when I try to put a tail on the file at the same time, I get a Permission denied.
I've tried to find an answer as to how you can allow reads on a MemoryMappedFile but I was unable to find a straight answer. So here it goes, is it possible to allow readers to access a MemoryMappedFile?
In other words, would it be possible to "tail" a MemoryMappedFile that is being actively written to?
If using a MemoryMappedFile as a log file is a bad idea to begin with. Then I'd also like to hear it. And if this is a stupid question to ask then I apologize.
Hans Passant gave the correct answer to this question so let me quote that here:
"The exact moments in time when the operating system flushes memory updates to the file, and the order in which they occur, are completely unpredictable. This makes MMFs efficient. It therefore puts a lock on the file that prevents any process from reading it. Since such a process could not see anything but stale junk. Also the reason why common logging libraries, like log4net, do not offer MMFs as one of their many possible log targets. So, yeah, bad idea."

Error: The process cannot access the file '...' because it is being used by another process

I have a function that always creates a directory and put in it some files (images).
When the code runs first time, no problem. Second time (always), it gets an error when I have to delete the directory (because I want to recreate it to put in it the images). The error is "The process cannot access the file '...' because it is being used by another process". The only process that access to this files is this function.
It's like the function "doesn't leave" the files.
How can I resolve this with a clear solution?
Here a part of the code:
String strPath = Environment.CurrentDirectory.ToString() + "\\sessionPDF";
if (Directory.Exists(strPath))
Directory.Delete(strPath, true); //Here I get the error
Directory.CreateDirectory(strPath);
//Then I put the files in the directory
If your code or another process is serving up the images, they will be locked for an indefinite amount of time. If it's IIS, they're locked for a short time while being served. I'm not sure about this, but if Explorer is creating thumbs for the images, it may lock the files while it does that. It may be for a split second, but if your code and that process collide, it's a race condition.
Be sure you release your locks when you're done. If the class implements IDisposable, wrap a using statement around it if you're not doing extensive work on that object:
using (var Bitmap = ... || var Stream = ... || var File = ...) { ... }
...which will close the object afterwards and the file will not be locked.
Just going out on a limb here without seeing the code that dumps the files, but if you're using FileStreams or Bitmap objects, I would double check to ensure you are properly disposing of all of those objects before running the second method.
The only clear solution on this case is keep track of who is handling access to the directory and fix the bug, by releasing that access.
If the object/resource that handling access is 3rd party, or by any means is not possible to change or access, it's a time to revise an architecture, to handle IO access in a different way.
Hope this helps.
Sounds like you are not releasing the file handle when the file is created. Try doing all of your IO within the using statement, that way the file will be released automatically when you are finished with it.
http://msdn.microsoft.com/en-us/library/yh598w02%28v=vs.80%29.aspx
I have seen cases where a virus scanner will scan the new file and prevent the file from being deleted, though that is highly unlikely.
Be sure to .Dispose of all IDisposable objects and make sure that nothing has changed your Environment.CurrentDirectory to the directory you want to delete.

Lock a file while retaining the ability to read/append/write/truncate in the same thread?

I have a file containing, roughly speaking, the state of the application.
I want to implement the following behaviour:
When the application is started, lock the file so that no other applications (or user itself) will be able to modify it;
Read the previous application state from the file;
... do work ...
Update the file with a new state (which, given the format of the file, involves rewriting the entire file; the length of the file may decrease after the operation);
... do work ...
Update the file again
... do work ...
If the work failed (application crashed), the lock is taken off, and the content of the file is left as it was after the previous unit of work executed.
It seems that, to rewrite the file, one should open it with a Truncate option; that means one should open a new FileStream each time they want to rewrite a file. So it seems that behavior I want could only achieved by such a dirty way:
When the application is started, read the file, then open the FileStream with the FileShare.Read;
When some work is done, close the handle opened previously, open another FileStream with the FileMode.Truncate and FileShare.Read, write the data and flush the FileStream.
When some work is done, close the handle opened previously, open another FileStream with the FileMode.Truncate and FileShare.Read, write the data and flush the FileStream.
On the Dispose, close the handle opened previously.
Such a way has some disadvantages: extra FileStream are opened; the file integrity is not guaranteed between FileStream close and FileStream open; the code is much more complicated.
Is there any other way, lacking these disadvantages?
Don't close and reopen the file. Instead, use FileStream.SetLength(0) to truncate the file to zero length when you want to rewrite it.
You might (or might not) also need to set FileStream.Position to zero. The documentation doesn't make it clear whether SetLength moves the file pointer or not.
Why don't you take exclusive access to the file when application starts, and create an in-memory cache of the file that can be shared across all threads in the process while your actual file remains locked for OS. You can use lock(memoryStream) to avoid concurrency issues. when you are done updating the local in-memory version of file just update the file on disk and release lock on it.
Regards.

How can I reliably guarantee access to a file/folder?

-What is the most foolproof way of ensuring the folder or file I want to manipulate is accessible (not read-only)?
-I know I can use ACL to add/set entries (make the file/folder non-readonly), but how would I know if I need to use security permissions to ensure file access? Or can I just add this in as an extra measure and handle the exception/negative scenario?
-How do I know when to close or just flush a stream? For example, should I try to use the streams once in a method and then flush/close/dipose at the end? If I use dispose(), do I still need to call flush() and close() explicitly?
I ask this question because constantly ensuring a file is available is a core requirement but it is difficult to guarantee this, so some tips in the design of my code would be good.
Thanks
There is no way to guarantee access to a file. I know this isn't a popular response but it's 100% true. You can never guarantee access to a file even if you have an exclusive non-sharing open on a Win32 machine.
There are too many ways this can fail that you simply cannot control. The classic example is a file opened over the network. Open it any way you'd like with any account, I'll simply walk over and yank the network cable. This will kill your access to the file.
I'm not saying this to be mean or arrogant. I'm saying this to make sure that people understand that operating on the file system is a very dangerous operation. You must accept that the operation can and will fail. It's imperative that you have a fallback scenario for any operation that touches disk.
-What is the most foolproof way of ensuring the folder or file I want to manipulate is accessible (not read-only)?
Opening them in write-mode?
Try and write a new file into the folder and catch any exceptions. Along with that do the normally sanity checks like folder/files exists etc.
You should never change the folder security in code as the environment could drastically change and cause major headaches. Rather ensure that the security is well documented and configured before hand. ALternatively use impersonation in your own code to ensure you are always running the required code as a user with full permissions to the folder/file.
Never call Dispose() unless you have no other choice. You always flush before closing the file or when you want to commit the content of the stream to the file/disk. The choice of when to do it depends on the amount of data that needs to be written and the time involved in writing the data.
100% foolproof way to ensure a folder is writable - create a file, close it, verify it is there, then delete it. A little tedious, but you asked for foolproof =)
Your better bet, which covers your question about ACL, is to handle the various exceptions if you cannot write to a file.
Also, I always call Close explicitly unless I need to read from a file before I'm done writing it (in which case I call flush then close).
Flush() - Synchronizes the in-memory buffer with the disk. Call when you want to write the buffer to the disk but keep the file open for further use.
Dispose(bool) - Releases the unmanaged resource (i.e. the OS file handle) and, if passed true, also releases the managed resources.
Close() - Calls Dispose(true) on the object.
Also, Dispose flushes the data before closing the handle so there is no need to call flush explicitly (although it might be a good idea to be flushing frequently anyway, depending on the amount and type of data you're handling).
If you're doing relatively atomic operations to files and don't need a long-running handle, the "using" paradigm is useful to ensure you are handling files properly, e.g.:
using (StreamReader reader = new StreamReader("filepath"))
{
// Do some stuff
} // CLR automagically handles flushing and releasing resources

Categories