File.open hangs and freezes thread when accessing a local file - c#

I'm currently using filestreams to copy files form one location to another.
It all functioned as intended until now when I suddenly have the problemn that File.open freezes the thread that it is running in.
FileStream sourceStream = File.Open(filePath, FileMode.Open)
It only happens for 1 specific file (3 GB in size). The interesting thing is one day prior it functioned normally though for this file so it can't be the file size. Next thing I checked was if some sort of exception was thrown that I don't catch.
I put a try / catch block the whole thing (normally I use the calling method to catch the exceptions) and still same effect.
try
{
FileStream sourceStream = File.Open(filePath, FileMode.Open);
sourceStream.Close();
}
catch (Exception e)
{
Console.Write("A");
}
I also checked what happens if the file is being accessed already. Then an exception is thrown (tested it for other files as like I said for this specific file it always hangs up the thread now when I try to open it).
The file is located on the local harddrive and other files (smaller though) in the same folder don't show this problem.
As I'm now running out of ideas what the possible reason could be, my question is:
What could possible reasons for this unexpected behaviour be and how can they be adverted?
EDIT:
It now functions again (just when I tried to use the process monitor it started functioning again).
So in total no clue what could have caused the phenomenon. If anyone would have an idea what could be a possible reason for this it would be good to know to avoid a possible repeat of the problem in the future.
Also of note as one question brought it up before the File.Open I have an using block with:
using (var stream = new BufferedStream(File.OpenRead(filePath), 1024 * 1024))
{
//..do calculations
}
Which I use to make some hash calculations in regards to the file. THIS one had no issues at all with opening the file (only the later File.Open had the issues)
Edit:
I've just received an info from the sysadmins here that shines a new light onto the problem:
The system is set up in a way so that the whole system is backuped time and again file by file wihtout the OS having any knowledge of it. This means in the case of the backuped file that the OS thinks it is there and nobody accesses it when in reality it is currently being backuped (and thus accessed and unable to be accessed from within the OS according to how they described the backup process.....as the OS doesn't know about the backup happening nothing was shown in the resources hard drive access nor the task manager).
Thus with that information it could be that as the OS didnt know about the file being accessed it tried to access it (through the open command) and waited and waited and waited for the hard drive read head to come to the file which never happened as it was not accessible in reality).
Thus it would have had to run into a timeout which the file.open command doesn't have (at least my guess there with the new infos if I understood the sys admins accurately there)
tnx

A couple possible reasons:
Your antivirus. That thing hooks into the OS and replaces the I/O functions with its own. When you open a file, it can actually perform a virus check before returning back control to your application. You could have had a bad signature update which forced the AV to perform the check on your 3GB file, and a subsequent update could have fixed the problem.
A bad sector on your drive. This usually makes I/O perform very poorly, but your system could have relocated the bad sector into another one, so the performance went back to normal. You can run a chkdsk /R to see if you have bad sectors.
Another app that locks the file, though I'd rather expect an exception in this case.

The Problem stemmed not from c# or the Windows System, but from the architecture of how the PC was set up itself.
In this case it was set up so, that the files I tried to read could be inacessible (because they were being backed up) WITHOUT the OS of the local PC knowing it.
Thus the OS thought the file was accessible and C# received that answer from the OS when it tried to open the file. And as file operations in C# use their Windows aequivalents and those have no timeouts.... the whole Operation hanged / freezed until the file backup was finished.
In retrospect I would say: Lucas Trzesniewski answer should cover most situations where the freeze happens....my own Problem was not answerd by that only because I had such a Special Situation that caused the Problem in the end.

Are you absolutely sure that the freezing always occurs in File.Open()?
Given the absence of exceptions it appears that the problem may be at lower level. When you've experienced it you tried to open the file with a hex editor or some other tool to check that it is actually entirely readable? It could be a problem of access to a certain area of ​​the hard drive.
Try to specify the access mode with FileAccess if you need read-only, write-only, etc.
See also this post for the actual usefulness of BufferedStream.

Have you check with File.Open() function with FileAccess & FileShare values ,
I think it's a file locking issue

I had a similar issue when sometimes File.Open hangs when trying to check if a file is locked.
I solved it like this:
public async Task<bool> IsLocked(FileInfo file)
{
var checkTask = Task.Run(() =>
{
try
{
using (file.Open(FileMode.Open, FileAccess.Read, FileShare.None)) { };
return false;
}
catch (Exception)
{
return true;
}
});
var delayTask = Task.Delay(1000);
var firstTask = await Task.WhenAny(checkTask, delayTask);
if (firstTask == delayTask)
{
return true;
}
else
{
return await checkTask;
}
}

Related

C# Directory.Move access denied error

This is a bit of a tricky one and hopefully I can gain some insight on how the C# built in Directory.Move function works (or should work). I've written a program that puts a list of folder names that are older than a specific date into a DirectoryInfo list, which it iterates over to Move the folder elsewhere.
foreach (DirectoryInfo temp in toBeDeleted)
{
filecheck.WriteLine(temp.Name);
Directory.Move(temp.FullName, #"T:\Transactiondeletions\" + counter + "\\" + temp.Name);
}
Where temp.Fullname is something like T:\UK\DATA\386\trans\12345678.16
However when I run the program I hit an access denied error.
T: in this case is something like 10.11.12.13\Data2$
I have another mapped drive, U:, which is on the same IP as 10.11.12.13\Data3$ and has the exact same directory structure.
The kicker is that my program works just fine on the U drive but not on the T drive. I've tried both the drive letter in my code as the actual full path with IP, and it still works fine on the U drive but not on the T drive.
On the T drive whenever my programs tries to move a folder, it hits Access denied.
However it works fine when:
I move the folder manually by hand
I use a directory copy + Directory.Delete instead of Directory.Move
Any ideas? I can't figure out why it won't work here even though I can move the files manually, I've tried running the .exe manually and as admin and as a colleague as well but the result is the same.
I thought it might've been related to a streamwriter being open still (filecheck), but I've already tried moving this part of the code until after I close the streamwriter but it hits the same errors so I've 'excluded' that possibility.
Any advice would be greatly appreciated and I'll be happy to provide any more required information if necessary.
I still have no solution for the Directory.Move operation not working. however I've been able to work around the problem by going into the directory and using File.Move to move all files elsewhere, and then using Directory.Delete to delete the original directory. For some reason it works like this. But it will do!
There may be 2 reasons for this exception. First - file is locked by the different process i.e. Windows Explorer etc. It is legitimate exception and you have to deal with it accordingly. Second - file is locked by the same process and by the same process here I mean any thread of it. This, in my opinion is a Microsoft's bug to throw the same exception as in first case. If you look deeper, it can be branched further: same process may have another stream etc. opened in the different thread or it can be held by current thread calling Move. In the first branch I still want more elaborate exception and in the second the issue is rooted in Windows kernel. Long story short: OS seems not have enough time to release IO locks held even by the same thread following previous file/folder operation.
In order to verify my claim have a look at System.IO.Directory.InternalMove method in .NET source. Down at the end of that method there is a call to Win32Native.MoveFile which is the source of that exception. Right there you have this comment // This check was originally put in for Win9x.. That one shows how professional Microsoft developers are and that there is no feasible solution to this issue.
I had few workarounds to this: 1. Do not use Move but use Copy+Delete source. 2. Wrap Move call into the IO Utility method which would contain do while loop around try catch block containing Move call. Remember, we are only addressing a bug, where we believe same thread (or same process) holds the lock so we need to specify timeout exit condition after some number of Thread.Sleep(x) calls if file is held by another process.

Can logging to text file in ASP.net WEb server cause locks

Can logging to text file in ASP.net WEb server cause locks
Thi is my code:
var outStream = new FileStream(Server.MapPath("~\\bin\\error.txt"), FileMode.Append, FileAccess.Write, FileShare.ReadWrite);
As you see FileShare is both read and write.
So far I have not seen any issues. Are you aware of possible issues using this method. (I do not want to log error to windows event)
Not 100% sure I understand what you are asking, but any time something writes to a file, the file is locked by that application for the period of time that it takes to complete the write. If there is another application writing to the same file at times, there will be the potential for conflict. If both applications are coded to handle conflicts of this nature, then all will be fine. So if you are coding to handle this situation, you would put the write method in a TRY block within a WHILE block. If an error occurs writing, then it would stay in the while block and pause for a second or something and then try again. Once the write is successful, break out of the while block. You may also want to put in a counter and limit the number of tries.
Just be aware that if you do this that the thread will sit here until the write is successful, so if you don't want this to hold up your application it needs to be done with another thread.
I think you would just want to do a lock on a static object that wraps the File I/O. This static object would ensure the file updates are thread safe.
With respect to Mr. Hinkle, I had always heard one would not want to have multiple try/catch fails inside a while loop as pure design -- unless I'm missing something. I can see where practically this would work, but I always thought exceptions should not become part of the "main flow" in this way.

Error: The process cannot access the file '...' because it is being used by another process

I have a function that always creates a directory and put in it some files (images).
When the code runs first time, no problem. Second time (always), it gets an error when I have to delete the directory (because I want to recreate it to put in it the images). The error is "The process cannot access the file '...' because it is being used by another process". The only process that access to this files is this function.
It's like the function "doesn't leave" the files.
How can I resolve this with a clear solution?
Here a part of the code:
String strPath = Environment.CurrentDirectory.ToString() + "\\sessionPDF";
if (Directory.Exists(strPath))
Directory.Delete(strPath, true); //Here I get the error
Directory.CreateDirectory(strPath);
//Then I put the files in the directory
If your code or another process is serving up the images, they will be locked for an indefinite amount of time. If it's IIS, they're locked for a short time while being served. I'm not sure about this, but if Explorer is creating thumbs for the images, it may lock the files while it does that. It may be for a split second, but if your code and that process collide, it's a race condition.
Be sure you release your locks when you're done. If the class implements IDisposable, wrap a using statement around it if you're not doing extensive work on that object:
using (var Bitmap = ... || var Stream = ... || var File = ...) { ... }
...which will close the object afterwards and the file will not be locked.
Just going out on a limb here without seeing the code that dumps the files, but if you're using FileStreams or Bitmap objects, I would double check to ensure you are properly disposing of all of those objects before running the second method.
The only clear solution on this case is keep track of who is handling access to the directory and fix the bug, by releasing that access.
If the object/resource that handling access is 3rd party, or by any means is not possible to change or access, it's a time to revise an architecture, to handle IO access in a different way.
Hope this helps.
Sounds like you are not releasing the file handle when the file is created. Try doing all of your IO within the using statement, that way the file will be released automatically when you are finished with it.
http://msdn.microsoft.com/en-us/library/yh598w02%28v=vs.80%29.aspx
I have seen cases where a virus scanner will scan the new file and prevent the file from being deleted, though that is highly unlikely.
Be sure to .Dispose of all IDisposable objects and make sure that nothing has changed your Environment.CurrentDirectory to the directory you want to delete.

Should I reuse a FileStream/BinaryWriter object?

Update: After looking in the event log at around the time this occurred, I get the message: "The server was unable to allocate from the system nonpaged pool because the pool was empty." repeated continually throughout the log, until it was rebooted.
I am writing a class that writes debugging information to a file, up until now the class has worked fine, however I am now starting to stress-test my application (by running it at 1000x times faster than normal) and this has caused an unusual error to occur.
The problem I am seeing is that after a long period of time (4 hours+) my application crashes and seems to take out Windows with it; I can no longer open up Windows Explorer or any other application. A system reboot seems to solve the issue, however when I do the file I am writing to is blank.
This makes me think that perhaps the issue is related to open file handles; perhaps Windows is reaching it's limit of open file handles somehow?
So, here comes the related question; here is the main function that writes data to the file. As you can see, FileStream and BinaryWriter objects are created with each call to this function, wrapped in using statements to ensure they are properly Closed/Disposed.
/// <summary>
/// This is called after changing any
/// stats data, or on initial startup.
/// It saves the current stats to file.
/// </summary>
public void UpdateStatsData()
{
lock (this.lockObject)
{
using (FileStream fileStream = new FileStream(Constants.StatsFile, FileMode.Create, FileAccess.Write, FileShare.None, 128, FileOptions.WriteThrough))
{
using (BinaryWriter binWriter = new BinaryWriter(fileStream))
{
binWriter.Write(this.serverStats.APM);
binWriter.Write(this.serverStats.AverageJackpotWin);
binWriter.Write(this.serverStats.AverageWinnings);
binWriter.Write(this.serverStats.NumberOfGamesPlayed);
binWriter.Write(this.serverStats.NumberOfJackpots);
binWriter.Write(this.serverStats.RunningPercentage);
binWriter.Write(this.serverStats.SiteID);
binWriter.Write(this.serverStats.TotalJackpotsValue);
binWriter.Write(this.serverStats.TotalStaked);
binWriter.Write(this.serverStats.TotalWinnings);
}
}
}
}
Is it possible that this function, when called very rapidly, could cause file handles to slowly build up and eventually exceed Windows' maximum?
A possible solution involves making the FileStream and BinaryWriter objects private member variables of the class, creating them in the constructor, and then overwriting the data with each call.
.
/// <summary>
/// This should be called after changing any
/// stats data, or on initial startup.
/// It saves the current stats to a serialized file.
/// </summary>
public void UpdateStatsData()
{
lock (this.lockObject)
{
// Seek to the beginning of the file.
this.binWriter.BaseStream.Seek(0, SeekOrigin.Begin);
// Write the stats data over the existing data.
this.binWriter.Write(this.serverStats.APM);
this.binWriter.Write(this.serverStats.AverageJackpotWin);
this.binWriter.Write(this.serverStats.AverageWinnings);
this.binWriter.Write(this.serverStats.NumberOfGamesPlayed);
this.binWriter.Write(this.serverStats.NumberOfJackpots);
this.binWriter.Write(this.serverStats.RunningPercentage);
this.binWriter.Write(this.serverStats.SiteID);
this.binWriter.Write(this.serverStats.TotalJackpotsValue);
this.binWriter.Write(this.serverStats.TotalStaked);
this.binWriter.Write(this.serverStats.TotalWinnings);
}
}
However, while it may be quicker and only mean using one FileStream, how do I ensure that the FileStream and BinaryWriter are Closed/Disposed properly on application shutdown?
The combination of parameters to the FileStream constructor look suspect to me (assuming that all threads log to the same file (Constants.StatsFile):
FileMode.Create = Always create the file. overwrite if it exists. you are deleting all previous logs with each entry into this method (might try OpenOrCreate or Append)
FileOptions.WriteThrough = no caching - force the disk to spin and force the thread to wait for the disk - slow
My guess: you are calling this method much more quickly than it can complete. Each call backs up on the lock statement waiting for the previous call to delete the file, write to it, and completely flush it to disk. After awhile you just run out of memory.
Assuming you didn't intend to delete the log file each time try this combination and see if things get better and at a minimum get rid of WriteThrough as that will make this method much faster:
using (FileStream fileStream = new FileStream(Constants.StatsFile, FileMode.Append,
FileAccess.Write, FileShare.None, 128, FileOptions.SequentialScan))
Running out of non-paged pool memory is a very serious mishap in Windows. Nothing good happens after that, drivers will fail to do their job, a reboot is required to recover from this.
Of course, it isn't normal for a user mode program (a managed one at that) to cause this to happen. Windows protects itself against this by giving a process a limited quota of the available system resources. There are many of them, a limit of 10,000 handles is an obvious one that strikes pretty often if a program leaks handles.
Memory from the non-paged pool is exclusively allocated by drivers. They need that kind of precious memory because they use memory at device interrupt time. A critical time where it isn't possible to map memory from the paging file. The pool is small, it needs to be because it permanently occupies RAM. It depends on the amount of RAM your machine has, typically 256 MB max for a machine with 1 GB of RAM. You can see its current size in TaskMgr.exe, Performance tab. I'm giving it a decent workaround right now, it is currently showing 61 MB.
Clearly your program is making a driver on your machine consume too much non-page pool memory. Or it is leaking, possibly induced by the heavy workout you give it. Windows is powerless to prevent this, quotas are associated with processes, not drivers. You'll have to find the driver that misbehaves. It would be one that's associated with the file system or the disk. A very common one that causes trouble like this is, you probably guessed it by now, your virus scanner.
Most of this code looks fine to me -- you should have no problem re-creating the FileStreams like you are.
The only thing that jumps out at me is that your lockObject is not static. That's potentially a big problem -- multiple instances of the class will cause blocking not to occur, which means you might be running into some strange condition caused by multiple threads running the same code at the same time. Who knows, under load you could be creating thousands of open file handles all at the same time.
I see nothing wrong with the first in terms of handle closure. I do with the second; specifically the very issues you ask about. You could make your class disposable and then ideally close it during a "controlled" shut-down, while depending on the file object's finaliser to take care of matters during exceptional shut-down, but I'm not sure you're fixing the right issue.
What measurements of open file handles confirm your suspicion that this is the issue? It's reasonable to suspect open file handles when you are indeed opening lots of files, but it's foolish to "fix" that unless either A) examining the code shows it will obviously have this problem (not the case here) or B) you've shown that such file handles are indeed too high.
Does the app leave an exception in the event viewer on crashing?

Error creating an XmlWriter when file already exists

In my code, I have these lines:
XmlWriterSettings writerSettings = new XmlWriterSettings();
writerSettings.Indent = true;
XmlWriter writer = XmlWriter.Create(filename, writerSettings);
document.Save(writer);
This works fine when filename does not exist. But when it does, I get this error (on the 3rd line, not the 4th):
System.IO.IOException: Sharing violation on path [the file path]
I want to overwrite the file if it already exists. How do I do this?
If you look carefully at the IOException, it says that it's a "sharing violation". This means that while you are trying to access this file, another program is using it. Usually, it's not much of a problem with reading, but with writing to files this can happen quite a lot. You should:
Try to find out if some other program is using this file, what the program is, and why it's doing so. It's possible that some programs (especially those written in languages without good garbage handling capabilities) were accessing the file and then did not close the IO stream, thus locking up the file. There are also some utilities (if my memory serves me correctly) that allow you to see what processes are using a certain file - just google it.
There's a possibility that when you were debugging your program, you may have killed the process or something (I do that sometimes), and the IO stream may have not been closed. For this, the easiest fix (as far as I know) is just a reboot.
Alternatively, the issue may be coming from your own code. However, as you're writing in C#, and garbage collection, along with the IO capabilities, usually prevent such problems, you might have forgotten to close a file stream somewhere. I do this sometimes, and it takes quite a while to find the location of the bug, even though the fix is nearly instant. If you step through your program and utilize watches to keep track of your IO operations, it should be relatively simple to find such a bug.
Good luck!
The problem isn't that the file exists, but that it is in use by a different program (or your own program). If it was simply that the file existed it would be overwritten and cause no exception.
If it's your program that has created the file that exists already, it's likely that you haven't disposed properly the object that created the file, so the file is still open.
Try using the overload of XmlWriter.Create that accepts a Stream, and pass in a FileStream from File.Create(filename)...

Categories