How to avoid File Blocking - c#

We are monitoring the progress of a customized app (whose source is not under our control) which writes to a XML Manifest. At times , the application is stuck due to unable to write into the Manifest file. Although we are covering our traces by explicitly closing the file handle using File.Close and also creating the file variables in Using Blocks. But somehow it keeps happening. ( Our application is multithreaded and at most three threads might be accessing the file. )
Another interesting thing is that their app updates this manifest at three different events(add items, deleting items, completion of items) but we are only suffering about one event (completion of items). My code is listed here
using (var st = new FileStream(MenifestPath, FileMode.Open, FileAccess.Read))
{
using (TextReader r = new StreamReader(st))
{
var xml = r.ReadToEnd();
r.Close();
st.Close();
//................ Rest of our operations
}
}

If you are only reading from the file, then you should be able to pass a flag to specify the sharing mode. I don't know how you specify this in .NET, but in WinAPI you'd pass FILE_SHARE_READ | FILE_SHARE_WRITE to CreateFile().
I suggest you check your file API documentation to see where it mentions sharing modes.

Two things:
You should do the rest of your operations outside the scopes of the using statements. This way, you won't risk using the closed stream and reader. Also, you needn't use the Close methods, because when you exit the scope of the using statement, Dispose is called, which is equivalent.
You should use the overload that has the FileShare enumeration. Locking is paranoid in nature, so the file may be locked automatically to protect you from yourself. :)
HTH.

The problem is different because that person is having full control on the file access for all processes while as i mentioned ONE PROCESS IS THIRD PARTY WITH NO SOURCE ACCCESS. And our applications are working fine. However, their application seems stuck if they cant get hold the control of file. So i am willing to find a method of file access that does not disturb their running.

This could happen if one thread was attempting to read from the file while another was writing. To avoid this type of situation where you want multiple readers but only one writer at a time, make use of the ReaderWriterLock or in .NET 2.0 the ReaderWriterLockSlim class in the System.Threading namespace.

Also, if you're using .NET 2.0+, you can simplify your code to just:
string xmlText = File.ReadAllText(ManifestFile);
See also: File.ReadAllText on MSDN.

Related

NLog - Allow other processes to read log file

Starting to use NLog. The main process (a Windows service) is writing to the log file every few seconds. I need to allow another process (a desktop app) to read this file at arbitrary times (desktop app doesn't require write access).
Problem however is that NLog probably creates an exclusive lock when it opens the file for writing. So if the desktop process tries to read when the file is locked, an exception is thrown.
How can I configure NLog to allow other processes to have readonly access to the log file contents even if the main process has it open for writing? The desktop process will call File.ReadAllText() which I hope is safe for concurrent operations.
(I read through the docs and found that NLog even allows concurrent writing to a log file from different processes, so a read-only access should be easier in theory. I can't see any solutions though).
Instead of using File.ReadAllText() or File.ReadAllTextAsync() that requires exclusive file-lock:
System.IO.IOException: The process cannot access the file '...' because it is being used by another process.
Then I suggest using FileShare.ReadWrite to avoid failure when NLog is actively writing to the log-file:
using (var f = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (var s = new StreamReader(f))
{
fileContent = s.ReadToEnd();
}
This will also avoid issues for the application that uses NLog for writing to the log-file, as reading with exclusive lock will cause log-writing to fail with logevent being lost and bad performance.
Problem however is that NLog probably creates an exclusive lock when it opens the file for writing
No it doesn't lock by default. There are two important settings:
ConcurrentWrites, this setting is default true:
concurrentWrites - Enables support for optimized concurrent writes to same log file from multiple processes on the same machine-host, when using keepFileOpen = true. By using a special technique that lets it keep the files open from multiple processes. If only single process (and single AppDomain) application is logging, then it is faster to set to concurrentWrites = False. Boolean Default: True. Note: in UWP this setting should be false
Also there is a keepFileOpen setting, with the default on false
keepFileOpen - Indicates whether to keep log file open instead of opening and closing it on each logging event. Changing this property to true will improve performance a lot, but will also keep the file handle locked. Consider setting openFileCacheTimeout = 30 when enabling this, as it will allow archive operations and react to log file being deleted. Boolean Default: False
See docs, there also more settings like concurrentWriteAttemptDelay, concurrentWriteAttempts etc.
Last but not least, if you're locking the file too long, maybe copy first and then read it by you application?
After release of version 5.0 the keepFileOpen is now by default set to True. Reference

Can logging to text file in ASP.net WEb server cause locks

Can logging to text file in ASP.net WEb server cause locks
Thi is my code:
var outStream = new FileStream(Server.MapPath("~\\bin\\error.txt"), FileMode.Append, FileAccess.Write, FileShare.ReadWrite);
As you see FileShare is both read and write.
So far I have not seen any issues. Are you aware of possible issues using this method. (I do not want to log error to windows event)
Not 100% sure I understand what you are asking, but any time something writes to a file, the file is locked by that application for the period of time that it takes to complete the write. If there is another application writing to the same file at times, there will be the potential for conflict. If both applications are coded to handle conflicts of this nature, then all will be fine. So if you are coding to handle this situation, you would put the write method in a TRY block within a WHILE block. If an error occurs writing, then it would stay in the while block and pause for a second or something and then try again. Once the write is successful, break out of the while block. You may also want to put in a counter and limit the number of tries.
Just be aware that if you do this that the thread will sit here until the write is successful, so if you don't want this to hold up your application it needs to be done with another thread.
I think you would just want to do a lock on a static object that wraps the File I/O. This static object would ensure the file updates are thread safe.
With respect to Mr. Hinkle, I had always heard one would not want to have multiple try/catch fails inside a while loop as pure design -- unless I'm missing something. I can see where practically this would work, but I always thought exceptions should not become part of the "main flow" in this way.

Reading a file without causing access denial to other processes

I've been thinking about writing a small specialized backup app, similar to newly introduced file history in Windows 8. The basic idea is to scan some directories every N hours for changed files and copy them to another volume. The problem is, some other apps may request access to these files while they are being backed up and get an access denial, potentially causing all kinds of nasty problems.
I far as i can tell, there are several approaches to that problem:
1) Using Volume Shadow Copy service
From my point of view, the future of this thing is uncertain and it's overhead during heavy IO loads may cripple the system.
2) Using Sharing Mode when opening files
Something like this mostly works...
using (var stream = new FileStream("test.txt", FileMode.Open, FileAccess.Read,
FileShare.Delete | FileShare.ReadWrite | FileShare.Read | FileShare.Write))
{
[Copy data]
}
... until some other process request access to the same file without FileShare.Read, at which point an IOException will be thrown.
3) Using Opportunistic Lock that may be "broken" by other (write?) requests.
This behaviour of FileIO.ReadTextAsync looks exactly like what I want, but it also looks very implementation-specific and may be changed in the future. Does someone knows, how to explicitly oplock a file locally via C# or C++?
Maybe there is some simple C# method like File.TryReadBytes that provides such "polite" reading? I'm interested in the solutions that will work on Windows 7 and above.
My vote's on VSS. The main reason is that it doesn't interfere with other processes modifying your files, thus it provides consistency. A possible inconsistency pretty much defeats the purpose of a backup. The API is stable and I wouldn't worry about its future.

Lucene.Net writing/reading synchronization

Could I write (with IndexWriter) new documents into index while it is opened for reading (with IndexReader)? Or must I close reading before writing?
Could I read/search documents (with IndexReader) in index while it is opened for writing (with IndexWriter)? Or must I close writing before reading?
Is Lucene.Net thread safely or not? Or must I write my own?
You may have any amount of readers/searchers opened at any time, but only one writer. This is enforced by a directory specific lock, usually involving a file named "write.lock".
Readers open snapshots, and writers adds more data to the index. Readers need to be opened or reopened (IndexReader.Reopen) after your writer has commited (IndexWriter.Commit) the data for it to be seen, unless you're working with near-realtime-searches. This involves a special reader returned from (IndexWriter.GetReader) which will be able to see content up to the time the call to GetReader was executed. It also means that the reader may see data that will never be commited due to application logic calling IndexWriter.Rollback.
Searchers uses readers, so identical limitations on these. (Unlimited number of them, can only see what's already commited, unless based on a near-realtime reader.)
Lucene is thread-safe, and best practices is to share readers and searchers between several threads, while checking that IndexReader.IsCurrent() == true. You could have a background thread running that reopens the reader once it detects changes, create a new searcher, and then let the main threads use it. This would also allow you to prewarm any FieldCache you use to increase search speed when once the new searcher is in place.
As I found in this mail list
Lucene.NET is thread-safe. So you can share the same instance of IndexWriter
or IndexSearcher among threads. Using a write-lock, it also hinders a second
IndexWriter instance to be opened with the same index.
As I see I can write and read separatly; I'll check it;)

Multiple Threads reading from the same file

I have a xml file that needs to be read from many many times. I am trying to use the Parallel.ForEach to speed this processes up since none of that data being read in is relevant as to what order it is being read in. The data is just being used to populate objects. My problem is even though I am opening the file each time in the thread as read only it complains that it is open by another program. (I don't have it opened in a text editor or anything :))
How can I accomplish multi reads from the same file?
EDIT: The file is ~18KB pretty small. It is read from about 1,800 times.
Thanks
If you want multiple threads to read from the same file, you need to specify FileShare.Read:
using (var stream = File.Open("theFile.xml", FileMode.Open, FileAccess.Read, FileShare.Read))
{
...
}
However, you will not achieve any speedup from this, for multiple reasons:
Your hard disk can only read one thing at a time. Although you have multiple threads running at the same time, these threads will all end up waiting for each other.
You cannot easily parse a part of an XML file. You will usually have to parse the entire XML file every time. Since you have multiple threads reading it all the time, it seems that you are not expecting the file to change. If that is the case, then why do you need to read it multiple times?
Depending on the size of the file and the type of reads you are doing it might be faster to load the file into memory first, and then provide access to it directly to your threads.
You didnt provide any specifics on the file, the reads, etc so I cant say for sure if it would address your specific needs.
The general premise would be to load the file once in a single thread, and then either directly (via the Xml structure) or indirectly (via XmlNodes, etc) provide access to the file to each of your threads. I envision something similar to:
Load the file
For each Xpath query dispatch the matching nodes to your threads.
If the threads dont modify the XML directly, this might be a viable alternative.
When you open the file, you need to specify FileShare.Read :
using (var stream = new FileStream("theFile.xml", FileMode.Open, FileAccess.Read, FileShare.Read))
{
...
}
That way the file can be opened multiple times for reading
While an old post, it seems to be a popular one so I thought I would add a solution that I have used to good effect for multi-threaded environments that need read access to a file. The file must however be small enough to hold in memory at least for the duration of your processing, and the file must only be read and not written to during the period of shared access.
string FileName = "TextFile.txt";
string[] FileContents = File.ReadAllLines(FileName);
foreach (string strOneLine in FileContents)
{
// Do work on each line of the file here
}
So long as the file is only being read, multiple threads or programs can access and process it at the same time without treading on one another's toes.

Categories