Related
I have a log file of my application, I do some manipulation by sided application on the log file.
At the end of the manipulation, I want to delete the file - which is not possible becuase file is used, so I just want to empty it - delete the contents.
I tried:
using (FileStream stream = File.Open(query.FileName, FileMode.Open, FileAccess.ReadWrite, FileShare.ReadWrite))
{
using (StreamWriter writer = new StreamWriter(stream, true))
{
writer.Write("");
reader.Close();
}
}
and:
using (FileStream stream = File.Open(query.FileName, FileMode.Open, FileAccess.ReadWrite, FileShare.ReadWrite))
{
stream.SetLength(0);
stream.Close();
}
and:
System.IO.File.WriteAllText(#"path", string.Empty);
Notihg works.
How to overrride file content?
Not at all. It will crash, because file is being used by another process
Well, it probably isn't another process, it probably is yours. You'll have to do this after closing the log file. But there's a more general fix for that. Go back to the code that creates the log file and add the FileShare.Delete option.
This option allows deleting files that are in use. You can now simply use File.Delete() in your code, even if the log file is still opened. This will put the file in a "delete pending" state, anybody that tries to open it will be slapped with an access denied error. The file on the disk will automatically disappear when the last handle to the file is closed.
Yet another useful option is FileOptions.DeleteOnClose. Now it is completely automatic, the file is automatically deleted without you having to do anything at all. I can't tell which one is best in your case, you probably want to avoid deleting the log file when your program crashed so FileShare.Delete is best.
You can use the below code too
System.IO.File.WriteAllText(#"Path of your file",string.Empty);
I have a log file that is open and in use by another program, and I read it's contents with the following:
Dim strContents As String
Dim x As New FileStream(FullPath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
Dim objReader As StreamReader
objReader = New StreamReader(x)
strContents = objReader.ReadToEnd()
objReader.Close()
This works for reading the text from the file while it is still in use by the other program, however, immediately after this I need to truncate the text file (without deleting it) so that it is blank again. But the file will still be in use by the other program.
I tried
Dim sWrite As StreamWriter
sWrite = New System.IO.StreamWriter(FullPath, False)
sWrite.Write("")
sWrite.Close()
But I get the "in use by another application" exception. I've tried looking in StackOverflow and googling but I can't seem to find an answer, and I can't find a way to do this with filestreams either, or I would try to use
Dim fs As New FileStream(FullPath, FileMode.Open, FileAccess.Write, FileShare.ReadWrite)
Thanks in advance.
There are a fair few other posts on this topic.
Detecting whether a file is locked by another process (or indeed the same process)
How to check for file lock?
Can I simply 'read' a file that is in use?
The solution appears to be:
Try
'Code to read file if its not locked by another app
Catch as System.IO.IOException
Also just an FYI that your log file is unmanaged resource so deterministic finalization will help with this resource contention issue. Use the Using statement for deterministic finalization, eg:
Using fs = New FileStream(path, FileMode.Open, FileAccess.ReadWrite, FileShare.ReadWrite)
fs.SetLength(0)
file.Save(fs)
fs.Flush()
End Using
Thats because the other program has a lock on the file the operating system won't let you do what you want to do. Unless you change the other program to no lock on write.
Unfortunately, the solution to the problem is not as simple adding or modifying a few lines code, regardless of the language being used. You are describing a classic example of resource contention.
This type of problem is best solved using an access manager, i.e. another process that arbitrates writes such that your program doesn't overwrite what the other program did and vice versa.
I'm not sure if you can do that until the other process closes the stream. However, you can kill the process that is writing to the file.
Here's an example to do that.
Dim pProcess() As Process = System.Diagnostics.Process.GetProcessesByName("notepad")
For Each p As Process In pProcess
p.Kill()
Next
Check this out: http://vbnetsample.blogspot.com/2007/08/start-and-kill-process.html
I have a logfile that is written by a 3rd party application and I'd like my application to "read" that log file in real/near-time, parse the new log entries and act upon certain events.
My thought was that I could achieve this with a combination of FileSystemWatcher (to signal file changes) and MemoryMappedFile (to continue reading from a certain offset).
However, since this is the first time I'm using MemoryMappedFiles I do run into some issues which probably arise from not understanding the concept correctly (e.g. I'm unable to open the existing File as it's in use by the other process).
I was wondering if someone has an example of how to use MemoryMappedFiles to read a file that is locked by another process?
Thanks,
Tom
EDIT:
From the comments, it looks like Memory Mapped Files won't help me accessing files that have an exclusive lock. However, "tail" tools like, e.g. Baretail (http://www.baremetalsoft.com/baretail/index.php) are able to do just that. It has no problem reading the file that has an exclusive lock from another application in 1s intervals). So, there has to be some way to do this?
EDITEDIT:
To answer my own question, the trick in opening a locked file is, creating the FileStream with the following access flags:
fileStream = new System.IO.FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.Delete | FileShare.ReadWrite);
To answer my own question, the trick in reading a locked file is creating the FileStream with the following access flags:
FileStream fileStream = new System.IO.FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.Delete | FileShare.ReadWrite);
Now it's just a matter of either doing interval based polling or looking for FileSystemWatcher change events to detect file changes
I'm not sure if MemoryMappedFiles are going to help you. Take a look at FileStream:
var stream = new FileStream(path, FileMode.Open, FileShare.Read);
stream.Seek(offset, SeekOrigin.Begin);
Though if the 3rd-party application has the file locked exclusively, there's not much that you can do about it...
[Begin 2nd EDIT]
One more idea...
If the third party app happens to use a logging framework like NLog, log4net, or System.Diagnostics, you could still write your own Target/Appender/TraceListener and route the messages somewhere that you could look at them (such as a file that is not opened exclusively, to another process, etc).
If your third party app is using a logging framework, we probably would have heard about it by now ;-)
[End 2nd EDIT]
[Begin EDIT]
I think I misread the question. It sounded at first like you were using a third party library that had logging implemented and you wanted to do this parsing from within the program that was generating the logging. Having reread your question, it sounds like you want to "listen" to the log file from outside of the application. If that is the case, my answer probably won't help you. Sorry.
[End EDIT]
I don't have anything to offer about MemoryMappedFiles, but I wonder if you could achieve what you are after by writing a custom listener/target/appender for the 3rd party logging system?
For example, if you are using NLog, you could write a custom Target and direct all of your logging messages there (while also directing them to the "real" Target(s)). This way you get crack at each log message as it is logged (so it is actually real time, not near real time). You could do the same thing with log4net and System.Diagnostics.
Note that NLog even has a "MethodCall" target. To use that one you only have to write a static method with the correct signature. I don't know if log4net has a similar concept to this.
This seems like it would be easier to get working reliably than trying to read and parse the log file as it is being written by the third party software.
If the file is "in use", there isn't anything that can be done about that. It truly is "in use". MemoryMappedFiles are for either reading large amounts of data off the drive or sharing data with other programs. It will not help getting around the "in use" limitation.
Memorymapped files are under the same restrictions as the FileStream you initialize it with, be sure that you initialize your Memory-Mapped-File like this
var readerStream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite);
var mmf = MemoryMappedFile.CreateFromFile(readerStream, null, 0, MemoryMappedFileAccess.Read, null, HandleInheritability.None, false);
If some other process have completely locked it even from writing you're in bad luck, not sure if there's a way around that. Perhaps use some timer do detect when the process has stopped writing to it.
I've done something similar just for monitoring log files on a console (as opposed to processing), but the principles are the same. Like you, I use a FileSystemWatcher, and the important logic is in my OnChanged event handler:
case WatcherChangeTypes.Changed:
{
System.IO.FileInfo fi = new FileInfo(e.FullPath);
long prevLength;
if (lastPositions.TryGetValue(e.FullPath, out prevLength))
{
using (System.IO.FileStream fs = new FileStream(
e.FullPath, FileMode.Open, FileAccess.Read))
{
fs.Seek(prevLength, SeekOrigin.Begin);
DumpNewData(fs, (int)(fi.Length - prevLength));
lastPositions[e.FullPath] = fs.Position;
}
}
else
lastPositions.Add(e.FullPath, fi.Length);
break;
}
where lastPositions is
Dictionary<string, Int64> lastPositions = new Dictionary<string, long>();
and DumpNewData is simply
private static void DumpNewData(FileStream fs, int bytesToRead)
{
byte[] bytesRead = new byte[bytesToRead];
fs.Read(bytesRead, 0, bytesToRead);
string s = System.Text.ASCIIEncoding.ASCII.GetString(bytesRead);
Console.Write(s);
}
In the past I've always used a FileStream object to write or rewrite an entire file after which I would immediately close the stream. However, now I'm working on a program in which I want to keep a FileStream open in order to allow the user to retain access to the file while they are working in between saves. ( See my previous question).
I'm using XmlSerializer to serialize my classes to a from and XML file. But now I'm keeping the FileStream open to be used to save (reserialized) my class instance later. Are there any special considerations I need to make if I'm reusing the same File Stream over and over again, versus using a new file stream? Do I need to reset the stream to the beginning between saves? If a later save is smaller in size than the previous save will the FileStream leave the remainder bytes from the old file, and thus create a corrupted file? Do I need to do something to clear the file so it will behave as if I'm writing an entirely new file each time?
Your suspicion is correct - if you reset the position of an open file stream and write content that's smaller than what's already in the file, it will leave trailing data and result in a corrupt file (depending on your definition of "corrupt", of course).
If you want to overwrite the file, you really should close the stream when you're finished with it and create a new stream when you're ready to re-save.
I notice from your linked question that you are holding the file open in order to prevent other users from writing to it at the same time. This probably wouldn't be my choice, but if you are going to do that, then I think you can "clear" the file by invoking stream.SetLength(0) between successive saves.
There are various ways to do this; if you are re-opening the file, perhaps set it to truncate:
using(var file = new FileStream(path, FileMode.Truncate)) {
// write
}
If you are overwriting the file while already open, then just trim it after writing:
file.SetLength(file.Position); // assumes we're at the new end
I would try to avoid delete/recreate, since this loses any ACLs etc.
Another option might be to use SetLength(0) to truncate the file before you start rewriting it.
Recently ran into the same requirement. In fact, previously, I used to create a new FileStream within a using statement and overwrite the previous file. Seems like the simple and effective thing to do.
using (var stream = new FileStream(path, FileMode.Create, FileAccess.Write)
{
ProtoBuf.Serializer.Serialize(stream , value);
}
However, I ran into locking issues where some other process is locking the target file. In my attempt to thwart this I retried the write several times before pushing the error up the stack.
int attempt = 0;
while (true)
{
try
{
using (var stream = new FileStream(path, FileMode.Create, FileAccess.Write)
{
ProtoBuf.Serializer.Serialize(stream , value);
}
break;
}
catch (IOException)
{
// could be locked by another process
// make up to X attempts to write the file
attempt++;
if (attempt >= X)
{
throw;
}
Thread.Sleep(100);
}
}
That seemed to work for almost everyone. Then that problem machine came along and forced me down the path of maintaining a lock on the file the entire time. So in lieu of retrying to write the file in the case it's already locked, I'm now making sure I get and hold the stream open so there are no locking issues with later writes.
int attempt = 0;
while (true)
{
try
{
_stream = new FileStream(path, FileMode.Open, FileAccess.ReadWrite, FileShare.Read);
break;
}
catch (IOException)
{
// could be locked by another process
// make up to X attempts to open the file
attempt++;
if (attempt >= X)
{
throw;
}
Thread.Sleep(100);
}
}
Now when I write the file the FileStream position must be reset to zero, as Aaronaught said. I opted to "clear" the file by calling _stream.SetLength(0). Seemed like the simplest choice. Then using our serializer of choice, Marc Gravell's protobuf-net, serialize the value to the stream.
_stream.SetLength(0);
ProtoBuf.Serializer.Serialize(_stream, value);
This works just fine most of the time and the file is completely written to the disk. However, on a few occasions I've observed the file not being immediately written to the disk. To ensure the stream is flushed and the file is completely written to disk I also needed to call _stream.Flush(true).
_stream.SetLength(0);
ProtoBuf.Serializer.Serialize(_stream, value);
_stream.Flush(true);
Based on your question I think you'd be better served closing/re-opening the underlying file. You don't seem to be doing anything other than writing the whole file. The value you can add by re-writing Open/Close/Flush/Seek will be next to 0. Concentrate on your business problem.
With the following file reading code:
using (FileStream fileStream = new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.None))
{
using (TextReader tr = new StreamReader(fileStream))
{
string fileContents = tr.ReadToEnd();
}
}
And the following file write code:
using (TextWriter tw = new StreamWriter(fileName))
{
tw.Write(fileContents);
tw.Close();
}
The following exception details are seen:
The process cannot access the file
'c:\temp\myfile.txt' because it is
being used by another process.
What is the best way of avoiding this? Does the reader need to retry upon receipt of the exception or is there some better way?
Note that the reader process is using a FileSystemWatcher to know when the file has changed.
Also note that, in this instance, I'm not looking for alternatives ways of sharing strings between the 2 processes.
You can open a file for writing and only lock write access, thereby allowing others to still read the file.
For example,
using (FileStream stream = new FileStream(#"C:\Myfile.txt", FileMode.Open, FileAccess.ReadWrite, FileShare.Read))
{
// Do your writing here.
}
Other file access just opens the file for reading and not writing, and allows readwrite sharing.
using (FileStream stream = new FileStream(#"C:\Myfile.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// Does reading here.
}
If you want to ensure that readers will always read an up-to-date file, you will either need to use a locking file that indicates someone is writing to the file (though you may get a race condition if not carefully implemented) or make sure you block write-sharing when opening to read and handle the exception so you can try again until you get exclusive access.
If you create a named Mutex you can define the mutex in the writing application, and have the reading application wait until the mutex is released.
So in the notification process that is currently working with the FileSystemWatcher, simply check to see if you need to wait for the mutex, if you do, it will wait, then process.
Here is a VB example of a Mutex like this that I found, it should be easy enough to convert to C#.
Get your process to check the status of the file if it is being written to. You can do this by the presence of a lock file (i.e. the presence of this other file, which can be empty, prevents writing to the main file).
Even this is not failsafe however, as the two processes may create the lock file at the same time - but you can check for this before you commit the write.
If your process encounters a lock file then get it to simply sleep/wait and try again at a predefined interval in the future.
Is there any particular reason for opening the file with FileShare.None? That'll prevent the file from being opened by any other process.
FileShare.Write or FileShare.ReadWrite should allow the other process (subject to permissions) to open and write to the file while you are reading it, however you'll have to watch for the file changing underneath you while you read it - simply buffering the contents upon opening may help here.
All of these answers, however, are equally valid - the best solution depends on exactly what you're trying to do with the file: if it's important to read it while guaranteeing it doesn't change, then lock it and handle the subsequent exception in your writing code; if it's important to read and write to it at the same time, then change the FileShare constant.
You can use a Mutex object for this.
The reader and writer both need retry mechanisms. Also FileShare should be set to FileShare.read for the readers and FileShare.none for the writer. This should ensure that the readers don't read the file while writing is in progress.
The reader (excluding retry) becomes
using (FileStream fileStream = new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.Read))
{
using (TextReader tr = new StreamReader(fileStream))
{
string fileContents = tr.ReadToEnd();
}
}
The writer (excluding retry) becomes:
FileStream fileStream = new FileStream(fileName, FileMode.Create, FileAccess.Write, FileShare.None);
using (TextWriter tw = new StreamWriter(fileStream))
{
tw.Write(fileContents);
tw.Close();
}
Write to a temp file, when finished writing rename/move the file to the location and/or name that the reader is looking for.
The best thing to do, is to put an application protocol on top of a file transfer/ownership transfer mechanism. The "lock-file" mechanism is an old UNIX hack that has been around for ages. The best thing to do, is to just "hand" the file over to the reader. There are lots of ways to do this. You can create the file with a random file name, and then "give" that name to the reader. That would allow the writer to asynchronously write another file. Think of how the "web page" works. A web page has a "link" to more information in it, for images, scripts, external content etc. The server hands you that page, because it's a coherent view of the "resource" you want. Your browser then goes and gets the appropriate content, based on what the page description (the HTML file or other returned content), and then transfers what it needs.
This is the most resilient type of "sharing" mechanism to use. Write the file, share the name, move to the next file. The "sharing the name" part, is the atomic hand off that makes sure that both parties (the reader and the writer) agree that the content is "complete."