Check if file is in use - additional question - c#

I'm trying to find an alternative to using the Restart Manager for checking if a file is locked. I found this accepted answer to the same question. However, the accepted answer contains the following comment that I do not understand: "this solution will not work if the file doesn't have a Write or Read lock on it, i.e. it has been opened (for reading or writing) with FileShare.Read or FileShare.Write access."
I tested this using the following code (ommitted using blocks and Close() for brevity):
static void Main(string[] args)
{
const string fileName = "test.txt";
// This should open the file as described, shouldn't it?
var fi1 = new FileInfo(fileName);
// Test with FileShare.Read / FileShare.Write / FileShare.ReadWrite gives the same result
var fs1 = fi1.Open(FileMode.OpenOrCreate, FileAccess.Write, FileShare.Write);
var fi2 = new FileInfo(fileName);
Console.WriteLine($"Is file locked? {IsFileLocked(fi2)}");
// Displays always: "Is file locked? True"
Console.ReadLine();
}
This displays always "Is file locked? True" whereas it should display "False" according to the comment.
I tested also the code of this answer which has a similar comment with no luck. Tested also with two seperate processes - as expected no difference.
Looking at the docs, my test results seem resonably - but I'm puzzled by the above mentioned comments.
How else would I open a file e.g. for reading without creating a lock?

The part of the answer that you quoted is incorrect. The mechanism that prevents you from opening an already open file is the share mode, not the desired access type.
When you attempt to open a file that is already in use, the share mode requested is compared against the share mode that the file was opened with. If they don't match up, your call fails.
EDIT: Just to cover all of my bases, the above only holds true on Windows. It is possible to open a file without any sort of mutual exclusion on POSIX-based systems. However, .NET was exclusive to Windows at the time of the quoted answer.

Related

SonarQube issue: Make sure that decompressing this archive file is safe

I have code that creates 3 files from 3 strings and zips it into the archive like:
private static async Task<byte[]> CreateArchive(CertificateCredentials certificate)
{
using (var ms = new MemoryStream())
{
using (var archive = new ZipArchive(ms, ZipArchiveMode.Create, false))
{
await Add(archive, "certificate.der", certificate.CertificateContent);
await Add(archive, "issuing_ca.der", certificate.IssuingCACertificateContent);
await Add(archive, "private_key.der", certificate.PrivateKeyContent);
}
return ms.ToArray();
async Task Add(ZipArchive zipArchive, string filename, string content)
{
ZipArchiveEntry zipEntry = zipArchive.CreateEntry(filename);
using (var originalFileStream = new MemoryStream(Convert.FromBase64String(content)))
using (Stream zipEntryStream = zipEntry.Open())
{
await originalFileStream.CopyToAsync(zipEntryStream);
}
}
}
}
In SonarQube report I got Critical Security Hotspot at this line
using (Stream zipEntryStream = zipEntry.Open())
with message:
Make sure that decompressing this archive file is safe
https://rules.sonarsource.com/csharp?search=Expanding%20archive%20files%20is%20security-sensitive
How can I fix that? It looks like secure for me.
Thank you in advance
Regarding security related rules you can find the actual documentation here. You can read in the "What to expect from security-related rules" that the chance of false positives is greater and that a human should have a look at it when an issue is raised. So, given the example, it is well possible that this issue can be identified as a false positive and no code change is needed. In the user guide you can find here how to handle reported issues. In the "Automatic Issue Assignment" section (Technical Review) you can read how to mark an issue as a false-positive (this requires Administer Issues permission on the project) using the SonarQube UI. This prevents future issue reporting on this code. When there are a lot of issues reported based on this rule you could decide to disable the rule or lower its priority. An other possiblity is to narrow the focus by ignore Issues in a few Blocks for example. It all depends on the type of project and the project/security requirements.
EDIT:
The rule warns for the following issues:
CVE-2018-1263: It is possible to construct a (external) zip with files that, when extracted, will be placed outside the extraction directory. When adding a file to a zip, the file will be identified inside the zip by a filename (including a path if needed). The CVE database does not mention an example yet.
CVE-2018-16131: It is possible to create a (extrenal) zip that, when extracted will consume all the memory available and this will crash the host ('Zip Bomb'). The CVE database is pointing to this issue where someone succeeds to exploit this issue.
SonarQube doesn't want to know how ZipArchive is implemented. It is quite possible that, when adding a new item to a zip, the original Zip if first extracted exposing the issues above. Your are creating and using a (in memory, but that is not very relevant) zip archive in your own code, not using any external provided zip file so both are of no issue here if you trust the .Net implementation of the methods used.
The SonarQube documentation is also pointing to this Java example containing a security compliant solution. It is quite possible that even then SonarQube will warn you.

Best practice for writing big files

I need to write a big file in my project.
What I learned:
I should NOT write the big file directly to the destination path,
because this may leave a incomplete file in case the app crash while writing it.
Instead, I should write to a temporary file and move (rename) it. (called atomic file operation)
My code snippet:
[NotNull]
public static async Task WriteAllTextAsync([NotNull] string path, [NotNull] string content)
{
string temporaryFilePath = null;
try {
temporaryFilePath = Path.GetTempFileName();
using (var stream = new StreamWriter(temporaryFilePath, true)) {
await stream.WriteAsync(content).ConfigureAwait(false);
}
File.Delete(path);
File.Move(temporaryFilePath, path);
}
finally {
if (temporaryFilePath != null) File.Delete(temporaryFilePath);
}
}
My Question:
The file will be missing if the app crashes between File.Delete and File.Move. Can I avoid this?
Is there any other best practice for writing big files?
Is there any suggestion on my code?
The file will be missing if the app crashes between File.Delete and File.Move. Can I avoid this?
Not that I'm aware of, but you can detect it - and if you use a more predictable filename, you can recover from that. It helps if you tweak the process somewhat to use three file names: the target, a "new" file and an "old" file. The process becomes:
Write to "new" file (e.g. foo.txt.new)
Rename the target file to the "old" file (e.g. foo.txt.old)
Rename the "new" file to the target file
Delete the "old" file
You then have three files, each of which may be present or absent. That can help you to detect the situation when you come to read the new file:
No files: Nothing's written data yet
Just target: All is well
Target and new: App crashed while writing new file
Target and old: App failed to delete old file
New and old: App failed after the first rename, but before the second
All three, or just old, or just new: Something very odd is going on! User may have interfered
Note: I was unaware of File.Replace before, but I suspect it's effectively just a simpler and possibly more efficient way of doing the code you're already doing. (That's great - use it!) The recovery process would still be the same though.
You can use File.Replace instead of deleting and moving files. In case of hard fault (electricity cut or something like this) you will always lost data, you have to count with that.

Clear content of a locked logfile in C#

There are many explanations out there how to empty a log file.
Like:
File.WriteAllText(activeTab.FileName, string.Empty);
But this example and other examples I found all have the same problem. It do not work if the logfile is currently locked by another process.
In ruby there is a task I can use rake log:clear which do not remove, just empty the log files.
I found also that I can this with Powershell using clc <filename>.
The sources are available here now:
https://github.com/PowerShell/PowerShell/blob/c1faf1e6e10fc1ce45e84ef6f49ae7136c67a111/src/Microsoft.PowerShell.Commands.Management/commands/management/ClearContentCommand.cs
But honestly I do not understand how this code works, also it inherits from other classes.
Is there a C# implementation available that I can use in any common program/class?
Turns out that the file can be cleared in my case using this snippet:
var stream = new FileStream(FileName, FileMode.Truncate, FileAccess.ReadWrite, FileShare.ReadWrite | FileShare.Delete);
stream.Close();

If no text to write via StreamWriter, then discard (do not create) the file

I am using the StreamWriter to create a file and to write some text to that file. In some cases I have no text to write via StreamWriter, but the file was already created when StreamWriter was initialized.
using (StreamWriter sw = new StreamWriter(#"C:\FileCreated.txt"))
{
}
Currently I am using the following code, when StreamWriter is closed, to check if the FileCreated.txt content is empty, if it is delete it. I am wondering if there is a more elegant approach than this (an option within StreamWriter perhaps)?
if (File.Exists(#"C:\FileCreated.txt"))
{
if (new FileInfo(#"C:\FileCreated.txt").Length == 0)
{
File.Delete(#"C:\FileCreated.txt");
}
}
By the way, I must open a stream to write before I can check if there is any text because of some other logic in the code.
If you want to take input from the user bit by bit, you can make your source a StringBuilder, and then just commit to disk when you're done
StringBuilder SB = new StringBuilder();
...
SB.AppendLine("text");
...
if(SB.Length > 0)
File.WriteAllLines(SB.ToString());
Delaying opening the file until the first output would solve this problem, but it might create a new one (if there's a permission error creating the file, you won't find out until later, maybe when the operator is no longer at the computer).
Your current approach is decent. I don't see the need to test File.Exists, though, if you just closed a stream to it. Also consider the race condition:
You find that the file is zero-length
Another process writes to the file
You delete the file
Also consider that you might have permission to create a file, and not to delete it afterwards!
Doing this correctly requires using the raw Win32 API, as I described in a previous answer. Do note that a .NET stream could be used for the first file handle, as long as you specify the equivalent of FILE_SHARE_WRITE.
Revisit your assumptions, i.e. that you must open the stream before checking for content. Simply reorganize your logic.

how to check if a file is being accessed by any other processs and release it?

I am trying to delete/open/edit some files in my C# .Net application.Sometimes i get exception stating the file/directory is being accessed by another process.Is there a way to check if a file/directory is being accessed by process and try to release the file from that process?
No. The only way to do this is to try to access the file, and handle the IOException.
Realistically this is the only safe way anyway. Suppose there was a IsFileInUse() method, and you called it, and it returned "nope, nobody's using that file," and you went ahead and accessed the file. The problem is that in the meantime some other process might have locked or deleted the file. So you'd need to put exception handling around your attempt to access the file anyway. The "test by acquiring" model is the only one that is 100% reliable.
If a file is in use by another process, .NET doesn't provide a way of determining which other process that might be. I believe this would require some pretty low-level unmanaged code though I could be wrong. It is a very low-level operation, if it is possible at all, to "release the file from that process" because that would violate the other process' expectations -- e.g. it thinks it is allowed to write to the file but you have deleted the file and garbaged the handle. I believe you would need to terminate the other process if it's not willing to give up its lock voluntarily.
First, I suppose there are 2 things that may help you:
consider using FileAccess and FileShare flags when opening files
if data from the file is needed only withing the scope of the function use the construction
using(FileStream stream = File.Open(...)) { <file operations> }
this will ensure that file is closed immediately after exiting 'using' block, and not when FileStream object is collected by GC.
Second, there is an unsafe way to get processes that use the file. It is based on debugging features provided by windows. The main idea is to get all system handles and iterate through them to find which are the files handle and additional information. This is done using functions that I'm not sure are documented. If you are interested use google to find more information, but I do not think it is not a good way.
public bool IsInUse(string path)
{
bool IsFree = true;
try
{
//Just opening the file as open/create
using (FileStream fs = new FileStream(path, FileMode.OpenOrCreate))
{
//we can check by using
fs.CanRead // or
fs.CanWrite
}
}
catch (IOException ex)
{
IsFree = false;
}
return IsFree;
}
string path = "D:\\test.doc";
bool IsFileFree = IsInUse(path);

Categories