I have a program that does different things my questions is related to access files in a network mapped drive or a shared folder
the program can run a file msi/exe from the network (network mapped drive or a shared folder)
the program can copy file from the network (network mapped drive or a shared folder)
how I can check if the files are accessible before I try to run or copy (in case of a network disconnection, or any other network problem)?
is it enough with File.Exists();
here is an example of my code:
public static bool FileIsOk(string path)
{
try
{
FileInfo finfo = new FileInfo(path);
if (finfo.Exists)
{
return true;
}
MessageBox.Show("file does not exist, or there is a problem with the network preventing access to the file!");
return false;
}
catch (Exception e)
{
MessageBox.Show(e.Message);
}
return false;
}
thanks
File.Exists() should be fine, but if you start a large copy operation, there's not a lot you can do if the connection goes down during that process, so you'll want to make sure you code for that.
You should trap the IOException and handle it as you see fit.
EDIT: code to trap IOException:
try
{
File.Copy(myLocalFile, myNetworkFile);
}
catch (IOException ioEx)
{
Debug.Write(myLocalFile + " failed to copy! Try again or copy later?");
}
Don't. Just attempt the operation. It will fail just as fast, and you won't be introducing a timing window problem. You have to cope with that failure anyway, why code it twice?
The best idea, of course, would be to create local cache of the setup. You cannot trust network connections. They may slow down or break during operation. If everything is run from network, I would say, it's definitely not a safe idea.
But as far as technical question is concerned, File Exists should be fine. A much more descriptive idea has already been discussed to check the existence of a file. Read here.
FileInfo fi = new FileInfo(#"\\server\share\file.txt");
bool exists = fi.Exists;
Related
Situation:
a file share
one app write files to it
files are big and write process duration is long
another one app copy this files for "processing"
necessary to be sure a file already copied by first one app
I use File.Open approach, but it not work as expected. File in copy state can be freely copied/moved.
public bool FileIsFreeToUse(Guid iterationGuid, FileInfo file)
{
try
{
using (File.Open(file.FullName, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
// log skipped
return true;
}
}
catch (IOException ex)
{
// log skipped
return false;
}
}
So, my questions.
Exists a way to check file in network share is not used with File.Open?
Is it file size check only possible way?
Best practice?
Thank you.
Hi I am new to C# and I am reviewing a code I didn't write. The code copies a compressed file from a network location to a local location before it is extracted and then parsed for data. The copy file function is:
public static void CopySourceFileToDestinationFile(string sourcePath, string destinationPath)
{
try
{
//copy log from source to local and uncompress it
File.Copy(sourcePath, destinationPath, true);
File.SetAttributes(destinationPath, FileAttributes.Normal);
}
catch (Exception ex)
{
var errmsg = $"ERROR in function 'CopySourceFileToDestinationFile()': {ex.Message.ToString()}";
Logger.Error(errmsg);
throw new InvalidOperationException(errmsg);
}
}
I see the File.SetAttributes function and looked up it and all it said on https://learn.microsoft.com/en-us/dotnet/api/system.io.file.setattributes?view=net-5.0 was telling me what the function is but not why you'd need it.
So I humbly ask, why do I ever need to use this function? Had I written this code from scratch I wouldn't have known the existence of this function, nor it's purpose
If you were reviewing that code, what I'd be more curious about is why the original developer put a lying comment above the copy action (it doesn't decompress) and why they catch an exception only to log and throw its message (why not the exception type and its stack trace)?
But about the attributes: a file, when written, will get an Archive flag, indicating to backup software that the file should be backed up. Clearing this bit by setting the file's attributes to Normal (i.e. no attributes) will ... probably not be very relevant, unless this file is copied into a directory that will actually be backed up.
It's not very relevant as any decent backup program won't just look at the Archive bit to decide whether to include a file in its backup or not. Clearing this attribute, as well as the rest of the code, reeks like code smell, cargo cult, voodoo programming and whatnot.
I have the following method to delete a file with a provided path
private void DestroyFile(string path)
{
try
{
if (File.Exists(path))
{
File.Delete(path);
}
if (File.Exists(path))
{
throw new IOException(string.Format("Failed to delete file: '{0}'.", path));
}
}
catch (Exception ex)
{
throw ex;
}
}
I am getting the IOException that is thrown if the file exists after the File.Delete method. Specifically
System.IO.IOException): Failed to delete file: 'C:\Windows\TEMP\[FILE NAME]'.
I have also confirmed that the file does not exist at the location in the path variable after the execution is complete. I am wondering if I am running up against a race condition between the file system updating after File.Delete and checking against it again with File.Exists. Is there a better way to smoothly delete? I know that File.Delete won't return an error if the file doesn't exist so maybe these checks are a bit redundant. Should I check if the file is in use rather than if it exists at all?
Some important additional information:
The program can and does run successfully often but this particular error has been frequently seen recently.
File.Delete will mark file for deletion. File really will be deleted only when all handles to it are closed (if there are no such handles - it will always be deleted after File.Delete returns). As documented for DeleteFile winapi function (which is used by C# File.Delete):
The DeleteFile function marks a file for deletion on close. Therefore,
the file deletion does not occur until the last handle to the file is
closed
Usually there are no open handles to files you delete. Or, if there are open handles - they usually don't have "delete" share (this share allows another process to mark file for deletion), so when you try to delete such file - it either gets deleted (no open handles) or access denied or similar exception is thrown (some handles, but without delete share).
However, sometimes some software, such as antivirus or search indexer, might open arbitrary files with "delete" share and hold them for some time. If you try to delete such file - it will go without errors and file really will be deleted when that software closes its handle. However, File.Exists will return true for such "pending delete" file.
You can reproduce this issue with this simple program:
public class Program {
public static void Main() {
string path = #"G:\tmp\so\tmp.file";
// create file with delete share and don't close handle
var file = new FileStream(path, FileMode.Create, FileAccess.ReadWrite, FileShare.Delete);
DestroyFile(path);
GC.KeepAlive(file);
}
private static void DestroyFile(string path) {
try {
if (File.Exists(path)) {
// no error
File.Delete(path);
}
// but still exists
if (File.Exists(path)) {
throw new IOException(string.Format("Failed to delete file: '{0}'.", path));
}
}
catch (Exception ex) {
throw ex;
}
}
}
You can retry File.Exists check forever in the program above - file will exist until you close the handle.
So that's what happens in your case - some program has open handle to this file with FileShare.Delete.
You should expect such situation. For example - just remove that File.Exists check, since you marked file for deletion and it will be deleted anyway.
while its not documented in the API, File.Delete WILL return before the file is completely deleted.
This is why you are running into the case you are having. Delete call will check for all the things that would make the delete fail (existing handle, lock, permission ect) and it will return after the initiation of Delete request
So its relatively safe to just put a while loop right after to wait until the file is gone or use a FileSystemWatcher to watch for Deleted event
File.Delete and generally most methods from System.IO are dependent on filesystem/streams/etc, whom a bit live their own lives, and are not managed resources, hence File.Delete can return before file is physically deleted, but after it's marked for deletion.
After File.Delete returns, you can be sure file will be deleted, if not this method will throw exception by itself, so second check with File.Exists and throwing IOException is unnecessary.
If you want custom exception, catch exceptions from File.Delete.
And in code attached, remember that throw ex; is diffrenent from throw; and changes stack trace to current line.
Hey guys so I'm working on a program it deletes certain directories files, mostly temp files, except I get an error even know I added a catch block. The System.UnauthorizedAccessException. on the catch ioexception I get the error there:
private void DeleteInternetFiles(string internetDirectory)
{
DirectoryInfo internetTempStorage = new DirectoryInfo(internetDirectory);
try
{
//this will delete files
foreach (FileInfo getNetFileInfo in internetTempStorage.GetFiles())
{
getNetFileInfo.Delete();
}
//this will loop through and delete folders
foreach (DirectoryInfo tempDirectoryInformation in internetTempStorage.GetDirectories())
{
tempDirectoryInformation.Delete();
}
}
//catch io exception and try delete file again
catch (IOException)
{
//delete file in this directory
File.Delete(internetDirectory);
//delete folders in this directory
Directory.Delete(internetDirectory);
}
//catch access exception and delete file again
catch (UnauthorizedAccessException)
{
//delete file in this directory
File.Delete(internetDirectory);
//delete folders in this directory
Directory.Delete(internetDirectory);
}
}
And this one below is how I call the method:
if (checkBox1.Checked)
{
DeleteInternetFiles(#"C:\Users\" + Environment.UserName + #" \AppData\Local\Microsoft\Windows\Temporary Internet Files");
}
Your second call to File.Delete(internetDirectory);, inside the catch block, seems likely to be the problem. The program has already encountered an error while trying to delete the file, and then you tried again. Two things could be happening:
The user account executing the program doesn't have permission to
delete files in another user's directory.
Some file is still in use and therefore can't be deleted (e.g.
currently open in Internet Explorer.
You might want to study the responses in C# - How to Delete temporary internet files. Note the comments about possibly having to "kill IE".
The problem I see here is that the delete action you're performing requires Administrator privileges.
What you can do is try to right click > Run as Administrator the application and then perform the action.
If you want to prompt the user to elevate your application, you can do this.
Force application to Run as Administrator [Winforms only]
You get this error because the file or folder you attempt to delete have not this access right.
It may happen in your case due to some file is being currently in use while you perform a delete operation.
There are more possibilities of file being used because you delete from a folder that windows os uses for the temporary use.
I created a service that moves certain file types in a directory to another one, this works fine locally and pretty fast over my network. On a different network though it works incredibly slow (a 500mb file takes 6 1/2 minutes) but the same file copied and pasted into the folder via explorer is complete in about 30/40 seconds.
Snippet where file move happens.
currentlyProcessing.Add(currentFile.FullName);
try
{
eventMsg("Attempting to move file","DEBUG");
File.Move(oldFilePath, newFilePath);
eventMsg("File Moved successfully","DEBUG");
}
catch (Exception ex)
{
eventMsg("Cannot Move File another resource is using it", "DEBUG");
eventMsg("Move File Exception : " + ex, "DEBUG");
}
finally
{
if(File.Exists(currentFile.FullName + ".RLK"))
{
try
{
File.Delete(currentFile.FullName + ".RLK");
}
catch (IOException e)
{
eventMsg("File Exception : " + e, "DEBUG");
}
}
currentlyProcessing.Remove(oldFilePath);
}
I fear the code is fine (as it works as expected on other network) so the problem is probably the network, in some shape or form. Has anyone got any common tips to check, the service runs as local system(or Network service) and there doesn't seem to be access problem. What other factors would affect this (other than network/hardware) .
What I would like is it to have a similar transfer speed to that I've witnessed in Explorer. Any pointers greatly appreciated .
First of all
Ping each other to check latency so you can see if it's the problem is globally in the network or is it something with your software.
cmd ping 192.168.1.12 -n 10
If it's a problem on the network follow this:
Restart your HUB / Router
Is one of the PCs use WiFi with low signal?
Is there any Anti Virus that is turned on and monitoring Network Activity ?
If none of the above solves your problem then try using WireShark to be able to further investigate the issue.
For files with such huge size, I would suggest zipping them up if possible, especially since network latency is always a big factor involved in upload/download of files over the internet.
You can zip your files with System.IO.Packaging. Have a look here Using System.IO.Packaging to generate a ZIP file , specifically:
using (Package zip = System.IO.Packaging.Package.Open(zipFilename, FileMode.OpenOrCreate))
{
string destFilename = ".\\" + Path.GetFileName(fileToAdd);
Uri uri = PackUriHelper.CreatePartUri(new Uri(destFilename, UriKind.Relative));
if (zip.PartExists(uri))
{
zip.DeletePart(uri);
}
PackagePart part = zip.CreatePart(uri, "",CompressionOption.Normal);
using (FileStream fileStream = new FileStream(fileToAdd, FileMode.Open, FileAccess.Read))
{
using (Stream dest = part.GetStream())
{
CopyStream(fileStream, dest);
}
}
}
Also, you can should FTP as some other user mentioned if possible. Look at the Renci SSHNet library on CodePlex.