We have a process where people scan documents with photocopiers and drop them in a certain directory on our file server. We then have a hourly service within an .NET Core app, that scans the directory, grabs the file and moves them according to their file name to a certain directory. Here comes the problems.
The code looks something like that:
private string MoveFile(string file, string commNumber)
{
var fileName = Path.GetFileName(file);
var baseFileName = Path.GetFileNameWithoutExtension(fileName).Split("-v")[0];
// 1. Check if the file already exists at destination
var existingFileList = luxWebSamContext.Documents.Where(x => EF.Functions.Like(x.DocumentName, "%" + Path.GetFileNameWithoutExtension(baseFileName) + "%")).ToList();
// If the file exists, check for the current version of file
if (existingFileList.Count > 0)
{
var nextVersion = existingFileList.Max(x => x.UploadVersion) + 1;
var extension = Path.GetExtension(fileName);
fileName = baseFileName + "-v" + nextVersion.ToString() + extension;
}
var from = #file;
var to = Path.Combine(#destinationPath, commNumber,fileName);
try
{
log.Info($"------ Moving File! ------ {fileName}");
Directory.CreateDirectory(Path.Combine(#destinationPath, commNumber));
File.Move(from, to, true);
return to;
}
catch (Exception ex)
{
log.Error($"----- Couldn't MOVE FILE: {file} ----- commission number: {commNumber}", ex);
The interesting part is in the try-block, where the file move takes place. Sometmes we have the problem that the program throws the following exception
2021-11-23 17:15:37,960 [60] ERROR App ----- Couldn't MOVE FILE:
\PATH\PATH\PATH\Filename_423489120.pdf ----- commission number:
05847894
System.IO.IOException: The process cannot access the file because it is being used by another process.
at System.IO.FileSystem.MoveFile(String sourceFullPath, String destFullPath, Boolean overwrite)
at System.IO.File.Move(String sourceFileName, String destFileName, Boolean overwrite)
So far so good. I would expect that after the file cannot be moved, it remains in the directory from it was supposed to be moved. But that's not the case. We had this issue yesterday afternoon and after I looked for the file, it was gone from the directory.
Is this the normal behaviour of the File.Move() method?
First to your question:
Is this the normal behaviour of the File.Move() method?
No, thats not the expected behaviour. The documentation says:
Moving the file across disk volumes is equivalent to copying the file
and deleting it from the source if the copying was successful.
If you try to move a file across disk volumes and that file is in use,
the file is copied to the destination, but it is not deleted from the
source.
Your Exception says, that another process is using the file in the same moment. So you should check, whether other parts of your application may performs a Delete, or someone (if this scenario is valid) is deleting files manually from the file system.
Typically, File.Move() only removes the source file, once the destination file is successfully transferred in place. So the answer to your question is no, it cannot be purely the File.Move(). The interesting part is, why is this file locked? Probaby because some file stream is still open and blocking access to the file. Also, do you have multiple instances of the copy process services running? This may cause several services trying to access the file simultaneously, causing the exception you posted.
There must be a different cause making the files disappear because the File.Move() will certainly not remove the file when the copy process did not succeed.
For debugging purposes, you may try and open the file with a lock on it. This will fail when a different process locks the file providing you a little bit more information.
Related
im trying to read a folder from a remote machine, inside this folder there are many txt files.
The machine is writing every time new datas inside the folder.
string str = "";
try
{
DirectoryInfo d = new DirectoryInfo(#"\\192.168.1.209\user\HST\");
FileInfo[] Files = d.GetFiles("*.txt");
foreach (FileInfo file in Files)
{
str = str + ", " + file.Name;
}
}
catch (Exception pp)
{
System.IO.File.WriteAllText(GlobalVariables.errorFolderLocation + "erroreLettura.1.209.txt", pp.ToString());
}
This is my code, I don't understand how to get these datas, because i get this error: "the system call level is not correct".
For example, if i try to delete the folder or a file, i get error because it's already used by another process.
So, is there a solution to "bypass" this error?
EDIT 1:
I need read every row of every file, i get the error on DirectoryInfo.
If i acces on the folders and files it works fine.
I need read this folder/files in 3 different machine, but only in this (192.168.1.209) not working, and its the only machine where i get error when i try to delete a file
Unfortunately, you are not able to delete a file which is in use by another process or delete a folder that contains a file in use, or whatever other operation that has not been specified to be accessed on the other process. In order to achieve this, the process which has the file in use, must open it with a share access, FileShare from System.IO .NET library.
I am currently experiencing a serious problem. I am scanning files in a directory, then I process them [read contents with File.ReadAllText(fi.FullName)], and then deleting the file. The thread sleeps for 200 ms, and starts again (scan, process, delete). The issue is that sometimes, I can see that a file that has been already deleted it appears in the next scan, this does not happen always, only occasionally.
List<FileInfo> files = GetFiles();
if (files != null)
{
foreach (FileInfo fi in files)
{
if (ProcessFile(fi))
{
fi.Delete();
log.Info("Report file: " + fi.FullName + " has been deleted");
}
}
}
And here is the GetFiles method
internal List<FileInfo> GetFiles()
{
try
{
DirectoryInfo info = new DirectoryInfo(scanDir);
List<FileInfo> files = info.GetFiles().OrderBy(p => p.CreationTime).Take(10).ToList(); //oldest file first
return files;
}
catch (Exception ex)
{
log.Error("Getting files from directory: " + scanDir + ". Error: " + ex.ToString());
return null;
}
}
I have read in other posts that the FileInfo.Delete() takes some time, but Microsoft documentation does not say anything about this. So I am not sure as to what is happing. Can anybody spot anything wrong with the code? Is there any official documentation as to whether the fileInfo.Delete() is a blocking call? or does it simple marks a file for deletion?
EDIT
and here is the only reference to the FileInfo in the ProcessFile
string message = File.ReadAllText(fi.FullName);
I believe that the File.ReadAllText closes the file, and that no handles should be left around please correct me if wrong!...also, this only happens occasionally, and not to all files (I am processing 10 files, and it happens to just 1)
From the Microsoft Reference Source:
public override void Delete()
{
#if FEATURE_CORECLR
FileSecurityState state = new FileSecurityState(FileSecurityStateAccess.Write, DisplayPath, FullPath);
state.EnsureState();
#else
// For security check, path should be resolved to an absolute path.
new FileIOPermission(FileIOPermissionAccess.Write, new String[] { FullPath }, false, false).Demand();
#endif
bool r = Win32Native.DeleteFile(FullPath);
if (!r) {
int hr = Marshal.GetLastWin32Error();
if (hr==Win32Native.ERROR_FILE_NOT_FOUND)
return;
else
__Error.WinIOError(hr, DisplayPath);
}
}
It is shown that the Delete method uses Win32Native.DeleteFile method in kernel32.dll. And further check on the kernel32.dll can be found in this over 500+ pages reference.
In page 79, you could find reference to DeleteFile:
1.49 DeleteFile
The DeleteFile function deletes an existing file.
DeleteFile: procedure
(
lpFileName: string
);
stdcall;
returns( "eax" );
external( "__imp__DeleteFileA#4" );
Parameters
lpFileName
[in] Pointer to a null-terminated string that specifies the file to be deleted.
Windows NT/2000: In the ANSI version of this function, the name is limited to MAX_PATH characters. To extend this limit to nearly 32,000 wide characters, call the Unicode version of the function and prepend "\\?\" to the path. For more information, see File Name Conventions.
Windows 95/98: This string must not exceed MAX_PATH characters.
Return Values
If the function succeeds, the return value is nonzero.
If the function fails, the return value is zero. To get extended error information, call GetLastError.
Remarks
If an application attempts to delete a file that does not exist, the DeleteFile function fails.
To delete or rename a file, you must have either delete permission on the file or delete child permission in the parent directory.
If you set up a directory with all access except delete and delete child and the ACLs of new files are inherited, then you should be able to create a file without being able to delete it. However, you can then create a file, and you will get all the access you request on the handle returned to you at the time you create the file.
If you requested delete permission at the time you created the file, you could delete or rename the file with that handle but not with any other.
Windows 95: The DeleteFile function deletes a file even if it is open for normal I/O or as a memory-mapped file. To prevent loss of data, close files before attempting to delete them.
Windows NT/2000: The DeleteFile function fails if an application attempts to delete a file that is open for normal I/O or as a memory-mapped file.
To close an open file, use the CloseHandle function.
MAPI: For more information, see Syntax and Limitations for Win32 Functions Useful in MAPI Development.
Since it is using kernel32.dll, It shares the same mechanism with what we do when we delete file from windows UI.
internal const String KERNEL32 = "kernel32.dll";
...
[DllImport(KERNEL32, SetLastError=true, CharSet=CharSet.Auto, BestFitMapping=false)]
[ResourceExposure(ResourceScope.Machine)]
internal static extern bool DeleteFile(String path);
Thus, this shows that it is not a "blocking" function as you might have suspected. Provided that the file is deletable (there is no access permission error or I/O error) It just takes time to delete.
One workaround for your case would be to try to collect every file which you want to delete first and then delete them together, say, using Async task or something like BackgroundWorker.
It was clearly stated that File.Move is atomic operation here: Atomicity of File.Move.
But the following code snippet results in visibility of moving the same file multiple times.
Does anyone know what is wrong with this code?
using System;
using System.Collections.Generic;
using System.IO;
using System.Threading.Tasks;
namespace FileMoveTest
{
class Program
{
static void Main(string[] args)
{
string path = "test/" + Guid.NewGuid().ToString();
CreateFile(path, new string('a', 10 * 1024 * 1024));
var tasks = new List<Task>();
for (int i = 0; i < 10; i++)
{
var task = Task.Factory.StartNew(() =>
{
try
{
string newPath = path + "." + Guid.NewGuid();
File.Move(path, newPath);
// this line does NOT solve the issue
if (File.Exists(newPath))
Console.WriteLine(string.Format("Moved {0} -> {1}", path, newPath));
}
catch (Exception e)
{
Console.WriteLine(string.Format(" {0}: {1}", e.GetType(), e.Message));
}
});
tasks.Add(task);
}
Task.WaitAll(tasks.ToArray());
}
static void CreateFile(string path, string content)
{
string dir = Path.GetDirectoryName(path);
if (!Directory.Exists(dir))
{
Directory.CreateDirectory(dir);
}
using (FileStream f = new FileStream(path, FileMode.OpenOrCreate))
{
using (StreamWriter w = new StreamWriter(f))
{
w.Write(content);
}
}
}
}
}
The paradoxical output is below. Seems that file was moved multiple times onto different locations. On the disk only one of them is present. Any thoughts?
Moved test/eb85560d-8c13-41c1-926a-6871be030742 -> test/eb85560d-8c13-41c1-926a-6871be030742.0018d317-ed7c-4732-92ac-3bb974d29017
Moved test/eb85560d-8c13-41c1-926a-6871be030742 -> test/eb85560d-8c13-41c1-926a-6871be030742.3965dc15-7ef9-4f36-bdb7-94a5939b17db
Moved test/eb85560d-8c13-41c1-926a-6871be030742 -> test/eb85560d-8c13-41c1-926a-6871be030742.fb66306a-5a13-4f26-ade2-acff3fb896be
Moved test/eb85560d-8c13-41c1-926a-6871be030742 -> test/eb85560d-8c13-41c1-926a-6871be030742.c6de8827-aa46-48c1-b036-ad4bf79eb8a9
System.IO.FileNotFoundException: Could not find file 'C:\file-move-test\test\eb85560d-8c13-41c1-926a-6871be030742'.
System.IO.FileNotFoundException: Could not find file 'C:\file-move-test\test\eb85560d-8c13-41c1-926a-6871be030742'.
System.IO.FileNotFoundException: Could not find file 'C:\file-move-test\test\eb85560d-8c13-41c1-926a-6871be030742'.
System.IO.FileNotFoundException: Could not find file 'C:\file-move-test\test\eb85560d-8c13-41c1-926a-6871be030742'.
System.IO.FileNotFoundException: Could not find file 'C:\file-move-test\test\eb85560d-8c13-41c1-926a-6871be030742'.
System.IO.FileNotFoundException: Could not find file 'C:\file-move-test\test\eb85560d-8c13-41c1-926a-6871be030742'.
The resulting file is here: eb85560d-8c13-41c1-926a-6871be030742.fb66306a-5a13-4f26-ade2-acff3fb896be
UPDATE. I can confirm that checking File.Exists also does NOT solve the issue - it can report that single file was really moved into several different locations.
SOLUTION. The solution I end up with is following: Prior to operations with source file create special "lock" file, if it succeeded then we can be sure that only this thread got exclusive access to the file and we are safe to do anything we want. The below is right set of parameters to create suck "lock" file.
File.Open(lockPath, FileMode.CreateNew, FileAccess.Write);
Does anyone know what is wrong with this code?
I guess that depends on what you mean by "wrong".
The behavior you're seeing is not IMHO unexpected, at least if you're using NTFS (other file systems may or may not behave similarly).
The documentation for the underlying OS API (MoveFile() and MoveFileEx() functions) is not specific, but in general the APIs are thread-safe, in that they guarantee the file system will not be corrupted by concurrent operations (of course, your own data could be corrupted, but it will be done in a file-system-coherent way).
Most likely what is occurring is that as the move-file operation proceeds, it does so by first getting the actual file handle from the given directory link to it (in NTFS, all "file names" that you see are actually hard links to an underlying file object). Having obtained that file handle, the API then creates a new file name for the underlying file object (i.e. as a hard link), and then deletes the previous hard link.
Of course, as this progresses, there is a window during the time between a thread having obtained the underlying file handle but before the original hard link has been deleted. This allows some but not all of the other concurrent move operations to appear to succeed. I.e. eventually the original hard link doesn't exist and further attempts to move it won't succeed.
No doubt the above is an oversimplification. File system behaviors can be complex. In particular, your stated observation is that you only wind up with a single instance of the file when all is said and done. This suggests that the API does also somehow coordinate the various operations, such that only one of the newly-created hard links survives, probably by virtue of the API actually just renaming the associated hard link after retrieving the file object handle, as opposed to creating a new one and deleting the old one (implementation detail).
At the end of the day, what's "wrong" with the code is that it is intentionally attempting to perform concurrent operations on a single file. While the file system itself will ensure that it remains coherent, it's up to your own code to ensure that such operations are coordinated so that the results are predictable and reliable.
In my below code, I am trying to move all my .doc files located in one folder to another. In production, this will be moving files generated into created folder C:\Temp\ to a network folder that has a database job which every 5 minutes moves the files from the network folder into our document imagining archive system.
When trying my below test code, I receive "The process cannot access the file because it is being used by another process."
CODE:
public void moveLocalToCommitFYI()
{
// MOVE DOCS FORM TEMP FOLDER TO COMMITFYI FOLDER
string dirSource = #"C:\Users\NAME\Desktop\CommitTest\MoveTest\";
string dirDest = #"C:\Users\NAME\Desktop\CommitTest\MoveTest\DestTest\";
try
{
Directory.Move(dirSource, dirDest);
}
catch (Exception ex)
{
MessageBox.Show("Source:\t" + ex.Source + "\nMessage: \t" + ex.Message + "\nData:\t" + ex.Data);
}
}
What appears strange to me is that when I specify a single source file outright and it's destination, code functions fine:
public void moveLocalToCommitFYI()
{
// MOVE DOCS FORM TEMP FOLDER TO COMMITFYI FOLDER
string dirSource = #"C:\Users\NAME\Desktop\CommitTest\MoveTestFile.doc";
string dirDest = #"C:\Users\NAME\Desktop\CommitTest\MoveTest\MoveTestFile.doc";
try
{
Directory.Move(dirSource, dirDest);
}
catch (Exception ex)
{
MessageBox.Show("Source:\t" + ex.Source + "\nMessage: \t" + ex.Message + "\nData:\t" + ex.Data);
}
}
Surely there is a way to move all my .doc files from the one directory to the other without needing to loop through and specify each individual file name?
EDIT:
Made the modification of having my destination folder be a different folder on my desktop instead of a sub-folder. dirSource contained 5 word documents and the destination directory (\MoveTest\ on my Desktop) contained no files:
string dirSource = #"C:\Users\NAME\Desktop\CommitTest\MoveTest\";
string dirDest = #"C:\Users\NAME\Desktop\MoveTest\";
This generated "Cannot create a file when that file already exists".
Based on this, I assumed that the code is actually MOVING the set directory folder and all it's contents from one location to another, so I modified my code to the following:
string dirSource = #"C:\Users\NAME\Desktop\CommitTest\MoveTest\";
string dirDest = #"C:\Users\NAME\Desktop\MoveTest2\";
This got rid of my sub-folder \MoveTest\ within \Desktop\CommitTest\ and created folder \Desktop\MoveTest2.
Is there a way for me to move the contents of the folder without getting rid of the source folder and place them into an already created destination?
Your test code is trying to move the entire directory into a directory inside itself. This is not a valid operation. Your actual code is probably trying to move the whole Temp directory to another location. This is not the same as moving all files inside it to another directory.
You'll have to loop through all file names and move each file individually.
var files = Directory.EnumerateFiles(dirSource, "*.doc")
.Select(path => new FileInfo(path));
foreach (var file in files)
file.MoveTo(Path.Combine(dirDest, file.Name));
If you find that the individual moves add too much overhead to the network traffic (especially important if this is a slow Internet connection and not a fast local connection), you might want to zip up all of your doc files (e.g. with ZipFile) before transmitting them to the server, and then have the server unzip them. This compression will probably lower the overall size, and transmitting one file instead of many will reduce the network overhead.
You cannot move a folder below itself. You need to find all its contents and move them. Or do this: rename MoveTest to MoveTest2, create a new MoveTest directory, and now you can move MoveTest under it.
I am having an xml file like:
<CurrentProject>
// Elements like
// last opened project file to reopen it when app starts
// and more global project independend settings
</CurrentProject>
Now I asked myself wether I should deliver this xml file with above empty elements with the installer for my app or should I create this file on the fly on application start if it does not exist else read the values from it.
Consider also that the user could delete this file and that should my application not prevent from working anymore.
What is better and why?
UPDATE:
What I did felt ok for me so I post my code here :) It just creates the xml + structure on the fly with some security checks...
public ProjectService(IProjectDataProvider provider)
{
_provider = provider;
string applicationPath = Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData);
_projectPath = Path.Combine(applicationPath,#"TBM\Settings.XML");
if (!File.Exists(_projectPath))
{
string dirPath = Path.Combine(applicationPath, #"TBM");
if (!Directory.Exists(dirPath))
Directory.CreateDirectory(dirPath);
using (var stream = File.Create(_projectPath))
{
XElement projectElement = new XElement("Project");
projectElement.Add(new XElement("DatabasePath"));
projectElement.Save(stream, SaveOptions.DisableFormatting);
}
}
}
In a similar scenario, I recently went for creating the initial file on the fly. The main reason I chose this was the fact that I wasn't depending on this file being there and being valid. As this was a file that's often read from/written to, there's a chance that it could get corrupted (e.g. if the power is lost while the file is being written).
In my code I attempted to open this file for reading and then read the data. If anywhere during these steps I encountered an error, I simply recreated the file with default values and displayed a corresponding message to the user.