We have a C# WinForms application that is run on the client. The application downloads a file from an FTP, saves it on a shared drive (hosted on the server) and the server will then run some code to decrypt the file and post the decrypted file back to the shared drive, to the same location as the encrypted file, under a different file name. When this is completed, the execution is then passed back to the client and it attempts to check if the decrypted file exists. It then throws an exception because the file cannot be found.
Client code:
protected void DecryptFile(string aEncryptedFilePath, string aDecryptedFilePath)
{
AppController.Task.SystemTaskManager.ExecuteTask(new TaskRequest(TaskPgpFileDecrypt.TaskId, aEncryptedFilePath, aDecryptedFilePath));
if (!File.Exists(aDecryptedFilePath)
throw new FileNotFoundException("File does not exist"); // Exception thrown here
}
Server code:
public TaskResponse Execute(string aSourceFilePath, string aDestinationFilePath)
{
// Decryption Code
if (!File.Exists(DestinationFilePath))
throw new ApplicationException($"Could not {ActionDescription} file {SourceFilePath}. Expected output file {DestinationFilePath} is missing.");
using (var fileStream = new FileStream(DestinationFilePath, FileMode.Open))
fileStream.Flush(true);
return new TaskResponse();
}
I've simplified it as best as I can but you can see that the client passes in an aEncryptedFilePath and what it expects to be the aDecrypedFilePath and the server code will decrypt the encrytped file and store on the path stored inaDestinationFilePath.
Now in the server code, you can also see that we check if the file exists and if it doesn't, the server will throw an exception. But here's the kicker. The server's file exists check returns true and continues the execution. It's only when we get to the client side code does the File.Exist check fail! We've tried flushing the buffer to ensure that the file is written to the disk but this doesn't help at all.
Another bit of useful information is that I can verify that the file exists because if I watch the folder where the file is created, I can see that it's created. However, if I click the file immediately after it shows up I get this warning from windows:
If I close the warning and wait a second or two, I am then able to open the file.
What could be causing this behaviour?
The Task class is an asynchronous operation. Based on the code here you are firing the decrypt command and then immediately checking to see if the file exists after you have issues the command to decrypt it. Because this is going to take some server cycles to complete you will need to wait for the task to complete before you access the file.
There is a really good MSDN doc that shows how to wait for a task to complete before continuing the code execution here: https://learn.microsoft.com/en-us/dotnet/api/system.threading.tasks.task?view=netframework-4.8#WaitingForOne
Related
there is a project assigned to me that create some files and send those files using SFTP to some server. there is another programme to read and send that files to another place(the files first programme sent to the server). it checks continuously new files from the server and read it and send. but the problem is sometimes the second program read files not completely imported to that server. that cause system to crash. they told me to do change in the first program before sending files, rename the files first and send and after finished upload rename again the sent files, files that in the server. is this possible to or is a there better way to do this. is there anyone have some ideas I'm kindly asking to share with me
That it's a good synchronization method, I mean using a temporary name during the transfer and the rename at once at the end.
The implementation depends on which approach you've used in the program.
It should be something like this:
// Rename the file or directory:
success = sftp.RenameFileOrDir("oldFilename.txt","newFilename.txt");
if (success != true) {
Console.WriteLine(sftp.LastErrorText);
return;
}
Basically:
Catch the event of transfer completed
Request the RENAME command
I have a page to upload file .txt by using PLUpload library (PLUpload). It worked when I test in client computer for all of browser : IE, Chrome, FF... But when I test in Window Server where hosting this website it throw error:
The process cannot access the file 'SystemPath\Test.txt' because it is
being used by another process.
The website write by ASP.NET and I think root cause is about the security of Window Server. The error code is Error #-200: HTTP Error.
Here is the code when upload :
using System.IO;
MemoryStream uploadStream = new MemoryStream();
using (FileStream source = File.Open(tempFile, FileMode.Open))
{
source.CopyTo(uploadStream);
}
Question: Why IE throw that error just only Window Server and how to fix that ?
There are a variety of processes that could lock up the file.
Eric Lippert suggested that it could be the antivirus: C# file is being used by another process
João Sousa recommends checking your code to make sure it disposes of all connections to the file when it is done: Process cannot access the file because it is being used by another process
Because the error is coming from the file system, anything that interacts with the file system could be locking the file. These may not be the cause of your error, but they are good places to start looking.
I'm experiencing an issue with an FTP watcher service and the File.Move method.
The FTP server is a simple IIS 8.5 FTP site and the FTP client is FileZilla FTP Client
The windows service will poll a directory where the files are to be dropped.
The first task is to rename the file, using the static File.Move method.
The second, is to copy the file to another directory using the static File.Copy method.
The issue is that while the file is being transferred, the File.Copy will [correctly] throw an IO Exception if it is used, with the message "The file is being used by another process".
However the File.Move will perform it's task without throwing any exception while the file is still being transferred. Is this the correct behavior for this method? I've not been able to find any information on why this occurs. My impression was that the File.Move would throw an exception if it's used on a file that's being used by another process [The FTP Transfer] but it doesn't seem to.
Has anyone experienced this and / or have an explanation for the behavior of the File.Move method
Copying a file requires opening it for read access. The FTP server currently has the file open such that you cannot open it for reading.
Moving a file does not require opening it for read access unless the file is on a different volume than the destination.
Since moving a file to the same volume requires only delete access and not read access, the FTP server must lock the files for read and write, but not delete.
This code shows that File.Move will indeed throw an exception if the file is in use when you try to move it, so I think your premise is incorrect.
var filePath = #"d:\public\temp\temp.txt";
var moveToPath = #"d:\public\temp\temp2.txt";
// Create a stream reader so the file is 'in use'
using (var fileStream = new StreamReader(filePath))
{
// This will fail with an IO exception
File.Move(filePath, moveToPath);
}
Exception:
The process cannot access the file because it is being used by another process.
Moving a file is effectively implemented as a mere rename and only requires write permission on the target and source directory. For a real copy you need read permissions on the file itself. As there is an exclusive lock on the source file, the copy will fail, however, the move will succeed.
I am trying to write to a file in a Asp.Net form (.aspx). I can create the file fine using,
if (!File.Exists(Settings.Default.FileLocation))
{
File.Create(Settings.Default.FileLocation);
}
But when I go to write to the file using this code:
File.WriteAllBytes(Settings.Default.FileLocation, someByteArray);
I get an exception:
System.IO.IOException: The process cannot access the file 'C:\inetpub\wwwroot\mysite.com\captured\captured.xml' because it is being used by another process.
(in this case 'C:\inetpub\wwwroot\mysite.com\captured\captured.xml' == Settings.Default.FileLocation)
I cant delete or copy the file in Windows Explorer during this time as well. However, if I stop or restart the Application Pool the WebForm is running in, the error goes away. What is the cause of this and how do I prevent it?
This is running on a Win server 2012 R2 and IIS 7.5
Read the documentation on MSDN. You should be using using statements to open the file so that you can make sure any open handles to it are closed.
using(FileStream fs=File.Create(Settings.Default.FileLocation))
{
//manipulate file here
}
Also, File.WriteAllBytes will create the file if it doesn't exist, so there's no need to separately create it.
File.Create opens the file for read/write by default and needs to be closed before it can be used again. From the docs:
By default, full read/write access to new files is granted to all
users. The file is opened with read/write access and must be closed
before it can be opened by another application.
File.WriteAllBytes will create the file if it doesn't exist, so I think your code to check for file existence and creating it is probably overkill.
Using the .NET assembly of WinSCP to upload a file. OperationResultBase.Check() is throwing the following error:
WinSCP.SessionRemoteException: Transfer was successfully finished, but temporary transfer file 'testfile.zip.filepart' could not be renamed to target file name 'testfile.zip'. If the problem persists, you may want to turn off transfer resume support.
It seems that this happens with any zip file that I try to send. If it makes a difference, these are zip files that were created using the DotNetZip library.
Code that I'm using, taken pretty much directly from the example in the WinSCP documentation:
public void uploadFile(string filePath, string remotePath)
{
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
TransferOperationResult transferResult;
transferResult = currentSession.PutFiles(filePath, remotePath, false, transferOptions);
transferResult.Check();
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
Console.WriteLine("Upload of {0} succeeded", transfer.FileName);
}
}
Discussion over at the WinSCP forum indicates that the assembly doesn't yet allow programmatic control of transfer resume support. Is there a workaround for this?
It sounds as if the filesystem on the destination server where the file is getting uploaded to does not allow file change permissions. This could be causing the renaming of the file at the finish of the upload to fail despite the fact that the complete file was uploaded and written to the filesystem with the temporary file name used while the transfer was in progress. If you don't have administrative access to the destination server, you can test that by trying to rename a file that is already on the destination server. If that fails also, then you will either need to have the proper permissions on the destination server changed in order for that to work. Otherwise you might have to use the advice provided in your error message to turn off the resume support so it is initially opened for writing with the desired filename instead of the temporary filename (with the .filepart extension).
Turn off the resumesupport:
put *.txt -nopreservetime -nopermissions -resumesupport=off
It would help, if you included full error message, including root cause as returned by the server.
My guess is that there's an antivirus application (or similar) running on the server-side. The antivirus application checks any file once upload finishes. That conflicts with WinSCP attempt to rename the file once the upload is finished. The problem may tend to occur more frequently for .ZIP archives, either because they tend to be larger or simply because they need to get extracted before the check (what takes time).
Anyway, you can disable the transfer to temporary file name using the TransferOptions.ResumeSupport.
See also the documentation for the error message "Transfer was successfully finished, but temporary transfer file ... could not be renamed to target file name ..."
All you have to do is to disable TransferResumeSupport using the below code.
transferOptions.ResumeSupport = new TransferResumeSuppor {State = TransferResumeSupportState.Off };