I have some code and when it executes, it throws a IOException, saying that
The process cannot access the file 'filename' because it is being used by
another process
What does this mean, and what can I do about it?
What is the cause?
The error message is pretty clear: you're trying to access a file, and it's not accessible because another process (or even the same process) is doing something with it (and it didn't allow any sharing).
Debugging
It may be pretty easy to solve (or pretty hard to understand), depending on your specific scenario. Let's see some.
Your process is the only one to access that file
You're sure the other process is your own process. If you know you open that file in another part of your program, then first of all you have to check that you properly close the file handle after each use. Here is an example of code with this bug:
var stream = new FileStream(path, FileAccess.Read);
var reader = new StreamReader(stream);
// Read data from this file, when I'm done I don't need it any more
File.Delete(path); // IOException: file is in use
Fortunately FileStream implements IDisposable, so it's easy to wrap all your code inside a using statement:
using (var stream = File.Open("myfile.txt", FileMode.Open)) {
// Use stream
}
// Here stream is not accessible and it has been closed (also if
// an exception is thrown and stack unrolled
This pattern will also ensure that the file won't be left open in case of exceptions (it may be the reason the file is in use: something went wrong, and no one closed it; see this post for an example).
If everything seems fine (you're sure you always close every file you open, even in case of exceptions) and you have multiple working threads, then you have two options: rework your code to serialize file access (not always doable and not always wanted) or apply a retry pattern. It's a pretty common pattern for I/O operations: you try to do something and in case of error you wait and try again (did you ask yourself why, for example, Windows Shell takes some time to inform you that a file is in use and cannot be deleted?). In C# it's pretty easy to implement (see also better examples about disk I/O, networking and database access).
private const int NumberOfRetries = 3;
private const int DelayOnRetry = 1000;
for (int i=1; i <= NumberOfRetries; ++i) {
try {
// Do stuff with file
break; // When done we can break loop
}
catch (IOException e) when (i <= NumberOfRetries) {
// You may check error code to filter some exceptions, not every error
// can be recovered.
Thread.Sleep(DelayOnRetry);
}
}
Please note a common error we see very often on StackOverflow:
var stream = File.Open(path, FileOpen.Read);
var content = File.ReadAllText(path);
In this case ReadAllText() will fail because the file is in use (File.Open() in the line before). To open the file beforehand is not only unnecessary but also wrong. The same applies to all File functions that don't return a handle to the file you're working with: File.ReadAllText(), File.WriteAllText(), File.ReadAllLines(), File.WriteAllLines() and others (like File.AppendAllXyz() functions) will all open and close the file by themselves.
Your process is not the only one to access that file
If your process is not the only one to access that file, then interaction can be harder. A retry pattern will help (if the file shouldn't be open by anyone else but it is, then you need a utility like Process Explorer to check who is doing what).
Ways to avoid
When applicable, always use using statements to open files. As said in previous paragraph, it'll actively help you to avoid many common errors (see this post for an example on how not to use it).
If possible, try to decide who owns access to a specific file and centralize access through a few well-known methods. If, for example, you have a data file where your program reads and writes, then you should box all I/O code inside a single class. It'll make debug easier (because you can always put a breakpoint there and see who is doing what) and also it'll be a synchronization point (if required) for multiple access.
Don't forget I/O operations can always fail, a common example is this:
if (File.Exists(path))
File.Delete(path);
If someone deletes the file after File.Exists() but before File.Delete(), then it'll throw an IOException in a place where you may wrongly feel safe.
Whenever it's possible, apply a retry pattern, and if you're using FileSystemWatcher, consider postponing action (because you'll get notified, but an application may still be working exclusively with that file).
Advanced scenarios
It's not always so easy, so you may need to share access with someone else. If, for example, you're reading from the beginning and writing to the end, you have at least two options.
1) share the same FileStream with proper synchronization functions (because it is not thread-safe). See this and this posts for an example.
2) use FileShare enumeration to instruct OS to allow other processes (or other parts of your own process) to access same file concurrently.
using (var stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.Read))
{
}
In this example I showed how to open a file for writing and share for reading; please note that when reading and writing overlaps, it results in undefined or invalid data. It's a situation that must be handled when reading. Also note that this doesn't make access to the stream thread-safe, so this object can't be shared with multiple threads unless access is synchronized somehow (see previous links). Other sharing options are available, and they open up more complex scenarios. Please refer to MSDN for more details.
In general N processes can read from same file all together but only one should write, in a controlled scenario you may even enable concurrent writings but this can't be generalized in few text paragraphs inside this answer.
Is it possible to unlock a file used by another process? It's not always safe and not so easy but yes, it's possible.
Using FileShare fixed my issue of opening file even if it is opened by another process.
using (var stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
}
Problem
one is tying to open file System.IO.File.Open(path, FileMode) with this method and want a shared access on file but
if u read documentation of System.IO.File.Open(path, FileMode) it is explicitly saying its does not allow sharing
Solution
use you have to use other override with FileShare
using FileStream fs = System.IO.File.Open(filePath, FileMode.Open, FileAccess.Read, FileShare.Read);
with FileShare.Read
Had an issue while uploading an image and couldn't delete it and found a solution. gl hf
//C# .NET
var image = Image.FromFile(filePath);
image.Dispose(); // this removes all resources
//later...
File.Delete(filePath); //now works
As other answers in this thread have pointed out, to resolve this error you need to carefully inspect the code, to understand where the file is getting locked.
In my case, I was sending out the file as an email attachment before performing the move operation.
So the file got locked for couple of seconds until SMTP client finished sending the email.
The solution I adopted was to move the file first, and then send the email. This solved the problem for me.
Another possible solution, as pointed out earlier by Hudson, would've been to dispose the object after use.
public static SendEmail()
{
MailMessage mMailMessage = new MailMessage();
//setup other email stuff
if (File.Exists(attachmentPath))
{
Attachment attachment = new Attachment(attachmentPath);
mMailMessage.Attachments.Add(attachment);
attachment.Dispose(); //disposing the Attachment object
}
}
I got this error because I was doing File.Move to a file path without a file name, need to specify the full path in the destination.
The error indicates another process is trying to access the file. Maybe you or someone else has it open while you are attempting to write to it. "Read" or "Copy" usually doesn't cause this, but writing to it or calling delete on it would.
There are some basic things to avoid this, as other answers have mentioned:
In FileStream operations, place it in a using block with a FileShare.ReadWrite mode of access.
For example:
using (FileStream stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
}
Note that FileAccess.ReadWrite is not possible if you use FileMode.Append.
I ran across this issue when I was using an input stream to do a File.SaveAs when the file was in use. In my case I found, I didn't actually need to save it back to the file system at all, so I ended up just removing that, but I probably could've tried creating a FileStream in a using statement with FileAccess.ReadWrite, much like the code above.
Saving your data as a different file and going back to delete the old one when it is found to be no longer in use, then renaming the one that saved successfully to the name of the original one is an option. How you test for the file being in use is accomplished through the
List<Process> lstProcs = ProcessHandler.WhoIsLocking(file);
line in my code below, and could be done in a Windows service, on a loop, if you have a particular file you want to watch and delete regularly when you want to replace it. If you don't always have the same file, a text file or database table could be updated that the service always checks for file names, and then performs that check for processes & subsequently performs the process kills and deletion on it, as I describe in the next option. Note that you'll need an account user name and password that has Admin privileges on the given computer, of course, to perform the deletion and ending of processes.
When you don't know if a file will be in use when you are trying to save it, you can close all processes that could be using it, like Word, if it's a Word document, ahead of the save.
If it is local, you can do this:
ProcessHandler.localProcessKill("winword.exe");
If it is remote, you can do this:
ProcessHandler.remoteProcessKill(computerName, txtUserName, txtPassword, "winword.exe");
where txtUserName is in the form of DOMAIN\user.
Let's say you don't know the process name that is locking the file. Then, you can do this:
List<Process> lstProcs = new List<Process>();
lstProcs = ProcessHandler.WhoIsLocking(file);
foreach (Process p in lstProcs)
{
if (p.MachineName == ".")
ProcessHandler.localProcessKill(p.ProcessName);
else
ProcessHandler.remoteProcessKill(p.MachineName, txtUserName, txtPassword, p.ProcessName);
}
Note that file must be the UNC path: \\computer\share\yourdoc.docx in order for the Process to figure out what computer it's on and p.MachineName to be valid.
Below is the class these functions use, which requires adding a reference to System.Management. The code was originally written by Eric J.:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Runtime.InteropServices;
using System.Diagnostics;
using System.Management;
namespace MyProject
{
public static class ProcessHandler
{
[StructLayout(LayoutKind.Sequential)]
struct RM_UNIQUE_PROCESS
{
public int dwProcessId;
public System.Runtime.InteropServices.ComTypes.FILETIME ProcessStartTime;
}
const int RmRebootReasonNone = 0;
const int CCH_RM_MAX_APP_NAME = 255;
const int CCH_RM_MAX_SVC_NAME = 63;
enum RM_APP_TYPE
{
RmUnknownApp = 0,
RmMainWindow = 1,
RmOtherWindow = 2,
RmService = 3,
RmExplorer = 4,
RmConsole = 5,
RmCritical = 1000
}
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Unicode)]
struct RM_PROCESS_INFO
{
public RM_UNIQUE_PROCESS Process;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = CCH_RM_MAX_APP_NAME + 1)]
public string strAppName;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = CCH_RM_MAX_SVC_NAME + 1)]
public string strServiceShortName;
public RM_APP_TYPE ApplicationType;
public uint AppStatus;
public uint TSSessionId;
[MarshalAs(UnmanagedType.Bool)]
public bool bRestartable;
}
[DllImport("rstrtmgr.dll", CharSet = CharSet.Unicode)]
static extern int RmRegisterResources(uint pSessionHandle,
UInt32 nFiles,
string[] rgsFilenames,
UInt32 nApplications,
[In] RM_UNIQUE_PROCESS[] rgApplications,
UInt32 nServices,
string[] rgsServiceNames);
[DllImport("rstrtmgr.dll", CharSet = CharSet.Auto)]
static extern int RmStartSession(out uint pSessionHandle, int dwSessionFlags, string strSessionKey);
[DllImport("rstrtmgr.dll")]
static extern int RmEndSession(uint pSessionHandle);
[DllImport("rstrtmgr.dll")]
static extern int RmGetList(uint dwSessionHandle,
out uint pnProcInfoNeeded,
ref uint pnProcInfo,
[In, Out] RM_PROCESS_INFO[] rgAffectedApps,
ref uint lpdwRebootReasons);
/// <summary>
/// Find out what process(es) have a lock on the specified file.
/// </summary>
/// <param name="path">Path of the file.</param>
/// <returns>Processes locking the file</returns>
/// <remarks>See also:
/// http://msdn.microsoft.com/en-us/library/windows/desktop/aa373661(v=vs.85).aspx
/// http://wyupdate.googlecode.com/svn-history/r401/trunk/frmFilesInUse.cs (no copyright in code at time of viewing)
///
/// </remarks>
static public List<Process> WhoIsLocking(string path)
{
uint handle;
string key = Guid.NewGuid().ToString();
List<Process> processes = new List<Process>();
int res = RmStartSession(out handle, 0, key);
if (res != 0) throw new Exception("Could not begin restart session. Unable to determine file locker.");
try
{
const int ERROR_MORE_DATA = 234;
uint pnProcInfoNeeded = 0,
pnProcInfo = 0,
lpdwRebootReasons = RmRebootReasonNone;
string[] resources = new string[] { path }; // Just checking on one resource.
res = RmRegisterResources(handle, (uint)resources.Length, resources, 0, null, 0, null);
if (res != 0) throw new Exception("Could not register resource.");
//Note: there's a race condition here -- the first call to RmGetList() returns
// the total number of process. However, when we call RmGetList() again to get
// the actual processes this number may have increased.
res = RmGetList(handle, out pnProcInfoNeeded, ref pnProcInfo, null, ref lpdwRebootReasons);
if (res == ERROR_MORE_DATA)
{
// Create an array to store the process results
RM_PROCESS_INFO[] processInfo = new RM_PROCESS_INFO[pnProcInfoNeeded];
pnProcInfo = pnProcInfoNeeded;
// Get the list
res = RmGetList(handle, out pnProcInfoNeeded, ref pnProcInfo, processInfo, ref lpdwRebootReasons);
if (res == 0)
{
processes = new List<Process>((int)pnProcInfo);
// Enumerate all of the results and add them to the
// list to be returned
for (int i = 0; i < pnProcInfo; i++)
{
try
{
processes.Add(Process.GetProcessById(processInfo[i].Process.dwProcessId));
}
// catch the error -- in case the process is no longer running
catch (ArgumentException) { }
}
}
else throw new Exception("Could not list processes locking resource.");
}
else if (res != 0) throw new Exception("Could not list processes locking resource. Failed to get size of result.");
}
finally
{
RmEndSession(handle);
}
return processes;
}
public static void remoteProcessKill(string computerName, string userName, string pword, string processName)
{
var connectoptions = new ConnectionOptions();
connectoptions.Username = userName;
connectoptions.Password = pword;
ManagementScope scope = new ManagementScope(#"\\" + computerName + #"\root\cimv2", connectoptions);
// WMI query
var query = new SelectQuery("select * from Win32_process where name = '" + processName + "'");
using (var searcher = new ManagementObjectSearcher(scope, query))
{
foreach (ManagementObject process in searcher.Get())
{
process.InvokeMethod("Terminate", null);
process.Dispose();
}
}
}
public static void localProcessKill(string processName)
{
foreach (Process p in Process.GetProcessesByName(processName))
{
p.Kill();
}
}
[DllImport("kernel32.dll")]
public static extern bool MoveFileEx(string lpExistingFileName, string lpNewFileName, int dwFlags);
public const int MOVEFILE_DELAY_UNTIL_REBOOT = 0x4;
}
}
I had this problem and it was solved by following the code below
var _path=MyFile.FileName;
using (var stream = new FileStream
(_path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// Your Code! ;
}
I had a very specific situation where I was getting an "IOException: The process cannot access the file 'file path'" on the line
File.Delete(fileName);
Inside an NUnit test that looked like:
Assert.Throws<IOException>(() =>
{
using (var sr = File.OpenText(fileName) {
var line = sr.ReadLine();
}
});
File.Delete(fileName);
It turns out NUnit 3 uses something they call "isolated context" for exception assertions. This probably runs on a separate thread.
My fix was to put the File.Delete in the same context.
Assert.Throws<IOException>(() =>
{
try
{
using (var sr = File.OpenText(fileName) {
var line = sr.ReadLine();
}
}
catch
{
File.Delete(fileName);
throw;
}
});
I had the following scenario that was causing the same error:
Upload files to the server
Then get rid of the old files after they have been uploaded
Most files were small in size, however, a few were large, and so attempting to delete those resulted in the cannot access file error.
It was not easy to find, however, the solution was as simple as Waiting "for the task to complete execution":
using (var wc = new WebClient())
{
var tskResult = wc.UploadFileTaskAsync(_address, _fileName);
tskResult.Wait();
}
In my case this problem was solved by Opening the file for Shared writing/reading. Following are the sample codes for shared reading and writing:-
Stream Writer
using(FileStream fs = new FileStream("D:\\test.txt",
FileMode.Append, FileAccess.Write, FileShare.ReadWrite))
using (StreamWriter sw = new StreamWriter(fs))
{
sw.WriteLine("any thing which you want to write");
}
Stream Reader
using (FileStream fs = new FileStream("D:\\test.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (StreamReader rr=new StreamReader(fs))
{
rr.ReadLine())
}
My below code solve this issue, but i suggest
First of all you need to understand what causing this issue and try the solution which you can find by changing code
I can give another way to solve this issue but better solution is to check your coding structure and try to analyse what makes this happen,if you do not find any solution then you can go with this code below
try{
Start:
///Put your file access code here
}catch (Exception ex)
{
//by anyway you need to handle this error with below code
if (ex.Message.StartsWith("The process cannot access the file"))
{
//Wait for 5 seconds to free that file and then start execution again
Thread.Sleep(5000);
goto Start;
}
}
Related
I have some code and when it executes, it throws a IOException, saying that
The process cannot access the file 'filename' because it is being used by
another process
What does this mean, and what can I do about it?
What is the cause?
The error message is pretty clear: you're trying to access a file, and it's not accessible because another process (or even the same process) is doing something with it (and it didn't allow any sharing).
Debugging
It may be pretty easy to solve (or pretty hard to understand), depending on your specific scenario. Let's see some.
Your process is the only one to access that file
You're sure the other process is your own process. If you know you open that file in another part of your program, then first of all you have to check that you properly close the file handle after each use. Here is an example of code with this bug:
var stream = new FileStream(path, FileAccess.Read);
var reader = new StreamReader(stream);
// Read data from this file, when I'm done I don't need it any more
File.Delete(path); // IOException: file is in use
Fortunately FileStream implements IDisposable, so it's easy to wrap all your code inside a using statement:
using (var stream = File.Open("myfile.txt", FileMode.Open)) {
// Use stream
}
// Here stream is not accessible and it has been closed (also if
// an exception is thrown and stack unrolled
This pattern will also ensure that the file won't be left open in case of exceptions (it may be the reason the file is in use: something went wrong, and no one closed it; see this post for an example).
If everything seems fine (you're sure you always close every file you open, even in case of exceptions) and you have multiple working threads, then you have two options: rework your code to serialize file access (not always doable and not always wanted) or apply a retry pattern. It's a pretty common pattern for I/O operations: you try to do something and in case of error you wait and try again (did you ask yourself why, for example, Windows Shell takes some time to inform you that a file is in use and cannot be deleted?). In C# it's pretty easy to implement (see also better examples about disk I/O, networking and database access).
private const int NumberOfRetries = 3;
private const int DelayOnRetry = 1000;
for (int i=1; i <= NumberOfRetries; ++i) {
try {
// Do stuff with file
break; // When done we can break loop
}
catch (IOException e) when (i <= NumberOfRetries) {
// You may check error code to filter some exceptions, not every error
// can be recovered.
Thread.Sleep(DelayOnRetry);
}
}
Please note a common error we see very often on StackOverflow:
var stream = File.Open(path, FileOpen.Read);
var content = File.ReadAllText(path);
In this case ReadAllText() will fail because the file is in use (File.Open() in the line before). To open the file beforehand is not only unnecessary but also wrong. The same applies to all File functions that don't return a handle to the file you're working with: File.ReadAllText(), File.WriteAllText(), File.ReadAllLines(), File.WriteAllLines() and others (like File.AppendAllXyz() functions) will all open and close the file by themselves.
Your process is not the only one to access that file
If your process is not the only one to access that file, then interaction can be harder. A retry pattern will help (if the file shouldn't be open by anyone else but it is, then you need a utility like Process Explorer to check who is doing what).
Ways to avoid
When applicable, always use using statements to open files. As said in previous paragraph, it'll actively help you to avoid many common errors (see this post for an example on how not to use it).
If possible, try to decide who owns access to a specific file and centralize access through a few well-known methods. If, for example, you have a data file where your program reads and writes, then you should box all I/O code inside a single class. It'll make debug easier (because you can always put a breakpoint there and see who is doing what) and also it'll be a synchronization point (if required) for multiple access.
Don't forget I/O operations can always fail, a common example is this:
if (File.Exists(path))
File.Delete(path);
If someone deletes the file after File.Exists() but before File.Delete(), then it'll throw an IOException in a place where you may wrongly feel safe.
Whenever it's possible, apply a retry pattern, and if you're using FileSystemWatcher, consider postponing action (because you'll get notified, but an application may still be working exclusively with that file).
Advanced scenarios
It's not always so easy, so you may need to share access with someone else. If, for example, you're reading from the beginning and writing to the end, you have at least two options.
1) share the same FileStream with proper synchronization functions (because it is not thread-safe). See this and this posts for an example.
2) use FileShare enumeration to instruct OS to allow other processes (or other parts of your own process) to access same file concurrently.
using (var stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.Read))
{
}
In this example I showed how to open a file for writing and share for reading; please note that when reading and writing overlaps, it results in undefined or invalid data. It's a situation that must be handled when reading. Also note that this doesn't make access to the stream thread-safe, so this object can't be shared with multiple threads unless access is synchronized somehow (see previous links). Other sharing options are available, and they open up more complex scenarios. Please refer to MSDN for more details.
In general N processes can read from same file all together but only one should write, in a controlled scenario you may even enable concurrent writings but this can't be generalized in few text paragraphs inside this answer.
Is it possible to unlock a file used by another process? It's not always safe and not so easy but yes, it's possible.
Using FileShare fixed my issue of opening file even if it is opened by another process.
using (var stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
}
Problem
one is tying to open file System.IO.File.Open(path, FileMode) with this method and want a shared access on file but
if u read documentation of System.IO.File.Open(path, FileMode) it is explicitly saying its does not allow sharing
Solution
use you have to use other override with FileShare
using FileStream fs = System.IO.File.Open(filePath, FileMode.Open, FileAccess.Read, FileShare.Read);
with FileShare.Read
Had an issue while uploading an image and couldn't delete it and found a solution. gl hf
//C# .NET
var image = Image.FromFile(filePath);
image.Dispose(); // this removes all resources
//later...
File.Delete(filePath); //now works
As other answers in this thread have pointed out, to resolve this error you need to carefully inspect the code, to understand where the file is getting locked.
In my case, I was sending out the file as an email attachment before performing the move operation.
So the file got locked for couple of seconds until SMTP client finished sending the email.
The solution I adopted was to move the file first, and then send the email. This solved the problem for me.
Another possible solution, as pointed out earlier by Hudson, would've been to dispose the object after use.
public static SendEmail()
{
MailMessage mMailMessage = new MailMessage();
//setup other email stuff
if (File.Exists(attachmentPath))
{
Attachment attachment = new Attachment(attachmentPath);
mMailMessage.Attachments.Add(attachment);
attachment.Dispose(); //disposing the Attachment object
}
}
I got this error because I was doing File.Move to a file path without a file name, need to specify the full path in the destination.
The error indicates another process is trying to access the file. Maybe you or someone else has it open while you are attempting to write to it. "Read" or "Copy" usually doesn't cause this, but writing to it or calling delete on it would.
There are some basic things to avoid this, as other answers have mentioned:
In FileStream operations, place it in a using block with a FileShare.ReadWrite mode of access.
For example:
using (FileStream stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
}
Note that FileAccess.ReadWrite is not possible if you use FileMode.Append.
I ran across this issue when I was using an input stream to do a File.SaveAs when the file was in use. In my case I found, I didn't actually need to save it back to the file system at all, so I ended up just removing that, but I probably could've tried creating a FileStream in a using statement with FileAccess.ReadWrite, much like the code above.
Saving your data as a different file and going back to delete the old one when it is found to be no longer in use, then renaming the one that saved successfully to the name of the original one is an option. How you test for the file being in use is accomplished through the
List<Process> lstProcs = ProcessHandler.WhoIsLocking(file);
line in my code below, and could be done in a Windows service, on a loop, if you have a particular file you want to watch and delete regularly when you want to replace it. If you don't always have the same file, a text file or database table could be updated that the service always checks for file names, and then performs that check for processes & subsequently performs the process kills and deletion on it, as I describe in the next option. Note that you'll need an account user name and password that has Admin privileges on the given computer, of course, to perform the deletion and ending of processes.
When you don't know if a file will be in use when you are trying to save it, you can close all processes that could be using it, like Word, if it's a Word document, ahead of the save.
If it is local, you can do this:
ProcessHandler.localProcessKill("winword.exe");
If it is remote, you can do this:
ProcessHandler.remoteProcessKill(computerName, txtUserName, txtPassword, "winword.exe");
where txtUserName is in the form of DOMAIN\user.
Let's say you don't know the process name that is locking the file. Then, you can do this:
List<Process> lstProcs = new List<Process>();
lstProcs = ProcessHandler.WhoIsLocking(file);
foreach (Process p in lstProcs)
{
if (p.MachineName == ".")
ProcessHandler.localProcessKill(p.ProcessName);
else
ProcessHandler.remoteProcessKill(p.MachineName, txtUserName, txtPassword, p.ProcessName);
}
Note that file must be the UNC path: \\computer\share\yourdoc.docx in order for the Process to figure out what computer it's on and p.MachineName to be valid.
Below is the class these functions use, which requires adding a reference to System.Management. The code was originally written by Eric J.:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Runtime.InteropServices;
using System.Diagnostics;
using System.Management;
namespace MyProject
{
public static class ProcessHandler
{
[StructLayout(LayoutKind.Sequential)]
struct RM_UNIQUE_PROCESS
{
public int dwProcessId;
public System.Runtime.InteropServices.ComTypes.FILETIME ProcessStartTime;
}
const int RmRebootReasonNone = 0;
const int CCH_RM_MAX_APP_NAME = 255;
const int CCH_RM_MAX_SVC_NAME = 63;
enum RM_APP_TYPE
{
RmUnknownApp = 0,
RmMainWindow = 1,
RmOtherWindow = 2,
RmService = 3,
RmExplorer = 4,
RmConsole = 5,
RmCritical = 1000
}
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Unicode)]
struct RM_PROCESS_INFO
{
public RM_UNIQUE_PROCESS Process;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = CCH_RM_MAX_APP_NAME + 1)]
public string strAppName;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = CCH_RM_MAX_SVC_NAME + 1)]
public string strServiceShortName;
public RM_APP_TYPE ApplicationType;
public uint AppStatus;
public uint TSSessionId;
[MarshalAs(UnmanagedType.Bool)]
public bool bRestartable;
}
[DllImport("rstrtmgr.dll", CharSet = CharSet.Unicode)]
static extern int RmRegisterResources(uint pSessionHandle,
UInt32 nFiles,
string[] rgsFilenames,
UInt32 nApplications,
[In] RM_UNIQUE_PROCESS[] rgApplications,
UInt32 nServices,
string[] rgsServiceNames);
[DllImport("rstrtmgr.dll", CharSet = CharSet.Auto)]
static extern int RmStartSession(out uint pSessionHandle, int dwSessionFlags, string strSessionKey);
[DllImport("rstrtmgr.dll")]
static extern int RmEndSession(uint pSessionHandle);
[DllImport("rstrtmgr.dll")]
static extern int RmGetList(uint dwSessionHandle,
out uint pnProcInfoNeeded,
ref uint pnProcInfo,
[In, Out] RM_PROCESS_INFO[] rgAffectedApps,
ref uint lpdwRebootReasons);
/// <summary>
/// Find out what process(es) have a lock on the specified file.
/// </summary>
/// <param name="path">Path of the file.</param>
/// <returns>Processes locking the file</returns>
/// <remarks>See also:
/// http://msdn.microsoft.com/en-us/library/windows/desktop/aa373661(v=vs.85).aspx
/// http://wyupdate.googlecode.com/svn-history/r401/trunk/frmFilesInUse.cs (no copyright in code at time of viewing)
///
/// </remarks>
static public List<Process> WhoIsLocking(string path)
{
uint handle;
string key = Guid.NewGuid().ToString();
List<Process> processes = new List<Process>();
int res = RmStartSession(out handle, 0, key);
if (res != 0) throw new Exception("Could not begin restart session. Unable to determine file locker.");
try
{
const int ERROR_MORE_DATA = 234;
uint pnProcInfoNeeded = 0,
pnProcInfo = 0,
lpdwRebootReasons = RmRebootReasonNone;
string[] resources = new string[] { path }; // Just checking on one resource.
res = RmRegisterResources(handle, (uint)resources.Length, resources, 0, null, 0, null);
if (res != 0) throw new Exception("Could not register resource.");
//Note: there's a race condition here -- the first call to RmGetList() returns
// the total number of process. However, when we call RmGetList() again to get
// the actual processes this number may have increased.
res = RmGetList(handle, out pnProcInfoNeeded, ref pnProcInfo, null, ref lpdwRebootReasons);
if (res == ERROR_MORE_DATA)
{
// Create an array to store the process results
RM_PROCESS_INFO[] processInfo = new RM_PROCESS_INFO[pnProcInfoNeeded];
pnProcInfo = pnProcInfoNeeded;
// Get the list
res = RmGetList(handle, out pnProcInfoNeeded, ref pnProcInfo, processInfo, ref lpdwRebootReasons);
if (res == 0)
{
processes = new List<Process>((int)pnProcInfo);
// Enumerate all of the results and add them to the
// list to be returned
for (int i = 0; i < pnProcInfo; i++)
{
try
{
processes.Add(Process.GetProcessById(processInfo[i].Process.dwProcessId));
}
// catch the error -- in case the process is no longer running
catch (ArgumentException) { }
}
}
else throw new Exception("Could not list processes locking resource.");
}
else if (res != 0) throw new Exception("Could not list processes locking resource. Failed to get size of result.");
}
finally
{
RmEndSession(handle);
}
return processes;
}
public static void remoteProcessKill(string computerName, string userName, string pword, string processName)
{
var connectoptions = new ConnectionOptions();
connectoptions.Username = userName;
connectoptions.Password = pword;
ManagementScope scope = new ManagementScope(#"\\" + computerName + #"\root\cimv2", connectoptions);
// WMI query
var query = new SelectQuery("select * from Win32_process where name = '" + processName + "'");
using (var searcher = new ManagementObjectSearcher(scope, query))
{
foreach (ManagementObject process in searcher.Get())
{
process.InvokeMethod("Terminate", null);
process.Dispose();
}
}
}
public static void localProcessKill(string processName)
{
foreach (Process p in Process.GetProcessesByName(processName))
{
p.Kill();
}
}
[DllImport("kernel32.dll")]
public static extern bool MoveFileEx(string lpExistingFileName, string lpNewFileName, int dwFlags);
public const int MOVEFILE_DELAY_UNTIL_REBOOT = 0x4;
}
}
I had this problem and it was solved by following the code below
var _path=MyFile.FileName;
using (var stream = new FileStream
(_path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// Your Code! ;
}
I had a very specific situation where I was getting an "IOException: The process cannot access the file 'file path'" on the line
File.Delete(fileName);
Inside an NUnit test that looked like:
Assert.Throws<IOException>(() =>
{
using (var sr = File.OpenText(fileName) {
var line = sr.ReadLine();
}
});
File.Delete(fileName);
It turns out NUnit 3 uses something they call "isolated context" for exception assertions. This probably runs on a separate thread.
My fix was to put the File.Delete in the same context.
Assert.Throws<IOException>(() =>
{
try
{
using (var sr = File.OpenText(fileName) {
var line = sr.ReadLine();
}
}
catch
{
File.Delete(fileName);
throw;
}
});
I had the following scenario that was causing the same error:
Upload files to the server
Then get rid of the old files after they have been uploaded
Most files were small in size, however, a few were large, and so attempting to delete those resulted in the cannot access file error.
It was not easy to find, however, the solution was as simple as Waiting "for the task to complete execution":
using (var wc = new WebClient())
{
var tskResult = wc.UploadFileTaskAsync(_address, _fileName);
tskResult.Wait();
}
In my case this problem was solved by Opening the file for Shared writing/reading. Following are the sample codes for shared reading and writing:-
Stream Writer
using(FileStream fs = new FileStream("D:\\test.txt",
FileMode.Append, FileAccess.Write, FileShare.ReadWrite))
using (StreamWriter sw = new StreamWriter(fs))
{
sw.WriteLine("any thing which you want to write");
}
Stream Reader
using (FileStream fs = new FileStream("D:\\test.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (StreamReader rr=new StreamReader(fs))
{
rr.ReadLine())
}
My below code solve this issue, but i suggest
First of all you need to understand what causing this issue and try the solution which you can find by changing code
I can give another way to solve this issue but better solution is to check your coding structure and try to analyse what makes this happen,if you do not find any solution then you can go with this code below
try{
Start:
///Put your file access code here
}catch (Exception ex)
{
//by anyway you need to handle this error with below code
if (ex.Message.StartsWith("The process cannot access the file"))
{
//Wait for 5 seconds to free that file and then start execution again
Thread.Sleep(5000);
goto Start;
}
}
I'm having a class that is using a FileStream internally using the FileOption: DeleteOnClose. The normal behavior is when I allocate the class with a filename, I do not use the DeleteOnClose, otherwise I use it.
The only problem is that I happens sometimes that I need to undo the DeleteOnClose.
I would be too lengthy to explain the deeper details here. Of course I could create a copy and copy the contents of the FileStream that have been opened with DeleteOnClose to another FileStream but the file size is too large (>= 30GB) so that this approach is impractical.
Deleting the file manually does not work, since the classes are more or less used as memory containers that need to be handled by the GC. Also when something happens it's not helpful to have dead files lying around.
So I was hoping if there is a way to undo the DeleteOnClose attribute as this can be done, e.g. with SetFileAttributes where e.g. the Temporary flag can be set/unset.
With the comments of TheGeneral and TonPlooij I created a small example to test FileDispositionInfo, but somehow this also doesn't work (copied from http://source.roslyn.codeplex.com/#Roslyn.Test.Utilities/TempFiles/DisposableFile.cs,4d5c94058d1b4cd3):
using Microsoft.Win32.SafeHandles;
using System;
using System.IO;
using System.Runtime.InteropServices;
namespace ConsoleApp11
{
class Program
{
[DllImport("kernel32.dll", PreserveSig = false)]
private static extern void SetFileInformationByHandle(SafeFileHandle handle, int fileInformationClass, ref uint fileDispositionInfoDeleteFile, int bufferSize);
private const int FileDispositionInfo = 4;
internal static void PrepareDeleteOnCloseStreamForDisposal(FileStream stream)
{
// tomat: Set disposition to "delete" on the stream, so to avoid ForeFront EndPoint
// Protection driver scanning the file. Note that after calling this on a file that's open with DeleteOnClose,
// the file can't be opened again, not even by the same process.
uint trueValue = 1;
SetFileInformationByHandle(stream.SafeFileHandle, FileDispositionInfo, ref trueValue, sizeof(uint));
}
/// <summary>
/// Marks given file for automatic deletion when all its handles are closed.
/// Note that after doing this the file can't be opened again, not even by the same process.
/// </summary>
internal static void DeleteFileOnClose(string fullPath)
{
using (var stream = new FileStream(fullPath, FileMode.Create, FileAccess.ReadWrite, FileShare.Delete| FileShare.ReadWrite, 8))
{
PrepareDeleteOnCloseStreamForDisposal(stream);
}
}
static void Main(string[] args)
{
DeleteFileOnClose("D:\\test.dat");
Console.WriteLine("Done.");
Console.ReadKey();
}
}
}
To answer you question directly, no you cant change the option mid way through. The file stream uses the winapi flag FILE_FLAG_DELETE_ON_CLOSE, this cant be changed mid way through, its essentially associated with the file handle, when the handle is closed the operating system cleans up the file, that is it and no work around.
If you want different behavior you will have to implement it yourself after the fact. ie deleting the file after its closed or not.
The best alternative I came up with was to create a hard-link to the temporary file if I want to persist it. DeleteOnClose apparently does not seem to actually delete the file, but only the link to the file, which means if we create another hard link to it before closing the file stream it will be persisted (with a new name).
This of course requires a file system that supports hard links, and I'm not sure if this will work on MacOS/Linux, but on Windows NTFS it seems to work as advertised.
So something like the following appears to work, if it is acceptable to use a temporary file name until we decide to persist it:
public class TemporaryFile : FileStream
{
[DllImport("Kernel32.dll", CharSet = CharSet.Unicode, SetLastError = true)]
private static extern bool CreateHardLink(string lpFileName, string lpExistingFileName, IntPtr lpSecurityAttributes);
public TemporaryFile(string path, FileMode fileMode = FileMode.Create, FileAccess fileAccess = FileAccess.ReadWrite, FileShare fileShare = FileShare.ReadWrite | FileShare.Delete, int bufferSize = 4096, FileOptions options = FileOptions.None)
: base(path, fileMode, fileAccess, fileShare, bufferSize, GetFileOptions(options))
{
}
public void PersistTo(string path)
{
if (!CreateHardLink(path, Name, IntPtr.Zero))
{
Marshal.ThrowExceptionForHR(Marshal.GetHRForLastWin32Error());
}
}
private static FileOptions GetFileOptions(FileOptions options)
{
return options | FileOptions.DeleteOnClose;
}
}
Which could then be used like:
using var tempFile = new TemporaryFile(#"c:\tempFile.tmp");
// Write data to TempFile
if (shouldFileBePersisted)
tempFile.PersistTo(#"c:\permanentFileName.txt");
It isn't a perfect solution, but this worked for my purposes, so thought I'd share it.
I need to checksum every single file on a given USB disk in a C# application. I suspect the bottleneck here is the actual read off the disk so I'm looking to make this as fast as possible.
I suspect this would be much quicker if I could read the files on the disk sequentially, in the actual order they appear on the disk (assuming the drive is not fragmented).
How can I find this information for each file from it's standard path? i.e. given a file at "F:\MyFile.txt", how can I find the start location of this file on the disk?
I'm running a C# application in Windows.
Now... I don't really know if it will be useful for you:
[StructLayout(LayoutKind.Sequential)]
public struct StartingVcnInputBuffer
{
public long StartingVcn;
}
public static readonly int StartingVcnInputBufferSizeOf = Marshal.SizeOf(typeof(StartingVcnInputBuffer));
[StructLayout(LayoutKind.Sequential)]
public struct RetrievalPointersBuffer
{
public uint ExtentCount;
public long StartingVcn;
public long NextVcn;
public long Lcn;
}
public static readonly int RetrievalPointersBufferSizeOf = Marshal.SizeOf(typeof(RetrievalPointersBuffer));
[DllImport("kernel32.dll", CharSet = CharSet.Unicode, SetLastError = true)]
public static extern SafeFileHandle CreateFileW(
[MarshalAs(UnmanagedType.LPWStr)] string filename,
[MarshalAs(UnmanagedType.U4)] FileAccess access,
[MarshalAs(UnmanagedType.U4)] FileShare share,
IntPtr securityAttributes,
[MarshalAs(UnmanagedType.U4)] FileMode creationDisposition,
[MarshalAs(UnmanagedType.U4)] FileAttributes flagsAndAttributes,
IntPtr templateFile);
[DllImport("kernel32.dll", ExactSpelling = true, SetLastError = true, CharSet = CharSet.Auto)]
static extern bool DeviceIoControl(IntPtr hDevice, uint dwIoControlCode,
ref StartingVcnInputBuffer lpInBuffer, int nInBufferSize,
out RetrievalPointersBuffer lpOutBuffer, int nOutBufferSize,
out int lpBytesReturned, IntPtr lpOverlapped);
// Returns a FileStream that can only Read
public static void GetStartLogicalClusterNumber(string fileName, out FileStream file, out long startLogicalClusterNumber)
{
SafeFileHandle handle = CreateFileW(fileName, FileAccess.Read | (FileAccess)0x80 /* FILE_READ_ATTRIBUTES */, FileShare.Read, IntPtr.Zero, FileMode.Open, 0, IntPtr.Zero);
if (handle.IsInvalid)
{
throw new Win32Exception();
}
file = new FileStream(handle, FileAccess.Read);
var svib = new StartingVcnInputBuffer();
int error;
RetrievalPointersBuffer rpb;
int bytesReturned;
DeviceIoControl(handle.DangerousGetHandle(), (uint)589939 /* FSCTL_GET_RETRIEVAL_POINTERS */, ref svib, StartingVcnInputBufferSizeOf, out rpb, RetrievalPointersBufferSizeOf, out bytesReturned, IntPtr.Zero);
error = Marshal.GetLastWin32Error();
switch (error)
{
case 38: /* ERROR_HANDLE_EOF */
startLogicalClusterNumber = -1; // empty file. Choose how to handle
break;
case 0: /* NO:ERROR */
case 234: /* ERROR_MORE_DATA */
startLogicalClusterNumber = rpb.Lcn;
break;
default:
throw new Win32Exception();
}
}
Note that the method will return a FileStream that you can keep open and use to read the file, or you can easily modify it to not return it (and not create it) and then reopen the file when you want to hash it.
To use:
string[] fileNames = Directory.GetFiles(#"D:\");
foreach (string fileName in fileNames)
{
try
{
long startLogicalClusterNumber;
FileStream file;
GetStartLogicalClusterNumber(fileName, out file, out startLogicalClusterNumber);
}
catch (Exception e)
{
Console.WriteLine("Skipping: {0} for {1}", fileName, e.Message);
}
}
I'm using the API described here: https://web.archive.org/web/20160130161216/http://www.wd-3.com/archive/luserland.htm . The program is much easier because you only need the initial Logical Cluster Number (the first version of the code could extract all the LCN extents, but it would be useless, because you have to hash a file from first to last byte). Note that empty files (files with length 0) don't have any cluster allocated. The function returns -1 for the cluster (ERROR_HANDLE_EOF). You can choose how to handle it.
If your drives are SSD or based on memory stick technology - forget it.
Memory sticks and other similar devices are generally based on SSD (or similar) technology, where the problem of random read/write access is actually not a problem. So you can just enumerate files and run your checksum.
You can try running this in several threads, but I am not sure that could speed up the process, it's something you may need to test. It may also vary from device to device.
Bonus
#xanatos mentioned an interesting point: "I always noticed that copying thousand of files on a memory stick is much slower than copying a single big file"
It is indeed much faster to copy one big file, rather than a pile of small files. And the reason is (usually) not because the files are located close to each other, so it's easier for hardware to read them sequentially. The problem comes to the OS that needs to keep tracking of each file.
If you ever run a procmon on Windows, you would observe huge amount of FileCreates, FileReads and FileWrites. In order to copy 100 files, OS would open each file, read its content, write to another file, close both files + lots of update operations that are sent to the file system, such as update attributes for both files, update security descriptors for both files, update directory information etc. So one copy operation has many satellite operations.
I have some code and when it executes, it throws a IOException, saying that
The process cannot access the file 'filename' because it is being used by
another process
What does this mean, and what can I do about it?
What is the cause?
The error message is pretty clear: you're trying to access a file, and it's not accessible because another process (or even the same process) is doing something with it (and it didn't allow any sharing).
Debugging
It may be pretty easy to solve (or pretty hard to understand), depending on your specific scenario. Let's see some.
Your process is the only one to access that file
You're sure the other process is your own process. If you know you open that file in another part of your program, then first of all you have to check that you properly close the file handle after each use. Here is an example of code with this bug:
var stream = new FileStream(path, FileAccess.Read);
var reader = new StreamReader(stream);
// Read data from this file, when I'm done I don't need it any more
File.Delete(path); // IOException: file is in use
Fortunately FileStream implements IDisposable, so it's easy to wrap all your code inside a using statement:
using (var stream = File.Open("myfile.txt", FileMode.Open)) {
// Use stream
}
// Here stream is not accessible and it has been closed (also if
// an exception is thrown and stack unrolled
This pattern will also ensure that the file won't be left open in case of exceptions (it may be the reason the file is in use: something went wrong, and no one closed it; see this post for an example).
If everything seems fine (you're sure you always close every file you open, even in case of exceptions) and you have multiple working threads, then you have two options: rework your code to serialize file access (not always doable and not always wanted) or apply a retry pattern. It's a pretty common pattern for I/O operations: you try to do something and in case of error you wait and try again (did you ask yourself why, for example, Windows Shell takes some time to inform you that a file is in use and cannot be deleted?). In C# it's pretty easy to implement (see also better examples about disk I/O, networking and database access).
private const int NumberOfRetries = 3;
private const int DelayOnRetry = 1000;
for (int i=1; i <= NumberOfRetries; ++i) {
try {
// Do stuff with file
break; // When done we can break loop
}
catch (IOException e) when (i <= NumberOfRetries) {
// You may check error code to filter some exceptions, not every error
// can be recovered.
Thread.Sleep(DelayOnRetry);
}
}
Please note a common error we see very often on StackOverflow:
var stream = File.Open(path, FileOpen.Read);
var content = File.ReadAllText(path);
In this case ReadAllText() will fail because the file is in use (File.Open() in the line before). To open the file beforehand is not only unnecessary but also wrong. The same applies to all File functions that don't return a handle to the file you're working with: File.ReadAllText(), File.WriteAllText(), File.ReadAllLines(), File.WriteAllLines() and others (like File.AppendAllXyz() functions) will all open and close the file by themselves.
Your process is not the only one to access that file
If your process is not the only one to access that file, then interaction can be harder. A retry pattern will help (if the file shouldn't be open by anyone else but it is, then you need a utility like Process Explorer to check who is doing what).
Ways to avoid
When applicable, always use using statements to open files. As said in previous paragraph, it'll actively help you to avoid many common errors (see this post for an example on how not to use it).
If possible, try to decide who owns access to a specific file and centralize access through a few well-known methods. If, for example, you have a data file where your program reads and writes, then you should box all I/O code inside a single class. It'll make debug easier (because you can always put a breakpoint there and see who is doing what) and also it'll be a synchronization point (if required) for multiple access.
Don't forget I/O operations can always fail, a common example is this:
if (File.Exists(path))
File.Delete(path);
If someone deletes the file after File.Exists() but before File.Delete(), then it'll throw an IOException in a place where you may wrongly feel safe.
Whenever it's possible, apply a retry pattern, and if you're using FileSystemWatcher, consider postponing action (because you'll get notified, but an application may still be working exclusively with that file).
Advanced scenarios
It's not always so easy, so you may need to share access with someone else. If, for example, you're reading from the beginning and writing to the end, you have at least two options.
1) share the same FileStream with proper synchronization functions (because it is not thread-safe). See this and this posts for an example.
2) use FileShare enumeration to instruct OS to allow other processes (or other parts of your own process) to access same file concurrently.
using (var stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.Read))
{
}
In this example I showed how to open a file for writing and share for reading; please note that when reading and writing overlaps, it results in undefined or invalid data. It's a situation that must be handled when reading. Also note that this doesn't make access to the stream thread-safe, so this object can't be shared with multiple threads unless access is synchronized somehow (see previous links). Other sharing options are available, and they open up more complex scenarios. Please refer to MSDN for more details.
In general N processes can read from same file all together but only one should write, in a controlled scenario you may even enable concurrent writings but this can't be generalized in few text paragraphs inside this answer.
Is it possible to unlock a file used by another process? It's not always safe and not so easy but yes, it's possible.
Using FileShare fixed my issue of opening file even if it is opened by another process.
using (var stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
}
Problem
one is tying to open file System.IO.File.Open(path, FileMode) with this method and want a shared access on file but
if u read documentation of System.IO.File.Open(path, FileMode) it is explicitly saying its does not allow sharing
Solution
use you have to use other override with FileShare
using FileStream fs = System.IO.File.Open(filePath, FileMode.Open, FileAccess.Read, FileShare.Read);
with FileShare.Read
Had an issue while uploading an image and couldn't delete it and found a solution. gl hf
//C# .NET
var image = Image.FromFile(filePath);
image.Dispose(); // this removes all resources
//later...
File.Delete(filePath); //now works
As other answers in this thread have pointed out, to resolve this error you need to carefully inspect the code, to understand where the file is getting locked.
In my case, I was sending out the file as an email attachment before performing the move operation.
So the file got locked for couple of seconds until SMTP client finished sending the email.
The solution I adopted was to move the file first, and then send the email. This solved the problem for me.
Another possible solution, as pointed out earlier by Hudson, would've been to dispose the object after use.
public static SendEmail()
{
MailMessage mMailMessage = new MailMessage();
//setup other email stuff
if (File.Exists(attachmentPath))
{
Attachment attachment = new Attachment(attachmentPath);
mMailMessage.Attachments.Add(attachment);
attachment.Dispose(); //disposing the Attachment object
}
}
I got this error because I was doing File.Move to a file path without a file name, need to specify the full path in the destination.
The error indicates another process is trying to access the file. Maybe you or someone else has it open while you are attempting to write to it. "Read" or "Copy" usually doesn't cause this, but writing to it or calling delete on it would.
There are some basic things to avoid this, as other answers have mentioned:
In FileStream operations, place it in a using block with a FileShare.ReadWrite mode of access.
For example:
using (FileStream stream = File.Open(path, FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
}
Note that FileAccess.ReadWrite is not possible if you use FileMode.Append.
I ran across this issue when I was using an input stream to do a File.SaveAs when the file was in use. In my case I found, I didn't actually need to save it back to the file system at all, so I ended up just removing that, but I probably could've tried creating a FileStream in a using statement with FileAccess.ReadWrite, much like the code above.
Saving your data as a different file and going back to delete the old one when it is found to be no longer in use, then renaming the one that saved successfully to the name of the original one is an option. How you test for the file being in use is accomplished through the
List<Process> lstProcs = ProcessHandler.WhoIsLocking(file);
line in my code below, and could be done in a Windows service, on a loop, if you have a particular file you want to watch and delete regularly when you want to replace it. If you don't always have the same file, a text file or database table could be updated that the service always checks for file names, and then performs that check for processes & subsequently performs the process kills and deletion on it, as I describe in the next option. Note that you'll need an account user name and password that has Admin privileges on the given computer, of course, to perform the deletion and ending of processes.
When you don't know if a file will be in use when you are trying to save it, you can close all processes that could be using it, like Word, if it's a Word document, ahead of the save.
If it is local, you can do this:
ProcessHandler.localProcessKill("winword.exe");
If it is remote, you can do this:
ProcessHandler.remoteProcessKill(computerName, txtUserName, txtPassword, "winword.exe");
where txtUserName is in the form of DOMAIN\user.
Let's say you don't know the process name that is locking the file. Then, you can do this:
List<Process> lstProcs = new List<Process>();
lstProcs = ProcessHandler.WhoIsLocking(file);
foreach (Process p in lstProcs)
{
if (p.MachineName == ".")
ProcessHandler.localProcessKill(p.ProcessName);
else
ProcessHandler.remoteProcessKill(p.MachineName, txtUserName, txtPassword, p.ProcessName);
}
Note that file must be the UNC path: \\computer\share\yourdoc.docx in order for the Process to figure out what computer it's on and p.MachineName to be valid.
Below is the class these functions use, which requires adding a reference to System.Management. The code was originally written by Eric J.:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Runtime.InteropServices;
using System.Diagnostics;
using System.Management;
namespace MyProject
{
public static class ProcessHandler
{
[StructLayout(LayoutKind.Sequential)]
struct RM_UNIQUE_PROCESS
{
public int dwProcessId;
public System.Runtime.InteropServices.ComTypes.FILETIME ProcessStartTime;
}
const int RmRebootReasonNone = 0;
const int CCH_RM_MAX_APP_NAME = 255;
const int CCH_RM_MAX_SVC_NAME = 63;
enum RM_APP_TYPE
{
RmUnknownApp = 0,
RmMainWindow = 1,
RmOtherWindow = 2,
RmService = 3,
RmExplorer = 4,
RmConsole = 5,
RmCritical = 1000
}
[StructLayout(LayoutKind.Sequential, CharSet = CharSet.Unicode)]
struct RM_PROCESS_INFO
{
public RM_UNIQUE_PROCESS Process;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = CCH_RM_MAX_APP_NAME + 1)]
public string strAppName;
[MarshalAs(UnmanagedType.ByValTStr, SizeConst = CCH_RM_MAX_SVC_NAME + 1)]
public string strServiceShortName;
public RM_APP_TYPE ApplicationType;
public uint AppStatus;
public uint TSSessionId;
[MarshalAs(UnmanagedType.Bool)]
public bool bRestartable;
}
[DllImport("rstrtmgr.dll", CharSet = CharSet.Unicode)]
static extern int RmRegisterResources(uint pSessionHandle,
UInt32 nFiles,
string[] rgsFilenames,
UInt32 nApplications,
[In] RM_UNIQUE_PROCESS[] rgApplications,
UInt32 nServices,
string[] rgsServiceNames);
[DllImport("rstrtmgr.dll", CharSet = CharSet.Auto)]
static extern int RmStartSession(out uint pSessionHandle, int dwSessionFlags, string strSessionKey);
[DllImport("rstrtmgr.dll")]
static extern int RmEndSession(uint pSessionHandle);
[DllImport("rstrtmgr.dll")]
static extern int RmGetList(uint dwSessionHandle,
out uint pnProcInfoNeeded,
ref uint pnProcInfo,
[In, Out] RM_PROCESS_INFO[] rgAffectedApps,
ref uint lpdwRebootReasons);
/// <summary>
/// Find out what process(es) have a lock on the specified file.
/// </summary>
/// <param name="path">Path of the file.</param>
/// <returns>Processes locking the file</returns>
/// <remarks>See also:
/// http://msdn.microsoft.com/en-us/library/windows/desktop/aa373661(v=vs.85).aspx
/// http://wyupdate.googlecode.com/svn-history/r401/trunk/frmFilesInUse.cs (no copyright in code at time of viewing)
///
/// </remarks>
static public List<Process> WhoIsLocking(string path)
{
uint handle;
string key = Guid.NewGuid().ToString();
List<Process> processes = new List<Process>();
int res = RmStartSession(out handle, 0, key);
if (res != 0) throw new Exception("Could not begin restart session. Unable to determine file locker.");
try
{
const int ERROR_MORE_DATA = 234;
uint pnProcInfoNeeded = 0,
pnProcInfo = 0,
lpdwRebootReasons = RmRebootReasonNone;
string[] resources = new string[] { path }; // Just checking on one resource.
res = RmRegisterResources(handle, (uint)resources.Length, resources, 0, null, 0, null);
if (res != 0) throw new Exception("Could not register resource.");
//Note: there's a race condition here -- the first call to RmGetList() returns
// the total number of process. However, when we call RmGetList() again to get
// the actual processes this number may have increased.
res = RmGetList(handle, out pnProcInfoNeeded, ref pnProcInfo, null, ref lpdwRebootReasons);
if (res == ERROR_MORE_DATA)
{
// Create an array to store the process results
RM_PROCESS_INFO[] processInfo = new RM_PROCESS_INFO[pnProcInfoNeeded];
pnProcInfo = pnProcInfoNeeded;
// Get the list
res = RmGetList(handle, out pnProcInfoNeeded, ref pnProcInfo, processInfo, ref lpdwRebootReasons);
if (res == 0)
{
processes = new List<Process>((int)pnProcInfo);
// Enumerate all of the results and add them to the
// list to be returned
for (int i = 0; i < pnProcInfo; i++)
{
try
{
processes.Add(Process.GetProcessById(processInfo[i].Process.dwProcessId));
}
// catch the error -- in case the process is no longer running
catch (ArgumentException) { }
}
}
else throw new Exception("Could not list processes locking resource.");
}
else if (res != 0) throw new Exception("Could not list processes locking resource. Failed to get size of result.");
}
finally
{
RmEndSession(handle);
}
return processes;
}
public static void remoteProcessKill(string computerName, string userName, string pword, string processName)
{
var connectoptions = new ConnectionOptions();
connectoptions.Username = userName;
connectoptions.Password = pword;
ManagementScope scope = new ManagementScope(#"\\" + computerName + #"\root\cimv2", connectoptions);
// WMI query
var query = new SelectQuery("select * from Win32_process where name = '" + processName + "'");
using (var searcher = new ManagementObjectSearcher(scope, query))
{
foreach (ManagementObject process in searcher.Get())
{
process.InvokeMethod("Terminate", null);
process.Dispose();
}
}
}
public static void localProcessKill(string processName)
{
foreach (Process p in Process.GetProcessesByName(processName))
{
p.Kill();
}
}
[DllImport("kernel32.dll")]
public static extern bool MoveFileEx(string lpExistingFileName, string lpNewFileName, int dwFlags);
public const int MOVEFILE_DELAY_UNTIL_REBOOT = 0x4;
}
}
I had this problem and it was solved by following the code below
var _path=MyFile.FileName;
using (var stream = new FileStream
(_path, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
// Your Code! ;
}
I had a very specific situation where I was getting an "IOException: The process cannot access the file 'file path'" on the line
File.Delete(fileName);
Inside an NUnit test that looked like:
Assert.Throws<IOException>(() =>
{
using (var sr = File.OpenText(fileName) {
var line = sr.ReadLine();
}
});
File.Delete(fileName);
It turns out NUnit 3 uses something they call "isolated context" for exception assertions. This probably runs on a separate thread.
My fix was to put the File.Delete in the same context.
Assert.Throws<IOException>(() =>
{
try
{
using (var sr = File.OpenText(fileName) {
var line = sr.ReadLine();
}
}
catch
{
File.Delete(fileName);
throw;
}
});
I had the following scenario that was causing the same error:
Upload files to the server
Then get rid of the old files after they have been uploaded
Most files were small in size, however, a few were large, and so attempting to delete those resulted in the cannot access file error.
It was not easy to find, however, the solution was as simple as Waiting "for the task to complete execution":
using (var wc = new WebClient())
{
var tskResult = wc.UploadFileTaskAsync(_address, _fileName);
tskResult.Wait();
}
In my case this problem was solved by Opening the file for Shared writing/reading. Following are the sample codes for shared reading and writing:-
Stream Writer
using(FileStream fs = new FileStream("D:\\test.txt",
FileMode.Append, FileAccess.Write, FileShare.ReadWrite))
using (StreamWriter sw = new StreamWriter(fs))
{
sw.WriteLine("any thing which you want to write");
}
Stream Reader
using (FileStream fs = new FileStream("D:\\test.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (StreamReader rr=new StreamReader(fs))
{
rr.ReadLine())
}
My below code solve this issue, but i suggest
First of all you need to understand what causing this issue and try the solution which you can find by changing code
I can give another way to solve this issue but better solution is to check your coding structure and try to analyse what makes this happen,if you do not find any solution then you can go with this code below
try{
Start:
///Put your file access code here
}catch (Exception ex)
{
//by anyway you need to handle this error with below code
if (ex.Message.StartsWith("The process cannot access the file"))
{
//Wait for 5 seconds to free that file and then start execution again
Thread.Sleep(5000);
goto Start;
}
}
I'm trying to make use of the ImageMagick COM object (ImageMagickObject) in a .NET library. This library is intended to be called from IronRuby, but that isn't all that important. I want to take this approach because it will fit with my existing calls, which currently call the ImageMagick binaries as external processes. The COM object will take the same arguments as the binaries, but will save the process creation and are about 5x faster overall.
My only hurdle is that the "Compare" method for the COM object returns its result to STDERR. This is also a problem with the binary, but it's easy to pipe that back into STDOUT, where I was expecting it. With the COM object, I'm getting my results from function return values.
How can I redirect the result from "Compare" to a string buffer or even a file instead of STDERR?
I have tried the following, which does stop the output from reaching STDERR, but it doesn't write to the file as expected:
using ImageMagickObject;
...
public class ImageMagickCOM
{
[DllImport("Kernel32.dll", SetLastError = true)]
public static extern int SetStdHandle(int device, IntPtr handle);
private const int STDOUT_HANDLE = -11;
private const int STDERR_HANDLE = -12;
private ImageMagickObject.MagickImage magickImage = null;
private FileStream filestream = null;
private StreamWriter streamwriter = null;
public ImageMagickCOM()
{
IntPtr handle;
int status;
filestream = new FileStream("output.txt", FileMode.Create);
streamwriter = new StreamWriter(filestream);
streamwriter.AutoFlush = true;
//handle = filestream.Handle; // deprecated
handle = filestream.SafeFileHandle.DangerousGetHandle(); // replaces filestream.handle
status = SetStdHandle(STDOUT_HANDLE, handle);
status = SetStdHandle(STDERR_HANDLE, handle);
Console.SetOut(streamwriter);
Console.SetError(streamwriter);
magickImage = new ImageMagickObject.MagickImage();
}
public string Compare()
{
object[] args = new object[] { "-metric", "AE", "-fuzz", "10%", "imageA.jpg", "imageB.jpg", "diff.png" };
return (string)this.magickImage.Compare(ref args);
}
public void Close()
{
if (this.magickImage != null)
{
Marshal.ReleaseComObject(magickImage);
this.magickImage = null;
}
if (this.streamwriter != null)
{
this.streamwriter.Flush();
this.streamwriter.Close();
this.streamwriter = null;
this.filestream = null;
}
}
}
Only the "Compare" action seems to use STDERR to send a result (it uses the return value as a success indicator). All of the other methods (Identify, Convert, Mogrify, etc) work as you would expect.
For reference, it gets called something like this (from IronRuby):
require 'ImagingLib.dll'
im = ImagingLib::ImageMagickCOM.new
im.compare # returns nil
im.close
And output.txt is created, but empty. Nothing gets printed to STDOUT or STDERR.
EDITS: For clarity regarding streamwriter flush/close and how the sample is used from IronRuby.
Did you try Disposing the(or flushing) writer and stream? It could have died stuck in the buffer. A using block might help there.
using(filestream = new FileStream("output.txt", FileMode.Create))
using(streamwriter = new StreamWriter(filestream))
{
...
}
After adding the debug option I finally got some text in the file.
Is that the result you are expecting? If so, see debug option of the compare command-line tool.
Please note that setting the Console.Error or Console.Out to streamwriter will cause an exception if you try to log anything directly or indirectly using the aforementioned properties.
If you really need to set those properties, set them to another instance of StreamWriter. If you do not need them, omit the calls to Console.SetOut and Console.SetError.
I have also noticed that once ImageMagickObject.MagickImage instance is created, it is not possible to redirect the standard error. If the standard error redirection needs to be undone, or performed multiple times, try executing the code in another application domain.
It looks like .Handle is deprecated, have you tried .SafeFileHandle.DangerousGetHandle() instead?
Ref: http://msdn.microsoft.com/en-us/library/system.runtime.interopservices.safehandle.dangerousgethandle.aspx
(Additionally: can you confirm you close the StreamWriter after the data has been written? Like, if you move it to just before the app closes, to see if the data makes its way out?)
According to the documentation "FileShare.Read is the default for those FileStream constructors without a FileShare parameter"... Perhaps it could help to have FileAccess.Read, FileAccess.Write, FileShare.Read and FileShare.Write set on in the FileStream c-tor?
I don't have ImageMagick exe to play with... if you can post the link, it would be great.