File remains open after process killed - c#

My code, with some non-essential bits stripped out, looks like:
using (Process p = new Process()()
{
p.StartInfo.FileName = batfile;
p.StartInfo.Arguments = #"""" + filename + #"""";
p.Start();
if (!p.WaitForExit(timeout))
{
p.Kill();
if (!p.WaitForExit(timeout))
{
return failure;
}
}
}
File.Delete(filename);
return success;
So, I create a process to execute a .bat file, pass the .bat file a filename to process, then start the process, wait for it to terminate, then delete the file it just processed. Normally, this works just fine.
Sometimes though, what's going on in the .bat file hangs, and I have to kill the process to recover. When that happens, the File.Delete() throws an exception "The process cannot access the file 'filename' because it is being used by another process".
Thinking that maybe I just needed to wait a bit I put a Thread.Sleep() for several second before the File.Delete() - I have increased the sleep time to 30 seconds and I'm still seeing the exception. Surely the open handle doesn't hang around that long after a process has died?
How do I either fix or workaround this?

Related

Batch script calling robocopy in Process won't terminate

If process.Kill() is called from another thread or even another program, the process never comes out of WaitForExit() if the batch script used robocopy.exe until it is finished as if it wasn't killed.
Robocopy.exe is called from the batch script. Every other script or program ends as you'd expect.
ProcessStartInfo startInfo = new ProcessStartInfo();
startInfo.FileName = "batch.bat";
startInfo.UseShellExecute = false;
startInfo.CreateNoWindow = true;
startInfo.RedirectStandardOutput = true;
startInfo.OutputDataReceived += CaptureHandler;
startInfo.RedirectStandardError = true;
startInfo.ErrorDataReceived += CaptureHandler;
process.Start();
process.BeginOutputReadLine();
process.BeginErrorReadLine();
process.WaitForExit();
The batch script looks like:
#echo off
call "robocopy.exe" "somedir" "somedest" /mir /fp /ndl /njh /njs /ns
I have a feeling it has to do with the output handlers.
I tried using process.CancelErrorRead and process.CancelOutputRead() as well after the Kill() call and before, no luck.
Oddly, if you use process.WaitForExit(timeout) overload, it will return true immediately after Kill() from the other thread. However, it's lying. The process is still running! If you try process.WaitForExit() again, as per the MSDN doc, it will still wait for the process to finish despite HasExited saying true.
To ensure that asynchronous event handling has been completed, call the WaitForExit() overload that takes no parameter after receiving a true from this overload.
https://msdn.microsoft.com/en-us/library/ty0d8k56(v=vs.110).aspx
You are successfully killing the batch processor (cmd.exe) but doing so won't kill robocopy, which is a separate process.
It doesn't seem to be documented, but when we look at the .NET source code it turns out that the Process.WaitForExit() method doesn't just wait for the process to exit, it also waits for end-of-file on the standard output and standard error streams. In this scenario, that means that it waits for robocopy to finish even after the batch processor has been killed.
(The overload of Process.WaitForExit with a timeout does not have this extra logic.)
I think this constitutes a bug in the .NET framework. At the very least, it should be documented.
As a workaround, you can use .HasExited and/or the version of WaitForExit with a timeout to determine whether the process has exited or not. Of course, in your scenario you might prefer to wait for grandchild processes, in which case your code is already behaving as desired.
I ran into the same problem. In my case, dropping the /mt switch from the RoboCopy argument list seemed to fix the issue.
Having followed up on Harry Johnston's helpful answer, I found that the process completes normally when you avoid RedirectStandardOutput = true. If this isn't an acceptable solution I found that using robocopy's /LOG:"C:\logs\robocopy.txt" switch to send its standard output to an external log file also works (although you lose the ability to get the file/directory log output from the process object itself).
Looks like right now the only way to do this without the application knowing to terminate Robocopy.exe specifically is to do kill the children of the script process before killing the script itself:
Kill process tree programmatically in C#
/// <summary>
/// Kill a process, and all of its children, grandchildren, etc.
/// </summary>
/// <param name="pid">Process ID.</param>
private static void KillProcessAndChildren(int pid)
{
ManagementObjectSearcher searcher = new ManagementObjectSearcher
("Select * From Win32_Process Where ParentProcessID=" + pid);
ManagementObjectCollection moc = searcher.Get();
foreach (ManagementObject mo in moc)
{
KillProcessAndChildren(Convert.ToInt32(mo["ProcessID"]));
}
try
{
Process proc = Process.GetProcessById(pid);
proc.Kill();
}
catch (ArgumentException)
{
// Process already exited.
}
}

C# process.Kill does not immediately stop a process that is running a batch file

The issue: I am having an issue where I cannot immediately stop a batch file running inside an C# app (process) by using processName.Kill()
In my project, the batch file will run several python scripts, taking around 30 minutes to complete in total. It will sporadically dump several lines of output to console, which I am capturing using the associated process' standard output.
I am setting up and starting the process like so:
var process = new Process
{
StartInfo = new ProcessStartInfo
{
UseShellExecute = false,
FileName = #"C:\Test.bat",
WorkingDirectory = #"C:\",
RedirectStandardOutput = true,
},
};
process.Start();
The batch file that I am testing with is extremely simple:
echo starting
timeout /t 4
echo it worked
I am using a cancellation token from a shared cancellation token source and cancelling after 2000 seconds. I have attached the following event to the token:
ct.Register(() =>
{
process.Kill();
});
I am sending the standardOutput to a blocking collection of strings line by line like so:
Task.Factory.StartNew(() =>
{
while (!streamReader.EndOfStream)
{
var line = streamReader.ReadLine();
if (line == null) break;
_blockingCollection.Add(line);
}
_blockingCollection.CompleteAdding();
});
What I expect to happen: because the process has been killed by the event raised by the cancellation token, I would expect the streamReader.Readline() method to either immediately return null or throw an exception. Instead, it waits for the timeout in the batch file to complete and then returns the full line that the timeout prints. In my real application, it could be waiting for 10-15 minutes for the next output from the scripts.
Am I misunderstanding how process.Kill() works? Is it that timeout behaves oddly? In an ideal situation, I would like the stream reader to immediately stop reading and break out of the while loop.
I tried doing this by having an event raised every time the process output received new data however I could not synchronize these to run in the right order as the scripts had a tendency to dump several hundred of lines of output at the same time
Thanks
(Sorry I can't comment.)
I guess it is because the command: timeout is troubling here. I once tried to kill a timeouting batch with Command Prompt and it happens as same as above. The batch wouldn't be able to be terminated until the timeouting has finished.
Maybe you can try for Process.Kill("timeout.exe") after killing the batch.
Sorry if I am misunderstanding your question. I'm just a newbie here.

How to stop a process after you run it?

This is the code I used to run the following exe program. How can I end this 3 process after I had run it?
Process.Start(#"C:\Bot-shortcut\DIE1.exe");
Process.Start(#"C:\Bot-shortcut\DIE2.exe");
Process.Start(#"C:\Bot-shortcut\DIE3.exe");
First, store the process object returned when you start the process.
If you want it to close normally, as though someone had clicked the close icon, then use CloseMainWindow. This simulates clicking the close icon so that the process can shut down normally. This is always preferable to killing the process, which can corrupt data.
If you want it to die instantly then use Kill. Note that this can corrupt data; the process might have been writing to a file when you killed it.
You have to get the process by Name and then stop it.
Here is the code snippet from MSDN:
Process[] myProcesses;
// Returns array containing all instances of Notepad.
myProcesses = Process.GetProcessesByName("Notepad");
foreach (Process myProcess in myProcesses)
{
myProcess.CloseMainWindow();
}
The Process.kill() will also stop the process but without any prompt.
Find the details in This article.
You can end your process using Kill
Process myProcess = new Process(#"C:\Bot-shortcut\DIE1.exe");
myProcess.Start();
//After Some Codes
myProcess.Kill();
Process.Start returns the process instance which you have started.
Store that in variable and use Process.Kill method to kill that process once you are done with it.
Process process = Process.Start(#"C:\Bot-shortcut\DIE1.exe");
process.Kill();
Store the process objects as variables:
var proc1 =Process.Start(#"C:\Bot-shortcut\DIE1.exe");
var proc2 = Process.Start(#"C:\Bot-shortcut\DIE2.exe");
var proc3 = Process.Start(#"C:\Bot-shortcut\DIE3.exe");
And wait for them to exit:
proc1.WaitForExit();
proc2.WaitForExit();
proc3.WaitForExit();
Or kill them:
proc1.Kill();
proc2.Kill();
proc3.Kill();

How to wait until a process terminates before continuing to execute - C#

So I launch a bunch of process to convert some audio files and i want my main program to wait until all of those process complete before executing.
System.Diagnostics.Process process = new System.Diagnostics.Process();
System.Diagnostics.ProcessStartInfo stratInfo = new System.Diagnostics.ProcessStartInfo();
stratInfo.WindowStyle = System.Diagnostics.ProcessWindowStyle.Hidden;
DirectoryInfo di = new DirectoryInfo(#dir);
foreach (FileInfo fi in di.GetFiles())
{
stratInfo.FileName = "C:\\AudioExtract.exe";
stratInfo.Arguments = "-a \"" + dir + "\\" + fi.Name + "\"";
process.StartInfo = stratInfo;
process.Start();
}
foreach (Process clsProcess in Process.GetProcesses())
{
if (clsProcess.ProcessName.Contains("AudioExtract.exe"))
{
StatusLbl.Text = "Found!";
}
}
Thats what i have to see if it is running, but i need it to continue updating the getprocesses and check if it is still running and im not quite sure how.
My app launches Many of the same process for different audio files, almost simultaneously. I looked at the link in the comment and that will set up an event handler, how would i handle many of the same event everytime one of the processes exit?
inside the if, try
pprocess.WaitForExit();
break;
http://msdn.microsoft.com/en-us/library/fb4aw7b8.aspx
That code will wait until the process exits, then break from your foreach allowing your main program to continue running.
There are some issues that will make your code practically unusable:
Firing up as many processes as you have files - big no-no. You will congest your CPU and won'2 get any benefits of your super multicore machine afterall. Rule of thumb: up to 2 processes per core. That will warm um the processor just fine.
Disk fragmentation. Writing to 100 files at once will leave your hard drive so fragmented, you'l have it choke in no time.
Reusing Process object: again, bad thing. If you want it like that: create one Process instance in a loop, and store it in some kind of List. If you really stick with idea of 'run all at once' - run them, store them in a list, then iterate the list and wait each one to complete!
Creating processes then asking process list from the system and searching them by name - why when you created them in the first place?
EDIT:
How you could do it:
investigate how many CPU cores you have
create Array twice as big
in foreach loop, do this:
determine if you have place in your array (any of the processes is null)
if so - create new process, put it into the array
if no, loop: check (nonblocking) if any of your processes is completed; first one that is, set it to null (inside the array) - if none are done, Sleep a little (my magic number is 350 - you choose your own)

Process.HasExited returns true even though process is running?

I have been observing that Process.HasExited sometimes returns true even though the process is still running.
My code below starts a process with name "testprogram.exe" and then waits for it to exit. The problem is that sometimes I get thrown the exception; it seems that even though HasExited returns true the process itself is still alive in the system - how can this be??
My program writes to a log file just before it terminates and thus I need to be absolutely sure that this log file exists (aka the process has terminated/finished) before reading it. Continuously checking for it's existence is not an option.
// Create new process object
process = new Process();
// Setup event handlers
process.EnableRaisingEvents = true;
process.OutputDataReceived += OutputDataReceivedEvent;
process.ErrorDataReceived += ErrorDataReceivedEvent;
process.Exited += ProgramExitedEvent;
// Setup start info
ProcessStartInfo psi = new ProcessStartInfo
{
FileName = ExePath,
// Must be false to redirect IO
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardError = true,
Arguments = arguments
};
process.StartInfo = psi;
// Start the program
process.Start();
while (!process.HasExited)
Thread.Sleep( 500 );
Process[] p = Process.GetProcessesByName( "testprogram" );
if ( p.Length != 0 )
throw new Exception("Oh oh");
UPDATE: I just tried waiting with process.WaitForExit() instead of the polling loop and the result is the exact same.
Addition: The above code was only to demonstrate a 'clearer' problem alike. To make it clear; my problem is NOT that I still can get a hold of the process by Process.GetProcessesByName( "testprogram" ); after it set HasExited to true.
The real problem is that the program I am running externally writes a file -just before- it terminates (gracefully). I use HasExited to check when the process has finished and thus I know I can read the file (because the process exited!), but it seems that HasExited returns true even sometimes when the program has NOT written the file to disk yet. Here's example code that illustrates the exact problem:
// Start the program
process.Start();
while (!process.HasExited)
Thread.Sleep( 500 );
// Could also be process.WaitForExit(), makes no difference to the result
// Now the process has quit, I can read the file it has exported
if ( !File.Exists( xmlFile ) )
{
// But this exception is thrown occasionally, why?
throw new Exception("xml file not found");
}
I realize this is an old post, but in my quest to find out why my app running the Exited event before the app had even opened I found out something that I though might be useful to people experiencing this problem in the future.
When a process is started, it is assigned a PID. If the User is then prompted with the User Account Control dialog and selects 'Yes', the process is re-started and assigned a new PID.
I sat with this for a few hours, hopefully this can save someone time.
I would suggest you to try this way:
process.Start();
while (!process.HasExited)
{
// Discard cached information about the process.
process.Refresh();
// Just a little check!
Console.WriteLine("Physical Memory Usage: " + process.WorkingSet64.ToString());
Thread.Sleep(500);
}
foreach (Process current in Process.GetProcessesByName("testprogram"))
{
if ((current.Id == process.Id) && !current.HasExited)
throw new Exception("Oh oh!");
}
Anyway... in MSDN page of HasExited I'm reading the following hightlighted note:
When standard output has been redirected to asynchronous event
handlers, it is possible that output processing will not have
completed when this property returns true. To ensure that asynchronous
event handling has been completed, call the WaitForExit() overload
that takes no parameter before checking HasExited.
That could be somehow linked to your problem as you are redirecting everything.
I know, this is an old post but maybe I can help someone. The Process class may behave unexpectedly. HasExited will return true if the process has exited or if the process runs with administrator privileges and your program only has user privileges.
I have posted a question regarding this a while back here, but did not receive a satisfiable answer.
First off, are you sure testprogram does not spawn a process of its own and exit without waiting for that process to finish? We're dealing with some kind of race condition here, and testprogram can be significant.
Second point I'd like to make is about this - "I need to be absolutely sure that this logfile exists". Well, there is no such thing. You can make your check, and then the file is gone. The common way to address this is not to check, but rather to do what you want to do with the file. Go ahead, read it, catch exceptions, retry if the thing seems unstable and you don't want to change anything. The functional check-and-do does not work well if you have more than one actor (thread or whatever) in the system.
A bunch of random ideas follows.
Have you tried using FileSystemWatcher and not depending on process completion?
Does it get any better if you try reading the file (not checking if it exists, but acting instead) in the process.Exited event? [it shouldn't]
Is the system healthy? Anything suspicious in the Event Log?
Can some really aggressive antivirus policy be involved?
(Can't tell much without seeing all the code and looking into testprogram.)
So just for a further investigation into the root cause of the problem you should maybe check out what's really happening by using Process Monitor. Simply start it and include the external program and your own tool and let it record what happens.
Within the log you should see how the external tool writes to the output file and how you open that file. But within this log you should see in which order all these accesses happen.
The first thing that came to my mind is that the Process class doesn't lie and the process is really gone when it tells so. So problem is that at this point in time it seems that the file is still not fully available. I think this is a problem of the OS, cause it holds some parts of the file still within a cache that is not fully written onto the disk and the tool has simply exited itself without flushing its file handles.
With this in mind you should see within the log that the external tool created the file, exited and AFTER that the file will be flushed/closed (by the OS [maybe remove any filters when you found this point within the log]).
So if my assumptions are correct the root cause would be the bad behavior of your external tool which you can't change thus leading to simply wait a little bit after the process has exited and hope that the timeout is long enough to get the file flushed/closed by the OS (maybe try to open the file in a loop with a timeout till it succeeded).
There's two possibilities, the process object continues to hold a reference to the process, so it has exited, but it hasn't yet been deleted. Or you have a second instance of the process running. You should also compare the process Id to make sure. Try this.
....
// Start the program
process.Start();
while (!process.HasExited)
Thread.Sleep( 500 );
Process[] p = Process.GetProcessesByName( "testprogram" );
if ( p.Length != 0 && p[0].Id == process.id && ! p[0].HasExited)
throw new Exception("Oh oh");
For a start, is there an issue with using Process.WaitForExit rather than polling it?
Anyway, it is technically possible for the process to exit from a usable point of view but the process still be around briefly while it does stuff like flush disk cache. Is the log file especially large (or any operation it is performing heavy on disk writes)?
As per MSDN documentation for HasExited.
If a handle is open to the process,
the operating system releases the
process memory when the process has
exited, but retains administrative
information about the process, such as
the handle, exit code, and exit time.
Probably not related, but it's worth noting.
If it's only a problem 1/10 of the time, and the process disappears after a second anyway, depending on your usage of HasExited, try just adding another delay after the HasExited check works, like
while (!process.HasExited)
DoStuff();
Thread.Sleep(500);
Cleanup();
and see if the problem persists.
Personally, I've always just used the Exited event handler instead of any kind of polling, and a simplistic custom wrapper around System.Diagnostics.Process to handle things like thread safety, wrapping a call to CloseMainWindow() followed by WaitForExit(timeout) and finally Kill(), logging, et cetera, and never encountered a problem.
Maybe the problem is in the testprogram? Does this code nicely flush/close etc.? It seems to me if testprogram writes a file to disk, the file should at least be available (empty or not)
If you have web application, and your external program/process is generating files (write to disk) check if your IIS have rights to write to that folder if not on properties security add permission for your IIS user, that was the reason in my case, i was receiving process.HasExited =true, but produced files from the process was not completed, after struggling for a while i add full permissions to the folder where process was writhing and process.Refresh() as Zarathos described from above and everything was working as expected.
Use process_name.Refresh() before checking whether process has exited or not. Refresh() will clear all the cached information related to the process.

Categories