How can I stop async Process by CancellationToken? - c#

I found beneath code for execute some process without freezing UI. This code is executed when 'Start Work' button is pressed. And I think users would stop this work by 'Stop' button. So I found this article at MSDN.. https://msdn.microsoft.com/en-us/library/jj155759.aspx . But, It was hard that applying this CancellationToken at this code.. Anyone can help this problem?
I use public static async Task<int> RunProcessAsync(string fileName, string args) method only.
Code (From https://stackoverflow.com/a/31492250):
public static async Task<int> RunProcessAsync(string fileName, string args)
{
using (var process = new Process
{
StartInfo =
{
FileName = fileName, Arguments = args,
UseShellExecute = false, CreateNoWindow = true,
RedirectStandardOutput = true, RedirectStandardError = true
},
EnableRaisingEvents = true
})
{
return await RunProcessAsync(process).ConfigureAwait(false);
}
}
// This method is used only for internal function call.
private static Task<int> RunProcessAsync(Process process)
{
var tcs = new TaskCompletionSource<int>();
process.Exited += (s, ea) => tcs.SetResult(process.ExitCode);
process.OutputDataReceived += (s, ea) => Console.WriteLine(ea.Data);
process.ErrorDataReceived += (s, ea) => Console.WriteLine("ERR: " + ea.Data);
bool started = process.Start();
if (!started)
{
//you may allow for the process to be re-used (started = false)
//but I'm not sure about the guarantees of the Exited event in such a case
throw new InvalidOperationException("Could not start process: " + process);
}
process.BeginOutputReadLine();
process.BeginErrorReadLine();
return tcs.Task;
}
Usage :
var cancelToken = new CancellationTokenSource();
int returnCode = async RunProcessAsync("python.exe", "foo.py", cancelToken.Token);
if (cancelToken.IsCancellationRequested) { /* something */ }
When the start button clicked, it starts some python script.
When script is running and user wants to stop it, user presses stop button.
Then program executes below code.
cancelToken.Cancel();
Thank you very much for reading this question.

The simple answer is that you can just call process.Kill() when the token is canceled:
cancellationToken.Register(() => process.Kill());
But there are two problems with this:
If you attempt to kill a process that doesn't exist yet or that has already terminated, you get an InvalidOperationException.
If you don't Dispose() the CancellationTokenRegistration returned from Register(), and the CancellationTokenSource is long-lived, you have a memory leak, since the registrations will stay in memory as long as the CancellationTokenSource.
Depending on your requirements, and your desire for clean code (even at the cost of complexity) it may be okay to ignore problem #2 and work around problem #1 by swallowing the exception in a catch.

It's quite simple now:
process.WaitForExitAsync(token);

Related

Process sometimes hangs while waiting for Exit

What may be the reason of my process hanging while waiting for exit?
This code has to start powershell script which inside performs many action e.g start recompiling code via MSBuild, but probably the problem is that it generates too much output and this code gets stuck while waiting to exit even after power shell script executed correctly
it's kinda "weird" because sometimes this code works fine and sometimes it just gets stuck.
Code hangs at:
process.WaitForExit(ProcessTimeOutMiliseconds);
Powershell script executes in like 1-2sec meanwhile timeout is 19sec.
public static (bool Success, string Logs) ExecuteScript(string path, int ProcessTimeOutMiliseconds, params string[] args)
{
StringBuilder output = new StringBuilder();
StringBuilder error = new StringBuilder();
using (var outputWaitHandle = new AutoResetEvent(false))
using (var errorWaitHandle = new AutoResetEvent(false))
{
try
{
using (var process = new Process())
{
process.StartInfo = new ProcessStartInfo
{
WindowStyle = ProcessWindowStyle.Hidden,
FileName = "powershell.exe",
RedirectStandardOutput = true,
RedirectStandardError = true,
UseShellExecute = false,
Arguments = $"-ExecutionPolicy Bypass -File \"{path}\"",
WorkingDirectory = Path.GetDirectoryName(path)
};
if (args.Length > 0)
{
var arguments = string.Join(" ", args.Select(x => $"\"{x}\""));
process.StartInfo.Arguments += $" {arguments}";
}
output.AppendLine($"args:'{process.StartInfo.Arguments}'");
process.OutputDataReceived += (sender, e) =>
{
if (e.Data == null)
{
outputWaitHandle.Set();
}
else
{
output.AppendLine(e.Data);
}
};
process.ErrorDataReceived += (sender, e) =>
{
if (e.Data == null)
{
errorWaitHandle.Set();
}
else
{
error.AppendLine(e.Data);
}
};
process.Start();
process.BeginOutputReadLine();
process.BeginErrorReadLine();
process.WaitForExit(ProcessTimeOutMiliseconds);
var logs = output + Environment.NewLine + error;
return process.ExitCode == 0 ? (true, logs) : (false, logs);
}
}
finally
{
outputWaitHandle.WaitOne(ProcessTimeOutMiliseconds);
errorWaitHandle.WaitOne(ProcessTimeOutMiliseconds);
}
}
}
Script:
start-process $args[0] App.csproj -Wait -NoNewWindow
[string]$sourceDirectory = "\bin\Debug\*"
[int]$count = (dir $sourceDirectory | measure).Count;
If ($count -eq 0)
{
exit 1;
}
Else
{
exit 0;
}
where
$args[0] = "C:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\MSBuild\Current\Bin\MSBuild.exe"
Edit
To #ingen's solution I added small wrapper which retries to execute hanged up MS Build
public static void ExecuteScriptRx(string path, int processTimeOutMilliseconds, out string logs, out bool success, params string[] args)
{
var current = 0;
int attempts_count = 5;
bool _local_success = false;
string _local_logs = "";
while (attempts_count > 0 && _local_success == false)
{
Console.WriteLine($"Attempt: {++current}");
InternalExecuteScript(path, processTimeOutMilliseconds, out _local_logs, out _local_success, args);
attempts_count--;
}
success = _local_success;
logs = _local_logs;
}
Where InternalExecuteScript is ingen's code
Let's start with a recap of the accepted answer in a related post.
The problem is that if you redirect StandardOutput and/or StandardError the internal buffer can become full. Whatever order you use, there can be a problem:
If you wait for the process to exit before reading StandardOutput the process can block trying to write to it, so the process never ends.
If you read from StandardOutput using ReadToEnd then your process can block if the process never closes StandardOutput (for example if it never terminates, or if it is blocked writing to StandardError).
Even the accepted answer, however, struggles with the order of execution in certain cases.
EDIT: See answers below for how avoid an ObjectDisposedException if the timeout occurs.
It's in these kind of situations, where you want to orchestrate several events, that Rx really shines.
Note the .NET implementation of Rx is available as the System.Reactive NuGet package.
Let's dive in to see how Rx facilitates working with events.
// Subscribe to OutputData
Observable.FromEventPattern<DataReceivedEventArgs>(process, nameof(Process.OutputDataReceived))
.Subscribe(
eventPattern => output.AppendLine(eventPattern.EventArgs.Data),
exception => error.AppendLine(exception.Message)
).DisposeWith(disposables);
FromEventPattern allows us to map distinct occurrences of an event to a unified stream (aka observable). This allows us to handle the events in a pipeline (with LINQ-like semantics). The Subscribe overload used here is provided with an Action<EventPattern<...>> and an Action<Exception>. Whenever the observed event is raised, its sender and args will be wrapped by EventPattern and pushed through the Action<EventPattern<...>>. When an exception is raised in the pipeline, Action<Exception> is used.
One of the drawbacks of the Event pattern, clearly illustrated in this use case (and by all the workarounds in the referenced post), is that it's not apparent when / where to unsubscribe the event handlers.
With Rx we get back an IDisposable when we make a subscription. When we dispose of it, we effectively end the subscription. With the addition of the DisposeWith extension method (borrowed from RxUI), we can add multiple IDisposables to a CompositeDisposable (named disposables in the code samples). When we're all done, we can end all subscriptions with one call to disposables.Dispose().
To be sure, there's nothing we can do with Rx, that we wouldn't be able to do with vanilla .NET. The resulting code is just a lot easier to reason about, once you've adapted to the functional way of thinking.
public static void ExecuteScriptRx(string path, int processTimeOutMilliseconds, out string logs, out bool success, params string[] args)
{
StringBuilder output = new StringBuilder();
StringBuilder error = new StringBuilder();
using (var process = new Process())
using (var disposables = new CompositeDisposable())
{
process.StartInfo = new ProcessStartInfo
{
WindowStyle = ProcessWindowStyle.Hidden,
FileName = "powershell.exe",
RedirectStandardOutput = true,
RedirectStandardError = true,
UseShellExecute = false,
Arguments = $"-ExecutionPolicy Bypass -File \"{path}\"",
WorkingDirectory = Path.GetDirectoryName(path)
};
if (args.Length > 0)
{
var arguments = string.Join(" ", args.Select(x => $"\"{x}\""));
process.StartInfo.Arguments += $" {arguments}";
}
output.AppendLine($"args:'{process.StartInfo.Arguments}'");
// Raise the Process.Exited event when the process terminates.
process.EnableRaisingEvents = true;
// Subscribe to OutputData
Observable.FromEventPattern<DataReceivedEventArgs>(process, nameof(Process.OutputDataReceived))
.Subscribe(
eventPattern => output.AppendLine(eventPattern.EventArgs.Data),
exception => error.AppendLine(exception.Message)
).DisposeWith(disposables);
// Subscribe to ErrorData
Observable.FromEventPattern<DataReceivedEventArgs>(process, nameof(Process.ErrorDataReceived))
.Subscribe(
eventPattern => error.AppendLine(eventPattern.EventArgs.Data),
exception => error.AppendLine(exception.Message)
).DisposeWith(disposables);
var processExited =
// Observable will tick when the process has gracefully exited.
Observable.FromEventPattern<EventArgs>(process, nameof(Process.Exited))
// First two lines to tick true when the process has gracefully exited and false when it has timed out.
.Select(_ => true)
.Timeout(TimeSpan.FromMilliseconds(processTimeOutMilliseconds), Observable.Return(false))
// Force termination when the process timed out
.Do(exitedSuccessfully => { if (!exitedSuccessfully) { try { process.Kill(); } catch {} } } );
// Subscribe to the Process.Exited event.
processExited
.Subscribe()
.DisposeWith(disposables);
// Start process(ing)
process.Start();
process.BeginOutputReadLine();
process.BeginErrorReadLine();
// Wait for the process to terminate (gracefully or forced)
processExited.Take(1).Wait();
logs = output + Environment.NewLine + error;
success = process.ExitCode == 0;
}
}
We already discussed the first part, where we map our events to observables, so we can jump straight to the meaty part. Here we assign our observable to the processExited variable, because we want to use it more than once.
First, when we activate it, by calling Subscribe. And later on when we want to 'await' its first value.
var processExited =
// Observable will tick when the process has gracefully exited.
Observable.FromEventPattern<EventArgs>(process, nameof(Process.Exited))
// First two lines to tick true when the process has gracefully exited and false when it has timed out.
.Select(_ => true)
.Timeout(TimeSpan.FromMilliseconds(processTimeOutMilliseconds), Observable.Return(false))
// Force termination when the process timed out
.Do(exitedSuccessfully => { if (!exitedSuccessfully) { try { process.Kill(); } catch {} } } );
// Subscribe to the Process.Exited event.
processExited
.Subscribe()
.DisposeWith(disposables);
// Start process(ing)
...
// Wait for the process to terminate (gracefully or forced)
processExited.Take(1).Wait();
One of the problems with OP is that it assumes process.WaitForExit(processTimeOutMiliseconds) will terminate the process when it times out. From MSDN:
Instructs the Process component to wait the specified number of milliseconds for the associated process to exit.
Instead, when it times out, it merely returns control to the current thread (i.e. it stops blocking). You need to manually force termination when the process times out. To know when time out has occurred, we can map the Process.Exited event to a processExited observable for processing. This way we can prepare the input for the Do operator.
The code is pretty self-explanatory. If exitedSuccessfully the process will have terminated gracefully. If not exitedSuccessfully, termination will need to be forced. Note that process.Kill() is executed asynchronously, ref remarks. However, calling process.WaitForExit() right after will open up the possibility for deadlocks again. So even in the case of forced termination, it's better to let all disposables be cleaned up when the using scope ends, as the output can be considered interrupted / corrupted anyway.
The try catch construct is reserved for the exceptional case (no pun intended) where you've aligned processTimeOutMilliseconds with the actual time needed by the process to complete. In other words, a race condition occurs between the Process.Exited event and the timer. The possibility of this happening is again magnified by the asynchronous nature of process.Kill(). I've encountered it once during testing.
For completeness, the DisposeWith extension method.
/// <summary>
/// Extension methods associated with the IDisposable interface.
/// </summary>
public static class DisposableExtensions
{
/// <summary>
/// Ensures the provided disposable is disposed with the specified <see cref="CompositeDisposable"/>.
/// </summary>
public static T DisposeWith<T>(this T item, CompositeDisposable compositeDisposable)
where T : IDisposable
{
if (compositeDisposable == null)
{
throw new ArgumentNullException(nameof(compositeDisposable));
}
compositeDisposable.Add(item);
return item;
}
}
The problem is that if you redirect StandardOutput and/or StandardError the internal buffer can become full.
To solve the issues aforementioned you can run the process in separate threads. I do not use WaitForExit, I utilize the process exited event which will return the ExitCode of the process asynchronously ensuring it has completed.
public async Task<int> RunProcessAsync(params string[] args)
{
try
{
var tcs = new TaskCompletionSource<int>();
var process = new Process
{
StartInfo = {
FileName = 'file path',
RedirectStandardOutput = true,
RedirectStandardError = true,
Arguments = "shell command",
UseShellExecute = false,
CreateNoWindow = true
},
EnableRaisingEvents = true
};
process.Exited += (sender, args) =>
{
tcs.SetResult(process.ExitCode);
process.Dispose();
};
process.Start();
// Use asynchronous read operations on at least one of the streams.
// Reading both streams synchronously would generate another deadlock.
process.BeginOutputReadLine();
string tmpErrorOut = await process.StandardError.ReadToEndAsync();
//process.WaitForExit();
return await tcs.Task;
}
catch (Exception ee) {
Console.WriteLine(ee.Message);
}
return -1;
}
The above code is battle tested calling FFMPEG.exe with command line arguments. I was converting mp4 files to mp3 files and doing over 1000 videos at a time without failing. Unfortunately I do not have direct power shell experience but hope this helps.
For the benefit of readers I'm going to divide this to 2 Sections
Section A: Problem & how to handle similar scenarios
Section B: Problem recreation & Solution
Section A: Problem
When this problem happens - process appears in task manager, then
after 2-3sec disappears (its fine), then it waits for timeout and then
exception is thrown System.InvalidOperationException: Process must
exit before requested information can be determined.
& See Scenario 4 below
In your code:
Process.WaitForExit(ProcessTimeOutMiliseconds); With this you're waiting for Process to Timeout or Exit, which ever takes place first.
OutputWaitHandle.WaitOne(ProcessTimeOutMiliseconds)anderrorWaitHandle.WaitOne(ProcessTimeOutMiliseconds); With this you're waiting for OutputData & ErrorData stream read operation to signal its complete
Process.ExitCode == 0 Gets status of process when it exited
Different settings & their caveats:
Scenario 1 (Happy Path): Process completes before the timeout, and thus your stdoutput and stderror also finishes before it and all is well.
Scenario 2: Process, OutputWaitHandle & ErrorWaitHandle timesout however stdoutput & stderror is still being read and completes after time out WaitHandlers. This leads to another exception ObjectDisposedException()
Scenario 3: Process times-out first (19 sec) but stdout and stderror is in action, you wait for WaitHandler's to times out (19 sec), causing a added delay of + 19sec.
Scenario 4: Process times out and code attempts to prematurely query Process.ExitCode resulting in the error System.InvalidOperationException: Process must exit before requested information can be determined.
I have tested this scenario over a dozen times and works fine, following settings have been used while testing
Size of Output stream ranging from 5KB to 198KB by initiating build of about 2-15 projects
Premature timeouts & process exits within the timeout window
Updated Code
.
.
.
process.BeginOutputReadLine();
process.BeginErrorReadLine();
//First waiting for ReadOperations to Timeout and then check Process to Timeout
if (!outputWaitHandle.WaitOne(ProcessTimeOutMiliseconds) && !errorWaitHandle.WaitOne(ProcessTimeOutMiliseconds)
&& !process.WaitForExit(ProcessTimeOutMiliseconds) )
{
//To cancel the Read operation if the process is stil reading after the timeout this will prevent ObjectDisposeException
process.CancelOutputRead();
process.CancelErrorRead();
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine("Timed Out");
Logs = output + Environment.NewLine + error;
//To release allocated resource for the Process
process.Close();
return (false, logs);
}
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine("Completed On Time");
Logs = output + Environment.NewLine + error;
ExitCode = process.ExitCode.ToString();
// Close frees the memory allocated to the exited process
process.Close();
//ExitCode now accessible
return process.ExitCode == 0 ? (true, logs) : (false, logs);
}
}
finally{}
EDIT:
After hours of playing around with MSBuild I was finally able to reproduce the issue at my system
Section B: Problem Recreation & Solution
MSBuild has -m[:number] switch which
is used to specify the maximum number of concurrent processes to use
when building.
When this is enabled, MSBuild spawns a number of nodes that lives on
even after the Build is complete. Now,
Process.WaitForExit(milliseconds) would wait never exit and
eventually timeout
I was able to solve this in couple of ways
Spawn MSBuild process indirectly through CMD
$path1 = """C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\MSBuild\15.0\Bin\MSBuild.exe"" ""C:\Users\John\source\repos\Test\Test.sln"" -maxcpucount:3"
$cmdOutput = cmd.exe /c $path1 '2>&1'
$cmdOutput
Continue to use MSBuild but be sure to set the nodeReuse to False
$filepath = "C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\MSBuild\15.0\Bin\MSBuild.exe"
$arg1 = "C:\Users\John\source\repos\Test\Test.sln"
$arg2 = "-m:3"
$arg3 = "-nr:False"
Start-Process -FilePath $filepath -ArgumentList $arg1,$arg2,$arg3 -Wait -NoNewWindow
Even If parallel build is not enabled, you could still prevent your process from hanging at WaitForExit by launching the Build via CMD & therefore you do not create a direct dependency on Build process
$path1 = """C:\....\15.0\Bin\MSBuild.exe"" ""C:\Users\John\source\Test.sln"""
$cmdOutput = cmd.exe /c $path1 '2>&1'
$cmdOutput
The 2nd approach is preferred since you do not want too many MSBuild nodes to be lying around.
Not sure if this is your issue, but looking at MSDN there seems to be some weirdness with the overloaded WaitForExit when you are redirecting output asynchronously. MSDN article recommends calling the WaitForExit that takes no arguments after calling the overloaded method.
Docs page located here. Relevant text:
When standard output has been redirected to asynchronous event handlers, it is possible that output processing will not have completed when this method returns. To ensure that asynchronous event handling has been completed, call the WaitForExit() overload that takes no parameter after receiving a true from this overload. To help ensure that the Exited event is handled correctly in Windows Forms applications, set the SynchronizingObject property.
Code modification might look something like this:
if (process.WaitForExit(ProcessTimeOutMiliseconds))
{
process.WaitForExit();
}

Call a process asynchronously via WebAPI

I am in a scenario wherein I need to call an exe process via an API endpoint (fire and forget). However, the exe that I will call is a long running one but I need to wait for it to finish to execute further codes. This will happen in the background and the client shouldn't really wait for it. When I'm trying the code below, the call is waiting for the exe process to finish. I've also tried making the method "async" but still same result.
[HttpPost]
public async Task<bool> ToggleCarWeaverService(string command)
{
try
{
var fileName = #"SomeExe.exe";
await _service.RunProcessAsync(command, fileName);
return true;
}
catch (Exception ex)
{
throw new HttpResponseException(HttpStatusCode.InternalServerError);
}
}
public Task<int> RunProcessAsync(string command, string fileName)
{
// Use ProcessStartInfo class
ProcessStartInfo startInfo = new ProcessStartInfo
{
CreateNoWindow = false,
UseShellExecute = true,
FileName = fileName,
WindowStyle = ProcessWindowStyle.Normal,
Arguments = command
};
try
{
var tcs = new TaskCompletionSource<int>();
var process = new Process
{
StartInfo = startInfo,
EnableRaisingEvents = true
};
process.Exited += (sender, args) =>
{
tcs.SetResult(process.ExitCode);
//Do more code
process.Dispose();
};
process.Start();
return tcs.Task;
}
catch (Exception ex)
{
// Log error.
throw ex;
}
}
Do not await the async task.
Use Task.Run to wrap and continue
[HttpPost]
public IHttpActionResult ToggleCarWeaverService(string command) {
Task.Run(async () => {
var fileName = #"SomeExe.exe";
await _service.RunProcessAsync(command, fileName);
//...other code after process finish
});
return Ok();
}
The answers given here will, correctly, answer your question - that is, by not awaiting the result, your call will return immediately. However, there is no guarantee that the process will continue to run. Once you've left the function you, effectively, have a rouge process that may or may not complete.
Most Cloud providers have a solution to this (e.g. Azure WebJobs), or you could roll your own using something like HangFire.
Then you shouldn't await the call further at all and thus change the line where you await the call RunProcessAsync to
_service.RunProcessAsync(command, fileName);

Task return a StreamReader in c#

I have this task in C# that should return the standard output of DISM, so I can use it where i need:
public async Task<StreamReader> DISM(string Args)
{
StreamReader DISMstdout = null;
await Task.Run(() =>
{
Process DISMcmd = new Process();
if (Environment.Is64BitOperatingSystem)
{
DISMcmd.StartInfo.FileName = System.IO.Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Windows), "SysWOW64", "dism.exe");
}
else
{
DISMcmd.StartInfo.FileName = System.IO.Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Windows), "System32", "dism.exe");
}
DISMcmd.StartInfo.Verb = "runas";
DISMcmd.StartInfo.Arguments = DISMArguments;
DISMcmd.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
DISMcmd.StartInfo.CreateNoWindow = true;
DISMcmd.StartInfo.UseShellExecute = false;
DISMcmd.StartInfo.RedirectStandardOutput = true;
DISMcmd.EnableRaisingEvents = true;
DISMcmd.Start();
DISMstdout = DISMcmd.StandardOutput;
DISMcmd.WaitForExit();
});
return DISMstdout;
}
But it doesn't really work.
If I want to read the standardoutput from another task I can't (because it is empty) So there must be a problem with my task?.
public async Task Test()
{
await Task.Run(() =>
{
StreamReader DISM = await new DISM("/Get-ImageInfo /ImageFile:" + ImagePath + #" /Index:1");
string data = string.Empty;
MessageBox.Show(DISM.ReadToEnd()); // this should display a msgbox with the standardoutput of dism
while ((data = DISM.ReadLine()) != null)
{
if (data.Contains("Version : "))
{
// do something
}
}
});
}
What is wrong with this piece of code?
The way I'd write your method to exploit async..await as opposed to the legacy asynchronous approaches is like this:
public async Task<TResult> WithDism<TResult>(string args, Func<StreamReader, Task<TResult>> func)
{
return await Task.Run(async () =>
{
var proc = new Process();
var windowsDir = Environment.GetFolderPath(Environment.SpecialFolder.Windows);
var systemDir = Environment.Is64BitOperatingSystem ? "SysWOW64" : "System32";
proc.StartInfo.FileName = Path.Combine(windowsDir, systemDir, "dism.exe");
proc.StartInfo.Verb = "runas";
proc.StartInfo.Arguments = args;
proc.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
proc.StartInfo.CreateNoWindow = true;
proc.StartInfo.UseShellExecute = false;
proc.StartInfo.RedirectStandardOutput = true;
proc.Start();
Console.Error.WriteLine("dism started");
var result = await func(proc.StandardOutput);
Console.Error.WriteLine("func finished");
// discard rest of stdout
await proc.StandardOutput.ReadToEndAsync();
proc.WaitForExit();
return result;
});
}
Since realistically, the only part where significant blocking can occur when spawning a process is as you handle the output it produces. Used like this:
var task = WithDism("/?", async sr => await sr.ReadToEndAsync()); // or process line-by-line
Console.WriteLine("dism task running");
Console.WriteLine(await task);
it produces the following output
dism task running
dism started
func finished
Error: 740
Elevated permissions are required to run DISM.
Use an elevated command prompt to complete these tasks.
Do note that when using subprocesses, it's your job to make sure they correctly exit or are shut down to avoid leaving zombie processes around. That's why I've added the possibly redundant ReadToEndAsync() - in case func still leaves some output unconsumed, this should allow the process to reach its natural end.
However, this means the calling function will only proceed once that happens. If you leave behind a lot of unconsumed output you're not interested in, this will cause an unwanted delay. You could work around this by spawning off this cleanup to a different background task and returning the result immediately using something like:
Task.Run(() => {
// discard rest of stdout and clean up process:
await proc.StandardOutput.ReadToEndAsync();
proc.WaitForExit();
});
but I admit I'm going a bit out on a limb there, I'm not entirely sure about the robustness of just letting a task "run wild" like that. What the appropriate way to clean up the process is will, of course, depend on what it's actually doing after you get the output you want to return from func.
I'm using synchronous calls to Console there because they only serve to illustrate the timing of events, I want to know that as execution reaches that point. Normally you would use async in a "viral" way to make sure control passes back to top-level as soon as possible.
After playing around with this using Benchmark.NET, it seems that starting a process (I tried DISM and Atom to have something hefty) - from setup to Start() - takes about 50 milliseconds. This seems pretty negligible to me for this use. After all, 50ms is good enough latency for say playing League of Legends, and you're not going to start these in a tight loop.
I'd like to provide an alternative answer of "don't bother with Task.Run() and just use async I/O in a straightforward way" unless you absolutely need to get rid of that delay and believe spawning off a background thread will help:
static string GetDismPath()
{
var windowsDir = Environment.GetFolderPath(Environment.SpecialFolder.Windows);
var systemDir = Environment.Is64BitOperatingSystem ? "SysWOW64" : "System32";
var dismExePath = Path.Combine(windowsDir, systemDir, "dism.exe");
return dismExePath;
}
static Process StartDism(string args)
{
var proc = new Process
{
StartInfo =
{
FileName = GetDismPath(),
Verb = "runas",
Arguments = args,
WindowStyle = ProcessWindowStyle.Hidden,
CreateNoWindow = true,
UseShellExecute = false,
RedirectStandardOutput = true
}
};
proc.Start();
return proc;
}
static void Cleanup(Process proc)
{
Task.Run(async () =>
{
proc.StandardInput.Close();
var buf = new char[0x1000];
while (await proc.StandardOutput.ReadBlockAsync(buf, 0, buf.Length).ConfigureAwait(false) != 0) { }
while (await proc.StandardError.ReadBlockAsync(buf, 0, buf.Length).ConfigureAwait(false) != 0) { }
if (!proc.WaitForExit(5000))
{
proc.Kill();
}
proc.Dispose();
});
}
static async Task Main(string[] args)
{
var dismProc = StartDism("/?");
// do what you want with the output
var dismOutput = await dismProc.StandardOutput.ReadToEndAsync().ConfigureAwait(false);
await Console.Out.WriteAsync(dismOutput).ConfigureAwait(false);
Cleanup(dismProc);
}
I'm only using Task.Run() to keep the cleanup off the main thread in case you need to do something else while DISM keeps producing output you're not interested in that you do not wish to kill outright.

Wrap EAP in Task

I was trying to wrap a EAP in a Task with following code.
public static async Task<string> Caller()
{
var ret = await RunProgram();
return ret;
}
public static async Task<string> RunProgram()
{
TaskCompletionSource<string> source = new TaskCompletionSource<string>();
var process = new Process();
process.StartInfo.UseShellExecute = true;
process.StartInfo.FileName = "cmd";
process.Exited += (sender, args) =>
{
source.SetResult("hello");
};
process.Start();
return await source.Task;
}
However,the Exited Event never gets fired. Could someone guide me on what am doing wrong here ?
Please note that above code is a prototype, the 'event-not-firing' scenario happens in the real scenario as well.
You need to enable event raising property of the Process
like this
var process = new Process
{
EnableRaisingEvents = true,
StartInfo = new ProcessStartInfo(processPath)
{
RedirectStandardError = true,
UseShellExecute = false
}
};
Without addressing any other issue.
Process.EnableRaisingEvents Property
Gets or sets whether the Exited event should be raised when the
process terminates.
Remarks
The EnableRaisingEvents property indicates whether the component
should be notified when the operating system has shut down a process.
The EnableRaisingEvents property is used in asynchronous processing to
notify your application that a process has exited. To force your
application to synchronously wait for an exit event (which interrupts
processing of the application until the exit event has occurred), use
the WaitForExit method
Example
var p = Process.Start(startInfo);
p.EnableRaisingEvents = true;
p.Exited += new EventHandler(ProcessExited);

Async process start and wait for it to finish

I am new to the thread model in .NET. What would you use to:
Start a process that handles a file (process.StartInfo.FileName = fileName;).
Wait for the user to close the process OR abandon the thread after some time.
If the user closed the process, delete the file.
Starting the process and waiting should be done on a different thread than the main thread, because this operation should not affect the application.
Example:
My application produces an html report. The user can right click somewhere and say "View Report" - now I retrieve the report contents in a temporary file and launch the process that handles html files i.e. the default browser. The problem is that I cannot cleanup, i.e. delete the temp file.
"and waiting must be async" - I'm not trying to be funny, but isn't that a contradiction in terms? However, since you are starting a Process, the Exited event may help:
ProcessStartInfo startInfo = null;
Process process = Process.Start(startInfo);
process.EnableRaisingEvents = true;
process.Exited += delegate {/* clean up*/};
If you want to actually wait (timeout etc), then:
if(process.WaitForExit(timeout)) {
// user exited
} else {
// timeout (perhaps process.Kill();)
}
For waiting async, perhaps just use a different thread?
ThreadPool.QueueUserWorkItem(delegate {
Process process = Process.Start(startInfo);
if(process.WaitForExit(timeout)) {
// user exited
} else {
// timeout
}
});
Adding an advanced alternative to this old question. If you want to wait for a process to exit without blocking any thread and still support timeouts, try the following:
public static Task<bool> WaitForExitAsync(this Process process, TimeSpan timeout)
{
ManualResetEvent processWaitObject = new ManualResetEvent(false);
processWaitObject.SafeWaitHandle = new SafeWaitHandle(process.Handle, false);
TaskCompletionSource<bool> tcs = new TaskCompletionSource<bool>();
RegisteredWaitHandle registeredProcessWaitHandle = null;
registeredProcessWaitHandle = ThreadPool.RegisterWaitForSingleObject(
processWaitObject,
delegate(object state, bool timedOut)
{
if (!timedOut)
{
registeredProcessWaitHandle.Unregister(null);
}
processWaitObject.Dispose();
tcs.SetResult(!timedOut);
},
null /* state */,
timeout,
true /* executeOnlyOnce */);
return tcs.Task;
}
Again, the advantage to this approach compared to the accepted answer is that you're not blocking any threads, which reduces the overhead of your app.
Try the following code.
public void KickOffProcess(string filePath) {
var proc = Process.Start(filePath);
ThreadPool.QueueUserWorkItem(new WaitCallBack(WaitForProc), proc);
}
private void WaitForProc(object obj) {
var proc = (Process)obj;
proc.WaitForExit();
// Do the file deletion here
}
The .NET 5 introduced the new API Process.WaitForExitAsync, that allows to wait asynchronously for the completion of a process. It offers the same functionality with the existing Process.WaitForExit, with the only difference being that the waiting is asynchronous, so it does not block the calling thread.
Usage example:
private async void button1_Click(object sender, EventArgs e)
{
string filePath = Path.Combine
(
Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData),
Guid.NewGuid().ToString() + ".txt"
);
File.WriteAllText(filePath, "Hello World!");
try
{
using Process process = new();
process.StartInfo.FileName = "Notepad.exe";
process.StartInfo.Arguments = filePath;
process.Start();
await process.WaitForExitAsync();
}
finally
{
File.Delete(filePath);
}
MessageBox.Show("Done!");
}
In the above example the UI remains responsive while the user interacts with the opened file. The UI thread would be blocked if the WaitForExit had been used instead.
I would probably not use a separate process for opening a file. Instead, I'd probably utilize a background thread (if I thought the operation was going to take a long time and possible block the UI thread).
private delegate void FileOpenDelegate(string filename);
public void OpenFile(string filename)
{
FileOpenDelegate fileOpenDelegate = OpenFileAsync;
AsyncCallback callback = AsyncCompleteMethod;
fileOpenDelegate.BeginInvoke(filename, callback, state);
}
private void OpenFileAsync(string filename)
{
// file opening code here, and then do whatever with the file
}
Of course, this is not a good working example (it returns nothing) and I haven't shown how the UI gets updated (you have to use BeginInvoke at the UI level because a background thread cannot update the UI thread). But this approach is generally how I go about handling asynchronous operations in .Net.
You can use the Exited event in Process class
ProcessStartInfo info = new ProcessStartInfo();
info.FileName = "notepad.exe";
Process process = Process.Start(info);
process.Exited += new EventHandler(process_Exited);
Console.Read();
and in that event you can handle the operations you mentioned

Categories