Some background: I am using Pulumi via C# (Pulumi is an IaC tool to provision resources), to provision resources. In my code (examples below), there is a callback, which is asynchronous. I'd like the callback to be synchronous, so it waits for the console to return before proceeding and then being executed again, in a loop.
This is the callback code (in the future, I will have the same line of code below, passing different variables to the same method. I am assuming the call below with the await and async, is the correct way for it to await and block to complete?).
clientApp.ClientId.Apply<string>(async clientId => await MyStack.Persist(env, clientId, projectName, "Test"));
And in my callback code, I invoke a process, like so:
string arguments = string.Join(" ", "xyz.Utilities.Pusher.exe", variableName, value, projectName, env);
string exePath = #"xyz.Utilities.Pusher.exe";
ProcessStartInfo startinfo = new ProcessStartInfo(exePath);
startinfo.WorkingDirectory = #"C:\Repos xyz.Utilities.Pusher\xyz.Utilities.Pusher\bin\Debug\net6.0\";
startinfo.Arguments = arguments;
startinfo.CreateNoWindow = false;
startinfo.UseShellExecute = false;
Process p = Process.Start(startinfo);
p.WaitForExit();
The callback method has this signature:
public static Task<string> PersistToOctopus(string env, string value, string projectName, string variableName)
I hope I have all of the details needed. What I am seeing is that the output I get is strange in that I get null errors etc and it seems like the method does not execute synchronously. Note this calls another exe that I developed, so I am wondering do I need to make changes in the consumed exe?
Many thanks
Related
What may be the reason of my process hanging while waiting for exit?
This code has to start powershell script which inside performs many action e.g start recompiling code via MSBuild, but probably the problem is that it generates too much output and this code gets stuck while waiting to exit even after power shell script executed correctly
it's kinda "weird" because sometimes this code works fine and sometimes it just gets stuck.
Code hangs at:
process.WaitForExit(ProcessTimeOutMiliseconds);
Powershell script executes in like 1-2sec meanwhile timeout is 19sec.
public static (bool Success, string Logs) ExecuteScript(string path, int ProcessTimeOutMiliseconds, params string[] args)
{
StringBuilder output = new StringBuilder();
StringBuilder error = new StringBuilder();
using (var outputWaitHandle = new AutoResetEvent(false))
using (var errorWaitHandle = new AutoResetEvent(false))
{
try
{
using (var process = new Process())
{
process.StartInfo = new ProcessStartInfo
{
WindowStyle = ProcessWindowStyle.Hidden,
FileName = "powershell.exe",
RedirectStandardOutput = true,
RedirectStandardError = true,
UseShellExecute = false,
Arguments = $"-ExecutionPolicy Bypass -File \"{path}\"",
WorkingDirectory = Path.GetDirectoryName(path)
};
if (args.Length > 0)
{
var arguments = string.Join(" ", args.Select(x => $"\"{x}\""));
process.StartInfo.Arguments += $" {arguments}";
}
output.AppendLine($"args:'{process.StartInfo.Arguments}'");
process.OutputDataReceived += (sender, e) =>
{
if (e.Data == null)
{
outputWaitHandle.Set();
}
else
{
output.AppendLine(e.Data);
}
};
process.ErrorDataReceived += (sender, e) =>
{
if (e.Data == null)
{
errorWaitHandle.Set();
}
else
{
error.AppendLine(e.Data);
}
};
process.Start();
process.BeginOutputReadLine();
process.BeginErrorReadLine();
process.WaitForExit(ProcessTimeOutMiliseconds);
var logs = output + Environment.NewLine + error;
return process.ExitCode == 0 ? (true, logs) : (false, logs);
}
}
finally
{
outputWaitHandle.WaitOne(ProcessTimeOutMiliseconds);
errorWaitHandle.WaitOne(ProcessTimeOutMiliseconds);
}
}
}
Script:
start-process $args[0] App.csproj -Wait -NoNewWindow
[string]$sourceDirectory = "\bin\Debug\*"
[int]$count = (dir $sourceDirectory | measure).Count;
If ($count -eq 0)
{
exit 1;
}
Else
{
exit 0;
}
where
$args[0] = "C:\Program Files (x86)\Microsoft Visual Studio\2019\Professional\MSBuild\Current\Bin\MSBuild.exe"
Edit
To #ingen's solution I added small wrapper which retries to execute hanged up MS Build
public static void ExecuteScriptRx(string path, int processTimeOutMilliseconds, out string logs, out bool success, params string[] args)
{
var current = 0;
int attempts_count = 5;
bool _local_success = false;
string _local_logs = "";
while (attempts_count > 0 && _local_success == false)
{
Console.WriteLine($"Attempt: {++current}");
InternalExecuteScript(path, processTimeOutMilliseconds, out _local_logs, out _local_success, args);
attempts_count--;
}
success = _local_success;
logs = _local_logs;
}
Where InternalExecuteScript is ingen's code
Let's start with a recap of the accepted answer in a related post.
The problem is that if you redirect StandardOutput and/or StandardError the internal buffer can become full. Whatever order you use, there can be a problem:
If you wait for the process to exit before reading StandardOutput the process can block trying to write to it, so the process never ends.
If you read from StandardOutput using ReadToEnd then your process can block if the process never closes StandardOutput (for example if it never terminates, or if it is blocked writing to StandardError).
Even the accepted answer, however, struggles with the order of execution in certain cases.
EDIT: See answers below for how avoid an ObjectDisposedException if the timeout occurs.
It's in these kind of situations, where you want to orchestrate several events, that Rx really shines.
Note the .NET implementation of Rx is available as the System.Reactive NuGet package.
Let's dive in to see how Rx facilitates working with events.
// Subscribe to OutputData
Observable.FromEventPattern<DataReceivedEventArgs>(process, nameof(Process.OutputDataReceived))
.Subscribe(
eventPattern => output.AppendLine(eventPattern.EventArgs.Data),
exception => error.AppendLine(exception.Message)
).DisposeWith(disposables);
FromEventPattern allows us to map distinct occurrences of an event to a unified stream (aka observable). This allows us to handle the events in a pipeline (with LINQ-like semantics). The Subscribe overload used here is provided with an Action<EventPattern<...>> and an Action<Exception>. Whenever the observed event is raised, its sender and args will be wrapped by EventPattern and pushed through the Action<EventPattern<...>>. When an exception is raised in the pipeline, Action<Exception> is used.
One of the drawbacks of the Event pattern, clearly illustrated in this use case (and by all the workarounds in the referenced post), is that it's not apparent when / where to unsubscribe the event handlers.
With Rx we get back an IDisposable when we make a subscription. When we dispose of it, we effectively end the subscription. With the addition of the DisposeWith extension method (borrowed from RxUI), we can add multiple IDisposables to a CompositeDisposable (named disposables in the code samples). When we're all done, we can end all subscriptions with one call to disposables.Dispose().
To be sure, there's nothing we can do with Rx, that we wouldn't be able to do with vanilla .NET. The resulting code is just a lot easier to reason about, once you've adapted to the functional way of thinking.
public static void ExecuteScriptRx(string path, int processTimeOutMilliseconds, out string logs, out bool success, params string[] args)
{
StringBuilder output = new StringBuilder();
StringBuilder error = new StringBuilder();
using (var process = new Process())
using (var disposables = new CompositeDisposable())
{
process.StartInfo = new ProcessStartInfo
{
WindowStyle = ProcessWindowStyle.Hidden,
FileName = "powershell.exe",
RedirectStandardOutput = true,
RedirectStandardError = true,
UseShellExecute = false,
Arguments = $"-ExecutionPolicy Bypass -File \"{path}\"",
WorkingDirectory = Path.GetDirectoryName(path)
};
if (args.Length > 0)
{
var arguments = string.Join(" ", args.Select(x => $"\"{x}\""));
process.StartInfo.Arguments += $" {arguments}";
}
output.AppendLine($"args:'{process.StartInfo.Arguments}'");
// Raise the Process.Exited event when the process terminates.
process.EnableRaisingEvents = true;
// Subscribe to OutputData
Observable.FromEventPattern<DataReceivedEventArgs>(process, nameof(Process.OutputDataReceived))
.Subscribe(
eventPattern => output.AppendLine(eventPattern.EventArgs.Data),
exception => error.AppendLine(exception.Message)
).DisposeWith(disposables);
// Subscribe to ErrorData
Observable.FromEventPattern<DataReceivedEventArgs>(process, nameof(Process.ErrorDataReceived))
.Subscribe(
eventPattern => error.AppendLine(eventPattern.EventArgs.Data),
exception => error.AppendLine(exception.Message)
).DisposeWith(disposables);
var processExited =
// Observable will tick when the process has gracefully exited.
Observable.FromEventPattern<EventArgs>(process, nameof(Process.Exited))
// First two lines to tick true when the process has gracefully exited and false when it has timed out.
.Select(_ => true)
.Timeout(TimeSpan.FromMilliseconds(processTimeOutMilliseconds), Observable.Return(false))
// Force termination when the process timed out
.Do(exitedSuccessfully => { if (!exitedSuccessfully) { try { process.Kill(); } catch {} } } );
// Subscribe to the Process.Exited event.
processExited
.Subscribe()
.DisposeWith(disposables);
// Start process(ing)
process.Start();
process.BeginOutputReadLine();
process.BeginErrorReadLine();
// Wait for the process to terminate (gracefully or forced)
processExited.Take(1).Wait();
logs = output + Environment.NewLine + error;
success = process.ExitCode == 0;
}
}
We already discussed the first part, where we map our events to observables, so we can jump straight to the meaty part. Here we assign our observable to the processExited variable, because we want to use it more than once.
First, when we activate it, by calling Subscribe. And later on when we want to 'await' its first value.
var processExited =
// Observable will tick when the process has gracefully exited.
Observable.FromEventPattern<EventArgs>(process, nameof(Process.Exited))
// First two lines to tick true when the process has gracefully exited and false when it has timed out.
.Select(_ => true)
.Timeout(TimeSpan.FromMilliseconds(processTimeOutMilliseconds), Observable.Return(false))
// Force termination when the process timed out
.Do(exitedSuccessfully => { if (!exitedSuccessfully) { try { process.Kill(); } catch {} } } );
// Subscribe to the Process.Exited event.
processExited
.Subscribe()
.DisposeWith(disposables);
// Start process(ing)
...
// Wait for the process to terminate (gracefully or forced)
processExited.Take(1).Wait();
One of the problems with OP is that it assumes process.WaitForExit(processTimeOutMiliseconds) will terminate the process when it times out. From MSDN:
Instructs the Process component to wait the specified number of milliseconds for the associated process to exit.
Instead, when it times out, it merely returns control to the current thread (i.e. it stops blocking). You need to manually force termination when the process times out. To know when time out has occurred, we can map the Process.Exited event to a processExited observable for processing. This way we can prepare the input for the Do operator.
The code is pretty self-explanatory. If exitedSuccessfully the process will have terminated gracefully. If not exitedSuccessfully, termination will need to be forced. Note that process.Kill() is executed asynchronously, ref remarks. However, calling process.WaitForExit() right after will open up the possibility for deadlocks again. So even in the case of forced termination, it's better to let all disposables be cleaned up when the using scope ends, as the output can be considered interrupted / corrupted anyway.
The try catch construct is reserved for the exceptional case (no pun intended) where you've aligned processTimeOutMilliseconds with the actual time needed by the process to complete. In other words, a race condition occurs between the Process.Exited event and the timer. The possibility of this happening is again magnified by the asynchronous nature of process.Kill(). I've encountered it once during testing.
For completeness, the DisposeWith extension method.
/// <summary>
/// Extension methods associated with the IDisposable interface.
/// </summary>
public static class DisposableExtensions
{
/// <summary>
/// Ensures the provided disposable is disposed with the specified <see cref="CompositeDisposable"/>.
/// </summary>
public static T DisposeWith<T>(this T item, CompositeDisposable compositeDisposable)
where T : IDisposable
{
if (compositeDisposable == null)
{
throw new ArgumentNullException(nameof(compositeDisposable));
}
compositeDisposable.Add(item);
return item;
}
}
The problem is that if you redirect StandardOutput and/or StandardError the internal buffer can become full.
To solve the issues aforementioned you can run the process in separate threads. I do not use WaitForExit, I utilize the process exited event which will return the ExitCode of the process asynchronously ensuring it has completed.
public async Task<int> RunProcessAsync(params string[] args)
{
try
{
var tcs = new TaskCompletionSource<int>();
var process = new Process
{
StartInfo = {
FileName = 'file path',
RedirectStandardOutput = true,
RedirectStandardError = true,
Arguments = "shell command",
UseShellExecute = false,
CreateNoWindow = true
},
EnableRaisingEvents = true
};
process.Exited += (sender, args) =>
{
tcs.SetResult(process.ExitCode);
process.Dispose();
};
process.Start();
// Use asynchronous read operations on at least one of the streams.
// Reading both streams synchronously would generate another deadlock.
process.BeginOutputReadLine();
string tmpErrorOut = await process.StandardError.ReadToEndAsync();
//process.WaitForExit();
return await tcs.Task;
}
catch (Exception ee) {
Console.WriteLine(ee.Message);
}
return -1;
}
The above code is battle tested calling FFMPEG.exe with command line arguments. I was converting mp4 files to mp3 files and doing over 1000 videos at a time without failing. Unfortunately I do not have direct power shell experience but hope this helps.
For the benefit of readers I'm going to divide this to 2 Sections
Section A: Problem & how to handle similar scenarios
Section B: Problem recreation & Solution
Section A: Problem
When this problem happens - process appears in task manager, then
after 2-3sec disappears (its fine), then it waits for timeout and then
exception is thrown System.InvalidOperationException: Process must
exit before requested information can be determined.
& See Scenario 4 below
In your code:
Process.WaitForExit(ProcessTimeOutMiliseconds); With this you're waiting for Process to Timeout or Exit, which ever takes place first.
OutputWaitHandle.WaitOne(ProcessTimeOutMiliseconds)anderrorWaitHandle.WaitOne(ProcessTimeOutMiliseconds); With this you're waiting for OutputData & ErrorData stream read operation to signal its complete
Process.ExitCode == 0 Gets status of process when it exited
Different settings & their caveats:
Scenario 1 (Happy Path): Process completes before the timeout, and thus your stdoutput and stderror also finishes before it and all is well.
Scenario 2: Process, OutputWaitHandle & ErrorWaitHandle timesout however stdoutput & stderror is still being read and completes after time out WaitHandlers. This leads to another exception ObjectDisposedException()
Scenario 3: Process times-out first (19 sec) but stdout and stderror is in action, you wait for WaitHandler's to times out (19 sec), causing a added delay of + 19sec.
Scenario 4: Process times out and code attempts to prematurely query Process.ExitCode resulting in the error System.InvalidOperationException: Process must exit before requested information can be determined.
I have tested this scenario over a dozen times and works fine, following settings have been used while testing
Size of Output stream ranging from 5KB to 198KB by initiating build of about 2-15 projects
Premature timeouts & process exits within the timeout window
Updated Code
.
.
.
process.BeginOutputReadLine();
process.BeginErrorReadLine();
//First waiting for ReadOperations to Timeout and then check Process to Timeout
if (!outputWaitHandle.WaitOne(ProcessTimeOutMiliseconds) && !errorWaitHandle.WaitOne(ProcessTimeOutMiliseconds)
&& !process.WaitForExit(ProcessTimeOutMiliseconds) )
{
//To cancel the Read operation if the process is stil reading after the timeout this will prevent ObjectDisposeException
process.CancelOutputRead();
process.CancelErrorRead();
Console.ForegroundColor = ConsoleColor.Red;
Console.WriteLine("Timed Out");
Logs = output + Environment.NewLine + error;
//To release allocated resource for the Process
process.Close();
return (false, logs);
}
Console.ForegroundColor = ConsoleColor.Green;
Console.WriteLine("Completed On Time");
Logs = output + Environment.NewLine + error;
ExitCode = process.ExitCode.ToString();
// Close frees the memory allocated to the exited process
process.Close();
//ExitCode now accessible
return process.ExitCode == 0 ? (true, logs) : (false, logs);
}
}
finally{}
EDIT:
After hours of playing around with MSBuild I was finally able to reproduce the issue at my system
Section B: Problem Recreation & Solution
MSBuild has -m[:number] switch which
is used to specify the maximum number of concurrent processes to use
when building.
When this is enabled, MSBuild spawns a number of nodes that lives on
even after the Build is complete. Now,
Process.WaitForExit(milliseconds) would wait never exit and
eventually timeout
I was able to solve this in couple of ways
Spawn MSBuild process indirectly through CMD
$path1 = """C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\MSBuild\15.0\Bin\MSBuild.exe"" ""C:\Users\John\source\repos\Test\Test.sln"" -maxcpucount:3"
$cmdOutput = cmd.exe /c $path1 '2>&1'
$cmdOutput
Continue to use MSBuild but be sure to set the nodeReuse to False
$filepath = "C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\MSBuild\15.0\Bin\MSBuild.exe"
$arg1 = "C:\Users\John\source\repos\Test\Test.sln"
$arg2 = "-m:3"
$arg3 = "-nr:False"
Start-Process -FilePath $filepath -ArgumentList $arg1,$arg2,$arg3 -Wait -NoNewWindow
Even If parallel build is not enabled, you could still prevent your process from hanging at WaitForExit by launching the Build via CMD & therefore you do not create a direct dependency on Build process
$path1 = """C:\....\15.0\Bin\MSBuild.exe"" ""C:\Users\John\source\Test.sln"""
$cmdOutput = cmd.exe /c $path1 '2>&1'
$cmdOutput
The 2nd approach is preferred since you do not want too many MSBuild nodes to be lying around.
Not sure if this is your issue, but looking at MSDN there seems to be some weirdness with the overloaded WaitForExit when you are redirecting output asynchronously. MSDN article recommends calling the WaitForExit that takes no arguments after calling the overloaded method.
Docs page located here. Relevant text:
When standard output has been redirected to asynchronous event handlers, it is possible that output processing will not have completed when this method returns. To ensure that asynchronous event handling has been completed, call the WaitForExit() overload that takes no parameter after receiving a true from this overload. To help ensure that the Exited event is handled correctly in Windows Forms applications, set the SynchronizingObject property.
Code modification might look something like this:
if (process.WaitForExit(ProcessTimeOutMiliseconds))
{
process.WaitForExit();
}
I have a dotnet core 2.2 console app.
I hosted it as windows service.
Service starts up another dotnet core WebAPI.
The problem is, how do I gracefully shutdown WebAPI process when the the service is stopped?
Note: I don't want to use Kill() method.
Sample code:
public class MyService : IHostedService, IDisposable
{
private Timer _timer;
static Process webAPI;
public Task StartAsync(CancellationToken cancellationToken)
{
_timer = new Timer(
(e) => StartChildProcess(),
null,
TimeSpan.Zero,
TimeSpan.FromMinutes(1));
return Task.CompletedTask;
}
public void StartChildProcess()
{
try
{
webAPI = new Process();
webAPI.StartInfo.UseShellExecute = false;
webAPI.StartInfo.FileName = #"C:\Project\bin\Debug\netcoreapp2.2\publish\WebAPI.exe";
webAPI.Start();
}
catch (Exception e)
{
// Handle exception
}
}
public Task StopAsync(CancellationToken cancellationToken)
{
// TODO: Add code to stop child process safely
_timer?.Change(Timeout.Infinite, 0);
return Task.CompletedTask;
}
public void Dispose()
{
_timer?.Dispose();
}
}
Technically you could simply call Process.Kill() in order to immediately shut down the process. However, a lot of the time that is not the way to go simply because the WebAPI might be in middle of important operations and you can't really tell when those actions may be happening and Process.Kill() is not really considered "graceful".
What would be most prudent to do is to tell the process that you would like for it to shut down at the earliest convenience and then allow for the WebAPI to clean things up before it exits itself. If you are desiging the WebAPI that is even better because that way you can decide on how to do this. Only calling Kill() when it is absolutely necessary.
You can do that multiple ways of course. Some that come to mind are Sockets that are periodically checked and sending a CTRL+C input to the WebAPI.
public Task StopAsync(CancellationToken cancellationToken)
{
// send request to shut down
// wait for process to exit and free its resources
process.WaitForExit();
process.Close();
process.Dispose();
_timer?.Change(Timeout.Infinite, 0);
return Task.CompletedTask;
}
Of course if this is Async then it wouldn't make sense to wait for it to exit inside of the method so you would simply wait or check if it has exited outside of this method.
There were a lot of threads regarding this issue on github, consider post 7426.
The solution is found here: StartAndStopDotNetCoreApp, the sample code of the program.cs is:
using System;
using System.IO;
using System.Diagnostics;
namespace StartAndStopDotNetCoreApp
{
class Program
{
static void Main(string[] args)
{
string projectPath = #"C:\source\repos\StartAndStopDotNetCoreApp\WebApplication";
string outputPath = #"C:\Temp\WebApplication";
Console.WriteLine("Starting the app...");
var process = new Process();
process.StartInfo.WorkingDirectory = projectPath;
process.StartInfo.FileName = "dotnet";
process.StartInfo.Arguments = $"publish -o {outputPath}";
process.StartInfo.UseShellExecute = false;
process.StartInfo.CreateNoWindow = true;
process.Start();
process.WaitForExit();
process.Close();
process.Dispose();
process = new Process();
process.StartInfo.WorkingDirectory = outputPath;
process.StartInfo.FileName = "dotnet";
process.StartInfo.Arguments = $"{projectPath.Split(#"\")[projectPath.Split(#"\").Length - 1]}.dll";
process.StartInfo.UseShellExecute = false;
process.StartInfo.CreateNoWindow = false;
process.StartInfo.RedirectStandardOutput = true;
process.Start();
Console.WriteLine("Press anything to stop...");
Console.Read();
process.Kill();
}
}
}
If this is not what you are looking for, search the mentioned thread for more ways, it offers plenty.
Perhaps it's possible to use the CloseMainWindow method of the System.Diagnostics.Process class? I stumbled upon it when I was looking for a way to send a WM_CLOSE message to another process from C#.
Edit:
CloseMainWindow does not seem to work if the main window is "blocked", for example by an open modal window or a dialog.
You might investigate a strategy for sending a WM_CLOSE or WM_DESTROY message to your app process explicitly. But I cannot guarantee if it will work as you want/expect. I haven't worked with that kind of Windows Messages functionality for a very long time.
I have this task in C# that should return the standard output of DISM, so I can use it where i need:
public async Task<StreamReader> DISM(string Args)
{
StreamReader DISMstdout = null;
await Task.Run(() =>
{
Process DISMcmd = new Process();
if (Environment.Is64BitOperatingSystem)
{
DISMcmd.StartInfo.FileName = System.IO.Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Windows), "SysWOW64", "dism.exe");
}
else
{
DISMcmd.StartInfo.FileName = System.IO.Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Windows), "System32", "dism.exe");
}
DISMcmd.StartInfo.Verb = "runas";
DISMcmd.StartInfo.Arguments = DISMArguments;
DISMcmd.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
DISMcmd.StartInfo.CreateNoWindow = true;
DISMcmd.StartInfo.UseShellExecute = false;
DISMcmd.StartInfo.RedirectStandardOutput = true;
DISMcmd.EnableRaisingEvents = true;
DISMcmd.Start();
DISMstdout = DISMcmd.StandardOutput;
DISMcmd.WaitForExit();
});
return DISMstdout;
}
But it doesn't really work.
If I want to read the standardoutput from another task I can't (because it is empty) So there must be a problem with my task?.
public async Task Test()
{
await Task.Run(() =>
{
StreamReader DISM = await new DISM("/Get-ImageInfo /ImageFile:" + ImagePath + #" /Index:1");
string data = string.Empty;
MessageBox.Show(DISM.ReadToEnd()); // this should display a msgbox with the standardoutput of dism
while ((data = DISM.ReadLine()) != null)
{
if (data.Contains("Version : "))
{
// do something
}
}
});
}
What is wrong with this piece of code?
The way I'd write your method to exploit async..await as opposed to the legacy asynchronous approaches is like this:
public async Task<TResult> WithDism<TResult>(string args, Func<StreamReader, Task<TResult>> func)
{
return await Task.Run(async () =>
{
var proc = new Process();
var windowsDir = Environment.GetFolderPath(Environment.SpecialFolder.Windows);
var systemDir = Environment.Is64BitOperatingSystem ? "SysWOW64" : "System32";
proc.StartInfo.FileName = Path.Combine(windowsDir, systemDir, "dism.exe");
proc.StartInfo.Verb = "runas";
proc.StartInfo.Arguments = args;
proc.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
proc.StartInfo.CreateNoWindow = true;
proc.StartInfo.UseShellExecute = false;
proc.StartInfo.RedirectStandardOutput = true;
proc.Start();
Console.Error.WriteLine("dism started");
var result = await func(proc.StandardOutput);
Console.Error.WriteLine("func finished");
// discard rest of stdout
await proc.StandardOutput.ReadToEndAsync();
proc.WaitForExit();
return result;
});
}
Since realistically, the only part where significant blocking can occur when spawning a process is as you handle the output it produces. Used like this:
var task = WithDism("/?", async sr => await sr.ReadToEndAsync()); // or process line-by-line
Console.WriteLine("dism task running");
Console.WriteLine(await task);
it produces the following output
dism task running
dism started
func finished
Error: 740
Elevated permissions are required to run DISM.
Use an elevated command prompt to complete these tasks.
Do note that when using subprocesses, it's your job to make sure they correctly exit or are shut down to avoid leaving zombie processes around. That's why I've added the possibly redundant ReadToEndAsync() - in case func still leaves some output unconsumed, this should allow the process to reach its natural end.
However, this means the calling function will only proceed once that happens. If you leave behind a lot of unconsumed output you're not interested in, this will cause an unwanted delay. You could work around this by spawning off this cleanup to a different background task and returning the result immediately using something like:
Task.Run(() => {
// discard rest of stdout and clean up process:
await proc.StandardOutput.ReadToEndAsync();
proc.WaitForExit();
});
but I admit I'm going a bit out on a limb there, I'm not entirely sure about the robustness of just letting a task "run wild" like that. What the appropriate way to clean up the process is will, of course, depend on what it's actually doing after you get the output you want to return from func.
I'm using synchronous calls to Console there because they only serve to illustrate the timing of events, I want to know that as execution reaches that point. Normally you would use async in a "viral" way to make sure control passes back to top-level as soon as possible.
After playing around with this using Benchmark.NET, it seems that starting a process (I tried DISM and Atom to have something hefty) - from setup to Start() - takes about 50 milliseconds. This seems pretty negligible to me for this use. After all, 50ms is good enough latency for say playing League of Legends, and you're not going to start these in a tight loop.
I'd like to provide an alternative answer of "don't bother with Task.Run() and just use async I/O in a straightforward way" unless you absolutely need to get rid of that delay and believe spawning off a background thread will help:
static string GetDismPath()
{
var windowsDir = Environment.GetFolderPath(Environment.SpecialFolder.Windows);
var systemDir = Environment.Is64BitOperatingSystem ? "SysWOW64" : "System32";
var dismExePath = Path.Combine(windowsDir, systemDir, "dism.exe");
return dismExePath;
}
static Process StartDism(string args)
{
var proc = new Process
{
StartInfo =
{
FileName = GetDismPath(),
Verb = "runas",
Arguments = args,
WindowStyle = ProcessWindowStyle.Hidden,
CreateNoWindow = true,
UseShellExecute = false,
RedirectStandardOutput = true
}
};
proc.Start();
return proc;
}
static void Cleanup(Process proc)
{
Task.Run(async () =>
{
proc.StandardInput.Close();
var buf = new char[0x1000];
while (await proc.StandardOutput.ReadBlockAsync(buf, 0, buf.Length).ConfigureAwait(false) != 0) { }
while (await proc.StandardError.ReadBlockAsync(buf, 0, buf.Length).ConfigureAwait(false) != 0) { }
if (!proc.WaitForExit(5000))
{
proc.Kill();
}
proc.Dispose();
});
}
static async Task Main(string[] args)
{
var dismProc = StartDism("/?");
// do what you want with the output
var dismOutput = await dismProc.StandardOutput.ReadToEndAsync().ConfigureAwait(false);
await Console.Out.WriteAsync(dismOutput).ConfigureAwait(false);
Cleanup(dismProc);
}
I'm only using Task.Run() to keep the cleanup off the main thread in case you need to do something else while DISM keeps producing output you're not interested in that you do not wish to kill outright.
I am trying to read from a CMD process repeatedly, and print out the output repeatedly as well, But the while loop I am using is inside a Task.Run so that it doesn't lock the program up, but there is some problem with ASync and non-ASync methods that I don't quite understand. I get that I need to not have the pProcess.StandardOutput.ReadToEnd in a Task.Run, but how would I implement this in my code? Thanks!
Task.Run(() =>
{
while (true)
{
string output = pProcess.StandardOutput.ReadToEnd();
System.Console.Write(output);
}
});
Could you please try the following code snippet? OutputDataReceived is the method to asynchronously read process output.
var startInfo = new ProcessStartInfo {
FileName = <Your Application>,
UseShellExecute = false, // Required to use RedirectStandardOutput
RedirectStandardOutput = true, //Required to be able to read StandardOutput
Arguments = <Args> // Skip this if you don't use Arguments
};
using(var process = new Process { StartInfo = startInfo })
{
process.Start();
process.OutputDataReceived += (sender, line) =>
{
if (line.Data != null)
Console.WriteLine(line.Data);
};
process.BeginOutputReadLine();
process.WaitForExit();
}
I have solves this problem
your are using Standardoutput.
you must use once for one object string output = pProcess.StandardOutput.ReadToEnd();
OR
pProcess.BeginOutputReadLine();
You cannot use both at a same time on single object. So if you want to read output and wait as well while the process not complete use
string output = pProcess.StandardOutput.ReadToEnd();
if you do not want to read output just simple you can use
pProcess.BeginOutputReadLine();
well you can use both according to your need but not at the same time.
Assuming pProcess is an instance of Process, pProcess.StandardOutput is asynchronous, but keeps reading until the end because of the .readToEnd() method. I don't think you need Task.Run or the while loop. Check out the example at https://msdn.microsoft.com/en-us/library/system.diagnostics.process.beginoutputreadline%28v=vs.110%29.aspx
There's no problem with executing this code in a seperate Task with Task.Run I think you're calling some asynchronous method like BeginErrorReadLine or BeginOutputReadLine somewhere before in your code. This SO answer may help: I am trying to read the output of a process in c# but I get this message "Cannot mix synchronous and asynchronous operation on process stream."
I am new to the thread model in .NET. What would you use to:
Start a process that handles a file (process.StartInfo.FileName = fileName;).
Wait for the user to close the process OR abandon the thread after some time.
If the user closed the process, delete the file.
Starting the process and waiting should be done on a different thread than the main thread, because this operation should not affect the application.
Example:
My application produces an html report. The user can right click somewhere and say "View Report" - now I retrieve the report contents in a temporary file and launch the process that handles html files i.e. the default browser. The problem is that I cannot cleanup, i.e. delete the temp file.
"and waiting must be async" - I'm not trying to be funny, but isn't that a contradiction in terms? However, since you are starting a Process, the Exited event may help:
ProcessStartInfo startInfo = null;
Process process = Process.Start(startInfo);
process.EnableRaisingEvents = true;
process.Exited += delegate {/* clean up*/};
If you want to actually wait (timeout etc), then:
if(process.WaitForExit(timeout)) {
// user exited
} else {
// timeout (perhaps process.Kill();)
}
For waiting async, perhaps just use a different thread?
ThreadPool.QueueUserWorkItem(delegate {
Process process = Process.Start(startInfo);
if(process.WaitForExit(timeout)) {
// user exited
} else {
// timeout
}
});
Adding an advanced alternative to this old question. If you want to wait for a process to exit without blocking any thread and still support timeouts, try the following:
public static Task<bool> WaitForExitAsync(this Process process, TimeSpan timeout)
{
ManualResetEvent processWaitObject = new ManualResetEvent(false);
processWaitObject.SafeWaitHandle = new SafeWaitHandle(process.Handle, false);
TaskCompletionSource<bool> tcs = new TaskCompletionSource<bool>();
RegisteredWaitHandle registeredProcessWaitHandle = null;
registeredProcessWaitHandle = ThreadPool.RegisterWaitForSingleObject(
processWaitObject,
delegate(object state, bool timedOut)
{
if (!timedOut)
{
registeredProcessWaitHandle.Unregister(null);
}
processWaitObject.Dispose();
tcs.SetResult(!timedOut);
},
null /* state */,
timeout,
true /* executeOnlyOnce */);
return tcs.Task;
}
Again, the advantage to this approach compared to the accepted answer is that you're not blocking any threads, which reduces the overhead of your app.
Try the following code.
public void KickOffProcess(string filePath) {
var proc = Process.Start(filePath);
ThreadPool.QueueUserWorkItem(new WaitCallBack(WaitForProc), proc);
}
private void WaitForProc(object obj) {
var proc = (Process)obj;
proc.WaitForExit();
// Do the file deletion here
}
The .NET 5 introduced the new API Process.WaitForExitAsync, that allows to wait asynchronously for the completion of a process. It offers the same functionality with the existing Process.WaitForExit, with the only difference being that the waiting is asynchronous, so it does not block the calling thread.
Usage example:
private async void button1_Click(object sender, EventArgs e)
{
string filePath = Path.Combine
(
Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData),
Guid.NewGuid().ToString() + ".txt"
);
File.WriteAllText(filePath, "Hello World!");
try
{
using Process process = new();
process.StartInfo.FileName = "Notepad.exe";
process.StartInfo.Arguments = filePath;
process.Start();
await process.WaitForExitAsync();
}
finally
{
File.Delete(filePath);
}
MessageBox.Show("Done!");
}
In the above example the UI remains responsive while the user interacts with the opened file. The UI thread would be blocked if the WaitForExit had been used instead.
I would probably not use a separate process for opening a file. Instead, I'd probably utilize a background thread (if I thought the operation was going to take a long time and possible block the UI thread).
private delegate void FileOpenDelegate(string filename);
public void OpenFile(string filename)
{
FileOpenDelegate fileOpenDelegate = OpenFileAsync;
AsyncCallback callback = AsyncCompleteMethod;
fileOpenDelegate.BeginInvoke(filename, callback, state);
}
private void OpenFileAsync(string filename)
{
// file opening code here, and then do whatever with the file
}
Of course, this is not a good working example (it returns nothing) and I haven't shown how the UI gets updated (you have to use BeginInvoke at the UI level because a background thread cannot update the UI thread). But this approach is generally how I go about handling asynchronous operations in .Net.
You can use the Exited event in Process class
ProcessStartInfo info = new ProcessStartInfo();
info.FileName = "notepad.exe";
Process process = Process.Start(info);
process.Exited += new EventHandler(process_Exited);
Console.Read();
and in that event you can handle the operations you mentioned