I am writing an application to manage processes and handle failovers. This program is written in C# for .NET Core and will run on Ubuntu Server 16.04 x64.
I have this code to create processes and track them, with exit events and such
ProcessStartInfo psi = new ProcessStartInfo
{
WorkingDirectory = "/home/xyzserver/someprocess",
FileName = "mono",
Arguments = "someprocess.exe",
RedirectStandardOutput = true
};
_proc = Process.Start(psi);
_proc.EnableRaisingEvents = true;
_proc.Exited += ProcOnExited;
I understand from the docs here that calls to Console.WriteLine will block if the _proc.StandardOutput stream is full. I want to prevent this behavior and dispose all the output from the managed application, since it will also write to a physical log on its own.
In addition, I would like to avoid storing any of the output in any unused stream buffers since they will never be used. A preferred solution will not UseShellExecute.
I have considered adding these 2 lines in the hope that any received data will be disposed, but am unsure about correctness.
_proc.OutputDataReceived += (sender, eventArgs) => {};
_proc.BeginOutputReadLine();
Is there a better way to accomplish this? Thoughts or comments are appreciated.
I manually ran the test on .NET Core using 3 programs:
An HTTP server to track the TextOutputter program.
A TextOutputter program that prints 1000 characters and makes an HTTP request every second.
a ProgramRunner that runs one instance of TextOutputter.
Without the 2 lines, the buffer fills upto 64k and stalls. With the 2 lines, there is no stalling.
Related
My testing seems to suggest that Process.Start() at its bare bones costs ~15-27ms (depending on if you use CreateNoWindow, which ADDS ~10ms). In this question, I'm not trying to say the code I'm running is slow, but the actual act of starting a process, even if it does nothing.
I got these numbers by using a stopwatch on a console app that ran another console app literally just returning in its main method.
namespace RunNothing
{
class Program
{
static void Main(string[] args)
{
var startInfo = new ProcessStartInfo(#"C:\Users\Noggog\Documents\visual studio 2017\Projects\DoNothing\DoNothing\bin\Release\DoNothing.exe")
{
CreateNoWindow = true,
UseShellExecute = false,
};
var sw = new Stopwatch();
var proc = new Process()
{
StartInfo = startInfo,
EnableRaisingEvents = true
};
proc.Exited += (sender, a) =>
{
sw.Stop();
System.Console.WriteLine(sw.ElapsedMilliseconds);
};
sw.Start();
proc.Start();
System.Console.ReadLine();
}
}
}
So my question is whether anyone knows a good way to improve that startup time per Process.Start() call?
For background, I have an app that will be starting various .exes, most of which should do a quick bool check and short circuit out, asap. Every so often one of the calls will do a longer operation, but usually it will do nothing. Right now a ~15-27ms call per go is getting a bit heavy for my use case.
Edit:
Some people were asking for more details of the end usage. This is the project that drove the question. The end usage is an extension of git hooks to provide more hookable commands, and provide convenience features such as calling an exe in response to hooks being fired. One of the modes is that a single exe can handle multiple hooks, and decide which ones it would respond to. In that scenario, every git command would "check in" with the exe to see if it wanted to do any logic. This is where the Process.Start() time adds up. If you have 6 repos and your git client is initializing things by running several commands a pre and post hook combo can be 27ms (proc start time) * 2(pre/post) * X(commands) * 6(repos) = ~2+ seconds. Potentially none of these commands are ones needing response, but it's already added several seconds of sluggishness to the system.
I have a third party DOS process which writes data about its progress to the command line.
I want to react on the progress. Normally I would use a Process with RedirectStandardOutput = true and RedirectStandardError = true and then
.OutputDataReceived +=xyzOutputDataReceived;
.ErrorDataReceived += xyzErrorDataReceived;
.Start();
.BeginOutputReadLine();
.BeginErrorReadLine();
Normally this works. and I got what i need as DataReceivedEventArg.
In this case the process seems to update the same line it has written (how is that possible?), so it writes 15 %, 15% changes to 18% and so on. Only at the end of execution it seems that the data is flushed to StandardOutput.
Also if i just try to pipe data to a text file (eg odb.exe >> output.txt) it shows nothing.
Is there any way to get the temporary data?
The question is not about getting the Standard Output, this works fine (synchronously and asynchronously). It is about how to get output from a process which I cannot change, and which does not seem to flush it's output to the standard stream.
Like juharr says, you need to use Win32 to screen scrape the console.
Fortunately you don't need to write that code yourself. You can use the buffer-reader from this post: https://stackoverflow.com/a/12366307/5581231
The BufferReader reads from standardout. I suppose you are writing a wpf or winforms application so we'll also have to get a reference to the console window of the DOS application. For this, we will use the Win32 API call AttachConsole.
[System.Runtime.InteropServices.DllImport("kernel32.dll")]
private static extern bool AttachConsole(int pid);
I wrote a small example program that demonstrates the usage. It starts the exe and attaches to its console. It then scrapes the entire window once a second, and dumps the output to the debugger output window. You should be able to modify this to search the console content for any keywords etc. that you can use to track the progress of the program. Or you could dump it to a textfield or something in your UI, possibly after comparing it for changes?
var process = Process.Start(#"..path to your exe....");
//Wait for the DOS exe to start, and create its console window
while (process.MainWindowHandle == IntPtr.Zero)
{
Thread.Sleep(500);
}
//Attach to the console of our DOS exe
if (!AttachConsole(process.Id))
throw new Exception("Couldn't attach to console");
while (true)
{
var strings = ConsoleReader.ReadFromBuffer(0, 0,
(short)Console.BufferWidth,
short)Console.BufferHeight);
foreach (var str in strings.
Select(s => s?.Trim()).
Where(s => !String.IsNullOrEmpty(s)))
{
Debug.WriteLine(str);
}
Thread.Sleep(1000);
}
Good Luck!
I'm trying to put together a wrapper around a console application using StandardInput and StandardOutput. I'm getting stuck where the console application would prompt for input such as a password.
I'd like to read from StandardOutput, prompt the user using the read text, and write the user's input back to the console application using its StandardInput. Seems easy enough, here's what I have currently:
Process process = new Process()
{
StartInfo =
{
FileName = "bin\\vpnc.exe",
Arguments = "--no-detach --debug 0",
CreateNoWindow = true,
UseShellExecute = false,
RedirectStandardInput = true,
RedirectStandardOutput = true,
}
};
process.OutputDataReceived += (s, args) =>
{
textBlock1.Dispatcher.Invoke(new Action(() =>
{
textBlock1.Text += args.Data;
}));
};
process.Start();
process.BeginOutputReadLine();
The problem is that BeginOutputReadLine() is doing just that...waiting for a line ending. In this case it just sits, and sits, and sits because there is no line to read...the console application has written out text with no line ending and is waiting for input. Coincidentally, when I manually kill the process the event fires and I get the text.
Is there a way to tell that the process is waiting for StandardInput? Or am I missing a completely obvious way to accomplish the goal?
Unless you need something asynchronous you probably want ReadToEnd.
Here is a list of all StreamReader Methods
process.StandardOutput.BaseStream.BeginRead(...) is a potential substitute for your readline, and that will not wait for a line ending however you'd need to know what terminates the output to figure out when not to start wait for the next chunk of data
As Rune said, you can access directly to the output stream of the process (process.StandardOutput) and read from there (if you don't want to wait until a line break is entered into the console app), but this means that you need to check periodically for new data.
To interact with the application, you can just write to the StandardInput of the process (create a StreamWriter that writes to the process.StandardInput).
A nice sample of writing to it is on the MSDN documentation (http://msdn.microsoft.com/en-us/library/system.diagnostics.process.beginoutputreadline.aspx).
Hope this helps
You need to use the synchronous read method and handle any necessary threading yourself. The below code won't tell you that input is expected, but you will be able to detect that a prompt is displayed.
char[] b = new char[1024];
while (!process.HasExited) {
int c = process.StandardOutput.Read(b, 0, b.Length);
context.Response.Write(b, 0, c);
Thread.Sleep(100);
}
I have been observing that Process.HasExited sometimes returns true even though the process is still running.
My code below starts a process with name "testprogram.exe" and then waits for it to exit. The problem is that sometimes I get thrown the exception; it seems that even though HasExited returns true the process itself is still alive in the system - how can this be??
My program writes to a log file just before it terminates and thus I need to be absolutely sure that this log file exists (aka the process has terminated/finished) before reading it. Continuously checking for it's existence is not an option.
// Create new process object
process = new Process();
// Setup event handlers
process.EnableRaisingEvents = true;
process.OutputDataReceived += OutputDataReceivedEvent;
process.ErrorDataReceived += ErrorDataReceivedEvent;
process.Exited += ProgramExitedEvent;
// Setup start info
ProcessStartInfo psi = new ProcessStartInfo
{
FileName = ExePath,
// Must be false to redirect IO
UseShellExecute = false,
RedirectStandardOutput = true,
RedirectStandardError = true,
Arguments = arguments
};
process.StartInfo = psi;
// Start the program
process.Start();
while (!process.HasExited)
Thread.Sleep( 500 );
Process[] p = Process.GetProcessesByName( "testprogram" );
if ( p.Length != 0 )
throw new Exception("Oh oh");
UPDATE: I just tried waiting with process.WaitForExit() instead of the polling loop and the result is the exact same.
Addition: The above code was only to demonstrate a 'clearer' problem alike. To make it clear; my problem is NOT that I still can get a hold of the process by Process.GetProcessesByName( "testprogram" ); after it set HasExited to true.
The real problem is that the program I am running externally writes a file -just before- it terminates (gracefully). I use HasExited to check when the process has finished and thus I know I can read the file (because the process exited!), but it seems that HasExited returns true even sometimes when the program has NOT written the file to disk yet. Here's example code that illustrates the exact problem:
// Start the program
process.Start();
while (!process.HasExited)
Thread.Sleep( 500 );
// Could also be process.WaitForExit(), makes no difference to the result
// Now the process has quit, I can read the file it has exported
if ( !File.Exists( xmlFile ) )
{
// But this exception is thrown occasionally, why?
throw new Exception("xml file not found");
}
I realize this is an old post, but in my quest to find out why my app running the Exited event before the app had even opened I found out something that I though might be useful to people experiencing this problem in the future.
When a process is started, it is assigned a PID. If the User is then prompted with the User Account Control dialog and selects 'Yes', the process is re-started and assigned a new PID.
I sat with this for a few hours, hopefully this can save someone time.
I would suggest you to try this way:
process.Start();
while (!process.HasExited)
{
// Discard cached information about the process.
process.Refresh();
// Just a little check!
Console.WriteLine("Physical Memory Usage: " + process.WorkingSet64.ToString());
Thread.Sleep(500);
}
foreach (Process current in Process.GetProcessesByName("testprogram"))
{
if ((current.Id == process.Id) && !current.HasExited)
throw new Exception("Oh oh!");
}
Anyway... in MSDN page of HasExited I'm reading the following hightlighted note:
When standard output has been redirected to asynchronous event
handlers, it is possible that output processing will not have
completed when this property returns true. To ensure that asynchronous
event handling has been completed, call the WaitForExit() overload
that takes no parameter before checking HasExited.
That could be somehow linked to your problem as you are redirecting everything.
I know, this is an old post but maybe I can help someone. The Process class may behave unexpectedly. HasExited will return true if the process has exited or if the process runs with administrator privileges and your program only has user privileges.
I have posted a question regarding this a while back here, but did not receive a satisfiable answer.
First off, are you sure testprogram does not spawn a process of its own and exit without waiting for that process to finish? We're dealing with some kind of race condition here, and testprogram can be significant.
Second point I'd like to make is about this - "I need to be absolutely sure that this logfile exists". Well, there is no such thing. You can make your check, and then the file is gone. The common way to address this is not to check, but rather to do what you want to do with the file. Go ahead, read it, catch exceptions, retry if the thing seems unstable and you don't want to change anything. The functional check-and-do does not work well if you have more than one actor (thread or whatever) in the system.
A bunch of random ideas follows.
Have you tried using FileSystemWatcher and not depending on process completion?
Does it get any better if you try reading the file (not checking if it exists, but acting instead) in the process.Exited event? [it shouldn't]
Is the system healthy? Anything suspicious in the Event Log?
Can some really aggressive antivirus policy be involved?
(Can't tell much without seeing all the code and looking into testprogram.)
So just for a further investigation into the root cause of the problem you should maybe check out what's really happening by using Process Monitor. Simply start it and include the external program and your own tool and let it record what happens.
Within the log you should see how the external tool writes to the output file and how you open that file. But within this log you should see in which order all these accesses happen.
The first thing that came to my mind is that the Process class doesn't lie and the process is really gone when it tells so. So problem is that at this point in time it seems that the file is still not fully available. I think this is a problem of the OS, cause it holds some parts of the file still within a cache that is not fully written onto the disk and the tool has simply exited itself without flushing its file handles.
With this in mind you should see within the log that the external tool created the file, exited and AFTER that the file will be flushed/closed (by the OS [maybe remove any filters when you found this point within the log]).
So if my assumptions are correct the root cause would be the bad behavior of your external tool which you can't change thus leading to simply wait a little bit after the process has exited and hope that the timeout is long enough to get the file flushed/closed by the OS (maybe try to open the file in a loop with a timeout till it succeeded).
There's two possibilities, the process object continues to hold a reference to the process, so it has exited, but it hasn't yet been deleted. Or you have a second instance of the process running. You should also compare the process Id to make sure. Try this.
....
// Start the program
process.Start();
while (!process.HasExited)
Thread.Sleep( 500 );
Process[] p = Process.GetProcessesByName( "testprogram" );
if ( p.Length != 0 && p[0].Id == process.id && ! p[0].HasExited)
throw new Exception("Oh oh");
For a start, is there an issue with using Process.WaitForExit rather than polling it?
Anyway, it is technically possible for the process to exit from a usable point of view but the process still be around briefly while it does stuff like flush disk cache. Is the log file especially large (or any operation it is performing heavy on disk writes)?
As per MSDN documentation for HasExited.
If a handle is open to the process,
the operating system releases the
process memory when the process has
exited, but retains administrative
information about the process, such as
the handle, exit code, and exit time.
Probably not related, but it's worth noting.
If it's only a problem 1/10 of the time, and the process disappears after a second anyway, depending on your usage of HasExited, try just adding another delay after the HasExited check works, like
while (!process.HasExited)
DoStuff();
Thread.Sleep(500);
Cleanup();
and see if the problem persists.
Personally, I've always just used the Exited event handler instead of any kind of polling, and a simplistic custom wrapper around System.Diagnostics.Process to handle things like thread safety, wrapping a call to CloseMainWindow() followed by WaitForExit(timeout) and finally Kill(), logging, et cetera, and never encountered a problem.
Maybe the problem is in the testprogram? Does this code nicely flush/close etc.? It seems to me if testprogram writes a file to disk, the file should at least be available (empty or not)
If you have web application, and your external program/process is generating files (write to disk) check if your IIS have rights to write to that folder if not on properties security add permission for your IIS user, that was the reason in my case, i was receiving process.HasExited =true, but produced files from the process was not completed, after struggling for a while i add full permissions to the folder where process was writhing and process.Refresh() as Zarathos described from above and everything was working as expected.
Use process_name.Refresh() before checking whether process has exited or not. Refresh() will clear all the cached information related to the process.
I'm currently writing myself a little C# back up program. I'm using a standard windows form for the interface, and am calling cmd.exe as a new process, and then using XCOPY from within this new process. Every thing's working great, except for this last feature I want to add in, which is the ability to break the operation.
From a native command prompt, I can do this cleanly with ctrl+c, but try as I might, I can't replicate this functionality using the winforms and process approach. I've tried redirecting the standardinput and using that to send consolespecialkeys.ControlC to the process, I've also tried sending 0x03 and "/x03", both of which I've read on other forum posts are hex code for ctrl+c. Nothing I'm sending is registered though, and exiting the process kills the user interface, but leaves the xcopy.exe working in the background. Killing xcopy.exe manually results in it leaving the file it was copying half copied and corrupted, not something that happens using the ctrl+c in a command prompt.
Am I missing something blindingly obvious? I'm new-ish to C#, so I'll hold my hands up and admit this is most likely me being slow, or misunderstanding how the process is working with cmd.exe. However, since processes support standard input redirection, it seems like something that should work... to me at least. I've put the basic outline of my code below, in case it helps identify where I'm messing up.
string XCopyArguments = "\"" + dir.FullName + "\" \"" + destination + "\" /D /S /I /E";
Process XCopyProcess = new Process();
ProcessStartInfo XCopyStartInfo = new ProcessStartInfo();
XCopyStartInfo.FileName = "CMD.exe ";
XCopyStartInfo.RedirectStandardError = true;
XCopyStartInfo.RedirectStandardOutput = true;
XCopyStartInfo.RedirectStandardInput = true;
XCopyStartInfo.UseShellExecute = false;
XCopyStartInfo.CreateNoWindow = true;
XCopyStartInfo.Arguments = " /D /c XCOPY " + XCopyArguments;
XCopyProcess.EnableRaisingEvents = true;
XCopyProcess.StartInfo = XCopyStartInfo;
XCopyProcess.Start();
XCopyProcess.WaitForExit(15000);
int ExitCode = XCopyProcess.ExitCode;
if (ExitCode > 0 & !XCopyProcess.HasExited)
{
XCopyProcess.Kill();
}
XCopyProcess.Dispose();
Many thanks in advance for any help anyone can offer.
I don't want to be a besserwisser, but I think you'd be much better off doing the copying inside your program. Using File, Directory and the other classes in the System.IO namespace, it's really simple, and leaves you in full control to report progress, cancel operations etc.
Yes, doing the operation in .NET would be easier. BUT, I need to send ctrl-c to a process also and I don't have that option.
So can we please get an answer to this question?
EDIT: Do I have to post a duplicate to get an answer? And no, #j0rd4n didn't answer the question.
Like the others said, there are better ways to accomplish that particular task. However, that doesn't answer your question. I use a similar technique to what you have shown here for automating various tasks and find it quite useful. Sometimes things go very badly though and you want the process to bail out before things get worse. ;p
Here is the problem with your example:
XCopyStartInfo.CreateNoWindow = true;
Set it to false and it will then process XCopyProcess.CloseMainWindow() and XCopyProcess.Close(). Much cleaner than using Kill().
It would require fewer code lines to just loop over the subdirectories and files and copy them one by one and then you wouldn't have to worry about controling another process...
Sorry it's in VB.NET.
Declare Function GenerateConsoleCtrlEvent Lib "kernel32" ( _
ByVal dwCtrlEvent As Integer, _
ByVal dwProcessGroupId As Integer _
) As Integer
Private Const CTRL_C_EVENT As Integer = 0
Private Sub SendCtrlC()
GenerateConsoleCtrlEvent(CTRL_C_EVENT, 0)
' send a Ctrl-C to this process
GenerateConsoleCtrlEvent(CTRL_C_EVENT, currentpid)
' send a Ctrl-C to the cmd process
GenerateConsoleCtrlEvent(CTRL_C_EVENT, cmdpid)
End Sub
I've successfully sent a CTRL-C combination to an cmd.exe process created with SW_HIDE--i.e, a hidden cmd.exe window.
The technique was to use EnumWindows to identify the process and get it's window handle (it still has a handle to process messages even if its not visible).
Next, I used PostMessage to post a ctrl-c combination to the process. This had the same effect as if a user had hit 'ctrl-c' while the window was active.
To do this from C#, you would want to probably visit http://pinvoke.net/ - a lifesaver when it comes to writing Win32 API function prototypes in C#.
You'll have to manually kill the process if you want to handle the copying this way. In your code above, you are calling XCopyProcess.WaitForExit(...). This is a blocking call so the parent C# process will halt at that point until the child process has finished or the time-interval has elapsed.
What you could do is instead of blocking, you can sleep in a loop routinely checking if the user has requested to kill the process via your C# UI. If you receive this event, you explicitly kill the process. Otherwise, you wait for another interval until the process is finished.
EDIT: I do agree with the other comments, though. Copy directly from the .NET framework instead of using xcopy.