I have a small wrapping application to give a GUI to an existing console application. I'm using the ProcessStartInfo and Process class to bind to the .exe, and then using BeginErrorReadLine() and BeginOutputReadLine() to redirect any messages into the new GUI. Everything works fine except for when the console calls Console.Write() instead Console.WriteLine(), in which case the text passed to Write is not displayed at all. I would think that the problem is because the WriteLine function inserts a line break after the text, and the Write method does not. Is there any way to circumvent this? I can't change it from Write to WriteLine in the original command line program as Write is used to prompt for input.
Relevant Code:
var startInfo = new ProcessStartInfo(ServerFile);
startInfo.RedirectStandardInput = true;
startInfo.RedirectStandardError = true;
startInfo.RedirectStandardOutput = true;
ServerProc = new Process();
ServerProc.StartInfo = startInfo;
ServerProc.EnableRaisingEvents = true;
ServerProc.ErrorDataReceived += new DataReceivedEventHandler(ServerProc_ErrorDataReceived);
ServerProc.OutputDataReceived += new DataReceivedEventHandler(ServerProc_OutputDataReceived);
private void ServerProc_ErrorDataReceived(object sender, DataReceivedEventArgs e)
{
Dispatcher.Invoke(new Action(() =>
{
ConsoleTextBlock.Text += e.Data + "\r\n";
ConsoleScroll.ScrollToEnd();
}));
}
private void ServerProc_OutputDataReceived(object sender, DataReceivedEventArgs e)
{
Dispatcher.Invoke(new Action(() =>
{
ConsoleTextBlock.Text += e.Data + "\r\n";
ConsoleScroll.ScrollToEnd();
}));
}
The problem you are experiencing is that the Process class is set up for convenient line-oriented event-based processing of the output of the process. You cannot use this functionality if you need to read partial lines as they are being output.
Nevertheless, the Process class does give you the tools you need if you need finer-grained control over the output than the line-oriented facilities. If you redirect the output, then Process.StandardOutput is a StreamReader and you have the whole StreamReader API that you can use instead of being forced to read entire lines.
For example, here is a character-by-character read of the standard output:
var start = DateTime.Now;
int n;
while ((n = ServerProc.StandardOutput.Read()) != -1)
{
var c = (char)n;
var delta = (DateTime.Now - start).TotalMilliseconds;
Console.WriteLine("c = {0} (0x{1:X}) delta = {2}",
char.IsWhiteSpace(c) ? '*' : c, n, delta);
}
If we run it on another console program that produces this output:
Console.Write("abc");
Thread.Sleep(1000);
Console.WriteLine("def");
It produces this output:
c = a (0x61) delta = 44.0025
c = b (0x62) delta = 44.0025
c = c (0x63) delta = 44.0025
c = d (0x64) delta = 1109.0634
c = e (0x65) delta = 1110.0635
c = f (0x66) delta = 1110.0635
c = * (0xD) delta = 1110.0635
c = * (0xA) delta = 1110.0635
which shows that the "abc" was read one second before the rest of the line.
However, this is not convenient if you like the event-oriented I/O already provided by Process. You can either:
use the async Stream API
use threads that do blocking reads
and perhaps even roll your own event-based I/O that meets your needs.
You are writing a program that is parsing another program's character-oriented output and so without full lines you will need some sort of timeout to indicate "I now satisfied that the program is done producing output for the time being." This can be easy or hard depending on how predictable the output is. For example, you might be able to recognize a prompt that ends with a "?". It depends on your situation.
The point is that you'll have to use the StandardOutput and StandardError StreamReader properties if you want something other than line-oriented I/O.
Related
I need to write a wrapper for an interactive command line program.
That means I need to be able to send commands to the other program via its standard input und receive the response via its standard output.
The problem is, that the standard output stream seems to be blocked while the input stream is still open. As soon as I close the input stream I get the response. But then I cannot send more commands.
This is what I am using at the moment (mostly from here):
void Main() {
Process process;
process = new Process();
process.StartInfo.FileName = "atprogram.exe";
process.StartInfo.Arguments = "interactive";
// Set UseShellExecute to false for redirection.
process.StartInfo.UseShellExecute = false;
process.StartInfo.CreateNoWindow = true;
// Redirect the standard output of the command.
// This stream is read asynchronously using an event handler.
process.StartInfo.RedirectStandardOutput = true;
// Set our event handler to asynchronously read the output.
process.OutputDataReceived += (s, e) => Console.WriteLine(e.Data);
// Redirect standard input as well. This stream is used synchronously.
process.StartInfo.RedirectStandardInput = true;
process.Start();
// Start the asynchronous read of the output stream.
process.BeginOutputReadLine();
String inputText;
do
{
inputText = Console.ReadLine();
if (inputText == "q")
{
process.StandardInput.Close(); // After this line the output stream unblocks
Console.ReadLine();
return;
}
else if (!String.IsNullOrEmpty(inputText))
{
process.StandardInput.WriteLine(inputText);
}
}
}
I also tried reading the standard output stream synchronously, but with the same result. Any method call on the output stream block indefinitely until the input stream is closed - even Peek() and EndOfStream.
Is there any way to communicate with the other process in a full duplex kind of way?
I tried to reproduce your problem with a small test suite of my own.
Instead of using event handlers I do it in the most trivial way I could conceive: Synchronously. This way no extra complexity is added to the problem.
Here my little "echoApp" I wrote in rust, just for the giggles and also to have a chance to run into the eternal line termination wars problem ( \n vs \r vs \r\n). Depending on the way your command line application is written, this could indeed be one of your problems.
use std::io;
fn main() {
let mut counter = 0;
loop {
let mut input = String::new();
let _ = io::stdin().read_line(&mut input);
match &input.trim() as &str {
"quit" => break,
_ => {
println!("{}: {}", counter, input);
counter += 1;
}
}
}
}
And - being a lazy bone who does not like creating a solution for such a small test, I used F# instead of C# for the controlling side - it is easy enough to read I think:
open System.Diagnostics;
let echoPath = #"E:\R\rustic\echo\echoApp\target\debug\echoApp.exe"
let createControlledProcess path =
let p = new Process()
p.StartInfo.UseShellExecute <- false
p.StartInfo.RedirectStandardInput <- true
p.StartInfo.RedirectStandardOutput <- true
p.StartInfo.Arguments <- ""
p.StartInfo.FileName <- path
p.StartInfo.CreateNoWindow <- true
p
let startupControlledProcess (p : Process) =
if p.Start()
then
p.StandardInput.NewLine <- "\r\n"
else ()
let shutdownControlledProcess (p : Process) =
p.StandardInput.WriteLine("quit");
p.WaitForExit()
p.Close()
let interact (p : Process) (arg : string) : string =
p.StandardInput.WriteLine(arg);
let o = p.StandardOutput.ReadLine()
// we get funny empty lines every other time...
// probably some line termination problem ( unix \n vs \r\n etc -
// who can tell what rust std::io does...?)
if o = "" then p.StandardOutput.ReadLine()
else o
let p = createControlledProcess echoPath
startupControlledProcess p
let results =
[
interact p "Hello"
interact p "World"
interact p "Whatever"
interact p "floats"
interact p "your"
interact p "boat"
]
shutdownControlledProcess p
Executing this in f# interactive (CTRL-A ALT-Enter in Visual Studio) yields:
val echoPath : string = "E:\R\rustic\echo\echoApp\target\debug\echoApp.exe"
val createControlledProcess : path:string -> Process
val startupControlledProcess : p:Process -> unit
val shutdownControlledProcess : p:Process -> unit
val interact : p:Process -> arg:string -> string
val p : Process = System.Diagnostics.Process
val results : string list =
["0: Hello"; "1: World"; "2: Whatever"; "3: floats"; "4: your"; "5: boat"]
val it : unit = ()
I could not reproduce any blocking or deadlocks etc.
So, in your case I would try to investigate if maybe your NewLine property needs some tweaking (see function startupControlledProcess. If the controlled application does not recognize an input as a line, it might not respond, still waiting for the rest of the input line and you might get the effect you have.
process.BeginOutputReadLine();
Doesn't work like expected, because it waits until output stream will be closed, which will happen when process will end, and process will end when its input stream will be closed.
As workaround just use combinations of process.StandardOutput.ReadLine() and asynchronous made by yourself
I'm currently rendering the output of a command line process into a text box. The problem is that in a normal command prompt window, one of the lines that is written has a load bar kind of thing... where every few seconds it outputs a "." to the screen.... After a few dots, it will start a new line and then continue loading until it has completed its process.
With the following code, instead of getting these "." appear one by one, my OutputDataRecieved is waiting for the whole line to be written out... so the load bar is useless... Ie, it waits for "............." and thennnn it acts upon it.
Is there a way to keep track of every character being output to the screen rather than what seems to be per line outputs?
//Create process
System.Diagnostics.Process process = new System.Diagnostics.Process();
// arguments.ProcessStartInfo contains the following declaration:
// ProcessStartInfo = new ProcessStartInfo( "Cmd.exe" )
// {
// WorkingDirectory = executableDirectoryName,
// UseShellExecute = false,
// RedirectStandardInput = true,
// RedirectStandardOutput = true,
// CreateNoWindow = true,
// }
process.StartInfo = arguments.ProcessStartInfo;
//Start the process
StringBuilder sb = new StringBuilder();
bool alreadyThrownExit = false;
// The following event only seems to be run per line output rather than each character rendering the command line process useless
process.OutputDataReceived += ( sender, e ) =>
{
sb.AppendLine( e.Data );
CommandLineHelper.commandLineOutput = sb.ToString();
arguments.DelegateUpdateTextMethod();
if( !alreadyThrownExit )
{
if( process.HasExited )
{
alreadyThrownExit = true;
arguments.DelegateFinishMethod();
process.Close();
}
}
};
process.Start();
process.StandardInput.WriteLine( arguments.Command );
process.StandardInput.WriteLine( "exit" );
process.BeginOutputReadLine();
If you want asynchronous processing of the stdout of the given process on a per-character basis, you can use the TextReader.ReadAsync() method. Instead of the code you have to handle the OutputDataReceived event, just do something like this:
process.Start();
// Ignore Task object, but make the compiler happy
var _ = ConsumeReader(process.StandardOutput);
process.StandardInput.WriteLine( arguments.Command );
process.StandardInput.WriteLine( "exit" );
where:
async Task ConsumeReader(TextReader reader)
{
char[] buffer = new char[1];
while ((await read.ReadAsync(buffer, 0, 1)) > 0)
{
// process character...for example:
Console.Write(buffer[0]);
}
}
Alternatively, you could just create a dedicated thread and use that to call TextReader.Read() in a loop:
process.Start();
new Thread(() =>
{
int ch;
while ((ch = process.StandardOutput.Read()) >= 0)
{
// process character...for example:
Console.Write((char)ch);
}
}).Start();
process.StandardInput.WriteLine( arguments.Command );
process.StandardInput.WriteLine( "exit" );
IMHO the latter is more efficient, as it doesn't require as much cross-thread synchronization. But the former is more similar to the event-driven approach you would have had with the OutputDataReceived event.
I've built Form App that I use for some time , Now I want to Catch the StandardError of my process as well as its standartOutput
I've looked at answers in SO and MSDN and yet and cant get it right
My code :
public void RunProcess(string FileName, string Arguments,, bool IsPrintOutput = true)
{
process = new Process();
process.ErrorDataReceived += new DataReceivedEventHandler(OnDataReceivedEvent);
if (IsPrintOutput) process.OutputDataReceived += new DataReceivedEventHandler(OnDataReceivedEvent);
process.StartInfo.RedirectStandardOutput = true;
process.StartInfo.RedirectStandardError = true;
process.StartInfo.CreateNoWindow = true;
process.StartInfo.UseShellExecute = false;
process.StartInfo.FileName = FileName;
process.StartInfo.Arguments = Arguments;
if (EventWhenExit)
{
process.EnableRaisingEvents = true;
process.Exited += new EventHandler(myprocess_Exited);
}
process.Start();
process.BeginOutputReadLine();
//run polling on stored logs to print them to screen
PollingService();
}
I've check it with Iperf and I see that when I run it with correct argument I get correct output
but when I just send it with out any argumnet I see that with cmd I get
C:\>iperf.exe
Usage: iperf [-s|-c host] [options]
Try `iperf --help' for more information.
And my App I get Nothing !
what am I missing here ?
Thanks
You can stop reading here ! If you want to see the details of inner method continue below :
private void OnDataReceivedEvent(object sender, DataReceivedEventArgs e)
{
string ProcessOutput = e.Data;
ProcessLog.Add(e.Data);
}
private void PollingService()
{
var T = new Thread (()=>
{
while (true /* ProcessRunning*/)
{
if (ProcessLogIndex < ProcessLog.Count)
{
lock (this)
{
var tempList = ProcessLog.GetRange(ProcessLogIndex, ProcessLog.Count - ProcessLogIndex);
ProcessLogIndex = ProcessLog.Count;
foreach (var ToSend in tempList)
{
onDataOutputFromProcess(this, ToSend, sProcessNameID.ToString());
}
}
}
Thread.Sleep(400);
}
});
T.IsBackground = true;
T.Start();
}
I don't see a call to BeginErrorReadLine() anywhere in the code you posted. If you don't call that method, then the Process class won't actually redirect the stderr to your event handler.
I believe the above is the issue, but if you are actually calling that somewhere (and just didn't show it), then it is worth considering that there are some strange console programs out there that don't actually used stderr (or stdout) for error output. Instead, they write directly to the console window or some other non-standard mechanism. In those cases, you won't be able to receive the error output by redirecting stderr.
You can identify those programs by redirecting their output at the command like with e.g. iperf.exe 2> foo.txt. The stderr file handle is 2, and so that syntax redirects that file handle to a file named foo.txt. If the file is empty and you see errors on the screen, then the program is one of those strange programs.
But really, I think you probably just forgot to call BeginErrorReadLine(). :)
Note: This question has changed significantly since the first version, so some comments or answers could seem weird. Please, check the edit history if something seems weird.
I am launching a child process from a C# class library.
I am using Process.BeginOutputReadLine() to read the output/error in an asynchronous way. I thought it didn't work with very long lines, but the problem seems to be that it's not scalable. In my computer, a 128 kb line is processed instantly, a 512 kb line seems to take around one minute, 1 mb seems to take several minutes, and I've been around two hours waiting for a 10 mb line to be processed, and it was still working when I cancelled it.
It seems easy to fix reading directly from the StandardOutput and StandardError streams, but the data from those streams seems to be buffered. If I get wnough data from stdout to fill the buffer, and then some more data from stderr, I can't find a way to check if there's data pending in one of them, and if I try to read from stderr, it will hang forever.
Why is this happening, what am I doing wrong, and what's the right way to do this?
Some code samples to illustrate what I'm trying to achieve.
Program1:
// Writes a lot of data to stdout and stderr
static void Main(string[] args)
{
int numChars = 512 * 1024;
StringBuilder sb = new StringBuilder(numChars);
String s = "1234567890";
for (int i = 0; i < numChars; i++)
sb.Append(s[i % 10]);
int len = sb.Length;
Console.WriteLine(sb.ToString());
Console.Error.WriteLine(sb.ToString());
}
Program2:
// Calls Program1 and tries to read its output.
static void Main(string[] args)
{
StringBuilder sb = new StringBuilder();
StringBuilder sbErr = new StringBuilder();
proc.StartInfo.CreateNoWindow = true;
proc.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
proc.StartInfo.UseShellExecute = false;
proc.StartInfo.RedirectStandardError = true;
proc.StartInfo.RedirectStandardOutput = true;
proc.StartInfo.RedirectStandardInput = false;
proc.StartInfo.Arguments = "";
proc.StartInfo.FileName = "program1.exe";
proc.ErrorDataReceived += (s, ee) => { if (ee.Data != null) sbErr.AppendLine(ee.Data); };
proc.OutputDataReceived += (s, ee) => { if (ee.Data != null) sb.AppendLine(ee.Data); };
proc.Start();
proc.BeginOutputReadLine();
proc.BeginErrorReadLine();
proc.WaitForExit();
}
Program1 has a constant that allows to set the size of data to generate, and Program2 launches Program1 and tries to read the data. I should expect the time to grow linearly with size, but it seems much worse than that.
I hope i understand your problem. the application hangs on Process.WaitForExit() because this is what Process.WaitForExit(): it waits for the process to exit.
you might want to call it in a new thread:
int your method that create the process:
Thread trd = new Thread(new ParameterizedThreadStart(Start));
trd.Start();
and add this method:
private void Start(object o)
{
((Process)o).WaitForExit();
// your code after process ended
}
I need to run an external program for every PDF file in specified directory.
The problem is - how to limit the number of external program processes to user-specified value? I run it in the loop, like this:
foreach(string file in Directory.GetFiles(sourcePath))
{
Process p = new Process();
p.StartInfo.FileName = #"path\program.exe";
p.StartInfo.Arguments = previouslySetArguments;
p.Start();
}
Now the problem is that there is sometimes a really huge amount of files and with that code, all processes would be ran at the same time. It really slows the machine down.
Other idea is to put p.WaitForExit(); after the p.Start(); but then it would run only one process at a time, which on the other hand - slows down the whole work :)
What is the easiest way to limit processes number to run the exact amount of them at the same time? I want to let the user decide.
Let's say I want to run maximum 5 processes at once. So:
- first 5 processes starts in more-or-less the same time for first 5 files in the directory
- when one of them (doesn't matter, which) ends work, the next one starts - for the next file
If I were you I would look into a the producer-consumer model of queuing. It's intended to do pretty much exactly this, and there are lots of good examples that you can modify to suit your needs.
Here's an example:
C# producer/consumer
And another example:
http://msdn.microsoft.com/en-us/library/hh228601%28v=vs.110%29.aspx
(this last one is for 4.5, but still valid IMO)
OK, so I tried the answers from this question, but - strange thing - I couldn't get them to work. I admit, I was in a hurry, I could made some mistakes...
For now, the quickiest and simpliest (and ugliest) method I've found is just a loop, like the code below. It works with the test program that just calls Thread.Sleep() with given command line argument as miliseconds.
Now, please, explain me, why is it not a good solution - I assume it is not the correct way (not only because the code is ugly), even if it works with this test example.
class Program
{
// hardcoded, it's just a test:
static int activeProcesses = 0;
static int finishedProcesses = 0;
static int maxProcesses = 5;
static int runProcesses = 20;
static string file = #"c:\tmp\dummyDelay.exe";
static void Main(string[] args)
{
Random rnd = new Random();
while (activeProcesses + finishedProcesses < runProcesses)
{
if (activeProcesses < maxProcesses)
{
Process p = new Process();
p.EnableRaisingEvents = true;
p.Exited += new EventHandler(pExited);
p.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
p.StartInfo.FileName = file;
p.StartInfo.Arguments = rnd.Next(2000, 5000).ToString();
p.Start();
Console.WriteLine("Started: {0}", p.Id.ToString());
activeProcesses++;
}
}
}
static void pExited(object sender, EventArgs e)
{
Console.WriteLine("Finished: {0}", ((Process)sender).Id.ToString());
((Process)sender).Dispose();
activeProcesses--;
finishedProcesses++;
}
}