Process often exiting prematurely - c#

I have this small piece of code (easy to try):
class Program
{
static void Main(string[] args)
{
List<string> paths = new List<string>() { "1.docx", "2.docx", "3.docx", "4.docx" };
foreach (string path in paths)
{
string path2 = #"Path\to\those\files" + path;
Process process = new Process();
process.StartInfo = new ProcessStartInfo(path2);
System.Diagnostics.Debug.WriteLine("~>" + path2);
process.Exited += (s1, e1) => process_Exited(path2);
process.EnableRaisingEvents = true;
process.Start();
}
while (true) { }
}
static void process_Exited(string path)
{//<~ breakpoint here
}
}
What is happening is sometimes, the breakpoint is hit moments after I start the application despite it having to wait for the processes to be exited one after another. It is always the last file of all, be it 2 files, 3 files, 4 files or more. The only time the breakpoint is never hit prematurely is when paths contains only one file. (By the way, maybe I could not care much about this strange behaviour, but when I really exit the .docx file (the last one from the paths list), the breakpoint is not hit.
Why is this (process sometimes exiting prematurely) happening and how to prevent that?
UPDATE: I just noticed it is not necessarily the last of files in paths, sometimes it is random one.

Eventually, unable to solve the primary issue, I ended up using NetOffice:
NetOffice.WordApi.Application wordApp = new NetOffice.WordApi.Application();
wordApp.Visible = true;
NetOffice.WordApi.Document doc = wordApp.Documents.Open(path);
which solved the premature closing.

Related

How to report custom error strings from child process while preserving debugger functionality

I have a program that creates a subprocess and trys to handle any error messages as per below:
study.StartInfo.FileName = studyWorkingDir +"\\"+ clargParts[0];
study.StartInfo.Arguments = clargs;
study.StartInfo.UseShellExecute = false;
study.StartInfo.RedirectStandardError = true;
study.ErrorDataReceived += (sender, args) =>
{
Process process = (Process)sender;
if (!process.HasExited)
{
MyErrorBroadcaster.BroadcastMessage("Instance error: " + args.Data.ToString());
process.kill();
}
};
study.Start(); // Start the Study!
study.PriorityClass = ProcessPriorityClass.BelowNormal;
study.BeginErrorReadLine();
The child process is an application (given below). In order to actually get a non-empty error in the above master process, I've needed to wrap the whole sub-program in a try-catch which makes a custom Console.Error.WriteLine() call
static class Program
{
static void Main()
{
try
{
Console.Title = "MyProgram";
Application.Run(new GUI());
}
catch (Exception e)
{
Console.Error.WriteLine(e.ToString().Replace("\r\n","\t"));
}
}
}
The problem is, this try-catch in the child program clobbers the debugger from stopping on the actual error locations in visual studio, which I think will annoy/be problematic for me and my coworkers as we develop. (Seems kludgy too!)
Is there a better way to get have this sub-application to report errors to the master process, without affecting normal debugger operation?
Revisiting this, the problem was that the master process was only getting the errors one line at a time and that first line happened to be "", after which it immediately closes the process (ignoring the following error lines). The e.ToString().Replace("\r\n","\t") bypassed this issue by guaranteeing the error was one line, but with unwanted side effects.
A better solution for me was the following:
study.StartInfo.UseShellExecute = false;
study.StartInfo.RedirectStandardError = true;
study.Start(); // Start the Study!
//This thread collects errors until the process ends, then reports them
(new Thread(() =>
{
//Acumulate errors, then send
String err = study.StandardError.ReadToEnd();
if (!string.IsNullOrEmpty(err))
MyErrorBroadcaster.BroadcastMessage("Instance Error", "\tInstance error: " + err);
})).Start();
This approach uses the study.StandardError.ReadToEnd(). This command is blocking which would be unsuitable for my purposes, so It is in a lambda thread (so the main program can move on!)

cant get process error output using process.ErrorDataReceived c#

I've built Form App that I use for some time , Now I want to Catch the StandardError of my process as well as its standartOutput
I've looked at answers in SO and MSDN and yet and cant get it right
My code :
public void RunProcess(string FileName, string Arguments,, bool IsPrintOutput = true)
{
process = new Process();
process.ErrorDataReceived += new DataReceivedEventHandler(OnDataReceivedEvent);
if (IsPrintOutput) process.OutputDataReceived += new DataReceivedEventHandler(OnDataReceivedEvent);
process.StartInfo.RedirectStandardOutput = true;
process.StartInfo.RedirectStandardError = true;
process.StartInfo.CreateNoWindow = true;
process.StartInfo.UseShellExecute = false;
process.StartInfo.FileName = FileName;
process.StartInfo.Arguments = Arguments;
if (EventWhenExit)
{
process.EnableRaisingEvents = true;
process.Exited += new EventHandler(myprocess_Exited);
}
process.Start();
process.BeginOutputReadLine();
//run polling on stored logs to print them to screen
PollingService();
}
I've check it with Iperf and I see that when I run it with correct argument I get correct output
but when I just send it with out any argumnet I see that with cmd I get
C:\>iperf.exe
Usage: iperf [-s|-c host] [options]
Try `iperf --help' for more information.
And my App I get Nothing !
what am I missing here ?
Thanks
You can stop reading here ! If you want to see the details of inner method continue below :
private void OnDataReceivedEvent(object sender, DataReceivedEventArgs e)
{
string ProcessOutput = e.Data;
ProcessLog.Add(e.Data);
}
private void PollingService()
{
var T = new Thread (()=>
{
while (true /* ProcessRunning*/)
{
if (ProcessLogIndex < ProcessLog.Count)
{
lock (this)
{
var tempList = ProcessLog.GetRange(ProcessLogIndex, ProcessLog.Count - ProcessLogIndex);
ProcessLogIndex = ProcessLog.Count;
foreach (var ToSend in tempList)
{
onDataOutputFromProcess(this, ToSend, sProcessNameID.ToString());
}
}
}
Thread.Sleep(400);
}
});
T.IsBackground = true;
T.Start();
}
I don't see a call to BeginErrorReadLine() anywhere in the code you posted. If you don't call that method, then the Process class won't actually redirect the stderr to your event handler.
I believe the above is the issue, but if you are actually calling that somewhere (and just didn't show it), then it is worth considering that there are some strange console programs out there that don't actually used stderr (or stdout) for error output. Instead, they write directly to the console window or some other non-standard mechanism. In those cases, you won't be able to receive the error output by redirecting stderr.
You can identify those programs by redirecting their output at the command like with e.g. iperf.exe 2> foo.txt. The stderr file handle is 2, and so that syntax redirects that file handle to a file named foo.txt. If the file is empty and you see errors on the screen, then the program is one of those strange programs.
But really, I think you probably just forgot to call BeginErrorReadLine(). :)

How to set maximum number of external processes the program can start at the same time?

I need to run an external program for every PDF file in specified directory.
The problem is - how to limit the number of external program processes to user-specified value? I run it in the loop, like this:
foreach(string file in Directory.GetFiles(sourcePath))
{
Process p = new Process();
p.StartInfo.FileName = #"path\program.exe";
p.StartInfo.Arguments = previouslySetArguments;
p.Start();
}
Now the problem is that there is sometimes a really huge amount of files and with that code, all processes would be ran at the same time. It really slows the machine down.
Other idea is to put p.WaitForExit(); after the p.Start(); but then it would run only one process at a time, which on the other hand - slows down the whole work :)
What is the easiest way to limit processes number to run the exact amount of them at the same time? I want to let the user decide.
Let's say I want to run maximum 5 processes at once. So:
- first 5 processes starts in more-or-less the same time for first 5 files in the directory
- when one of them (doesn't matter, which) ends work, the next one starts - for the next file
If I were you I would look into a the producer-consumer model of queuing. It's intended to do pretty much exactly this, and there are lots of good examples that you can modify to suit your needs.
Here's an example:
C# producer/consumer
And another example:
http://msdn.microsoft.com/en-us/library/hh228601%28v=vs.110%29.aspx
(this last one is for 4.5, but still valid IMO)
OK, so I tried the answers from this question, but - strange thing - I couldn't get them to work. I admit, I was in a hurry, I could made some mistakes...
For now, the quickiest and simpliest (and ugliest) method I've found is just a loop, like the code below. It works with the test program that just calls Thread.Sleep() with given command line argument as miliseconds.
Now, please, explain me, why is it not a good solution - I assume it is not the correct way (not only because the code is ugly), even if it works with this test example.
class Program
{
// hardcoded, it's just a test:
static int activeProcesses = 0;
static int finishedProcesses = 0;
static int maxProcesses = 5;
static int runProcesses = 20;
static string file = #"c:\tmp\dummyDelay.exe";
static void Main(string[] args)
{
Random rnd = new Random();
while (activeProcesses + finishedProcesses < runProcesses)
{
if (activeProcesses < maxProcesses)
{
Process p = new Process();
p.EnableRaisingEvents = true;
p.Exited += new EventHandler(pExited);
p.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
p.StartInfo.FileName = file;
p.StartInfo.Arguments = rnd.Next(2000, 5000).ToString();
p.Start();
Console.WriteLine("Started: {0}", p.Id.ToString());
activeProcesses++;
}
}
}
static void pExited(object sender, EventArgs e)
{
Console.WriteLine("Finished: {0}", ((Process)sender).Id.ToString());
((Process)sender).Dispose();
activeProcesses--;
finishedProcesses++;
}
}

C# FileSystemWatcher WaitForChanged Method only detects one file change

I got a problem similar to this one: FileSystemWatcher - only the change event once firing once?
But since that thread is two years old and my code is a bit different, I decided to open a new question.
Well, here's my code:
while (true)
{
FileSystemWatcher fw = new FileSystemWatcher();
fw.Path = #"Z:\";
fw.Filter = "*.ini";
fw.WaitForChanged(WatcherChangeTypes.All);
Console.WriteLine("File changed, starting script...");
//if cleanup
try
{
if (File.ReadAllLines(#"Z:\file.ini")[2] == "cleanup")
{
Console.WriteLine("Cleaning up...");
Process c = new Process();
c.StartInfo.FileName = System.Environment.GetFolderPath(Environment.SpecialFolder.Desktop).Trim('\\') + #"\clean.exe";
c.StartInfo.WorkingDirectory = System.Environment.SpecialFolder.DesktopDirectory.ToString();
c.Start();
c.WaitForExit();
Console.WriteLine("Done with cleaning up, now starting script...");
}
}
catch
{
Console.WriteLine("No cleanup parameter found.");
}
Process p = new Process();
p.StartInfo.FileName = System.Environment.GetFolderPath(Environment.SpecialFolder.Desktop).Trim('\\') + #"\go.exe";
p.StartInfo.WorkingDirectory = System.Environment.SpecialFolder.DesktopDirectory.ToString();
p.Start();
Console.WriteLine("Script running...");
p.WaitForExit();
fw = null;
Console.WriteLine("Done. Waiting for next filechange...");
}
Problem: This program should detect a file change in the file "Z:\file.ini". If it has changed, a script should be fired. When the script is done, the programm should return to the start and start watching for changes, again (that's why I used the while-loop).
Well, the first change is detected and everything seems to be working just fine, but any changes AFTER the first one are not going to be detected. I tried to set the FileSystemWatcher Object to null, as you can see, but it didn't help.
So, I hope for good answers. Thanks.
I would change your design so you don't rely on the FileSystemWatcher for any changes. Instead poll the directory or the file that your watching for any changes. You can then use the FileSystemWatcher in conjunction with this to wake it up as soon as possible if we know there are changes. This way, if you miss an event, you'd still recover from it based on your poll time-out.
e.g.
static void Main(string[] args)
{
FileSystemWatcher watcher = new FileSystemWatcher(#"f:\");
ManualResetEvent workToDo = new ManualResetEvent(false);
watcher.NotifyFilter = NotifyFilters.LastWrite;
watcher.Changed += (source, e) => { workToDo.Set(); };
watcher.Created += (source, e) => { workToDo.Set(); };
// begin watching
watcher.EnableRaisingEvents = true;
while (true)
{
if (workToDo.WaitOne())
{
workToDo.Reset();
Console.WriteLine("Woken up, something has changed.");
}
else
Console.WriteLine("Timed-out, check if there is any file changed anyway, in case we missed a signal");
foreach (var file in Directory.EnumerateFiles(#"f:\"))
Console.WriteLine("Do your work here");
}
}

Hanging process when run with .NET Process.Start -- what's wrong?

I wrote a quick and dirty wrapper around svn.exe to retrieve some content and do something with it, but for certain inputs it occasionally and reproducibly hangs and won't finish. For example, one call is to svn list:
svn list "http://myserver:84/svn/Documents/Instruments/" --xml --no-auth-cache --username myuser --password mypassword
This command line runs fine when I just do it from a command shell, but it hangs in my app. My c# code to run this is:
string cmd = "svn.exe";
string arguments = "list \"http://myserver:84/svn/Documents/Instruments/\" --xml --no-auth-cache --username myuser --password mypassword";
int ms = 5000;
ProcessStartInfo psi = new ProcessStartInfo(cmd);
psi.Arguments = arguments;
psi.RedirectStandardOutput = true;
psi.WindowStyle = ProcessWindowStyle.Normal;
psi.UseShellExecute = false;
Process proc = Process.Start(psi);
StreamReader output = new StreamReader(proc.StandardOutput.BaseStream, Encoding.UTF8);
proc.WaitForExit(ms);
if (proc.HasExited)
{
return output.ReadToEnd();
}
This takes the full 5000 ms and never finishes. Extending the time doesn't help. In a separate command prompt, it runs instantly, so I'm pretty sure it's unrelated to an insufficient waiting time. For other inputs, however, this seems to work fine.
I also tried running a separate cmd.exe here (where exe is svn.exe and args is the original arg string), but the hang still occurred:
string cmd = "cmd";
string arguments = "/S /C \"" + exe + " " + args + "\"";
What could I be screwing up here, and how can I debug this external process stuff?
EDIT:
I'm just now getting around to addressing this. Mucho thanks to Jon Skeet for his suggestion, which indeed works great. I have another question about my method of handling this, though, since I'm a multi-threaded novice. I'd like suggestions on improving any glaring deficiencies or anything otherwise dumb. I ended up creating a small class that contains the stdout stream, a StringBuilder to hold the output, and a flag to tell when it's finished. Then I used ThreadPool.QueueUserWorkItem and passed in an instance of my class:
ProcessBufferHandler bufferHandler = new ProcessBufferHandler(proc.StandardOutput.BaseStream,
Encoding.UTF8);
ThreadPool.QueueUserWorkItem(ProcessStream, bufferHandler);
proc.WaitForExit(ms);
if (proc.HasExited)
{
bufferHandler.Stop();
return bufferHandler.ReadToEnd();
}
... and ...
private class ProcessBufferHandler
{
public Stream stream;
public StringBuilder sb;
public Encoding encoding;
public State state;
public enum State
{
Running,
Stopped
}
public ProcessBufferHandler(Stream stream, Encoding encoding)
{
this.stream = stream;
this.sb = new StringBuilder();
this.encoding = encoding;
state = State.Running;
}
public void ProcessBuffer()
{
sb.Append(new StreamReader(stream, encoding).ReadToEnd());
}
public string ReadToEnd()
{
return sb.ToString();
}
public void Stop()
{
state = State.Stopped;
}
}
This seems to work, but I'm doubtful that this is the best way. Is this reasonable? And what can I do to improve it?
One standard issue: the process could be waiting for you to read its output. Create a separate thread to read from its standard output while you're waiting for it to exit. It's a bit of a pain, but that may well be the problem.
Jon Skeet is right on the money!
If you don't mind polling after you launch your svn command try this:
Process command = new Process();
command.EnableRaisingEvents = false;
command.StartInfo.FileName = "svn.exe";
command.StartInfo.Arguments = "your svn arguments here";
command.StartInfo.UseShellExecute = false;
command.StartInfo.RedirectStandardOutput = true;
command.Start();
while (!command.StandardOutput.EndOfStream)
{
Console.WriteLine(command.StandardOutput.ReadLine());
}
I had to drop an exe on a client's machine and use Process.Start to launch it.
The calling application would hang - the issue ended up being their machine assuming the exe was dangerous and preventing other applications from starting it.
Right click the exe and go to properties. Hit "Unblock" toward the bottom next to the security warning.
Based on Jon Skeet's answer this is how I do it in modern day (2021) .NET 5
var process = Process.Start(processStartInfo);
var stdErr = process.StandardError;
var stdOut = process.StandardOutput;
var resultAwaiter = stdOut.ReadToEndAsync();
var errResultAwaiter = stdErr.ReadToEndAsync();
await process.WaitForExitAsync();
await Task.WhenAll(resultAwaiter, errResultAwaiter);
var result = resultAwaiter.Result;
var errResult = errResultAwaiter.Result;
Note that you can't await the standard output before the error, because the wait will hang in case the standard error buffer gets full first (same for trying it the other way around).
The only way is to start reading them asynchronously, wait for the process to exit, and then complete the await by using Task.WaitAll
I know this is an old post but maybe this will assist someone. I used this to execute some AWS (Amazon Web Services) CLI commands using .Net TPL tasks.
I did something like this in my command execution which is executed within a .Net TPL Task which is created within my WinForm background worker bgwRun_DoWork method which holding a loop with while(!bgwRun.CancellationPending). This contains the reading of the Standard Output from the Process via a new Thread using the .Net ThreadPool class.
private void bgwRun_DoWork(object sender, DoWorkEventArgs e)
{
while (!bgwRun.CancellationPending)
{
//build TPL Tasks
var tasks = new List<Task>();
//work to add tasks here
tasks.Add(new Task(()=>{
//build .Net ProcessInfo, Process and start Process here
ThreadPool.QueueUserWorkItem(state =>
{
while (!process.StandardOutput.EndOfStream)
{
var output = process.StandardOutput.ReadLine();
if (!string.IsNullOrEmpty(output))
{
bgwRun_ProgressChanged(this, new ProgressChangedEventArgs(0, new ExecutionInfo
{
Type = "ExecutionInfo",
Text = output,
Configuration = s3SyncConfiguration
}));
}
if (cancellationToken.GetValueOrDefault().IsCancellationRequested)
{
break;
}
}
});
});//work Task
//loop through and start tasks here and handle completed tasks
} //end while
}
I know my SVN repos can run slow sometimes, so maybe 5 seconds isn't long enough? Have you copied the string you are passing to the process from a break point so you are positive it's not prompting you for anything?

Categories