Best way to repeat a process and continue even if it fails - c#

I have a program that restores databases. Basically the program reads from a file and restores one database at a time. The program runs 1 time per database. I need a way to repeat this until all databases are restored. Even if the database restore isn't successful, i need the process to repeat until all databases are attempted to be restored. I am thinking of having another program call the program I just mentioned. So something like this:
public const string databaseAttemptedRestore = "C:\\Logs\\AttemptedDatabaseRestore.log";
public const string databasesToRestore = "C:\\Logs\\DatabasesToRestore.log";
static void Main(string[] args)
{
//Repeats restore, reading from databasesToRestore. Needs to be called by task scheduler
Action toRepeat = () =>
{
var proc = new Process();
//database restore program
proc.StartInfo.FileName = #"C:\Projects\DbRestoreProdConsole\bin\Debug\DbRestoreProdConsole.exe";
// set up output redirection
proc.StartInfo.RedirectStandardOutput = true;
proc.StartInfo.RedirectStandardError = true;
proc.EnableRaisingEvents = true;
proc.StartInfo.CreateNoWindow = true;
proc.StartInfo.UseShellExecute = false;
proc.StartInfo.WorkingDirectory = #"C:\Windows\SysWOW64";
// see below for output handler
proc.Start();
proc.WaitForExit();
};
double lineCount = File.ReadLines(databasesToRestore).Count();
double repeat = Math.Ceiling(lineCount / 2);
Enumerable.Range(0, (int)repeat).ToList().ForEach(i => toRepeat());
}
}
}
I am trying to figure out the best solution to this repeating process where I can track errors and run the program smooth etc. The program runs after hours so it doesn't need to be a fast running program. The most databases I will restore per night is around 8. but it usually will be 1 or 2 per night. Any suggestions is greatly appreciated. Thank you.

Related

Suppress output of child process

Let's say we start console application with:
public static void StartProcess()
{
using var next = new Process();
next.StartInfo.UseShellExecute = false;
next.StartInfo.FileName = "dotnet";
next.StartInfo.Arguments = "/opt/ConsoleApp1/ConsoleApp1.dll";
next.Start();
}
This code leads to double StandardOutput and StandardError, because parent and child processes will write data to the same terminal. How to supress child process output and/or detach child console?
Of course I can do something like this:
public static void StartProcess()
{
using var next = new Process();
next.StartInfo.UseShellExecute = false;
next.StartInfo.FileName = "dotnet";
next.StartInfo.Arguments = "/opt/ConsoleApp1/ConsoleApp1.dll";
next.StartInfo.RedirectStandardOutput = true;
next.StartInfo.RedirectStandardError = true;
next.Start();
next.StandardOutput.BaseStream.CopyToAsync(Stream.Null);
next.StandardError.BaseStream.CopyToAsync(Stream.Null);
}
As far as I understand this will work until parent process is alive, but what if child process will work longer than parent one? Need some stable, cross-platform solution.
Adding StartInfo.CreateNoWindow = true fixed it for me.
Well, almost... Starting the process still spits out a newline onto the Linux console, but no other output.

Communicate with process in C#

I need to communicate with external executable (ampl.exe) using standard input and standard output. This exe make calculations during some minutes with some display in the console. It has a prompt so I can succesively launch calculations by using its standard input as soon as a calculation is finished.
The external exe is launched as :
var myProcess = new Process();
myProcess.StartInfo = new ProcessStartInfo("ampl.exe");
myProcess.StartInfo.CreateNoWindow = true;
myProcess.StartInfo.UseShellExecute = false;
myProcess.StartInfo.RedirectStandardOutput = true;
myProcess.StartInfo.RedirectStandardError = true;
myProcess.StartInfo.RedirectStandardInput = true;
myProcess.Start();
I communicate with it by using myProcess.StandardInput and myProcess.StandardOutput (synchronous way).
I use standard input to launch the calcul, for example :
myProcess.StandardInput.WriteLine("solve;");
I want to wait the end of the solve statement, get results in files, prepare new calculation input files and then launching a second solve.
My problem is that I do now know when the first calculation is finished, that is when the exe is waiting for new command in its standard input.
The only way I found is to add a specific display command and wait for getting it it its standard output :
myProcess.StandardInput.WriteLine("solve;");
myProcess.StandardInput.WriteLine("print 'calculDone';");
string output = myProcess.StandardOutput.ReadLine();
while (!output.Contains("calculDone"))
{
output = myProcess.StandardOutput.ReadLine();
}
Is there another way avoiding to use this display command to do this ?
Edit : following advices, I tried the asynchronous way. But I still need to print 'CalculDone' to know when the solve statement ended. I do not get the prompt of ampl.exe (which is 'ampl : ') in the standard output of the process.
AutoResetEvent eventEnd = new AutoResetEvent(false);
var myProcess = new Process();
myProcess.StartInfo = new ProcessStartInfo("ampl.exe");
myProcess.StartInfo.CreateNoWindow = true;
myProcess.StartInfo.UseShellExecute = false;
myProcess.StartInfo.RedirectStandardOutput = true;
myProcess.StartInfo.RedirectStandardError = true;
myProcess.StartInfo.RedirectStandardInput = true;
myProcess.EnableRaisingEvents = true;
myProcess.OutputDataReceived += (sender, e) =>
{
if (e.Data == "commandDone")
{
eventEnd.Set();
}
else if (e.Data != null)
{
Console.WriteLine("ampl: {0}", e.Data);
}
};
myProcess.Start();
myProcess.BeginOutputReadLine();
myProcess.StandardInput.WriteLine("solve;");
myProcess.StandardInput.WriteLine("print 'commandDone';");
eventEnd.WaitOne();
The best option would be to use the Processs.OutputDataReceived event instead of a tight while loop. It’s like the event async pattern, you launch an asynchronous task and wait for an event callback telling you it’s done. The continuation of the asynchronous task would go in the event handler. Remember to unsubscribe the event handler the first time it goes off, otherwise it will be firing when you don’t want it to.
Another option I have never tried is Process.WaitForInputIdle() method, but I’m not sure if this will work in your particular case. If it does you wouldn’t need to write anything to the input stream.

How to read long lines from child pocress' stdout in an efficient way?

Note: This question has changed significantly since the first version, so some comments or answers could seem weird. Please, check the edit history if something seems weird.
I am launching a child process from a C# class library.
I am using Process.BeginOutputReadLine() to read the output/error in an asynchronous way. I thought it didn't work with very long lines, but the problem seems to be that it's not scalable. In my computer, a 128 kb line is processed instantly, a 512 kb line seems to take around one minute, 1 mb seems to take several minutes, and I've been around two hours waiting for a 10 mb line to be processed, and it was still working when I cancelled it.
It seems easy to fix reading directly from the StandardOutput and StandardError streams, but the data from those streams seems to be buffered. If I get wnough data from stdout to fill the buffer, and then some more data from stderr, I can't find a way to check if there's data pending in one of them, and if I try to read from stderr, it will hang forever.
Why is this happening, what am I doing wrong, and what's the right way to do this?
Some code samples to illustrate what I'm trying to achieve.
Program1:
// Writes a lot of data to stdout and stderr
static void Main(string[] args)
{
int numChars = 512 * 1024;
StringBuilder sb = new StringBuilder(numChars);
String s = "1234567890";
for (int i = 0; i < numChars; i++)
sb.Append(s[i % 10]);
int len = sb.Length;
Console.WriteLine(sb.ToString());
Console.Error.WriteLine(sb.ToString());
}
Program2:
// Calls Program1 and tries to read its output.
static void Main(string[] args)
{
StringBuilder sb = new StringBuilder();
StringBuilder sbErr = new StringBuilder();
proc.StartInfo.CreateNoWindow = true;
proc.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
proc.StartInfo.UseShellExecute = false;
proc.StartInfo.RedirectStandardError = true;
proc.StartInfo.RedirectStandardOutput = true;
proc.StartInfo.RedirectStandardInput = false;
proc.StartInfo.Arguments = "";
proc.StartInfo.FileName = "program1.exe";
proc.ErrorDataReceived += (s, ee) => { if (ee.Data != null) sbErr.AppendLine(ee.Data); };
proc.OutputDataReceived += (s, ee) => { if (ee.Data != null) sb.AppendLine(ee.Data); };
proc.Start();
proc.BeginOutputReadLine();
proc.BeginErrorReadLine();
proc.WaitForExit();
}
Program1 has a constant that allows to set the size of data to generate, and Program2 launches Program1 and tries to read the data. I should expect the time to grow linearly with size, but it seems much worse than that.
I hope i understand your problem. the application hangs on Process.WaitForExit() because this is what Process.WaitForExit(): it waits for the process to exit.
you might want to call it in a new thread:
int your method that create the process:
Thread trd = new Thread(new ParameterizedThreadStart(Start));
trd.Start();
and add this method:
private void Start(object o)
{
((Process)o).WaitForExit();
// your code after process ended
}

How to set maximum number of external processes the program can start at the same time?

I need to run an external program for every PDF file in specified directory.
The problem is - how to limit the number of external program processes to user-specified value? I run it in the loop, like this:
foreach(string file in Directory.GetFiles(sourcePath))
{
Process p = new Process();
p.StartInfo.FileName = #"path\program.exe";
p.StartInfo.Arguments = previouslySetArguments;
p.Start();
}
Now the problem is that there is sometimes a really huge amount of files and with that code, all processes would be ran at the same time. It really slows the machine down.
Other idea is to put p.WaitForExit(); after the p.Start(); but then it would run only one process at a time, which on the other hand - slows down the whole work :)
What is the easiest way to limit processes number to run the exact amount of them at the same time? I want to let the user decide.
Let's say I want to run maximum 5 processes at once. So:
- first 5 processes starts in more-or-less the same time for first 5 files in the directory
- when one of them (doesn't matter, which) ends work, the next one starts - for the next file
If I were you I would look into a the producer-consumer model of queuing. It's intended to do pretty much exactly this, and there are lots of good examples that you can modify to suit your needs.
Here's an example:
C# producer/consumer
And another example:
http://msdn.microsoft.com/en-us/library/hh228601%28v=vs.110%29.aspx
(this last one is for 4.5, but still valid IMO)
OK, so I tried the answers from this question, but - strange thing - I couldn't get them to work. I admit, I was in a hurry, I could made some mistakes...
For now, the quickiest and simpliest (and ugliest) method I've found is just a loop, like the code below. It works with the test program that just calls Thread.Sleep() with given command line argument as miliseconds.
Now, please, explain me, why is it not a good solution - I assume it is not the correct way (not only because the code is ugly), even if it works with this test example.
class Program
{
// hardcoded, it's just a test:
static int activeProcesses = 0;
static int finishedProcesses = 0;
static int maxProcesses = 5;
static int runProcesses = 20;
static string file = #"c:\tmp\dummyDelay.exe";
static void Main(string[] args)
{
Random rnd = new Random();
while (activeProcesses + finishedProcesses < runProcesses)
{
if (activeProcesses < maxProcesses)
{
Process p = new Process();
p.EnableRaisingEvents = true;
p.Exited += new EventHandler(pExited);
p.StartInfo.WindowStyle = ProcessWindowStyle.Hidden;
p.StartInfo.FileName = file;
p.StartInfo.Arguments = rnd.Next(2000, 5000).ToString();
p.Start();
Console.WriteLine("Started: {0}", p.Id.ToString());
activeProcesses++;
}
}
}
static void pExited(object sender, EventArgs e)
{
Console.WriteLine("Finished: {0}", ((Process)sender).Id.ToString());
((Process)sender).Dispose();
activeProcesses--;
finishedProcesses++;
}
}

Hanging process when run with .NET Process.Start -- what's wrong?

I wrote a quick and dirty wrapper around svn.exe to retrieve some content and do something with it, but for certain inputs it occasionally and reproducibly hangs and won't finish. For example, one call is to svn list:
svn list "http://myserver:84/svn/Documents/Instruments/" --xml --no-auth-cache --username myuser --password mypassword
This command line runs fine when I just do it from a command shell, but it hangs in my app. My c# code to run this is:
string cmd = "svn.exe";
string arguments = "list \"http://myserver:84/svn/Documents/Instruments/\" --xml --no-auth-cache --username myuser --password mypassword";
int ms = 5000;
ProcessStartInfo psi = new ProcessStartInfo(cmd);
psi.Arguments = arguments;
psi.RedirectStandardOutput = true;
psi.WindowStyle = ProcessWindowStyle.Normal;
psi.UseShellExecute = false;
Process proc = Process.Start(psi);
StreamReader output = new StreamReader(proc.StandardOutput.BaseStream, Encoding.UTF8);
proc.WaitForExit(ms);
if (proc.HasExited)
{
return output.ReadToEnd();
}
This takes the full 5000 ms and never finishes. Extending the time doesn't help. In a separate command prompt, it runs instantly, so I'm pretty sure it's unrelated to an insufficient waiting time. For other inputs, however, this seems to work fine.
I also tried running a separate cmd.exe here (where exe is svn.exe and args is the original arg string), but the hang still occurred:
string cmd = "cmd";
string arguments = "/S /C \"" + exe + " " + args + "\"";
What could I be screwing up here, and how can I debug this external process stuff?
EDIT:
I'm just now getting around to addressing this. Mucho thanks to Jon Skeet for his suggestion, which indeed works great. I have another question about my method of handling this, though, since I'm a multi-threaded novice. I'd like suggestions on improving any glaring deficiencies or anything otherwise dumb. I ended up creating a small class that contains the stdout stream, a StringBuilder to hold the output, and a flag to tell when it's finished. Then I used ThreadPool.QueueUserWorkItem and passed in an instance of my class:
ProcessBufferHandler bufferHandler = new ProcessBufferHandler(proc.StandardOutput.BaseStream,
Encoding.UTF8);
ThreadPool.QueueUserWorkItem(ProcessStream, bufferHandler);
proc.WaitForExit(ms);
if (proc.HasExited)
{
bufferHandler.Stop();
return bufferHandler.ReadToEnd();
}
... and ...
private class ProcessBufferHandler
{
public Stream stream;
public StringBuilder sb;
public Encoding encoding;
public State state;
public enum State
{
Running,
Stopped
}
public ProcessBufferHandler(Stream stream, Encoding encoding)
{
this.stream = stream;
this.sb = new StringBuilder();
this.encoding = encoding;
state = State.Running;
}
public void ProcessBuffer()
{
sb.Append(new StreamReader(stream, encoding).ReadToEnd());
}
public string ReadToEnd()
{
return sb.ToString();
}
public void Stop()
{
state = State.Stopped;
}
}
This seems to work, but I'm doubtful that this is the best way. Is this reasonable? And what can I do to improve it?
One standard issue: the process could be waiting for you to read its output. Create a separate thread to read from its standard output while you're waiting for it to exit. It's a bit of a pain, but that may well be the problem.
Jon Skeet is right on the money!
If you don't mind polling after you launch your svn command try this:
Process command = new Process();
command.EnableRaisingEvents = false;
command.StartInfo.FileName = "svn.exe";
command.StartInfo.Arguments = "your svn arguments here";
command.StartInfo.UseShellExecute = false;
command.StartInfo.RedirectStandardOutput = true;
command.Start();
while (!command.StandardOutput.EndOfStream)
{
Console.WriteLine(command.StandardOutput.ReadLine());
}
I had to drop an exe on a client's machine and use Process.Start to launch it.
The calling application would hang - the issue ended up being their machine assuming the exe was dangerous and preventing other applications from starting it.
Right click the exe and go to properties. Hit "Unblock" toward the bottom next to the security warning.
Based on Jon Skeet's answer this is how I do it in modern day (2021) .NET 5
var process = Process.Start(processStartInfo);
var stdErr = process.StandardError;
var stdOut = process.StandardOutput;
var resultAwaiter = stdOut.ReadToEndAsync();
var errResultAwaiter = stdErr.ReadToEndAsync();
await process.WaitForExitAsync();
await Task.WhenAll(resultAwaiter, errResultAwaiter);
var result = resultAwaiter.Result;
var errResult = errResultAwaiter.Result;
Note that you can't await the standard output before the error, because the wait will hang in case the standard error buffer gets full first (same for trying it the other way around).
The only way is to start reading them asynchronously, wait for the process to exit, and then complete the await by using Task.WaitAll
I know this is an old post but maybe this will assist someone. I used this to execute some AWS (Amazon Web Services) CLI commands using .Net TPL tasks.
I did something like this in my command execution which is executed within a .Net TPL Task which is created within my WinForm background worker bgwRun_DoWork method which holding a loop with while(!bgwRun.CancellationPending). This contains the reading of the Standard Output from the Process via a new Thread using the .Net ThreadPool class.
private void bgwRun_DoWork(object sender, DoWorkEventArgs e)
{
while (!bgwRun.CancellationPending)
{
//build TPL Tasks
var tasks = new List<Task>();
//work to add tasks here
tasks.Add(new Task(()=>{
//build .Net ProcessInfo, Process and start Process here
ThreadPool.QueueUserWorkItem(state =>
{
while (!process.StandardOutput.EndOfStream)
{
var output = process.StandardOutput.ReadLine();
if (!string.IsNullOrEmpty(output))
{
bgwRun_ProgressChanged(this, new ProgressChangedEventArgs(0, new ExecutionInfo
{
Type = "ExecutionInfo",
Text = output,
Configuration = s3SyncConfiguration
}));
}
if (cancellationToken.GetValueOrDefault().IsCancellationRequested)
{
break;
}
}
});
});//work Task
//loop through and start tasks here and handle completed tasks
} //end while
}
I know my SVN repos can run slow sometimes, so maybe 5 seconds isn't long enough? Have you copied the string you are passing to the process from a break point so you are positive it's not prompting you for anything?

Categories