Trouble executing large batch file from C# - c#

I'm executing a batch file from my C# application using the System.Diagnostics classes, updating my GUI with the output along the way, but as soon as my batch file is in excess of a certain number of lines, the process just hangs. The exact amount of lines seems to vary, but I have been able to reproduce it with a simple batch file that prints "Hello Kitty" 316 times:
#echo off
echo Hello Kitty
echo Hello Kitty
etc.
If I remove the 316th line, the batch file executes fine and the forms application behaves as expected, but any more lines causes the process to suspend indefinitely, not producing even one of the first 300 hello kitties.
Here is my code for executing the batch file:
process = new System.Diagnostics.Process();
process.StartInfo.FileName = batchName;
process.StartInfo.Arguments = " < nul";
process.StartInfo.CreateNoWindow = true;
process.StartInfo.RedirectStandardOutput = true;
process.StartInfo.RedirectStandardError = true;
process.StartInfo.UseShellExecute = false;
process.Start();
with these declared elsewhere:
protected Process process;
protected StreamReader output;
My main form does something like this (simplified a little):
string output;
while (!proc.Process.HasExited)
{
proc.Process.WaitForExit(200);
if (proc.Process.HasExited)
{
output = proc.Output.ReadToEnd();
rtbStatus.AppendText(output);
}
Application.DoEvents();
}
I don't understand why it does this, and no examples that I find on the net make any mention of a size limit on batch files. Please advise.

You need to read the output continuously - you cannot wait to the end.
Output will be buffered, so a small amount (usually about 4KB) can be written before the process hangs.
The alternative is to direct the output to a file then read the file.

Related

When writing multiple text files, some of them are not appearing in the Windows folder until I close my winform

I have a 3rd party exe that accepts the path to a single text file via user input and creates new text files after doing some calculations. It is normally run via the console and accepts inputs one at a time.
I wrote a Winform application to allow us to mass generate these input files based on some parameters, and then pass those input files in to the exe one at a time.
It seems to be working fairly well so far, if I have it generate 2 or 3 input files, the output files shortly appear in the correct folder. But if I generate and pass through 10 files (one at a time), the first 3 or so files appear correctly, but then the later ones do not.
I thought that it was that perhaps my program was outpacing the exe's calculations, so I put some pauses in between the console inputs. That didn't change anything, but I found out that once I close out of my winform, all of the late files appear in my folder all at once.
Any thoughts on what is happening here?
This is my first time using Process, so it's probably not ideal. I was working originally with not putting in the next input until the exe was at the correct line, but I was struggling with reading the output. Then I deleted what I had and wrote the simple bit below, not expecting that to be enough. Apparently it was and I actualy started getting results.
Thanks!
Process p = new Process();
ProcessStartInfo start = new ProcessStartInfo();
start.CreateNoWindow = true;
start.UseShellExecute = false;
start.FileName = #"pathtoexe";
start.WindowStyle = ProcessWindowStyle.Hidden;
start.WorkingDirectory = Path.GetDirectoryName(start.FileName);
start.RedirectStandardOutput = true;
start.RedirectStandardInput = true;
start.RedirectStandardError = true;
p.Start();
StreamWriter writer = p.StandardInput;
writer.WriteLine("9"); // exe asks where the dll is stored, 9 indicates the path we use
Thread.Sleep(1000);
writer.WriteLine("1"); // exe is asking which type of output files we want, 1 indicates the basic text output calculations
Thread.Sleep(1000);
for (int j = 1; j<= _mun.Count;j++)
{
writer.WriteLine(#"..\Inputs\GeneratedInputs_"+j+".in"); //The input that is read by one run of the exe. We create these files before getting to this step
Thread.Sleep(1000);
writer.WriteLine(""); // exe asks us to confirm by entering a character. Just hitting enter works
Thread.Sleep(1000);
writer.WriteLine("2"); //the exe asks us which calculations we are running. 2 is the result we want
Thread.Sleep(5000);
if (j == _mun.Count) //The last input of the exe is if we want to run another run again or exit. On the last item of our list, choose "exit" with 0 instead of run again, with 1.
{
writer.WriteLine("0");
// p.WaitForExit(); I'm not really sure where in all of this I should be putting a WaitForExit, if at all.
}
else
{
writer.WriteLine("1");
}
Thread.Sleep(1000);
}
You can call
writer.Flush()
To clear the buffer and ensure all data is written.
You should also consider wrapping your writer instance in a using clause to ensure proper disposal
using (StreamWriter writer = writer = p.StandardInput)
{
writer.WriteLine("test");
}

Progress Bar for each process

I'm currently busy with a small C# application that reads a folder then dynamically lists all .rar files on a windows form with a dynamic progress bar next to each .rar file. So basically with the press of a button, the .rar files needs to be unzipped (winrar command line) showing the progress for each process.
Below is my process snippet
Process proc = new Process();
proc.StartInfo.FileName = #"unrar.exe";
proc.StartInfo.Arguments = FileName + "whatever attributes/switches";
proc.StartInfo.RedirectStandardOutput = true;
proc.StartInfo.RedirectStandardInput = true;
proc.StartInfo.RedirectStandardError = true;
proc.StartInfo.UseShellExecute = false;
proc.StartInfo.CreateNoWindow = true;
proc.OutputDataReceived += new DataReceivedEventHandler(proc_OutputDataReceived);
proc.Start();
proc.BeginOutputReadLine();
proc.WaitForExit();
Having trouble getting this right.
Help will be much appreciated.
If unrar.exe outputs the progress to the standard output, you could try and parse it to update the progressbar.
Instead of using unrar.exe to uncompress the archives from within your program, you could try using a library, like SevenZipLib http://sevenziplib.codeplex.com/.
The problem may be that UNRAR doesn't output a NewLine, and just keeps writing to the same line, so the event handler never gets called. It only gets called once a new line is written.
I would go with Simon's solution and try to use 7zip instead. It's more friendly has a great C# library and works with almost all formats.

File copy using robo copy and process

I am creating a File copy program which will copy large number of files(~100,000) with size ~50 KB using ROBOCOPY command.
For each file, I am creating a new process and passing the ROBOCOPY command and arguments as follow:
using (Process p = new Process)
{
p.StartInfo.Arguments = string.Format("/C ROBOCOPY {0} {1} {2}",
sourceDir, destinationDir, fileName);
p.StartInfo.FileName = "CMD.EXE";
p.StartInfo.CreateNoWindow = true;
p.StartInfo.UseShellExecute = false;
p.Start();
p.WaitForExit();
}
Instead of creating a process for each file, I am looking for a better approach, which will be good in terms of performance and design. Can someone suggest a better method?
This question is a bit old but I thought I would answer to help anyone who still land on it. I wrote a library called RoboSharp (https://github.com/tjscience/RoboSharp) that brings all of the goodness in Robocopy to c#. Take a look if you require the power of Robocopy in c#.
Process p = new Process();
p.StartInfo.Arguments = string.Format("/C Robocopy /S {0} {1}", "C:\\source", "C:\\destination");
p.StartInfo.FileName = "CMD.EXE";
p.StartInfo.CreateNoWindow = true;
p.StartInfo.UseShellExecute = false;
p.Start();
p.WaitForExit();
/C Robocopy -> this is a command to run robocopy
/S -> This will help to copy sub folders as well as Files
I would just use System.IO. Should be plenty fast enough, and your filename could be a wildcard.
using System.IO;
// snip your code... providing fileName, sourceDir, destinationDir
DirectoryInfo dirInfo = new DirectoryInfo(sourceDir);
FileInfo[] fileInfos = dirInfo.GetFiles(fileName);
foreach (FileInfo file in fileInfos)
{
File.Copy(file.FullName, Path.Combine(destinationDir, file.Name), true); // overwrites existing
}
You should call File.Copy in a loop.
Robocopy can use up to 128 thread by itself. It makes a huge difference. By default it uses 8.
See https://pureinfotech.com/robocopy-multithreaded-file-copy-windows-10/
.cmd has following lines
Start ROBOCOY src dest a* b* c* /z /w:1 r:1
Start ROBOCOY src dest d* e* f* g* /z /w:1 r:1
Start ROBOCOY src dest h* K* P* Y* /z /w:1 r:1
Start ROBOCOY src dest xry* srp* /z /w:1 r:1
When I run > Robocopy sample.cmd
I starts with 4 multiple windows copying files simultaneously as per above commands, it waits
for another file, as it has wait time, if file is being used by another process. Is is more
faster as it do job simultaneously.
Now I am developing GUI using C# windows to run the process instead going to command console and
start
main()
{
process.start( "path of sample.cmd" )
process.waitforexit()
label.text=" sucessful copy"
}
However, if it takes control of one process, i.e. cmd.exe and and there are 4 robocopy processes in
taskmanager. when cmd.exe process completes, it returns the cursor to label.text "Sucesssfully
completed". While there are robocopy processes still running. you can see the robocopy windows
doing the copying process.
Here is the question: I want to take control of all the processes (cmd.exe and robocopy.exe)
programatically in C#, so that when the label.text should display "successfully completed" only
when all commands are successfully completed", if one fails, then there is no point in the GUI.
option 2 (similar to Biju has written above): is it better to remove robocopy command scripts from
sample.cmd(batch file) file and write code to run the 4 robocopy lines in C#, but how to run the
robocooy script line written .cmd file, as they have arguments as well. I code runs each robocopy
process then each will return to the next line of code and if it fails, we can catch the error and
display in the message box.
Hope this will help... However, I am looking for more better way, if somebody can improve on the same.

Is there a way to have a batch file run within a windows form application in C#?

So anyways, I've been working on a batch IDE, and I was wondering if there was a good way to effectively embed the file into the form.
It would function sort of like a debug mode, where at any time, the user can click a button, and the batch file would load into the actual form.
Like the black cmd window would be embedded into the form...
Is there any way to do that?
ProcessStartInfo psi = new ProcessStartInfo();
psi.RedirectStandardOutput = true;
psi.UseShellExecute = false;
psi.FileName = "C:\\echo.cmd";
var p = Process.Start(psi);
Console.WriteLine(p.StandardOutput.ReadToEnd());
And in C:\echo.cmd I have just basic echo hello!. When this code is executed - you'll see hello! received from batch's output stream.
Note that if executed command will wait for some input - ReadToEnd() can't return. In this case you should use Process.OutputDataReceived event.
Look at the process object and the StandardInput, StandardOutput and StandardError streams.
That is essentially all the command window is showing with some special handling of control characters.

C# System.Diagnostics.Process redirecting Standard Out for large amounts of data

I running an exe from a .NET app and trying to redirect standard out to a streamreader. The problem is that when I do
myprocess.exe >> out.txt
out.txt is close to 14mb.
When I do the command line version it is very fast but when I run the process from my csharp app it is excruciatingly slow because I believe the default streamreader flushes every 4096 bytes.
Is there a way to change the default stream reader for the Process object?
I haven't tried, but it looks like the asynchronous methods may offer better performance. Instead of using process.StandardOutput, try this method instead:
Process process = Process
.Start(new ProcessStartInfo("a.exe"){RedirectStandardOutput = true});
if (process != null)
{
process.OutputDataReceived += ((sender, e) =>
{
string consoleLine = e.Data;
//handle data
});
process.BeginOutputReadLine();
}
Edit: Just realized I'm answering the wrong question. In my case the stdout buffer was full and WaitForExit() was blocking forever, because nothing was reading from the buffer yet. So if you have THAT problem, then here's a solution. ;)
This is my first day with C# so please understand that this might not be the best solution, and might not always work. But it works in the 2x I've tested it. ;) This is synchronous, just start start writing the redirected stdout/stderr to the file before you WaitForExit(). This way WaitForExit() won't block waiting for the stdout buffer to be emptied.
string str_MyProg = "my.exe";
string str_CommandArgs = " arg1 arg2"'
System.Diagnostics.ProcessStartInfo procStartInfo = new System.Diagnostics.ProcessStartInfo(str_MyProg, str_CommandArgs);
procStartInfo.RedirectStandardError = true;
procStartInfo.RedirectStandardOutput = true; // Set true to redirect the process stdout to the Process.StandardOutput StreamReader
procStartInfo.UseShellExecute = false;
procStartInfo.CreateNoWindow = true; // Do not create the black window
// Create a process, assign its ProcessStartInfo and start it
System.Diagnostics.Process myProcess = new System.Diagnostics.Process();
myProcess.StartInfo = procStartInfo;
myProcess.Start();
// Dump the output to the log file
string stdOut = myProcess.StandardOutput.ReadToEnd();
StreamWriter logFile = new StreamWriter("output.txt" );
logFile.Write(stdOut);
logFile.Close();
myProcess.WaitForExit();
Yes, that's about right. There is a buffer that stores the process output, usually between 1 and 4KB in the common CRT implementations. One small detail: that buffer is located in the process you start, not the .NET program.
Nothing very special needs to happen when you redirect to a file, the CRT directly writes it. But if you redirect to your .NET program then output goes from the buffer into a pipe. Which then takes a thread switch to your program so you can empty the pipe. Back and forth a good 700 times.
Yes, not fast. Easily fixed though, call setvbuf() in the program you are running to increase the stdout and stderr output buffer sizes. Then again, that takes having the source code of that program.
Anticipating a problem with that: maybe you ought to use cmd.exe /c to get the redirection to a file, then read the file.
The Process class exposes the stdout stream directly, so you should be able to read it at whatever pace you like. It's probably best to read it in small chunks and avoid calling ReadToEnd.
For example:
using(StreamReader sr = new StreamReader(myProcess.StandardOutput))
{
string line;
while((line = sr.ReadLine()) != null)
{
// do something with line
}
}
This worked out for me:
var sb = new StringBuilder();
while (!proc.StandardOutput.EndOfStream)
{
sb.Append(proc.StandardOutput.ReadToEnd());
proc.StandardOutput.DiscardBufferedData();
}

Categories