Multiple instances - c#

I am trying to perform parallel processing by lauching a console application program2 that does the work stuff. It is launched by program1 which knows how many instances to launch.
At some point the program can't launch more instances. Even if you increase the instancesmount, it only launches to a limit. In this case only 92. if I set the limit to 100 or 200 it still only launches 92 on the server.
I am writing the program in c# and it runs in windows server 2008.
Here is the code:
for (int instanceCount = 0; instanceCount < InstancesAmount; instanceCount++)
{
using (System.Diagnostics.Process myProcess = new System.Diagnostics.Process())
{
if (hiddeConsoleWindow)
{
myProcess.StartInfo.WindowStyle = System.Diagnostics.ProcessWindowStyle.Hidden;
}
myProcess.StartInfo.FileName = ExecutablePathProgram2;
System.Security.SecureString password = new System.Security.SecureString();
foreach (char c in ConfigurationSettingsManager.ProcessStartPassword.ToCharArray())
{
password.AppendChar(c);
}
myProcess.StartInfo.UserName = ConfigurationSettingsManager.ProcessStartUserName;
myProcess.StartInfo.Password = password;
myProcess.StartInfo.Domain = ConfigurationSettingsManager.ProcessStartDomain;
myProcess.StartInfo.UseShellExecute = false;
myProcess.Start();
}
}
I have been looking if there is max instances to launch but it always says that it is as many as the OS supports.
I also checked if there is a max instances per session or per user but couldn't find anything that describes something like that or I did miss it.

To quote Raymond Chen's blog: "If you have to ask about various operating system limits, you're probably doing something wrong".
There's a limit to how much work can actually get done by the computer even with that many processes. You will be better served by determining the number of processors in the system and choosing that many concurrent tasks to execute. Your "program1" could then launch the process and use the StartInfo to watch for when the process ends (and capture any error output in the meantime by redirecting the output and error streams and logging them as needed. Once a process completes, then you should launch the next one in the queue.
When you launch that many processes, the system will be thrashing trying to switch context between 100 processes and won't get much of anything done.
You may be running into memory limits depending on how much memory your child processes allocate. You'll have a bunch of processes starting up and taking up chunks of memory but doing nothing until their turn at the processor comes around. If it can't allocate the memory, it will probably choke and kill the process (depending on how error handling is done).

That is strange, as there is no hard limit by default. But of course, that depends on what is the launched process doing (memory consumption, handle allocation, files, etc.). Fo example, I have tested it with "notepad.exe" on my machine and I get 150 notepad.exe running if I specify 150 instances.
You can check here for a very interesting discussion on process limits: Pushing the Limits of Windows: Processes and Threads.

First, I definitely agree with #Garo Yeriazarian. But to be thorough I'd recommend checking out this blog post:
http://xentelworker.blogspot.com/2005/10/i-open-100-explorer-windows-and-my.html

Related

Strategies for streamlining a workflow of multiple apps

We have a bit of a complicated scenario in the office where we have multiple standalone applications that can also be combined into a single workflow. I'm now looking into strategies to avoid running half a dozen apps for this one workflow and I'm fairly confident that the most appropriate solution is to write an over-arching app that runs these smaller apps in sequence.
The apps don't rely on each others' results as such, but they must be run in a specific instance and you can't run step 2 if step 1 fails, etc. Roll-back isn't necessary. Some of the apps are used in standalone scenarios as well as this workflow, so it seems like a controlling application would allow me to re-use those apps, rather than duplicate code.
A controlling app also allows for the workflow to be extensible; I can "plug in" a new step between step 1 and step 2 if there is a required amendment to the workflow. Further, it should allow me to do things like build a queue system so that the workflow can just be constantly run.
Am I on the right track with my thoughts? Are there limitations to this approach?
1) If you have the source code of these smaller apps the best thing to do would be to recreate an entire application that act as a "workspace" , with the various steps of the work included directly in this bigger app.
The benefits of this approach:
faster execution (instead of loading every time a new process/application you will use only one)
simpler deployment (one application is simpler than X)
better and customized gui
2) If, otherwise, you don't have the source code of these apps , so recreating these is impossible (excluding reverse-engineering) , your approach seem's to be the only one possible for your scenario.
In this case if these apps don't have an API to use, the most stupid and working approach would be to simply use the System.Diagnostics.Process
class to start a process for every app involved when is necessary to.
Here an example of this approach:
Process process = new Process();
string path = #"C:\path\to\the\app1.exe";
ProcessStartInfo processStartInfo = new ProcessStartInfo(path);
processStartInfo.UseShellExecute = false;
process.StartInfo = processStartInfo;
process.Start();
and so on... every time you want to launch an application .
For killing these applications you have 2 possibility, kill the process manually
by calling process.Kill() or let the application kill herself

semaphore implementation for counting no. of instances currently running

I am working on multi instance application. Is there any way in c# to know that how many instances are running currently.I used one peice of code to count the window processes of my application name but that is not a good way.
string fileName = Process.GetCurrentProcess().MainModule.FileName;
int count = 0;
foreach (Process p in Process.GetProcesses())
{
try
{
if (p.MainModule.FileName == fileName)
{
count++;
}
}
catch { }
}
MessageBox.Show("Total Instances Running are " + count);
Can it be done by using semaphore or some increment and decrement counter that is incremented by one when new instance is created and decrement by one when a instance closes.
A Semaphore helps you count down, blocking you when you get to 0. You could use a global semaphore, but you'll have to initialize it at a high enough-value and start counting down.
I think all in all, your own solution is probably the cleanest.
Why don't you use shared memory? Of course you have to protect it with a mutex visible to all processes, but in this way you can store commonly used data among all processes.
I suppose you have to P/Invoke platform routine is order to create a shared memory, but it really straightforward.
Managing multiple processes is never not a problem. The OS puts up a big wall between them that makes it just about anything hard and expensive. The code you are using is no exception, iterating running processes is expensive. Always first consider using threads instead and look for AppDomains in a .NET program, a feature expressly invented to provide the kind of isolation a process can provide but without the cost of a slaying the interop barrier.
If you are committed to a multi-process solution then you almost always need a separate process that acts as an arbiter. Responsible for ensuring the worker processes get started and doing something meaningful when one of them dies on an unhandled exception. Which is itself something that's very hard to deal with since there is no good way to get any info about the reason it died and you lost an enormous amount of state. The typical outcome is a malfunction of the entire app unless you give these processes very simple things to do that you can easily do without.
Such an arbiter process has no problem counting instances cheaply, it can use the Process.Exited event to maintain a counter. An event instead of having to poll.

What does it mean when Process.Start fails because "Insufficient system resources exist"

I have a C# application which launches another executable using Process.Start().
99% of the time this call works perfectly fine. After the application has run for quite some time though, Process.Start() will fail with the error message:
Insufficient system resources exist to complete the requested service
Initially I thought this must have been due to a memory leak in my program - I've profiled it fairly extensively and it doesn't appear there's a leak - the memory footprint will still be reasonable even when this message failed.
Immediately after a failure like this, if I print some of the system statistics it appears that I have over 600MB of RAM free, plenty of space on disk, and the CPU usage is effectively at 0%.
Is there some other system resource I haven't thought of? Am I running into a memory limit within the .NET VM?
Edit2:
I opened up the application in SysInternals Process Explorer and it looks like I'm leaking Handles left and right:
Handles Used: 11,950,352 (!)
GDI Handles: 26
USER Handles: 22
What's strange here is that the Win32 side of handles seem very reasonable, but somehow my raw handle count has exploded waaaaay out of control. Any ideas what could cause a Handle leak like this? I was originally convinced it was Process.Start() but that would be USER handles, wouldn't it?
Edit:
Here's an example of how I'm creating the process:
var pInfo = new ProcessStartInfo(path, ClientStartArguments)
{
UseShellExecute = false,
WorkingDirectory = workingDirectory
};
ClientProcess = Process.Start(pInfo);
Here's an example of how I kill the same process (later in the program after I have interacted with the process):
Process[] clientProcesses = Process.GetProcessesByName(ClientProcessName);
if (clientProcesses.Length > 0)
{
foreach (var clientProcess in clientProcesses.Where(
clientProcess => clientProcess.HasExited == false))
{
clientProcess.Kill();
}
}
The problem here is with retained process handles. As we can see from your later edits you are keeping a reference to the Process object returned by Process.Start(). As mentioned in the documentation of Process:
Like many Windows resources, a process is also identified by its handle, which might not be unique on the computer. A handle is the generic term for an identifier of a resource. The operating system persists the process handle, which is accessed through the Handle property of the Process component, even when the process has exited. Thus, you can get the process's administrative information, such as the ExitCode (usually either zero for success or a nonzero error code) and the ExitTime. Handles are an extremely valuable resource, so leaking handles is more virulent than leaking memory.
I especially like the use of the word virulent. You need to dispose and release the reference to Process.
Also check out this excellent question and it's corresponding answer: Not enough memory or not enough handles?
Since the Process class implements IDisposable, it is good practice to properly dispose of it when you are done. In this case, it will prevent handle leaks.
using (var p = new Process())
{
p.StartInfo = new ProcessStartInfo(#"C:\windows\notepad.exe");
p.Start();
p.WaitForExit();
}
If you are calling Process.Kill() and the process has already exited, you will get an InvalidOperationException.
That's not an uncommon problem to have with little programs like this. The problem is that you are using a large amount of system resources but very little memory. You don't put enough pressure on the garbage collected heap so the collector never runs. So finalizable objects, the wrappers for system handles like Process and Thread, never get finalized.
Simply disposing the Process object after the process has exited will go a long way to solve the problem. But might not solve it completely, any threads that the Process class uses or you use yourself consume 5 operating system handles each. The Thread class doesn't have a Dispose() method. It should but it doesn't, it is next to impossible to call it correctly.
The solution is triggering a garbage collection yourself. Count the number of times you start a process. Every, say, hundredth time call GC.Collect(). Keep an eye on the Handle count with Taskmgr.exe. Use View + Select Columns to add it. Fine tune the GC.Collect calls to so that it doesn't increase beyond, say, 500.

Process.WaitForExit inconsistent across different machines

This code runs as expected on a large number of machines. However on one particular machine, the call to WaitForExit() seems to be ignored, and in fact marks the process as exited.
static void Main(string[] args)
{
Process proc = Process.Start("notepad.exe");
Console.WriteLine(proc.HasExited); //Always False
proc.WaitForExit(); //Blocks on all but one machines
Console.WriteLine(proc.HasExited); //**See comment below
Console.ReadLine();
}
Note that unlike a similar question on SO, the process being called is notepad.exe (for testing reasons), so it is unlikely the fault lies with it - i.e. it is not spawning a second sub-process and closing. Even so, it would not explain why it works on all the other machines.
On the problem machine, the second call to Console.WriteLine(proc.HasExited)) returns true even though notepad is still clearly open, both on the screen and in the task manager.
The machine is running Windows 7 and .NET 4.0.
My question is; what conditions on that particular machine could be causing this? What should I be checking?
Edit - Things I've tried so far / Updates / Possibly relevant info:
Reinstalled .NET.
Closed any processes I don't know in task manager.
Windows has not yet been activated on this machine.
Following advice in the comments, I tried getting the 'existing' process Id using GetProcessesByName but that simply returns an empty array on the problem machine. Therefore, it's hard to say the problem is even with WaitForExit, as the process is not returned by calling GetProcessesByName even before calling WaitForExit.
On the problem machine, the resulting notepad process's ParentID is the ID of the notepad process the code manually starts, or in other words, notepad is spawning a child process and terminating itself.
The problem is that by default Process.StartInfo.UseShellExecute is set to true. With this variable set to true, rather than starting the process yourself, you are asking the shell to start it for you. That can be quite useful- it allows you to do things like "execute" an HTML file (the shell will use the appropriate default application).
Its not so good when you want to track the application after executing it (as you found), because the launching application can sometimes get confused about which instance it should be tracking.
The inner details here of why this happens are probably beyond my capabilities to answer- I do know that when UseShellExecute == true, the framework uses the ShellExecuteEx Windows API, and when it UseShellExecute == false, it uses CreateProcessWithLogonW, but why one leads to trackable processes and the other doesn't I don't know, as they both seem to return the process ID.
EDIT: After a little digging:
This question pointed me to the SEE_MASK_NOCLOSEPROCESS flag, which does indeed seem to be set when using ShellExecute. The documentation for the mask value states:
In some cases, such as when execution is satisfied through a DDE
conversation, no handle will be returned. The calling application is
responsible for closing the handle when it is no longer needed.
So it does suggest that returning the process handle is unreliable. I still have not gotten deep enough to know which particular edge case you might be hitting here though.
A cause could be a virus that replaced notepad.exe to hide itself.
If executed, it spawns notepad and exits (just a guess).
try this code:
var process = Process.Start("notepad.exe");
var process2 = Process.GetProcessById(process.Id);
while (!process2.HasExited)
{
Thread.Sleep(1000);
try
{
process2 = Process.GetProcessById(process.Id);
}
catch (ArgumentException)
{
break;
}
}
MessageBox.Show("done");
After Process.Start() check the process id of notepad.exe with the taskmanager and verify it is the same as process.Id;
Oh, and you really should use the full path to notepad.exe
var notepad = Path.Combine(Environment.GetFolderPath(
Environment.SpecialFolder.Windows), "notepad.exe");
Process.Start(notepad);

Can a C# program measure its own CPU usage somehow?

I am working on a background program that will be running for a long time, and I have a external logging program (SmartInspect) that I want to feed with some values periodically, to monitor it in realtime when debugging.
I know I can simply fire up multiple programs, like the Task Manager, or IARSN TaskInfo, but I'd like to keep everything in my own program for this, as I also wants to add some simple rules like if the program uses more than X% CPU, flag this in the log.
I have a background thread that periodically feeds some statistics to SmartInspect, like memory consumption, working set, etc.
Is it possible for this thread to get a reasonably accurate measure of how much of the computer's CPU resources it consumes? The main program is a single-threaded application (apart from the watchdog thread that logs statistics) so if a technique is limited to how much does a single thread use then that would be good too.
I found some entries related to something called rusage for Linux and C. Is there something similar I can use for this?
Edit: Ok, I tried the performance counter way, but it added quite a lot of GC-data each time called, so the graph for memory usage and garbage collection skyrocketed. I guess I'll just leave this part out for now.
You can also use System.Diagnostics.Process.TotalProcessorTime and System.Diagnostics.ProcessThread.TotalProcessorTime properties to calculate your processor usage as this article describes.
Have a look at System.Diagnostics.PerformanceCounter. If you run up perfmon.exe, you'll see the range of performance counters available to you (set the 'performance object' to 'Process'), one of which is '% Processor Time'.
You can through the System.Diagnostic.PerformanceCounter class. Here's an example of somebody monitoring CPU usage:
http://blogs.msdn.com/dotnetinterop/archive/2007/02/02/system-diagnostics-performancecounter-and-processor-time-on-multi-core-or-multi-cpu.aspx
Note that this does require elevated privileges. And there may be a performance hit using it.
It is good that you are logging to monitors like smartinspect. But windows itself gathers the data for each resource in this case your program (or process). WMI is the standard for Application monitoring. We can view the data captured by WMI. Many application management, health monitoring or applicaiton monitoring tools support WMI out of the box.
So I would not recommend you to log your CPU usage within the application to a log file.
If you think availablity and performance is critical then go for solutions like Microsoft Operations manager solution.
To get an idea about WMI and to get the list of process see below:
- Win32_PerfFormattedData_PerfProc_Process to get Cpu time, filter is processID
See this article
- You can get the processID from Win32_process class.
WMI Made Easy For C# by Kevin Matthew Goss
oConn.Username = "JohnDoe";
oConn.Password = "JohnsPass";
System.Management.ManagementScope oMs = new System.Management.ManagementScope("\\MachineX", oConn);
//get Fixed disk stats
System.Management.ObjectQuery oQuery = new System.Management.ObjectQuery("select FreeSpace,Size,Name from Win32_LogicalDisk where DriveType=3");
//Execute the query
ManagementObjectSearcher oSearcher = new ManagementObjectSearcher(oMs,oQuery);
//Get the results
ManagementObjectCollection oReturnCollection = oSearcher.Get();
//loop through found drives and write out info
foreach( ManagementObject oReturn in oReturnCollection )
{
// Disk name
Console.WriteLine("Name : " + oReturn["Name"].ToString());
// Free Space in bytes
Console.WriteLine("FreeSpace: " + oReturn["FreeSpace"].ToString());
// Size in bytes
Console.WriteLine("Size: " + oReturn["Size"].ToString());
}
You can monitor the process from Remote system as well.
This code project article describes how to use the high performance timer:
http://www.codeproject.com/KB/cs/highperformancetimercshar.aspx
You can use it to time the execution of your code.
Here you can find a number of open source C# profilers:
http://csharp-source.net/open-source/profile

Categories