Process.Handle is not constant? - c#

Process.Handle is returning different values every time, is this correct?
From msdn, "The handle that the operating system assigned to the associated process when the process was started. The system uses this handle to keep track of process attributes."
https://learn.microsoft.com/en-us/dotnet/api/system.diagnostics.process.handle?view=netcore-3.1
Isn't supposed to be constant if not unique?
Tried different ways of getting the process but the Handle is different everytime.
Ex:
Process.GetProcesses().FirstOrDefault(...)
OR
Process.GetProcessById(123)
OR
Process.GetProcessesByName("xyz")
I'm trying to hold a process id of a process that is launched by my application and this id will be used "later" to get the process from the running processes to stop it, if it is still running in a particular case. I don't want to check with name as the same application can be launched externally.
Trying to add another layer of validation to make sure any other process is not running with same id later(if my expected process is already stopped and the same id is used for restarting same application or any other application)
Is the Process.Handle property expected to be varying or Any other way to do the same?

It's "a handle for a process" not "the handle for a process". It is a quantity that allows a program to operate on a process, not the sole identifier of a process.
The same is true for any other Windows kernel handle. For example, if you open the same file twice by 'the usual methods', you'll have two different handles for the same thing.
As long as the process has not terminated or there is still one handle open that refers to the process, its process id is the actual unique identification. The id can be reused later however.
I can't answer why the documentation says what it does.

'm trying to hold a process id of a process that is launched by my application and this id will be used "later" to get the process from the running processes to stop it,
Just hold onto the Process instance, then you can skip the "get the process from the running processes".
You need to do this anyway to prevent the process ID from being reused. The process ID is constant for the lifetime of the process-tracking kernel data object, which really means "as long as either the process is alive or someone keeps an open handle to it". By keeping a Process object instance, you keep a handle open, and keep the process ID valid.

Related

Need to communicate with child process to process a file

I have several 3rd party DLLs which are super flaky. They can sometimes hang and never return or sometimes throw weird exceptions which can bring down the whole process.
I want to move these DLLs and load then in a separate Child process. That way instead of having to do nasty Thread.Abort I can just bring down the process cleanly and later re-spawn it when required.
So my parent application receives a list of files that need to be processed by certain third party DLLs. I essentially need to pass these to the new child process, let it process the file and then communicate back to the parent that it was successful. I must also be able to bring down the process if sh*t hits the fan and re-spawn. These files come as constant stream so spawning a process every-time i get a file is not possible, id want it to hang around and just accept requests.
Right now i'm spawning the child process from the parent and then attempting to use Memory Mapped Files to share the files/work. Would it be easier just passing the location of said file and somehow getting a response when its processed?
What would be a good strategy here...
I would....
Create a WCF service, using PerCall instancing that hosts the dlls and does the file process - this would spawn a new instance for each call and if any goes down it should not affect the other. You could host it even as part of your main app but maybe as a separate Windows service, and as its probably going to be on the same machine, use named pipes transport.
Fire each request at it from your main app.
If you don't get a successful response (as long as its not a wcf exception - i.e. endpoint not found) just retry the request for x number of times

Pass informations between separate consoles and windows applications

I have two separate programs, one is a console application, and the other one is a windows application.
My windows application:
Has a graphic interface, buttons, and others functions.
One of the buttons, named "research": when I click on it, I launch the console application with this line of code:
string strResult = ProcessHelper.LaunchProcessWaitForPipedResult("MyExecFile.exe", strArguments, 10 * 60 * 1000, true); // 10 mins max
My console Application:
do a query on all existing files in a directory.
My problem:
I want to create a progress-bar on the windows application to show the progress of the console application. The problem is I don't know how to pass this information between the two processes. The only restriction is to not use a database or file.
Given two processes in the same user session, and wanting to avoid any communication outside that session I would look at three options:
1. Using named pipes.
The parent process creates a named pipe using a random name (and confirms that name is not in use by opening it). It passes that name to the child process. A simple protocol is used that allows the child to send updates.
There are a number of challenges to overcome:
Getting the logic to ensure the name is unique right (named pipe names are global).
Ensuring no other process can connect (the default named pipe ACL limits connections to the session: this might be enough).
Handling the case where a different parent process does not support progress updates.
Handling the child or parent crashing.
Avoiding getting too clever with the communication protocol, but allowing room for growth (what happens when more than a simple progress bar is wanted?)
2. Using Shared Memory
In this case names of objects are, by default, local to the session. By default this is more secure.
The parent process creates a sufficiently large amount of shared memory (for a simple progress update: not much), a mutex and an event.
The parent process then, concurrently with the GUI waits for the event to be signalled, when it is it enters the mutex and reads the content of shared memory. It then unsets the event and leaves the mutex.
Meanwhile to send an update the child enters the mutex, updates and memory and sets the event before leaving the mutex.
The challenges here include:
Defining the layout of the shared memory. Without a shared assembly this is likely to be error prone.
Avoiding others using the shared memory and synchronisation objects. .NET makes things harder here: in Win32 I would make the handles inheritable thus not needing to name the objects (except for debugging) and pass to the child directly.
Getting the sequencing of shared memory, mutex and event correct is critical. Memory corruption and more subtle bugs await any errors.
It is harder to do variable length data with shared memory, not an issue for a simple progress count but customers always want more.
Summary
I would probably look at named pipes in the first place (or perhaps custom WMI types if I wanted greater flexibility). BUT I would do that only after trying everything to avoid needing multiple processes in the first place. A shared library plus console wrapper for others, while I use the library directly would be a far easier option.

Synchronize multiple processes to begin working at the same time?

I'm working in .NET 4 in C#. I have LauncherProgram.exe that will create multiple instances of WorkerProgram.exe, have them do some work on the arguments supplied when the process is created, and then LauncherProgram.exe will launch a new set of WorkerProgram.exe instances to do some different work.
Each WorkerProgram.exe is launched with some parameters that tell it what to work on, and there can be one or more WorkerProgram.exe launched at the same time. The WorkerProgram.exe reads the supplied parameters, performs some initialization, and then is ready to do the work.
What I'm trying to figure out is how to make each set of WorkerProgram.exe launched at the same time "tell" or "signal" or "I can't figure out the proper term" the LauncherProgram.exe that EACH process has completed the initialization step and is ready to begin. I want to synchronize the start of the "do your work" in the WorkerProgram.exe instances launched in a set.
I'm setting up my LauncherProgram.exe to work something like this (ignoring types for now):
while (there are sets of work to do)
{
for each set of work
{
for each group data in set
create and launch a WorkerProgram.exe for a single set of data
wait for all created WorkerProgram.exe to indicate init is complete
send signal to start processing
}
}
I actually have a small test program where I use named events to signal multiple spawned processes to START something at the same time.
(Hopefully all the above makes sense)
I just can't figure out the "wait for N processes to tell me their initialization is ready" bit.
I've searched for "process synchronization" and "rendezvous" and looked at using named events and named semaphores. I can find a bunch of things about threads, but less about separate processes.
LauncherProgram.exe creates the WorkerProgram.exe processes using the System.Diagnostics.Process class, if that helps.
If you can give me better terms to help narrow my search, or point me to a design pattern or mechanism, or a library or class that helps, I'd be very appreciative.
Thanks.
You can use the System.Threading.Mutex class for interprocess communication. See http://msdn.microsoft.com/en-us/library/system.threading.mutex(v=vs.110).aspx. It is probably easiest to name each Mutex, giving the process id of WorkerProgram.exe or some other distinguishing characteristic as the name.
You can use some interprocess communication but the simple way to do it is to write to temp files for instance writing DONE to some file and having Launcher read periodically until all WorkerProgram write DONE to their respective files, etc... or even create a FileMapping in windows to share memory between processes with file backings.
Other ways to do it include remote procedure calls, sockets, and simple file mappings.

How to start a process with extra data and then search for this process?

I have an agent program that will launch multiple instance of the same executable. Each of those instances need to have an individual ID associated with them.
The agent retains a reference to the Process object that was used to load the instance, but I have to consider that the agent may be shut down and restarted without affecting the started instances.
Once the agent starts again, I need it to search the existing processes and rebind a reference to the processes.
How can I assign data to a process and retrieve it afterwards?
Right now, I am starting the process like this:
this.AttachedProcess = new Process()
{
StartInfo = new ProcessStartInfo(filename)
};
And, later, I need to search for that process by calling Process.GetProcesses().
While I could use a command line argument to start the process (something like -instance XX) and read that command line using this answer, I'd like to know if there is another way to assign extra data to a process and retrieve it later.
You could save the Process.Id of the processes you create in a file.
Upon startup you read that file and check if those processes still are started and check that the filename matches (if the system has been restarted some other processes might have got those ids)
You could create a serializable class ProcessInfo that stores the process ID and any other information you want to associate with the process. When the agent shuts down (such as if the service stops, or when it gets disposed, or a closing event gets fired, etc) have it serialize the process info to a file. When it starts up again, it should check for and read in the process info file, which will essentially restore the agent to the state it was in just before it was shut down.
The main idea here is that the agent should be maintaining this information, not Windows or the individual processes that are running. Requesting auxiliary data from a process requires COM, WCF, or some other messaging service, and this is overkill for the kind of interaction you're talking about.
See the System.Runtime.Serialization namespace, particularly the DataContractSerializer class.

Listing and terminating processes in .Net

How can I list and terminate existing processes in .Net application? Target applications are 1) .Net applications, 2) they are instances of the same executable 3) they have unique Ids but I don't know how to get this information from them, 4) they are spawned externally (i.e. I do not have handles to them as I don't create them).
I want to list all processes, get unique ids from them and restart some of them. I assume that all of them are responsive.
You can grab a list of running processes with Process.GetProcesses static method. You can easily query the return value (possibly with LINQ) to get the ones you want.
System.Diagnostics.Process.GetProcesses()
Process.Kill();
Check this out for killing processes:
http://www.dreamincode.net/code/snippet1543.htm
The Process Id is a property of the process. Eg:
Process.Id
All of the methods available on the process are listed here:
http://msdn.microsoft.com/en-us/library/system.diagnostics.process_methods.aspx
I'd like to refer you to this question on interprocess communication, and also this tutorial. You can use WCF to query a process, and request a shutdown. Each process will need it's own named pipe. You can generate a unique name at startup, based on the process ID (Process.GetCurrentProcess().Id).
All this may be a little heavy weight for some simple communication though. Using the Windows message queue might be an option as well. You can use process.MainWindowHandle to get a process' window handle and send custom messages to instances of your application. See Messages and Message queues. If you choose to go that way, pinvoke could be of help.

Categories