For reasons of memory isolation and stability I need to execute some methods/classes as sub-processes, not threads.
I found the following library that allows me to do just that: https://github.com/tmds/Tmds.ExecFunction.
However, I can't seem to find a way to get the process ID of the child process.
ExecFunction.Run(() => new NewSubProcessClass());
The code above will generate a new process and will even generate console output in a Linux Docker container. The problem is, that I have no process ID.
public static Process Start(Action action, Action<ExecFunctionOptions> configure = null);
The method I quoted above looked like it should do the trick, however, when I replace
ExecFunction.Run(() => new NewSubProcessClass());
with
Process process = ExecFunction.Start(() => new NewSubProcessClass());
I no longer get any console output.
UPDATE 0:
The suggestion from #bazzilic
process.WaitForExit();
is the solution to the problem.
If you examine the source code for Tmds.ExecFunction, both methods .Run(...) (here) and .Start(...) (here) under the hood call the same private method .Start(...) (source here). The two differences are:
the public .Start(...) method returns the process object from the tuple returned by the private .Start(...) method whereas .Run(...) does not;
the .Run(...) method also supplies waitForExit: true in the parameters.
In the code of the private .Start(...) method, there is this portion:
if (waitForExit)
{
process.WaitForExit();
}
Other than that, the two public methods are identical. The difference in your code behavior is most likely due to not waiting for the child process to finish in your parent process. Try this:
Process process = ExecFunction.Start(() => new NewSubProcessClass());
process.WaitForExit();
PS: Keep in mind that Process is IDisposable, so might be better to do
using ( Process process = ExecFunction.Start(() => new NewSubProcessClass()) ) {
process.WaitForExit();
}
I want to inject a environment variable to a new created thead so that the processes which will be started inside of the thread can see this variable.
Till now it does not work, I tried it like this
var startInfo = new ProcessStartInfo();
startInfo.EnvironmentVariables["foo"] = "bar";
applicationThread = new Thread(new ThreadStart(scanner.start));
applicationThread.Start();
But the processes which will be started inside of my thread do not see the environment variable "foo"
Thanks
Thread.Start has an overload taking an Object parameter. You can pass your variable this way.
From MSDN (Thread.Start(Object) method documentation):
Causes the operating system to change the state of the current
instance to ThreadState.Running, and optionally supplies an object
containing data to be used by the method the thread executes.
You should be able to use it like this:
applicationThread.Start(startInfo.EnvironmentVariables["foo"]);
Below is my code to start the process, have put link for demo only.I want this process to run in background without opening browser. Also 2nd line throws exception
Object reference not set to an instance of an object.
var process=Process.Start("http://www.google.com");
process.WaitForExit();
Because when you start a process indirectly you won't get Process object (then in your case process is always null and second line throws an exception).
Let me explain what I mean with indirectly: if you don't specify an executable but you give a document (or a resource) then it'll be executed through a shell verb. In this case an existing process may be (re)used. In this cases Process.Start() will return null.
Try this:
Create an empty Word document 'c:\test.docx'.
Close all Word instances.
Execute Process.Start(#"c:\test.docx"); // Returns a Process instance
Execute Process.Start(#"c:\test.docx"); // Returns null
Can you simply solve this? AFAIK you can't because Process uses ShellExecuteEx with a SHELLEXECUTEINFO structure to start the process. Reading SHELLEXECUTEINFO documentation for hProcess field you'll see that:
A handle to the newly started application. This member is set on return and is always NULL unless fMask is set to SEE_MASK_NOCLOSEPROCESS. Even if fMask is set to SEE_MASK_NOCLOSEPROCESS, hProcess will be NULL if no process was launched. For example, if a document to be launched is a URL and an instance of Internet Explorer is already running, it will display the document. No new process is launched, and hProcess will be NULL.
Note ShellExecuteEx does not always return an hProcess, even if a process is launched as the result of the call. For example, an hProcess does not return when you use SEE_MASK_INVOKEIDLIST to invoke IContextMenu.
Note if you're running a new process just to open an URL and get a server side generated file then you should follow Damien's suggestion and use a WebClient.DownloadFile().
Process.Start() can return a null reference:
Return Value
Type: System.Diagnostics.Process
A new Process component that is associated with the process resource, or null, if no process resource is started (for example, if an existing process is reused).
(Emphasis mine)
When that happens, you'll get a NullReferenceException when trying to call WaitForExit()
I'm in to a loop defined dynamically the function that will run onClick of control.
the function is the following:
public static void TryOpenFile(string filename, EventHandler callback)
{
Process proc;
proc = Process.Start(filename);
if (callback != null)
{
proc.EnableRaisingEvents = true;
proc.Exited += (a, b) =>
{
callback(a, b);
};
}
}
And then:
for(int i = 0; i < numberOfControls; i++)
{
controlImg.SetFileToOpen(file,
delegate
{
//exited!
});
}
Looks like your filename points to an already running process. As per the MSDN documentation:
Return Value
Type: System.Diagnostics.Process
A new Process component
that is associated with the process resource, or null, if no process
resource is started (for example, if an existing process is reused).
Update: If your filename is a bad filename, it will obviously throw an Exception.
The documentation for this overload of Process.Start explains what is happening (emphasis mine):
Use this overload to start a process resource by specifying its file
name. The overload associates the resource with a new Process
component. If the process is already running, no additional process
resource is started. Instead, the existing process resource is reused
and no new Process component is created. In such a case, instead of
returning a new Process component, Start returns null to the calling
procedure.
A new process may not be started if you are using ShellExecute to start a file using it's association, rather than running an executable. For instance, if filename is (for example) "C:\Test.xls", it might start Excel. But if Excel was already running, it might open the file in the existing running instance, rather than starting a new process. In that case, the value of proc would be null.
Background
I have a Windows service that uses various third-party DLLs to perform work on PDF files. These operations can use quite a bit of system resources, and occasionally seem to suffer from memory leaks when errors occur. The DLLs are managed wrappers around other unmanaged DLLs.
Current Solution
I'm already mitigating this issue in one case by wrapping a call to one of the DLLs in a dedicated console app and calling that app via Process.Start(). If the operation fails and there are memory leaks or unreleased file handles, it doesn't really matter. The process will end and the OS will recover the handles.
I'd like to apply this same logic to the other places in my app that use these DLLs. However, I'm not terribly excited about adding more console projects to my solution, and writing even more boiler-plate code that calls Process.Start() and parses the output of the console apps.
New Solution
An elegant alternative to dedicated console apps and Process.Start() seems to be the use of AppDomains, like this: http://blogs.geekdojo.net/richard/archive/2003/12/10/428.aspx
I've implemented similar code in my application, but the unit tests have not been promising. I create a FileStream to a test file in a separate AppDomain, but don't dispose it. I then attempt to create another FileStream in the main domain, and it fails due to the unreleased file lock.
Interestingly, adding an empty DomainUnload event to the worker domain makes the unit test pass. Regardless, I'm concerned that maybe creating "worker" AppDomains won't solve my problem.
Thoughts?
The Code
/// <summary>
/// Executes a method in a separate AppDomain. This should serve as a simple replacement
/// of running code in a separate process via a console app.
/// </summary>
public T RunInAppDomain<T>( Func<T> func )
{
AppDomain domain = AppDomain.CreateDomain ( "Delegate Executor " + func.GetHashCode (), null,
new AppDomainSetup { ApplicationBase = Environment.CurrentDirectory } );
domain.DomainUnload += ( sender, e ) =>
{
// this empty event handler fixes the unit test, but I don't know why
};
try
{
domain.DoCallBack ( new AppDomainDelegateWrapper ( domain, func ).Invoke );
return (T)domain.GetData ( "result" );
}
finally
{
AppDomain.Unload ( domain );
}
}
public void RunInAppDomain( Action func )
{
RunInAppDomain ( () => { func (); return 0; } );
}
/// <summary>
/// Provides a serializable wrapper around a delegate.
/// </summary>
[Serializable]
private class AppDomainDelegateWrapper : MarshalByRefObject
{
private readonly AppDomain _domain;
private readonly Delegate _delegate;
public AppDomainDelegateWrapper( AppDomain domain, Delegate func )
{
_domain = domain;
_delegate = func;
}
public void Invoke()
{
_domain.SetData ( "result", _delegate.DynamicInvoke () );
}
}
The unit test
[Test]
public void RunInAppDomainCleanupCheck()
{
const string path = #"../../Output/appdomain-hanging-file.txt";
using( var file = File.CreateText ( path ) )
{
file.WriteLine( "test" );
}
// verify that file handles that aren't closed in an AppDomain-wrapped call are cleaned up after the call returns
Portal.ProcessService.RunInAppDomain ( () =>
{
// open a test file, but don't release it. The handle should be released when the AppDomain is unloaded
new FileStream ( path, FileMode.Open, FileAccess.ReadWrite, FileShare.None );
} );
// sleeping for a while doesn't make a difference
//Thread.Sleep ( 10000 );
// creating a new FileStream will fail if the DomainUnload event is not bound
using( var file = new FileStream ( path, FileMode.Open, FileAccess.ReadWrite, FileShare.None ) )
{
}
}
Application domains and cross-domain interaction is a very thin matter, so one should make sure he really understands how thing work before doing anything... Mmm... Let's say, "non-standard" :-)
First of all, your stream-creating method actually executes on your "default" domain (surprise-surprise!). Why? Simple: the method that you pass into AppDomain.DoCallBack is defined on an AppDomainDelegateWrapper object, and that object exists on your default domain, so that is where its method gets executed. MSDN doesn't say about this little "feature", but it's easy enough to check: just set a breakpoint in AppDomainDelegateWrapper.Invoke.
So, basically, you have to make do without a "wrapper" object. Use static method for DoCallBack's argument.
But how do you pass your "func" argument into the other domain so that your static method can pick it up and execute?
The most evident way is to use AppDomain.SetData, or you can roll your own, but regardless of how exactly you do it, there is another problem: if "func" is a non-static method, then the object that it's defined on must be somehow passed into the other appdomain. It may be passed either by value (whereas it gets copied, field by field) or by reference (creating a cross-domain object reference with all the beauty of Remoting). To do former, the class has to be marked with a [Serializable] attribute. To do latter, it has to inherit from MarshalByRefObject. If the class is neither, an exception will be thrown upon attempt to pass the object to the other domain. Keep in mind, though, that passing by reference pretty much kills the whole idea, because your method will still be called on the same domain that the object exists on - that is, the default one.
Concluding the above paragraph, you are left with two options: either pass a method defined on a class marked with a [Serializable] attribute (and keep in mind that the object will be copied), or pass a static method. I suspect that, for your purposes, you will need the former.
And just in case it has escaped your attention, I would like to point out that your second overload of RunInAppDomain (the one that takes Action) passes a method defined on a class that isn't marked [Serializable]. Don't see any class there? You don't have to: with anonymous delegates containing bound variables, the compiler will create one for you. And it just so happens that the compiler doesn't bother to mark that autogenerated class [Serializable]. Unfortunate, but this is life :-)
Having said all that (a lot of words, isn't it? :-), and assuming your vow not to pass any non-static and non-[Serializable] methods, here are your new RunInAppDomain methods:
/// <summary>
/// Executes a method in a separate AppDomain. This should serve as a simple replacement
/// of running code in a separate process via a console app.
/// </summary>
public static T RunInAppDomain<T>(Func<T> func)
{
AppDomain domain = AppDomain.CreateDomain("Delegate Executor " + func.GetHashCode(), null,
new AppDomainSetup { ApplicationBase = Environment.CurrentDirectory });
try
{
domain.SetData("toInvoke", func);
domain.DoCallBack(() =>
{
var f = AppDomain.CurrentDomain.GetData("toInvoke") as Func<T>;
AppDomain.CurrentDomain.SetData("result", f());
});
return (T)domain.GetData("result");
}
finally
{
AppDomain.Unload(domain);
}
}
[Serializable]
private class ActionDelegateWrapper
{
public Action Func;
public int Invoke()
{
Func();
return 0;
}
}
public static void RunInAppDomain(Action func)
{
RunInAppDomain<int>( new ActionDelegateWrapper { Func = func }.Invoke );
}
If you're still with me, I appreciate :-)
Now, after spending so much time on fixing that mechanism, I am going to tell you that is was purposeless anyway.
The thing is, AppDomains won't help you for your purposes. They only take care of managed objects, while unmanaged code can leak and crash all it wants. Unmanaged code doesn't even know there are such things as appdomains. It only knows about processes.
So, in the end, your best option remains your current solution: just spawn another process and be happy about it. And, I would agree with the previous answers, you don't have to write another console app for each case. Just pass a fully qualified name of a static method, and have the console app load your assembly, load your type, and invoke the method. You can actually package it pretty neatly in a very much the same way as you tried with AppDomains. You can create a method called something like "RunInAnotherProcess", which will examine the argument, get the full type name and method name out of it (while making sure the method is static) and spawn the console app, which will do the rest.
You don't have to create many console applications, you can create a single application that will receive as parameter the full qualified type name. The application will load that type and execute it.
Separating everything into tiny processes is the best method to really dispose all the resources. An application domain cannot do full resources disposing, but a process can.
Have you considered opening a pipe between the main application and the sub applications? This way you could pass more structured information between the two applications without parsing standard output.