Hosting using AddInProcess - c#

Our application requires now that one of its components will be started in its own dedicated process.
I have just come across the AddInProcess class (from System.AddIn.dll)
Unfortunately i couldn't find any useful code examples or projects that use this infrastructure.
I am wondering what are its pros/cons against rolling our own out of process infastructure?
Our application uses .NET 3.5 (WinForms)
The component that should be loaded out of process is an execution engine that loads arbitrary user code and executes it.
One note to consider is the fact that this component that executes code, needs to pass back a Results object to the calling application.

I would say it depends on what sort of interface into the component you need.
If it is simple, i.e. the functionality needed is in a single function or two, you could just start a process to do it, passin an argument if needed.
If it is more complex, you could create a WCF host process and expose a service interface.

Related

COM export method from object in .exe application [duplicate]

I currently have a .NET class library written in C# that exposes its functionaility via COM to a C++ program (pre-.NET).
We now want to move the library out-of-process to free up address space in the main application (it is an image-processing application, and large images eat up address space). I remember from my VB6 days that one could create an "OLE automation server". The OS would automatically start and stop the server .exe as objects were created/destroyed. This looks like the perfect fit for us: as far as I can see nothing would change in the client except it would call CoCreateInstance with CLSCTX_LOCAL_SERVER instead of CLSCTX_INPROC_SERVER.
How would I create such an out-of-process server in C#? Either there is no information online about it, or my terminology is off/out of date!
You can actually do this in .NET (I've done it before as a proof-of-concept), but it's a bit of work to get everything working right (process lifetime, registration, etc).
Create a new Windows application. In the Main method, call RegistrationServices.RegisterTypeForComClients- this is a managed wrapper around CoRegisterClassObject that takes care of the class factory for you. Pass it the Type of the managed ComVisible class (the one you actually want to create- .NET supplies the class factory automatically) along with RegistrationClassContext.LocalServer and RegistrationConnectionType.SingleUse. Now you have a very basic exe that can be registered as a LocalServer32 for COM activation. You'll still have to work out the lifetime of the process (implement refcounts on the managed objects with constructors/finalizers- when you hit zero, call UnregisterTypeForComClients and exit)- you can't let Main exit until all your objects are dead.
The registration isn't too bad: create a ComRegisterFunction attributed method that adds a LocalServer32 key under HKLM\CLSID(yourclsidhere), whose default value is the path to your exe. Run regasm yourexe.exe /codebase /tlb, and you're good to go.
You could always expose your .NET class as COM classes using InteropServices and then configure the library as a COM+ application. The .NET library would run out-of-process and be hosted by a DLLHOST.EXE instance.
Here is an article in MSDN that covers all aspects of how to create COM localserver in c# (.net): link
Your post started a while ago and I had the same problem. The following link is absolute gold and tells you everything
http://www.andymcm.com/blog/2009/10/managed-dcom-server.html

What is the architecturally correct and secure way to expose business logic to a task scheduling mechanism

I have a three tier asp.net web app backed by SQL server express, with business logic in C#, and a web UI. I have a small collection of actions that exist as methods on objects in my business logic layer that need to run on a configurable, periodic basis. These actions rely on many other objects in my current app along with needing my data access layer to talk to SQL. Currently I manually log in to the admin site and kick off the actions via my UI as there are only two at the moment but that will grow.
A couple options I've considered but wanted thoughts on before I proceed...
I could create some scheduled tasks in Windows server to kick these actions off periodically but I want to know how would I expose these actions in the best way. I thought of creating a web service exposing them and building a tiny exe to call that web service but I would have to make sure that web service was locked down with security. Another option which I know a little less about would be exposing those actions via export and then building an app that could use them by referencing the DLL. It seems that app would get kind of large if it has to pull everything in to use unless I could componentize my app binaries more so it would only need a small binary or two.
Any thoughts on how I should approach this or pointers on content discussing this type of issue?
I had gone the way of tiny EXE that calls a WebService from main app and it seems to work well, but that was before I discovered Quartz.net.
Now I'd suggest use Quartz.net as a scheduler. From the site:
Jobs can be any .NET class that implements the simple IJob interface,
leaving infinite possibilities for the work Jobs can perform.

Creating a Silverlight library with dependecies composed via MEF

I have a Silverlight 4 library L which has a dependency that is to be provided at run-time via a plugin P.
I am using a DeploymentCatalog along the lines of the example provided by MEF documentation and all is well: the XAP of the plugin P is correctly downloaded asynchronously and the import is satisfied.
However, I cannot control the details on the Silverlight application A that will be using library L and I cannot exclude that A itself might want to use MEF: therefore it's possible that at some point A might issue a CompositionHost.SatisfyImports(...) CompositionHost.Initialize(catalog) call for its own purposes which I understand can only be invoked once.
Am I missing something here or partitioning the application across multiple XAPs can only be achieved if one has complete control of the Silverlight application and libraries?
Stefano
CompositionHost.SatisfyImports can be called many times. CompositionHost.Initialize can only be called once. As a library, it is not a good idea to call that method because the application may do so. Since you need to create and use a DeploymentCatalog, it's probably better if you don't use CompositionHost at all in your library, since you want to avoid calling the Initialize method, which would be the way to hook the CompositionHost to the DeploymentCatalog.
You can create your own CompositionContainer hooked up to the DeploymentCatalog and call GetExports or SatisfyImports on the container you created. CompositionHost is pretty much just a wrapper around a static CompositionContainer.
It's not usually a good idea to tie yourself to a single dependency injection container in a library, instead you'd usually want to abstract that away using something like the CommonServiceLocator, which leaves the choice of IoC container a preference of whoever is consuming your library.
I only started with MEF in Silverlight a month ago, so I'm definitely not an authority.
The first thing I noticed is that CompositionHost.SatisfyImports has been replaced with CompositionInitializer.SatisfyImports .
Second I could not find any reference to "SatisfyImports can only be invoked once"
My scenario is the following:
I have a BL xap which I use/link to from my application
The BL has some Imports that will be satisfied by calling SatisfyImports from the Application
The BL also has some imports that
cannot/will not be resolved until a
certain custom (third party)
module/xap will be loaded (loaded
when demand that is). When the custom
module becomes available (is loaded)
I solve the missing imports with an
extra call to
CompositionInitializer.SatisfyImports:
E.g:
If DomainSpecificModuleLogic Is Nothing Then
'this is required to trigger recomposition and resolve imports to the ThirdPartyModule
System.ComponentModel.Composition.CompositionInitializer.SatisfyImports(Me)
End If
So I have multiple calls to SatisfyImports (at different moments in time) and no problems due to this -> you do not required control over the whole application, just make sure that when someone accesses an object from your library that uses MEF, you have a call to SatisfyImports
Note: my BL is a singleton, so for sure I am calling SatisfyImports on the same object multiple times.

Share an instance between multiple projects

I am working in VS 2008 C# and need to share an instance of an object created in one project with another project. I tried creating a static class in project1 and adding it as a link to project2, but the information wasn't saved. The static class was written in project1.
//object o = new object
//project1.staticObject = o
//project2.object = project1.staticObject
When I tried something like above, project2.object would be null. By adding a class as a link, is it creating a new instance of the static class in project2 or is it referencing the same class? If it is referencing the same class, shouldn't any information saved into the static class from project1 be accessible by project2? I know this isn't the most elegant manner of sharing data, but if anyone would help with this problem or provide a better manner of doing it, I would greatly appreciate it.
Thanks in advance.
Projects run in separate processes, so they can't share data in this manner. You'll need to persist the data in another type of store. I recommend using a database (hey, 20 gazillion websites, stock trading apps, airlines, etc can't be wrong).
If you don't want to use a database, you could open an IP connection between instances of the app and have a thread send packets of data to sync back and forth between the applications. Or, in your "server" app, add a web service that each process would call to update and retrieve information.
If you need really high-speed communication between the processes, sockets with a peer or star topology is a good way to go. If you're okay with some latency, having a web service (which works fine even if these aren't web apps) or a database could be a good solution. The right approach depends on your application.
WCF could also solve this problem. It effectively wraps the IP/socket layer and provides some nice object persistence/remote control capabilities, but I find it overly complex for most applications.
To share a single instance of an object among different processes (that's what I think you are intending to do) you need something that will maintain that object's state. You can look at the WCF and how to set up it's behaviour to act as a singleton so essentially every requester gets the same instance across the board.
http://msdn.microsoft.com/en-us/magazine/cc163590.aspx
Creating the link creates only applies to the source code. When you compile each project, it then has that single class definition available in both projects. The process you took does nothing for instances during runtime for sharing.
You can look at WCF or .NET Remoting, although .NET Remoting is now officially replaced by WCF.
If you are talking about sharing the same object between two processes, you can do that, the concept is called memory-mapped files. Here is some starter docs from msdn.
Though the docs and API use the term "FileMapping" quite a bit, you can use it just for sharing memory between two processes.
In .NET 4.0, you can use the System.IO.MemoryMappedFiles namespace. For your case, looks like .NET 3.5, you'll have to use some sort of interop to use the Win API.

C# Web Service and using a variable

I need to create a project for multiple web services using WCF in c#. The web services will be calling other assemblies to perform the core processing. The assemblies will be accessing data from SQL Server. One of the parameters that will be part of every web service method will include the database to use. My problem is how to pass the database parameter to assemblies to use. I can't change all the signatures for all the satellite assemblies to use. I want to reference some kind of variable that the satellite assembles reference. Theses same satellite assemblies are used with a Windows Forms app and an ASP.NET app so I would need to have something that all types of applications could use. Static fields are not good since for one web service call the database could be "X" and for another it would be "Y". Any ideas?
This is the sort of thing that might play nicely with an IoC or DI framework - having some interface that includes the database information, and have it pushed into all the callers for you. Even without IoC, hiding the implementation in an interface sounds like a solid plan.
With your static concept; a [ThreadStatic] might work but is a little hacky (and you need to be religious about cleaning the data between callers), or another option is to squirrel some information away on the Principal, as this is relatively easily configured from both WCF (per-call) and winforms (typically per-process). In either case, be careful about any thread-switching (async, etc). In particular, note that ASP.NET can change threads in the middle of a single page pipeline.

Categories