Managed Extensibility Framework - c#

I'm creating a Service for my organization that will be installed on hundreds of computers. The Implementation of it may need to change over time. After watching and read a bit about MEF I'm still a little lost. Is MEF a good soulution for say if I wanted to Drop a dll into the service folder and have that service pick up the changes?

I have done quite a bit with MEF. Yes, MEF will do what you're looking for, with a caveat...
You can discover and load in a new DLL at runtime
However, it loads into the same App domain as your main application so,
You can't unload or change the DLL without restarting your application
If that last point is a problem, consider MAF (although it's much heavier). But in MAF, it will load your extensions into a separate app domain.
Your other option is just to spawn off another process to handle the request, and pass command line parameters to it.

if you are using VS2010 i recommend you try first Click Once. It gives you the setup.exe but also gives you an HTML file (and other files), wich you can upload to an IIS server or an FTP site, or even to a shared folder over your local network. The thing is, when you configure your deployment you can tell the installer to auto-update all clientes directly from the site (the HTML file, ftp or lan shared folder). When user starts the application it will connect to the site and ask for an update if it exists, and if it does, the application will self-update. If you want to deploy an update, you only need to upload again the HTML file and all other files in the folder.
Check out this links:
http://msdn.microsoft.com/en-us/library/ms953320.aspx
http://weblogs.asp.net/shahar/archive/2008/01/29/how-to-use-clickonce-to-deploy-your-applications.aspx
Happy coding ;)

I am no expert about MEF. Documentation says Mef is all about making plug-in based systems. so your system expects some interfaces and MEF makes easy to import classes implemented these interfaces to your system.your case seems suitable for this. MEF has directory catalog and by using catalogs you can import types/dlls runtime.
How do I get a MEF Directory catalog looking at the same directory for both the Servicelayer and DAL?

I think you can do this if you have your service logic in a separated class library and you got the needed contract interfaces in another class library. Then you can create a directory catalog and import your service implementation with MEF (you only need to reference to the contract interfaces assembly). Then you can pick up the new dll/service implementation if changed. You only need a directory watcher (FileSystemWatcher), then you have to call your catalog's refresh method when directory watcher fires. In theory it should work, but it's just an idea. :) Anyway, hope this helps.

You could also look into Prism. From what I understand, they are two different frameworks to do pretty much the same thing. Prism allows you to create modules by having a class in a dll that implements the IModule interface. You can just drag and drop dlls into a folder this way just like in MEF. You can statically or dynamically load modules, and do a bunch of other things I don't even know about.
Prism also has other features bundled with it, i.e. the Unity dependency injection container (which I like to call "The Magical Black Box"), a neat event and command system, etc. I'm sure MEF has all these things too.

Related

Compiling forms in C#

I want to compile each individual form on my application to be used sort of as a dll on its own... I looked into this and found very confusing representations of assemblies, which may or may not be what I wanted.
Is it possible to compile the form1.cs, form1.designer.cs and form1.resx to be 1 single file which then will be able to be used as a dll. I use "dll" as an example because that is the functionality I need with each of these forms when compiled to a single file, I need to be able to call it and use it from a shell application.
I know it is possible in VS to create a separate project which will compile into a dll but with something on the verge of 80 forms to compile... it will be a messy thing to maintain. So basically, is there an easier way?
this is the closest code I could get, but it is in console, so it will be impractical if there are easier ways... also I am not sure if it will actualy compile form1.cs, form1.designer.cs and form1.resx and still work as a dll
csc /target:library /out:MathLibrary.DLL Add.cs Mult.cs
Thanks for the help
Possible? Yes. Advisable? Umm, not sure.
You must study the CSC options to use it in such a massive way.
Partial classes are simply each listed among the sources. See here
The RESX file must be compiled by ResGen.exe to a resources file see here
You will use the /References parameter to include other DLLs.
The real challenge will probably come when you try to get cross references to work, depending on the layout of your application. Is there a main hub that will control all forms? Is it a plug-in architecture?
Good luck
Basically, you are working with solution. It can contain multiple projects. For each dll, you must have one project. So create 80 projects, add to each of them single form, edit it, add some logic.
Then there will be a main project, which produce exe. You can reference all dlls in that project, but better don't. If you do, updating any of dll will required recompiling that exe too. You can load them dynamically or use sort of plugin system (to enumerate dlls, understand their purpose, etc). Then you obtain Type from assembly (loaded dll), create instance (which will call constructor, which calls InitializeComponents, which loads form resources) and display form.
Regarding abstraction, you surely need something. To example, login window. You can create a generic form with some focus, user interface and user interaction logic. But it has to communicated with main project (which encapsulate encryption, password storage model, user rights, etc). One easy way to do this is to provide 2 interfaces:
interface ILoginImplementation
{
public void SetInitialUserName(string name);
}
interface ILoginLogic
{
public bool TryAuthenticate(string name, string password);
}
Implementation is what your form must implement and Logic is what main project implements and supply when instantiating login form.
I realize this is probably not ideal, but I still think your best bet is to use Visual Studio and create a separate project for each .dll to be created.
By right clicking the Solution node and selecting Add > New Solution Folder, you can at least organize your projects into a somewhat more orderly hierarchy. That alone might go a long way to make your project more manageable.
PS: If you haven't already, you should definitely try to create an interface, or a base class (or both!) that each of your Form-classes can derive from or implement. If you're able to abstract away and generalize some of the logic, it is quite likely to save you a lot of work down the road.

Plugin Architecture With Forward & Backward Compatibility

I'm currently working on a C# product that will use a plugin type system. This isn't anything new and I have seen much info around about how to use a interface to implement this functionality quite easily.
I've also seen methods to implement backwards compatibility by updating the interface name, e.g.: Interface change between versions - how to manage?
There are multiple scenarios which I can foresee with our product in regards to version mismatches between the main exe and the plugin.
Main Program same plugin version as plugin
Main Program newer than plugin
Main Program older than plugin
From the info I've been able to gather 1 & 2 work just fine. But I haven't been able to figure out how to correctly implement "forward" compatibility (3) properly.
It is our intention to only ADD methods to the plugin API.
Any ideas would be a great help.
Isolated PluginAPI DLL
First, Your PluginAPI (containing the interfaces) should be a separate DLL to your main application. Your main application will reference the PluginAPI, and each plugin will reference the PluginAPI. You're most likely already doing this.
Interface Versioning
Second, structurally, you should create a new interface each time you add a new property or method.
For example:
Version 1: Plugins.IPerson
Version 2: Plugins.V2.IPerson : Plugins.IPerson
Version 3: Plugins.V3.IPerson : Plugins.V2.IPerson
In rare cases where you decide to remove or completely redesign your API, example:
Version 4: Plugins.V4.IPerson //Without any Interface inheritance
Isolated PluginAPI DLL Versioning
Finally, I am not 100% sure how versioning of the PluginAPI .dll will go even with this structural architecture of Interface versioning. It may work
OR
You may need to have matching dlls for each version (each referencing the previous version(s)). We will assume that this is the case.
Solution for case 3
So let's now take your case [3], main program older than plugin:
Person Plugin implements Plugins.V2.IPlugin and references the V3 .dll (just to make it interesting).
Main Program references the V1 .dll
The plugin folder will contain the V2 and V3 plugin .dlls
The main app folder will only contain the V1 plugin .dll (among other files)
Main App will find and load the Person plugin and reference through a V1 definition for the IPerson interface
Of course, only V1 methods and properties will be accessible from the plugin to the Main App
(Additional methods will be accessible through reflection - not that you would want to)
Bonus Update
When you might use plugins
Third-parties extending your system. Source code would be better if that's an option, or if it's web-based, redirect to their URL. This is a dream for many software projects, but you should wait until you have an interested third-party partner before doing the extra work to build the plugin framework.
User Editable "Scripts". You should not build your own scripting language, instead you should compiled the user c# code against a restrictive interface in an appdomain that is very restrictive (disabling reflection and others).
Security grouping - Your core software might use trusted platform calls. Riskier modules can be separated into another library and optionally excluded by end-users.
When not to use Plugins
I am an advocate for less-is-more. Don't overengineer. If you are building modular software that's great, use classes and namespaces (don't get carried away with interfaces). "Modular" means you are striving to adhere to SOLID principles, but that doesn't mean you need Plugin architecture. Even inversion of control is overkill in many situations.
If you plan to open to third-parties in the future, don't make it a plugin architecture to start with. You can build out a plugin framework later in stages: i) derive interfaces; ii) define your plugins with interfaces within the same project; iii) load your internal plugins with a plugin loader class; iv) finally, you can implement an external library loader. Each of these 4 steps leave you with a working system on their own and move you toward a finished plugin system.
Hot Swappable Plugins
When designing a plugin architecture, you may be interested to know that you can make plugins hot swappable:
Without Freeing Memory - Just keep loading the new plugin. This is usually fine, unless it's maybe for a server software which you expect i) to run for a very long time without restarting; AND ii) expect many plugin changes and upgrades during that time. When you load a plugin at runtime, it loads the assembly into memory and cannot be unloaded. See [2] for why.
With Freeing Memory - You can unload an AppDomain. An AppDomain runs in the same process but are reference isolated - you can't reference or call objects directly. Instead calls must be marshalled and data must be serialised in between appdomains. The added complexity is not worth it if you're not going to change plugins often, there is: i) a performance penalty due to marshalling/serialization, ii) much more coding complexity (you can't simply use events and delegates and methods as normal), iii) this all leads to more bugs and makes it more difficult to debug.
So if option [2] entices you, please try [1] first, and use that architecture until you have the problems necessary for [2]. Never over-architect. Trust me, I have built a [2] architecture before during University, it's fun, but in most cases overkill and will likely kill your project (spending too much time on non-business functions).
You need to assume that your plugins only implement the interface(s) exposed. If you release a new version of your main program with new interface you will check to see if your plugins support that interface. Therefore if a new plugin is presented to an old version of main. It will either support the requested interface or will not and will fail the test as a valid plugin.

What is an easily maintainable way to share a common .net class library over many corporate asp.net mvc 3 web applications?

I've been struggling to do this in a way that fulfills all of my requirements.
Here is what we have in our library:
Base classes for controllers and services
Business objects (stores, departments, etc)
Common Partial Views (Login, Error, etc)
Base class for HttpApplication
General common code (read an INI file, create a db conn, etc)
The one requirement that has been giving me trouble is as follows:
Lives in one place on a server. (i.e. copy local = false)
This breaks because:
The DLL containing the HttpApplication class must be in the same directory as the web apps dll to launch. I haven't found a way around that. I'm ok with duplicating this code in every app, but would rather not.
The shared views don't like to work if I use Assembly.LoadFrom() to load the dll from the shared location. (I've been using this method to precompile my views)
Any namespace shortcuts in web.config break at runtime with compilation errors because the web.config is parsed before the assembly is loaded.
My question to you folks is how do you handle your common code in a similar environment?
The GAC seems to be more trouble than its worth, and we want all of our apps to be using the same code, and not have multiple apps on multiple versions and have to maintain all of that. Are there design patters/best practices that can guide us in this regard?
Also, as a bonus, if you can solve any of the problems above, that would be great, too.
Thanks!
Edit: I guess a question that follows is whether or not we should even have a directory with the common dll(s) on the server, or if they should only be deployed as projects are deployed/updated?
Firstly, you will want to separate out what you're trying to achieve. Don't create 1 library that does everything or you will have a Big Ball of Mud. Don't be afraid to create several maintainable libraries to achieve what you're after. Is there a specific reason it needs to be stored in one location?
For example, several of the items you mention are MVC or web specific. If you have items that can be reused by MVC, create a class library that contains MVC base classes you inherit and reference them in your project. Use the single responsibility principle as much as possible.
Regarding the other items you mentioned, like database connectivity, if it's reusable, abstract it out in a data access class library and reference it. Other simple operations like reading an ini file or creating a file, create another library and abstract it to easy to use methods.
I prefer to copy the library dlls locally. You never know when you will need to make changes to the library, but you don't want all of your projects to stop compiling. When you're ready to implement a new version of the library, copy the dll in and recompile.
Not sure why all the hate towards the gac. It was designed to handle this specific problem. Install your common dlls to the gac and all apps can see them. Need to deploy a new one, just re-install it in one place.

Plugin system with MVC3, Razor and C#

I'm fairly decent with MVC3 and enjoy creating my sites with it, however, I am yet to think up and implement a decent method of a "plugin" system.
Basically, I aim to have a generic "blog-type" CMS which I can distribute across my sites, but with the option to have certain things as plugins.
For example:
Generic build:
User area
Basic blog/news editing
Plugins: (May be needed for one or two sites, but not all)
Chatroom plugin
Stats
and so on...
Currently I would just make it all and disable things through a config file, however it would be nice if i could just drop a folder into my FTP and have an MVC page which automatically picks it up!
I assume I would have to start with scanning the directory "/plugins" and picking up a "plugin.config" (Or similar) file which would contain the basic details.
But how would I get my main system to pick these things up and actually use them?!
You may be able to do this using MVC Areas, here are some links about them:
ASP.NET MVC 2 Areas
ASP.NET MVC Areas: Are they important to a large application?
https://stackoverflow.com/questions/462458/asp-net-mvc-areas-are-they-important-to-a-large-application
Try assembly scanning with StructureMap dependency injection.
Read this great tutorial: ASP.NET MVC2 Plugin Architecture Tutorial
It help me create a plugin architecture with MVC3.
Areas solve the problem for you providing you have everything in the original project/assembly. You could write your plugin system to allow the plugins to register their own areas, or alternatively you could register some new view search paths in a custom Razor view engine.
I chose the latter for a recent OS project I wrote called Spruce, which uses a whole plugin architecture you might find useful as a reference.
You can scan all the assemblies in the bin directory on startup to check for plugins, via reflection. You usually check for types that implement an interface or inherit from a class, and use these along side an IoC container such as TinyIoc, NInject, StructureMap or Unity. I'd recommend TinyIoC which is used by NancyFX.

Is MEF an all-or-nothing affair?

I've had a few questions about MEF recently, but here's the big one -- is it really all-or-nothing, as it appears to be?
My basic application structure is simply an app, several shared libraries that are intended to be singletons, and several different plugins (which may implement different interfaces). The app loads the plugins, and both the app and all plugins need to access the shared libraries.
My first go at MEF was fairly successful, although I made some stupid mistakes along the way because I was trying so many different things, I just got confused at times. But in the end, last night I got my smallish test app running with MEF, some shared libraries, and one plugin.
Now I'm moving onto the target app, which I already described. And it's the multiple plugins part that has be a bit worried.
My existing application already supports multiple plugins with different interfaces by using Reflection. I need to be able to uniquely identify each plugin so that the user can select one and get the expected behavior exposed by that plugin. The problem is that I don't know how to do this yet... but that's the topic of a different question.
Ideally, I'd be able to take my existing plugin loader and use it as-is, while relying on MEF to do the shared library resolution. The problem is, I can't seem to get MEF to load them (i.e. I get a CompositionException when calling ComposeParts()) unless I also use MEF to load the plugin. And if I do this, well... then I need to know how to keep track of them as they get loaded so the user can select one from a list of plugins.
What have your experiences been with trying to mix and match these approaches?
MEF is designed to let you easily load plugin assemblies. If you have control over the plugins (by which I mean that you can add MEF export attributes) then there is no need to keep your own plugin loader which uses reflection. MEF does all that for you.
That being said, "mixing and matching" MEF with other technologies is certainly possible. It sounds like your problem is that if you use your own plugin loader, you don't add those plug-ins to the MEF container. As a result, you get a CompositionException for parts which try to import the selected plug-in.
To add a plugin that you loaded with your own code to the MEF container, you can use the ComposeExportedValue like this:
container.ComposeExportedValue<IPlugin>(selectedPlugin);
edit: I see what you mean now by "all or nothing". Your problem is that in order to be able to import parts with MEF, you also need to construct the object with MEF. This problem then cascades to the object which normally created that object, etc. all the way to the application root.
To avoid this "all or nothing" effect, you can compromise by exposing the MEF container as a global variable (i.e. static field). That way, classes can access the MEF container and pull exports from it, e.g. by calling Program.Container.GetExportedValue<MyDependency>() in the constructor.
edit2: If you have an object that was not constructed by MEF, then there are two ways to add it to the container.
The first is to call container.ComposeExportedValue<IMyContractType>(myObject);.
The second is to return the object in a property getter, and then mark the property itself with an [Export(typeof(SomeType))] attribute.

Categories