I have a large collection of custom C# classes that roughly follow the MVC pattern that I reuse over and over to build GUI and command line PC applications with different groupings of functionality. For example there are classes for importing and exporting files, classes for manipulating data, classes for monitoring, utility classes, etc.
These classes are currently grouped into a set of assemblies. The higher-level assemblies reference lower-level assemblies, which in turn reference even lower-level assemblies.
All the classes are marked as public so when I build a PC application I can reference the appropriate assemblies and uses the classes in a modular fashion.
When I release a PC application I use a .NET obfuscator to merge the assemblies and the EXE into a single EXE so end users cannot see the APIs of all the classes and assemblies.
Now I would like to create a .NET assembly from my collection of classes that only exposes a specific, limited API and give it to end-users. I guess this is the facade design pattern.
For example I would like to take assemblies A and B and use classes from those, say A.Foo, A.Bar, A.Baz, B.Foobar, B.Bazbar, B.Bazfoo and create a new assembly, C, that has a single public class C.MyClass. Assembly C references assemblies A and B. End users can only see C.MyClass and they can't see or access A.xxx or B.xxx.
Is there a clean way of doing this that doesn't require drastic changes to my current collection of classes and assemblies (as they are already used in multiple other projects)?
Or is this something that is typically handed in the merging/obfuscation stage instead?
Thanks!
Related
I am currently working with a piece of software known as Kofax TotalAgility or KTA for short.
This is Business Process Automation Software, which I have the "pleasure" of expanding with custom .net libraries.
I have been creating a MS Graph library to perform actions with the MS Graph API. The API works great and I am quite pleased with how it turned out.
However due to the way KTA is accessing methods in classes I have used "Data classes" (dont know if that is the right word) to use as input parameters for my methods. To be clear these methods have no functionality other than to store data for methods to use, the reason I am doing this, is because of the way it is structured in the KTA class inspector (I am assuming that KTA uses the IL Code from my library to create a list of classes and methods).
This is what I am expecting the user is shown when they are using my methods. As you can see by using classes as input parameters I get this nice hierarchical structure.
By using classes as input parameters another issue occurs which is that my "Data Classes" are show in the list of classes, which produces alot of unnecessary clutter.
Is there a way to hide these classes from the inspector? I get that it might be an internal KTA issue, which of course would mean I am not asking in the right place, and it is an internal Kofax issue.
However if there is some C# or .NET way of doing this, that would be preferable.
There are a number of different terms for the data/parameter classes that you mention, such as DTO (data transfer objects), POCO (plain old C# objects), or the one that you can see in the KTA product dlls: model classes.
There is not a direct way to hide public classes from KTA. However, when you use the KTA API via the TotalAgility.Sdk.dll, you notice that you don’t see all of the parameter classes mixed in with the list of the classes that hold the SDK functions. The reason is just that these objects are in a separate referenced assembly: Agility.Sdk.Model.dll. When you are configuring a .NET activity/action in KTA, it will only list the classes directly in the assembly that you specify, not referenced assemblies.
If you are using local assembly references in KTA, then this should work because you can just have your referenced assembly in the same folder as your main dll. However if you are ILMerging into a single dll to can add it to the .NET assembly store, then this approach won’t work.
When ILMerged together, the best you can do is to have your parameter classes grouped in a namespace that helps make it clear. What I do is have a main project with just one class that acts as a wrapper for any functions I want to expose. Then use ILMerge with the internalize option, which changes visibility to internal for any types not in the primary assembly. To allow the model classes to still be public, I keep them in a specific namespace and add that namespace to the exclude list for the internalize command. See Internalizing Assemblies with ILMerge for more detail.
Keep in mind that anyone seeing this list is configuring a function call with your dll. Even if they are not a skilled developer, they should at least have some competence for this type of task (hopefully). So even if the list shows a bunch of model classes, it shouldn’t be too hard to follow instructions if you tell them which class is to be used.
I'm developing an application that heavily relies on a plugin architecture (*).
However I'm not sure what design pattern to use for dependencies between plugins, e.g. when plugin A depends on plugin B, possibly with some constraints (plugin B version between v1.05 and v1.30 or so)
My thoughts so far:
I could specify an interface for plugin B that never changes, and have plugin A reference this interface project only. Plugin B is then free to implement this in whatever way with versioning, and the latest available implementation will just be dependency-injected into the requested interfaces.
This could work, but it seems as though defining an interface which is very much tailored to the specific plugin's functions is a bit unnecessary; plus I suppose that I'd have to stick to that interface then; I could only enhance the plugins implementation in future versions easily, but not the interface.
I could ignore interfaces and just develop the plugins' implentations. Plugin A's project could then directly reference Plugin B's .dll. But as far as I know, this would cause errors when replacing Plugin B's .dll with a newer version, unless I add explicit version redirects in my applications config, wouldn't it?!
Are there any best practices? I suppose this issue is very similar to Nuget packages' depdendencies - does anyone happen to know how they have solved it?
Thanks
(*) in case it matters, my plugin architecture works as follows: I have all my plugins implement an interface IPlugin.
My main app then scans the plugin directory for all .dlls, filters out all classes that implement IPlugin, and uses Ninject to add a binding from IPlugin to the specific implementation (in the end, there'll be several bindings available for IPlugin, e.g. IPlugin -> Plugin1, IPlugin -> Plugin2 etc.). I'm then using Ninject to request/create a singleton instance of each plugin and register it in my main app. That way, my plugins can "request" dependencies via constructor arguments and Ninject/DI takes care of providing those.
As far as I am aware, Nuget tracks library dependencies using the metadata stored in the nuget package file. If I were you I'd avoid implementing arbitrary restrictions. What if one of your plugin developers wants to create a shared support library of useful classes, for example?
To my mind, a plugin should be a black box of functionality. If a plugin needs another plugin, then they should communicate via a standardized messaging platform rather than directly.
That said, you could always scrape all interface implementations from the library you load and hook those up as well as your plugins. That way the plugin developer can "request" implementations of those interfaces as well as plugins.
You'll need to cope with massive class libraries (I recommend only hooking up in Ninject interfaces that are referenced in plugin constructors) and with potential conflicts (two plugins might expect separate implementations of the same interface - which is the main reason I believe that a plugin should take care of itself internally, rather than hoping its design time expectations are fulfilled by the external plugin manager).
And in answer to (2), as long as the methods and properties you reference don't change name or signature, you shouldn't have any problems using a newer version of DLL B with DLL A. If you change a return type, change from a public field (which shouldn't exist in the first place) to a public property, change the parameters on a method or anything of that nature on a class that you're using from DLL B in DLL A, a recompile of A would be required.
I’m a C++ guy which has to work with some C# projects hence I have question. Having two projects placed on different svn servers I need them to share interface classes. How it should be solved in C#.
For example I have cs file which have interface and class used to pass data to the interface i.e.
Public Class data
{
public int a;
public int b;
}
Public Interface Ifoo
{
int foo(data);
}
This interface is implemented in ProjectA and used by ProjectB.
I want to be able to chose implementation of the interface so that in tests of ProjectB I will use special implementation of Ifoo interface.Chosing different dll using :
Assembly assembly = Assembly.LoadFrom(asm_name);
fooer = assembly.CreateInstance(class_name) as Ifoo;
Where I should place Ifoo interface?
I thought it should be placed in ProjectA svn repo (as ProjectA is owner of the interface) and then checkout it as an external with checkout of ProjectB .
Can you tell me what is the rule of thumb in such case?
BR
Krzysztof
First of all, whatever you decide to put your interface and asspciated data class (project A or project B svn or a new one), the first (and quite ovious) recomendation is that you put them together on its own library (DLL), without any dependency on other objects, so that becomes easy to share it across different projects.
To use it on a different project (do not matter if on another svn repository or not), you will have to give to that project physical access to this interface/data class. Being on its own dll and without the constraint of requiring other objects, it's a simple matter of add a reference of the library in the project.
With local copies of both projects, you don't need to copy the library itself into the other project.
In any case, you have to think well of your interface and data, so that you do not contantly make changes to them, in order to avoid having problems of compatibility between the projects. If you need to "add" something to the interface because of new features, create a new interface instead (and put it on other DLL). This way you will maintain compatibility with other projects that do not implement the new features.
If the data associated with the interface is so specific that any class implementing this interface will be used ONLY BY project A, so, the obvious place to put the DLL is into the project A. Usually this is the case when a software has the aability to use plugins. The interfaces are in a dll that can be "public" provided to plugin developers that do not have access to the main project itself. This is so simple as to make the DLL available to download. Beijng the SAME dll used on both main project and plugins, there will be no problems (than the reason to not change it).
But if your interface is more "generic" and is used to create something like a framework, where different projects (not related/not dependent) can use it alone, than, the suggestion to separete it in a third project (with its own svn) is more interesting. Using good polices regarding the development of this interface, will be less problematic to mantain the framework.
In the comments you said you can relate the "interface" to the project A, but if you can use it in project B without project A being involved, you can relate the interface to project B as well, and so, the option of moving the interface/associated data to a separetely project is preferable.
In any case, the underline implementation is irrelevant, as the main reason why we use interfaces in C# is exactly to be able to use an object in a "generic way" whithout (necessarily) having to care about how it is implemented.
Well, I have a project, and by the moment I am using .NET 4.0, because I would like that this application is compatible with windows XP, because EF 5.0 is only for windows 7 and upper.
However, I would like to implement some parts of the application with the features of .NET 4.5, such as EF 5.0.
So for my database access I have a reposotry class that now use EF 4.0, this is a independent dll, so I can create other repository dll that use EF 5, and in my project import both dlls, then I can instantiate the correct repository according to the version of EF 5.0 that I can use. This is a paramater in the config file. is this the best way?
I ask this because I don't know where I must declare my interface. because my repository classes need to implement this interface, but then this tie my dlls to my application, but I need to use this repositories in two different applications, so I want to implement once, and use in many applications. I want independent dlls, because now are two applications, but in the future, can be more.
The reason to want to use an interface in the application that uses the repositories is because I would like to instantiate at runtime the correct repository, according to the config file settings. So in the fututre I can implement new repositories and there is no needed to change the code.
EDIT1: I read about multi targeting, but if in my project I use features for example of .NET 4.0 and I want to complie for 3.5, I get an error because this feature does not exist in 3.5. That's correct. Then the only way is to mantain two different projects? It would be a double work.
Thanks.
Daimroc.
So for my database access I have a reposotry class that now use EF
4.0, this is a independent dll, so I can create other repository dll that use EF 5, and in my project import both dlls, then I can
instantiate the correct repository according to the version of EF 5.0
that I can use. This is a paramater in the config file. is this the
best way?
You can go this route and I don't really see an issue with it unless you think that this could cause maintenance/development headaches in the future. There are a couple of other things that you can look into doing. I think both are completely valid and probably just personal opinion/preference.
Modules You can go a modular route where your repository DLLs are potentially loaded dynamically. Look into Microsoft's Unity library. This should allow you to create an IModule in each of your repository DLLs that will set up your application as needed. Then just create a UnityBootstrapper class to tell it how to find your modules (manually add them, look in a directory, etc.). This should allow you to hot swap your repository DLLs and not have to worry about setting a config file if you don't want to.
Preprocessor Directives With preprocessor directives you get to define how your code will compile. Depending on how you have your classes structured this may be something fairly simple to set up or a complete nightmare that makes you want to abstract and refactor your classes. This question: Detect target framework version at compile time has an answer for handling different compile results depending on the target framework. Personally though, I like the modular route.
I ask this because I don't know where I must declare my interface.
because my repository classes need to implement this interface, but
then this tie my dlls to my application, but I need to use this
repositories in two different applications, so I want to implement
once, and use in many applications. I want independent dlls, because
now are two applications, but in the future, can be more.
The reason to want to use an interface in the application that uses
the repositories is because I would like to instantiate at runtime the
correct repository, according to the config file settings. So in the
fututre I can implement new repositories and there is no needed to
change the code.
Sounds like you need to create another library that is used to communicate between your UI and your Repository libraries. This can be a little tricky and overwhelming to set up just right. Basically you want your gateway DLL to house the interfaces and business objects. Your Application would reference this DLL and this DLL would reference your repositories.
Depending on your needs you may actually need to set up another intermediary DLL that would actually just house your interfaces and most basic utility classes. This would allow you to have your EF objects implement the same interface that your application is using without the need for your gateway DLL having to map your business objects and EF objects back and forth.
EDIT1: I read about multi targeting, but if in my project I use
features for example of .NET 4.0 and I want to complie for 3.5, I get
an error because this feature does not exist in 3.5. That's correct.
Then the only way is to mantain two different projects? It would be a
double work.
I believe you can get around this by using the Preprocessor Directives I mentioned above. Below is just an example of making a method handle work differently depending on if the framework is .NET 2.0; it's just an example and not tested. The DefineConstants will need to be set up, but this should allow you to handle 1 project for multiple framework targets while also being able to use newer .NET features as they are released.
public Person FindPersonByName(List<Person> people, string name)
{
#if DOTNET_20
foreach(Person person in people)
{
if (person.Name == name)
return person;
}
return null;
#else
return people.FirstOrDefault(p => p.Name == name);
#endif
}
I hope this was helpful and the best of luck in finding the right solution.
I've a project where some business logic is separated to an DLL project, this DLL contains the business logic for this software for a specific customer.
Now I've a problem after another client with different rules want to implement the software, I need someway that the application load the appropriate dll according to the client using the software, considering that this dll contains same function names but different bodies.
I'm using c# 3.5, is there a way to do so ??
Yes, you certainly can. You can branch the project, alter the implementation of the classes, keep the signatures of all the classes and class members the same, recompile, and your business logic will behave as you wish.
But, this is not good. You will have two different branches, with different implementations, for which you will have to keep the signatures in synch forever. And then you'll have another client, and another. This will be a nightmare that never ends.
Is is possible that the differing functionality can be separated out? You can:
put configuration in the database or configuration files (probably XML). A lot of your app should work based on tables or config files, for this reason.
you can implement plug-ins and providers for places where the code needs to be different.
kindof oldschool, but you can implement plug-and-play functionality using the part of CodeDom that compiles code (ignore the part about graphing out code). You can then put functionality in easily edited text files.
take a look at the Managed Extensibility Framework, built for just this type of thing.
Code the business Logic against an Interface - IBusinessLogic.
You can keep both business logics in the same assembly, and use config based dependency injection to specify which business logic is used during the deployment to the customer.
If I understood your problem correctly than you are looking for business logic customization. You can achieve it through several ways. one of them I am describing here.
Create a folder on your application directory for customization DLLs. Create all your business objects through a wrapper. which will 1st check on customization dll for appropriate Class before any business object by using reflection else it will create business logic from regular class. hope this will help.