How to include 3rd party code in separate versions in one project - c#

I've got an interesting problem on my hands and I can't quite figure out the right way of handling it. This is specific to sitecore, but I would imagine the fix to the issue would be one that could be applied to anyone that has multiple websites running different versions of a framework.
Right now I have 3 separate websites running Sitecore as the framework and CMS for the sites. One website it running code from Sitecore 6.5, another is on 7.0, and another is on 7.0 but will be 7.2 soon enough.
One of the core principles of programming is do not repeat yourself. I want to set up a separate C# project to include handling of Sitecore specific logic and classes. It would mostly include utility like classes that do simple functions to make my life easier checking many kinds of things. These base features are included in each version of Sitecore I am using.
Basically there is a ton of shared functionality between the Sitecore DLLs despite the differences, and I want to be able to write version agnostic code in one place.
I don't care if it needs to build out 3 separate DLLs for each set of Sitecore DLLs I need to compile with, as long as I can keep one base source. Is this sort of thing possible?

How I would handle it:
Setup an independent project and make use of configurations/symbols. A lot of the simple .NET code can probably be universally shared, however give you're working with different versions of SC you would most likely deal with deprecated functionality, API changes, etc. One example I can think of is UIFilterHelpers.ParseDatasourceString (which is deprecated in 7.2 in favor of SearchStringModel.ParseDatasourceString). There are a log of ways to approach this, but for example:
Inline Versions
#if SC7
IEnumerable<SearchStringModel> searchStringModel = UIFilterHelpers.ParseDatasourceString(Attributes["sc_datasource"]);
#else //SC72
IEnumerable<SearchStringModel> searchStringModel = SearchStringModel.ParseDatasourceString(Attributes["sc_datasource"]);
#endif
Another approach is to use partial classes and define version-specific implementations (then only include those in the correct project. Maybe you have:
Common.sln
Common.SC65.csproj
MyClass.cs [shared]
MyClass.SC65.cs
Common.SC7.csproj
MyClass.cs [shared]
MyClass.SC7.cs
Common.SC72.csproj
MyClass.cs [shared]
MyClass.SC72.cs
In the above, MyClass.cs resides in the root and is included in every project. However, the .SC#.cs files are only included in the project targeting the specific sitecore version.
This pattern is used a lot by libraries that target different .NET platforms or various configurations. To use an MVC example, you'd have MyProject.csproj, MyProject.MVC3.csproj, MyProject.MVC4.csproj, MyProject.MVC5.csproj (each with different references and possibly framework versions).

Related

How to look at a .NET project architecture at a glance

We are working on an enterprise level .net project, where-in we have a huge code base. We have our own different small frameworks implemented in the project.
While working, many a times it happens that I want to see a particular module's (or it's framework's) class hierarchy at a glance, which seems difficult. I have to drill down in different class files to see the relationships. Which is little difficult to do and takes time.
One way is that, I can create a dummy class.diagram file and drag drop particular class files to check the relationships. But it doesn't work that well.
Is there some other practice being used which I am not aware of ?
One thing I know of is Visual Studio 2013 Ultimate edition has Architecture tab (I know 2010 and 2012 also have it, but never used in them), that can be used to generate dependency graph of all projects in a particular solution. That can be used to generate dependencies within the projects.
I have seen it shows dependencies to the class levels.
You can take help of this Channel9 link to know more about it.
I've used .NET Reflector (with various plugins) in the same context.
There's a huge amount of plugins available at https://reflectoraddins.codeplex.com/ that ease the task even further.
On the "free" side, you'll probably want to give the following combo a try:
ILSpy (http://ilspy.net/) + AssemblyVisualizer (http://denismarkelov.github.io/AssemblyVisualizer/)
Note: all of the above is used to view the assembly's hierachy, not from a source point of view.

Is it possible to set up a base project for use across multiple ASP.NET MVC projects?

My team lead handed this one to me, and I'm a bit stumped. We have just started using ASP.NET MVC for web development in our shop, and there are common design and functionality that we would like to be able to use across multiple sites.
So far, I have looked at creating a custom template with the common elements, but the downside to that is that updates to the template (as far as I can tell) do not automatically get pushed to projects created using that template. As having changes automatically update to the consuming projects is a requirement, custom templates won't work for me.
My question is, is it possible to set up a base project for use across multiple ASP.NET MVC projects, where updates to the base get propogated to the consuming projects? If you have any experience in this field, I would certainly appreciate some direction. My apologies if this question seems elementary to you, but this is my first real foray into ASP.NET MVC.
I've found that the best method for sharing resources between disparate projects is to create your own Nuget packages. These can contain anything from a class library with reusable classes, enums, extension methods, etc. to entire web applications complete with controllers, views, JavaScript, CSS, etc. The scope is entirely up to how much commonality you can abstract from your projects. You can then set up your own private Nuget repository to hold these so you don't have to publish them to the whole world. (Although, if you do create something that would benefit others as well, by all means do share on the official Nuget repo.)
Setting everything up is pretty trivial. I learned how to create Nuget packages and set up a private repo in a day. Here's some resources to get you started:
Official Nuget documentation for creating and deploying packages
Using the Package Explorer application to create packages via a GUI
Official Nuspec (the package manifest file) reference.
Hosting your own Nuget feeds
Alternate method for creating your own repository with SymbolSource integration
SymbolSource also offers private repos, remotely hosted on their servers, gratis. Some enterprise environments may not like having their code "in the cloud", but if you can get by with it, this is by far the easiest way to get going.
From experience, the company I work for has found that whilst there are common design and functionality elements across our project, the uncommon elements can be too broad which outweighs then need to have some form of base project. Using custom project templates also become a maintenance nightmare so avoid those.
Instead we've opted to document how a project should be setup for particular designs and it's up to the Team Lead to follow which bits are needed for the particular project they are working on.
If there a functional overlaps we've considered (but not actually yet done) creating a common library(s) that has they're own development lifecyle, and then setup our own NuGet Server for distribution of the common library to your other projects. We haven't done this yet mainly because again the differences between projects we have worked tend to be large enough for this not be warranted.
But from the sound of what you're describing, NuGet packages or something similar could be the way to go in your case.
While I don't think there's a way to set up a base project that everything else inherits from, you could quite easily set up a common library project that all others reference. It could include base classes for all the common things you'll be using (eg ControllerBase).
This way, updating the library project will allow new functionality to be added to all other projects. You could configure templates so that the common base classes are used by default when adding new elements.
Depending on how you reference the common library (compiled dll/linked project reference) you either get a stable link to a specific version or instant updates across all projects. I'd personally prefer to reference the common dll rather than the project, since this allows project A to be using an older version than project B. Updating A to the new version is trivial, but it gives you a level of separation so that if B requires breaking changes, you don't have to waste resources to keep A working.
Another added bonus is that checking out an old version from source control would still be guaranteed to work as it would be tied to the version of the library in use at the time it was created.

Converting multiple VFP9 classes to C#

Let me start by saying that I have not programmed in a heavily OO programming language since Java in college (I graduated in December 2005). Since I've graduated I have been programming with FoxPro 2.5 - VFP9. Now, the company I work for is pushing to convert all of our FoxPro applications to C#.
My part of this project is converting our report parsing application. In VFP9 it consists of 5-6 forms (none of which will be carried over as we have created a new C# front-end to replace it), a single Base class that contains all of our standard methods, and approximately 575 individual parser classes (some of which do nothing but set a few parser specific variables/properties and call the needed base classes). Some of the parsers contain their own custom methods which still use and interact with the base methods and global properties.
Now for my question...
From a design standpoint, we would like for our new C# front-end to spawn multiple executables (3-5 EXEs) that will call our new C# base/parser class libraries (DLLs). My original thought was that I would have one solution/project with a Base_Code.cs and the other 575 parser.cs files (H1.cs, H2.cs, H3.cs, etc). However, we need the capability to build each .cs file independently of the others as I may be updating the Base_Code.cs while my co-worker is updating the H1.cs.
How do I best structure this? Do I keep one solution but create 576 projects or do I create 576 solutions all using the same Namespace as another team is attempting currently?
There are several global variables/properties that we use throughout the base code and each parser (these will be passed in from the front-end application) like file paths, file names, etc. that will be static so this needs to be taken into consideration as well when thinking of the design.
EDIT FOR EXAMPLE **
The C# front-end is basically a queueing system and file/status viewer. This front-end "queues" the reports we pick up throughout the day. The report at the top of the list determines what DLL will be needed. The front-end application and the DLLs are completely separate.
Example: H00001_2342318.MSG - this will call the H00001 DLL
H00002_3422551.MSG - this will call the H00002 DLL
Each H00001, H00002, etc (575 DLLs in total) will use methods that are in the BASE DLL.
If I have to update the H00001 DLL, I need to do so without having to rebuild all 575 DLLs.
It sounds like what you want is a "plugin" kind of architecture. This lets you drop in / update dlls (assemblies) without recompiling the main app.
E.g. http://code.msdn.microsoft.com/windowsdesktop/Creating-a-simple-plugin-b6174b62
and related.
576 projects and/or solutions is a maintenance nightmare. I have far fewer (75 or so) projects across less than a dozen solutions for an entire suite of products. This includes tests, framework components, etc. and the only way I keep on top of it is via strict naming/path conventions, source control, and scripts for automation.
Speaking of tests, you should be planning for unit tests (greenfield development presents an excellent opportunity for this; don't miss the opportunity). Tests go in a separate project; this is another reason why each class shouldn't be given its own assembly. You could theoretically double your project count.
I would start with a single solution with logically separated projects (e.g. break things out by function/dependency, and add a test project for each library).
Source control eliminates any concerns about conflicts between team members. If you have internal challenges getting source control up and running, look into Team Foundation Services Online: http://tfs.visualstudio.com/. Setup is incredibly easy and it's free for up to 5 users.
If you do end up needing greater disconnection between projects, you may want to consider using NuGet packages with a local repository to isolate/version different, discrete components. This wouldn't be my first step, but it is a worthwhile option to keep in mind.
From the comments, it sounds like you are currently performing daily deployments of autonomous units.
This seems risky. Can this be driven by configuration/data rather than code changes?
576 completely autonomous units seems like a problem with application design (e.g. reuse?)
Assuming that new code must be deployed every day, perhaps a scripting language + DLR could make this easier. Dynamic compilation of c# is also a possibility.

Easiest way to refactor package in C# or Java?

I'm very annoyed by C# or Java refactoring of namespaces or packages. If you referenced in many classes a class in a common package used in many independent projects and if you decide to just move that package as a package child of the current parent package you have to modify all clients just because you cannot use generic imports like this
import mypackage.*
which would allow refactoring without impacting clients.
So how do you manage to do refactoring when impact can be so big for such a small change ?
What if it's client's code not under my control am I stuck ?
Use an IDE with support for refactoring. If you move a java file in Eclipse, all references are updated. Same for rename, package name changes, etc. Very handy.
It sounds like your asking about packages that are compiled and deployed to other projects as for instance, a jar file. This is one reason why getting your API as correct as possible is so important.
How to Design a Good API and Why it Matters
I think that you could deprecate the existing structure and modify each class to be a wrapper or facade to the new refactored class. This might give you flexibility to continue improving the new structure while slowing migrating projects that use the old code.
imagine someone doing an import like import com.* and if it was like what you wanted it to be, it will load anything and everything in a com package which means zillions of classes are going to be imported, and then you will complain about why it is so slow, why it requires too much memory......
In your case, if you use a IDE, that will take care of most of the work and will be very easy but you will still need to deploy new executables to your clients as well if your application architecture requires.

How to program three editions Light, Pro, Ultimate in one solution

I'd like to know how best to program three different editions of my C# ASP.NET 3.5 application in VS2008 Professional (which includes a web deployment project).
I have a Light, Pro and Ultimate edition (or version) of my application.
At the moment I've put all in one solution with three build versions in configuration manager and I use preprocessor directives all over the code (there are around 20 such constructs in some ten thousand lines of code, so it's overseeable):
#if light
//light code
#endif
#if pro
//pro code
#endif //etc...
I've read in stackoverflow for hours and thought to encounter how e.g. Microsoft does this with its different Windows editions, but did not find what I expected.
Somewhere there is a heavy discussion about if preprocessor directives are evil.
What I like with those #if-directives is:
the side-by-side code of differences,
so I will understand the code for the
different editions after six months
and the special benefit to NOT give
out compiled code of other versions
to the customer.
OK, long explication, repeated question:
What's the best way to go?
I'd be tempted to manage the differences during runtime with different licences, and enable/disable features using that configuration. Why ?
you only have to build one deployable.
you can unit test this much more easily, rather than build 3 versions and test this.
users can upgrade and simply be sent a new licence. They won't have to upgrade/reinstall.
You have to weigh this up against your concern for distributing a solution that your customers haven't actually paid for (and can simply enable via an appropriately secure licence key).
My first thought is to split your software into various modules (projects/assemblies), and then create three different setup projects in your solution, one for each version. In the setup, you only include the modules you need.
You will loose the "side-by-side" code, but IMHO this just creates complicated methods, instead of maintainable code. Use extension methods, if you want to provide more functionality for a type, or derive classes.
i would suggest to create the basic classes and features as normal, but to allow override for that mathods that would be edition-specific.
then you create a light/pro/ultimate edition assembly that overrides that methods.
then you need a factory, that instanciate the correct overriding types depending on the edition.
here you could work with the internal-accessor and make the code assembly internal visible to the edition-assemblys

Categories