Mapping Objects - c#

what is the best Solution for mapping class object to lightweight class object by example:
Customer to CustomerDTO both have the same properties names, i was thinking of the best optimized solution for mapping between them , i know reflection slow me down badly , and making methods for every mapping is time consuming so any idea ?
thanks in advance.

AutoMapper. There's also ValueInjecter and Emit Mapper.

If reflection is slowing you down too much, try Fasterflect: http://www.codeproject.com/KB/library/fasterflect_.aspx
If you use the caching mechanism, it is not much slower than hand-written code.

I've been playing about with this, and have the following observations. Should Customer inherit from CustomerDTO or read/write to a CustomerDTO? I've found that some DTO generators only generate dumb fixed size array collections for vectors of data items within the DTO others will allow you to specify a LIST<> or some such collection. The high-level collection does not need to appear in the serialised DTO but effects which approach you take. If your solution adds high-level collections then you can inherit if it doesn't then you probably want to read/write to a intermediate DTO.
I've used Protocol Buffers and XSDObjectGenerator for my DTO Generation (at different times!).

A new alternative is UltraMapper.
Is faster than anything i tried up to Feb-2017. (2x faster than Automapper in any scenario)
It is more reliable than AutoMapper (no StackOverflows, no depth limits, no self-reference limits).
UtraMapper is just 1300 lines of code instead of more than 4500+ of Automapper and it is easier to understand, maintain and contribute to the project.
It is actively developed but at this moment it needs community review.
Give it a try and leave a feedback on the page project!.

Related

Which strategy can be used when converting back and forth between a domain class and several versions of the same target class?

For my specific context I control the target classes. They were auto-generated based on XSDs and have huge overlaps because they represent different versions of the same class.
Each version is a huge C# class of over 5.000 lines.
Support can't be dropped for old versions. This means we always need to be able to map the domain class to several different versions and back again. There are always small but breaking changes from version to version. More than 90% of the target class is always the same, even if the code is duplicated for each version.
Currently there is one big mapping for each format, which is a horror. There is so. much. duplicated. code. Furthermore, developers tend to make updates where they need it, and skip everything else, which means individual versions often go out of sync, meaning that one version will be updated to do something that other versions don't. This is also not ideal.
So my question to you is: What strategy can you use for this kind of mapping?
Given the size of your classes, and having to maintain multiple versions, I'd suggest serializing and serializing. Assuming that they otherwise approximate one another, JsonConvert doing JsonConvert.Deserialize<TargetClass>(JsonConvert.Serialize(sourceClass)) should solve it, though I've not worked with such large models to have any idea on how performant it is.
Alternatively, you could use a t4 template (if you're not in .net Core anyway) to generate the mapping using reflection into a common method or whatever.
As far as preventing the Developer problem... Interfaces, base classes that define as much of this centrally as possible. Code reviews to ensure that developers are making changes to the lowest layer they possibly can.
You can do some tricky things with inheretence with static using statements, I'm pretty sure.
Something dumb like
using OldVersion = path.to.the.class.CantRenameThis;
class CantRenameThis : OldVersion
We ended up with a solution that achieved the main targets:
Decent compile-time safety to spot mapping errors
De-duplication of code
No messing with the auto-generated code
We did this by exploiting that the auto-generated classes are generated as partial. That means we can extend them.
We ended up creating hierarchies of interfaces/classes looking like this:
ClassV1 implements IClassVerySpecificV1
ClassV2 implements IClassVerySpecificV2
IClassVerySpecificV1 implements SpecificA, SpecificB, SpecificC and IClassBasic
IClassVerySpecificV1 implements SpecificB, SpecificC, SpecificD and IClassBasic
A mapper would then look like:
ClassV1Mapper requires a SpecificAMapper, SpecificBMapper, SpecificCMapper and ClassBasicMapper
ClassV2Mapper requires a SpecificBMapper, SpecificCMapper, SpecificDMapper and ClassBasicMapper
This way we could map 90% of everything by just throwing everything that belongs to IClassBasic into a ClassBasicMapper.
We did run into some issues however:
As you can already guess, we end up with a LOT of interfaces. More than you want.
Sometimes a field exists across versions, but has different (enum) values. Our domain model would have the superset, with an attribute specifiying which values were valid for which versions.

How to test a data mapper of very large objects

I have a very large class (500+ properties and nested complex objects) and we are mapping to another class with the same properties i.e. it is a one-to-one mapping.
Please no comments about why we are doing this (a long story - but this is a legacy system that is in the process of being re-architected and this is a stepping stone to the next stage of refactoring out services) - and why not automapper etc. Data mapping is hand coded in C#.
I could create a test object, map and compare the mapped object, however there are SO many properties to populate, this in itself is a major task which we hope to avoid.
Any thoughts on whether I could use reflection or serialize/deserialize or some test libraries or maybe use automapper in some way to fill object, map and compare?
We need to ensure a) all properties are mapped and b) each property is mapped to the correct property (properties on each object is named the same)
I suspect a manual code review is probably the only feasible solution but I'm reaching out...
UPDATE
OK not sure why people have down-voted this. It is a valid question with some potentially complex technical solutions. Thanks for you guys that have responded with useful suggestions!
Any thoughts on whether I could use reflection or serialize/deserialize or some test libraries or maybe use automapper in some way to fill object, map and compare?
You could just use a serializer and serialize one object and deserialize the other. Could be a three-to-five-liner if your objects are plain data classes that don't do exotic stuff.

database lookup table to enum or similar

I am using lookup tables for references. e.g. registration types, admin, moderator then using a factory to determine the type of registration. What is the easiest way to create a strongly typed way of comparing registrations. Sort of a similar behaviour to an enum. for example
pssudo code
class regfactory
{
case()
if(regType.Admin: return new adminReg()
}
The only way I can think of is a dictionary of magic strings generated from the database.
I bevieve the only way to to accomplish strongly typed enums for your situation would be code code generation. Anything not generated before compiletime would not serve for strong typing.
Robert Koritnik posted a very slick way to do this: T4 template to Generate Enums
Another way to 'generate' better readable enum names (in case you need them) is the HUmanizer project at https://github.com/MehdiK/Humanizer.
From a practical point of view it may seem a bit error-prone and you may feel like you're breaking some good practice rule by not centralizing access to that data and leaving it at risk of going out of synch during maintenance, however, from an architectural point of view, considering that we're talking about look-up data, it's ok to hard-code it as it's just part of your "static data contract", if you will.
If you do have lots of those, then maybe there could be a case for putting those constants in a format where a build or database patch script could update them when those values are changed, but 9 out of 10 just stuffing them in an enum works fine.
It's worth noting that some ORMs do have good support for enums, including EF which will allow for keeping those values in synch if you adopt a code first approach. However, we're talking about adding a whole new layer to your software so you gotta have more reasons than just wanting to keep your static look-up data in synch to implement that.
You can use reflection in c#. There is an excellent example on this answer to list the declared classes in a given namespace.
Then you would compare the name of your registration type with the available classes' names to decide which class to instantiate.

How to handle and organize DTOs for different context?

When using simple DTOs in various scenarios I have frequently run into the same kind of problem and I always wondered whether there's a better way to deal with it.
The thing is, I have a business object, e.g. Asset which has a bunch of properties, child objects and calculated fields, some of them expensive to calculate in sense of time, some of them huge in sense of data amonut. I need to use a different flavor of this object in various screens in the UI, e.g.
in a tree where there is a hierarchy displayed and I don't need much more than the display name
in a grid where I'm showing just a couple of properties
in a detail pane where there's a big subset of available information, but still some of it (like mapped objects) is shown only on demand
To be able to achieve optimal performance with this scenario, I have always created different DTOs for each context, only containing the subset of information which is actually used in that context. While being a resource-optimal solution, this leads to couple of problems :
I have a class explosion with huge number of DTO classes
I have quite a hard time coming up with different names for the same thing like AssetDtoForGridInTheOverviewScreenInTheUpperPaneAboveTheSplitter, not to mention maintaining them later
I am frequently repeating myself in the transformation methods, because there are properties that are used by most of the DTOs but not by all of them (therefore I can't put them into any superclass and reuse the transformation logic)
The technology I'm using is ASP.NET SOAP WebServices and C# 3.5, but I think somehow this could be a language-agnostic problem. Any ideas are welcome..
This is a known problem with DTOs. It's described in this otherwise mediocre articule on MSDN. To paraphrase: DTO is the most versatile n-tier data access pattern, but it also requires most work.
You can address some of your issues with mapping by using convention-based mapping, such as AutoMapper.
When it comes to class explosion, could it be that you are using too flat data structures?
This can be difficult to tell because DTOs naturally include a great deal of semantic repetition that turns out to not be logical repetition at all. For example, even if you have semantically similar types, if one is a ViewModel and the other is a Domain Object, they may share semantic structure, but have vastly different responsibilities.
If, on the other hand, you have a lot of repetition in the same application layer (e.g. UI), you may be violating the DRY principle. In this case, it may often help to encapsulate related data in what starts out as a flat data structure into a separate class. In most UI frameworks I'm aware of, you can still databind a flat display to a hierarchically structured class.
The problem of class explosion is inherent to the DTO approach, there probably isn't much you can do about that. Be careful not to mix your view-model with your DTO model. Your DTO's should only be used to get the data from your data tier to your front end and not for presentation.
With the advent of .NET 3.5 you can choose to implement some basic, more coarse grained DTO's and replace your ViewModel with an anonymous type which you can dynamically create off your DTO's. I found this to be avery flexible solution.
Regarding your naming conventions, it is probably useful to group your DTO's into scenarios and put them in a corresponding namespace. For example Solution.AssetManagement.Asset and Solution.AssetReporting.Asset

AOP, DataMappers, and Factories, can they work together?

Platform: C# 2.0
Using: Castle.DynamicProxy2
I have been struggling for about a week now trying to find a good strategy to rewrite my DAL. I tried NHibernate and, unfortunately, it was not a good fit for my project. So, I have come up with this interaction thus far:
I first start with registering my DTO's and my data mappers:
MetaDataMapper.RegisterTable(typeof(User)):
MapperLocator.RegisterMapper(typeof(User), typeof(UserMapper));
This maps each DTO as it is registered using custom attributes on the properties of the DTO essentially:
[Column(Name = "UserName")]
I then have a Mapper that belongs to each DTO, so for this type it would be UserMapper. This data mapper handles calling my ADO.Net wrapper and then mapping the result to the DTO. I however am in the process of enabling deep loading and subsequently lazy loading and thus where I am stuck. Basically my User DTO may have an Address object (FK) which requires another mapper to populate that object but I have to determine to use the AddressMapper at run time.
My problem is handling the types without having to explicitly go through a list of them (not to mention the headache of always having to keep that list updated) each time I need to determine which mapper to return. So my solution was having a MapperLocator class that I register with (as above) and return an IDataMapper interface that all of my data mappers implement. Then I can just cast it to type UserMapper if I am dealing with User objects. This however is not so easy when I am trying to determine the type of Data Mapper to return during run time. Since generics have to know what they are at compile time, using AOP and then passing in the type at run time is not an option without using reflection. I am already doing a fair bit of reflection when I am mapping the DTO's to the table, reading attributes and such. Plus my MapperFactory uses reflection to instantiate the proper data mapper. So I am trying to do this without reflection to keep those expensive calls down as much as possible.
I thought the solution could be found in passing around an interface, but I have yet to be able to implement that idea. So then I thought the solution would possibly be in using delegates, but I have no idea how to implement that idea either. So...frankly...I am lost, can you help, please?
I will suggest a couple of things.
1) Don't prematurely optimize. If you need to use reflection to instantiate your *Mappers, do it with reflection. Look at the headache you're causing yourself by not doing it that way. If you have problems later, than profile them to see if there's faster ways of doing it.
2) My question to you would be, why are you trying to implement your own DAL framework? You say that NHibernate isn't a good fit, but you don't elaborate on that. Have you tried any of the dozens of other ORM's? What's your criteria? Your posted code looks remarkably like Linq2Sql's mapping.
Lightspeed and SubSonic are both great lightweight ORM packages. Linq2Sql is an easy-to-use mapper, and of course there's Microsoft's Entity Framework, which has a whole team at Microsoft dealing with the problems you're describing.
You might save yourself a lot of time, especially maintenance wise, by looking at these rather than implementing it yourself. I would highly recommend any of those that I mentioned.

Categories