I have a very large class (500+ properties and nested complex objects) and we are mapping to another class with the same properties i.e. it is a one-to-one mapping.
Please no comments about why we are doing this (a long story - but this is a legacy system that is in the process of being re-architected and this is a stepping stone to the next stage of refactoring out services) - and why not automapper etc. Data mapping is hand coded in C#.
I could create a test object, map and compare the mapped object, however there are SO many properties to populate, this in itself is a major task which we hope to avoid.
Any thoughts on whether I could use reflection or serialize/deserialize or some test libraries or maybe use automapper in some way to fill object, map and compare?
We need to ensure a) all properties are mapped and b) each property is mapped to the correct property (properties on each object is named the same)
I suspect a manual code review is probably the only feasible solution but I'm reaching out...
UPDATE
OK not sure why people have down-voted this. It is a valid question with some potentially complex technical solutions. Thanks for you guys that have responded with useful suggestions!
Any thoughts on whether I could use reflection or serialize/deserialize or some test libraries or maybe use automapper in some way to fill object, map and compare?
You could just use a serializer and serialize one object and deserialize the other. Could be a three-to-five-liner if your objects are plain data classes that don't do exotic stuff.
Related
I have to apply [Serializable()] attribute for all classes, but I want to know is there any way to make classes Serializable globally instead of applying this attribute individually for all classes?
No, there isn't a way of applying this globally - you'd have to visit each type and add the attribute.
However: applying this globally is a really, really bad idea. Knowing exactly what you're serializing, when, and why is really important - whether this is for session-state, primary persistence, cache, or any other use-case. Statements like
I have to apply [Serializable()] attribute for all classes
tells me that you are not currently in control of what you are storing.
Additionally, since [Serializable] maps (usually) to BinaryFormatter, it is important to know that there are a lot of ways (when using BinaryFormatter) in which it is possible to accidentally drag unexpected parts of your model into the serialized data. The most notorious of these is "events", but: there are others.
When I see this type of question, what I envisage is that you're using types from your main data model as the thing that you are putting into session-state, but frankly: this is a mistake - and leads to questions like this. Instead, the far more maneagable approach is to create a separate model that exists purely for this purpose:
it only has the data that you need to have available in session
it is marked [Serializable] if your provider needs that - or whatever other metadata is needed for the sole purpose for which it exists
it does not have any events
it doesn't involve any tooling like ORM contexts, database connections etc
ideally it is immutable (to avoid confusion over what happens if you make changes locally, which can otherwise sometimes behave differently for in-memory vs persisted storage)
just plain simple basic objects - very easy to reason about
can be iterated separately to your main domain objects, so you don't have any unexpected breaks because you changed something innocent-looking in your domain model and it broke the serializer
I am considering using NO SQL databases such as MongoDb, RavenDb or any other ones recommend I would consider.
Can someone give me some advice, tutorials and useful links regarding my following question.
This system I want to write must be dynamic e.g. the model may change allot and should not be hard coded in C#.
For example if I had a JSON document saved holding ID, NAME, FULL NAME and then added a property called PHONENUMBER I would not want to rebuild the C# code or redeploy.
Is it possible to build C# models from a dynamic JSON? and then be able to manipulate it?
If so what library are most recommend for this type of system? What libraries work best with .NET?
This question is a step in to starting my university project.
Thanks for help & advice.
Yes, you can do that quite easily with RavenDB.
You can do it in one of two ways.
Either you will use a fully dynamic model, utilizing the C# dynamic keyword. That will let you do pretty much whatever you want, including adding properties at runtime, querying on runtime properties, etc.
However, a more common setup is that you'll use a lot of common properties (a customer has to have a name, for example). So you'll have a model that looks something like this:
public class Customer
{
public string Id {get;set;}
public string Name {get;set;}
public dynamic Props {get;set;}
}
The fixed properties are coded in C#, which helps you get into an easier, more consistent model and work with all the usual compiled tooling.
The dynamic stuff is in the Props property (which is usually initialized to ExpandoObject).
Note that you cannot do linq queries using dynamic. This is a limitation of C#, not RavenDB. You can still query dynamically using RavenDB, but you'll have to use the string based query API.
I implemented a Json.NET serializer wrapper that may help you:
https://github.com/welegan/RedisSessionProvider/blob/master/RedisSessionProvider/Serialization/RedisJSONSerializer.cs
I use it in my library which stores the contents of ASP.NET's Session object inside of Redis, which is a NoSQL option you did not mention. Still, given the typeless nature of Json, I imagine it will be applicable to your needs regardless of what NoSQL db you choose. The basic steps are:
Serialize:
Decide on a delimiter (I probably could have chosen a better one)
Store the type info (you can cache it for performance gains)
Store the object data after a delimiter
Deserialize:
Find the type info up to the delimiter
Deserialize the Type object
Pass Type as well as the object data to the library of your choosing. At the very least, Json.NET and ServiceStack.Json both expose serializers that will do the trick.
Edit
Seems I misunderstood part of your question. You want to be able to support adding json properties without redeploying your C#, and my example would strip out the extra properties during the serialize step back to the noSql db. You can use either a Dictionary<string, string> or ExpandoObject like ayende or mxmissile suggest, but keep in mind you will then have very few guarantees about the type of the properties of the object you get out.
In other words, you can freely add property names but as soon as you change the type of a property from int to long your code will break unexpectedly. Depending on your use case, that may or may not matter, just something to keep in mind.
Yes, using a Dictionary. However, I am not sure how those database systems handle dictionaries. Gracefully or not.
No, c# is compiled, so once that is done, there is no changing it without changing the source and compiling again. I think you should add some Javascript for that as it is a JS strong point.
I want to deserialize an object graph in C#, the objects in the graph will have object and collection properties, some of the properties may be private, but I do not need to worry about cyclic object references. My intent is to use the deserialized object graph as test data as an application is being built, for this reason the objects need to be able to be deserialized from the XML prior to any serialization. I would like it to be as easy as possible to freely edit the XML to vary the objects that are constructed. I want the deserialization process not to require nested loops or nested Linq to SQL statements for each tier in the object graph.
I found the DataContractSerializer lacking. It can indeed deserialize to private fields and properties with a private setter but it appears to be incredibly brittle with regard to the processing of the XML input. All it takes is for an element in the XML to be not in quite the right order and it fails. What's more the order it expects the data to be declared in does not necessarily match the order the object members are declared in the class declaration, making it impossible to determine what XML will work without having the data in the objects to start with so that you can serialize it and check what it expects.
The XmlSerializer does not appear to be able to serialize to non-public data of any type.
Since the purpose is to generate test input data for what might be quite simple applications during development I'd rather not have to resort to heavyweight ORM technologies like Entity or Nhibernate.
Is there a simple solution?
[Update]
#Chuck Savage
Thanks very much for your reply. I'm responding in this edit due to the comment character limit.
In the technique you suggested the logic to deserialize each tier of the object hierarchy is maintained in each class, so in a sense you do have nested Linq to SQL just spread out across the various classes involved. This technique also maintains a reference to the XElement from which each object gets its values in each class, so in that sense it isn't so much deserialized as just creating a wrapper around the XML. In the scenario I have in mind I'd ideally like to be deserializing the actual business objects the application will use so an XML wrapper type object like this wouldn't work very well since it would require a distinctly different implementation for test usage compared to production usage.
What I'm really after is something that can do something akin to what the XmlSerializer can do, but which can also deserialize private fields, (or at least properties with no setter). The reason being that the XmlSerializer does what it does with minimal impact on the 'normal' production use of the classes involved (and hence no impact on their implementation).
How about something like this: https://stackoverflow.com/a/10158569/353147
You will have to create your own boilerplate code to go back and forth to xml, but with the included extensions that can be minimized.
Here is another example: https://stackoverflow.com/a/9035905/353147
You can also search my answers on the topic with: user:353147 XElement in the StackOverflow search.
what is the best Solution for mapping class object to lightweight class object by example:
Customer to CustomerDTO both have the same properties names, i was thinking of the best optimized solution for mapping between them , i know reflection slow me down badly , and making methods for every mapping is time consuming so any idea ?
thanks in advance.
AutoMapper. There's also ValueInjecter and Emit Mapper.
If reflection is slowing you down too much, try Fasterflect: http://www.codeproject.com/KB/library/fasterflect_.aspx
If you use the caching mechanism, it is not much slower than hand-written code.
I've been playing about with this, and have the following observations. Should Customer inherit from CustomerDTO or read/write to a CustomerDTO? I've found that some DTO generators only generate dumb fixed size array collections for vectors of data items within the DTO others will allow you to specify a LIST<> or some such collection. The high-level collection does not need to appear in the serialised DTO but effects which approach you take. If your solution adds high-level collections then you can inherit if it doesn't then you probably want to read/write to a intermediate DTO.
I've used Protocol Buffers and XSDObjectGenerator for my DTO Generation (at different times!).
A new alternative is UltraMapper.
Is faster than anything i tried up to Feb-2017. (2x faster than Automapper in any scenario)
It is more reliable than AutoMapper (no StackOverflows, no depth limits, no self-reference limits).
UtraMapper is just 1300 lines of code instead of more than 4500+ of Automapper and it is easier to understand, maintain and contribute to the project.
It is actively developed but at this moment it needs community review.
Give it a try and leave a feedback on the page project!.
Platform: C# 2.0
Using: Castle.DynamicProxy2
I have been struggling for about a week now trying to find a good strategy to rewrite my DAL. I tried NHibernate and, unfortunately, it was not a good fit for my project. So, I have come up with this interaction thus far:
I first start with registering my DTO's and my data mappers:
MetaDataMapper.RegisterTable(typeof(User)):
MapperLocator.RegisterMapper(typeof(User), typeof(UserMapper));
This maps each DTO as it is registered using custom attributes on the properties of the DTO essentially:
[Column(Name = "UserName")]
I then have a Mapper that belongs to each DTO, so for this type it would be UserMapper. This data mapper handles calling my ADO.Net wrapper and then mapping the result to the DTO. I however am in the process of enabling deep loading and subsequently lazy loading and thus where I am stuck. Basically my User DTO may have an Address object (FK) which requires another mapper to populate that object but I have to determine to use the AddressMapper at run time.
My problem is handling the types without having to explicitly go through a list of them (not to mention the headache of always having to keep that list updated) each time I need to determine which mapper to return. So my solution was having a MapperLocator class that I register with (as above) and return an IDataMapper interface that all of my data mappers implement. Then I can just cast it to type UserMapper if I am dealing with User objects. This however is not so easy when I am trying to determine the type of Data Mapper to return during run time. Since generics have to know what they are at compile time, using AOP and then passing in the type at run time is not an option without using reflection. I am already doing a fair bit of reflection when I am mapping the DTO's to the table, reading attributes and such. Plus my MapperFactory uses reflection to instantiate the proper data mapper. So I am trying to do this without reflection to keep those expensive calls down as much as possible.
I thought the solution could be found in passing around an interface, but I have yet to be able to implement that idea. So then I thought the solution would possibly be in using delegates, but I have no idea how to implement that idea either. So...frankly...I am lost, can you help, please?
I will suggest a couple of things.
1) Don't prematurely optimize. If you need to use reflection to instantiate your *Mappers, do it with reflection. Look at the headache you're causing yourself by not doing it that way. If you have problems later, than profile them to see if there's faster ways of doing it.
2) My question to you would be, why are you trying to implement your own DAL framework? You say that NHibernate isn't a good fit, but you don't elaborate on that. Have you tried any of the dozens of other ORM's? What's your criteria? Your posted code looks remarkably like Linq2Sql's mapping.
Lightspeed and SubSonic are both great lightweight ORM packages. Linq2Sql is an easy-to-use mapper, and of course there's Microsoft's Entity Framework, which has a whole team at Microsoft dealing with the problems you're describing.
You might save yourself a lot of time, especially maintenance wise, by looking at these rather than implementing it yourself. I would highly recommend any of those that I mentioned.