I've a web application and a service layer running in different places and both have their own business entities, means both have their own classes to represent an employee, order etc (ex. Emp in service layer and Employee in web app). When the web application invokes the service layer to get a list of employees I want to transform the list of employees returned by service to the list of web application's employee type.
I'm looking for a way to do this easily. Any ideas will be great. By the way I'm using ASP.NET and WCF.
Use AutoMapper.
AutoMapper is a simple little library built to solve a deceptively
complex problem - getting rid of code that mapped one object to
another. This type of code is rather dreary and boring to write, so
why not invent a tool to do it for us?
General Features
Flattening
Projection
Configuration Validation
Lists and Arrays
Nested Mappings
Custom Type Converters
Custom Value Resolvers
Custom Value Formatters
Null Substitution
Here's a sample from : wlegant Code
Before automapper
var meeting = _repository.GetMeetingById(meetingId);
var dto = new MeetingDto();
dto.Begins = meeting.Begins;
dto.End = meeting.End;
dto.Attendees = meeting.Attendees;
dto.AttendeesCount = meeting.Attendees.Count;
//do something meaningful
and using auto mapper
var meeting = _repository.GetMeetingById(meetingId);
var dto = Mapper.Map<Meeting, MeetingDto>(meeting);
You could use Automapper:
https://github.com/AutoMapper/AutoMapper
It helps you map one type to another. Your input objects (WCF) will be transformed into an object of another type (Web application). Automapper is able (for a large part) to figure this out automatically. Little configuration is needed.
To map two types:
Mapper.CreateMap<WcfEmployee, WebAppEmployee>();
To convert a type to another:
WebAppEmployee employee = Mapper.Map<WcfEmployee, WebAppEmployee>(employee);
For the most part Automapper uses name-based convention to map two types, but IIRC you can certainly tweak this. For this you need to inform Automapper of your convention rules. In other words, the rules for how it should map your types.
Personally I wouldn't recommend doing it in a simple way but rather in a very conscius way. Only map the things from the service to the application that the app actually need and only expose what is absolutely necesary to expose. In other words keeps as much, preferrably all of the data the service exposes internal to th eapp object.
Usually data from a service is used to base functionality upon. Expose the functionality instead of the data. That will make it possible for you to change the data structure completely (as long as it supports the same mental model/functional requirements) with out having to rewrite anything based on the Application side object. You'd of course need to rewrite the application side class.
If the class properties share the same names and typing the simplest way to do this is via the JsonSerializer:
using System.Text.Json;
public class MappingService
{
/// <summary>
/// Converts model from class F to class T
/// </summary>
/// <typeparam name="T">To Class</typeparam>
/// <typeparam name="F">From Class</typeparam>
/// <returns>model of type class T</returns>
public T Map<F, T>(F from)
{
var json = JsonSerializer.Serialize(from);
var to = JsonSerializer.Deserialize<T>(json);
return to;
}
}
receivedEmployesArray.Select(x => new MyWinFormsEmploeType(x)); // if you create intializaion in constructor
receivedEmployesArray.Select(x => new MyWinFormsEmploeType() {
Name = x.Name,
Position = x.Position
}); // trasfering property to property
Or the most progressive way - use automapper
Related
I'm trying to figure out if there is any way to create a RedisClient that has the functionality of a RedisTypedClient but able to define the URN key with a simple string instead of passing in a type. For example, at the moment we're creating the IRedisTypedClient like so:
var redisProjectClient = _redisClientsManager.GetClient().As<Project>()
We can then store/retrieve related entities based on types that we know about:
var files = redisProjectClient.GetRelatedEntities<File>(projectId);
However, we're wanting to manage simple JSON payloads of objects from external services (fetched by HTTP API endpoints). These payload objects will be types that we won't know about in c# but will provide schema's/type names that we want to be able to use to manage the relationships like we do in the redis typed client. I can't see if this is currently possible without having to manually manage all of the extra stuff that makes the typed clients so good:
Id listing for entities of that type
The ability to retrieve/update entities related to an object
GetNextSequence for next Id
These aren't available in a flat IRedisClient so I want to do something like this:
var file = "{...}" // JSON object provided by external service
// We will know it's a "Project" type with a "projectID" from the JSON payload data:
var redisProjectClient = _redisClientsManager.GetClient().As("Project");
redisProjectClient.StoreRelatedEntities("File", projectId, file);
// ...
var files = redisProjectClient.GetRelatedEntities("File", projectId);
Any idea if this is possible or how to create this type of client?
RedisTypedClient - Can you use strings to define the type?
No the Typed clients only works with Types, not unknown data structures, you'll need to store any arbitrary JSON payloads using the string IRedisClient.
To maintain the relationships you can store them in Redis sets as described in this previous answer which is what the StoreRelatedEntities() API does behind the scenes where it stores the entities using StoreAll() as normal but also maintains the related entities in an index Set using the urn (e.g. urn:File:x) as the identifier that each entity is stored against.
GetRelatedEntities() then reads all ids maintained in the "relationship set index" and uses GetValues() to fetch all entities by urns.
How to create AutoMapper configuration when the source and destination classes are totally different ? I want to create a mapping between some external classes (which cannot be changed) and my classes which I'm gonna persist in the db. I could persist the entire external class , but I dont want to do that to save space. I'm using .net core 2.0.
For ex: I've an external class like below :
A
{
B {
b1;b2;b3;
}
C {
c1;c2;c3;
}
}
The above needs to be mapped to my class defined like below :
A
{
Optmized_BC{
b1;
b2;
c1;
}
c2;
}
What's the best way to create AutoMapper configuration in the above case ? Should I call CreateMap for every pair of source/destination variable ? Is there a way where I can map all variables inside one CreateMap call (using some clever linq maybe ?)
You could persist your data as JSON in the database, then it would be easy to deserialize it to the other class using Newtonsoft JSON library. You have to decide if it's easier than writing the mapper function for each case. The same structure/naming would be deserialized automatically, otherwise, you could use 'dynamic'.
Just to give you an idea:
var result = JsonConvert.DeserializeObject<dynamic>(json);
A a = new A();
a.Optmized_BC.b1 = result.B.ToObject<B>().b1;
I would suggest to use the explicit mapper and cover it with the unit tests
I have a function that returns an entity obtained from a WCF web service. How should I return this entity as? I don't think I can return the original object (from the web service), because that would mean that the function's caller (from other assembly) will be forced to have a service reference to this web service (because the class is defined in the service reference) and I think I want to avoid that. And I don't think I can use interface either, since I can't modify the WCF entity to implement my interface.
On the other hand, I need to return precisely all properties that the original entity has, eg. all properties needed to be there, and there is no conversion/adjustment needed to any value or any property name and type.
Is it better to create a new class that duplicate the same properties from the original WCF class? How should I implement it, is it better to create a new object that copies all values from the original object, e.g.
return new Foo() { Id = original.Id, Name = original.Name, ... etc. }?
or just wrap it with get set methods like :
public int Id
{
get { return _original.Id; }
set { _original.Id = value; }
}
And any idea how to name the new class to avoid ambiguity with the original class name from the WCF reference?
as you have figured out, it is not a good idea to force the client to use the same types as the server. This would unnecessarily expose server application architecture to the client. The best option is to use Data Transfer Objects (DTOs).
You may have DTO for each of the entity you wish to expose to the client and the DTO will have properties to expose all the required fields of the entity. There are libraries such as value injector (valueinjecter.codeplex.com) or auto mapper as suggested by #stephenl to help you in copying the values from one object to another.
Place the DTOs in a separate namespace and assembly for best physical decoupling. You can use YourCompany.YourProduct.Data.Entities as the namespace for entities and YourCompany.YourProduct.Data.DTO for the DTOs
Actually, it depends on whether you are the consumer. If you are the consumer, reusing the type assembly is ok. However if you are not in control of the consuming services, it is better to use DTO objects with [DataContract] attributes.
I'm currently developing an application based on no-sql (using raven db). The core aspect of this application is a tree-like data structure with many nodes, subnodes and so on.
Currently, each node or subnode is represented by a c# object. A parent-child relationship is made with a collection of subnodes on the parent node, a forward-only relationship.
The whole thing is handled by ad hoc forms, in an Mvc application, with proper GETs and POSTs for each data type. The whole graph is stored as JSON on Raven DB.
Now the goal is to modify the UI part using knockoutjs. Since KO works with json data structures as well, I was wondering if there is a way to make the ravendb json structure "knockout compatible", meaning I can directly use it without having to make a KO specific structure (to implement observables, etc) and then create a mapping between the two.
A sample of the object graph:
public class NodeA
{
public string Name {get;set;}
public List<SubNode> Childs {get;set;}
}
public class SubNode
{
public string Name {get;set;}
public bool SomeBool {get;set;}
}
public class NodeB
{
public string Name {get;set;}
public int SomeInt {get;set;}
}
public class GraphToStore
{
public List<NodeA> NodeAList {get;set;}
public List<NodeB> NodeBList {get;set;}
}
The read/write part would still be handled server side, with ajax calls after stuff gets updated on the UI. Validation would be server-side and returned to the client via ajax calls too. My problem is as I said making the ravendb json work with knockoutjs, otherwise I have to reconstruct the whole thing and map it, and the graph is huge (50+ classes).
Take a look at Knockout-mapping-plugin. It will "automagically" generate the knockout compatible viewmodel with one call.
You would do something like
var viewModel.myRavenDbData = ko.mapping.fromJSON(json data variable);
var unwrappedData = viewModel.myRavenDbData(); // need only if viewModel.myRavenDbData is an observable
After you have this working, breakpoint after the call to mapping and explore the data structure. Generally, it will look like your data structure with ko.observables for the actual values. All the nodes needed for navigation will be normal javascript objects.
Yes you can use the Knockouts Mapping capabilities and create ViewModels directly from the model objects. But, I have two points:
1) I think that the fact that the objects are stored in RavenDB does not metter. Your MVC applications retrieves the objects from RavenDB - so they are deserialized from JSON and than they are served to your JS page via REST interface, so they are serialized again into JSON. So you are not working directly with the RavenDB's JSON structure, it is a standard CLR object serialized to JSON.
If you want to work directly with Raven, you have to plug your application directly to raven's interface - and that is not a good idea (but of course in the metter of performance it should work great).
2) I don't think that it is good idea to use your model objects as ViewModels, only by using the knockout mapping plugin.
Soon you will need to add some logic to the view model. Either for computing values to be showed in the view, or adding some action logic (save/edit...etc).
For the first case, you can define your viewmodels on the server side and use the mapping plugin.
For the later, you will have to write the view models in javascript anyway. I would recomend start writing the viewmodels in javascript directly.
It works the best for me.
The current system that I am working on makes use of Castle Activerecord to provide ORM (Object Relational Mapping) between the Domain objects and the database. This is all well and good and at most times actually works well!
The problem comes about with Castle Activerecords support for asynchronous execution, well, more specifically the SessionScope that manages the session that objects belong to. Long story short, bad stuff happens!
We are therefore looking for a way to easily convert (think automagically) from the Domain objects (who know that a DB exists and care) to the DTO object (who know nothing about the DB and care not for sessions, mapping attributes or all thing ORM).
Does anyone have suggestions on doing this. For the start I am looking for a basic One to One mapping of object. Domain object Person will be mapped to say PersonDTO. I do not want to do this manually since it is a waste.
Obviously reflection comes to mind, but I am hoping with some of the better IT knowledge floating around this site that "cooler" will be suggested.
Oh, I am working in C#, the ORM objects as said before a mapped with Castle ActiveRecord.
Example code:
By #ajmastrean's request I have linked to an example that I have (badly) mocked together. The example has a capture form, capture form controller, domain objects, activerecord repository and an async helper. It is slightly big (3MB) because I included the ActiveRecored dll's needed to get it running. You will need to create a database called ActiveRecordAsync on your local machine or just change the .config file.
Basic details of example:
The Capture Form
The capture form has a reference to the contoller
private CompanyCaptureController MyController { get; set; }
On initialise of the form it calls MyController.Load()
private void InitForm ()
{
MyController = new CompanyCaptureController(this);
MyController.Load();
}
This will return back to a method called LoadComplete()
public void LoadCompleted (Company loadCompany)
{
_context.Post(delegate
{
CurrentItem = loadCompany;
bindingSource.DataSource = CurrentItem;
bindingSource.ResetCurrentItem();
//TOTO: This line will thow the exception since the session scope used to fetch loadCompany is now gone.
grdEmployees.DataSource = loadCompany.Employees;
}, null);
}
}
this is where the "bad stuff" occurs, since we are using the child list of Company that is set as Lazy load.
The Controller
The controller has a Load method that was called from the form, it then calls the Asyc helper to asynchronously call the LoadCompany method and then return to the Capture form's LoadComplete method.
public void Load ()
{
new AsyncListLoad<Company>().BeginLoad(LoadCompany, Form.LoadCompleted);
}
The LoadCompany() method simply makes use of the Repository to find a know company.
public Company LoadCompany()
{
return ActiveRecordRepository<Company>.Find(Setup.company.Identifier);
}
The rest of the example is rather generic, it has two domain classes which inherit from a base class, a setup file to instert some data and the repository to provide the ActiveRecordMediator abilities.
I solved a problem very similar to this where I copied the data out of a lot of older web service contracts into WCF data contracts. I created a number of methods that had signatures like this:
public static T ChangeType<S, T>(this S source) where T : class, new()
The first time this method (or any of the other overloads) executes for two types, it looks at the properties of each type, and decides which ones exist in both based on name and type. It takes this 'member intersection' and uses the DynamicMethod class to emil the IL to copy the source type to the target type, then it caches the resulting delegate in a threadsafe static dictionary.
Once the delegate is created, it's obscenely fast and I have provided other overloads to pass in a delegate to copy over properties that don't match the intersection criteria:
public static T ChangeType<S, T>(this S source, Action<S, T> additionalOperations) where T : class, new()
... so you could do this for your Person to PersonDTO example:
Person p = new Person( /* set whatever */);
PersonDTO = p.ChangeType<Person, PersonDTO>();
And any properties on both Person and PersonDTO (again, that have the same name and type) would be copied by a runtime emitted method and any subsequent calls would not have to be emitted, but would reuse the same emitted code for those types in that order (i.e. copying PersonDTO to Person would also incur a hit to emit the code).
It's too much code to post, but if you are interested I will make the effort to upload a sample to SkyDrive and post the link here.
Richard
use ValueInjecter, with it you can map anything to anything e.g.
object <-> object
object <-> Form/WebForm
DataReader -> object
and it has cool features like: flattening and unflattening
the download contains lots of samples
You should automapper that I've blogged about here:
http://januszstabik.blogspot.com/2010/04/automatically-map-your-heavyweight-orm.html#links
As long as the properties are named the same on both your objects automapper will handle it.
My apologies for not really putting the details in here, but a basic OO approach would be to make the DTO a member of the ActiveRecord class and have the ActiveRecord delegate the accessors and mutators to the DTO. You could use code generation or refactoring tools to build the DTO classes pretty quickly from the AcitveRecord classes.
Actually I got totally confussed now.
Because you are saying: "We are therefore looking for a way to easily convert (think automagically) from the Domain objects (who know that a DB exists and care) to the DTO object (who know nothing about the DB and care not for sessions, mapping attributes or all thing ORM)."
Domain objects know and care about DB? Isn't that the whole point of domain objects to contain business logic ONLY and be totally unaware of DB and ORM?....You HAVE to have these objects? You just need to FIX them if they contain all that stuff...that's why I am a bit confused how DTO's come into picture
Could you provide more details on what problems you're facing with lazy loading?