Should AutoMapper be used to take data from ViewModel and save back into a database model?
I know the opposite practice is good software practice: to have Automapper to extract database models, and place them in Viewmodels for front end web applications.
I was reading this general article here:
The article was not specifically about Automapper, but want to validate if the opinion is true.
best way to project ViewModel back into Model
"I believe best way to map from ViewModel to Entity is not to use AutoMapper for this. AutoMapper is a great tool to use for mapping objects without using any other classes other than static. Otherwise, code gets messier and messier with each added service, and at some point you won't be able to track what caused your field update, collection update, etc."
Relocating here: https://softwareengineering.stackexchange.com/questions/387385/should-automapper-be-used-to-map-from-viewmodel-back-into-model
Related
This question is to verify if the current implementation is the right way to go about in terms of best practices and performance. So far in all my previous companies I have been using Auto Mapper to map relational objects to domain model entities and domain model entities to Dtos. The ORM tools have been Entity framework.
In my current company they are using Dapper as ORM tool and do not use AutoMapper as they say Dapper does the mapping for you internally. So the way they have structured the project is create a separate class library project that contains Dtos and reference the Dtos in Dataccess and Business layer.
The query returned by Dapper is internally mapped to the Dtos. These Dtos are returned to the Business layer and so on.
For example
In the code below the Participant function is Dto.
Repository file in DataAccess layer
public List<ParticipantFunction> GetParticipantFunctions(int workflowId)
{
// Update the Action for Participant
string selectSql = #"SELECT [WFPatFunc_ID] AS WFPatFuncID
,[WFFunction]
,[SubIndustryID]
,[DepartmentID]
FROM [dbo].[WF_ParticipantsFunctions]
WHERE [DepartmentID] = (SELECT TOP 1 [DepartmentID] FROM [dbo].[WF] WHERE [WF_ID] = #workflowId)";
return _unitOfWork.GetConnection().Query<ParticipantFunction>(selectSql, new
{
workflowId = workflowId
}).ToList();
}
The reason what I have been told by the developers is that AutoMapper would be just an overhead and reduce speed and since Dapper does mapping internally, there is no need for it.
I would like to know if the practice they are following is fine and have no issues.
There is no right or wrong here. If the current system works and solves all their requirements, then great: use that! If you have an actual need for something where auto-mapper would be useful, then great: use that!
But: if you don't have a need for the thing that auto-mapper does (and it appears that they do not), then... don't use that?
Perhaps one key point / question is: what is your ability to refactor the code if your requirements change later. For many people, the answer there is "sure, we can change stuff" - so in that case I would say: defer on adding an additional layer until you actually have a requirement for an additional layer.
If you will absolutely not be able to change the code later, perhaps due to lots of public-facing APIs (software as a product), then it makes sense to de-couple everything now so there is no coupling / dependency in the public API. But: most people don't have that. Besides which, dapper makes no demands on your type model whatseover other than: it must look kinda like the tables. If it does that, then again: why add an additional layer if you don't need it?
This is more of an architecture problem and there is no good or bad.
Pros of DTOs:
Layering - You are not directly using data object so you can use Attributes for mapping and stuff not needed in your UI. That way you can have the DTOs in a library that has no dependencies to your data access stuff.(Note you could do this with fluent mapping but this way you can use what you like)
Modification - If you domain model changes your public interface will stay the same. Lets say you add a property to the model all the stuff you already build wont be getting the new field in you JSON for no reason.
Security - This is why Microsoft started pushing DTOs if I remember correctly I think CodePlex(Not 100% sure it was them) was using EF entitles directly to add stuff to the database. Someone figure this out and just expanded the JSON with stuff that he was not allowed to access, for example if you have a post that has a reference to a user you could change the role of the user by adding a new post because of change tracking. There are ways to protect you self from this but security should always be an opt-out not opt-in.
I like to use DTOs when I need to expose BI level to a public interface. For example if I have an API that has System operations like api/Users/AllUsers or api/Users/GetFilteredUsers.
No DTOs
Performance - Generally not using DTOs will run faster. No extra step of mapping. Projections help with this but you can really optimize when you know what you need to do.
Speed of development and smaller code base - Sometime a large architecture is an overkill and you just want to get things done. And you are not just doing copy paste of your properties for most of you time.
More flexibility - This is opposite to the security sometimes you want to use the same api to do more then one thing. For example if you want to have the UI decide what it wants to see from a big object. Like select and expand. (Note this can be done with DTOs but if you ever tried to do expand with DTOs you know how tricky it can get)
I use it when I need to expose the data access level to the client, if you need to use Breeze or JayData and/or Odata. With api like api/Users and api/Users?filter=(a=>a.Value > 5)
I'm trying to map DTO's to Entities. When I searched it online I noticed lots of references to AutoMapper and just as much feedback about how it is not a good way to do this.
Also I couldn't find any newly dated sources, one question complaining about how there are no "new" sources is 4 years old.
One of the sources I found, which looked really promising was this
https://rogerjohansson.blog/2013/12/01/why-mapping-dtos-to-entities-using-automapper-and-entityframework-is-horrible/
and I couldnt get it working either.
So, basically situation is like this.
I'm trying to do an integration about orders by using wcf. (A whole another case)
I have an order dto and related dto's are orderline, customer, customeraddress, orderadress. Some more will follow later.
Since these are essentially database tables, main "table" is Order. It acts as the header, orderline and others are self explanatory. I'm sure everyone came across something like this before.
I created Dto's according to their counterpart entities.
What I'm told to do is;
a) Convert (or as terminology goes, map?) these DTO's to Entities
b) Add the entity to dbcontext and savechanges.
So, can anyone point me in a good direction on solving this situation?
We have a similar project as you. Instead of WCF we use Model classes from MVC, but finally is the same idea: to convert from one object to another. I cannot disagree more about AutoMapper. At first, we had the same doubts about its efficiency, but finally we decided to give it a try. Then, we faced some of the problems the article pointed (especially the collections of elements). Luckily, AutoMapper gives you enough flexibility to handle those special mapping conditions.
For collections we use custom mappings, which allow us to detect when we have new elements / elements to update / elements to remove
For references, we follow the rules of Entity Framework: add the FK_Id value rather than the real object.
If, for some reason, you need to add some logic on the mapping, based on some reference entities, then we use dependencyResolver (only on extreme cases, as we don't like the idea of dependencyResolver)
I think AutoMapper is easy enough to learn the basics, so you can map your objects if a matter of minutes. Plus, it gives you all the tools for the special considerations.
The article you posted explains how "Entity Framework does not like AutoMapper", but its more related on how you follow the rules of EF and AutoMapper. Entity Framework is a huge ORM and, as such, you need to follow some rules (very strict rules in some cases). Of course, using AutoMapper with the basic examples will break some rules, but once you start to get used to it, is really easy to follow the rules.
To sum up: AutoMapper saves you a lot of time, that you can invest on customize some configurations. If not, you will have to use linq projections, which in most cases will take you much more time. For example: the collection problem is solved by detecting the add/edit/delete based on Ids, which can also be handled with AutoMapper through custom mappers.
I have a colleague who insists I develop ASP.Net MVC websites using an n tier data access layer and MVVM.
He has a background in Silverlight and WPF and I have attempted to create a solution but it is causing problems.
I created and DAL and logic layers (using generics):
Dal - containing a repository pattern that wraps up entity frame work.
Logic - pass through layer, no generic logic to apply currently.
I am still using the MVC pattern and am passing the Entity framework model to View unless the need additional properties or methods - in which case I create a view model and an interface to map between the two.
The n tier data access layer has locked the entity framework context at the bottom of the stack and the main problem I have that entities can't be tracked on more than one context. ({"An entity object cannot be referenced by multiple instances of IEntityChangeTracker."})
I recently came across a problem where context coming up with this error despite my attempts despite to use models, interfaces, deep copying.
The issue here is this was the approach I was going to take to try and put a model in the DAL layer and map the data between entity framework entity and model.
I understand the underlying issue: that even though you dispose of a context it doesn't release the entities that were attached to it. Is there anyway to get this n tire data access approach to work with MVC? or am right this will never work and I should stick to using the entity framework context in the controller method (or underlying class implemented using dependency injection).
There's a lot to unpack here.
If you are looking for an MVVM implementation, then you are going to want to look at something like knockout.js. Or some other framework that is going to do declarative databinding in the client side. there are a number of articles that you can read to understand this.
https://learn.microsoft.com/en-us/aspnet/core/client-side/knockout
but let's start with the assumption that you aren't going down that road and want to stick with MVC.
I would stop passing entities to your views. You are going to cause yourself endless grief. Create a view model for each corresponding view. you use of an interface seems like overkill. You can project linq queries for your entities/repos right into the the view models. you could also use something like Automapper to map from entity to view model. Appending some of your code to the questions might help. And my view models tend to be pretty flat. I don't see why you would need to perform deep copies of entity trees out to the view. Those can be gotten latter using partial views or some other method.
Now that you aren't sending entities out to the view, when you post information for updating, you are going to have to instatiate a new object for fetch the object you are looking to update from ef again. make your changes and save.
To me that simplest implementation is the best. I would pass you context to the controller method (preferably using an IoC) and querying the context directy in the controller action and projecting directly into the view model. If you've got a lot of shared code between controller actions, maybe move stuff into a service and then inject the service into the controller and inject the context into the service. but i'd start with just injecting the context into the controller.
First of all , I use EF into Dal layer (separeted project from MVC, same solution). The generated models from EF's EDMX file are the actual models from Model layer?? If so, how do I access these models to work in MVC's View layer? I think it's wrong accessing data layer directly from view to work with these models, and If I make a Model layer with "my models" and translate Dal's models into mine models... it'll be duplicated code.
Probably I'm getting something wrong, but most of egs. are with code first approach, and I can't figure out this.
You are right in your idea of not accessing access the Models in your DAL directly from your presentation layer.
To avoid duplicating code when translating your DAL objects into the Models used by your views, you could use something like AutoMapper, which is supposed to do the heavylifting for you in exactly that scenario.
I think it's wrong accessing data layer directly from view to work with these models...
That's right, the appropriate method is using View Model
When you have dozens of distinct values to pass to a view, the same flexibility that allows you to
quickly add a new entry, or rename an existing one, becomes your worst enemy .You are left on your
own to track item names and values; you get no help from Microsoft IntelliSense and compilers .
The only proven way to deal with complexity in software is through appropriate design. So defining an object model for each view helps you track what that view really needs. I suggest you define a
view-model class for each view you add to the application.
-- "Programming Microsoft ASP.NET MVC" by Dino Esposito
A ViewModel provides all information that your view requires to make itself. To transfer data from ViewModel and a business entity you can use AutoMapper.
Don't worry about duplication, those are two different concept and should be separated from each other; it makes your application easy to maintain.
I may be mistaking but to me, using generation from EDMX provides you with the DbContext which could be considered as the DAL and entities which could be considered as the Model.
So you might directly manipulate entity instances as your business object. Manipulation of the base through the DbContext should appear in the BLL layer. Additionally, you might implement DTOs where needed.
This, of course, assumes you want to use entity framework code generation. Other options like using POCOs might be more relevant considering your overall architecture.
I use a view model in my local project and the models in the other project. Then put references to the models im gonna use on the page in my view model. Then reference the view model on my page. Let me know if that sounds like something you want to do and I can edit in some code.
I want to go the way doing all input validation in a viewmodel.
When the database is accessed the viewmodels data must be copied over to the model sent to database.
My concern is that data could be lost because:
Automapper is wrongly setup for certain scenarios thus properties are not copied over to the model
Or Automapper is just not suited for every scenario maybe too complex ViewModels
Are my concerns justified or should I not care for that?
Automapper is totally fine in this scenario I use it extensively for exactly this purpose. If you are worried about data lose should you not be making use of data annotation validation on the model itself to make sure you have the required data before persisting or calling some other service.
Also the only real way to make completely sure that you dont miss anything using Auotomapper or just manual code mapping is a good set of unit tests.
My concern is that data could be lost because: Automapper is
wrongly setup for certain scenarios thus properties are not copied
over to the model Or Automapper is just not suited for every scenario
maybe too complex ViewModels
I think it is unlikely that you will loose data. For complex
viewModels you can choose to set specific properties manually which
are not copied over by Automapper. As suggested by #feanz good Unit
Test will provide you required assurance.**