architecture ASP.NET MVC 5 - c#

I am working on project using ASP.NET MV5 along with Entity Framework. I got few question related to architecture application in best possible way. I Got existing database so i will use code first existing database approach (code or design?) possibly i will use the store procedure as well
now if i use code first existing database design approach, should i have separate models for each business concern or one design (ADO.NET Entity Model). I have just realize some of my model will share among different business functions for example ASP.NET Identity table "Role" is using my dashboard controller where it see, who can use what functions!
can i mix code first existing database--> design & code approach together?
if i use code first existing database design approach, can i go and modify model?
should i have one DbContext for reading database or separate? reason why i am asking if code running under one DbContext brings all data! do i really need that? does it effect performance? security otherwise have multiple DbContext for each business concerns?
I know it is very open questions. i more interesting to see other expert approach in architecture complex application in best possible way using ASP.NET technologies.
Many Thanks

For small projects, it's probably ok to just use your entity objects directly. However, if you want to perform complex validation and apply business rules, you're much better off creating separate business objects. Entity objects are, by nature, data objects only.
You can use utilities like AutoMapper to make translating between entity and business objects much easier and eliminate errors and inconsistencies while maintaining clear separate between your business objects and your entity objects.
In the long run, this is a more sustainable architecture.

Related

Data Layer Architecture with multiple datasources

I am trying to create a system that allows you to switch multiple data sources, e.g. switching from Entity Framework to Dapper. I am trying to find the best approach to do this.
At the moment I have different projects for different data layers, e.g. Data.EF for Entity Framework, Data.Dapper for Dapper. I have used a database approach but when it creates the models the information generated is coupled together and not easy to refactor, e.g. separation of models.
I have a project called models, this holds domain and view models, and I was thinking of creating Data.Core and follow the repository pattern. But then, doing this will add an extra layer so I would have Presentation / Business / Repository / Data.
I would like to know the best structure for this approach. Should I also do a code-first approach to create my database? This helps separate concerns and improve abstraction. This is quite a big application so getting the structure right is essential.
I'd suggest factoring your data interfaces either to the model through repository interfaces for your entities or to an infrastructure project. (I think the latter was your rationale behind creating a Data.Core project.)
Each data source will then implement the very same set of interfaces, and you can easily switch between them, even dynamically using dependency injection.
For instance, using repositories:
Model
\_ Entities
Entity
\_ Repositories
IEntityRepository
Data.EF
EntityRepository : Model.IEntityRepository
Data.Dapper
EntityRepository : Model.IEntityRepository
Then in your business you won't need to even reference Data.EF or Data.Dapper: you can work with IEntityRepository and have that reference injected dynamically.
I think you approach is correct. I'd say Presentation / business / repository / data is pretty standard these days.
I'd say the code first approach using POCOs is the preferred option today in the industry. I would advise to start creating a project containing your POCO data structures with any logic in it and take it from there. The advantage of this is that your objects model the domain more naturally. If you start with a db centric approach the problem is that, if you are not careful, you may end with objects more akin to SQL relational databases than to the real model. This was painfully evident in the first versions of .net where it was encouraged to use Datasets tighly coupled with the db and that often caused problems to work with in the business layer.
If needed you can do any complex mapping between the business objects and the db objects in the repository layer. You can use a proxy and/or a unit of work if you need to.
I would suggest you create your domain objects, use the code-first approach and also apply the repository pattern
Yes the repository pattern does bring in an extra layer. Have a look at this post for more detail information Difference between Repository and Service Layer?
RE: code-first approach to create my database
It doesn't matter how big your application is, it is a question of what else you intend to use the database for. If this database is simply a repository for this application then using code-first is fine as you are simply storing your code objects. However if you are using this database as an integration point between applications then you may wish to design the database seperately to the application models.

Using EF and Passing Data Through Application Layers

All,
We are using EF as our primary data access technology. Like many apps out there, we have a business objects/domain layer. This layers talks to our repository, which, in turn, talks to EF.
My question is: What is the best mechanism for passing the data back and forth to/from EF? Should we use the EF-generated entity classes (we did DB-first development, so we have entity classes that EF generated), create our own DTOs, use JSON or something else?
Of course, I could make an argument for each of these, as well as a counter-argument against them. I'm looking for opinions based on experience building a non-trivial application using a layered architecture and EF.
Thanks,
John
I would use POCOs and use them with EF. You can still do that with the DB first approach.
The main benefit is that your business objects will not be tied to any data access technology.
Your underlying storage mechanism can, and will, change but your POCOs remain. All that business logic is easily re-used and tested.
As you're looking for cons, then I would say it might take longer. However, that cost is well worth it.
With t4 templates I put the actual EF generated entities in a common project that is referenced by all other projects. I use the EF database first created models through the entire application (including use as view models). If I need to add additional properties to an entity that are not in the database I just extend the partial class of the entity in the common project. I have written dozens and large nTier applications using this model and its worked great.

Strategies for replacing legacy data layer with Entity framework and POCO classes

We are using .net C# 4.0, VS 2010, EF 4.1 and legacy code in this project we are working on.
I'm working on a win form project where I have made a decision to start using entity framework 4.1 for accessing an ms sql db. The code base is quite old and we have an existing data layer that uses data adapters. These data adapters are used all over the place (in web apps and win form apps) My plan is to replace the old db access code with EF over time and get rid for the tight coupling between UI layers and data layer.
So my idea is to more or less combine EF with the legacy data access layer and slowly replace the legacy data layer with a more modern take on things using EF. So for now we need to use both EF and the legacy db access code.
What I have done so far is to add a project containing the edmx file and context. The edmx is generated using database first approach. I have also added another project that contains the POCO classes (by using ADO.NET POCO Entity Generator). I have more or less followed Julia Lerman's approach in her book "Programming Entity Framework" on how to split the model and the generated POCO classes. The database model has been set for years and it's not an option the change the table and the relationships, triggers, stored procedures, etc, so I'm basically stuck with the db model as it is.
I have read about the repository pattern and unit of work and I kind of like the patterns, but I struggle to implement them when I have both EF and the legacy db access code to deal with. Specially when I don't have the time to replace all of the legacy db access code with a pure EF implementation. In an perfect world I would start all over again with a fresh take one the data model, but that is not an option here.
Is the repository and unit of work patterns the way to go here? In order to use the POCO classes in my business layer, I sometimes need to use both EF and the legacy db code to populate my POCO classes. In another words, I can sometimes use EF to retrieve a part of the data I need and the use the old db access layer to retrieve the rest of the data and then map the data to my POCO classes. When I want to update some data I need to pick data from the POCO classes and use the legacy data access code to store the data in the database. So I need to map the data retrieved from the legacy data access layer to my POCO classes when I want to display the data in the UI and vice versa when I want to save data to the data base.
To complicate things we store some data in tables that we don't know the name of before runtime (Please don't ask me why:-) ). So in the old db access layer, we had to create sql statements on the fly where we inserted the table and column names based on information from other tables.
I also find that the relationships between the POCO classes are somewhat too data base centric. In another words, I feel that I need to have a more simplified domain model to work with. Perhaps I should create a domain model that fits the bill and then use the POCO classes as "DAO's" to populate the domain model classes?
How would you implement this using the Repository pattern and Unit of Work pattern? (if that is the way to go)
Alarm bells are ringing for me! We tried to do something similar a while ago (only with nHibernate not EF4). We had several problems running ADO.NET along side an ORM - database concurrency being a big one.
The database model has been set for
years and it's not an option the
change the table and the
relationships, triggers, stored
procedures, etc, so I'm basically
stuck with the db model as it is.
Yep. Same thing! The problem was that our stored procs contained a lot of business logic and weren't simple CRUD procs so keeping the ORM updated with the various updates performed by a stored procedure was not easy at all - Single Responsibility Principle - not a good one to break!
My plan is to replace the old db
access code with EF over time and get
rid for the tight coupling
between UI layers and data layer.
Maybe you could decouple without the need for an ORM - how about putting a service/facade layer infront of your UI layer to coordinate all interactions with the underlying domain and hide it from the UI.
If your database is 'king' and your app is highly data driven I think you will always be fighting an uphill battle implementing the patterns you mention.
Embrace ado.net for this project - use EF4 and DDD patterns on your next green field proj :)
EDMX + POCO class generator results in EFv4 code, not EFv4.1 code but you don't have to bother with these details. EFv4.1 offers just different API which does exactly the same (and it is only wrapper around EFv4 API).
Depending on the way how you use datasets you can reach some very hard problems. Datasets are representation of the change set pattern. They know what changes were done to data and they are able to store just these changes. EF entities know this only if they are attached to the context which loaded them from the database. Once you work with detached entities you must make a big effort to tell EF what has changed - especially when modifying relations (detached entities are common scenario in web applications and web services). For those purposes EF offers another template called Self-tracking entities but they have another problems and limitations (for example missing lazy loading, you cannot apply changes when entity with the same key is attached to the context, etc.).
EF also doesn't support several features used in datasets - for example unique keys and batch updates. It's fun that newer MS APIs usually solve some pains of previous APIs but in the same time provide much less features then previous APIs which introduces new pains.
Another problem can be with performance - EF is slower then direct data access with datasets and have higher memory consumption (and yes there are some memory leaks reported).
You can forget about using EF for accessing tables which you don't know at design time. EF doesn't allow any dynamic behavior. Table names and the type of database server are fixed in mapping. Another problems can be with the way how you use triggers - ORM tools don't like triggers and EF has limited features when working with database computed values (possibility to fill value in the database or in the application is disjunctive).
The way of filling POCOs from EF + Datasets sounds like this will not be possible when using only EF. EF has some allowed mapping patterns but possibilities to map several tables to single POCO class are extremely limited and constrained (if you want to have these tables editable). If you mean just loading one entity from EF and another entity from data adapter and just make reference between them you should be OK - in this scenario repository sounds like reasonable pattern because the purpose of the repository is exactly this: load or persist data. Unit of work can be also usable because you will most probably want to reuse single database connection between EF and data adapters to avoid distributed transaction during saving changes. UoW will be the place responsible for handling this connection.
EF mapping is related to database design - you can introduce some object oriented modifications but still EF is closely dependent on the database. If you want to use some advanced domain model you will probably need separate domain classes filled from EF and datasets. Again it will be responsibility of repository to hide these details.
From how much we have implemented, I have learned following things.
POCO and Self Tracking objects are difficult to deal with, as if you do not have easy understanding of what goes inside, there will be number of unexpected behavior which may have worked well in your previous project.
Changing pattern is not easy, so far we have been managing simple CRUD without unit of work and identity map pattern. Now lot of legacy code that we wrote in past does not consider these new patterns and the logic will not work correctly.
In our previous code, we were simply using transactions and single insert/update/delete statement that was directly sent to database assuming transactions on server side will take care of all operations.
In such conditions, we were directly dealing with IDs all the time, newly generated IDs were immediately available after single insert statement, however this is not case with EF.
In EF, we are not dealing with IDs, we are dealing with navigation properties, which is a huge change from earlier ADO.NET programming methods.
From our experience we found that only replacing EF with earlier data access code will result in chaos. But EF + RIA Services offer you a completely new solution where you will probably get everything you need and your UI will very easily bind to it. So if you are thinking about complete rewriting using UI + RIA Services + EF, then it is worth, because lot of dependency in query management reduces automatically. You will be focusing only on business logic, but this is a big decision and the amount of man hours required in complete rewriting or just replacing EF is almost same.
So we went UI + RIA Services + EF way, and we started replacing one one module. Mostly EF will easily co-exist with your existing infrastructure so there is no harm.

nHibernate, an n-Tier solution + request for advice

I'm getting the chance to develop a mildly complex project and have been investigating the various approaches that I can use to tackle this project. Typically I would have ran with the traditional 3-Tier approach but after spending some time looking around at various options I've got an inkling that some kind of ORM might be a better fit and I'm considering nHibernate. However, I'm looking for some guidance on implementing nHibernate and more specifically how I would structure my BL and DAL in conjunction with nHibernate.
With nHibernate I would create my Objects (or DTOs?) and use nHibernate methods for my CRUD interactions all in my DAL. But what I can't get my head around is the Objects defined in the DAL would be probably be better situated within the BL, i.e. where validation and other stuff can be performed easily, and I just use the DAL from the various ObjectFactory's / ObjectRepositories. Unfortunately it seems through the many articles I've read this isn't mentioned or skirted over and I'm a tad confused.
What is the more accepted or easier method of implementation when using nHibernate in a 3 Tier system? Alternatively, what is the conventional method of exposing objects through the business layer from the data layer to the presentation?
My personal experience with nHibernate has led me to decide that the data access layer becomes so thin it never has made any sense to me to separate it from the business logic. Much of your data access code is already separated into xml files (or various other distinctive methods like Fluent nHibernate) and since joins are handled almost transparently your queries using criteria objects are rarely more than a few lines.
I suspect you're overthinking this. nHibernate is basically a pretty simple tool; what it basically does is manage the serialization of your records in your database to and from similarly structured objects in your data model. That's basically it. Nothing says you can't encapsulate your Hibernate objects in Business Layer objects for validation; that's perfectly fine. But understand that the operations of validation and serialization are fundamentally different; Hibernate manages the serialization component, and does it quite nicely. You can consider the Hibernate-serializable objects as effectively "atomic".
Basically, what you want is this: nHibernate IS your Data Access Layer. (You can, of course, have other methods of Data Access in your Data Access Layer, but if you're going to use Hibernate, you should keep to the basic Hibernate data design, i.e. simple objects that perform a relatively straightforward mapping of record to object.) If your design requires that you use a different design (deeply composited objects dependent upon multiple overlapping tables) that doesn't map well into Hibernate, you might have to abandon using Hibernate; otherwise, just go with a simple POCO approach as implied by nHibernate.
I'm a fan of letting the architecture emerge, but this is what my starting architecture would look like on typical ntier asp.net mvc project if I were starting it today using NHibernate.
First off, I would try to keep as much domain code out of the controller as possible. Therefore, I would create a service layer / facade over the business layer that the controller (or code behind) makes calls to. I would split my objects into two types: 1) objects with business behavior that are used on the write side, and 2) ViewModel / DTO objects that are used for displaying data and taking the initial data entry. These DTO's would have all of the view specific concerns like simple validation attributes etc... The DTOs could have their own NHibernate mappings, or they could be projected using NHibernate's AliasToBean feature. They would be mapped to business objects once they get passed the controller in operations.
As far as the Data Access layer goes, I would probably would use NHibernate directly in the service layer. I would not use the repository pattern unless I knew that I had to be able to swap out the ORM. NHibernate is already a persistence abstraction. Putting a repository over it makes you give up a lot of features.
My 02 cents ( since there is no on-answer-fits-all):
The DAL should ONLY be responsible for data retrievel and persistence.
you can have a library containing your Model ( objects) which are filled by the DAL, but are loosly coupled (in theory, you should be able to write a new DAL using other technology and plug it instead of the NHIBERNATE one, even if you are not going to)
for client<-> BL talk, i would seriously advice views/dto's to avoid model coupling with the client (trust, me.. i'm cleaning up an application like this and it's hell)
anyways.. i'm talking about the situation we are using here which follows this structure:
client (winforms and web) <-> View/Presenter <-> WCF Services using messages <-> BL <-> DAL
I have NHibernate based apps in production and while it's better than most DALs, I'm at the point I could never recommend anyone use NHibernate any longer. The sophistication that is required to work with the session to do any advanced application is just absurd. For doing simple apps NHibernate is very trivial for enterprise application the complexity is off the charts.
At this point I've went to the decision to solve data access with 3 different choices depending on scope, using a document database (specifically Raven currently) for full scale application, for medium amounts of data access using LinqToSql and for trivial access I'm actually using raw ADO.NET connections with great success.
For reference, these statements are after I've spent 2+ years of development time using NHibernate and every time I've ever felt like I understood NHibernate fully I would run into some new limitation or giant monkey wrench I have to do deal with. It's also lead me to realize I started designing applications in regards to NHibernate which is one of my number one biggest reasons for using an ORM to not have my applications' design be dictated to by the database.
Not having to deal with session management with the complexity of NHibernate has been one of the largest boons to me for moving to RavenDB. With Raven you have very little need to manage the session except when you're doing extreme performance optimization or working with batch actions.

What's the best way to set up data access for an ASP.NET MVC project?

I am starting a new ASP.NET MVC project to learn with, and am wondering what's the optimal way to set up the project(s) to connect to a SQL server for the data. For example lets pretend we have a Product table and a product object I want to use to populate data in my view.
I know somewhere in here I should have an interface that gets implemented, etc but I can't wrap my mind around it today :-(
EDIT: Right now (ie: the current, poorly coded version of this app) I am just using plain old SQL server(2000 even) using only stored procedures for data access, but I would not be adverse to adding in an extra layer of flexability for using linq to sql or something.
EDIT #2: One thing I wanted to add was this: I will be writing this against a V1 of the database, and I will need to be able to let our DBA re-work the database and give me a V2 later, so it would be nice to only really have to change a few small things that are not provided via the database now that will be later. Rather than having to re-write a whole new DAL.
It really depends on which data access technology you're using. If you're using Linq To Sql, you might want to abstract away the data access behind some sort of "repository" interface, such as an IProductRepository. The main appeal for this is that you can change out the specific data access implementation at any time (such as when writing unit tests).
I've tried to cover some of this here:
I would check out Rob Conery's videos on his creation of an MVC store front. The series can be found here: MVC Store Front Series
This series dives into all sorts of design related subjects as well as coding/testing practies to use with MVC and other projects.
In my site's solution, I have the MVC web application project and a "common" project that contains my POCOs (plain ol' C# objects), business managers and data access layers.
The DAL classes are tied to SQL Server (I didn't abstract them out) and return POCOs to the business managers that I call from my controllers in the MVC project.
I think that Billy McCafferty's S#arp Architecture is a quite nice example of using ASP.NET MVC with a data access layer (using NHibernate as default), dependency injection (Ninject atm, but there are plans to support the CommonServiceLocator) and test-driven development. The framework is still in development, but I consider it quite good and stable. As of the current release, there should be few breaking changes until there is a final release, so coding against it should be okay.
I have done a few MVC applications and I have found a structure that works very nicely for me. It is based upon Rob Conery's MVC Storefront Series that JPrescottSanders mentioned (although the link he posted is wrong).
So here goes - I usually try to restrict my controllers to only contain view logic. This includes retrieving data to pass on to the views and mapping from data passed back from the view to the domain model. The key is to try and keep business logic out of this layer.
To this end I usually end up with 3 layers in my application. The first is the presentation layer - the controllers. The second is the service layer - this layer is responsible for executing complex queries as well as things like validation. The third layer is the repository layer - this layer is responsible for all access to the database.
So in your products example, this would mean that you would have a ProductRepository with methods such as GetProducts() and SaveProduct(Product product). You would also have a ProductService (which depends on the ProductRepository) with methods such as GetProductsForUser(User user), GetProductsWithCategory(Category category) and SaveProduct(Product product). Things like validation would also happen here. Finally your controller would depend on your service layer for retrieving and storing products.
You can get away with skipping the service layer but you will usually find that your controllers get very fat and tend to do too much. I have tried this architecture quite a few times and it tends to work quite nicely, especially since it supports TDD and automated testing very well.
For our application I plan on using LINQ to Entities, but as it's new to me there is the possiblity that I will want to replace this in the future if it doesn't perform as I would like and use something else like LINQ to SQL or NHibernate, so I'll be abstracting the data access objects into an abstract factory so that the implementation is hidden from the applicaiton.
How you do it is up to you, as long as you choose a proven and well know design pattern for implementation I think your final product will be well supported and robust.
Use LINQ. Create a LINQ to SQL file and drag and drop all the tables and views you need. Then when you call your model all of your CRUD level stuff is created for you automagically.
LINQ is the best thing I have seen in a long long time. Here are some simple samples for grabbing data from Scott Gu's blog.
LINQ Tutorial
I just did my first MVC project and I used a Service-Repository design pattern. There is a good bit of information about it on the net right now. It made my transition from Linq->Sql to Entity Framework effortless. If you think you're going to be changing a lot put in the little extra effort to use Interfaces.
I recommend Entity Framework for your DAL/Repository.
Check out the Code Camp Server for a good reference application that does this very thing and as #haacked stated abstract that goo away to keep them separated.
i think you need a orm.
for example entity framework(code first)
you can create some class for model.
use these models for you logic and view,and mapping them to db(v1).
when dba give you new db(v2),only change the mapping config.(v1 and v2 are all rdb,sql server,mysql,oracel...),if db(v1) is a rdb and db(v2) is a nosql(mongo,redis,couchbase...),that's not work
may be need do some find and replace

Categories