I'm getting the chance to develop a mildly complex project and have been investigating the various approaches that I can use to tackle this project. Typically I would have ran with the traditional 3-Tier approach but after spending some time looking around at various options I've got an inkling that some kind of ORM might be a better fit and I'm considering nHibernate. However, I'm looking for some guidance on implementing nHibernate and more specifically how I would structure my BL and DAL in conjunction with nHibernate.
With nHibernate I would create my Objects (or DTOs?) and use nHibernate methods for my CRUD interactions all in my DAL. But what I can't get my head around is the Objects defined in the DAL would be probably be better situated within the BL, i.e. where validation and other stuff can be performed easily, and I just use the DAL from the various ObjectFactory's / ObjectRepositories. Unfortunately it seems through the many articles I've read this isn't mentioned or skirted over and I'm a tad confused.
What is the more accepted or easier method of implementation when using nHibernate in a 3 Tier system? Alternatively, what is the conventional method of exposing objects through the business layer from the data layer to the presentation?
My personal experience with nHibernate has led me to decide that the data access layer becomes so thin it never has made any sense to me to separate it from the business logic. Much of your data access code is already separated into xml files (or various other distinctive methods like Fluent nHibernate) and since joins are handled almost transparently your queries using criteria objects are rarely more than a few lines.
I suspect you're overthinking this. nHibernate is basically a pretty simple tool; what it basically does is manage the serialization of your records in your database to and from similarly structured objects in your data model. That's basically it. Nothing says you can't encapsulate your Hibernate objects in Business Layer objects for validation; that's perfectly fine. But understand that the operations of validation and serialization are fundamentally different; Hibernate manages the serialization component, and does it quite nicely. You can consider the Hibernate-serializable objects as effectively "atomic".
Basically, what you want is this: nHibernate IS your Data Access Layer. (You can, of course, have other methods of Data Access in your Data Access Layer, but if you're going to use Hibernate, you should keep to the basic Hibernate data design, i.e. simple objects that perform a relatively straightforward mapping of record to object.) If your design requires that you use a different design (deeply composited objects dependent upon multiple overlapping tables) that doesn't map well into Hibernate, you might have to abandon using Hibernate; otherwise, just go with a simple POCO approach as implied by nHibernate.
I'm a fan of letting the architecture emerge, but this is what my starting architecture would look like on typical ntier asp.net mvc project if I were starting it today using NHibernate.
First off, I would try to keep as much domain code out of the controller as possible. Therefore, I would create a service layer / facade over the business layer that the controller (or code behind) makes calls to. I would split my objects into two types: 1) objects with business behavior that are used on the write side, and 2) ViewModel / DTO objects that are used for displaying data and taking the initial data entry. These DTO's would have all of the view specific concerns like simple validation attributes etc... The DTOs could have their own NHibernate mappings, or they could be projected using NHibernate's AliasToBean feature. They would be mapped to business objects once they get passed the controller in operations.
As far as the Data Access layer goes, I would probably would use NHibernate directly in the service layer. I would not use the repository pattern unless I knew that I had to be able to swap out the ORM. NHibernate is already a persistence abstraction. Putting a repository over it makes you give up a lot of features.
My 02 cents ( since there is no on-answer-fits-all):
The DAL should ONLY be responsible for data retrievel and persistence.
you can have a library containing your Model ( objects) which are filled by the DAL, but are loosly coupled (in theory, you should be able to write a new DAL using other technology and plug it instead of the NHIBERNATE one, even if you are not going to)
for client<-> BL talk, i would seriously advice views/dto's to avoid model coupling with the client (trust, me.. i'm cleaning up an application like this and it's hell)
anyways.. i'm talking about the situation we are using here which follows this structure:
client (winforms and web) <-> View/Presenter <-> WCF Services using messages <-> BL <-> DAL
I have NHibernate based apps in production and while it's better than most DALs, I'm at the point I could never recommend anyone use NHibernate any longer. The sophistication that is required to work with the session to do any advanced application is just absurd. For doing simple apps NHibernate is very trivial for enterprise application the complexity is off the charts.
At this point I've went to the decision to solve data access with 3 different choices depending on scope, using a document database (specifically Raven currently) for full scale application, for medium amounts of data access using LinqToSql and for trivial access I'm actually using raw ADO.NET connections with great success.
For reference, these statements are after I've spent 2+ years of development time using NHibernate and every time I've ever felt like I understood NHibernate fully I would run into some new limitation or giant monkey wrench I have to do deal with. It's also lead me to realize I started designing applications in regards to NHibernate which is one of my number one biggest reasons for using an ORM to not have my applications' design be dictated to by the database.
Not having to deal with session management with the complexity of NHibernate has been one of the largest boons to me for moving to RavenDB. With Raven you have very little need to manage the session except when you're doing extreme performance optimization or working with batch actions.
Related
I am working on project using ASP.NET MV5 along with Entity Framework. I got few question related to architecture application in best possible way. I Got existing database so i will use code first existing database approach (code or design?) possibly i will use the store procedure as well
now if i use code first existing database design approach, should i have separate models for each business concern or one design (ADO.NET Entity Model). I have just realize some of my model will share among different business functions for example ASP.NET Identity table "Role" is using my dashboard controller where it see, who can use what functions!
can i mix code first existing database--> design & code approach together?
if i use code first existing database design approach, can i go and modify model?
should i have one DbContext for reading database or separate? reason why i am asking if code running under one DbContext brings all data! do i really need that? does it effect performance? security otherwise have multiple DbContext for each business concerns?
I know it is very open questions. i more interesting to see other expert approach in architecture complex application in best possible way using ASP.NET technologies.
Many Thanks
For small projects, it's probably ok to just use your entity objects directly. However, if you want to perform complex validation and apply business rules, you're much better off creating separate business objects. Entity objects are, by nature, data objects only.
You can use utilities like AutoMapper to make translating between entity and business objects much easier and eliminate errors and inconsistencies while maintaining clear separate between your business objects and your entity objects.
In the long run, this is a more sustainable architecture.
All,
We are using EF as our primary data access technology. Like many apps out there, we have a business objects/domain layer. This layers talks to our repository, which, in turn, talks to EF.
My question is: What is the best mechanism for passing the data back and forth to/from EF? Should we use the EF-generated entity classes (we did DB-first development, so we have entity classes that EF generated), create our own DTOs, use JSON or something else?
Of course, I could make an argument for each of these, as well as a counter-argument against them. I'm looking for opinions based on experience building a non-trivial application using a layered architecture and EF.
Thanks,
John
I would use POCOs and use them with EF. You can still do that with the DB first approach.
The main benefit is that your business objects will not be tied to any data access technology.
Your underlying storage mechanism can, and will, change but your POCOs remain. All that business logic is easily re-used and tested.
As you're looking for cons, then I would say it might take longer. However, that cost is well worth it.
With t4 templates I put the actual EF generated entities in a common project that is referenced by all other projects. I use the EF database first created models through the entire application (including use as view models). If I need to add additional properties to an entity that are not in the database I just extend the partial class of the entity in the common project. I have written dozens and large nTier applications using this model and its worked great.
As every body knows that Object oriented languages provide reusability features.
I have a simple three tier application:
presentation layer
business layer is designed to reap the benefits of reusability
datalayer is a dumb ado.net library(by dumb I meant that it has no business logic in place.)
I have been struggling to enforce code reusability in this datalayer. I am pasting a pseudo pattern in one of my methods in datalayer.
create connection object
open connection to database
create a transaction object
begin transaction
create command object
execute it
.
.
.
.
create nth command object
execute it
commit transaction
close connection
In reality this code swells to around 300 to 400 lines of code and it becomes impossible to read this code.
In this series of command executions we are selecting / inserting / updating queries on different tables. If it were not to be in transaction, I would have separated this code to their respective classes.
There is again a spaghetti pattern which I recently encountered:
business layer method1 calls datalayer method to update column1
business layer method2 calls datalayer method to update column2
businees layer method3 calls datalayer method to save the entire result in table by updating it.
This pattern emerged when I was trying to reap the benefits of reusability, these methods are called from different locations so they were reused. However, if were to write simple sql query not keeping in mind of reusability there would have been a single call to database.
So, is there any pattern or technique by which reusability can be achieved in data layer?
Note:
I don't want to use any stored procedures despite the fact that they
offer precompilation benefits and etc. , as they tend to tie
datalayer more specific to a particular database.
I am also currently not considering any ORM solutions here only
plain ADO.net.
Excuses for not considering any ORMs.
Learning curve
Avoiding tight coupling to a specific ORM which I think can be
removed by restricting the ORM code in datalayer itself.
I checked the internet some time 6 months ago and there were only
two popular or widely used ORM solutions available then. Entity
Framework and NHibernate. I choose Entity Framework (for some reasons
I will link later link1, link2 besides that I had a feeling that working with EF would be easy as it is provided by Microsoft) to start learning.
I used this Microsoft recommended
book in this book
there were three techniques as I understood
TPT,
TPH and
TPC; TPC I never tried.
When I checked the SQL generated from the Entity Framework, it was
very ugly and was creating some extra columns: Ids, some ugly Case
statements etc., it seemed that for a highly transactional system
the ORM solution cannot be applied.By highly transactional system I mean 1000 of insertions happening every single minute. The database continues to swell in size and reaches somewhere near 500 to 600 GBs in some distant future.
I agree with the comments to your question; you should really avoid re-inventing the wheel here and go with an ORM, if at all possible. Speaking from experience, you're going to end up writing code and solving problems that have long ago been solved and it will probably take you more time in the long run. However, I understand that sometimes there are constraints that don't permit the use of an ORM.
Here are some articles that I have found helpful:
This first article is an old one but it explains the different options that you have for data access design patterns. It has a few different patterns and only you can really decide which one will be best for you but it sounds like you might want to look at the Repository Pattern:
http://msdn.microsoft.com/en-us/magazine/dd569757.aspx
This next article is the first in a series that talks about how to implement a repository pattern with a data mapper which, based on your example above, will probably help to reduce some of your redundant code.
http://blogsprajeesh.blogspot.com/2010/02/data-access-layer-in-c-using-repository.html
Finally, depending on how you implement your data access pattern, you may find the template pattern and generics helpful. The following article talks about that a little bit and you can glean some helpful information from it:
http://www.c-sharpcorner.com/UploadFile/rmcochran/elegant_dal05212006130957PM/elegant_dal.aspx
Without knowing more about your project, it's hard to say, exactly, which pattern will best suit your needs. However, using a combination of the Unit of Work pattern with repositories and data mappers will probably help you to reuse some code and manage your data access.
What I'm not seeing is your model layer.
You have a business layer, and a DAO layer, but no model.
business layer method1 calls datalayer method to update column1
business layer method2 calls datalayer method to update column2
businees layer method3 calls datalayer method to save the entire result in table by updating it.
Why isn't this:
business layer updates model/domain object A
business layer updates model/domain object A in a different way
business layer persists model/domain to database through data layer.
This way you get re-use, and avoid repeated loops back and forth to database.
Ultimately it sounds like your business layer knows FAR too much of the database data model. You need business objects, not just business methods.
As I began writing web applications with ASP.NET I started with small projects that used a Linq-To-SQL mapper for database access to a MSSQL Server.
After gaining some experience, I switched into a classic three-tiered approach with a graphic Layer, business Layer, and a data Layer. The only function of the data layer was to provide insert/update/delete-methods without any logic and logic the form of selection methods.
Over the time I realized that it would be better not to provide the database classes up to the GUI (took some time, unfortunately). I switched to using business classes in the BL that are used for all operations performed by the BL and displayed by the GUI in the form of getting List from the business layer.
A great advantage is that I can provide additional properties that are not represented by the database itself. However, I did that mapping inside the business layer myself with methods that mapped the corresponding business layer class to the database class.
I guess that's where O/R mapper come in handy? Until now, I haven't realized their purpose, but I think I just found it. I've recently tried out using the new Entity Framework with .NET Framework 4, but I'm only using it like the Linq-To-SQL DataContext.
Is there a way to achieve the mapping automatically? If yes, is that something the new Entity Framework provides or do I need to look for a O/R Mapper like NHibernate?
I use NHibernate exclusively in my projects. I like the control and flexibility it gives me. There is a 'shortcut' called Active Record that uses NHibernate under the covers but provides a really nice an simple interface to NHibernate.
NHibernate has a steep learning curve, but when you get past that - it is really smooth sailing. When (and if) you venture the way of NHibernate, check out Ayende for cool tips.
(Entity Framework is an O/R Mapper.)
If you're serious about getting your hands dirty with ORM (but relatively new to that area), I highly recommend something like TekPub's videos on these topics. You'll be able to see these tools in use starting from scratch. It is a graceful introduction to some simple, but real-world issues like the ones you mention.
LinqToSql is an ORM, so you are already using one. Taking LinqToSql out and replacing it with EntityFramework or NHibernate won't solve the problems you appear to be having right now.
Here are some things you should learn more about to help give you additional context:
AutoMapper
Data Transfer Objects (DTOs)
Plain Old CLR Object (POCO)
I've had a great time using Entity Framework 4.0 (+ the CTP). I think you'd have a much easier time dealing with an ORM like that. EF4 provides everything you need to interoperate with MSSQL from C#/.NET. You won't have to write a single line of SQL, and it has full support for LINQ (through ObjectQuery).
I am starting a new ASP.NET MVC project to learn with, and am wondering what's the optimal way to set up the project(s) to connect to a SQL server for the data. For example lets pretend we have a Product table and a product object I want to use to populate data in my view.
I know somewhere in here I should have an interface that gets implemented, etc but I can't wrap my mind around it today :-(
EDIT: Right now (ie: the current, poorly coded version of this app) I am just using plain old SQL server(2000 even) using only stored procedures for data access, but I would not be adverse to adding in an extra layer of flexability for using linq to sql or something.
EDIT #2: One thing I wanted to add was this: I will be writing this against a V1 of the database, and I will need to be able to let our DBA re-work the database and give me a V2 later, so it would be nice to only really have to change a few small things that are not provided via the database now that will be later. Rather than having to re-write a whole new DAL.
It really depends on which data access technology you're using. If you're using Linq To Sql, you might want to abstract away the data access behind some sort of "repository" interface, such as an IProductRepository. The main appeal for this is that you can change out the specific data access implementation at any time (such as when writing unit tests).
I've tried to cover some of this here:
I would check out Rob Conery's videos on his creation of an MVC store front. The series can be found here: MVC Store Front Series
This series dives into all sorts of design related subjects as well as coding/testing practies to use with MVC and other projects.
In my site's solution, I have the MVC web application project and a "common" project that contains my POCOs (plain ol' C# objects), business managers and data access layers.
The DAL classes are tied to SQL Server (I didn't abstract them out) and return POCOs to the business managers that I call from my controllers in the MVC project.
I think that Billy McCafferty's S#arp Architecture is a quite nice example of using ASP.NET MVC with a data access layer (using NHibernate as default), dependency injection (Ninject atm, but there are plans to support the CommonServiceLocator) and test-driven development. The framework is still in development, but I consider it quite good and stable. As of the current release, there should be few breaking changes until there is a final release, so coding against it should be okay.
I have done a few MVC applications and I have found a structure that works very nicely for me. It is based upon Rob Conery's MVC Storefront Series that JPrescottSanders mentioned (although the link he posted is wrong).
So here goes - I usually try to restrict my controllers to only contain view logic. This includes retrieving data to pass on to the views and mapping from data passed back from the view to the domain model. The key is to try and keep business logic out of this layer.
To this end I usually end up with 3 layers in my application. The first is the presentation layer - the controllers. The second is the service layer - this layer is responsible for executing complex queries as well as things like validation. The third layer is the repository layer - this layer is responsible for all access to the database.
So in your products example, this would mean that you would have a ProductRepository with methods such as GetProducts() and SaveProduct(Product product). You would also have a ProductService (which depends on the ProductRepository) with methods such as GetProductsForUser(User user), GetProductsWithCategory(Category category) and SaveProduct(Product product). Things like validation would also happen here. Finally your controller would depend on your service layer for retrieving and storing products.
You can get away with skipping the service layer but you will usually find that your controllers get very fat and tend to do too much. I have tried this architecture quite a few times and it tends to work quite nicely, especially since it supports TDD and automated testing very well.
For our application I plan on using LINQ to Entities, but as it's new to me there is the possiblity that I will want to replace this in the future if it doesn't perform as I would like and use something else like LINQ to SQL or NHibernate, so I'll be abstracting the data access objects into an abstract factory so that the implementation is hidden from the applicaiton.
How you do it is up to you, as long as you choose a proven and well know design pattern for implementation I think your final product will be well supported and robust.
Use LINQ. Create a LINQ to SQL file and drag and drop all the tables and views you need. Then when you call your model all of your CRUD level stuff is created for you automagically.
LINQ is the best thing I have seen in a long long time. Here are some simple samples for grabbing data from Scott Gu's blog.
LINQ Tutorial
I just did my first MVC project and I used a Service-Repository design pattern. There is a good bit of information about it on the net right now. It made my transition from Linq->Sql to Entity Framework effortless. If you think you're going to be changing a lot put in the little extra effort to use Interfaces.
I recommend Entity Framework for your DAL/Repository.
Check out the Code Camp Server for a good reference application that does this very thing and as #haacked stated abstract that goo away to keep them separated.
i think you need a orm.
for example entity framework(code first)
you can create some class for model.
use these models for you logic and view,and mapping them to db(v1).
when dba give you new db(v2),only change the mapping config.(v1 and v2 are all rdb,sql server,mysql,oracel...),if db(v1) is a rdb and db(v2) is a nosql(mongo,redis,couchbase...),that's not work
may be need do some find and replace