Having created a functional test project using nHibernate with all the typical trimmings I've got a very good handle on how I can leverage nHibernate, well at least as much as I can so far. However, I'm wondering how other developers handle scenarios where the application needs to retreive data from the backend database that may run complex queries with joins / grouping for say reporting purposes or other tasks that fall outside of the OO paradigm.
Should I take the pure route of utilising nHibernate to execute these queries and fill the appropriate Repository objects regardless of its performance OR should I simply go straight the database passing a dataset back through the business layer?
Ultimately I'm comfortable using nHibernate for interacting with simple and complex business objects but there are situations where I feel that simply delving into the database makes more sense, and working as the only .NET developer atm I'm very interested in how other devs have handled this situation...
Thanks in advance
NHibernate offers much more than simple querying. The answer depends on what you need. Some examples:
Automatic mapping of data to classes;
Automatic change tracking;
Database independence;
Multiple choices in querying the database (currently 7) with support for easy refactoring;
Integrated caching capabilities;
Much much more.
If any of these properties fit your project, then yes, choose NHibernate.
Be pragmatic about it.
Many "complex" queries can be easily expressed in HQL, but if SQL is a better fit for some, by all means use it.
The reporting scenarios in particular are usually better suited for reporting tools that go directly against the DB.
If your reports have relatively simple joins/grouping, then Nhivernate will be ok.
If you have complex queries for reporting purposes, Nhibernate may not be the best solution. I would recommend SSRS (Sql Server Reporting Service). It will be easier for grouping and other advanced functinality such as exporting to word/excel.
It really depends on the complexity of your reports.
Related
I was learning this ORM because think this is good technology for most projects. But most employers required acquirement of ADO.NET and SQL.
This ORM not will use in high-loaded system (like popular web-sites)? In which types of projects this ORM will be useful? Are highly loaded projects using ORM?
If you want the best possible performance, don't use an ORM. That said, not all parts of an application need the best possible performance and good ORMs (custom built or off the shelf) significantly increase development speed.
I'm not a big fan of the ORMBattle website, but searching for questions including that term on StackOverflow will give you additional information to read about .NET ORM performance:
http://www.google.com/search?q=site:stackoverflow.com+ormbattle
For instance:
Testing custom ORM solution performance overhead - how to?
ORM (esp. NHibernate) performance for complex queries
Good ORMs result in very little overhead (on top of ADO.NET) and the performance will be just fine in the large majority of cases.
A good ORM will allow you to easily "drop to the metal" (i.e. get closer to raw SQL performance) when you need extra performance.
ORMs can certainly be performance killers. I've measured performance of Entity Framework (v1) and LINQ to SQL against ADO.NET (both datasets and datareaders). EF performance was simply unacceptable, especially in web apps where data contexts would be created and discarded very frequently. LINQ to SQL wasn't too bad, but wouldn't qualify for a high-performance application. The difference between the two is that EF is a two-layer model, without generated code to optimze the mapping between layers. LINQ to SQL is a single layer, and doesn't offer nearly as much customization; the trade-off is that LINQ queries map more closely to the relational model, so there is much less overhead.
The concept of an ORM is certainly valuable; it results in much cleaner code at the application level. But there's no such thing as a free lunch. I'm currently writing a custom ORM that maps a single data model onto both SQL Server and Oracle, with the ability to switch between servers with a simple app config setting. However, there's no LINQ IQueryable provider, and all queries are written or generated as dynamic SQL and run as ADO.NET queries. All database interaction is interfaced to the application as method calls for specific operations, and IEnumerables of entity classes returned as results. Performance is within 10% of straight ADO.NET coding, but that level of performance was a requirement from the start of the project.
While the core components of this ORM could be used in any project, the only way to get ORM and high performance is to avoid any on-the-fly mapping, either between layers (as EF does) or translating LINQ into SQL. (It's very painful to write that, because I dearly love LINQ, but the mapping cost is too much. I did make sure that LINQ to Objects works both within the ORM and in application usage with the results.)
All depends on you requirements and your architecture.
ORMs are evils when you have an reporting system and a very good for simple logic. If will implement a Repository Pattern may be will achieved a good performance.
But, as I said all depends on your requirements and architecture.
Have a look at CQRS (Command-Query Responsibility Segregation) here, this is an interesting approach of system design.
Have a look at Foundations of Programming, this is where i started.
I like to use an ORM where I have a Relational Db with a domain model (objects to be persisted). I find it to save time on development and provide cleaner code.
regarding your post about jobs, I noticed this too. However I can only speculate the answer is they still have so many .Net developers who are still learning this (ORM) there for the frameworks are not in their production systems.
I have noticed a number of Consultancy companies which seem to be using ORMS (if you are one of these companies and you do not use an orm, please correct this, I based it on your technical blogs)
IMeta (Offers Commercial support for NHibernate) UK
Engine Room Apps (they offer lessons and write apps using Nhibernate) UK
EMC2 (they did the whocanhelpme on codeplex)
HTH
bones
Given that I'm very good with SQL and c#,
Should I learn another layer on top like NHibernate ?
Is NHibernate just a library (that stores in a Database)? Or it's a service?
Should I use NHibernate or ADO.NET Entity Framework?
If you think I should learn/use an ORM, give me your top reason.
You should use an ORM as long as you need to convert database data to and from business objects, since it will save you a lot of work and will allow you to focus on your application logic.
NHibernate is a .NET library that does just that, mapping .NET objects to database tables according to how you configure it. In this sense it is the same as the Entity Framework, only that EF is already embedded in the .NET framework and NHibernate is a separate assembly that you must reference in your project.
Last but not least, if you use SQL Server you should add LINQ to SQL to the list of possible ORM candidates, it is simpler that EF and for many scenarios it is more than appropriate.
It depends on your applications.
NHibernate is a library. So it's a DLL.
Depends on what you want. NHibernate is based on Hibernate which is battle tested.
It doesn't matter how good anyone is with SQL or C#. There is a fundamental gap with the tools when dealing with SQL and C#. Aside from all the other productivity boosts that I've had when I learned to Stop Worrying and Just Use an ORM, I found only having to deal with C# most of the time has helped greatly. I have far fewer impedance mismatches in my work now and I do believe that contributes to fewer bugs.
Less code you have to write is less code you have to maintain. ORMs allow you to worry less about certain details so you are free to concentrate on higher level tasks.
No, I tried Fluent NH and Castle Active Record and Spring Framework NH Extensions but they all obscure basic operations and make things less visible. Start using native NH, then add a layer after a year.
Yes, NH is a library, not a service. But the way you use it in your code makes it feel almost like a service (e.g. a data repository service)
I tried EF and found it nauseating so I would go with NH
For OLTP-like systems, ORM is the way of the future. Not using ORM for me is like not using unit-tests or programming in non-OOP language.
Probably, but it depends on what kind of applications you normally write.
NHibernate is primarily a DLL, but there is more to it than that.
NHibernate (Read this for more details: NHibernate, Entity Framework, active records or linq2sql)
My top reason would be so you can use Linq. Right now, you pretty much need an ORM to use Linq.
Unless it's a very small application, then the answer is 'yes'.
Library.
I hear people swear by the EF, but I'm very leery of it. I also don't like tying myself to all Microsoft technologies. NHibernate would be my suggestion.
First, you don't want to go through the time and headache of writing all the SQL and classes and such; it's just not worth it. Second, it allows for greater ability to switch from one RDBMS to another without having to change much code. Third, it'll give you more control in the future in terms of database abstraction and such.
I am really having a hard time here. I need to design a "Desktop app" that will use WCF as the communications channel. Its a multi-tiered application (DB and application server are the same, the client goes through the internet cloud).
The application is a little complex (in terms of SQL and code logics) then the usual LOB applications, but the concept is the same: Read from DB, update to DB, handle concurrency etc. My problem is that now with Entity Framework out in the open, I cant decide which way to proceed: Should I use Entity Framework, Dataset or Custom Classes.
As I understand by Entity Framework, it will create the object mapping of my DB tables ALONG WITH the CRUD scripts as well. Thats all well and good for simple CRUD, but most of the times the "Select" is complex and it requires a custom SQL. I understand I can use Stored Procedures in EF (I dont like SP btw, i dont know why, I like to code my SQL in the DAL by hand, I feel more secure and comfortable that way).
With DataSet, I will use my custom SQLs and populate on the data set. With Custom classes (objects for DB tables) I will populate my custom SQLs on those custom classes (collections and lists etc). I want to use EF, but i dont feel confident in deploying an application whose SQL I have not written and cant see in the code. Am I missing something here.
Any help in this regard would be greatly appreciated.
Xeshu
I would agree with Marc G. 100% - DataSets suck, especially in a WCF scenario (they add a lot of overhead for handling in-memory data manipulation) - don't use those. They're okay for beginners and two-tier desktop apps on a small scale maybe - but I wouldn't use them in a serious, professional app.
Basically, your question boils down to how do you transform your rows from the database into something you can remote across WCF. This means some form of mapping - either you do it yourself, using DataReaders and then shoving all the data into WCF [DataContract] classes - you can certainly do that, gives you the ultimate control, but it's also tedious, cumbersome, and error-prone.
Or you let some ready-made ORM handle this grunt work for you - take your pick amongst Linq-to-SQL (great, easy-to-use, flexible, but SQL Server only), EF v4 (out by March 2010 - looks very promising, very flexible) or any other ORM, really - whatever suits your needs best.
Other serious competitors in the ORM space might include Subsonic 3.0 and NHibernate (amongst many many others).
So to sum up:
forget about Datasets
either you have 100% control and to the mapping between SQL and your objects yourself
you let some capable ORM handle that (Linq-to-SQL, EF v4, Subsonic, NHibernate et al) - which one really doesn't matter all that much, i.e. it's also a matter of personal preference and coding style
I can't advocate datasets, especially in an SOA environment like WCF - it'll work, but for mostly the wrong reasons. They simply aren't portable, and IMO don't really "work" over service boundaries. Of course, IMO they don't work in most other scenarios too ;-p
So then it comes down to how much plumbing you want to do. Most ORMs will create WCF-serializable types for you; personally I'd use LINQ-to-SQL at the moment; it is both simpler and more complete than EF, although EF 4.0 is meant to be much better than EF in 3.5sp1. You can use custom TSQL (via ExecuteQuery, which still does the mapping back to objects), but I tend to use either SPROC (for complex queries) or LINQ-generated queries (for simple requests).
Writing the types yourself is fine too, and will work with NHibernate etc. So many options.
While EF works with WCF and sounds very promising, you should consider the effort to get on speed with it. Especially when doing some non trivial stuff, the designer in VS2008 can't open the model anymore and you have to code your model in xml.
Also keep in mind that EF works on a very high abstraction level. Because of the law of leaky abstractions its not all that shiny as it supposed to be :)
The other way round that means, you have to deal with very crazy and hard to read sql statements sent to your database when it comes to troubleshooting / performance issues.
When I last worked in programming, we were trying to move away from DataReaders and the traditional ADO.NET API toward Object Relational Mapping (ORM).
To do this, we generated a DataContext of our DB via sqlmetal. There was then a thin data layer that made the DataContext private, and any code needing to access the database would have to use a public method in this thin data layer. These methods were basically stored procedures; they would perform queries on the database via LINQ to SQL.
Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what? It almost seemed like a hack at the time.
Basically, I'd like to know if LINQ to SQL and sqlmetal is what to expect if I'm go to write a DAL today at a .NET 3.5 shop that doesn't employ a third-party, open-source ORM.
It is still considered best practice to have some sort of data access layer. Whether this is best achieved with a ORM is a heavily debated issue. There is one faction that generally argues that ORM's are the way to go. Another faction argues that stored procedures and database centric is the best route.
Also, this may not be exactly the poster you meant, but it similar (and also the one in my cubicle)
http://download.microsoft.com/download/4/a/3/4a3c7c55-84ab-4588-84a4-f96424a7d82d/NET35_Namespaces_Poster_LORES.pdf
Your approach is good. I currently use Astroria services (ADO.NET Data Services). There was a nice introduction in MSDN Magazine about this.
I also like the new PLINQO (requires CodeSmith Tools though). This is very slick in my opinion.
When I have such a DAL (service layer), I just consume this service from my client application (Silverlight or ASP.NET MVC).
I think it depends on your use but I'd say with such a thin data layer as you explained that would be your DAL. Most projects will build another layer on top of that mainly for edit/create logic and maybe some stitching logic for gets.
For most of my projects I design it like this.
Repository holds the instance of DataContext and exposes some basic add/delete methods
ProductRepository : Repository exposes general queries (IQueryable)
StoreService uses an instance of different repositories like ProductRepository, SalesRepository and handles all logic for creating something like a product.
So something like...
StoreService.CreateProduct(/* properites */)
This would return some sort of result class.
The best data layer is the one that is plain and simple and gets the job done without any bells any whistles. I have used the technologies you mentioned and written about them here:
The Only Pattern for Data Access is - There Are No Patterns for Data Access
This very site uses LINQ to SQL, so take that as you will.
Officially, Microsoft is supporting Entity Framework over LINQ to SQL in terms of new development. However, there's a vocal group of people who think EF is the wrong way to go. LINQ to SQL will still be around for some time, and is a very decent ORM, if somewhat limiting in terms of which DB backend you can use.
I would recommend LINQ as a great starting point for your ORM. If you need better, look into EF and/or NHibernate.
"Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what?"
The people I know using the 3.5 Framework (and that's just about everyone) - the vast majority - are still using NHibernate. Version 2.0 is a very nice OR/M. I started using it on a recent project and it cut my data access code down significantly, to the point where I really don't want to use anything else in the future. And the Fluent NHibernate API is making some headway for folks who don't like the XML mapping.
I've been taking a look at some different products for .NET which propose to speed up development time by providing a way for business objects to map seamlessly to an automatically generated database. I've never had a problem writing a data access layer, but I'm wondering if this type of product will really save the time it claims. I also worry that I will be giving up too much control over the database and make it harder to track down any data level problems. Do these type of products make it better or worse in the already tough case that the database and business object structure must change?
For example:
Object Relation Mapping from Dev Express
In essence, is it worth it? Will I save "THAT" much time, effort, and future bugs?
I have used SubSonic and EntitySpaces. Once you get the hang of them, I beleive they can save you time, but as complexity of your app and volume of data grow, you may outgrow these tools. You start to lose time trying to figure out if something like a performance issue is related to the ORM or to your code. So, to answer your question, I think it depends. I tend to agree with Eric on this, high volume enterprise apps are not a good place for general purpose ORMs, but in standard fare smaller CRUD type apps, you might see some saved time.
I've found iBatis from the Apache group to be an excellent solution to this problem. My team is currently using iBatis to map all of our calls from Java to our MySQL backend. It's been a huge benefit as it's easy to manage all of our SQL queries and procedures because they're all located in XML files, not in our code. Separating SQL from your code, no matter what the language, is a great help.
Additionally, iBatis allows you to write your own data mappers to map data to and from your objects to the DB. We wanted this flexibility, as opposed to a Hibernate type solution that does everything for you, but also (IMO) limits your ability to perform complex queries.
There is a .NET version of iBatis as well.
I've recently set up ActiveRecord from the Castle Project for an app. It was pretty easy to get going. After creating a new app with it, I even used MyGeneration to script out class files for a legacy app that ActiveRecord could use in a pretty short time. It uses NHibernate to interact with the database, but takes away all the xml mapping that comes with NHibernate. The nice thing is though, if necessary, you already have NHibernate in your project, you can use its full power if you have some special cases. I'd suggest taking a look at it.
There are lots of choices of ORMs. Linq to Sql, nHibernate. For pure object databases there is db4o.
It depends on the application, but for a high volume enterprise application, I would not go this route. You need more control of your data.
I was discussing this with a friend over the weekend and it seems like the gains you make on ease of storage are lost if you need to be able to query the database outside of the application. My understanding is that these databases work by storing your object data in a de-normalized fashion. This makes it fast to retrieve entire sets of objects, but if you need to select data from a perspective that doesn't match your object model, the odbms might have a hard time getting at the particular data you want.