Entity Framework 4 vs Native Ado.net - c#

I was wondering how does the Entity Framework 4 is compared to native Ado.Net and SPs ?
what i would be missing if i used normal Ado.Net ?
does it worth leaving EF4 ?

In a nutshell, EF is an object-relational mapper (ORM), and ADO.Net is raw power. An ORM allows you to trade some runtime performance for ease of maintenance. You gain the ability to write code in a more declarative manner, expressing what you want out of the database instead of exactly how to go about getting it. As a result, changes to the database structure can be accounted for in the mappings rather than in every single part of your application that needed to touch the particular table that changed.
What you would be missing if you use ADO.Net is developer productivity. Describing each database operation in detail to ADO.Net is time consuming, error-prone, and not much fun.
I don't think I would ever want to "leave" an ORM and go back to raw ADO.Net except in situations in which extreme performance is required, such as importing large amounts of data, in which case you might be better off writing an SSIS package anyway.

EF is not suited for "crunching" large amounts of data: statistical or financial data with lot of abstract entities for example. Otherwise it's fine. Anyway, unless you suffering from perfomance issues - it's fine too. Also nothing stops you from using both concepts at the same time.

EF feels more natural, but if you are a hardcore sql user, it might feel weak and odd at first. But I like doing everything on the c# side, less maintenance issues, less headaches, less magic strings.
Anyways, for performance issues, unless you are doing mass inserts, updates, you won't see any difference.
If you use normal ADO.Net, without some kind of OR/M wrapped around it, you would still be working on records, not classes with behaviours and methods on them. You would need an additional biz layer tied to the record.

Related

Entity Framework VS pure Ado.Net

EF is so widely used staff but I don't realize how I should use it. I met a lot of issues with EF on different projects with different approaches. So some questions brought together in my head. And answers leads me to use pure ado.net with stored procedures.
So the questions are:
How to deal with EF in n-tier application?
For example, we have some DAL with EF. I saw a lot of articles and projects that used repository, unit of work patterns as some kind of abstraction for EF. I think such approach kills most of benefits that increase development speed and leads to few things:
remapping of EF load results in some DTO that kills performance(call some select to get table data - first loop, second loop - map results to some composite type generated by ef, next - filter mapped data using linq and, at last, map it to some DTO). Exactly remapping to DTO is killer of one of the biggest efs benefit;
or
leads to strong cohesion between EF (and it's version) and app. It will be something like 2-tier app with dal and presentation with bll or dal with bll and presentation. I guess it's not best practice. And the same loading process as we have for previous thing except mapping, so again performance issue raised up. We could try to use EF as DAL without any abstraction under them. But we will get similar issues in some other way.
Should I use one context per app\thread\atomic operation? Using approach - one context per app\thread may slightly increase performance and possibilities to call navigation properties, but we meet another problem - updating this context and growing loaded data in context, also I'm not sure about concurrency with one dbcontext per app\thread. Using context per operation will lead us to remapping EF results to our DTO's. So you see that we again pushed back to question no.1.
Could we try to use EF + stored procedures only? Again we have issues from previous questions. What is the reason to use EF if the biggest part of functionality will not be used?
So, yes EF is great to start project. It so convenient when we have few screens and crud operations.
But what next?
All this text is just unsorted thoughts. I know that pure ado.net will lead to another kind of challenges.
So, what is your opinion about this topic?
By following the naming conventions , you will find it's called : ADO.NET Entity Framework , which means that Entity Framework sits on top of ADO.NET so it can't be faster , It may perform both in equal time , but let's look at EF provides :
You will no more get stuck with writing queries without any clue about if what you're writing is going to compile or not .
It makes you rely on C# or your favorite .NET language on writing your own data constraints that you wish to accept from the target user directly inside your model classes .
Finally : EF and LINQ give a lot of power in maintaining your applications later .
There are three different models with the Entity Framework : Model First , Database First and Code First get to know each of 'em .
-The Point about killing performance when remapping is on process , it's because that on the first run , EF loads metadata into memory and that takes time as it builds in-memory representation of model from edmx file.
ADO. Net is an object oriented framework that allows you to interact with database system (SQL, Oracle, etc).
Entity framework is a techniques of manipulating data in databases like (collection of queries (inert table name , select * from like this )).
it is uses with LINQ.
Entity Framework is not efficient in any case as in most tools or toolboxes designed to achieve 'faster' results.
Access to database should be viewed as a separate tier using store procedures as the interface. There is no reason for any application to have more than absolutely require CRUD operations. Less is more principle. Stored procedures are easy to write, secure, maintain and is de facto fastest way. It's easy to write tools to generate desired codes for POCO and DbContext through stored procedures.
Application well designed should have a limited numbers of connection strings to database and none of which should be the all mighty God. Using schema to support connection rights.
Lazy loading are false statements added to solve a problem that should never exist and introduced with ORM and its plug and play features. Data should only be read when needed. Developers should be responsible to implement this logic base on application context.
If your application logic has a problem to maintain states, no tool will help. It will in fact, make it worse by cover up the real problem until it's too late.
Database first is the only solution for a well designed application. Civilization realized long time ago the important of solid aqueduct and sewer system. High level code can and will be replaced anytime but data stays. Rewrite an entire application is matter of days if database is well designed.
Applications are just glorified database access. Still true in most cases.
This is my conclusion after many years in business applications debugging through codes produced by many different tools or toolboxes. The faster results advertised are not even close to cover the amount of time/energy wasted later trying to clean up the mess. Performance issues are rarely if not ever caused by high demand but the sum of all 'features' added through unusable tools.
ADO.NET provides consistent access to data sources such as SQL Server and XML, and to data sources exposed through OLE DB and ODBC. Data-sharing consumer applications can use ADO.NET to connect to these data sources and retrieve, handle, and update the data that they contain.
Entity Framework 6 (EF6) is a tried and tested object-relational mapper (O/RM) for .NET with many years of feature development and stabilization. An ORM like EF has the following advantage
ORM lets developers focus on the business logic of the application thereby facilitating huge reduction in code.
It eliminates the need for repetitive SQL code and provides many benefits to development speed.
Prevents writing manual SQL queries; & many more..
In an n-tier application,it depends on the amount of data your application is handling and your database is managing. According to my knowledge DTO's don't kill performance. They are data container for moving data between layers and are only used to pass data and does not contain any business logic. They are mostly used in service classes.See DTO.
One DBContext is always a best practice.
There is no such combination of EF + SP(Stored Procedure) as per my knowledge. If you wish to use an ORM like EF and an SP at the same time try micro-ORMs like Dapper,BLToolkit, etc..It was build for that purpose and is heck lotta fast than EF. Here is a good article on Dapper ORM.
Here is a related thread on a similar topic: What is the difference between an orm and ADO.net?

What is best multi-user database C# app approach?

I would like to know what is the best method for developing a multi-user C# app using the SQL Server2005 as database. This is what I have in mind:
using nhibernate or telerik's openacces orm.
linq
using wrappers. all data from tables load into corresponding objects (at startup) and from that point only delete&update transactions affect the database.
...
I've looked at orm tools but in my opinion they generate a lot of code and i do not know if
it's necessary.
What is the best solution having in mind future changes in the application?
If i would choose the 3rd option how can i ensure that only one users modifies a row in a table(how can i lock a table row which is under modification) ?
Any suggestions or reading material will help!
Thanks!
There are hundreds of ways to solve this, but don't discount ORM. Microsoft's Entity Framework is getting better with every revision. The framework 4.0 bits are pretty good and play extremely well with LINQ.
As for generated code vs your own, try something like Entity Spaces... You have complete control over how the code gets generated and the data access layer is extremely powerful and flexible (not to mention very easy to use). It also plays nicely with LINQ.
I have written a lot of data access code over the years. In the beginning, the ORM tools were rough around the edges and left a lot to be desired. These tools have gone through many iterations since and have become indispensable in my opinion. I can't imagine writing routine after routine that does the same basic CRUD. I did that for years and spent lots of time correcting hardcoded SQL and vow to avoid it at all costs from here on out.
As for concurrency / locking issues, that's a question unto itself. There are many ways to provide locking (the major categories being optimistic and pessimistic). Each has its pros and cons.
If it's multiuser do NOT do #3. The purpose of an DBMS is to handle the multi-user aspects for you. Everything from transactions to access rights are built right in. Going down the path of mimicking that in your code will be difficult to get right. In the past some "engines" like Borland's BDE and MS Access did this. The end result is that you end up dealing with little things like data corruption and consistency errors.
Never mind that as your database grows the is going to take exponentially longer to start.
We typically stay away from ORM tools for a number of reasons, mostly feature / benefit / security concerns. Of course, we are extremely well versed in SQL and can take advantage of the specific features a given db server can offer, which most ORMs can't do. We also tend to tweak the queries based on performance metrics after product release, which would force a recompile of an app for most ORMs. By staying away from this, we can let production DBAs do their job. That may or may not be a concern of yours.
That said a lot of dev teams both like and successfully use the ones you spoke about. I would say to skip Linq-to-SQL in favor of Entity Framework if you're going that route. Linq-to-SQL has all but been replaced by EF.
Save yourself a load of effort and time and use an ORM. In terms of helping you decide which one, there is loads of information/opinion on the web (and StackOverflow!) about which one to use but that'll depend on what your application requirements are (which you haven't described).
I like Linq-to-SQL for small/mid sized apps. It's quick and easy and almost efficient. For bigger apps it'll depend on what types of data transformations and design you have in mind but Linq-to-Entities or nHibernate are probably the most appropriate.

Which one can have better performance - LINQ to EF or NHibernate?

I want to start working on a big project. I research about performance issues about LINQ to EF and NHibernate. I want to use one of them as ORM in my project. now my question is that which one of these two ORM can get me better performance in my project? I will use SQL Server 2008 as database and C# as programming language.
Neither one will have "better performance."
When analyzing performance, you need to look at the limiting factor. The limiting factor in this case will not be the ORM you choose, but rather how you use that tool, how you write your queries, and how you optimize the database backend.
Therefore, the "fastest" ORM will be the one which you can use correctly, coupled with the database server you best understand.
The ORM itself does have a certain amount of overhead, so the "fastest", in terms of sheer performance, is to use none at all. However, this favors the computer's time over at your development time, which is typically not a good trade-off. ORMs can save large amounts of your development time, while imposing only a small overhead when used correctly.
Typically when people experience performance problems when using an ORM it is because they are using the ORM incorrectly, rather than because they picked the "wrong" ORM.
We're currently using Fluent NHibernate on one our projects (with web services, so that adds additional time lag) and as far as I can see, data access is pretty much instantaneous (from human perspective).
Maybe someone can provide answer with concrete numbers though.
Since these two ORMs are somewhat different, it'd be better to decide on which one to use with regard to your specific needs, rather than performance (which, like I said, shouldn't be a big deal).
Here's a nice benchmark. As you can see results depend on whether you are doing SELECT, UPDATE, DELETE.

Should I Use Entity Framework, DataSet or Custom classes?

I am really having a hard time here. I need to design a "Desktop app" that will use WCF as the communications channel. Its a multi-tiered application (DB and application server are the same, the client goes through the internet cloud).
The application is a little complex (in terms of SQL and code logics) then the usual LOB applications, but the concept is the same: Read from DB, update to DB, handle concurrency etc. My problem is that now with Entity Framework out in the open, I cant decide which way to proceed: Should I use Entity Framework, Dataset or Custom Classes.
As I understand by Entity Framework, it will create the object mapping of my DB tables ALONG WITH the CRUD scripts as well. Thats all well and good for simple CRUD, but most of the times the "Select" is complex and it requires a custom SQL. I understand I can use Stored Procedures in EF (I dont like SP btw, i dont know why, I like to code my SQL in the DAL by hand, I feel more secure and comfortable that way).
With DataSet, I will use my custom SQLs and populate on the data set. With Custom classes (objects for DB tables) I will populate my custom SQLs on those custom classes (collections and lists etc). I want to use EF, but i dont feel confident in deploying an application whose SQL I have not written and cant see in the code. Am I missing something here.
Any help in this regard would be greatly appreciated.
Xeshu
I would agree with Marc G. 100% - DataSets suck, especially in a WCF scenario (they add a lot of overhead for handling in-memory data manipulation) - don't use those. They're okay for beginners and two-tier desktop apps on a small scale maybe - but I wouldn't use them in a serious, professional app.
Basically, your question boils down to how do you transform your rows from the database into something you can remote across WCF. This means some form of mapping - either you do it yourself, using DataReaders and then shoving all the data into WCF [DataContract] classes - you can certainly do that, gives you the ultimate control, but it's also tedious, cumbersome, and error-prone.
Or you let some ready-made ORM handle this grunt work for you - take your pick amongst Linq-to-SQL (great, easy-to-use, flexible, but SQL Server only), EF v4 (out by March 2010 - looks very promising, very flexible) or any other ORM, really - whatever suits your needs best.
Other serious competitors in the ORM space might include Subsonic 3.0 and NHibernate (amongst many many others).
So to sum up:
forget about Datasets
either you have 100% control and to the mapping between SQL and your objects yourself
you let some capable ORM handle that (Linq-to-SQL, EF v4, Subsonic, NHibernate et al) - which one really doesn't matter all that much, i.e. it's also a matter of personal preference and coding style
I can't advocate datasets, especially in an SOA environment like WCF - it'll work, but for mostly the wrong reasons. They simply aren't portable, and IMO don't really "work" over service boundaries. Of course, IMO they don't work in most other scenarios too ;-p
So then it comes down to how much plumbing you want to do. Most ORMs will create WCF-serializable types for you; personally I'd use LINQ-to-SQL at the moment; it is both simpler and more complete than EF, although EF 4.0 is meant to be much better than EF in 3.5sp1. You can use custom TSQL (via ExecuteQuery, which still does the mapping back to objects), but I tend to use either SPROC (for complex queries) or LINQ-generated queries (for simple requests).
Writing the types yourself is fine too, and will work with NHibernate etc. So many options.
While EF works with WCF and sounds very promising, you should consider the effort to get on speed with it. Especially when doing some non trivial stuff, the designer in VS2008 can't open the model anymore and you have to code your model in xml.
Also keep in mind that EF works on a very high abstraction level. Because of the law of leaky abstractions its not all that shiny as it supposed to be :)
The other way round that means, you have to deal with very crazy and hard to read sql statements sent to your database when it comes to troubleshooting / performance issues.

Should I start using LINQ To SQL?

Currently I am using NetTiers to generate my data access layer and service layer. I have been using NetTiers for over 2 years and have found it to be very useful. At some point I need to look at LINQ so my questions are...
Has anyone else gone from NetTiers to LINQ To SQL?
Was this switch over a good or bad thing?
Is there anything that I should be aware of?
Would you recommend this switch?
Basically I would welcome any thoughts
.
No
See #1
You should beware of standard abstraction overhead. Also it's very SQL Server based in it's current state.
Are you using SQL Server, then maybe. If you are using LINQ for other things right now like over XML data (great), Object data, Datasets, then yes you should could switch to have a uniform data syntax for all of them. Like lagerdalek mentioned if it ain't broke don't fix it.
From the quick look at .netTiers Application Framework, I'd say if you already have an investment with that solution it seems to give you much more than a simple Data Access Layer and you should stick with it.
From my experience LINQ to SQL is a good solution for small-medium sized projects. It is an ORM which is a great way to enhance productivity. It also should give you another layer of abstraction that will allow you to change out the layer underneath for something else. The designer in Visual Studio (and I belive VS Express also) is very easy and simple to use. It gives you the common drag-drop and property-based editing of the object mappings.
# Jason Jackson - The Designer does let you add properties by hand, however you need to specify the attributes for that property, but you do this once, it might take 3 minutes longer than the initial dragging of the table into the designer, however it is only necessary once per change in the database itself. This is not too different from other ORMs, however you are correct that they could make this much easier, and find only those properties that have changed, or even implement some kind of refactoring tool for such needs.
Resources:
Why use LINQ to SQL?
Scott Guthrie on LINQ to SQL
10 Tips to Improve your LINQ to SQL Application Performance
LINQ To SQL and Visual Studio 2008 Performance Update
Performance Comparisons LINQ to SQL / ADO / C#
LINQ to SQL 5 Minute Overview
Note that Parallel LINQ is being developed to allow for much greater performance on multi-core machines.
I tried to use Linq to SQL on a small project, thinking that I wanted something I could generate quickly. I ran into a lot of problems in the designer. For example, anytime you need to add a column to a table you basically have to remove and re-add the table definition in the designer. If you have set any properties on the table then you have to re-set those properties. For me this really slowed down the development process.
LINQ to SQL itself is nice. I really like the extensibility. If they can improve the designer I might try it again. I think that the framework would benefit from a little more functionality aimed at a disconnected model like web development.
Check out Scott Guthrie's LINQ to SQL series of blog posts for some great examples of how to use it.
NetTiers is very good for generating a heavy and robust DAL, and we use it internally for core libraries and frameworks.
As I see it, LINQ (in all its incarnations, but specifically as I think you're asking to SQL) is fantastic for quick data access, and we generally use it for more agile cases.
Both technologies are quite inflexible to change without regeneration of the code or dbml layer.
That being said, used properly LINQ 2 SQL is quite a robust solution, and you might even start using it for future development due to it's ease of use, but I wouldn't throw away your current DAL for it - if it aint broke ...
My experience tells me that using by using linq you can get things done faster, however the actual actions to the database are slower.
So... if you have a small database, i'll say go for it. If not, i would wait for some improvements before changing
I'm using LINQ to SQL on fairly large project right now (about 150 tables) and it is working out very well for me. The last ORM I used was IBatis and it worked well but took alot of legwork to get your mappings done. LINQ to SQL performs very well for me and so far has proved to be very easy to use out of the box. There are definately some differences you have to overcome in transition, but I would recommend it's use.
Side note, I have never used or read about NetTiers so I won't discount it's effectiveness, but LINQ to SQL in general has proven to be an extremely viable ORM.
Our team used to use NetTiers and found it to be useful. BUT... the more we used it, the more we found headaches and pain points with it. For example, anytime you make a change to the database, you need to re-generate the DAL with CodeSmith which involved:
re-generating thousands of lines of code in 3 separate projects
re-generating hundreds of stored procedures
Maybe there are other ways of doing it, but this is what we had to do. The re-gen of the source code was ok, scary, but ok. The real issue came with the stored procedures. It didn't clean any unused stored procedures so if you removed a table from your schema and re-gened your DAL, the stored procedures for that table did not get removed. Also, this became quite a headache for database change scripts where we had to compare the old database structure to the new one and create a change script to update client installations. This script could run into the tens of thousands of lines of sql code and if there was an issue executing it, which there invariably was, it was quite a pain to resolve it.
Then the light came on, NHibernate as an ORM. It certainly has a ramp-up time to it but it is well worth it. There is a ton of support for it so if there's something you need done, more than likely it's been done before. It is extremely flexible and allows you to control every aspect of it and then some. It is also becoming easier and easier to use. Fluent Nhibernate is up and coming as a great way to get rid of the xml mapping files that are needed and NHibernate Profiler provides an excellent interface to see what's going on behind the scenes to increase efficiency and remove redundancy.
Moving from NetTiers to NHibernate has been painful, but in a good way. It has forced us to move into a better architecture and re-evaluate functional needs. NetTiers provided tons of data access code, get this entity by its id, get this other entity by its foreign key, get a tlist and vlist of this and that, but most of it was unnecessary and unused. NHibernate with a generic repository and custom repositories only where needed reduced tons of unused code and really increased readability and reliability.

Categories