So we have this web service that uses an homemade data access framework and I found that that in its current state, the web service cannot run more than one instance at a time because this framework will start stepping on its own feet and whine about connections being closed/already opened and error like that.
So I implemented an SQL lock/mutex that queues all requests and since then, it's been pretty smooth.
I recently worked for another project which uses the ADO Entity Framework (which I've never played with until then) and found out it pretty much does what this homemade framework does.
My question is, is the ADO Entity Framework robust enough on its own so I would not need this SQL mutex implementation anymore ?
Thanks.
If you will follow the rule "Do not share ObjectContext (DbContext in code first) instances between threads", everything will be ok.
Entity framework uses some static data to improve performance (entity model cache), but most of objects (entity connections, contexts, change trackers, etc.) are not thread-safe and shouldn't be shared between threads.
Yes, it is robust enough to do that, given you don't share dbcontexts between threads, which your homebrew layer must be doing. Not a way I'd have gone.
Related
EF is so widely used staff but I don't realize how I should use it. I met a lot of issues with EF on different projects with different approaches. So some questions brought together in my head. And answers leads me to use pure ado.net with stored procedures.
So the questions are:
How to deal with EF in n-tier application?
For example, we have some DAL with EF. I saw a lot of articles and projects that used repository, unit of work patterns as some kind of abstraction for EF. I think such approach kills most of benefits that increase development speed and leads to few things:
remapping of EF load results in some DTO that kills performance(call some select to get table data - first loop, second loop - map results to some composite type generated by ef, next - filter mapped data using linq and, at last, map it to some DTO). Exactly remapping to DTO is killer of one of the biggest efs benefit;
or
leads to strong cohesion between EF (and it's version) and app. It will be something like 2-tier app with dal and presentation with bll or dal with bll and presentation. I guess it's not best practice. And the same loading process as we have for previous thing except mapping, so again performance issue raised up. We could try to use EF as DAL without any abstraction under them. But we will get similar issues in some other way.
Should I use one context per app\thread\atomic operation? Using approach - one context per app\thread may slightly increase performance and possibilities to call navigation properties, but we meet another problem - updating this context and growing loaded data in context, also I'm not sure about concurrency with one dbcontext per app\thread. Using context per operation will lead us to remapping EF results to our DTO's. So you see that we again pushed back to question no.1.
Could we try to use EF + stored procedures only? Again we have issues from previous questions. What is the reason to use EF if the biggest part of functionality will not be used?
So, yes EF is great to start project. It so convenient when we have few screens and crud operations.
But what next?
All this text is just unsorted thoughts. I know that pure ado.net will lead to another kind of challenges.
So, what is your opinion about this topic?
By following the naming conventions , you will find it's called : ADO.NET Entity Framework , which means that Entity Framework sits on top of ADO.NET so it can't be faster , It may perform both in equal time , but let's look at EF provides :
You will no more get stuck with writing queries without any clue about if what you're writing is going to compile or not .
It makes you rely on C# or your favorite .NET language on writing your own data constraints that you wish to accept from the target user directly inside your model classes .
Finally : EF and LINQ give a lot of power in maintaining your applications later .
There are three different models with the Entity Framework : Model First , Database First and Code First get to know each of 'em .
-The Point about killing performance when remapping is on process , it's because that on the first run , EF loads metadata into memory and that takes time as it builds in-memory representation of model from edmx file.
ADO. Net is an object oriented framework that allows you to interact with database system (SQL, Oracle, etc).
Entity framework is a techniques of manipulating data in databases like (collection of queries (inert table name , select * from like this )).
it is uses with LINQ.
Entity Framework is not efficient in any case as in most tools or toolboxes designed to achieve 'faster' results.
Access to database should be viewed as a separate tier using store procedures as the interface. There is no reason for any application to have more than absolutely require CRUD operations. Less is more principle. Stored procedures are easy to write, secure, maintain and is de facto fastest way. It's easy to write tools to generate desired codes for POCO and DbContext through stored procedures.
Application well designed should have a limited numbers of connection strings to database and none of which should be the all mighty God. Using schema to support connection rights.
Lazy loading are false statements added to solve a problem that should never exist and introduced with ORM and its plug and play features. Data should only be read when needed. Developers should be responsible to implement this logic base on application context.
If your application logic has a problem to maintain states, no tool will help. It will in fact, make it worse by cover up the real problem until it's too late.
Database first is the only solution for a well designed application. Civilization realized long time ago the important of solid aqueduct and sewer system. High level code can and will be replaced anytime but data stays. Rewrite an entire application is matter of days if database is well designed.
Applications are just glorified database access. Still true in most cases.
This is my conclusion after many years in business applications debugging through codes produced by many different tools or toolboxes. The faster results advertised are not even close to cover the amount of time/energy wasted later trying to clean up the mess. Performance issues are rarely if not ever caused by high demand but the sum of all 'features' added through unusable tools.
ADO.NET provides consistent access to data sources such as SQL Server and XML, and to data sources exposed through OLE DB and ODBC. Data-sharing consumer applications can use ADO.NET to connect to these data sources and retrieve, handle, and update the data that they contain.
Entity Framework 6 (EF6) is a tried and tested object-relational mapper (O/RM) for .NET with many years of feature development and stabilization. An ORM like EF has the following advantage
ORM lets developers focus on the business logic of the application thereby facilitating huge reduction in code.
It eliminates the need for repetitive SQL code and provides many benefits to development speed.
Prevents writing manual SQL queries; & many more..
In an n-tier application,it depends on the amount of data your application is handling and your database is managing. According to my knowledge DTO's don't kill performance. They are data container for moving data between layers and are only used to pass data and does not contain any business logic. They are mostly used in service classes.See DTO.
One DBContext is always a best practice.
There is no such combination of EF + SP(Stored Procedure) as per my knowledge. If you wish to use an ORM like EF and an SP at the same time try micro-ORMs like Dapper,BLToolkit, etc..It was build for that purpose and is heck lotta fast than EF. Here is a good article on Dapper ORM.
Here is a related thread on a similar topic: What is the difference between an orm and ADO.net?
I'm looking for a reference implementation of the "Unit of work" and "repository" pattern for MS SQL Server or Plain old ADO.NET. But all samples are build aroud an existing context like Linq2SQL or EF. According to my understanding, these technologies are themselves almost implementing these pattern.
But how do I deal with a "plain" SQL Repository without any context and SaveChanges() methods? Is the right way to use TransactionScope? For example collect all SQL Operations in a List of commands and then simple execute them one after each other within a Tx Scope... or is this too simple?
Why am I looking for this? I have the task of building a data layer that can both deal with an ancient Sybase database as well as SQL Server (maybe additional in conjunction with a POCO based EF4 Component)
For this my Idea is to create an abstraction layer with a Repository and Unit of Work Pattern and create different implementations for each Technology.
Update:
I was on vacation the last week. Sorry for the delay. Today I built up an basic picture of my architecture for this. [link] (s7.directupload.net/file/d/2570/whb7ulbs_jpg.htm). My idea is to create a simple ObjectContext like the EF ObjectContext that exists parallel to the EF Context and is used by my repository. This context collects ATOM Sql Transactions in a kind of Stack and executes them within the Transaction within the Unit of Work part. Good idea? Bad Idea? Hard to do? I'm looking forward to your views on this.
I don't envy your task; supporting multiple backend databases in your application is going to be tricky.
Here's an example of a Unit Of Work pattern using ASP.NET MVC and LightSpeed: link
Personally, I would use EF or NHibernate (prefer EF); SQL Anywhere supports ADO.NET and Entity Framework, so (ideally) you wouldn't need to do anything special to support that database.
Good luck!
If you are just worried about transaction scope, let me point you towards the System.Transactions library, and its TransactionScope object. Great class. any sql or other transaction managed system that is manipulated within the same thread that instantiated the transaction scope will be automatically added to the transaction. that way, if any part of the code fails, and throws an exception, you can just not call the scope.Complete() method and all the operations within the transaction scope are rolled back. very nice class.
As I began writing web applications with ASP.NET I started with small projects that used a Linq-To-SQL mapper for database access to a MSSQL Server.
After gaining some experience, I switched into a classic three-tiered approach with a graphic Layer, business Layer, and a data Layer. The only function of the data layer was to provide insert/update/delete-methods without any logic and logic the form of selection methods.
Over the time I realized that it would be better not to provide the database classes up to the GUI (took some time, unfortunately). I switched to using business classes in the BL that are used for all operations performed by the BL and displayed by the GUI in the form of getting List from the business layer.
A great advantage is that I can provide additional properties that are not represented by the database itself. However, I did that mapping inside the business layer myself with methods that mapped the corresponding business layer class to the database class.
I guess that's where O/R mapper come in handy? Until now, I haven't realized their purpose, but I think I just found it. I've recently tried out using the new Entity Framework with .NET Framework 4, but I'm only using it like the Linq-To-SQL DataContext.
Is there a way to achieve the mapping automatically? If yes, is that something the new Entity Framework provides or do I need to look for a O/R Mapper like NHibernate?
I use NHibernate exclusively in my projects. I like the control and flexibility it gives me. There is a 'shortcut' called Active Record that uses NHibernate under the covers but provides a really nice an simple interface to NHibernate.
NHibernate has a steep learning curve, but when you get past that - it is really smooth sailing. When (and if) you venture the way of NHibernate, check out Ayende for cool tips.
(Entity Framework is an O/R Mapper.)
If you're serious about getting your hands dirty with ORM (but relatively new to that area), I highly recommend something like TekPub's videos on these topics. You'll be able to see these tools in use starting from scratch. It is a graceful introduction to some simple, but real-world issues like the ones you mention.
LinqToSql is an ORM, so you are already using one. Taking LinqToSql out and replacing it with EntityFramework or NHibernate won't solve the problems you appear to be having right now.
Here are some things you should learn more about to help give you additional context:
AutoMapper
Data Transfer Objects (DTOs)
Plain Old CLR Object (POCO)
I've had a great time using Entity Framework 4.0 (+ the CTP). I think you'd have a much easier time dealing with an ORM like that. EF4 provides everything you need to interoperate with MSSQL from C#/.NET. You won't have to write a single line of SQL, and it has full support for LINQ (through ObjectQuery).
I'd like to know if it's possible to have both Linq-to-SQL and Entity Framework running side-by-side. Our current configuration is Linq-to-SQL and we'd like to eventually move to EF. But there's just too much going on in the Linq-to-SQL side right now and we'd like to do it in phases.
so any chance we can just start writing new stuff in entity framework but leave the older stuff running as is? And is it worth it?
There's nothing stopping you from using the two technologies in the same project, but you can't share the contexts or the models between the two.
Assuming you are willing to pay the price for the differences, there is nothing to stop you from using both in your project.
When I last worked in programming, we were trying to move away from DataReaders and the traditional ADO.NET API toward Object Relational Mapping (ORM).
To do this, we generated a DataContext of our DB via sqlmetal. There was then a thin data layer that made the DataContext private, and any code needing to access the database would have to use a public method in this thin data layer. These methods were basically stored procedures; they would perform queries on the database via LINQ to SQL.
Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what? It almost seemed like a hack at the time.
Basically, I'd like to know if LINQ to SQL and sqlmetal is what to expect if I'm go to write a DAL today at a .NET 3.5 shop that doesn't employ a third-party, open-source ORM.
It is still considered best practice to have some sort of data access layer. Whether this is best achieved with a ORM is a heavily debated issue. There is one faction that generally argues that ORM's are the way to go. Another faction argues that stored procedures and database centric is the best route.
Also, this may not be exactly the poster you meant, but it similar (and also the one in my cubicle)
http://download.microsoft.com/download/4/a/3/4a3c7c55-84ab-4588-84a4-f96424a7d82d/NET35_Namespaces_Poster_LORES.pdf
Your approach is good. I currently use Astroria services (ADO.NET Data Services). There was a nice introduction in MSDN Magazine about this.
I also like the new PLINQO (requires CodeSmith Tools though). This is very slick in my opinion.
When I have such a DAL (service layer), I just consume this service from my client application (Silverlight or ASP.NET MVC).
I think it depends on your use but I'd say with such a thin data layer as you explained that would be your DAL. Most projects will build another layer on top of that mainly for edit/create logic and maybe some stitching logic for gets.
For most of my projects I design it like this.
Repository holds the instance of DataContext and exposes some basic add/delete methods
ProductRepository : Repository exposes general queries (IQueryable)
StoreService uses an instance of different repositories like ProductRepository, SalesRepository and handles all logic for creating something like a product.
So something like...
StoreService.CreateProduct(/* properites */)
This would return some sort of result class.
The best data layer is the one that is plain and simple and gets the job done without any bells any whistles. I have used the technologies you mentioned and written about them here:
The Only Pattern for Data Access is - There Are No Patterns for Data Access
This very site uses LINQ to SQL, so take that as you will.
Officially, Microsoft is supporting Entity Framework over LINQ to SQL in terms of new development. However, there's a vocal group of people who think EF is the wrong way to go. LINQ to SQL will still be around for some time, and is a very decent ORM, if somewhat limiting in terms of which DB backend you can use.
I would recommend LINQ as a great starting point for your ORM. If you need better, look into EF and/or NHibernate.
"Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what?"
The people I know using the 3.5 Framework (and that's just about everyone) - the vast majority - are still using NHibernate. Version 2.0 is a very nice OR/M. I started using it on a recent project and it cut my data access code down significantly, to the point where I really don't want to use anything else in the future. And the Fluent NHibernate API is making some headway for folks who don't like the XML mapping.