I'm looking for a reference implementation of the "Unit of work" and "repository" pattern for MS SQL Server or Plain old ADO.NET. But all samples are build aroud an existing context like Linq2SQL or EF. According to my understanding, these technologies are themselves almost implementing these pattern.
But how do I deal with a "plain" SQL Repository without any context and SaveChanges() methods? Is the right way to use TransactionScope? For example collect all SQL Operations in a List of commands and then simple execute them one after each other within a Tx Scope... or is this too simple?
Why am I looking for this? I have the task of building a data layer that can both deal with an ancient Sybase database as well as SQL Server (maybe additional in conjunction with a POCO based EF4 Component)
For this my Idea is to create an abstraction layer with a Repository and Unit of Work Pattern and create different implementations for each Technology.
Update:
I was on vacation the last week. Sorry for the delay. Today I built up an basic picture of my architecture for this. [link] (s7.directupload.net/file/d/2570/whb7ulbs_jpg.htm). My idea is to create a simple ObjectContext like the EF ObjectContext that exists parallel to the EF Context and is used by my repository. This context collects ATOM Sql Transactions in a kind of Stack and executes them within the Transaction within the Unit of Work part. Good idea? Bad Idea? Hard to do? I'm looking forward to your views on this.
I don't envy your task; supporting multiple backend databases in your application is going to be tricky.
Here's an example of a Unit Of Work pattern using ASP.NET MVC and LightSpeed: link
Personally, I would use EF or NHibernate (prefer EF); SQL Anywhere supports ADO.NET and Entity Framework, so (ideally) you wouldn't need to do anything special to support that database.
Good luck!
If you are just worried about transaction scope, let me point you towards the System.Transactions library, and its TransactionScope object. Great class. any sql or other transaction managed system that is manipulated within the same thread that instantiated the transaction scope will be automatically added to the transaction. that way, if any part of the code fails, and throws an exception, you can just not call the scope.Complete() method and all the operations within the transaction scope are rolled back. very nice class.
Related
EF is so widely used staff but I don't realize how I should use it. I met a lot of issues with EF on different projects with different approaches. So some questions brought together in my head. And answers leads me to use pure ado.net with stored procedures.
So the questions are:
How to deal with EF in n-tier application?
For example, we have some DAL with EF. I saw a lot of articles and projects that used repository, unit of work patterns as some kind of abstraction for EF. I think such approach kills most of benefits that increase development speed and leads to few things:
remapping of EF load results in some DTO that kills performance(call some select to get table data - first loop, second loop - map results to some composite type generated by ef, next - filter mapped data using linq and, at last, map it to some DTO). Exactly remapping to DTO is killer of one of the biggest efs benefit;
or
leads to strong cohesion between EF (and it's version) and app. It will be something like 2-tier app with dal and presentation with bll or dal with bll and presentation. I guess it's not best practice. And the same loading process as we have for previous thing except mapping, so again performance issue raised up. We could try to use EF as DAL without any abstraction under them. But we will get similar issues in some other way.
Should I use one context per app\thread\atomic operation? Using approach - one context per app\thread may slightly increase performance and possibilities to call navigation properties, but we meet another problem - updating this context and growing loaded data in context, also I'm not sure about concurrency with one dbcontext per app\thread. Using context per operation will lead us to remapping EF results to our DTO's. So you see that we again pushed back to question no.1.
Could we try to use EF + stored procedures only? Again we have issues from previous questions. What is the reason to use EF if the biggest part of functionality will not be used?
So, yes EF is great to start project. It so convenient when we have few screens and crud operations.
But what next?
All this text is just unsorted thoughts. I know that pure ado.net will lead to another kind of challenges.
So, what is your opinion about this topic?
By following the naming conventions , you will find it's called : ADO.NET Entity Framework , which means that Entity Framework sits on top of ADO.NET so it can't be faster , It may perform both in equal time , but let's look at EF provides :
You will no more get stuck with writing queries without any clue about if what you're writing is going to compile or not .
It makes you rely on C# or your favorite .NET language on writing your own data constraints that you wish to accept from the target user directly inside your model classes .
Finally : EF and LINQ give a lot of power in maintaining your applications later .
There are three different models with the Entity Framework : Model First , Database First and Code First get to know each of 'em .
-The Point about killing performance when remapping is on process , it's because that on the first run , EF loads metadata into memory and that takes time as it builds in-memory representation of model from edmx file.
ADO. Net is an object oriented framework that allows you to interact with database system (SQL, Oracle, etc).
Entity framework is a techniques of manipulating data in databases like (collection of queries (inert table name , select * from like this )).
it is uses with LINQ.
Entity Framework is not efficient in any case as in most tools or toolboxes designed to achieve 'faster' results.
Access to database should be viewed as a separate tier using store procedures as the interface. There is no reason for any application to have more than absolutely require CRUD operations. Less is more principle. Stored procedures are easy to write, secure, maintain and is de facto fastest way. It's easy to write tools to generate desired codes for POCO and DbContext through stored procedures.
Application well designed should have a limited numbers of connection strings to database and none of which should be the all mighty God. Using schema to support connection rights.
Lazy loading are false statements added to solve a problem that should never exist and introduced with ORM and its plug and play features. Data should only be read when needed. Developers should be responsible to implement this logic base on application context.
If your application logic has a problem to maintain states, no tool will help. It will in fact, make it worse by cover up the real problem until it's too late.
Database first is the only solution for a well designed application. Civilization realized long time ago the important of solid aqueduct and sewer system. High level code can and will be replaced anytime but data stays. Rewrite an entire application is matter of days if database is well designed.
Applications are just glorified database access. Still true in most cases.
This is my conclusion after many years in business applications debugging through codes produced by many different tools or toolboxes. The faster results advertised are not even close to cover the amount of time/energy wasted later trying to clean up the mess. Performance issues are rarely if not ever caused by high demand but the sum of all 'features' added through unusable tools.
ADO.NET provides consistent access to data sources such as SQL Server and XML, and to data sources exposed through OLE DB and ODBC. Data-sharing consumer applications can use ADO.NET to connect to these data sources and retrieve, handle, and update the data that they contain.
Entity Framework 6 (EF6) is a tried and tested object-relational mapper (O/RM) for .NET with many years of feature development and stabilization. An ORM like EF has the following advantage
ORM lets developers focus on the business logic of the application thereby facilitating huge reduction in code.
It eliminates the need for repetitive SQL code and provides many benefits to development speed.
Prevents writing manual SQL queries; & many more..
In an n-tier application,it depends on the amount of data your application is handling and your database is managing. According to my knowledge DTO's don't kill performance. They are data container for moving data between layers and are only used to pass data and does not contain any business logic. They are mostly used in service classes.See DTO.
One DBContext is always a best practice.
There is no such combination of EF + SP(Stored Procedure) as per my knowledge. If you wish to use an ORM like EF and an SP at the same time try micro-ORMs like Dapper,BLToolkit, etc..It was build for that purpose and is heck lotta fast than EF. Here is a good article on Dapper ORM.
Here is a related thread on a similar topic: What is the difference between an orm and ADO.net?
So we have this web service that uses an homemade data access framework and I found that that in its current state, the web service cannot run more than one instance at a time because this framework will start stepping on its own feet and whine about connections being closed/already opened and error like that.
So I implemented an SQL lock/mutex that queues all requests and since then, it's been pretty smooth.
I recently worked for another project which uses the ADO Entity Framework (which I've never played with until then) and found out it pretty much does what this homemade framework does.
My question is, is the ADO Entity Framework robust enough on its own so I would not need this SQL mutex implementation anymore ?
Thanks.
If you will follow the rule "Do not share ObjectContext (DbContext in code first) instances between threads", everything will be ok.
Entity framework uses some static data to improve performance (entity model cache), but most of objects (entity connections, contexts, change trackers, etc.) are not thread-safe and shouldn't be shared between threads.
Yes, it is robust enough to do that, given you don't share dbcontexts between threads, which your homebrew layer must be doing. Not a way I'd have gone.
I am currently working on a software project which uses SQL Server database to store data. Since there are some plans to move away from SQL Server to Oracle, one of the requirements is to build pluggable database layer.
So my question is what is the best way to do it?
You have lots of choices. One option is to go with one of the various Object Relational Mapper (ORM) frameworks out there. NHibernate is a popular one, but Microsoft's Entity Framework (in v4) is a reasonable possibility as well, and it integrates better with Linq, if that's the sort of thing you're interested in.
A second option (not necessarily exclusive of the above) is to implement something like the Repository pattern, and run all database access through a repository layer. Some folks see the ORM frameworks as a replacement for the Repository pattern; other see a repository layer adding value on top of the ORM framework. The real value that a repository gives you is the ability to swap out ORM layers. Only you know whether that's a reasonable likelihood, and even if it is, it may be more work to implement the additional repository level than to just re-bind everything to the new ORM.
I suggest using Entity Framework, quite easier than NHibernate and being matured very fast. For oracle support in EF, you'll need the oracle provider.
Along with using Entity framework, I suggest using a pattern such as Repository Pattern. This way, you can use Dependency Injection to change the implementation of your choice. Your application becomes independent of database or the ORM itself. So you could also use NHibernate if you wish.
I would recommend using NHibernate for your data layer. With it, you can easily swap out the configuration to work with almost any database driver you want.
NHibernate has good support for both MsSQL and Oracle.
You can write all your queries in Hibernates query language (HQL) which will be dialect agnostic. Another option is to use the linq provider in NHibernate 3 to get strongly typed data access.
As some other have mentioned, i would also recommend using the repository pattern and inject a Unit of Work or a SessionFactory.
Edit: Oracle now has released a beta of their Entity Framework provider: http://thedatafarm.com/blog/data-access/oracle-entity-framework-beta-released-today/
1: http://nhforge.org/Default.aspx## Heading ##
If you want to go ORM way, you can use NHibernate as alexn suggested, or Telerik's OpenAccess ORM. There is also EF provider for oracle and other DBMSes, but they are not free.
If you want to build your whole DAL from scratch, than go with Repository pattern. Create an interface for repository and create each repository provider for each db. Here's a discussion about.
I was avoiding writing what may seem like another thread on .net arch/n-tier architecture, but bear with me.
I, hopefully like others still am not 100% satisfied or clear on the best approach to take given today's trends and new emerging technologies when it comes to selecting an architecture to use for enterprise applications.
I suppose I am seeking mass community opinion on the direction and architectural implementation you would chose when building an enterprise application utilising most facets of today's .NET technology and what direction you would take. I better make this about me and my question, in fear of this being too vague otherwise; I would like to improve my architecture to improve and would really like to hear what you guys think given the list of technologies I am about to write.
Any and all best practices and architectural patterns you would suggest are welcome and if you have created a solution before for a similar type setup, any pitfalls or caveats you may have hit or overcome.
Here is a list of technologies adopted in my latest project, yep pretty much everything except WPF :)
Smart Client (WinForms)
WCF
Used by Smart Client
ASP.NET MVC
Admin tool
Client tool
LINQ to SQL
Used by WCF
Used ASP.NET MVC
Microsoft SQL Server 2008
Utilities and additional components to consider:
Dependency Injection - StructureMap
Exception Management - Custom
Unit Testing - MBUnit
I currently have this running in an n-Tier arch. adopting a Service-based design pattern utilising Request/Response (Not sure of it's formal name) and the Repository pattern, most of my structure was adopted from Rob Conery's Storefront.
I suppose I am more or less happy with most of my tiers (It's really just the DAL which I am a bit uneasy on).
Before I finish, these are the real questions I have faced with my current architecture:
I have a big question mark on if I should have a custom data access layer given the use of LINQ to SQL. Should I perform LINQ to SQL directly in my service/business layer or in a DAL in a repository method? Should you create a new instance of your DB context in each repository method call (using using())? or one in the class constructor/through DI?
Do you believe we can truly use POCO (plain old CLR objects) when using LINQ to SQL? I would love to see some examples as I encountered problems and it would have been particularly handy with the WCF work as I can't obviously be carrying L2S objects across the wire.
Creating an ASP.NET MVC project by itself quite clearly displays the design pattern you should adopt, by keeping view logic in the view, controller calling service/business methods and of course your data access in the model, but would you drop the 'model' facet for larger projects, particularly where the data access is shared, what approach would you take to get your data?
Thanks for hearing me out and would love to see sample code-bases on architectures and how it is split. As said I have seen Storefront, I am yet to really go through Oxite but just thought it would benefit myself and everyone.
Added additional question in DAL bullet point. / 15:42 GMT+10
To answer your questions:
Should I perform LINQ to SQL directly in my service/business layer or in a DAL in a repository method? LINQ to SQL specifically only makes sense if your database maps 1-to-1 with your business objects. In most enterprise situations that's not the case and Entities is more appropriate.
That having been said, LINQ in general is highly appropriate to use directly in your business layer, because the LINQ provider (whether that is LINQ to SQL or something else) is your DAL. The benefit of LINQ is that it allows you to be much more flexible and expressive in your business layer than DAL.GetBusinessEntityById(id), but the close-to-the-metal code which makes both LINQ and the traditional DAL code work are encapsulated away from you, having the same positive effect.
Do you believe we can truly use POCO (plain old CLR objects) when using LINQ to SQL? Without more specific info on your POCO problems regarding LINQ to SQL, it's difficult to say.
would you drop the 'model' facet for larger projects The MVC pattern in general is far more broad than a superficial look at ASP.NET MVC might imply. By definition, whatever you choose to use to connect to your data backing in your application becomes your model. If that is utilizing WCF or MQ to connect to an enterprise data cloud, so be it.
When I looked at Rob Connery's Storefront, it looked like he is using POCOs and Linq to SQL; However, he is doing it by translating from the Linq to SQL created entities to POCO (and back), which seems a little silly to me - Essentially we have a DAL for a DAL.
However, this appears to be the only way to use POCOs with Linq to SQL.
I would say you should use a Repository pattern, let it hide your Linq to SQL layer(or whatever you end up using for data access). That doesn't mean you can't use Linq in the other tiers, just make sure your repository returns IQueryable<T>.
Whether or not you use LINQ-to-SQL, it is often cmomon to use a separate DTO object for things like WCF. I have cited a few thoughts on this subject here: Pragmatic LINQ - but for me, the biggest is: don't expose IQueryable<T> / Expression<...> on the repository interface. If you do, your repository is no longer a black box, and cannot be tested in isolation, since it is at the whim of the caller. Likewise, you can't profile/optimise the DAL in isolation.
A bigger problem is the fact that IQueryable<T> leaks (LOLA). For example, Entity Framework doesn't like Single(), or Take() without an explicit OrderBy() - but L2S is fine with that. L2S should be an implementation detail of the DAL - it shouldn't define the repository.
For similar reasons, I mark the L2S association properties as internal - I can use them in the DAL to create interesting queries, but...
When I last worked in programming, we were trying to move away from DataReaders and the traditional ADO.NET API toward Object Relational Mapping (ORM).
To do this, we generated a DataContext of our DB via sqlmetal. There was then a thin data layer that made the DataContext private, and any code needing to access the database would have to use a public method in this thin data layer. These methods were basically stored procedures; they would perform queries on the database via LINQ to SQL.
Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what? It almost seemed like a hack at the time.
Basically, I'd like to know if LINQ to SQL and sqlmetal is what to expect if I'm go to write a DAL today at a .NET 3.5 shop that doesn't employ a third-party, open-source ORM.
It is still considered best practice to have some sort of data access layer. Whether this is best achieved with a ORM is a heavily debated issue. There is one faction that generally argues that ORM's are the way to go. Another faction argues that stored procedures and database centric is the best route.
Also, this may not be exactly the poster you meant, but it similar (and also the one in my cubicle)
http://download.microsoft.com/download/4/a/3/4a3c7c55-84ab-4588-84a4-f96424a7d82d/NET35_Namespaces_Poster_LORES.pdf
Your approach is good. I currently use Astroria services (ADO.NET Data Services). There was a nice introduction in MSDN Magazine about this.
I also like the new PLINQO (requires CodeSmith Tools though). This is very slick in my opinion.
When I have such a DAL (service layer), I just consume this service from my client application (Silverlight or ASP.NET MVC).
I think it depends on your use but I'd say with such a thin data layer as you explained that would be your DAL. Most projects will build another layer on top of that mainly for edit/create logic and maybe some stitching logic for gets.
For most of my projects I design it like this.
Repository holds the instance of DataContext and exposes some basic add/delete methods
ProductRepository : Repository exposes general queries (IQueryable)
StoreService uses an instance of different repositories like ProductRepository, SalesRepository and handles all logic for creating something like a product.
So something like...
StoreService.CreateProduct(/* properites */)
This would return some sort of result class.
The best data layer is the one that is plain and simple and gets the job done without any bells any whistles. I have used the technologies you mentioned and written about them here:
The Only Pattern for Data Access is - There Are No Patterns for Data Access
This very site uses LINQ to SQL, so take that as you will.
Officially, Microsoft is supporting Entity Framework over LINQ to SQL in terms of new development. However, there's a vocal group of people who think EF is the wrong way to go. LINQ to SQL will still be around for some time, and is a very decent ORM, if somewhat limiting in terms of which DB backend you can use.
I would recommend LINQ as a great starting point for your ORM. If you need better, look into EF and/or NHibernate.
"Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what?"
The people I know using the 3.5 Framework (and that's just about everyone) - the vast majority - are still using NHibernate. Version 2.0 is a very nice OR/M. I started using it on a recent project and it cut my data access code down significantly, to the point where I really don't want to use anything else in the future. And the Fluent NHibernate API is making some headway for folks who don't like the XML mapping.