I am currently working on a software project which uses SQL Server database to store data. Since there are some plans to move away from SQL Server to Oracle, one of the requirements is to build pluggable database layer.
So my question is what is the best way to do it?
You have lots of choices. One option is to go with one of the various Object Relational Mapper (ORM) frameworks out there. NHibernate is a popular one, but Microsoft's Entity Framework (in v4) is a reasonable possibility as well, and it integrates better with Linq, if that's the sort of thing you're interested in.
A second option (not necessarily exclusive of the above) is to implement something like the Repository pattern, and run all database access through a repository layer. Some folks see the ORM frameworks as a replacement for the Repository pattern; other see a repository layer adding value on top of the ORM framework. The real value that a repository gives you is the ability to swap out ORM layers. Only you know whether that's a reasonable likelihood, and even if it is, it may be more work to implement the additional repository level than to just re-bind everything to the new ORM.
I suggest using Entity Framework, quite easier than NHibernate and being matured very fast. For oracle support in EF, you'll need the oracle provider.
Along with using Entity framework, I suggest using a pattern such as Repository Pattern. This way, you can use Dependency Injection to change the implementation of your choice. Your application becomes independent of database or the ORM itself. So you could also use NHibernate if you wish.
I would recommend using NHibernate for your data layer. With it, you can easily swap out the configuration to work with almost any database driver you want.
NHibernate has good support for both MsSQL and Oracle.
You can write all your queries in Hibernates query language (HQL) which will be dialect agnostic. Another option is to use the linq provider in NHibernate 3 to get strongly typed data access.
As some other have mentioned, i would also recommend using the repository pattern and inject a Unit of Work or a SessionFactory.
Edit: Oracle now has released a beta of their Entity Framework provider: http://thedatafarm.com/blog/data-access/oracle-entity-framework-beta-released-today/
1: http://nhforge.org/Default.aspx## Heading ##
If you want to go ORM way, you can use NHibernate as alexn suggested, or Telerik's OpenAccess ORM. There is also EF provider for oracle and other DBMSes, but they are not free.
If you want to build your whole DAL from scratch, than go with Repository pattern. Create an interface for repository and create each repository provider for each db. Here's a discussion about.
Related
Is there a way my DAL classes can be re used across different databases ?
I know some technologies (Linq and EF) support rapid development, I appreciate this feature but I also want to keep my DAL Code reuse able in different database.
A Simple thing that come to mind is use of Oledb with inline SQL queries, Is there a more elegant way ? please guide me. I am just considering 2 things.
support to 4 most commonly used databases (SQL Server, My SQL, Access, Oracle).
Rapid development support.
Thanks
You may consider using an ORM framework such as Entity Framework or NHibernate. This way your data access layer will be database agnostic.
If you stick to the interfaces (IDbConnection, IDbCommand, ...) and their factory methods (IDbConnection.CreateCommand) then the only code that needs to know what database you are using is the initial connection creation (which can be encapsulated).
Entity Framework does the work you want. The most commonly used databases support it, with the exception of Oracle. For Oracle, you have to use third-party components as the official Oracle support for Entity Framework is still in beta.
As I began writing web applications with ASP.NET I started with small projects that used a Linq-To-SQL mapper for database access to a MSSQL Server.
After gaining some experience, I switched into a classic three-tiered approach with a graphic Layer, business Layer, and a data Layer. The only function of the data layer was to provide insert/update/delete-methods without any logic and logic the form of selection methods.
Over the time I realized that it would be better not to provide the database classes up to the GUI (took some time, unfortunately). I switched to using business classes in the BL that are used for all operations performed by the BL and displayed by the GUI in the form of getting List from the business layer.
A great advantage is that I can provide additional properties that are not represented by the database itself. However, I did that mapping inside the business layer myself with methods that mapped the corresponding business layer class to the database class.
I guess that's where O/R mapper come in handy? Until now, I haven't realized their purpose, but I think I just found it. I've recently tried out using the new Entity Framework with .NET Framework 4, but I'm only using it like the Linq-To-SQL DataContext.
Is there a way to achieve the mapping automatically? If yes, is that something the new Entity Framework provides or do I need to look for a O/R Mapper like NHibernate?
I use NHibernate exclusively in my projects. I like the control and flexibility it gives me. There is a 'shortcut' called Active Record that uses NHibernate under the covers but provides a really nice an simple interface to NHibernate.
NHibernate has a steep learning curve, but when you get past that - it is really smooth sailing. When (and if) you venture the way of NHibernate, check out Ayende for cool tips.
(Entity Framework is an O/R Mapper.)
If you're serious about getting your hands dirty with ORM (but relatively new to that area), I highly recommend something like TekPub's videos on these topics. You'll be able to see these tools in use starting from scratch. It is a graceful introduction to some simple, but real-world issues like the ones you mention.
LinqToSql is an ORM, so you are already using one. Taking LinqToSql out and replacing it with EntityFramework or NHibernate won't solve the problems you appear to be having right now.
Here are some things you should learn more about to help give you additional context:
AutoMapper
Data Transfer Objects (DTOs)
Plain Old CLR Object (POCO)
I've had a great time using Entity Framework 4.0 (+ the CTP). I think you'd have a much easier time dealing with an ORM like that. EF4 provides everything you need to interoperate with MSSQL from C#/.NET. You won't have to write a single line of SQL, and it has full support for LINQ (through ObjectQuery).
I have a small application in C#, and so far I have implemented it using data persistence classes that are running SQL directly, I understand there are better alternatives as Hibernate.NET and possibly Spring.NET.
There are some other I forget? pros and cons?
thanks!
Here are a few of the more common ones:
LINQ to SQL
ADO.NET Entity Framework
NHibernate
Plain old ADO.NET
SubSonic
Spring.NET isn't really a dedicated data persistence framework. It's more of a IoC (Inversion of Control) framework, which gives you dependency injection, and a bunch of other stuff, including the ability to plug in a data persistence framework.
LINQ to SQL and SubSonic are probably the easiest to use and learn out of all of these. NHibernate and Entity Framework are more complex, but probably also more powerful and flexible than plain LINQ to SQL.
I was avoiding writing what may seem like another thread on .net arch/n-tier architecture, but bear with me.
I, hopefully like others still am not 100% satisfied or clear on the best approach to take given today's trends and new emerging technologies when it comes to selecting an architecture to use for enterprise applications.
I suppose I am seeking mass community opinion on the direction and architectural implementation you would chose when building an enterprise application utilising most facets of today's .NET technology and what direction you would take. I better make this about me and my question, in fear of this being too vague otherwise; I would like to improve my architecture to improve and would really like to hear what you guys think given the list of technologies I am about to write.
Any and all best practices and architectural patterns you would suggest are welcome and if you have created a solution before for a similar type setup, any pitfalls or caveats you may have hit or overcome.
Here is a list of technologies adopted in my latest project, yep pretty much everything except WPF :)
Smart Client (WinForms)
WCF
Used by Smart Client
ASP.NET MVC
Admin tool
Client tool
LINQ to SQL
Used by WCF
Used ASP.NET MVC
Microsoft SQL Server 2008
Utilities and additional components to consider:
Dependency Injection - StructureMap
Exception Management - Custom
Unit Testing - MBUnit
I currently have this running in an n-Tier arch. adopting a Service-based design pattern utilising Request/Response (Not sure of it's formal name) and the Repository pattern, most of my structure was adopted from Rob Conery's Storefront.
I suppose I am more or less happy with most of my tiers (It's really just the DAL which I am a bit uneasy on).
Before I finish, these are the real questions I have faced with my current architecture:
I have a big question mark on if I should have a custom data access layer given the use of LINQ to SQL. Should I perform LINQ to SQL directly in my service/business layer or in a DAL in a repository method? Should you create a new instance of your DB context in each repository method call (using using())? or one in the class constructor/through DI?
Do you believe we can truly use POCO (plain old CLR objects) when using LINQ to SQL? I would love to see some examples as I encountered problems and it would have been particularly handy with the WCF work as I can't obviously be carrying L2S objects across the wire.
Creating an ASP.NET MVC project by itself quite clearly displays the design pattern you should adopt, by keeping view logic in the view, controller calling service/business methods and of course your data access in the model, but would you drop the 'model' facet for larger projects, particularly where the data access is shared, what approach would you take to get your data?
Thanks for hearing me out and would love to see sample code-bases on architectures and how it is split. As said I have seen Storefront, I am yet to really go through Oxite but just thought it would benefit myself and everyone.
Added additional question in DAL bullet point. / 15:42 GMT+10
To answer your questions:
Should I perform LINQ to SQL directly in my service/business layer or in a DAL in a repository method? LINQ to SQL specifically only makes sense if your database maps 1-to-1 with your business objects. In most enterprise situations that's not the case and Entities is more appropriate.
That having been said, LINQ in general is highly appropriate to use directly in your business layer, because the LINQ provider (whether that is LINQ to SQL or something else) is your DAL. The benefit of LINQ is that it allows you to be much more flexible and expressive in your business layer than DAL.GetBusinessEntityById(id), but the close-to-the-metal code which makes both LINQ and the traditional DAL code work are encapsulated away from you, having the same positive effect.
Do you believe we can truly use POCO (plain old CLR objects) when using LINQ to SQL? Without more specific info on your POCO problems regarding LINQ to SQL, it's difficult to say.
would you drop the 'model' facet for larger projects The MVC pattern in general is far more broad than a superficial look at ASP.NET MVC might imply. By definition, whatever you choose to use to connect to your data backing in your application becomes your model. If that is utilizing WCF or MQ to connect to an enterprise data cloud, so be it.
When I looked at Rob Connery's Storefront, it looked like he is using POCOs and Linq to SQL; However, he is doing it by translating from the Linq to SQL created entities to POCO (and back), which seems a little silly to me - Essentially we have a DAL for a DAL.
However, this appears to be the only way to use POCOs with Linq to SQL.
I would say you should use a Repository pattern, let it hide your Linq to SQL layer(or whatever you end up using for data access). That doesn't mean you can't use Linq in the other tiers, just make sure your repository returns IQueryable<T>.
Whether or not you use LINQ-to-SQL, it is often cmomon to use a separate DTO object for things like WCF. I have cited a few thoughts on this subject here: Pragmatic LINQ - but for me, the biggest is: don't expose IQueryable<T> / Expression<...> on the repository interface. If you do, your repository is no longer a black box, and cannot be tested in isolation, since it is at the whim of the caller. Likewise, you can't profile/optimise the DAL in isolation.
A bigger problem is the fact that IQueryable<T> leaks (LOLA). For example, Entity Framework doesn't like Single(), or Take() without an explicit OrderBy() - but L2S is fine with that. L2S should be an implementation detail of the DAL - it shouldn't define the repository.
For similar reasons, I mark the L2S association properties as internal - I can use them in the DAL to create interesting queries, but...
When I last worked in programming, we were trying to move away from DataReaders and the traditional ADO.NET API toward Object Relational Mapping (ORM).
To do this, we generated a DataContext of our DB via sqlmetal. There was then a thin data layer that made the DataContext private, and any code needing to access the database would have to use a public method in this thin data layer. These methods were basically stored procedures; they would perform queries on the database via LINQ to SQL.
Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what? It almost seemed like a hack at the time.
Basically, I'd like to know if LINQ to SQL and sqlmetal is what to expect if I'm go to write a DAL today at a .NET 3.5 shop that doesn't employ a third-party, open-source ORM.
It is still considered best practice to have some sort of data access layer. Whether this is best achieved with a ORM is a heavily debated issue. There is one faction that generally argues that ORM's are the way to go. Another faction argues that stored procedures and database centric is the best route.
Also, this may not be exactly the poster you meant, but it similar (and also the one in my cubicle)
http://download.microsoft.com/download/4/a/3/4a3c7c55-84ab-4588-84a4-f96424a7d82d/NET35_Namespaces_Poster_LORES.pdf
Your approach is good. I currently use Astroria services (ADO.NET Data Services). There was a nice introduction in MSDN Magazine about this.
I also like the new PLINQO (requires CodeSmith Tools though). This is very slick in my opinion.
When I have such a DAL (service layer), I just consume this service from my client application (Silverlight or ASP.NET MVC).
I think it depends on your use but I'd say with such a thin data layer as you explained that would be your DAL. Most projects will build another layer on top of that mainly for edit/create logic and maybe some stitching logic for gets.
For most of my projects I design it like this.
Repository holds the instance of DataContext and exposes some basic add/delete methods
ProductRepository : Repository exposes general queries (IQueryable)
StoreService uses an instance of different repositories like ProductRepository, SalesRepository and handles all logic for creating something like a product.
So something like...
StoreService.CreateProduct(/* properites */)
This would return some sort of result class.
The best data layer is the one that is plain and simple and gets the job done without any bells any whistles. I have used the technologies you mentioned and written about them here:
The Only Pattern for Data Access is - There Are No Patterns for Data Access
This very site uses LINQ to SQL, so take that as you will.
Officially, Microsoft is supporting Entity Framework over LINQ to SQL in terms of new development. However, there's a vocal group of people who think EF is the wrong way to go. LINQ to SQL will still be around for some time, and is a very decent ORM, if somewhat limiting in terms of which DB backend you can use.
I would recommend LINQ as a great starting point for your ORM. If you need better, look into EF and/or NHibernate.
"Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what?"
The people I know using the 3.5 Framework (and that's just about everyone) - the vast majority - are still using NHibernate. Version 2.0 is a very nice OR/M. I started using it on a recent project and it cut my data access code down significantly, to the point where I really don't want to use anything else in the future. And the Fluent NHibernate API is making some headway for folks who don't like the XML mapping.