I'm a very beginner in .NET and now I'm developing a little project (web API) using NancyFX framework. In my project, I need to use SQL database for some very basic tasks like storing registered users' details or getting some user information. I'd like to know what is the most popular, convenient and modern way of using SQL in .NET for beginners? I mean, should I use LINQ or just pure SQLClient functionality or are there any good libraries for working with SQL on .NET? I've tried to implement LINQ to SQL pattern but ended up with huge chunks of unused auto generated code and even bigger mess in my head...
For a framework to communicate with you're database I would recommend using Entity framework, its very convenient and easy and has the Code first approach which you should read about.
More over i suggest you follow the repository pattern,
https://msdn.microsoft.com/en-us/library/ff649690.aspx
This basically means - each object you save in the db, will have a repository which will contain all the object of its kind and that will be you're entry point to reading/inserting/updatibg/and deleting rows from the db, while abstracting away all details of implementation - in our case I recommend entity framework as I mentioned before.
Good luck
Related
I've been looking into using and MVC C# frontend to a Caché database backend. After looking around for a while i haven't been able to find an effective way of connecting the two together (via .edmx model generation). I know I'll need a database driver for Visual Studio 2012 to do this but i don't know where to find it.
I've been developing a few apps using MVC technology and want to keep following suit instead of resorting to using their .csp based technology.
Hopefully somebody can help with this.
Take a look at the Cache Managed Provider for .NET documentation:
http://docs.intersystems.com/cache20121/csp/docbook/DocBook.UI.Page.cls?KEY=GBMP
The Managed Provider functionality specifically allows you to access Cache data from within .NET programs. It's not going to be as nice as using, say, the .NET Entity Framework to do data access, plus you'll have to take InterSystems' code example with a grain of salt since they are pretty simplistic, but this should be what you need.
You can use an ORM framework like Entity Framework or NHibernate to get access to Intersystems Cache so the database can be separated nicely in the data layer. I managed to make NHibernate works with Intersystems Cache works. Have a look at here if you are interested.
I'm writing a .NET application and the thought of implementing a data layer from scratch is icky to me. (By data layer I'm referring to the code that talks to the database, not the layer which abstracts the database access into domain objects [sometimes called the data access layer and used interchangeably with data layer].)
I'd like to find an existing generic data layer implementation which provides standard crud functionality, error handling, connection management - the works. I'll be talking to SQL Server only.
It doesn't matter to me if the library is in C# or VB.NET and I don't care if it's LINQ or ADO.NET. As long as it works.
** I want to emphasize that I'm not looking for data access technologies or mechanisms (e.g. LINQ, ORM tools, etc.) but rather existing libraries.)
If you are talking to only SQL Server the Linq to SQL is your best option. It is pretty easy to get up and running. You will get both the Data Layer and the Abstraction. All you have to do is provide a connection string to Linq to SQL and it will handle the rest.
If you are going to connect to other database than SQL you would want to with NHibernate.
NHibernate takes a little more work than Linq to SQL to get up and running. MS provided in Visual Studio a nice tool that can get you reading from a SQL database pretty quick.
Honestly as much of a fan as I've always been with NHibernate. With the latest release of Enterprise Library 5 Data Access Block that they added in the dynamic mapping support natively. I would have to strongly consider not using NHibernate on a project and instead use a forward database generation tool from my domain objects to create my database (perhaps even use NHibernate solely for the scheme export) or something like CodeSmith and use EntLib.
You can use easyobjects has a very small learning curve, and is very extensible.
From their web:
EasyObjects.NET is a powerful data-access architecture for the .NET Framework. When used in combination with code generation, you can create, from scratch, a complete data layer for your application in minutes.
I'd like to find an existing generic data layer implementation which provides standard crud functionality, error handling, connection management - the works. I'll be talking to SQL Server only.
Might want to check out Subsonic. Though I personally find it quite limited, it's certainly not an ORM, but a "query tool." It will make CRUD operations easy and straightforward, and it generates partial POCO classes for every table in your database, rather than trying to map from a database to a domain layer.
Microsoft's Entity Framework might be what you are looking for to releave you from writing "the code that talks to the database".
The best things are that it already ships with Visual Studio and - depending on your requirements - you can use most functionality out-of-the box or manually adjust it to your custom business logic via T4 templates.
You can use it for forward and reverse engeneering and being a microsoft technology it integrates well with other MS products like SQL server.
I started using it 3 months ago in my current project at work which is composed of several windows and WCF services to convert third party data into our own database scheme. From the experiences we made with it, we'll be using the EF in future project a lot more.
What would you expect this framework to do with your exceptions? If it can't connect to your database, what should it do - crash the application, show an error message (winforms or WPF or ASP)... the questions are endless.
An ORM such as those suggested elsewhere in these answers is likely to be the closest you're going to get. Expecting a third party framework to provide all your exception handling isn't realistic - how would a third party know how your application is supposed to behave?
The direct answer to your question asking for "an existing generic data layer implementation which provides standard crud functionality, error handling, connection management - the works" is simple: use ADO.NET. The answers everyone else have provided actually go beyond that functionality, but your responses suggest that you think that there's something even further beyond - something that implements your data layer for you. My suggestion is that what you're looking for probably doesn't exist.
When I last worked in programming, we were trying to move away from DataReaders and the traditional ADO.NET API toward Object Relational Mapping (ORM).
To do this, we generated a DataContext of our DB via sqlmetal. There was then a thin data layer that made the DataContext private, and any code needing to access the database would have to use a public method in this thin data layer. These methods were basically stored procedures; they would perform queries on the database via LINQ to SQL.
Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what? It almost seemed like a hack at the time.
Basically, I'd like to know if LINQ to SQL and sqlmetal is what to expect if I'm go to write a DAL today at a .NET 3.5 shop that doesn't employ a third-party, open-source ORM.
It is still considered best practice to have some sort of data access layer. Whether this is best achieved with a ORM is a heavily debated issue. There is one faction that generally argues that ORM's are the way to go. Another faction argues that stored procedures and database centric is the best route.
Also, this may not be exactly the poster you meant, but it similar (and also the one in my cubicle)
http://download.microsoft.com/download/4/a/3/4a3c7c55-84ab-4588-84a4-f96424a7d82d/NET35_Namespaces_Poster_LORES.pdf
Your approach is good. I currently use Astroria services (ADO.NET Data Services). There was a nice introduction in MSDN Magazine about this.
I also like the new PLINQO (requires CodeSmith Tools though). This is very slick in my opinion.
When I have such a DAL (service layer), I just consume this service from my client application (Silverlight or ASP.NET MVC).
I think it depends on your use but I'd say with such a thin data layer as you explained that would be your DAL. Most projects will build another layer on top of that mainly for edit/create logic and maybe some stitching logic for gets.
For most of my projects I design it like this.
Repository holds the instance of DataContext and exposes some basic add/delete methods
ProductRepository : Repository exposes general queries (IQueryable)
StoreService uses an instance of different repositories like ProductRepository, SalesRepository and handles all logic for creating something like a product.
So something like...
StoreService.CreateProduct(/* properites */)
This would return some sort of result class.
The best data layer is the one that is plain and simple and gets the job done without any bells any whistles. I have used the technologies you mentioned and written about them here:
The Only Pattern for Data Access is - There Are No Patterns for Data Access
This very site uses LINQ to SQL, so take that as you will.
Officially, Microsoft is supporting Entity Framework over LINQ to SQL in terms of new development. However, there's a vocal group of people who think EF is the wrong way to go. LINQ to SQL will still be around for some time, and is a very decent ORM, if somewhat limiting in terms of which DB backend you can use.
I would recommend LINQ as a great starting point for your ORM. If you need better, look into EF and/or NHibernate.
"Is this a common approach today? I mean, is everyone whose using the .NET 3.5 framework really running sqlmetal in their build process, or what?"
The people I know using the 3.5 Framework (and that's just about everyone) - the vast majority - are still using NHibernate. Version 2.0 is a very nice OR/M. I started using it on a recent project and it cut my data access code down significantly, to the point where I really don't want to use anything else in the future. And the Fluent NHibernate API is making some headway for folks who don't like the XML mapping.
(EDIT: I made it a community wiki as it is more suited to a collaborative format.)
There are a plethora of ways to access SQL Server and other databases from .NET. All have their pros and cons and it will never be a simple question of which is "best" - the answer will always be "it depends".
However, I am looking for a comparison at a high level of the different approaches and frameworks in the context of different levels of systems. For example, I would imagine that for a quick-and-dirty Web 2.0 application the answer would be very different from an in-house Enterprise-level CRUD application.
I am aware that there are numerous questions on Stack Overflow dealing with subsets of this question, but I think it would be useful to try to build a summary comparison. I will endeavour to update the question with corrections and clarifications as we go.
So far, this is my understanding at a high level - but I am sure it is wrong...
I am primarily focusing on the Microsoft approaches to keep this focused.
ADO.NET Entity Framework
Database agnostic
Good because it allows swapping backends in and out
Bad because it can hit performance and database vendors are not too happy about it
Seems to be MS's preferred route for the future
Complicated to learn (though, see 267357)
It is accessed through LINQ to Entities so provides ORM, thus allowing abstraction in your code
LINQ to SQL
Uncertain future (see Is LINQ to SQL truly dead?)
Easy to learn (?)
Only works with MS SQL Server
See also Pros and cons of LINQ
"Standard" ADO.NET
No ORM
No abstraction so you are back to "roll your own" and play with dynamically generated SQL
Direct access, allows potentially better performance
This ties in to the age-old debate of whether to focus on objects or relational data, to which the answer of course is "it depends on where the bulk of the work is" and since that is an unanswerable question hopefully we don't have to go in to that too much. IMHO, if your application is primarily manipulating large amounts of data, it does not make sense to abstract it too much into objects in the front-end code, you are better off using stored procedures and dynamic SQL to do as much of the work as possible on the back-end. Whereas, if you primarily have user interaction which causes database interaction at the level of tens or hundreds of rows then ORM makes complete sense. So, I guess my argument for good old-fashioned ADO.NET would be in the case where you manipulate and modify large datasets, in which case you will benefit from the direct access to the backend.
Another case, of course, is where you have to access a legacy database that is already guarded by stored procedures.
ASP.NET Data Source Controls
Are these something altogether different or just a layer over standard ADO.NET?
- Would you really use these if you had a DAL or if you implemented LINQ or Entities?
NHibernate
Seems to be a very powerful and powerful ORM?
Open source
Some other relevant links;
NHibernate or LINQ to SQL
Entity Framework vs LINQ to SQL
I think LINQ to SQL is good for projects targeted for SQL Server.
ADO.NET Entity Framework is better if we are targeting different databases. Currently I think a lot of providers are available for ADO.NET Entity Framework, Provider for PostgreSQL, MySQL, esql, Oracle and many other (check http://blogs.msdn.com/adonet/default.aspx).
I don't want to use standard ADO.NET anymore because it's a waste of time. I always go for ORM.
Having worked on 20+ different C#/ASP.NET projects I always end up using NHibernate. I often start with a completely different stack - ADO.NET, ActiveRecord, hand rolled wierdness. There are numerous reasons why NHibernate can work in a wide range of situations, but the absolutely stand out for me is the saving in time, especially when linked to code generation. You can change the datamodel, and the entities get rebuilt, but most/all the other code doesn't need to be changed.
MS does have a nasty habit of pushing technologies in this area that parallel existing open source, and then dropping them when they don't take off. Does anyone remember ObjectSpaces?
Added for new technologies:
With Microsoft Sql Server out for Linux in Beta right now, I think it's ok to not be database agnostic. The .Net Core Path and MS-SQL route allows you to run on Linux servers like Ubuntu entirely with no windows dependencies.
As such, imo, a very good flow is to not use a full ORM framework or data controls and leverage the power of SSDT Visual Studio Projects (Sql Server Data Tools) and a Micro ORM.
In Visual Studio you can create a Sql Server Project as a legit Visual Studio Project. Doing so allows you to create the entire database via table designers or raw query editing right inside visual studio.
Secondly, you get SSDT's Schema Compare tool which you can use to compare your database project to a live database in Microsoft Sql Server and update it. You can sync your Visual Studio Project with the server causing updates in your project to go out to the server. Or you can sync the server with your project causing your source code to update. Via this route you can easily pick up changes the DBA made in maintenance last night and push out your new development changes for a new feature easily with a simple tool.
Using that same tool you can compute the migration script without actually running it, if you need to pass that off to an operations department and submit a change order, it works for that flow to.
Now for writing code against you MS-SQL Database, I recommend PetaPoco.
Because PetaPoco works Perfectly inline with the above SSDT solution. PetaPoco comes with T4 text templates you can use to generate all your data entity classes, and it generates the bulk data layer classes for you.
The catch is, you have to write queries yourself, which isn't a bad thing.
So you end up with something like this:
var people = dbContext.Fetch<Person>("SELECT * FROM People where Username Like '%#0%'", "bob");
PetaPoco automatically handles parameterizing #0 for you, it also has the handy Sql class for building queries.
Furthermore, PetaPoco is an order of magnitude faster than EF6 and 8+ times faster than EF7.
So in total, this solution involves using SSDT for SCHEMA management, and PetaPoco for code integration at the gain of high maintainability, customization, and very good performance.
The only downfall to this approach, is that you're hard tieing yourself to Microsoft Sql Server. However, imo, Microsoft Sql Server is one of the best RDBM's out there.
It's got DBMail, Jobs, CLR object capabilities, and on and on. Plus the integration between Visual Studio and MS-SQL server is phenomenal and you don't get any of that if you choose a different RDBMS.
I must say that I never used NHibernate for the immense time that needed to start using... time wasted on the XML setup.
I recently did a web application in MVC2, where I did choose ADO Entities Framework and I use Linq all the time.
I must say, I was impressed with the speed! and our site was having around 35 000 unique visitors per day, in around 60Gb bandwidth per day (I reduced radically this 60Gb number by hosting all static files in Amazon S3 - Great .NET wrapper they have, I must say).
I will always go this way. It's easy to start (just add new data item, choose tables and that's it! for every change in the database we just need to refresh the model - made automatically in just 2 clicks) and it's fun to use - Linq rules!
with LINQ2SQL, Entity Framework in the market, does it make sense to use Enterprise Library data access application block to design Data Access Layer(DAL)?
thanks.
That's like asking "Should I use a Dremel Rotary Tool or an Ingersol Rand industrial sandblasting rig?"
Can you describe what your application does and where it'll be used?
It really depends on what you are doing.
A lot of what I am writing is to existing stored procedures and other similar items. I find that SqlHelper from the Application Blocks fits my needs quite well, and haven't been compelled to change.
I have been using linq2sql and it is great. That said it can tie you with sql server (although there are third party implementations that enable linq 2 other database systems). Entity framework is rather new, but doesn't have the same restriction.
I recommend to go with either of those.