Current method for accessing databases by .NET - c#

So in Java they have a generic JDBC connection layer for connecting to most databases (SQL ones anyways).
What is the most current equivalent in .NET if I am targetting .NET4.0 or .NET4.5 (if it differs). A link to a tutorial would be nice also.
I did try googling but came up with quite a few results and different code segments all doing what appears to be the same thing and I have no idea which ones are current. OLEDB and ADO.NET came up alot so I assume it will be one of these. There appears to be multiple versions of ADO.NET also.

ADO.NET is the standard data-access plumbing for most common databases; however, above that, it is pretty common to use tools such as Entity Framework, NHibernate, "dapper", etc. You shouldn't be using OLEDB or ODBC without good reason. There aren't "multiple versions of ADO.NET" - however, there are multiple .NET framework versions, and each of those will have incremental ADO.NET changes.
If you are using a more esoteric RDBMS, you may have to use a different access technology.

Related

How to connect to a MySQL database inside the .NET framework?

I'm opening a discussion here on a subject I couldn't find any answer good enough to be called a final answer: MySQL and .NET.
While I know there is a lot of ways to make this connection, I'm trying to find a list of pros and cons of each approach.
I've been using ADO.NET with the MySQL NETconnector since the beggining of my project, and everything was ok when the database was new and didn't have many records. But now I'm facing a situation where the number of records grows exponentially, and I found other way of querying against the database, which is the ODBC connector. Using the ADO.NET + NETConnector solution I had my O/RM and didn't have to write my queries, while ODBC makes my code look awful now (since I didn't switch completely to ODBC, I have Linq queries and plain SQL queries inside my code).
Is there any solution (free or not) where I can have both an O/RM without the need of writing SQL queries myself and the speed of ODBC?
What you should be doing is using the MySQL ADO.NET Connector and storing your queries in the database in the form of stored procedures. Version 6.0 of the MySQL connector also supports The Entity Framework. If you are interested in using the Entity Framework, check out this link which describes how to set that up.
NHibernate
Update to Comments
NHibernate Proxy Generators
It is a byte code generator for your object model that allows NHibernate to perform lazy loading and other operations. The link provided explains the benefits.
Castle and LinFu are two different implementations of those Proxy Generators.
While NHibernate does not have coincide documentation all the information on how to use it, is on the internet. This could be a barrier to usability for some people though. I understand more about NHibernate because of my past experience with Hibernate.

Is there any way in .NET, other then Oledb to be database independent

Is there a way my DAL classes can be re used across different databases ?
I know some technologies (Linq and EF) support rapid development, I appreciate this feature but I also want to keep my DAL Code reuse able in different database.
A Simple thing that come to mind is use of Oledb with inline SQL queries, Is there a more elegant way ? please guide me. I am just considering 2 things.
support to 4 most commonly used databases (SQL Server, My SQL, Access, Oracle).
Rapid development support.
Thanks
You may consider using an ORM framework such as Entity Framework or NHibernate. This way your data access layer will be database agnostic.
If you stick to the interfaces (IDbConnection, IDbCommand, ...) and their factory methods (IDbConnection.CreateCommand) then the only code that needs to know what database you are using is the initial connection creation (which can be encapsulated).
Entity Framework does the work you want. The most commonly used databases support it, with the exception of Oracle. For Oracle, you have to use third-party components as the official Oracle support for Entity Framework is still in beta.

one Library for multiDatabases

Im developing a windows forms application using C# 4.0 and that application is going to target different database engines like SQL, MySQL and Oracle i was wondering if there is a library that can talk to all the three engines instead of implementing my own layers for every one.
thanks in advance.
You could use an ORM tool; I like NHibernate But there are many more: see a list at wikipedia.
The problem is if you want to do anything remotely advanced (date arithmatic, generate primary keys, get the id of the last inserted record, pivot a table , use RANGE construct etc.) then both databases use completely different syntax.
The best solution (in the java world at least is either Ibatis or Hibernate) I know there is a .NET version of Hibernate I am not sure about Ibatis.
These libraries insulate your program from the various SQL dialects and provide a common API independent of the underlying database.
If you use the classes in System.Data.Common you can make your code database independent:
Writing Provider Independent Code in ADO.NET
I don't know C#, but I know it will have a library for ODBC.
It looks like MS has one here.
It's old, but actually it does the job just fine. Virtually every DB in existence provides an ODBC driver.
Checkout DbLinq.
DbLinq is THE LINQ provider that allows to use common databases with
an API close to Linq to SQL. It currently supports (by order of
appearance): MySQL, Oracle, PostgreSQL, SQLite, Ingres, Firebird...
And still SQL Server.

sqlite ado.net provider - which is the most popular to use? (e.g. phxsoftware, devart, mindscape)

Regarding starting to use SQLite within C# Visual Studio 2008 for a winforms application, it seems from the sqlite site you have to download a sqlite ado.net provider, and there are numerous listed (in .net section of sqlite wrappers)
QUESTION - Which is the most popular/robust sqlite wrapper that people are using? Some from the list seem to include:
phxsoftware
devart
mindscape
I'm not sure yet whether I should want or need support for LINQ or EntityFramework. Most fundamental requirement is just one that allows me to work with a few tables worth of data from the sqlite database within VS2008 easily.
Thanks
I have been using phxsoftware alot with C# applications. I works very fast, even with large (ten of thousands) tables. It is also .NET 2 compliant, if that is an issue.
As to the free issue, this package is free.

Which is the "best" data access framework/approach for C# and .NET?

(EDIT: I made it a community wiki as it is more suited to a collaborative format.)
There are a plethora of ways to access SQL Server and other databases from .NET. All have their pros and cons and it will never be a simple question of which is "best" - the answer will always be "it depends".
However, I am looking for a comparison at a high level of the different approaches and frameworks in the context of different levels of systems. For example, I would imagine that for a quick-and-dirty Web 2.0 application the answer would be very different from an in-house Enterprise-level CRUD application.
I am aware that there are numerous questions on Stack Overflow dealing with subsets of this question, but I think it would be useful to try to build a summary comparison. I will endeavour to update the question with corrections and clarifications as we go.
So far, this is my understanding at a high level - but I am sure it is wrong...
I am primarily focusing on the Microsoft approaches to keep this focused.
ADO.NET Entity Framework
Database agnostic
Good because it allows swapping backends in and out
Bad because it can hit performance and database vendors are not too happy about it
Seems to be MS's preferred route for the future
Complicated to learn (though, see 267357)
It is accessed through LINQ to Entities so provides ORM, thus allowing abstraction in your code
LINQ to SQL
Uncertain future (see Is LINQ to SQL truly dead?)
Easy to learn (?)
Only works with MS SQL Server
See also Pros and cons of LINQ
"Standard" ADO.NET
No ORM
No abstraction so you are back to "roll your own" and play with dynamically generated SQL
Direct access, allows potentially better performance
This ties in to the age-old debate of whether to focus on objects or relational data, to which the answer of course is "it depends on where the bulk of the work is" and since that is an unanswerable question hopefully we don't have to go in to that too much. IMHO, if your application is primarily manipulating large amounts of data, it does not make sense to abstract it too much into objects in the front-end code, you are better off using stored procedures and dynamic SQL to do as much of the work as possible on the back-end. Whereas, if you primarily have user interaction which causes database interaction at the level of tens or hundreds of rows then ORM makes complete sense. So, I guess my argument for good old-fashioned ADO.NET would be in the case where you manipulate and modify large datasets, in which case you will benefit from the direct access to the backend.
Another case, of course, is where you have to access a legacy database that is already guarded by stored procedures.
ASP.NET Data Source Controls
Are these something altogether different or just a layer over standard ADO.NET?
- Would you really use these if you had a DAL or if you implemented LINQ or Entities?
NHibernate
Seems to be a very powerful and powerful ORM?
Open source
Some other relevant links;
NHibernate or LINQ to SQL
Entity Framework vs LINQ to SQL
I think LINQ to SQL is good for projects targeted for SQL Server.
ADO.NET Entity Framework is better if we are targeting different databases. Currently I think a lot of providers are available for ADO.NET Entity Framework, Provider for PostgreSQL, MySQL, esql, Oracle and many other (check http://blogs.msdn.com/adonet/default.aspx).
I don't want to use standard ADO.NET anymore because it's a waste of time. I always go for ORM.
Having worked on 20+ different C#/ASP.NET projects I always end up using NHibernate. I often start with a completely different stack - ADO.NET, ActiveRecord, hand rolled wierdness. There are numerous reasons why NHibernate can work in a wide range of situations, but the absolutely stand out for me is the saving in time, especially when linked to code generation. You can change the datamodel, and the entities get rebuilt, but most/all the other code doesn't need to be changed.
MS does have a nasty habit of pushing technologies in this area that parallel existing open source, and then dropping them when they don't take off. Does anyone remember ObjectSpaces?
Added for new technologies:
With Microsoft Sql Server out for Linux in Beta right now, I think it's ok to not be database agnostic. The .Net Core Path and MS-SQL route allows you to run on Linux servers like Ubuntu entirely with no windows dependencies.
As such, imo, a very good flow is to not use a full ORM framework or data controls and leverage the power of SSDT Visual Studio Projects (Sql Server Data Tools) and a Micro ORM.
In Visual Studio you can create a Sql Server Project as a legit Visual Studio Project. Doing so allows you to create the entire database via table designers or raw query editing right inside visual studio.
Secondly, you get SSDT's Schema Compare tool which you can use to compare your database project to a live database in Microsoft Sql Server and update it. You can sync your Visual Studio Project with the server causing updates in your project to go out to the server. Or you can sync the server with your project causing your source code to update. Via this route you can easily pick up changes the DBA made in maintenance last night and push out your new development changes for a new feature easily with a simple tool.
Using that same tool you can compute the migration script without actually running it, if you need to pass that off to an operations department and submit a change order, it works for that flow to.
Now for writing code against you MS-SQL Database, I recommend PetaPoco.
Because PetaPoco works Perfectly inline with the above SSDT solution. PetaPoco comes with T4 text templates you can use to generate all your data entity classes, and it generates the bulk data layer classes for you.
The catch is, you have to write queries yourself, which isn't a bad thing.
So you end up with something like this:
var people = dbContext.Fetch<Person>("SELECT * FROM People where Username Like '%#0%'", "bob");
PetaPoco automatically handles parameterizing #0 for you, it also has the handy Sql class for building queries.
Furthermore, PetaPoco is an order of magnitude faster than EF6 and 8+ times faster than EF7.
So in total, this solution involves using SSDT for SCHEMA management, and PetaPoco for code integration at the gain of high maintainability, customization, and very good performance.
The only downfall to this approach, is that you're hard tieing yourself to Microsoft Sql Server. However, imo, Microsoft Sql Server is one of the best RDBM's out there.
It's got DBMail, Jobs, CLR object capabilities, and on and on. Plus the integration between Visual Studio and MS-SQL server is phenomenal and you don't get any of that if you choose a different RDBMS.
I must say that I never used NHibernate for the immense time that needed to start using... time wasted on the XML setup.
I recently did a web application in MVC2, where I did choose ADO Entities Framework and I use Linq all the time.
I must say, I was impressed with the speed! and our site was having around 35 000 unique visitors per day, in around 60Gb bandwidth per day (I reduced radically this 60Gb number by hosting all static files in Amazon S3 - Great .NET wrapper they have, I must say).
I will always go this way. It's easy to start (just add new data item, choose tables and that's it! for every change in the database we just need to refresh the model - made automatically in just 2 clicks) and it's fun to use - Linq rules!

Categories