I have been using SQL since 1985, so am very comfortable with DB servers.
I see (C#) Code First as yet another fad, that comes and goes. It seems to suit people that have no DBA background. Equally if using Code First and you have not idea what DB you are connecting to eg it might be Mongo later that too is a useful abstraction. Code First does not let itself to Database Diagrams so you can see what is going on.
I would like to know how you promote changes into a production SQL server using code first, where you have no desire to Drop and recreate the DB, unlike using an ALTER TABLE command. I have used tools from Red-Gate that make DB code promotions easy.
So why Code First?
How do you move DB Changes into production?
The code first approach is something that is really nice to have when you're prototyping applications and "don't care" about the "persistence layer" too much in the beginning. It helps getting quick results in a time when things are not at all well defined yet, because you can always very easily drop and re-create your database of your greenfield project.
Unfortunately, most tools that offer this approach do not really help transitioning from the "greenfield project mode" to the more longterm "legacy project mode" where database migrations are essential. In fact, not only are database migrations difficult to achieve in "client model first" approaches, but more importantly, the likelihood of the database model not being well designed is quite high.
Developers pay a high price for the quick win - as always. It is really not a good idea and a lot of more experienced developers who are used to working with legacy, do agree with you: Go database first, and derive your client (e.g. ORM) models from it.
I have written about this topic here, including the advantage of using client model source code generation.
Related
With domain-driven design is best to make tiny steps (change design or code, unit testing ...).
I think that is good (to make the script=to write the code) of SQL Server from SQL Server Management Studio, but with DDD the database code is written at the end, after we tested the design.
With code written in c# and then create database with EF you will change the c# code frequently, and that implicitly will change database code a lot.
How best to proceed?
Assuming you are working on a brownfield project. Then for a given user story:
1) Design and unit test your domain model.
2) Then integration-test your infrastructure. This includes testing repository implementations against database that gets created dynamically for these tests (can be in-memory or embedded). NHibernate generates schema for you automatically, not sure about EF.
Being persistence-agnostic definitely helps here because you can test against SQLite but run against SQL Server for example.
3) Then manually write migration scripts for your production database. There is no black magic that will help you with this step. The script can later be executed by a framework like RoundhousE. More information here.
Rinse and repeat. For a green field project that is not deployed yet, you can skip step 3) and generated 'baseline' script on first production deployment.
DDD preaches persistence ignorance, which states that your domain artifacts (classes for entities, value objects) should be unaware of how they're persisted. However, technical persistence concerns cannot always be easily avoided or delayed. As a result, the model in code will usually be affected by constraints of the persistence technology.
You've already foreshadowed the best approach: tiny steps. The question is what constitutes a step. Initial steps can consist of the designing the model in code and then implementing persistence. A subsequent step repeats the process. The fact that the steps are small reduces the chance that you'll create a design in code which cannot be easily persisted all while prioritizing focus on the model over the database.
Regarding the use of SQL Management studio vs EF generators, this is a matter of preference. I prefer to hand-code SQL, others may enjoy the generation facilities of EF.
I would like to know what is the best method for developing a multi-user C# app using the SQL Server2005 as database. This is what I have in mind:
using nhibernate or telerik's openacces orm.
linq
using wrappers. all data from tables load into corresponding objects (at startup) and from that point only delete&update transactions affect the database.
...
I've looked at orm tools but in my opinion they generate a lot of code and i do not know if
it's necessary.
What is the best solution having in mind future changes in the application?
If i would choose the 3rd option how can i ensure that only one users modifies a row in a table(how can i lock a table row which is under modification) ?
Any suggestions or reading material will help!
Thanks!
There are hundreds of ways to solve this, but don't discount ORM. Microsoft's Entity Framework is getting better with every revision. The framework 4.0 bits are pretty good and play extremely well with LINQ.
As for generated code vs your own, try something like Entity Spaces... You have complete control over how the code gets generated and the data access layer is extremely powerful and flexible (not to mention very easy to use). It also plays nicely with LINQ.
I have written a lot of data access code over the years. In the beginning, the ORM tools were rough around the edges and left a lot to be desired. These tools have gone through many iterations since and have become indispensable in my opinion. I can't imagine writing routine after routine that does the same basic CRUD. I did that for years and spent lots of time correcting hardcoded SQL and vow to avoid it at all costs from here on out.
As for concurrency / locking issues, that's a question unto itself. There are many ways to provide locking (the major categories being optimistic and pessimistic). Each has its pros and cons.
If it's multiuser do NOT do #3. The purpose of an DBMS is to handle the multi-user aspects for you. Everything from transactions to access rights are built right in. Going down the path of mimicking that in your code will be difficult to get right. In the past some "engines" like Borland's BDE and MS Access did this. The end result is that you end up dealing with little things like data corruption and consistency errors.
Never mind that as your database grows the is going to take exponentially longer to start.
We typically stay away from ORM tools for a number of reasons, mostly feature / benefit / security concerns. Of course, we are extremely well versed in SQL and can take advantage of the specific features a given db server can offer, which most ORMs can't do. We also tend to tweak the queries based on performance metrics after product release, which would force a recompile of an app for most ORMs. By staying away from this, we can let production DBAs do their job. That may or may not be a concern of yours.
That said a lot of dev teams both like and successfully use the ones you spoke about. I would say to skip Linq-to-SQL in favor of Entity Framework if you're going that route. Linq-to-SQL has all but been replaced by EF.
Save yourself a load of effort and time and use an ORM. In terms of helping you decide which one, there is loads of information/opinion on the web (and StackOverflow!) about which one to use but that'll depend on what your application requirements are (which you haven't described).
I like Linq-to-SQL for small/mid sized apps. It's quick and easy and almost efficient. For bigger apps it'll depend on what types of data transformations and design you have in mind but Linq-to-Entities or nHibernate are probably the most appropriate.
(EDIT: I made it a community wiki as it is more suited to a collaborative format.)
There are a plethora of ways to access SQL Server and other databases from .NET. All have their pros and cons and it will never be a simple question of which is "best" - the answer will always be "it depends".
However, I am looking for a comparison at a high level of the different approaches and frameworks in the context of different levels of systems. For example, I would imagine that for a quick-and-dirty Web 2.0 application the answer would be very different from an in-house Enterprise-level CRUD application.
I am aware that there are numerous questions on Stack Overflow dealing with subsets of this question, but I think it would be useful to try to build a summary comparison. I will endeavour to update the question with corrections and clarifications as we go.
So far, this is my understanding at a high level - but I am sure it is wrong...
I am primarily focusing on the Microsoft approaches to keep this focused.
ADO.NET Entity Framework
Database agnostic
Good because it allows swapping backends in and out
Bad because it can hit performance and database vendors are not too happy about it
Seems to be MS's preferred route for the future
Complicated to learn (though, see 267357)
It is accessed through LINQ to Entities so provides ORM, thus allowing abstraction in your code
LINQ to SQL
Uncertain future (see Is LINQ to SQL truly dead?)
Easy to learn (?)
Only works with MS SQL Server
See also Pros and cons of LINQ
"Standard" ADO.NET
No ORM
No abstraction so you are back to "roll your own" and play with dynamically generated SQL
Direct access, allows potentially better performance
This ties in to the age-old debate of whether to focus on objects or relational data, to which the answer of course is "it depends on where the bulk of the work is" and since that is an unanswerable question hopefully we don't have to go in to that too much. IMHO, if your application is primarily manipulating large amounts of data, it does not make sense to abstract it too much into objects in the front-end code, you are better off using stored procedures and dynamic SQL to do as much of the work as possible on the back-end. Whereas, if you primarily have user interaction which causes database interaction at the level of tens or hundreds of rows then ORM makes complete sense. So, I guess my argument for good old-fashioned ADO.NET would be in the case where you manipulate and modify large datasets, in which case you will benefit from the direct access to the backend.
Another case, of course, is where you have to access a legacy database that is already guarded by stored procedures.
ASP.NET Data Source Controls
Are these something altogether different or just a layer over standard ADO.NET?
- Would you really use these if you had a DAL or if you implemented LINQ or Entities?
NHibernate
Seems to be a very powerful and powerful ORM?
Open source
Some other relevant links;
NHibernate or LINQ to SQL
Entity Framework vs LINQ to SQL
I think LINQ to SQL is good for projects targeted for SQL Server.
ADO.NET Entity Framework is better if we are targeting different databases. Currently I think a lot of providers are available for ADO.NET Entity Framework, Provider for PostgreSQL, MySQL, esql, Oracle and many other (check http://blogs.msdn.com/adonet/default.aspx).
I don't want to use standard ADO.NET anymore because it's a waste of time. I always go for ORM.
Having worked on 20+ different C#/ASP.NET projects I always end up using NHibernate. I often start with a completely different stack - ADO.NET, ActiveRecord, hand rolled wierdness. There are numerous reasons why NHibernate can work in a wide range of situations, but the absolutely stand out for me is the saving in time, especially when linked to code generation. You can change the datamodel, and the entities get rebuilt, but most/all the other code doesn't need to be changed.
MS does have a nasty habit of pushing technologies in this area that parallel existing open source, and then dropping them when they don't take off. Does anyone remember ObjectSpaces?
Added for new technologies:
With Microsoft Sql Server out for Linux in Beta right now, I think it's ok to not be database agnostic. The .Net Core Path and MS-SQL route allows you to run on Linux servers like Ubuntu entirely with no windows dependencies.
As such, imo, a very good flow is to not use a full ORM framework or data controls and leverage the power of SSDT Visual Studio Projects (Sql Server Data Tools) and a Micro ORM.
In Visual Studio you can create a Sql Server Project as a legit Visual Studio Project. Doing so allows you to create the entire database via table designers or raw query editing right inside visual studio.
Secondly, you get SSDT's Schema Compare tool which you can use to compare your database project to a live database in Microsoft Sql Server and update it. You can sync your Visual Studio Project with the server causing updates in your project to go out to the server. Or you can sync the server with your project causing your source code to update. Via this route you can easily pick up changes the DBA made in maintenance last night and push out your new development changes for a new feature easily with a simple tool.
Using that same tool you can compute the migration script without actually running it, if you need to pass that off to an operations department and submit a change order, it works for that flow to.
Now for writing code against you MS-SQL Database, I recommend PetaPoco.
Because PetaPoco works Perfectly inline with the above SSDT solution. PetaPoco comes with T4 text templates you can use to generate all your data entity classes, and it generates the bulk data layer classes for you.
The catch is, you have to write queries yourself, which isn't a bad thing.
So you end up with something like this:
var people = dbContext.Fetch<Person>("SELECT * FROM People where Username Like '%#0%'", "bob");
PetaPoco automatically handles parameterizing #0 for you, it also has the handy Sql class for building queries.
Furthermore, PetaPoco is an order of magnitude faster than EF6 and 8+ times faster than EF7.
So in total, this solution involves using SSDT for SCHEMA management, and PetaPoco for code integration at the gain of high maintainability, customization, and very good performance.
The only downfall to this approach, is that you're hard tieing yourself to Microsoft Sql Server. However, imo, Microsoft Sql Server is one of the best RDBM's out there.
It's got DBMail, Jobs, CLR object capabilities, and on and on. Plus the integration between Visual Studio and MS-SQL server is phenomenal and you don't get any of that if you choose a different RDBMS.
I must say that I never used NHibernate for the immense time that needed to start using... time wasted on the XML setup.
I recently did a web application in MVC2, where I did choose ADO Entities Framework and I use Linq all the time.
I must say, I was impressed with the speed! and our site was having around 35 000 unique visitors per day, in around 60Gb bandwidth per day (I reduced radically this 60Gb number by hosting all static files in Amazon S3 - Great .NET wrapper they have, I must say).
I will always go this way. It's easy to start (just add new data item, choose tables and that's it! for every change in the database we just need to refresh the model - made automatically in just 2 clicks) and it's fun to use - Linq rules!
I've been taking a look at some different products for .NET which propose to speed up development time by providing a way for business objects to map seamlessly to an automatically generated database. I've never had a problem writing a data access layer, but I'm wondering if this type of product will really save the time it claims. I also worry that I will be giving up too much control over the database and make it harder to track down any data level problems. Do these type of products make it better or worse in the already tough case that the database and business object structure must change?
For example:
Object Relation Mapping from Dev Express
In essence, is it worth it? Will I save "THAT" much time, effort, and future bugs?
I have used SubSonic and EntitySpaces. Once you get the hang of them, I beleive they can save you time, but as complexity of your app and volume of data grow, you may outgrow these tools. You start to lose time trying to figure out if something like a performance issue is related to the ORM or to your code. So, to answer your question, I think it depends. I tend to agree with Eric on this, high volume enterprise apps are not a good place for general purpose ORMs, but in standard fare smaller CRUD type apps, you might see some saved time.
I've found iBatis from the Apache group to be an excellent solution to this problem. My team is currently using iBatis to map all of our calls from Java to our MySQL backend. It's been a huge benefit as it's easy to manage all of our SQL queries and procedures because they're all located in XML files, not in our code. Separating SQL from your code, no matter what the language, is a great help.
Additionally, iBatis allows you to write your own data mappers to map data to and from your objects to the DB. We wanted this flexibility, as opposed to a Hibernate type solution that does everything for you, but also (IMO) limits your ability to perform complex queries.
There is a .NET version of iBatis as well.
I've recently set up ActiveRecord from the Castle Project for an app. It was pretty easy to get going. After creating a new app with it, I even used MyGeneration to script out class files for a legacy app that ActiveRecord could use in a pretty short time. It uses NHibernate to interact with the database, but takes away all the xml mapping that comes with NHibernate. The nice thing is though, if necessary, you already have NHibernate in your project, you can use its full power if you have some special cases. I'd suggest taking a look at it.
There are lots of choices of ORMs. Linq to Sql, nHibernate. For pure object databases there is db4o.
It depends on the application, but for a high volume enterprise application, I would not go this route. You need more control of your data.
I was discussing this with a friend over the weekend and it seems like the gains you make on ease of storage are lost if you need to be able to query the database outside of the application. My understanding is that these databases work by storing your object data in a de-normalized fashion. This makes it fast to retrieve entire sets of objects, but if you need to select data from a perspective that doesn't match your object model, the odbms might have a hard time getting at the particular data you want.
Currently I am using NetTiers to generate my data access layer and service layer. I have been using NetTiers for over 2 years and have found it to be very useful. At some point I need to look at LINQ so my questions are...
Has anyone else gone from NetTiers to LINQ To SQL?
Was this switch over a good or bad thing?
Is there anything that I should be aware of?
Would you recommend this switch?
Basically I would welcome any thoughts
.
No
See #1
You should beware of standard abstraction overhead. Also it's very SQL Server based in it's current state.
Are you using SQL Server, then maybe. If you are using LINQ for other things right now like over XML data (great), Object data, Datasets, then yes you should could switch to have a uniform data syntax for all of them. Like lagerdalek mentioned if it ain't broke don't fix it.
From the quick look at .netTiers Application Framework, I'd say if you already have an investment with that solution it seems to give you much more than a simple Data Access Layer and you should stick with it.
From my experience LINQ to SQL is a good solution for small-medium sized projects. It is an ORM which is a great way to enhance productivity. It also should give you another layer of abstraction that will allow you to change out the layer underneath for something else. The designer in Visual Studio (and I belive VS Express also) is very easy and simple to use. It gives you the common drag-drop and property-based editing of the object mappings.
# Jason Jackson - The Designer does let you add properties by hand, however you need to specify the attributes for that property, but you do this once, it might take 3 minutes longer than the initial dragging of the table into the designer, however it is only necessary once per change in the database itself. This is not too different from other ORMs, however you are correct that they could make this much easier, and find only those properties that have changed, or even implement some kind of refactoring tool for such needs.
Resources:
Why use LINQ to SQL?
Scott Guthrie on LINQ to SQL
10 Tips to Improve your LINQ to SQL Application Performance
LINQ To SQL and Visual Studio 2008 Performance Update
Performance Comparisons LINQ to SQL / ADO / C#
LINQ to SQL 5 Minute Overview
Note that Parallel LINQ is being developed to allow for much greater performance on multi-core machines.
I tried to use Linq to SQL on a small project, thinking that I wanted something I could generate quickly. I ran into a lot of problems in the designer. For example, anytime you need to add a column to a table you basically have to remove and re-add the table definition in the designer. If you have set any properties on the table then you have to re-set those properties. For me this really slowed down the development process.
LINQ to SQL itself is nice. I really like the extensibility. If they can improve the designer I might try it again. I think that the framework would benefit from a little more functionality aimed at a disconnected model like web development.
Check out Scott Guthrie's LINQ to SQL series of blog posts for some great examples of how to use it.
NetTiers is very good for generating a heavy and robust DAL, and we use it internally for core libraries and frameworks.
As I see it, LINQ (in all its incarnations, but specifically as I think you're asking to SQL) is fantastic for quick data access, and we generally use it for more agile cases.
Both technologies are quite inflexible to change without regeneration of the code or dbml layer.
That being said, used properly LINQ 2 SQL is quite a robust solution, and you might even start using it for future development due to it's ease of use, but I wouldn't throw away your current DAL for it - if it aint broke ...
My experience tells me that using by using linq you can get things done faster, however the actual actions to the database are slower.
So... if you have a small database, i'll say go for it. If not, i would wait for some improvements before changing
I'm using LINQ to SQL on fairly large project right now (about 150 tables) and it is working out very well for me. The last ORM I used was IBatis and it worked well but took alot of legwork to get your mappings done. LINQ to SQL performs very well for me and so far has proved to be very easy to use out of the box. There are definately some differences you have to overcome in transition, but I would recommend it's use.
Side note, I have never used or read about NetTiers so I won't discount it's effectiveness, but LINQ to SQL in general has proven to be an extremely viable ORM.
Our team used to use NetTiers and found it to be useful. BUT... the more we used it, the more we found headaches and pain points with it. For example, anytime you make a change to the database, you need to re-generate the DAL with CodeSmith which involved:
re-generating thousands of lines of code in 3 separate projects
re-generating hundreds of stored procedures
Maybe there are other ways of doing it, but this is what we had to do. The re-gen of the source code was ok, scary, but ok. The real issue came with the stored procedures. It didn't clean any unused stored procedures so if you removed a table from your schema and re-gened your DAL, the stored procedures for that table did not get removed. Also, this became quite a headache for database change scripts where we had to compare the old database structure to the new one and create a change script to update client installations. This script could run into the tens of thousands of lines of sql code and if there was an issue executing it, which there invariably was, it was quite a pain to resolve it.
Then the light came on, NHibernate as an ORM. It certainly has a ramp-up time to it but it is well worth it. There is a ton of support for it so if there's something you need done, more than likely it's been done before. It is extremely flexible and allows you to control every aspect of it and then some. It is also becoming easier and easier to use. Fluent Nhibernate is up and coming as a great way to get rid of the xml mapping files that are needed and NHibernate Profiler provides an excellent interface to see what's going on behind the scenes to increase efficiency and remove redundancy.
Moving from NetTiers to NHibernate has been painful, but in a good way. It has forced us to move into a better architecture and re-evaluate functional needs. NetTiers provided tons of data access code, get this entity by its id, get this other entity by its foreign key, get a tlist and vlist of this and that, but most of it was unnecessary and unused. NHibernate with a generic repository and custom repositories only where needed reduced tons of unused code and really increased readability and reliability.