Dealing with Schema Updates in nHibernate/Fluent nHibernate after Deployment - c#

In writing an application that runs on Fluent Nhibernate/Nhibernate, something has me a bit concerned. I suppose this would be true of any ORM (and even without using an ORM), but what is the ... I guess the word is 'field of study' that relates to the best practices and methods for updating a database after deployment?
In nHibernate, I establish a SessionFactory and have an initial run where it writes the database out based on the mappings. That's fine and good, I can even write the database out manually. But what about when my client comes back and wants something new added? Can I append to the database without losing my data? I am completely new to all of this and it has been troubling me since the start of this project, and I really do not know what direction to go to make sure I can manage the program after it is deployed.
I have looked at other stack overflow questions that I could find regarding this topic - one of which did not even have an accepted answer (though the question itself was kind of vague), but I did discover the tool http://www.red-gate.com/products/sql-development/sql-compare/ from the question
Tool to upgrade SQL Express database after deployment though I am wondering just how good of a 'strategy' that is.

There are a couple of options, use the AutoMapping feature in Fluent NHibernate to minimize the mapping code you write. If your schema changes comply with the AutoMap conventions then you only need to work with the corresponding domain object changes.
Another less optimal option is to take a database first approach and have something like MyGeneration automatically generate the domain classes and NHibernate mapping files from the schema. This works if you have complete control of the database schema and it can be made to implement a good domain model design (both conditions which very rarely ever happen...)
In either approach, these tools can help handle the database scripting needed to "migrate" the schema changes to a new version

from my experience, after deployment you have to manually keep your db structure up-to-date.
that means that whenever you add / change your db structure, you do so using a script with DDL commands.
when you're ready to deploy, you just run those DDL scripts against your production db.
for example, if you add a 'bar' column to your 'foo' table, your script would be something like (pseudo-code):
ALTER TABLE foo ADD COLUMN 'bar' int(32) not null default(0);

Related

Entity Framework approach for production

We are building this .NET application using Entity Framework as our DB connector. I know all about the stuff of picking the right approach based on your circumstances like "do you have an existing database?", "do you prefer modelling instead of coding". But after some reading I've found that it's not the only thing to think of as the upgrade process of the database when it's already in production is really important, espacially for us.
So which approach is best for production use with Entity Framework. For the moment we have an existing database. I prefer to use the model and update the database from it but then we have lack of functionality in default values of columns and the model can be hard to work with in teams so what we need is basically some best practice here.
For production use: Database First, Model First or Code First?
Someone else might chime in here and tell you to use model migrations with the code first approach. That may be a solution, it's just not my preference.
We manage an in-motion database using EF code first, however I would not be able to do it without one hugely beneficial Visual Studio feature: SQL Schema Compare. I believe this feature is only available in the Premium and Ultimate versions of the product.
Each time our model changes, I put 2 copies of the database schema on my local machine: the new version, and the current production version. If you run Schema Compare using the new version as the source and the production version as the target, it will generate a SQL script that you can run against your production db to bring its schema and data in line with the changes.
The SQL it generates often needs some editing before it can be run in production, but it will do a lot of the hard work for you -- disabling constraints, add / drop indexes, and moving data from an old table into a new version of it. It will also warn you of potential issues when changing the schema.

Programatically find and apply schema differences on SQL Server

I have a product that I'm currently authoring that relies on SQL server for the backend. One issue I'm trying to resolve is to improve the 'upgrade' story. So v1 will have a particular schema and v2 may include some enhancements to this schema (new tables and new columns).
I'm aware of the SDKs from RedGate and ApexSQL - but would like to avoid.
I've had a read through the SMO docs, but I'm new to it and struggling to see if this can be applied in this situtaiton. Ideally, I'd like this to make this programatic (SMO or other) - the base cases seems straight forward enough, but I really don't want to re-invent the wheel if I can help it. Does anyone have any experience of similar requirements or ideas about how I could approach?
You don't say what version of SQL Server you're using but in (I think) 2005 and beyond, there is the concept of database triggers. These work like their table level cousins but can be used to track any kind of DDL change that happens on the database. We didn't use it to actually generate DDL - more to track when the format of a table changed. Although what you're after should be possible I'd have thought.
Triggers are one of those things that divide developers. Some people think they're the best thing since sliced bread whilst others hate them with a passion. Perhaps because when data changes, these are the last thing you think of.
Maybe not exactly what your're looking for (since it's not SMO) but having a look at Entity Framework Code First Migrations might help you:
http://msdn.microsoft.com/en-us/data/jj591621
Changes in the model-classes can be versioned and can either be applied directly to a database or, if you do not have direct access to your database, you can generate SQL-Code for your new version and hand it to your database-administrator.
I us Database Projects in Visual Studio to mange versioning of schemas. Once you create a baseline in a Database Project, you can make your changes in the project and then use the Schema Compare to create SQL scripts to apply the changes in different environments.
I would recommend doing only additive changes, but it will generate change scripts for destructive changes. If you do not have your environments synced up, I strongly recommend generating a new script for each environment.
This blog post goes over how to create one in Visual Studio 2012: http://candordeveloper.com/2013/01/08/creating-a-sql-server-database-project-in-visual-studio-2012/
Red Gate has a schema compare product too, but I have not really used it.

Verify that target database schema complies with what's in Entity Framework?

We have a process where our database guys script changes (and version them using Juneau) to our application's database out-of-band with our code base. They're good at accounting for new columns being null, and not wiping existing data, but occasionally a column rename sneaks in that isn't fully communicated. So they will make some changes to the database schema on a testing server, we'll update Entity Framework to work with those changes, and then commit our code. This process works okay, except for when it's time to deploy.
We have TFS set up to deploy the successful build to the appropriate servers, but there's no guarantee that the database for that environment has been updated. We don't care if extra fields/tables/views/etc. exist in the target database, but we want change the build to check that the database contains at least everything EF is aware of.
I looked at this question, but I don't need the schema to match exactly. Plus, we don't want it creating/modifying the database directly. And this question seems like it's trying to achieve a similar ideal, but still not quite what we're looking to achieve. We just want a integration test of sorts to verify our version of EF will work with the target schema.
I wonder why you try to deploy your application without changes to database. Your application is dependent on the database so the deployment should always be done after the database. It looks like you are going to invest a lot of time to develop validation to fix your incorrect deployment process (where fixing the process itself is the correct solution).
Anyway you can create some "validation" of the database but it will take some time. If you are using EDMX file you can open it as XML and read its SSDL part which describes all expected tables, columns, relations, views (in form of SELECT SQL queries), stored procedures and functions. You can parse this XML part and use system database views (sys.tables, sys.columns, ...) to query if these objects exists in the database.
Another approach can be using database diff. tool to compare your current test database with the target one. This will require the tool which can be executed from command line and you will have to parse its output to find breaking changes.

Is it possible to update old database from dbml file ? (C#, .Net 4, Linq, SQL Server)

I began recently a new job, a very interesting project (C#,.Net 4, Linq, VS 2010 and SQL Server). And immediately I got a very exciting challenge: I must implement either a new tool or integrate the logic when program start, or whatever, but what must happen is the following: the customers have previous application and database (full with their specific data). Now a new version is ready and the customer gets the update. In the mean time we made some modification on DB (new table, columns, maybe an old column deleted, or whatever). I’m pretty new in Linq and also SQL databases and my first solution can be: I check the applications/databases version and implement all the changes step by step comparing all tables, columns, keys, constrains, etc. (all this new information I have in my dbml and the old I asked from the existing DB). And I’ll do this each time the version changed. But somehow I feel, this is NOT a smart solution so I look for a general solution of this problem.
Is there a way to update customers DB from the dbml file? To create a new one is not a problem (CreateDatabase with DataContext), is there any update/alter database methods? I guess I’m not the only one who search for such a solution (I found nothing in internet – or I looked for bad keywords). How did you solve this problem? I look also for an external tool, but first for a solution with C#, Linq or something similar.
For any idea thank you in advance!
Best regards,
Emil
What I always do is use Red Gate's SQL Compare to compare the schema of the new database to the schema of the old database. It will generate a change script for you and then you can run that script in code.
We have a table that has a single row in it for program setup information. One of the columns in this table is the database version number. This will instantly tell us what database version the customer has when we do an update. Then we run every script that will update them to the latest version they need to be running. Whenever we release a new version (with database changes), we run the SQL Compare and make a script to go from the previous version to the next. We don't do any scripts that will skip versions, just in case of strange conflicts that may arise from that.
This also gives us the opportunity to do any data massaging we may have to do in between versions by writing a custom script and inserting that into the update scripts. Every update script changes that database version field as well.
This allows us to do a lot of automated updating. Having that database version allows the client to take a peek at that version before the user has a chance to use the application. If it's different and the application needs an update, it will go out to our ftp site and download the update and run the setup automatically.
Basically what you want to be able to do is to script the changes - to be able to run "something" that allows you to update one version of the database to the next and also to make any necessary changes to the data required by that change in the schema.
Good news is that you can do this with SQL, you can write DDL statements to create and modify a database schema.
My solution is to put my database schema maintenance entirely in code, I think this is the best version of the writeup I've done so far:
How to create "embedded" SQL 2008 database file if it doesn't exist?
Why in code? Because it works. May not be the best solution but its one I have had some success with and the results are consistent and repeatable. Oh and its version controlled too.
The big problem you may have in this specific instance is that you need to establish a baseline - to make sure that the existing databases are consistent in terms of their schema. This is where more complex and clever tools may serve you better - being able to do a schema diff and then update has a lot of appeal as a concept for example but equally you're somewhat dependent on having your reference database perfect and that raises other issues.

Which ORM will give me compile-tested queries?

Which ORM will give me compile-tested queries?
Is linqtosql compile time tested?
Edit:
Say I write a query that references a column named 'TotalSales'. I then rename the column in my database to TotalSales2 (and any other config file like: Employee.cfg.xml in nHibernate).
When I compile the project, I want Visual Studio to tell me the column 'totalSales' doesn't exist and then I will go and change it.
There aren't any as far as I'm aware. They will often let you create a LINQ query that cannot be translated into SQL for example. Also, I am not aware of any compile time checking that your mappings map to your database correctly.
You can, and should in my opinion, perform all these checks within tests. Most ORMs make this easy to do.
I use LLBLGen but it has to be "refreshed" when data model changes are made. I don't think you'll get an ORM that will AT COMPILE TIME check for modifications against the database. You're asking for quite a bit there.
In DataObjects.Net properties marked by [Field] attribute are always bound to field in database, so you can be sure that query will be translated. If you use not persistent field or another not supported statement, query translator will fail in runtime or performs such operation on fetched objects (on client).
Generally compile time validation is impossible or theoretically can be performed with special post-build tasks, that will scan compiled code, find all queries and validate them. But such checks will seriously slow down compilation process.
Perhaps not exactly what you're looking for but if using the Entity Framework and selecting "Update Model From Database" from the designer you will get messages saying the fields are no longer mapped if you change the names.
This doesn't happen automatically when you build a project.
Basically, you need 2 features together:
Compile-time checked queries (= an ORM with LINQ implementation). This is normally not a problem - at least some tools support this.
Pre-build step updating your entities based on database schema. AFAIK this is rarely implemented this way - normally you must explicitly update the model by the schema. Note that this part is normally rather costly.
SubSonic can do that if you include the code generation step as a pre-build event.
I used a Java tool called DODS, which was developed ca. 2000 with the Enhydra application server. DODS is still around here: http://www.enhydra.org/tech/dods/
The way DODS works, and which meets your goal of compile-time validation, is that it's a code generation tool. It generates Java classes corresponding to the tables in your database. Object instances of these classes have getters and setters for each column in the table. Of course if you change your database structure, you have to re-generation the Java code using DODS.
As long as you keep the generated code up to date with the structure of your database, it provides compile-time validation that any application code that uses these classes is querying valid tables and columns.
Anyway, I realize you tagged your question with C# and ASP.NET. A tool that generates Java code isn't going to be that helpful for you. But there could be another tool more specifically for .NET that works on the same principle of generating code that maps to database structure. So I'd suggest narrowing your search to .NET ORM tools that say something about code generation.

Categories