Database validation in application - c#

I have a c# application add-on from which I need to validate the schema of a database. I can't use any of the obvious frameworks due to limitations of the application I'm extending, but rather need to find an alternative method to confirm whether database procedures exist and or whether they need to be updated (if the procedure itself was changed from what is expected). Aside from writing individual queries for each procedure are there any better solutions I might consider?

There is database project in visual studio. You can import your database to this project and try to build. Please see official documentation:
https://learn.microsoft.com/en-us/sql/ssdt/import-into-a-database-project?view=sql-server-2017

<twocents>
Export all your objects to .SQL files and commit them to source control and write them out into a folder on disk.
Create a tracker table that maps the object you have exported to a hash of the DDL of the object.
Inspect the objects in the database when your app initialises or at some other appropriate time.
Compare the hash of the DDL of the database object in the database with what is on the disk. If there is a disparity drop and recreate the object by executing the DDL of the file on disk
The hash acts as fast way to check if what's in the database matches the structure of the object that you expect to be in the database. How you deploy it is questionable, but the assumption I am making is that you are in control of your database objects that you have exposed to the application.
</twocents>

Related

Verify that target database schema complies with what's in Entity Framework?

We have a process where our database guys script changes (and version them using Juneau) to our application's database out-of-band with our code base. They're good at accounting for new columns being null, and not wiping existing data, but occasionally a column rename sneaks in that isn't fully communicated. So they will make some changes to the database schema on a testing server, we'll update Entity Framework to work with those changes, and then commit our code. This process works okay, except for when it's time to deploy.
We have TFS set up to deploy the successful build to the appropriate servers, but there's no guarantee that the database for that environment has been updated. We don't care if extra fields/tables/views/etc. exist in the target database, but we want change the build to check that the database contains at least everything EF is aware of.
I looked at this question, but I don't need the schema to match exactly. Plus, we don't want it creating/modifying the database directly. And this question seems like it's trying to achieve a similar ideal, but still not quite what we're looking to achieve. We just want a integration test of sorts to verify our version of EF will work with the target schema.
I wonder why you try to deploy your application without changes to database. Your application is dependent on the database so the deployment should always be done after the database. It looks like you are going to invest a lot of time to develop validation to fix your incorrect deployment process (where fixing the process itself is the correct solution).
Anyway you can create some "validation" of the database but it will take some time. If you are using EDMX file you can open it as XML and read its SSDL part which describes all expected tables, columns, relations, views (in form of SELECT SQL queries), stored procedures and functions. You can parse this XML part and use system database views (sys.tables, sys.columns, ...) to query if these objects exists in the database.
Another approach can be using database diff. tool to compare your current test database with the target one. This will require the tool which can be executed from command line and you will have to parse its output to find breaking changes.

How should I generate Database table on a fresh application install?

I've been working on a CMS based in ASP.NET MVC2 using a poco based linq-to-sql repository that is connected to a SQL 2008 database.
I'm nearing completion on the CMS and now I'm thinking about deployment. Ideally, I would like the install process to be something like this:
Copy CMS solution to server location
Create empty database
Change db connection string in web.config to new database
Run script to create db tables (including relationships, constraints, and default data).
I'm not sure where to start with this kind of project. I have limited experience working with Stored Procedures. If it's possible to do this progmatically in C# that would be preferable because it would be easier for me to work with.
Any thoughts on where I should start with this?
Edit - New thoughts
I wonder how difficult it would be to just have a database file (mdf?) that could just be renamed and copied to the sql server. Is that even possible?
I would suggest scripting out the creation of your new database and including it as a step in your application install.
You can let SQL Enterprise manager do a lot of the work for you to generate the script too. If you are using enterprise manager just right click your database and select:
Tasks -> Generate Scripts
In that wizard select your database and the "Script all objects in selected database". That will end up generating a create script for your entire dB.
You can then take that script file and include it as a resource in your setup program, and run it during setup to create out your full dB.
A good thing to think about if you are going this route is Type data. You may want to include another script step that will populate your type data as the 1st script will only generate your dB, tables, and procs.
The two ways I can think of doing this off the top of my head are:
1) Web Setup projects - Similar to this post here and add your own custom actions
2) Make it so the first time your application runs it searches for a DB and if one is not found it gives you the option to add the DB/Connection String.
Both of these can be done in C#. You won't be using Stored Procedures until you have a DB setup because that is where the SPROCS reside.
Embedding the database is probably your best bet. You can use an .mdb, or SQLite or similiar. If your using .mdb, just create a template .mdb, then make it a resource of the project. Then you can copy it to the destination and modify it from there.
I would use SQLite, as its open source, and then you don't have to mess with changing the connection string because you'll just be using the program's install directory to store it.

How do I use SMO to drop and recreate all views in a Database?

I recently copied a MSSQL2000 DB to a MSSQL2008 DB. Afterwards I ran a script to update all the text/varchar fields in the tables to the same collation as the model db. However, all my copied views still are using the same collation as the previous db. The easiest way I found to fix this would be to have MS SQL Management Studio create the DROP/CREATE scripts for these views. However, certain views depend on other views so you need to make sure to DROP/CREATE these views in the proper order.
So my question is:
How would I create a script that traverses, depth first, the dependencies of each view and then on the return trip drops and creates each view?
I'm assuming that this will use SMO. I will be writing the script in C#.
Can you not just iterate through and execute sp_refreshview dynamically? This will achieve the same result: updating the view for the new base table definition.
You aren't using WITH SCHEMABINDING (otherwise could not change tables) so dependency or depth does not matter.
in cases like this just run the script a bunch of times (SSMS f5), eventually all the child views will exist when they are used by a parent. You can tell when everything is good, as there will be no errors.
Try DBSourceTools. http://dbsourcetools.codeplex.com.
This utility uses SMO to script all database objects and data to disk (using DROP / CREATE) scripts.
It also automatically builds a dependancy tree and will re-create database objects in the required order.
Once you have scripted your 2000 DB to disk, create a "Deployment Target" database for your 2008 database.
You can then create a full set of patches to change / upgrade / rename all of your views.
DBSourceTools will re-build your database for you, and then apply all patches in order.
It's a reliable, repeatable method of version control for databases, and allows you to test and version control these patches.
When you are ready for release, simply send all of the patches to your DBA, and get him to run them in order.
Have fun.

Converting project to SQL Server, design thoughts?

Currently, I'm sitting on an ugly business application written in Access that takes a spreadsheet on a bi-daily basis and imports it into a MDB. I am currently converting a major project that includes this into SQL Server and .net, specifically in c#.
To house this information there are two tables (alias names here) that I will call Master_Prod and Master_Sheet joined on an identity key parent to the Master_Prod table, ProdID. There are also two more tables to store history, History_Prod and History_Sheet. There are more tables that extend off of Master_Prod but keeping this limited to two tables for explanation purposes.
Since this was written in Access, the subroutine to handle this file is littered with manually coded triggers to deal with history that were and have been a constant pain to keep up with, one reason why I'm glad this is moving to a database server rather than a RAD tool. I am writing triggers to handle history tracking.
My plan is/was to create an object modeling the spreadsheet, parse the data into it and use LINQ to do some checks client side before sending the data to the server... Basically I need to compare the data in the sheet to a matching record (Unless none exist, then its new). If any of the fields have been altered I want to send the update.
Originally I was hoping to put this procedure into some sort of CLR assembly that accepts an IEnumerable list since I'll have the spreadsheet in this form already but I've recently learned this is going to be paired with a rather important database server that I am very concerned with bogging down.
Is this worth putting a CLR stored procedure in for? There are other points of entry where data enters and if I could build a procedure to handle them given the objects passed in then I could take a lot of business rule away from the application at the expense of potential database performance.
Basically I want to take the update checking away from the client and put it on the database so the data system manages whether or not the table should be updated so the history trigger can fire off.
Thoughts on a better way to implement this along the same direction?
Use SSIS. Use Excel Source to read the spreadsheets, perhaps use a Lookup Transformation to detect new items and finally use a SQL Server Destination to insert the stream of missing items into SQL.
SSIS is way better fit to these kind of jobs that writing something from scratch, no matter how much fun linq is. SSIS Packages are easier to debug, maintain and refactor than some dll with forgoten sources. Besides, you will not be able to match the refinements SSIS has in managing its buffers for high troughput Data Flows.
Originally I was hoping to put this
procedure into some sort of CLR
assembly that accepts an IEnumerable
list since I'll have the spreadsheet
in this form already but I've recently
learned this is going to be paired
with a rather important database
server that I am very concerned with
bogging down.
Does not work. Any input into a C# written CLR procedure STILL has to follow normal SQL semantics. All that can change is the internal setup. Any communication up with the client has to be done in SQL. Which means executions / method calls. No way to directly pass in an enumerable of objects.
My plan is/was to create an object
modeling the spreadsheet, parse the
data into it and use LINQ to do some
checks client side before sending the
data to the server... Basically I need
to compare the data in the sheet to a
matching record (Unless none exist,
then its new). If any of the fields
have been altered I want to send the
update.
You probably need to pick a "centricity" for your approach - i.e. data-centric or object-centric.
I would probably model the data appropriately first. This is because relational databases (or even non-normalized models represented in relational databases) will often outlive client tools/libraries applications. I would probably start trying to model in a normal form and think about the triggers to maintain audit/history as you mention during this time also.
I would typically then think of the data coming in (not an object model or an entity, really). So then I focus on the format and semantics of the inputs and see if there is misfit in my data model - perhaps there were assumptions in my data model which were incorrect. Yes, I'm not thinking of making an object model which validates the spreadsheet even though spreadsheets are notoriously fickle input sources. Like Remus, I would simply use SSIS to bring it in - perhaps to a staging table and then some more validation before applying it to production tables with some T-SQL.
Then I would think about a client tool which had an object model based on my good solid data model.
Alternatively, the object approach would mean modeling the spreadsheet, but also an object model which needs to be persisted to the database - and perhaps you now have two object models (spreadsheet and full business domain) and database model (storage persistence), if the spreadsheet object model is not as complete as the system's business domain object model.
I can think of an example where I had a throwaway external object model kind of like this. It read a "master file" which was a layout file describing an input file. This object model allowed the program to build SSIS packages (and BCP and SQL scripts) to import/export/do other operations on these files. Effectively it was a throwaway object model - it was not used as the actual model for the data in the rows or any kind of navigation between parent and child rows, etc., but simply an internal representation for internal purposes - it didn't necessarily correspond to a "domain" entity.

How to install and setup a database with .NET?

I am currently working on a project that include the use of SQLServer. I would like to know what strategy I should use when I install the software to build the database? I need to set up the tables, the stored procedures and the users.
Does my software need to make a check on start up to see if the database exist and then if it doesn't, create it up?
Is there any way that I could automate this when I install SQLServer?
Thank you.
EDIT
Ok right now I have plenty of nice solution, but I am really looking for a solution (free or open source would be awesome) that would allow me to deploy a new application that needs SQLServer to be freshly installed and setuped to the needs of the software.
RedGate software offers SQL Packager which gives you option to produce a script/.exe to deploy whole db including everything (stored procedures, tables etc) from one single .exe. If you can afford it and want to have easy way to do it (without having to do it yourself) it's the way to go ;-)
Easy roll-out of database updates across your client base
Script and compress your schema and data accurately and quickly
Package any pre-existing SQL script as a .exe, or launch as a C# project
Simplify deployments and updates for SQL Server 2000, 2005 and 2008
You could use migration framework like Migrator.Net, then you could run the migrations every time your application starts. The good thing about this approach is that you can update your database when you release a new version of your software.
Go take a look at their Getting started page. This might clear up the concept.
I have succesfully used this approach to solve the problem you are confronted with.
You do all of that with the SQL scripts. And then your installation program runs them against the customer's SQL Server.
You can write a T-SQL script that only creates objects when they do not exist. For a database:
if db_id('dbname') is null
create database dbname
For a stored procedure (see MSDN for a list of object types):
if object_id('spname', 'P') is null
exec ('create procedure dbo.spname as select 1')
go
alter procedure dbo.spname
as
<procedure definition>
The good thing about such scripts is that running them multiple times doesn't create a problem- all the objects will already exist, so the script won't do anything a second time.
Setting up the server is pretty straight forward if you're using MS SQL Server. As for creating the database and tables, you generally only do this once. The whole point of a database is that the data is persistent, so if there's a chance that the database won't exist you've either got a) major stability problems, or b) no need for an actual database.
Designing the database, tables, and procedures is an entire component of the software development process. When I do this I usually keep all of my creation scripts in source control. After creation you will write the program in such a way that it assumes the database already exists - checking for connectivity is one thing, but the program should never think that there is no database at all.
you can make a script from all of objects that exist in your db. after that you can run this script from your code.
when you create your db script with script wizard in sql server, in "choose script options" section, set "Include If Not Exist" to yes. with this work done only if db not exists.

Categories