DbContext SaveChanges - c#

I'm relatively new to C# and very new to WPF. I've been trying to wrap my head around the concept of MVVM and I've thrown ADO Entity into the mix now.
The purpose of my sample application is to track CAD items. I'm pulling items out of a database and successfully populating my view; great. I've added the information in manually to test that the views are working as they should be.
I'm now, from my application trying to add a new item through a function which I'm launching from my ICommand. As I understand it, I'm creating a new DBContext object, adding an item to it and saving my changes. Executing "SaveChanges()" successfully tells me that 1 row was updated but when I check, the data isn't there? In addition to this, if, I call SaveChanges() again (within the same debug session) it throws an error to indicate that there's multiple entries. Again, when viewing the data via "Show Table Data" I'm seeing nothing.
public void AddNewItem(object parameter)
{
using (var dbq = new DBEntities()) {
var tempItem = dbq.OutstandingCAD.Create();
tempItem.Id = 2;
dbq.OutstandingCAD.Add(tempItem);
dbq.SaveChanges();
}
}
Could someone please look over that small block of code and suggest whether what I'm doing is correct and whether my issue lies somewhere else?
Much appreciated

Your issue is not with your code but the way you are trying to evaluate your work.
There is a Microsoft article here which discusses the situation you are experiencing. It references Visual Studio 2005, but it is still relevant.
Basically, your database is an .mdf file that is stored in your project. your SQL connection string is something like AttachDbFileName=|DataDirectory|\data.mdf".
One of the things to know when working with local database files is that they are treated as any other content files. For desktop projects, it means that by default, the database file will be copied to the output folder (aka bin) each time the project is built. After F5, here's what it would look like on disk
MyProject\Data.mdf
MyProject\MyApp.vb
MyProject\Bin\Debug\Data.mdf
MyProject\Bin\Debug\MyApp.exe
At design-time, MyProject\Data.mdf is used by the data tools. At run-time, the app will be using the database under the output folder. As a result of the copy, many people have the impression that the app did not save the data to the database file. In fact, this is simply because there are two copies of the data file involved. Same applies when looking at the schema/data through the database explorer. The tools are using the copy in the project, not the one in the bin folder.
Essentially, you have 2 copies of your database, one for design-time, one for run-time. You are looking at the design-time database and not seeing changes that were made to the run-time database. However, every time you run the project, the run-time database gets overwritten, so it can appear that your changes disappear. This is really a debugging feature, to keep you from having hundreds of lines in the database from multiple tests of the same feature. However, some people prefer this kind of persistence, and would use an external SQL instance instead of a project contained .mdf.

Related

Database validation in application

I have a c# application add-on from which I need to validate the schema of a database. I can't use any of the obvious frameworks due to limitations of the application I'm extending, but rather need to find an alternative method to confirm whether database procedures exist and or whether they need to be updated (if the procedure itself was changed from what is expected). Aside from writing individual queries for each procedure are there any better solutions I might consider?
There is database project in visual studio. You can import your database to this project and try to build. Please see official documentation:
https://learn.microsoft.com/en-us/sql/ssdt/import-into-a-database-project?view=sql-server-2017
<twocents>
Export all your objects to .SQL files and commit them to source control and write them out into a folder on disk.
Create a tracker table that maps the object you have exported to a hash of the DDL of the object.
Inspect the objects in the database when your app initialises or at some other appropriate time.
Compare the hash of the DDL of the database object in the database with what is on the disk. If there is a disparity drop and recreate the object by executing the DDL of the file on disk
The hash acts as fast way to check if what's in the database matches the structure of the object that you expect to be in the database. How you deploy it is questionable, but the assumption I am making is that you are in control of your database objects that you have exposed to the application.
</twocents>

migration synch developmental and production databases

I am using MVC 5 with NET Framework 4.5.1. with Code-first. I am also using Migrations with the SQL 2012 server and (localdb)\v11.0.
I am in the middle of developing a project using C# and MVC5. During development, I created a lot of new tables in my developmental computer and changed a the "Name" field which I believe the system makes an index for. I added it and deleted it several times.
After that, I added a lot iof new unrelated tables, but for some reason, my migrations started giving me foreign constraint errors due to the indexes for the "Name" field. These errors kept multiplying as I fixed them, so, I decided to revert back to an initial state in the migration, and reset using the current position as a new starting point. I was hoping, that the production table would look at this new starting point in the development db, and resynch itself to the developmental state. I thought that I had read somewhere that the production db matches itself to the developmental db and updates itself. I believe that there is a migration file in the production db which would match itself to the file in the developmental db -that file was clearly out of synch. I have considered deleting the data in it, but I am holding off till I get advice.
Anyway, I changed the name of the migrations directory in the Dev computer and excluded it from the project. Then I reinitialized my tables (using a new db name in my local db) on the dev computer and re-loaded it with the initialization data. It all worked.
Now, I had a new problem, my production db and my developmental db were different, and my migration in the dev computer was setup to create new files whereas the one in the production state was expecting the older migration. Every time I tried to update the production db using the development computer, I kept getting an error that the files existed - which of course they did.
So, I commented out all the create files in my migration file and re-tried. Now, the production db would start, but would not run because the updated code had new fields it was referring to which did not get created in the production db. So, on my production db I started to get errors of all the fields that were missing. I tried to make automatic migrations true as well, that did not work. I am guessing, the only way to fix this is to go in manually and synch the fields one by one.
QUESTION 1: Is there an automatic way to synch (using migrations) the production db and the developmental db so that they become the same same as the developmental db?
QUESTION 2: Keeping in view the above scenario, what would have been a better way to go about to re-set the miggrations with a production db out as well?
I found a solution. The folks at Red-Gate have a great SQL tool called the SQL Compare. It compares the database file structures and even makes them EXACTLY the same, at a click of a button.
But, before you use it, be sure you ONLY compare "tables", as opposed to everything which includes "users" and "roles" and a lot more. That is because when you run the software, it backups, deletes and re-creates, and if the roles or users get deleted, the software can no longer access the database and everything gets deleted! Also... MAKE A BACKUP! (I lost all my test data on my first try)
http://www.red-gate.com/products/sql-development/sql-compare/
(This is not a sales plug for the folks at Red-Gate. I dont know them, but their tool helped me immensely - its a good tool, easy to use, and FREE for 14 days! - and I list it here so that anyone else, and I am sure there are many, who may be stuck like me can be helped.)
April 24 2015
Ok. There is more to it after you synch both the databases so that they look exactly alike.
Create a Back up of your production data *
Delete the Migration folder in your developmental folder.
Enable Migrations again
Add an initial migration
Update the local database
Now you have your local completely set up *
Go to the host database
Find the table called "__MigrationHistory" in Host/Production
Delete all the data (you want to purge it) ("__MigrationHistory" (Host))
Now copy all the data from the local "__MigrationHistory" to the hosted "__MigrationHistory"
(There will be your one single line i.e. the initial one you created above")
Now the data has been saved and every thing will be synched and it will work.
You can begin development again.

Generate Database from model creates empty edmx sql script

I am playing around with Code first, have created my entities and am now trying to generate a database from the model. I have gone through the wizard, successfully connected to the server, selected the name of the new database and exited. An edmx.sql script is automatically generated for me to run, except it's empty when I open it and it massively slows down VS 2010 to the point that I have to kill the process.
When I look in the server the database has been created (as expected since the wizard asked me to create it) but there are not tables in it (obviously because I haven't run the script).
Any ideas what is going wrong here?
I found out what caused this. Apparently the SP1 upgrade on VS2010 torches this sort of thing. The following files need to be re-installed manually from the DVD:
DACFramework_enu.msi
DACProjectSystemSetup_enu.msi
TSqlLanguageService_enu.msi
Now I can see the full script, ran it, sorted!

How to change database design in a deployed application?

Situation
I'm creating a C#/WPF 4 application using a SQL Compact Edition database as a backend with the Entity Framework and deploying with ClickOnce.
I'm fairly new to applications using databases, though I don't suspect I'll have much problem designing and building the original database. However, I'm worried that in the future I'll need to add or change some functionality which will require me to change the database design after the database is already deployed and the user has data in the database.
Questions
Is it even possible to push an updated database design out to users via a clickonce update in the same way it is for code changes?
If I did, how would the user's data be affected?
How is this sort of thing done in real situations? What are some best-practices?
I figure that in the worst case, I'd need to build some kind of "version" number into the database or program settings and create some routine to migrate the user's current version of the database to the new one.
I appreciate any insight into my problem. Thanks a lot.
There are some 'tricks' that are employed when designing databases to allow for design changes.
Firstly, many database designers create views to code against, rather than coding directly to the tables. This allows tables to be altered (split or merged, etc) while only requiring that the views are updated. You may want to investigate database refactoring techniques for this.
Secondly, you can indeed add versioning information to the database (commonly done as a 'version' table with a single field). Updating the database can be done through code or through scripts. One system I worked on would automatically check the database version and then progressively update the schema through versions in code until it matched the required version for the runtime. This was quite an undertaking.
I think your "worst" case is actually a pretty good route to go in this situation. Maintain a database version in the DB and have your application check and update the DB as necessary. If you build your updater correctly, it should be able to maintain the user's data. Depending on the update this might involve creating temporary tables to hold the existing data and repopulating new versions of the tables from them. You might be able to include a new SDF file with the new schema in place in the update process and simply transfer the data. It might be slightly easier that way -- you could use file naming to differentiate versions and trigger the update code that way.
Unfortunately version control and change management for databases is desperately, desperately far from what you can do with the rest of your code.
If you have an internal-only environment there are a number of tools which will help you (DBGhost, Red Gate has a newish app, some deployment management apps) but all of them are less than full solutions imho, but they are mostly good enough.
For client-shipped solutions you really don't have anything better than your worst case I'm afraid. Just try and design with flexibility in mind - see Dr.Herbie's answer.
This is not a solved problem basically.
"Smart Client Deployment with ClickOnce" by Brian Noyes has an excellent chapter on this issue. (Chapter 5)
ISBN 978-0-32-119769-6
He suggests something like this:
if(ApplicationDeployment.CurrentDeployment.IsFirstRun) {
MigrateData();
}
private void MigrateData() {
string previousDb = Path.Combine(ApplicationDeployment.CurrentDeployment.DataDirectory, #".\pre\mydb.sdf");
if(!File.Exists(previousDb))
return;
string oldConnString = #"Data Source=|DataDirectory|\.pre\mydb.sdf";
string newConnString = #"Data Source=|DataDirectory|\mydb.sdf";
//If you are using datasets perform any migration here, with the old and new table adapters.
//Otherwise use an .sql data migration script.
//Store the version of the database in the database, and check that in the beginning of your update script and GOTO the correct line in the SQL script.
}
A common solution is to include a version number somewhere in the database. If you have a table with miscellaneous system data, throw it in there, or create a table with one record just to hold the DB version number. Then whenever the program starts up, check if the database version is less than the expected version. If so, execute the required SQL CREATE, ALTER, etc, commands to bring it up to speed. Have a script or function for each version change. So if you see the database is currently at version 6 and the code expects version 8, execute the 6 to 7 update and the 7 to 8 update.
Another method we used on one project I worked was to ship a schema-only, no data database with the code. Every time you installed a new version the installer would also install the latest copy of this new blank database. Then when the program started it up it would compare the user's current database schema with the new database schema, and determine what database changes were needed on the fly. Like, if in the "reference schema" table Foo had a column named Bar, and there was no column Bar in the user's current database, we would generate a "alter table Foo add Bar ..." and execute it. While writing the first draft of the program to do this was a fair amount of work, once we'd done it there was pretty much zero maintenance to keep the DB schema up to date. The conversion was just done on the fly.
Note that this scheme doesn't handle DB changes that require changing data values, like if you add a new column that must be initially populated by doing some computation on data from other tables or some such. But if you can generate new data from old data, that must mean that the new data is redundant and your database is not normalized. I don't think the situation ever came up for us.
I had the same issue with an app in Android with an SQLite database adding a table. I changed the name of the database to include a version extension, like: theDataBaseV1, deleted the previous one and the app works fine.
I just changed the name of the database and the name in this line of code
private static final String DATABASE_NAME = "busesBogotaV2.db";
in the DBManager when its going to open.
Does anybody knows if this trivial solution has any unintended consequences?

How do I use SMO to drop and recreate all views in a Database?

I recently copied a MSSQL2000 DB to a MSSQL2008 DB. Afterwards I ran a script to update all the text/varchar fields in the tables to the same collation as the model db. However, all my copied views still are using the same collation as the previous db. The easiest way I found to fix this would be to have MS SQL Management Studio create the DROP/CREATE scripts for these views. However, certain views depend on other views so you need to make sure to DROP/CREATE these views in the proper order.
So my question is:
How would I create a script that traverses, depth first, the dependencies of each view and then on the return trip drops and creates each view?
I'm assuming that this will use SMO. I will be writing the script in C#.
Can you not just iterate through and execute sp_refreshview dynamically? This will achieve the same result: updating the view for the new base table definition.
You aren't using WITH SCHEMABINDING (otherwise could not change tables) so dependency or depth does not matter.
in cases like this just run the script a bunch of times (SSMS f5), eventually all the child views will exist when they are used by a parent. You can tell when everything is good, as there will be no errors.
Try DBSourceTools. http://dbsourcetools.codeplex.com.
This utility uses SMO to script all database objects and data to disk (using DROP / CREATE) scripts.
It also automatically builds a dependancy tree and will re-create database objects in the required order.
Once you have scripted your 2000 DB to disk, create a "Deployment Target" database for your 2008 database.
You can then create a full set of patches to change / upgrade / rename all of your views.
DBSourceTools will re-build your database for you, and then apply all patches in order.
It's a reliable, repeatable method of version control for databases, and allows you to test and version control these patches.
When you are ready for release, simply send all of the patches to your DBA, and get him to run them in order.
Have fun.

Categories