I have a DataSet with multiple tables the source document is an XML file.
mydataset.ReadXML(filename);
There are multiple tables with multiple relations. I have the database in the SQL created via code - tables, columns and relations.
Now I would like to insert the data . (Yes I want everything).
EDIT: The SQL tables use Identity Columns autogenerate - because I am not sure the incoming data will never duplicate one of the parameters that I would like to assume is unique.
So what methods should I do to ensure data is input considering I have foreign key constraints , how should I iterate the the tables in the dataset to make sure I don't try to insert in tables requiring an existing id. Do I create a hard coded map (I prefer not too) , or do I walk the tables looking for a Foreign key and verify the parent table etc..
Any Ideas ? I am sure someone has done this before and has a simple answer for me.
You have a couple of options. Assuming the database is not generating the key values, this is pretty simple. Either
1) You discover the order to load the tables so that each table with a Foreign Key is loaded after the table to which it refers.
2) You turn off constraint checking in SqlBulkCopy, and then optionally check the constraints after loading.
To check the constraints after load, run a command like
alter table SomeTable with check check constraint fk_SomeTable_OtherTable
If you do have database-generated keys, this is all harder. But the best way to tackle that is to use SEQUENCE objects isntead of IDENTITY columns and run sp_sequence_get_range to fetch the new key ranges to the client and apply the keys there first.
Related
I am working on a project where I may not alter the database in any way (unfortunately). I have started the project using Entity Framework and this has worked for the first few objects that I need. Now I have come across two scenarios that I am not sure how to accommodate.
Tables without a primary key defined.
Tables using the suffix of the table name as a field.
For the first, I get an error about reviewing my schema and uncommenting the proper area of the edmx file. The table has a field that acts as primary key but is not designated as not null and does not have a primary key created for it.
For the second, there are several tables with names like order1, order2, order3 etc where the table that needs to be accessed would be a parameter of my access methods.
My thought is that it would be simplest to just manually write the SQL and bind the data to models. If I go that route would I even use EF or do I just create a new database connection? what would be the 'proper' way to go about that?
My scenario is that I have [mostly] matching tables in 2 different databases. The table schema will usually be identical, but sometimes will change (column additions, renames, removals). I need to copy data from Source to Destination under the following conditions:
Attempt to insert all rows from source to destination. No updates.
Unique constraints will cause inserts to fail for dups - this is OK
If columns don't quite match up between source and dest, insert must still succeed. Yes, this is the tricky part. Wherever columns match, need insert according to matching schema. Any columns that are new/removed/renamed can be skipped and ignored, but overall insert must succeed.
Source may be Access or MSSQL and Dest may be Access or MSSQL, so any sql must work for both.
This is not a one-time thing. It's part of a software/data upgrade process that will occur over and over for many customers, with different datasets and different tables. But again, most columns will always match (table names will always be same on both sides, with similar schema), with occasional column differences and unique constraints.
It's OK for an insert to fail due to a unique constraint violation (this means there's a duplicate record).
It's NOT OK for an insert to fail due to column mismatches -- I need to find a way for these to match up best as possible. Unmatching columns can be skipped/ignored.
Unfortunately the table schemas weren't designed very well and thus tables do not have an int primary key. Many tables have multi-column keys, but these can't be used because ANY table in the source may need to be copied to ANY table in the destination. So reliance on keys won't work.
I'm using Visual Studio 2015 and latest c#. SQL Server and Access.
This is a strange scenario, but I need a robust way to handle it. I don't care if it's an ugly hack, it just needs to work. Can anyone think of a good approach for this?
I did project couple of years ago i had the same challenge.In my case there were multiple remote stores that they were offline during the month (no internet connection) and each 2 weeks they need to connect to internet and sync the data between their local database and HQ database.
Basically first they have to send their local data to HQ and then they need to receive HQ changes back.
If you dont have large data (in my case the database had more than 600 tables and around 60 million rows) you can first send the client data to server merge with existing data then you can drop the client table and batch insert fresh data from server.
My team are using SqlMetal to generate database classes from a SqlServer database. It works perfectly for all of our existing classes and foreign key associations but it's refusing to generate the code for a particular new foreign key association that we want to add. This key is from an audit table to a global event table detailing the time the audit record was created and the user it was created by. Many similar foreign key associations between other audit tables and this global "event" table exist in the system and SqlMetal generates code for those associations.
I've tried resolving this problem by:
Dropping and recreating the table
Removing the primary key
Creating a new identical table with a different name
Removing all other fields from the table
Dumping the indexes
Performing a fresh database build
Renaming the foreign key
None of the above seem to resolve the problem. However, SqlMetal does correctly generate code for foreign key associations from this table to some (but not all) other tables in the system. The association between these two tables would only generate when I altered the original create table script to include the foreign key association rather than running it (or a new equivalent table) in later. Unfortunately, we need to be able to deploy this change as a script to our existing production database so this isn't an option. I've seen a couple of articles and forum posts mentioning similar problems but none seem to give any solution or even an explanation.
During data transfer i want to disable/enable
All foreign keys on table
All foreign keys on all tables
via query in MSAccess.
I will call query it from C# Module. There will be bulk insertion.
You could delete your relationship from MSysRelationships do your stuff, make sure it is all valid then recreate the record in MSysRelationships.
This seems difficult though. You're knowingly putting bad data into a table with constraints. Why not put your data into a temp table with the same design as your table with the constraints then use an insert query to move the records in to the canonical table according to the rules you've established in the relationships. That way you don't ever drop relationships and risk ruining your table with bad data.
I am trying to design a window based application in C#.NET. I am reading csv file in data grid view and then insert those data into database. I am using SqlBulkCopy to insert data from csv file into database. My concern is, when I am trying to insert data into database (which already consist data) I am getting error of primary key constraint. I want to know whether it is possible to compare value before inserting into database using SqlBulkCopy. If value exist in database it should do update.
Can anyone provide me logic for this.
Thanks.
If you really, really know that the dupes aren't needed, just set the "Ignore Dupes" option on the index for the PK and you're done.
You need to read the data from the database into a list first.
Then when you load the csv you discard any duplicates.
Also, when you insert you should possibly ignore the primary key and let the sql database generate the key.
There's no way to do any logic when you're firehosing data in via SQLBulkCopy. Even triggers (shudder) are turned off.
What you should do is bulkcopy your data into an empty staging table, then run a stored procedure that merges the data from the staging table into the real, non-empty table.
If you're using SQL 2008, then you can use the MERGE command, otherwise you'll just have to code around it with a separate update and insert.