DataSet TableAdapter Fill Method - c#

I have a DataSet with two TableAdapters (1 to many relationship) that was created using visual studio 2010's Configuration Wizard.
I make a call to an external source and populate a Dictionary with the results. These results should be all of the entries in the database. To synchronize the DB I don't want to just clear all of the tables and then repopulate them like dropping the tables and creating them with new data in sql.
Is there a clean way possibly using the TableAdapter.Fill() method or do I have to loop through the two tables row by row and decide if it stay or gets deleted and then add the new entries? What is the best approach to make the data that is in the dictionary be the only data in my two tables with the DataSet?

First Question: if it's the same DB why do you have 2 tables with the same information?
To the question at hand: that largley depend on the sizes. If the tables are not big then use a transaction, clear the table (DELETE * FROM TABLE or whatever) and write your data in there again.
If the tables are big on the other hand the question is: can you load all this into your dictionary?
Of course you have to ask yourself what happens to inconsistent data (another user/app changed the data while you had it in your dictionary).
If this takes to long you could remember what you did to the data - that means: flag the changed data and remember the deleted keys and new inserted rows and make your updates based on that.
Both can be achieved by remembering the Filled DataTable and use this as backing field or by implementing your own mechanisms.
In any way I would recommend think on the problem: do you really need the dictionary? Why not make queries against the database to get the data? Or only cache a part of the data for quick access?
PS: the update method on you DataAdapter will do all the work (changing the changed, removing the deleted and inserting the new datarows but it will update the DataTable/Set so this will only work once)

It could be that it is quicker to repopulate the entire table than to itterate through and decide what record go / stay. Could you not do the process of deciding if a records is deleteed via an sql statement ? (Delete from table where active = false) if you want them to stay in the database but not in the dataset (select * from table where active = true)
You could have a date field and select all records that have been added since the date you late 'pooled' the database (select * from table where active = true and date-added > #12:30#)

Related

migrating an access multi valued field column to c#

I am attempting to use the Microsoft.ACE.OLEDB.12.0 driver to read data from an access database. came upon an odd situation. one of the columns in the access database shows as a comma delimited list of ids.
Wells
________
345,456,7
6,387
when I looked at the column definition in access I thought it would say string but it does not, it says number. so I guess it is storing an array of integers in a single column?
I'm having a tough time getting a data reader to pick this up.
using
var w = DB_Reader.GetValue(DB_Reader.GetOrdinal("Wells"));
results in the error
The provider could not determine the Object value. For example, the
row was just created, the default for the Object column was not
available, and the consumer had not yet set a new Object value.
Well, at the end of the day, you can think of the mutli-value column as in fact a child table.
So, if you looking to migrate a master and child table, then in YOUR database, you need a relational set of tables to re-create what Access is doing behind the scene.
So, lets take a multi-value example and query.
Say we have this sql query in Access:
SELECT ID, Person_Name, FavorateColors FROM tPerson;
But, "favorite colors" is one of those MV columns. (and I should point out with the HUGE movement towards no-sql databases - they also often work this way also - same for XML or JSON data for that matter. However, be it some XML, JSON or Access mutli-value features? Well, you need that child table if you going to adopt a relational data model to represent this data.
Ok, so we run the above query, and you get this output:
In fact, when I used the lookup wizard - I picked a child table called tblColors.
but, how can we explode the above query to dig out the data?
Change the above query to this:
SELECT ID, Person_Name, FavorateColors.Value FROM tPerson
Note how we added ".value" after the MV column name. Now, when you run the query, you get the SAME result as if you had two tables, and did a left join. The parent table rows will like any relational database simple repeat for each child table value, and you get this:
Note how now the PK value and the row is repeating for each child mv value.
So, you are quite much free to query as per above - you get what amounts to a left joined table, and of course the parent record repeats.
So, just like XML, JSON, or in fact a query or a table of data with repeating parent row, and child rows? Well, you quite much forced to write code to split out this data, or re-normalize the data. This of course is far more common when receiving say JSON/XML data, or in fact often say data from a Excel sheet.
So, you have to process out the child record data, and create a relation for that data.
And thus now our question becomes how can we import JSON/XML/Excel data that really should have used two relational database tables.
So, assuming we want to process this data? You process it the same as for any data you have that should have been two related tables in the first place.
it really depends if this is a one time import, or you have to do this all the time?
If it was a one time deal, then I would use Access, and use a make table query based on the above query. You would in fact have to pluck up the PK ID from the child table. In above there is a child table called colors - we just missing that "junction" table in between that Access automatic created. The hidden tables are not exposed, and thus I would simply use a make table query in access, and then add a FK column that is the PK value from the tblColors.

Insert/Update whole DataTable into database table C#

I am facing an issue I hope to get it solved by here. I have 3 different tables in a DataSet and I want to insert it in the database table.
I know I can do this using SqlBulkCopy but there is a catch and that is I want to check if the data already exists in the database then I want it to get updated instead of insert.
And if the data doesn't exist in the database table, I want to insert it then. Any help on this would be appreciated.
I know I can iterate it through each record and then fire a procedure which will check for its existence if it exists den update or else insert. But the data size is huge and iterating through each record would be a time taking process, I don't want to use this approach.
Regards
Disclaimer: I'm the owner of the project Bulk Operations
This project allows to BulkInsert, BulkUpdate, BulkDelete, and BulkMerge (Upsert).
Under the hood, it does almost what #marc_s have suggested (Use SqlBulkCopy into a temporary table and perform a merge statement to insert or update depending on the primary key).
var bulk = new BulkOperation(connection);
bulk.BulkMerge(dt);

C# SQLCe Inserting new rows with DataSet without loading the whole table?

I've a SQLCe database with a table. In the application there is a method which should only inserting new rows to this table.
I'm wondering what is the best practise for doing so. When working with a DataSet one has to load the whole table into it.
To me this seems like a big overkill, since I only want to insert new rows and therefore there is no need to fill the DataSet with the entire table.
On the other hand it also seems very ineffective to manually insert every single row with an explicit INSERT statement.
So in order to do a "batch"-INSERT one would go with a DataSet. Is there a possibility to work with a DataSet without filling the entire table, e.g. get only the schema of the table and then insert the rows to the DataSet?
Many thanks,
Juergen
Working with DataSet against a SQL Server Compact database is vey inefficient . you should use SqlCeResultSet or my wrapper lbrar, that allows you to do batch INSERTs very fast, based on a DataTable, a DataReader or even a List of objects. http://sqlcebulkcopy.codeplex.com

How to refresh datagridview binding in vb.net or C#?

I created two tables(FIRSTtable and SECONDtable) in the mysql database and two tables that are related.
The FIRST table, has a columns (product_id (pK), product_name).
The SECOND table has an columns (machine_id, production_date, product_id (fK),
product_quantity, operator_id).
Relations between the two tables using the product_id column with UpdateCascade and DeleteCascade. Both relationships are functioning normally when I try with the sql script. Suppose I delete all product_id in the FIRST table, all existing data in the SECOND table will be deleted.
Both of these tables displayed in datagridview. When I delete all the data in the FIRST table, the all rows in datagridview FIRST table will be deleted, also the data in mysql the FIRST table will be deleted.
I try to open the mysql database, the data are in SECOND Table also deleted, the problem why the view that in the second datagridview, can not be deleted, still keep the previous data? How to refresh datagridview binding in vb.net or C#? Thanks.
With Me.SECOND_DataGridView
.Datasource = Nothing ' tried this, but failed.
.DataSource = MyDataset.Tables("SECOND_table")
End With
I believe what you are running into is the fact the the MySQL Engine is actually performing the cascading deletes for you.
When you query the MySQL Data into a localized C# "DataTable" (Table within a DataSet), that data is now in memory and not directly linked to that on the disk. When you go to delete the rows in the "memory" version of the first data table, its causing the deletions to occur at the SERVER for the second level table and NOT directly updating you in-memory version of data table two.
That being said, you will probably have to do one of two things... Requery the entire dataset (tables one and two) to get a full refresh of what is STILL in the actual database... OR... As you are calling the delete from table one of the dataset, you'll have to perform the delete handling in the local datatable TWO as well to keep it in synch.

DataTable: is deleting old DataRows before inserting new safe?

i have a many-to-many relationship table in a typed DataSet.
For convenience on an update i'm deleting old relations before i'm adding the new(maybe the same as before).
Now i wonder if this way is failsafe or if i should ensure only to delete which are really deleted(for example with LINQ) and only add that one which are really new.
In SQL-Server is a unique constraint defined for the relation table, the two foreign keys are a composite primary key.
Is the order the DataAdapter updates the DataRows which RowState are <> Unchanged predictable or not?
In other words: is it possible that DataAdapter.Update(DataTable) will result in an exception when the key already exists?
This is the datamodel:
This is part of the code(LbSymptomCodes is an ASP.Net ListBox):
Dim daTrelRmaSymptomCode As New ERPModel.dsRMATableAdapters.trelRMA_SymptomCodeTableAdapter
For Each oldTrelRmaSymptomCodeRow As ERPModel.dsRMA.trelRMA_SymptomCodeRow In thisRMA.GettrelRMA_SymptomCodeRows
oldTrelRmaSymptomCodeRow.Delete()
Next
For Each item As ListItem In LbSymptomCodes.Items
If item.Selected Then
Dim newTrelRmaSymptomCodeRow As ERPModel.dsRMA.trelRMA_SymptomCodeRow = Services.dsRMA.trelRMA_SymptomCode.NewtrelRMA_SymptomCodeRow
newTrelRmaSymptomCodeRow.fiRMA = Services.IdRma
newTrelRmaSymptomCodeRow.fiSymptomCode = CInt(item.Value)
Services.dsRMA.trelRMA_SymptomCode.AddtrelRMA_SymptomCodeRow(newTrelRmaSymptomCodeRow)
End If
Next
daTrelRmaSymptomCode.Update(Services.dsRMA.trelRMA_SymptomCode)
Thank you in advance.
I think that the DataAdapter in ADO.NET is clever enough to perform the delete/inserts in the correct order.
However, if you really want to ensure that updates are done in the correct order you should do it manually by using the Select method to return an array of data rows for each particular row state. You could then call the Update method on the array of data rows
DataTable tbl = ds.Tables["YourTable"];
// Process any Deleted rows first
adapter.Update(tbl.Select(null, null, DataViewRowState.Deleted));
// Process any Updated/Modified rows
adapter.Update(tbl.Select(null, null, DataViewRowState.ModifiedCurrent));
// Process the Inserts last
adapter.Update(tbl.Select(null, null, DataViewRowState.Added));
Not sure about the DA but in theory DB transactions should be performed in the following order Deletes, Inserts, Updates.
looking at msdn the exact wording for the update method is
Blockquote
Attempts to save all changes in the DataTable to the database. (This includes removing any rows deleted from the table, adding rows inserted to the table, and updating any rows in the table that have changed.)
Blockquote
In regards to your solution of deleting items and possibly re-inserting the same items, typically speaking this should be avoided because it creates a load on the DB. In high volume applications you want to do everything you can to minimize calls to the DB as they are very expensive; computation time, from determining which row updates are spurious, is cheap.

Categories