Is there any BulkUpdate Command similar to BulkCopy in SQL Server 2008 - c#

I have used BulkCopy command to transfer rows from one table to another table with bulk data about 3 to 5 million rows. I want to update these rows.
Is there any BulkUpdate command similar to the BulkCopy command? I'm using ASP.NET with C#.

No, there isn't.
Q: What's an "lac"?
This might help:
http://itknowledgeexchange.techtarget.com/itanswers/bulk-update-in-sql-server-2005/
Assuming that you have a column with distict values to show you which
rows are which between the two tables this can be done with a simple
update statement.
UPDATE TableA
SET TableA.A1 = TableB.B1,
TableA.A2 = TableB.B2
FROM TableB
WHERE TableA.A3 = TableB.B3
If you are worried about creating one massive transaction you can
batch the operation into smaller chunks. This is done via the TOP
keyword.
UPDATE TOP (1000) TableA
SET TableA.A1 = TableB.B1,
TableA.A2 = TableB.B2
FROM TableB
WHERE TableA.A3 = TableB.B3
AND TableA.A1 <> TableB.B1
AND TableA.A2 <> TableB.B2
You can put that into a loop...
Here's another link (with basically the same solution):
http://www.sqlusa.com/bestpractices2005/hugeupdate/

A common approach here is:
bulk-load (SqlBulkCopy) into an empty *staging table - meaning: a table with the right columns/types as the actual data, but not part of the main transactional system
now do an update joining the real data to the staging data, to update the values in the real data

Disclaimer: I'm the owner of the project Bulk Operations
The Bulk Operations Library allow to Insert, Delete, Update and Merge millions of rows in few seconds.
It's very easy to learn and use if you already know the SqlBulkCopy class.
var bulk = new BulkOperation(connection);
// ... Mappings ....
bulk.BulkUpdate(dt);

Related

Insert/Update whole DataTable into database table C#

I am facing an issue I hope to get it solved by here. I have 3 different tables in a DataSet and I want to insert it in the database table.
I know I can do this using SqlBulkCopy but there is a catch and that is I want to check if the data already exists in the database then I want it to get updated instead of insert.
And if the data doesn't exist in the database table, I want to insert it then. Any help on this would be appreciated.
I know I can iterate it through each record and then fire a procedure which will check for its existence if it exists den update or else insert. But the data size is huge and iterating through each record would be a time taking process, I don't want to use this approach.
Regards
Disclaimer: I'm the owner of the project Bulk Operations
This project allows to BulkInsert, BulkUpdate, BulkDelete, and BulkMerge (Upsert).
Under the hood, it does almost what #marc_s have suggested (Use SqlBulkCopy into a temporary table and perform a merge statement to insert or update depending on the primary key).
var bulk = new BulkOperation(connection);
bulk.BulkMerge(dt);

Efficient insert statement

I'm looking for an efficient way of inserting records into SQL server for my C#/MVC application. Anyone know what the best method would be?
Normally I've just done a while loop and insert statement within, but then again I've not had quite so many records to deal with. I need to insert around half a million, and at 300 rows a minute with the while loop, I'll be here all day!
What I'm doing is looping through a large holding table, and using it's rows to create records in a different table. I've set up some functions for lookup data which is necessary for the new table, and this is no doubt adding to the drain.
So here is the query I have. Extremely inefficient for large amounts of data!
Declare #HoldingID int
Set #HoldingID = (Select min(HoldingID) From HoldingTable)
While #JourneyHoldingID IS NOT NULL
Begin
Insert Into Journeys (DepartureID, ArrivalID, ProviderID, JourneyNumber, Active)
Select
dbo.GetHubIDFromName(StartHubName),
dbo.GetHubIDFromName(EndHubName),
dbo.GetBusIDFromName(CompanyName),
JourneyNo, 1
From Holding
Where HoldingID = #HoldingID
Set #HoldingID = (Select MIN(HoldingID) From Holding Where HoldingID > #HoldingID)
End
I've heard about set-based approaches - is there anything that might work for the above problem?
If you want to insert a lot of data into a MSSQL Server then you should use BULK INSERTs - there is a command line tool called the bcp utility for this, and also a C# wrapper for performing Bulk Copy Operations, but under the covers they are all using BULK INSERT.
Depending on your application you may want to insert your data into a staging table first, and then either MERGE or INSERT INTO SELECT... to transfer those rows from the staging table(s) to the target table(s) - if you have a lot of data then this will take some time, however will be a lot quicker than performing the inserts individually.
If you want to speed this up then are various things that you can do such as changing the recovery model or tweaking / removing triggers and indexes (depending on whether or not this is a live database or not). If its still really slow then you should look into doing this process in batches (e.g. 1000 rows at a time).
This should be exactly what you are doing now.
Insert Into Journeys(DepartureID, ArrivalID, ProviderID, JourneyNumber, Active)
Select
dbo.GetHubIDFromName(StartHubName),
dbo.GetHubIDFromName(EndHubName),
dbo.GetBusIDFromName(CompanyName),
JourneyNo, 1
From Holding
ORDER BY HoldingID ASC
you (probably) are able to do it in one statement of the form
INSERT INTO JOURNEYS
SELECT * FROM HOLDING;
Without more information about your schema it is difficult to be absolutely sure.
SQLServer 2008 introduced Table Parameters. These allow you to insert multiple rows in a single trip to the database (send it as a large blob). Without using a temporary table. This article describes how it works (step four in the article)
http://www.altdevblogaday.com/2012/05/16/sql-server-high-performance-inserts/
It differs from bulk inserts in that you do not need special utilities and that all constraints and foreign keys are checked.
I quadrupled my throughput using this and parallelizing the inserts. Now at 15.000 inserts/second in the same table sustained. Regular table with indexes and over a billion rows.

DataSet TableAdapter Fill Method

I have a DataSet with two TableAdapters (1 to many relationship) that was created using visual studio 2010's Configuration Wizard.
I make a call to an external source and populate a Dictionary with the results. These results should be all of the entries in the database. To synchronize the DB I don't want to just clear all of the tables and then repopulate them like dropping the tables and creating them with new data in sql.
Is there a clean way possibly using the TableAdapter.Fill() method or do I have to loop through the two tables row by row and decide if it stay or gets deleted and then add the new entries? What is the best approach to make the data that is in the dictionary be the only data in my two tables with the DataSet?
First Question: if it's the same DB why do you have 2 tables with the same information?
To the question at hand: that largley depend on the sizes. If the tables are not big then use a transaction, clear the table (DELETE * FROM TABLE or whatever) and write your data in there again.
If the tables are big on the other hand the question is: can you load all this into your dictionary?
Of course you have to ask yourself what happens to inconsistent data (another user/app changed the data while you had it in your dictionary).
If this takes to long you could remember what you did to the data - that means: flag the changed data and remember the deleted keys and new inserted rows and make your updates based on that.
Both can be achieved by remembering the Filled DataTable and use this as backing field or by implementing your own mechanisms.
In any way I would recommend think on the problem: do you really need the dictionary? Why not make queries against the database to get the data? Or only cache a part of the data for quick access?
PS: the update method on you DataAdapter will do all the work (changing the changed, removing the deleted and inserting the new datarows but it will update the DataTable/Set so this will only work once)
It could be that it is quicker to repopulate the entire table than to itterate through and decide what record go / stay. Could you not do the process of deciding if a records is deleteed via an sql statement ? (Delete from table where active = false) if you want them to stay in the database but not in the dataset (select * from table where active = true)
You could have a date field and select all records that have been added since the date you late 'pooled' the database (select * from table where active = true and date-added > #12:30#)

Any way to SQLBulkCopy "insert or update if exists"?

I need to update a very large table periodically and SQLBulkCopy is perfect for that, only that I have a 2-columns index that prevents duplicates. Is there a way to use SQLBulkCopy as "insert or update if exists"?
If not, what is the most efficient way of doing so? Again, I am talking about a table with millions of records.
Thank you
I published a nuget package (SqlBulkTools) to solve this problem.
Here's a code example that would achieve a bulk upsert.
var bulk = new BulkOperations();
var books = GetBooks();
using (TransactionScope trans = new TransactionScope())
{
using (SqlConnection conn = new SqlConnection(ConfigurationManager
.ConnectionStrings["SqlBulkToolsTest"].ConnectionString))
{
bulk.Setup<Book>()
.ForCollection(books)
.WithTable("Books")
.AddAllColumns()
.BulkInsertOrUpdate()
.MatchTargetOn(x => x.ISBN)
.Commit(conn);
}
trans.Complete();
}
For very large tables, there are options to add table locks and temporarily disable non-clustered indexes. See SqlBulkTools Documentation for more examples.
I would bulk load data into a temporary staging table, then do an upsert into the final table. See here for an example of doing an upsert.
Not in one step, but in SQL Server 2008, you could:
bulk load into staging table
apply a MERGE statement to update/insert into your real table
Read more about the MERGE statement
Instead of create a new temporary table, which BTW consume more space and memory.
I created a Trigger with INSTEAD OF INSERT and use inside MERGE statement.
But don't forget add the parameter SqlBulkCopyOptions.FireTriggers in the SqlBulkCopy.
This is my two cents.
Another alternative would be to not use a temporary table but use a stored procedure with a table valued parameter. Pass a datatable to the sp and do the merge there.
Got a hint from #Ivan. For those who might need, here's what I did.
create trigger yourschma.Tr_your_triger_name
on yourschma.yourtable
instead of INSERT
as
merge into yourschma.yourtable as target
using inserted as source
on (target.yourtableID = source.yourtableID)
when matched then
update
set target.ID = source.ID,
target.some_column = source.some_column,
target.Amount = source.Amount
when not matched by target then
insert (some_column, Amount)
values (source.some_column, source.Amount);
go

Bulkcopy the updated & newly inserted data between differenet databases using C#

This is my first post.. I have 2 SQL Server databases located on different servers..
Let's say SDT for source data table from source database SDB to DDT (Destination data table) for Database DDB
I'm using C# for bulk copying from SDT to DDT..
My code is something like this:
sqlcommand = "Delete * from DDT where locID = #LocIDParam" // #LocIDParam is the parameter for a specific location //
then bulk copy "Select * from SDT where locID = #LocIDParam" // the steps are well known..
I just don't want to go for useless details..
However, my SDT has a huge data so that it causes high traffic for bulk copying the whole table
Is there anyway for bulk copying the only updated records from SDT to DDT as well as inserting the new ones???
Do you think using an SQL trigger for updated and newly inserted data is the best idea for this kind of scenarios? (trigger to insert the primary key value into a single column table for the new and update then deleting and inserting from/to DDT based on this )
PS. I don't want to use SQL replication for that since it has a lot of problems..
Thank you in advance
From the date I suppose you already fond your solution. In case not, here is how we deal with a somehow similar situation.
On the source table we have a column that shows if the data has to be send to the destination. We use a boolean but you can also have a datetime field that shows last update date.
Then our pull process does following :
Pull all the flagged data in a temporary table on the destination server
Update records that exists in both table
Insert all records from temporary table that don't exist in destination table
Drop the temporary table
If you use SQL 2008, there is a merge option that I don't know. Here a link that explains it :
SQL 208 MERGE command
Hope this will help you if you still need.

Categories