I need to transfer around 2m records from 4 tables in SQL Server 2008 Express to MySQL.
In C# I can insert these records very quickly within a transaction with the Table-Valued parameter. (Around 50 seconds).
How can I do something similar for MySQL in C#?
Read the explanations in the MySQL Reference Manual. The best you can do is use LOAD DATA INFILE while disabling indices before and recreating (and thus batch-calculating them) afterwards. There is more interesting advice if that doesn't work out for you.
Are you locked into using C# for the migration? If you're not and only want to transfer the data you can use the MySQL Migration Toolkit to do the transfer for you.
Related
I have almost the exact same issue as the scenario (linked) below, but unfortunately i'm unable to recreate the solutions succesfully.
I have a c# application using SQL Bulk Import with a datareader and writetoserver, where it's the SQLDatReader or an OracleDataReader, and i need to add columns to the result set.
I can not do it on the source sql statement.
I can not load a data table first and modify it (as it's 100's of gb's of data, almost a terabyte).
How to add columns to DataReader
can anyone provide a working example an help "push" me over this problem?
I temporarily found a solution of using SQL Server Integration Services (SSIS), but what i found while watching it run is it downloads all the data to a dts_buffer, than does the column modifications and then pumps the data into sql server, try doing that with a couple 100gb of data and it is not a good performing thing, if you can even get your infrastructure to build you a 24 core VM with 128gb of memory).
I finally have a small working example, the codeproject (jdweng) was helpful.
I will pose followup, i've tested with sql server (sqldatareader), need to do a test with oracle data reader.
One of the cases i was trying was converting a oracle unique id (stored as a string) to sql server as a uniqueidentifier. I want convert that on the fly, there is no way to adjust the source oracle statement (ADyson) to return a compatible datatype to sql server. Altering a 1tb table afterwards from varchar(40) to uniqueidentifier is painful, but if i could just change as part of the bulk insert, it'd be quick.
and i think now i will be able to.
I have an Oracle database with a table Customers to which I only have read-only access. I would like to copy some columns from this table and insert them into a SQL Server 2008 R2 table called Customers, only when the ID does not exist in the destination table.
I'm new to C#.... I can open and read from Oracle successfully using Oracle.DataAccess and I can write to SQL Server, but I am lost on how to read from Oracle then write into SQL Server. All the examples I could find are for databases of the same type.
I wanted to follow up on this to share that despite many creative solutions offered here and elsewhere, there was not an efficient enough way to accomplish this task. In most every case, I would have to read the entire contents of two very large tables and then check for missing records without the performance advantage of a true join. I decided to go with a hybrid model where the app obtains basic information from the main Oracle database and then reads and writes the supplemental information in and out of SQL Server.
I have a question, clearly since I'm here...
I have a source database (SQL Server 2008) and a destination database (also SQL Server 2008). There are modifications which need to be done which means that a lot of the data has to pass through C# (converting coordinates, triangulation etc). But how would you do it?
I'm looking for a few things here. Right now I'm using a SqlDataReader and pulling the data into a DataTable and using the data like that in order to manipulate it before I push it into the destination database, but what would be a better (faster & more memory efficient) way of doing it?
Also, for data which does not need to be manipulated I'm still pulling it through in the same way, I assume there is a way of avoiding that which would be quicker?
Technical info:
DB: 2 x SQL Server 2008 (source/dest) - On the same server
Language: C# / .NET 3.5
OS: Windows
I would insert the source data into a temp table(s) on to the destination database.
Then I would use SQL Merge to update the destination.
MERGE (Transact-SQL)
I would completely minimize the role that c# plays.
You could embed the .NET code onto the sql server using the SQL CLR in order to remove the round-trip time between the database and the machine.
You should use Integration Services for ETL tasks.
I am trying to migrate data from an Informix database to SQL Server 2008. I've got quite a lot of data to move. I've been try multiple methods to get the data over, and so far SQLBulkCopy in multiple chunks seems to be the fastest that I can find. Does anyone know of a faster means of getting the data over? I'm trying to cut down on the transfer time so that on my cut-over date I don't run out of time to do the full cut-over. Thanks.
As you mentioned, I think that the bcp command is the fastest solution.
you can make csv file from your data and then import those to your db by bcp command.
There isn't much more you can do to get this work completed faster. One thing you might want to look at though is the recover model for the sql database. If it's currently set to Full, you're going to end up slowing down quite a bit as the transaction log fills up.
http://msdn.microsoft.com/en-us/library/ms189275.aspx
Hope that helps.
If you can use an Ole or ODBC connection to your Informix database, then SSIS may be the best option.
I need to copy several tables from one DB to another in SQL Server 2000, using C# (VS 2005). The call needs to be parameterized - I need to be able to pass in the name of the database to which I am going to be copying these tables.
I could use DTS with parameters, but I can't find any sample code that does this from C#.
Alternatively, I could just use
drop table TableName
select * into TableName from SourceDB..TableName
and then reconstruct the indexes etc - but that is really kludgy.
Any other ideas?
Thanks!
For SQL Server 7.0 and 2000, we have SQLDMO for this. For SQL Server 2005 there is SMO. This allows you do to pretty much everything related to administering the database, scripting objects, enumerating databases, and much more. This is better, IMO, than trying a "roll your own" approach.
SQL 2000:
Developing SQL-DMO Applications
Transfer Object
SQL 2005:
Here is the SMO main page:
Microsoft SQL Server Management Objects (SMO)
Here is the Transfer functionality:
Transferring Data
How to: Transfer Schema and Data from One Database to Another in Visual Basic .NET
If the destination table is being dropped every time then why not do SELECT INTO? Doesn't seem like a kludge at all.
If it works just fine and ticks all the requirements boxes why create a days worth of work growing code to do exactly the same thing?
Let SQL do all the heavy lifting for you.
You could put the scripts (copy db) found here
http://www.codeproject.com/KB/database/CreateDatabaseScript.aspx
Into an application. Just replace the destination. To actually move the entite database, FOLLOW
http://support.microsoft.com/kb/314546
But remember, the database has to be taken offline first.
Thanks