An Explanation of MySqlBulkLoader - c#

Can you tell me what MySqlBulkLoader is for, where and how to use it?
Some examples would also be appreciated, please..

MySQLBulkLoader is a class in the MySQL Connector/Net class that wraps the MySQL statement LOAD DATA INFILE. This gives MySQL Connector/Net the ability to load a data file from a local or remote host to the server. [MySQLBulkLoader]
The example how to use the MySQLBulkLoader is also presented Here
To be clear:
The MySQLBulkLoader is not similar to SQLBulkCopy. SQLBulkCopy also called Bulk insert reads data from DataTable and MySQLBulkLoader also called LOAD DATA INFILE reads from a file. If you have a list of data to insert in you database, it is possible to prepare and insert data inside you database directly with SQLBulkCopy; where with the MySQLBulkoader you will need to genereate a file from your data before running the command.
There are no counterpart of SQLBulkCopy inside MySQL Connector/Net at the time writting; however, the MySQL DB support Bulk insert, so you can run the corresponding command in a MySQLCommand like presented Here.

MySqlBulkLoader is a class provided by the MySql .net Connector.
It provides an interface to MySql that is similar in concept to the SqlBulkCopy class / BCP for Sql Server. Basically, it allows you to load data into MySql in bulk. A decent looking example can be found at dragthor.wordpress.com and there's also an example in the MySql documentation.

Related

How to add columns to a datareader

I have almost the exact same issue as the scenario (linked) below, but unfortunately i'm unable to recreate the solutions succesfully.
I have a c# application using SQL Bulk Import with a datareader and writetoserver, where it's the SQLDatReader or an OracleDataReader, and i need to add columns to the result set.
I can not do it on the source sql statement.
I can not load a data table first and modify it (as it's 100's of gb's of data, almost a terabyte).
How to add columns to DataReader
can anyone provide a working example an help "push" me over this problem?
I temporarily found a solution of using SQL Server Integration Services (SSIS), but what i found while watching it run is it downloads all the data to a dts_buffer, than does the column modifications and then pumps the data into sql server, try doing that with a couple 100gb of data and it is not a good performing thing, if you can even get your infrastructure to build you a 24 core VM with 128gb of memory).
I finally have a small working example, the codeproject (jdweng) was helpful.
I will pose followup, i've tested with sql server (sqldatareader), need to do a test with oracle data reader.
One of the cases i was trying was converting a oracle unique id (stored as a string) to sql server as a uniqueidentifier. I want convert that on the fly, there is no way to adjust the source oracle statement (ADyson) to return a compatible datatype to sql server. Altering a 1tb table afterwards from varchar(40) to uniqueidentifier is painful, but if i could just change as part of the bulk insert, it'd be quick.
and i think now i will be able to.

Inserting missing records from Oracle DB into SQL Server database

I have an Oracle database with a table Customers to which I only have read-only access. I would like to copy some columns from this table and insert them into a SQL Server 2008 R2 table called Customers, only when the ID does not exist in the destination table.
I'm new to C#.... I can open and read from Oracle successfully using Oracle.DataAccess and I can write to SQL Server, but I am lost on how to read from Oracle then write into SQL Server. All the examples I could find are for databases of the same type.
I wanted to follow up on this to share that despite many creative solutions offered here and elsewhere, there was not an efficient enough way to accomplish this task. In most every case, I would have to read the entire contents of two very large tables and then check for missing records without the performance advantage of a true join. I decided to go with a hybrid model where the app obtains basic information from the main Oracle database and then reads and writes the supplemental information in and out of SQL Server.

MySqlBulkLoader using a DataTable instead of a file

I'm providing MySql compatibility for my program that previously worked only with SQL Server. I used SqlBulkCopy and I would like to use it with MySql as well. I know there is MySqlBulkLoader that can be used to perform the same task. The difference however is that SqlBulkCopy worked with a DataTable so I prepared my DataTable and then performed the copy. MySqlBulkLoader, as far as I know, is used to copy an entire file into the database. But I am not dealing with a file here and I would prefer to skip extra steps of converting my DataTable into a temp file, performing the BulkCopy and then deleting the temp file.
Is there a way to make MySqlBulkLoader work with DataTables? Is there a trustworthy alternative to MySqlBulkLoader?
I assume that you're using the MySql Connector/NET, but which version of it?
Assuming that you're using the latest version (8.0 at current time), a look at the MySQL Connector/NET 8.0 API Reference shows that there is no other option than importing your data from an existing file.
Seems like your proposed method is the only workaround for that...

Convert ODBC Datasource to .MDB file

I currently have a database connected to ODBC using the DBISAM 4 ODBC Driver.
I need a way to convert this database into an .MDB access database file using code.
I suggest doing it in 2 steps:
Conversion of database schema. In this step create SQL file with CREATE TABLE commands with info from your database source. Some data types may be different in your source and it may be hard to convert it to MS Access. Try to run such SQL commands on MS Access and correct errors until your schema looks identical (the same names of tables and columns, identical or very similar data types).
Copy data. Now you have identical or very similar schema on both sides. Now export source data to destination table. There are many ways of doing it. I prefer Jython with JDBC drivers, PreparedStatement with INSERT and code that looks like:
insert_stmt.setObject(i, rs_in.getObject(i))
This will work with ODBC while in JDK 1.7 and earlier there is JDBC-ODBC bridge (it disappeared in JDK 1.8). I think that in .NET environment it is very similar.

Transfer data from SQL Server to MySQL

I need to transfer around 2m records from 4 tables in SQL Server 2008 Express to MySQL.
In C# I can insert these records very quickly within a transaction with the Table-Valued parameter. (Around 50 seconds).
How can I do something similar for MySQL in C#?
Read the explanations in the MySQL Reference Manual. The best you can do is use LOAD DATA INFILE while disabling indices before and recreating (and thus batch-calculating them) afterwards. There is more interesting advice if that doesn't work out for you.
Are you locked into using C# for the migration? If you're not and only want to transfer the data you can use the MySQL Migration Toolkit to do the transfer for you.

Categories