I have to copy more than 100k records in the same table and after every row insert in first table, second table will be updated with first table insert ID (Primary key).
I tried to do bulk insert but then I would not get all the inserted ids which will be inserted in second table.
I am using MySQL 5.5. When I run the following code, I get following random errors:
Lost connection to MySQL server during query.
The transaction associated with the current connection has completed
but has not been disposed. The transaction must be disposed before
the connection can be used to execute SQL statements.
Net packets out of order: received[x], expected[y].
How can I insert these records optimally?
CODE;
foreach (var item in transactions)
{
int transactionId;
using (MySqlCommand cm = DM.ConnectionManager.Conn.CreateCommand())
{
cm.CommandType = System.Data.CommandType.Text;
var commandText = #"INSERT INTO FirstTable SET
column1=#column1;";
cm.CommandText = commandText;
cm.Parameters.AddWithValue("#column1", item.column1);
cm.ExecuteNonQuery();
transactionId = (int)cm.InsertId;
}
foreach (var item in item.TransactionDetails)
{
using (MySqlCommand cm = DM.ConnectionManager.Conn.CreateCommand())
{
cm.CommandType = System.Data.CommandType.Text;
var commandText = #"INSERT INTO SecondTable SET
column1=#column1,
column2=#column2;";
cm.CommandText = commandText;
cm.Parameters.AddWithValue("#column1", item.column1);
cm.Parameters.AddWithValue("#column2", item.column2);
cm.ExecuteNonQuery();
}
}
}
The fastest method to import to MySQL is INFILE, so I would suggest making a CSV file and then running the following as a SQL statement.
Please note I've not got a full C# setup and tested this... but it is how I backup / restore MySQL when I want it done fast... so I'm assuming the following can be run when set to "commandText" and run after the CSV file is created and written to disk.
LOAD DATA LOCAL INFILE 'import.csv' INTO TABLE MyTable FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(Col1,Col2,Col3);
From https://dev.mysql.com/doc/refman/5.7/en/load-data.html "The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed..."
Related
I am trying to add records to a SQL Server database. The connection works fine for any other table except one.
This is my code:
private void saveCurrent()
{
try
// Save entry into the database.
{
string query = "INSERT INTO entries VALUES (#Date, #Tags, #Entry, #User)";
using (connection = new SqlConnection(connectionString))
using (SqlCommand command = new SqlCommand(query, connection))
{
connection.Open();
command.Parameters.AddWithValue("#Date", System.DateTime.Now.ToLongDateString());
command.Parameters.AddWithValue("#Tags", txtTags.Text);
command.Parameters.AddWithValue("#Entry", richEntryBox.Text);
command.Parameters.AddWithValue("#User", Form1.userName);
command.ExecuteNonQuery();
isDirty = false;
}
}
catch (Exception exception)
{
MessageBox.Show("There was an error saving this entry: " + exception.ToString());
}
The error is:
System.Data.SqlClient.SqlException (0x8-131904): Column name or number of supplied values does not match table definition.
All of the columns are of type nvarchar(50) and nvarchar(MAX). I am trying to enter just text information, no binaries. The dataset shows that the table has a "photos" column, but it can be null and I'm not using it (for some reason, I cannot get VS2017 to delete that column). I have altered the dataset to not include the "photos" field, but still receiving the error. Any push to the solution would be appreciated. A snap of the dataset is included here.
My dataset, in which I've removed the photos column:
--S
If your database still has the photos field, you'll need to specify the columns for insertion explicitly.
So change your insert to:
string query = "INSERT INTO entries (date, tags, entry, user) VALUES (#Date, #Tags, #Entry, #User)";
In general, you want to be explicit with your insertions. What would happen if someone added a column after tags and before entry in the database? This would break your code.
I have to retrive the data from one service based sql server to another service based sql server
select * from servernameA.db_name.dbo.table_name
where column in (select column from servernameb.db_name.dbo.table_name)
So what's the problem? You can do it.
Just a little performance improvement in your query with DISTINCT.
select * from servernameA.db_name.dbo.table_name
where column in (select DISTINCT column from servernameb.db_name.dbo.table_name)
For your reference:
Query across multiple databases on same server
Selecting data from two different servers in SQL Server
------------Edit----------------
How about this? I haven't tried this yet. Give this a try.
// connectionString is the connection string for the ServerA..DB1
using (var conn = new SqlConnection(connectionString))
{
conn.Open();
// provide commandText as the SQL query having the full name of Server2..DB2
using (var command = new SqlCommand(commandText, conn))
{
using (var reader = command.ExecuteReader())
{
while(reader.Read())
{
// read the result
}
}
}
}
There is a windows form application. I am using the MS Access database for some data manipulation. I want to copy data from one database to another. The table name, schema and the data types are same in both the tables.
I am using the below query to bulk insert data in destination database by selecting data from the source database.
INSERT INTO [Table1] IN 'C:\Data\Users.mdf' SELECT * FROM [Table1]
After data is inserted, I am querying to the target table to fetch the inserted data. I am using OleDbConnection for performing the database operations.
The issue I am facing here is that, after the above mentioned INSERT query is executed when I am executing the SELECT statement to fetch the data, I am not getting the data. However, when I am checking in debugging mode then I am getting the data.
I noticed that if I am waiting for some time after the INSERT statement is executed then the data is coming correctly. So I assume that it needs some time(delay?) to complete the bulk insert operation.
I tried providing Task.Delay(20000) after the INSERT query execution but no luck. Could someone help me here, how I can resolve this issue? Any help is highly appreciated.
I didn't find a good way to handle this but did a work around for the same. After data is inserted into the table, I am firing another query to check whether there is any data in the inserted table or not. This happens in a do..while loop like follows. The table is dropped every time the operation is completed.
var insertQuery = "INSERT INTO [Table1] IN 'C:\Data\Users.mdf' SELECT * FROM [Table1]";
ExecuteQuery(insertQuery, connProd);
var count = 10;
do
{
var selectQuery = "SELECT TOP 1 * FROM " + tableProdCopy;
var dtTopRowData = GetQueryData(selectQuery, connOther);
if (dtTopRowData != null && dtTopRowData.Rows.Count > 0)
{
count = 0;
break;
}
System.Threading.Thread.Sleep(2000);
count = count - 1;
} while (count > 0);
private DataTable GetQueryData(string query, OleDbConnection conn)
{
using (OleDbCommand cmdOutput = new OleDbCommand(query, conn))
{
using (OleDbDataAdapter adapterOutput = new OleDbDataAdapter(cmdOutput))
{
var dtOutput = new DataTable();
adapterOutput.Fill(dtOutput);
return dtOutput;
}
}
}
private void ExecuteQuery(string query, OleDbConnection conn)
{
using (OleDbCommand cmdInput = new OleDbCommand(query, conn))
{
cmdInput.ExecuteNonQuery();
}
}
I have a stored procedure that looks like that:
InsertItem: INSERT INTO (IN itemId INT, name TEXT);
Is there a way I could execute a bulk of it?
like instead of executing something like that:
using (MySqlConnection connection = new MySqlConnection(_connectionString))
{
connection.Open();
foreach (Item item in GetItems())
{
using (MySqlCommand command = new MySqlCommand("InsertItem", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#itemId", item.ItemId);
command.Parameters.AddWithValue("#name", item.Name);
command.ExecuteNonQuery();
}
}
}
I'm trying to achieve code looking like that without successing:
using (MySqlConnection connection = new MySqlConnection(_connectionString))
{
connection.Open();
using (MySqlCommandBulk command = new MySqlCommand("InsertItem", connection))
{
command.CommandType = CommandType.StoredProcedure;
for (Item item in GetItems())
{
MySqlCommandBulkItem bulkItem = new MySqlCommandBulkItem();
bulkItem["itemId"] = item.ItemId;
bulkItem["name"] = item.Name;
command.BulkItems.Add(bulkItem);
}
command.Execute();
}
}
My point is that the command will send all of the data at once, and will not send each query alone.
Any ideas?
The Oracle connector for the Dotnet framework allows the use of arrays in place of scalars on parameters. But the MySQL connector doesn't.
There are two ways to accelerate bulk loads in MySQL.
One of them applies to InnoDB tables but doesn't help with MyISAM tables. Start a transaction. Then, after every few hundred rows, COMMIT it and start another one. That will commit your table inserts in bunches, which is faster than autocommiting them individually.
The other is to use MySQL's LOAD DATA INFILE command to slurp up a data file and bulk-insert it into the database. This is very fast, but you have to be diligent about formatting your file correctly.
The company that I work for has large databases, millions of records in a single table. I have written a C# program that migrates tables between remote servers.
I first create all the tables using SMO without copying data and then the data insertion is done after all the tables have been created.
During the record insertion since there are so many records the console window remains blank until all the rows have been inserted. Due to the sheer volumes of data this takes a long time.
What I want now is a way to print n rows updated like in MSSQL import export data wizard.
The insert part is just a simple insert into select * query.
It sounds like you might be using SqlCommands, if so here is a sample
using (SqlConnection connection = new SqlConnection(Connection.ConnectionString) )
{
using(SqlCommand command = new SqlCommand("insert into OldCustomers select * from customers",connection))
{
connection.Open();
var numRows = command.ExecuteNonQuery();
Console.WriteLine("Affected Rows: {0}",numRows);
}
}
You definitely need to look on OUTPUT clause. There are useful examples on MSDN.
using (SqlConnection conn = new SqlConnection(connectionStr) )
{
var sqlCmd = "
CREATE TABLE #tmp (
InsertedId BIGINT
);
INSERT INTO TestTable
OUTPUT Inserted.Id INTO #tmp
VALUES ....
SELECT COUNT(*) FROM #tmp";
using(SqlCommand cmd = new SqlCommand(sqlCmd,conn))
{
conn .Open();
var numRows = command.ExecuteNonQuery();
Console.WriteLine("Affected Rows: {0}",numRows);
}
}
Also I suggest to use stored procedure for such purposes.