Dynamically print number of rows updated - c#

The company that I work for has large databases, millions of records in a single table. I have written a C# program that migrates tables between remote servers.
I first create all the tables using SMO without copying data and then the data insertion is done after all the tables have been created.
During the record insertion since there are so many records the console window remains blank until all the rows have been inserted. Due to the sheer volumes of data this takes a long time.
What I want now is a way to print n rows updated like in MSSQL import export data wizard.
The insert part is just a simple insert into select * query.

It sounds like you might be using SqlCommands, if so here is a sample
using (SqlConnection connection = new SqlConnection(Connection.ConnectionString) )
{
using(SqlCommand command = new SqlCommand("insert into OldCustomers select * from customers",connection))
{
connection.Open();
var numRows = command.ExecuteNonQuery();
Console.WriteLine("Affected Rows: {0}",numRows);
}
}

You definitely need to look on OUTPUT clause. There are useful examples on MSDN.
using (SqlConnection conn = new SqlConnection(connectionStr) )
{
var sqlCmd = "
CREATE TABLE #tmp (
InsertedId BIGINT
);
INSERT INTO TestTable
OUTPUT Inserted.Id INTO #tmp
VALUES ....
SELECT COUNT(*) FROM #tmp";
using(SqlCommand cmd = new SqlCommand(sqlCmd,conn))
{
conn .Open();
var numRows = command.ExecuteNonQuery();
Console.WriteLine("Affected Rows: {0}",numRows);
}
}
Also I suggest to use stored procedure for such purposes.

Related

Select after bulk insert doesn't work in MS access database

There is a windows form application. I am using the MS Access database for some data manipulation. I want to copy data from one database to another. The table name, schema and the data types are same in both the tables.
I am using the below query to bulk insert data in destination database by selecting data from the source database.
INSERT INTO [Table1] IN 'C:\Data\Users.mdf' SELECT * FROM [Table1]
After data is inserted, I am querying to the target table to fetch the inserted data. I am using OleDbConnection for performing the database operations.
The issue I am facing here is that, after the above mentioned INSERT query is executed when I am executing the SELECT statement to fetch the data, I am not getting the data. However, when I am checking in debugging mode then I am getting the data.
I noticed that if I am waiting for some time after the INSERT statement is executed then the data is coming correctly. So I assume that it needs some time(delay?) to complete the bulk insert operation.
I tried providing Task.Delay(20000) after the INSERT query execution but no luck. Could someone help me here, how I can resolve this issue? Any help is highly appreciated.
I didn't find a good way to handle this but did a work around for the same. After data is inserted into the table, I am firing another query to check whether there is any data in the inserted table or not. This happens in a do..while loop like follows. The table is dropped every time the operation is completed.
var insertQuery = "INSERT INTO [Table1] IN 'C:\Data\Users.mdf' SELECT * FROM [Table1]";
ExecuteQuery(insertQuery, connProd);
var count = 10;
do
{
var selectQuery = "SELECT TOP 1 * FROM " + tableProdCopy;
var dtTopRowData = GetQueryData(selectQuery, connOther);
if (dtTopRowData != null && dtTopRowData.Rows.Count > 0)
{
count = 0;
break;
}
System.Threading.Thread.Sleep(2000);
count = count - 1;
} while (count > 0);
private DataTable GetQueryData(string query, OleDbConnection conn)
{
using (OleDbCommand cmdOutput = new OleDbCommand(query, conn))
{
using (OleDbDataAdapter adapterOutput = new OleDbDataAdapter(cmdOutput))
{
var dtOutput = new DataTable();
adapterOutput.Fill(dtOutput);
return dtOutput;
}
}
}
private void ExecuteQuery(string query, OleDbConnection conn)
{
using (OleDbCommand cmdInput = new OleDbCommand(query, conn))
{
cmdInput.ExecuteNonQuery();
}
}

Copying data from one database to another - Out of Memory Exception

I have a task where i want to copy all data from one database to another database & skipping 2 tables. There are more than 200 tables.
I have table structure ready for my 2nd databas.
So as a solution i created a page & on a button click i have below code :-
DataSet ds = new DataSet();
string connectionString = "Data Source=COMP112\\MSSQLSERVER2014;Initial Catalog=HCMBL;Integrated Security=True;Persist Security Info=True";
SqlConnection con = new SqlConnection(connectionString);
//render table name from database
string sqlTable = "SELECT * FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE' and TABLE_Schema='" + Session["SchemaName"].ToString() + "' and TABLE_NAME!='ENTRY' and TABLE_NAME!='OT' and TABLE_NAME!='BL_ENTRY' and TABLE_NAME!='BL_OT'";
con.Open();
SqlDataAdapter da = new SqlDataAdapter();
SqlCommand cmd = new SqlCommand(sqlTable, con);
cmd.CommandType = CommandType.Text;
da.SelectCommand = cmd;
da.Fill(ds);
con.Close();
//render connection string from WebConfig file
string strcon = ConfigurationManager.ConnectionStrings["SPSchema"].ConnectionString;
for (int i = 0; i < ds.Tables[0].Rows.Count; i++)
{
if (!(ds.Tables[0].Rows[i]["TABLE_NAME"].ToString().Contains("Asp")))
{
string deleteQuery = "Truncate table " + Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"];
con.Open();
SqlCommand cmdDelete = new SqlCommand(deleteQuery, con);
cmdDelete.ExecuteNonQuery();
con.Close();
DataSet dataSet = new DataSet();
SqlConnection conn = new SqlConnection(strcon);
conn.Open();
string selectData = "select * from " + Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"];
SqlCommand command = new SqlCommand(selectData, conn);
DataTable dataTable = new DataTable();
SqlDataAdapter dataAdapter = new SqlDataAdapter(selectData, conn);
dataAdapter.FillSchema(dataSet, SchemaType.Mapped);
dataAdapter.Fill(dataSet);
dataTable = dataSet.Tables[0];
conn.Close();
if (dataSet.Tables[0].Rows.Count > 0)
{
//Connect to second Database and Insert row/rows.
SqlConnection conn2 = new SqlConnection(connectionString);
conn2.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(conn2);
bulkCopy.DestinationTableName = Session["SchemaName"].ToString() + "." + ds.Tables[0].Rows[i]["TABLE_NAME"].ToString();
bulkCopy.WriteToServer(dataTable);
conn2.Close();
}
}
}
As i run the above code after inserting data in less than 10 tables, it gives out of memory exception & program crashes.
How to handle this? I tried increasing the memory capacity of SQL Server but still same error.
Is there any other way to achieve the task?
What you are doing is very far from the best solution. You are using an ASP.NET MVC process to get all data of your entire database into memory, and then outputting it to another database. If your database is anything more than small and trivial, that will most definitely fill your process's alotted memory.
This type of task should never be done through the memory of a process, but rather using some form of Backup/Restore pattern.
You should look into SSIS projects and create an extract, transfer, and load (ETL) solution, which can be triggered from your ASP.NET MVC solution asynchronously.
An SSIS solution can be triggered from C# code in this way:
var app = new Application();
var package = app.LoadPackage("compiled-package.dtsx", null);
var results = package.Execute();
See this question for a little more information (not specifically about duplicating databases, but has information about triggering SSIS packages from code): How to execute an SSIS package from .NET?
Alternatively
You also have the option of running a query against both databases at once, however this requires some additional plumbing to be done. The user account of your ASP.NET MVC solution needs to have access to both databases. If your databases are hosted on different servers, you also need to link one server to the other: Create linked servers
To perform an insert directly from the output of a select, consider this:
string source = "NAME_OF_SOURCE_DATABASE";
string target = "NAME_OF_TARGET_DATABASE";
string schema = Session["SchemaName"].ToString();
string table = ds.Tables[0].Rows[i]["TABLE_NAME"];
// Uncomment this if you need to deal with autoincrement columns
/*string idInsQuery = $"SET IDENTITY_INSERT {target}.{schema}.{table} ON";
var idInsCommand = new SqlCommand(idInsQuery, conn);
idInsCommand.ExecuteNonQuery();*/
string insQuery = $"INSERT INTO {target}.{schema}.{table} SELECT * FROM {source}.{schema}.{table}";
var insCommand = new SqlCommand(insQuery, conn);
insCommand.ExecuteNonQuery();
// Uncomment this if you need to deal with autoincrement columns
/*string idInsQuery2 = $"SET IDENTITY_INSERT {target}.{schema}.{table} OFF";
var idInsCommand2 = new SqlCommand(idInsQuery2, conn);
idInsCommand2.ExecuteNonQuery();*/
This will only work if the table structures are identical. There might be problems with autoincrement ids or columns with default values, too.
This will copy data from a table in database 1 to a table in database 2
Insert into db2.dbo.table2 (col1,col2)
Select col1,col2 from db1.dbo.table1
Run this sql statement and the data will be copied without a round trip to your app.
Let me know if you find my approach is useful.
First of all, why you want to write down one whole application to do this job while SQL Server have inherited property to do it.
My approach would be configure an Linked Server and configure it which tables you want to copy and which one not.
https://learn.microsoft.com/en-us/sql/relational-databases/linked-servers/create-linked-servers-sql-server-database-engine
Secondly, You can just write down simple stored procedure and schedule that in your sql server to push into another server database as per your schedule. In this way you can control it in N number of ways. I mean about controlling any dependencies(Table level or Business level).
To do this in t-sql, you can use the following system stored procedures to schedule a daily job. This example schedules daily at 1:00 AM. See Microsoft help for details on syntax of the individual stored procedures and valid range of parameters.
DECLARE #job_name NVARCHAR(128), #description NVARCHAR(512), #owner_login_name NVARCHAR(128), #database_name NVARCHAR(128);
SET #job_name = N'Some Title';
SET #description = N'Periodically do something';
SET #owner_login_name = N'login';
SET #database_name = N'Database_Name';
-- Delete job if it already exists:
IF EXISTS(SELECT job_id FROM msdb.dbo.sysjobs WHERE (name = #job_name))
BEGIN
EXEC msdb.dbo.sp_delete_job
#job_name = #job_name;
END
-- Create the job:
EXEC msdb.dbo.sp_add_job
#job_name=#job_name,
#enabled=1,
#notify_level_eventlog=0,
#notify_level_email=2,
#notify_level_netsend=2,
#notify_level_page=2,
#delete_level=0,
#description=#description,
#category_name=N'[Uncategorized (Local)]',
#owner_login_name=#owner_login_name;
-- Add server:
EXEC msdb.dbo.sp_add_jobserver #job_name=#job_name;
-- Add step to execute SQL:
EXEC msdb.dbo.sp_add_jobstep
#job_name=#job_name,
#step_name=N'Execute SQL',
#step_id=1,
#cmdexec_success_code=0,
#on_success_action=1,
#on_fail_action=2,
#retry_attempts=0,
#retry_interval=0,
#os_run_priority=0,
#subsystem=N'TSQL',
#command=N'EXEC my_stored_procedure; -- OR ANY SQL STATEMENT',
#database_name=#database_name,
#flags=0;
-- Update job to set start step:
EXEC msdb.dbo.sp_update_job
#job_name=#job_name,
#enabled=1,
#start_step_id=1,
#notify_level_eventlog=0,
#notify_level_email=2,
#notify_level_netsend=2,
#notify_level_page=2,
#delete_level=0,
#description=#description,
#category_name=N'[Uncategorized (Local)]',
#owner_login_name=#owner_login_name,
#notify_email_operator_name=N'',
#notify_netsend_operator_name=N'',
#notify_page_operator_name=N'';
-- Schedule job:
EXEC msdb.dbo.sp_add_jobschedule
#job_name=#job_name,
#name=N'Daily',
#enabled=1,
#freq_type=4,
#freq_interval=1,
#freq_subday_type=1,
#freq_subday_interval=0,
#freq_relative_interval=0,
#freq_recurrence_factor=1,
#active_start_date=20170101, --YYYYMMDD
#active_end_date=99991231, --YYYYMMDD (this represents no end date)
#active_start_time=010000, --HHMMSS
#active_end_time=235959; --HHMMSS
Let me know in case you need more details on this.
Thanks,
Ayan

How can I insert records in bulk in MySQL?

I have to copy more than 100k records in the same table and after every row insert in first table, second table will be updated with first table insert ID (Primary key).
I tried to do bulk insert but then I would not get all the inserted ids which will be inserted in second table.
I am using MySQL 5.5. When I run the following code, I get following random errors:
Lost connection to MySQL server during query.
The transaction associated with the current connection has completed
but has not been disposed. The transaction must be disposed before
the connection can be used to execute SQL statements.
Net packets out of order: received[x], expected[y].
How can I insert these records optimally?
CODE;
foreach (var item in transactions)
{
int transactionId;
using (MySqlCommand cm = DM.ConnectionManager.Conn.CreateCommand())
{
cm.CommandType = System.Data.CommandType.Text;
var commandText = #"INSERT INTO FirstTable SET
column1=#column1;";
cm.CommandText = commandText;
cm.Parameters.AddWithValue("#column1", item.column1);
cm.ExecuteNonQuery();
transactionId = (int)cm.InsertId;
}
foreach (var item in item.TransactionDetails)
{
using (MySqlCommand cm = DM.ConnectionManager.Conn.CreateCommand())
{
cm.CommandType = System.Data.CommandType.Text;
var commandText = #"INSERT INTO SecondTable SET
column1=#column1,
column2=#column2;";
cm.CommandText = commandText;
cm.Parameters.AddWithValue("#column1", item.column1);
cm.Parameters.AddWithValue("#column2", item.column2);
cm.ExecuteNonQuery();
}
}
}
The fastest method to import to MySQL is INFILE, so I would suggest making a CSV file and then running the following as a SQL statement.
Please note I've not got a full C# setup and tested this... but it is how I backup / restore MySQL when I want it done fast... so I'm assuming the following can be run when set to "commandText" and run after the CSV file is created and written to disk.
LOAD DATA LOCAL INFILE 'import.csv' INTO TABLE MyTable FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
(Col1,Col2,Col3);
From https://dev.mysql.com/doc/refman/5.7/en/load-data.html "The LOAD DATA INFILE statement reads rows from a text file into a table at a very high speed..."

reducing insert statement time when dealing with large amount of data

I read about SqlBulkCopy and the way it can reduce the amount of time used when inserting large amount of rows my scenario is : I have a an excel file wish I convert it into a dataTable then I send this dataTable to a stored procedure ( wish I can't change its code ) that insert all the rows in the dataTable to an sql table in the database
the problem is that I have like 10 000 to 50 000 rows to insert is there any work around to reduce the time took by the stored procedure ?
The best way to do this would be to use SqlBulkCopy to add the data to a temporary table and then feed that data into the stored proc. You will need to write some SQL code to do the processing but the performance benefits of doing it this way should be worth the effort.
If you create a new stored proc then you have the added benefit of running all of this code inside the database engine so you will not be switching back and forth between your application and the DB engine.
Some Code:
var importData = new DataSet();
xmlData.Position = 0;
importData.ReadXml(xmlData);
using (var connection = new SqlConnection(myConnectionString))
{
connection.Open();
using (var trans = connection.BeginTransaction())
{
using (var sbc = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, trans) { DestinationTableName = myTableName })
{
foreach (DataColumn col in importData.Tables[0].Columns)
{
sbc.ColumnMappings.Add(col.ColumnName, col.ColumnName);
}
sbc.WriteToServer(importData.Tables[0]); //table 0 is the main table in this dataset
// Now lets call the stored proc.
var cmd = new SqlCommand("ProcessDataImport", connection)
{
CommandType = CommandType.StoredProcedure
};
cmd.CommandTimeout = 1200;
cmd.ExecuteNonQuery();
trans.Commit();
}
connection.Close();
return null;
}
}
Where XmlData is a stream with the Xml data matching your bulk import and myTableName contains the table you want to import into. Rememeber, when doing a bulk copy, the column names must match 100%. Case is important too.
The proc would look something like this:
CREATE PROCEDURE [ProcessDataImport]
AS
BEGIN
DECLARE #IMPORTCOL INT
WHILE EXISTS (SELECT X FROM TEMPTABLE)
BEGIN
SELECT #IMPORTCOL = (SELECT TOP 1 COLUMN1 FROM TEMPTABLE)
EXEC DOTHEIMPORT #IMPORTCOL
DELETE FROM TEMPTABLE WHERE COLUMN1 = #IMPORTCOL
END
END

Basic start with Visual Studio C# and SQL Compact (connect, select, insert)?

I'm trying to learn about C# with SQL CE so that my program can remember stuff.
I have created a database and can connect to it:
SqlCeConnection conn =
new SqlCeConnection(#"Data Source=|DataDirectory|\dbJournal.sdf");
conn.Open();
And it connects right, I guess cause if I rename the dbJournal.sdf to something wrong it doesn't debug right.
Let's say I want to make a simple SELECT query.
(SELECT * FROM tblJournal)
How is that done?
What about a simple insert?
(INSERT TO tblJournal (column1, column2, column2) VALUES
(value1, value2, value3))
I'm used to PHP and MySQL (as you properly can see :o))
#Chuck mentions EntityFramework which simplifies things and does all the work of writing the sql for you.
But there is a basic ADO.NET approach here which I will describe below.
The classes follow a standard pattern so to insert/read from sql server or other databases there are exact replica classes like SqlConnection or OleDbConnection and OleDbCommand etc
This is the most barebones ado.net approach:
using( SqlCeConnection conn =
new SqlCeConnection(#"Data Source=|DataDirectory|\dbJournal.sdf") )
using( SqlCeCommand cmd = conn.CreateCommand() )
{
conn.Open();
//commands represent a query or a stored procedure
cmd.CommandText = "SELECT * FROM tblJournal";
using( SqlCeDataReader rd = cmd.ExecuteReader() )
{
//...read
}
conn.Close();
}
Then to read data :
while (rd.Read())
{//loop through the records one by one
//0 gets the first columns data for this record
//as an INT
rd.GetInt32(0);
//gets the second column as a string
rd.GetString(1);
}
A nice and quicker way to read data is like this:
using( SqlCeDataAdapter adap =
new SqlCeDataAdapter("SELECT * FROM tblJournal", "your connection") )
{
//the adapter will open and close the connection for you.
DataTable dat = new DataTable();
adap.Fill(dat);
}
This gets the entire data in one shot into a DataTable class.
To insert data :
SqlCeCommand cmdInsert = conn.CreateCommand();
cmdInsert.CommandText = "INSERT TO tblJournal (column1, column2, column2)
VALUES (value1, value2, value3)";
cmdInsert.ExecuteNonQuery();
If you just start learning that i will suggest you to use LINQ to make that queries.
Here is MSDN article showing features of LINQ.
http://msdn.microsoft.com/en-us/library/bb425822.aspx
Using LINQ it will be simple to do every query. For example, you can write your select query like this
from journal in TblJournal select journal
or just
context.TblJournal
also in order to improve performence , you better keep the conncection open all the time when working with SQL CE (as opposed to other standard sql databases)

Categories