I'm iterating a (binary) file and insert some values in a table.
After receiving the values I insert them row-by-row in the database.
Part of my code (example values):
string sColumn1 = Getcolumn1Value();
string sColumn2 = Getcolumn2Value();
string sqlIns = "INSERT INTO table (column1, column2) " +
"VALUES (#column_1, #column_2)";
SqlCommand cmdInsPacket = new SqlCommand(sqlIns, connection);
cmdInsPacket.Parameters.AddWithValue("#column_1", sColumn1);
cmdInsPacket.Parameters.AddWithValue("#column_2", sColumn2);
cmdInsPacket.ExecuteNonQuery();
cmdInsPacket.Parameters.Clear();
Because there're millions of records I was wondering if it's beter to insert a group of, for example, 500 instead of one by one.
Any good suggestions to fix this (and is my suggestion (group insert) useful)
(MS SQL Server 2008 R2 and C#)
You can use SqlBulkCopy to insert multiple records. You can put records in datatable and save them to db in single call.
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "dbo.BulkCopyDemoMatchingColumns";
try
{
bulkCopy.WriteToServer(newProducts);
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
}
If you have the data in DataTable format use this simple function.
function static void InsertDataIntoSQLServerUsingSQLBulkCopy(DataTable bulkData)
{
using(SqlConnection dbConnection = new SqlConnection("Data Source=ProductHost;Initial Catalog=yourDB;Integrated Security=SSPI;"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = "Your table name";
foreach (var column in bulkData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(bulkData);
}
}
}
Related
I have a Xamarin app that connects to a SQL Server (*), and I want to insert the data from the SQL Server's tables into a local SQLite database. My approach is to execute a query like
select * from Table1
retrieve the results of the query as a List, then insert that List into my SQLite table. That said I'm very new at using SqlClient, so please share if there are better ways to do this. Thanks.
Edit: the smallest number of columns these tables have is 5. Don't know if that disqualifies Lists as the best option.
My code:
private void LoadData()
{
string cs = #"connection string here";
using (SqlConnection sconn = new SqlConnection(cs))
{
sconn.Open();
SqlDataReader reader = null;
SqlCommand aml = new SqlCommand("select * from Table1");
reader = aml.ExecuteReader();
while (reader.Read())
{
// get result of query as List somehow?
}
using (SQLiteConnection conn = new SQLiteConnection(App.DatabaseLocation))
{
conn.CreateTable<Table1>();
if (conn.Query<Table1>("select * from Table1").Count() <= 0)
{
// insert the list object
}
}
}
}
(*) The app does not use a web service as the app is intended for onsite use only and will not be distributed publicly.
A better alternative way to do it more easier is to use a ORM like dapper
With the help of dapper, all you need to do is
using (var connection = new SqlConnection(_sqlConnectionString))
{
var results = connection.Query<YourTableModel>(query).ToList();
return results;
}
You can get data from the SQL Server as DataTable or convert it to a list as you prefer.
public DataTable GetDataTable(string connectionString, string tableName)
{
SqlConnection conn = new SqlConnection(connectionString);
conn.Open();
string query = $'SELECT * FROM [{tableName}]';
SqlCommand cmd = new SqlCommand(query, conn);
DataTable t1 = new DataTable();
using (SqlDataAdapter a = new SqlDataAdapter(cmd))
{
a.Fill(t1);
}
return t1;
}
Then use this table or list returned from the above method to insert in the SQLite table.
string cs = #"Data Source=datasource;Initial Catalog=databasename;User ID=user;Password=password";
DataTable table = GetDataTable(cs, "Table1");
using (SQLiteConnection conn = new SQLiteConnection(App.DatabaseLocation))
{
conn.CreateTable<Table1>();
if (conn.Query<Table1>("select * from Table1").Count() <= 0)
{
foreach(DataRow row in table.Rows)
{
//Access values of each row column row["columnName"]
// insert the list object
}
}
}
Refer to this one:
Inserting Data from SQL Server to Sqlite
I am migrating my program from Microsoft SQL Server to MySQL. Everything works well except one issue with bulk copy.
In the solution with MS SQL the code looks like this:
connection.Open();
SqlBulkCopy bulkCopy = new SqlBulkCopy(connection);
bulkCopy.DestinationTableName = "testTable";
bulkCopy.WriteToServer(rawData);
Now I try to do something similar for MySQL. Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.
Any help would be highly appreciated.
Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.
Don't rule out a possible solution based on unfounded assumptions. I just tested the insertion of 100,000 rows from a System.Data.DataTable into a MySQL table using a standard MySqlDataAdapter#Update() inside a Transaction. It consistently took about 30 seconds to run:
using (MySqlTransaction tran = conn.BeginTransaction(System.Data.IsolationLevel.Serializable))
{
using (MySqlCommand cmd = new MySqlCommand())
{
cmd.Connection = conn;
cmd.Transaction = tran;
cmd.CommandText = "SELECT * FROM testtable";
using (MySqlDataAdapter da = new MySqlDataAdapter(cmd))
{
da.UpdateBatchSize = 1000;
using (MySqlCommandBuilder cb = new MySqlCommandBuilder(da))
{
da.Update(rawData);
tran.Commit();
}
}
}
}
(I tried a couple of different values for UpdateBatchSize but they didn't seem to have a significant impact on the elapsed time.)
By contrast, the following code using MySqlBulkLoader took only 5 or 6 seconds to run ...
string tempCsvFileSpec = #"C:\Users\Gord\Desktop\dump.csv";
using (StreamWriter writer = new StreamWriter(tempCsvFileSpec))
{
Rfc4180Writer.WriteDataTable(rawData, writer, false);
}
var msbl = new MySqlBulkLoader(conn);
msbl.TableName = "testtable";
msbl.FileName = tempCsvFileSpec;
msbl.FieldTerminator = ",";
msbl.FieldQuotationCharacter = '"';
msbl.Load();
System.IO.File.Delete(tempCsvFileSpec);
... including the time to dump the 100,000 rows from the DataTable to a temporary CSV file (using code similar to this), bulk-loading from that file, and deleting the file afterwards.
Similar to SqlBulkCopy, we have MySqlBulkCopy for Mysql.
here is the example how to use it.
public async Task<bool> MySqlBulCopyAsync(DataTable dataTable)
{
try
{
bool result = true;
using (var connection = new MySqlConnector.MySqlConnection(_connString + ";AllowLoadLocalInfile=True"))
{
await connection.OpenAsync();
var bulkCopy = new MySqlBulkCopy(connection);
bulkCopy.DestinationTableName = "yourtable";
// the column mapping is required if you have a identity column in the table
bulkCopy.ColumnMappings.AddRange(GetMySqlColumnMapping(dataTable));
await bulkCopy.WriteToServerAsync(dataTable);
return result;
}
}
catch (Exception ex)
{
throw;
}
}
private List<MySqlBulkCopyColumnMapping> GetMySqlColumnMapping(DataTable dataTable)
{
List<MySqlBulkCopyColumnMapping> colMappings = new List<MySqlBulkCopyColumnMapping>();
int i = 0;
foreach (DataColumn col in dataTable.Columns)
{
colMappings.Add(new MySqlBulkCopyColumnMapping(i, col.ColumnName));
i++;
}
return colMappings;
}
You can ignore the column mapping if you don't have any identity column in your table.
If you have identity column then you have to use the column mapping otherwise it won't insert any records in the table
It will just give message like "x rows were copied but only 0 rows were inserted".
This class i available in the below library
Assembly MySqlConnector, Version=1.0.0.0
Using any of BulkOperation NuGet-package, you can easily have this done.
Here is an example using the package from https://www.nuget.org/packages/Z.BulkOperations/2.14.3/
MySqlConnection conn = DbConnection.OpenConnection();
DataTable dt = new DataTable("testtable");
MySqlDataAdapter da = new MySqlDataAdapter("SELECT * FROM testtable", conn);
MySqlCommandBuilder cb = new MySqlCommandBuilder(da);
da.Fill(dt);
instead of using
......
da.UpdateBatchSize = 1000;
......
da.Update(dt)
just following two lines
var bulk = new BulkOperation(conn);
bulk.BulkInsert(dt);
will take only 5 seconds to copy the whole DataTable into MySQL without first dumping the 100,000 rows from the DataTable to a temporary CSV file.
I'm trying to use bulkCopy.WriteToServer() to copy data from my remote SQL Server Express to my local SQL Server Express, but no data is ever written to my local SQL Server Express database.
The code jumps over the WriteToServer() method instantly... I have no idea if it's failing on the inside and not showing an error message though
I have read How to duplicate a SQL Server 2000 table programatically using .NET 2.0? and I am using very similar code. Although I am using SQL Server 2008 Express on remote and a local to SQL Server 2014 Express:
using (SqlConnection remoteConnection = new SqlConnection(remoteConnectionString))
{
var query = "SELECT * FROM information_schema.tables WHERE table_type = 'base table'";
SqlCommand commandGetTables = new SqlCommand(query, remoteConnection);
try
{
remoteConnection.Open();
SqlDataReader results = commandGetTables.ExecuteReader();
while (results.Read())
{
tables.Add(results.GetString(2));
}
results.Close();
}
catch (Exception ex)
{
//stuff
}
finally
{
remoteConnection.Close();
}
remoteConnection.Open();
foreach (var table in tables)
{
// Get data from the source table as a SqlDataReader.
var commandSourceData = new SqlCommand("SELECT * FROM " + table + ";", remoteConnection);
var reader = commandSourceData.ExecuteReader();
using (SqlConnection destinationConnection = new SqlConnection(destinationConnectionString))
{
destinationConnection.Open();
using (var bulkCopy = new SqlBulkCopy(destinationConnection))
{
bulkCopy.DestinationTableName = table;
try
{
// Write from the source to the destination.
bulkCopy.WriteToServer(reader);
}
catch (Exception ex)
{
//stuff
}
finally
{
//stuff removed for this post
}
}
}
}
remoteConnection.Close();
}
return true;
I know this could be subject to SQL injection etc, but this app is only used by me and not the issue here.
What am I doing wrong?
Edit
I checked the value of reader (var reader = commandSourceData.ExecuteReader();) and it has entries as I would expect, meaning he reading from the remote is fine.
bulkCopy.DestinationTableName = table;
bulkCopy.WriteToServer(reader);
these lines are wrong it's supposed to look like this..
bulkCopy.DestinationTableName = "dbo." + DataTable.TableName;
bulkCopy.WriteToServer(DataTable);
Now I use method in C# to read table from SQLite database into DataTable,
but I want to send all table into other object.
So I think I have to use DataSet to combine all DataTable(s)
and send it to object as parameter.
Is there method that easy read all tables from SQLite database to DataSet?
Or I have to read all tables from SQLite database to DataTable each table
and combine to DataSet by hand?
The sql for listing all the tables is:
SELECT name FROM sqlite_master WHERE type = 'table' ORDER BY 1
you could then get all the tables as databases seperately and then add them into a dataset - an example here: http://www.dotnetperls.com/dataset
so i guess the code would be something like:
Dataset d = new Dataset()
foreach (tableName in GetTables()){
d.Tables.Add(GetDataTable("select * from "+tableName);
}
code for GetTables and GetDataTable (i'll leave the piecing it together to you):
public ArrayList GetTables()
{
ArrayList list = new ArrayList();
// executes query that select names of all tables in master table of the database
String query = "SELECT name FROM sqlite_master " +
"WHERE type = 'table'" +
"ORDER BY 1";
try
{
DataTable table = GetDataTable(query);
// Return all table names in the ArrayList
foreach (DataRow row in table.Rows)
{
list.Add(row.ItemArray[0].ToString());
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
return list;
}
public DataTable GetDataTable(string sql)
{
try
{
DataTable dt = new DataTable();
using (var c = new SQLiteConnection(dbConnection))
{
c.Open();
using (SQLiteCommand cmd = new SQLiteCommand(sql, c))
{
using (SQLiteDataReader rdr = cmd.ExecuteReader())
{
dt.Load(rdr);
return dt;
}
}
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
return null;
}
}
I'm looking at the example here:
http://msdn.microsoft.com/en-US/library/y06xa2h1(v=vs.80).aspx
string s = "primaryKeyValue";
DataRow foundRow = dataSet1.Tables["AnyTable"].Rows.Find(s);
if (foundRow != null)
{
MessageBox.Show(foundRow[1].ToString());
}
else
{
MessageBox.Show("A row with the primary key of " + s + " could not be found");
}
They don't specify where does dataSet1 come from and does this represent some database?
I'm trying to use this example in my code to find unique rows but I can't seem to implement this syntax. I'm only using connection string to open connection to SQL and I use SqlDataAdapter to perform functions...
EDIT:
SqlConnection myConnection = new SqlConnection("Data Source=server; Initial Catalog=Dashboard; Integrated Security=SSPI; Persist Security Info=false; Trusted_Connection=Yes");
SqlDataAdapter da = new SqlDataAdapter();
try
{
//Opens the connection to the specified database
myConnection.Open();
//Specifies where the Table in the database where the data will be entered and the columns used
da.InsertCommand = new SqlCommand("INSERT INTO DashboardLibAnswer(Id,Date,Time,Question,Details,Answer,Notes,EnteredBy,WhereReceived,QuestionType,AnswerMethod,TransactionDuration)"
+ "VALUES(#Id,#Date,#Time,#Question,#Details,#Answer,#Notes,#EnteredBy,#WhereReceived,#QuestionType,#AnswerMethod,#TransactionDuration)", myConnection);
//Specifies the columns and their variable type where the data will be entered
//Special note: Conversion from String > DateTime will cause exceptions that will only import some part of data and not everything
da.InsertCommand.Parameters.Add("#Id", SqlDbType.NVarChar);
da.InsertCommand.Parameters.Add("#Date", SqlDbType.Text);
da.InsertCommand.Parameters.Add("#Time", SqlDbType.Text);
da.InsertCommand.Parameters.Add("#Question", SqlDbType.Text);
da.InsertCommand.Parameters.Add("#Details", SqlDbType.Text);
da.InsertCommand.Parameters.Add("#Answer", SqlDbType.Text);
da.InsertCommand.Parameters.Add("#Notes", SqlDbType.Text);
da.InsertCommand.Parameters.Add("#EnteredBy", SqlDbType.NVarChar);
da.InsertCommand.Parameters.Add("#WhereReceived", SqlDbType.NVarChar);
da.InsertCommand.Parameters.Add("#QuestionType", SqlDbType.NVarChar);
da.InsertCommand.Parameters.Add("#AnswerMethod", SqlDbType.NVarChar);
da.InsertCommand.Parameters.Add("#TransactionDuration", SqlDbType.NVarChar);
//Using the global variable counter this loop will go through each valid entry and insert it into the specifed database/table
for (int i = 0; i < counter; i++)
{
//Iterates through the collection array starting at first index and going through until the end
//and inserting each element into our SQL Table
DataSet dashboardDS = new DataSet();
da.Fill(dashboardDS, "DashboardLibAnswer");
DataTable dt = dashboardDS.Tables["DashboardLibAnswer"];
foreach (DataColumn col in dt.Columns)
{
if (col.Unique)
{
da.InsertCommand.Parameters["#Id"].Value = collection.getIdItems(i);
da.InsertCommand.Parameters["#Date"].Value = collection.getDateItems(i);
da.InsertCommand.Parameters["#Time"].Value = collection.getTimeItems(i);
da.InsertCommand.Parameters["#Question"].Value = collection.getQuestionItems(i);
da.InsertCommand.Parameters["#Details"].Value = collection.getDetailsItems(i);
da.InsertCommand.Parameters["#Answer"].Value = collection.getAnswerItems(i);
da.InsertCommand.Parameters["#Notes"].Value = collection.getNotesItems(i);
da.InsertCommand.Parameters["#EnteredBy"].Value = collection.getEnteredByItems(i);
da.InsertCommand.Parameters["#WhereReceived"].Value = collection.getWhereItems(i);
da.InsertCommand.Parameters["#QuestionType"].Value = collection.getQuestionTypeItems(i);
da.InsertCommand.Parameters["#AnswerMethod"].Value = collection.getAnswerMethodItems(i);
da.InsertCommand.Parameters["#TransactionDuration"].Value = collection.getTransactionItems(i);
da.InsertCommand.ExecuteNonQuery();
}
}
//Updates the progress bar using the i in addition to 1
_worker.ReportProgress(i + 1);
} // end for
//Once the importing is done it will show the appropriate message
MessageBox.Show("Finished Importing");
} // end try
catch (Exception exceptionError)
{
//To show exceptions thrown just uncomment bellow line
//rtbOutput.AppendText(exceptionError.ToString);
} // end catch
//Closes the SQL connection after importing is done
myConnection.Close();
}
if you populate a dataset from your data adapter, you'll be able to follow the same logic -
http://msdn.microsoft.com/en-us/library/bh8kx08z(v=vs.71).aspx
It might be worth showing what you actually have to get more specific help
EDIT
I think I'm understanding what you want - if you fill your datatable from the already populated table, just check the item doesn't already exist before adding it - i.e.
if (dt.Rows.Find(collection.getIdItems(i)) == null)
{
// add your new row
}
(just to be sure I knocked together a quick test - hopefully this helps):
// MyContacts db has a table Person with primary key (ID) - 3 rows - IDs 4,5,6
SqlConnection myConnection = new SqlConnection("Data Source=.; Initial Catalog=MyContacts; Integrated Security=SSPI; Persist Security Info=false; Trusted_Connection=Yes");
SqlDataAdapter da = new SqlDataAdapter();
da.SelectCommand = new SqlCommand("select * from Person", myConnection);
myConnection.Open();
DataSet dashboardDS = new DataSet();
da.Fill(dashboardDS, "Person");
dashboardDS.Tables[0].PrimaryKey = new[] { dashboardDS.Tables[0].Columns["ID"]};
List<int> ids = new List<int> {4, 6, 7};
foreach (var id in ids)
{
if (dashboardDS.Tables[0].Rows.Find(id) == null)
{
Console.WriteLine("id not in database {0}", id); //i.e. 7
}
}
You will first need to open a connection to your database. This is an excellent source for connection strings: The Connection String Reference.
Then you will need to fill the dataset with data from some table. Since we are only interested in the schema information we are only selecting one row (SELECT TOP 1 ...).
Then we can go through the columns and check their Unique property (Boolean):
string connString =
"server=(local)\\SQLEXPRESS;database=MyDatabase;Integrated Security=SSPI";
string sql = #"SELECT TOP 1 * FROM AnyTable";
using (SqlConnection conn = new SqlConnection(connString)) {
conn.Open();
SqlDataAdapter da = new SqlDataAdapter(sql, conn);
using (DataSet ds = new DataSet()) {
da.Fill(ds, "AnyTable");
DataTable dt = ds.Tables["AnyTable"];
foreach (DataColumn col in dt.Columns) {
if (col.Unique) {
Console.WriteLine("Column {0} is unique.", col.ColumnName);
}
}
}
}
UPDATE #1
Sorry, I missunderstood your question. The above example returns unique columns, not unique rows. You can get unique (distinct) rows by using the DISTINCT keyword in SQL:
SELECT DISTINCT field1, field2, field3 FROM AnyTable
You can then fill the data table the same way as above.
Usually the word "unique" is used for unique constraints and unique indexes in database jargon. The term "distinct" is used for rows which are different.
UPDATE #2
Your updated question seems to suggest that you don't want find unique rows, but that you want to insert unique rows (which is the exact opposite).
Usually you would select distinct items from a collection like this. However it is difficult to answer your question accurately, since we don't know the type of your collection.
foreach (var item in collection.Distinct()) {
}
UPDATE #3
The easiest way to insert distinct values in the SQL Server table is to filter the rows at their origin, when reading them from the CSV-File; even before splitting them.
string[] lines = File.ReadAllLines(#"C:\Data\MyData.csv");
string[][] splittedLines = lines
.Distinct()
.Select(s => s.Split(','))
.ToArray();
Now you have distinct (unique) splitted lines that you can insert into the SQL Server table.