Import Text File data to PostgreSQL database using C# - c#

private void GetTextFile()
{
NpgsqlConnection npgsqlConnection = new NpgsqlConnection();
npgsqlConnection.ConnectionString = "Server=127.0.0.1;Port=5432;User
Id=postgres;Password=rutuparna;Database=Employee";
npgsqlConnection.Open();
NpgsqlCommand command = new NpgsqlCommand("Select * from employee_details", npgsqlConnection);
NpgsqlDataReader dataReader = command.ExecuteReader();
using (System.IO.StreamWriter writer = new System.IO.StreamWriter(#"D:\Rutu\txtfile.txt", false, Encoding.UTF8))
{
while (dataReader.Read())
{
writer.WriteLine(dataReader[0] + "; " + dataReader[1] + ";" + dataReader[2] + ";" + dataReader[3]);
}
}
MessageBox.Show("Data fetched Properly");
}
Here I have done how to convert data into a text file.
Can somebody please give me the code of how to export Text File data to the SQL database using C# ?

Why not something like this
using (SqlConnection con = new SqlConnection(#"your connection string"))
{
con.Open();
using(StreamReader file = new StreamReader(#"D:\Rutu\txtfile.txt"))
{
while((line = file.ReadLine()) != null)
{
string[] fields = line.Split(',');
SqlCommand cmd = new SqlCommand("INSERT INTO employee_details(column1, column2, column3,column4) VALUES (#value1, #value2, #value3, #value4)", con);
cmd.Parameters.AddWithValue("#value1", fields[0].ToString());
cmd.Parameters.AddWithValue("#value2", fields[1].ToString());
cmd.Parameters.AddWithValue("#value3", fields[2].ToString());
cmd.Parameters.AddWithValue("#value4", fields[3].ToString());
cmd.ExecuteNonQuery();
}
}
}
I don't know the names of your columns, so you'll need to replace them.

If speed is a consideration (and IMHO it always should be) and you are using PostgreSQL (which you seem to be), then you should take a look at the COPY function. Single inserts are always slower than a bulk operation.
It is relatively easy to wrap the COPY function within c#. The following is a cut down version of a method, to illustrate the point. My method loops through the rows of a DataTable, but it is easy to adapt for a file situation (ReadLine() etc).
using (var pgConn = new NpgsqlConnection(myPgConnStr))
{
using (var writer = pgConn.BeginBinaryImport("COPY " + destinationTableName + " (" + commaSepFieldNames + ") FROM STDIN (FORMAT BINARY)"))
{
//Loop through data
for (int i = 0; i < endNo; i++)
{
writer.StartRow();
//inner loop through fields
for (int j = 0; j < fieldNo; j++)
{
//test for null
if (true)
{
writer.WriteNull();
}
else
{
//Write data using column types
writer.Write(value, type);
}
}
}
writer.Complete();
}
}
WriteNull() is a special method for correctly adding NULL to the stream, otherwise you use Write<T>(). This latter has three overloads, just the value, value with type enumeration, or value with type name. I prefer to use the enumeration. commaSepFieldNames is a comma separated list of the field names. Other variables should (I hope) be self-explanatory.

Related

Export SQL Server Data into CSV file on desktop data contain commas

I am trying to export data into a CSV file from the SQL server. The code from this link (Export SQL Server Data into CSV file) is working with some except. In some rows that contain commas, the table arrangement is not correct. The code i have try
using (var connection = ConnectionToSqlServer.GetConnection())
{
connection.Open();
SqlCommand sqlCmd = new SqlCommand("Select * from dbo.Test", connection);
SqlDataReader reader = sqlCmd.ExecuteReader();
string fileName = "test.csv";
StreamWriter sw = new StreamWriter(fileName);
object[] output = new object[reader.FieldCount];
for (int i = 0; i < reader.FieldCount; i++)
{
for (int i = 0; i < reader.FieldCount; i++)
{
if (reader.GetName(i).Contains(","))
{
output[i] = "\"" + reader.GetName(i) + "\"";
}
else
output[i] = reader.GetName(i);
}
}
sw.WriteLine(string.Join(",", output));
while (reader.Read())
{
reader.GetValues(output);
sw.WriteLine(string.Join(",", output));
}
sw.Close();
reader.Close();
connection.Close();
}
I am suggesting you consider below options:
Quote the values, to have proper CSV generation. If the CSV content has , inside it, then the generated CSV might be having an issue.
while (reader.Read())
{
reader.GetValues(output);
sw.WriteLine(string.Join(",", $"\"{output}\""));
}
You can think of using library like CSVHelper

MySQL -> Transaction Context -> Code Review

I am hoping someone could check my context of how I am using Transaction with MySql. I believe this should work with the outline below. Can someone look at my code and tell me if I am doing it correctly? Thank you.
I believe this should:
Instantiate the db connection.
Iterate through the DataTable rows of the given DataTable.
Check to see if the table exists and if it does not it will Execute the Create Table.
Execute Insert Command with Parameters of information into the newly created or existing table.
Commit the Transaction and then close the connection.
//Open the SQL Connection
var dbConnection = new MySqlConnection(GetConnectionString(WowDatabase));
dbConnection.Open();
//Instantiate the Command
using (var cmd = new MySqlCommand())
{
//Create a new Transaction
using (var transaction = dbConnection.BeginTransaction())
{
uint lastId = 999999;
for (int i = 0; i < dt.Rows.Count; i++)
{
//var identifier = dt.Rows[i].Field<int>("Identifier");
var id = dt.Rows[i].Field<uint>("Entry");
var name = dt.Rows[i].Field<string>("Name");
var zone = dt.Rows[i].Field<uint>("ZoneID");
var map = dt.Rows[i].Field<uint>("MapID");
var state = dt.Rows[i].Field<Enums.ItemState>("State");
var type = dt.Rows[i].Field<Enums.ObjectType>("Type");
var faction = dt.Rows[i].Field<Enums.FactionType>("Faction");
var x = dt.Rows[i].Field<float>("X");
var y = dt.Rows[i].Field<float>("Y");
var z = dt.Rows[i].Field<float>("Z");
string dataTableName = "entry_" + id;
//Create Table if it does not exist.
if (id != lastId)
{
cmd.CommandText = $"CREATE TABLE IF NOT EXISTS `{dataTableName}` (" +
"`identifier` int NOT NULL AUTO_INCREMENT COMMENT 'Auto Incriment Identifier' ," +
"`zone_id` int NULL COMMENT 'Zone Entry' ," +
"`x_axis` float NULL COMMENT 'X Axis on Map' ," +
"`y_axis` float NULL COMMENT 'Y Axis on Map' ," +
"`z_axis` float NULL COMMENT 'Z Axis on Map' ," +
"`situation` enum('') NULL COMMENT 'Location of the item. Underground, Indoors, Outdoors)' ," +
"`faction` enum('') NULL COMMENT 'Specifies the Faction which can safely access the item.' ," +
"PRIMARY KEY(`identifier`)" +
")";
cmd.ExecuteNonQuery();
lastId = id;
}
//Create command to execute the insertion of Data into desired Table
cmd.CommandText = $"INSERT INTO [{dataTableName}] " +
"([identifier], [zone_id], [x_axis], [y_axis], [z_axis], [situation], [faction], [Create_Date], [Update_Date]) " +
"VALUES (#Identifier, #Zone_Id, #X_Axis, #Y_Axis, #Z_Axis, #Situation, #Faction, #Create_Date, #Update_Date)";
//Add data value with Parameters.
cmd.CommandType = CommandType.Text;
//cmd.Parameters.AddWithValue("#Identifier", identifier);
cmd.Parameters.AddWithValue("#Identifier", id);
cmd.Parameters.AddWithValue("#Zone_Id", zone);
cmd.Parameters.AddWithValue("#X_Axis", x);
cmd.Parameters.AddWithValue("#Y_Axis", y);
cmd.Parameters.AddWithValue("#Z_Axis", z);
cmd.Parameters.AddWithValue("#Situation", state);
cmd.Parameters.AddWithValue("#Faction", faction);
cmd.Parameters.AddWithValue("#Create_Date", DateTime.Now.Date);
cmd.Parameters.AddWithValue("#Update_Date", DateTime.Now.Date);
cmd.ExecuteNonQuery();
} //for (int i = 0; i < dt.Rows.Count; i++)
//Commit the Transaction
transaction.Commit();
} //using (var transaction = dbConnection.BeginTransaction())
} //using (var cmd = new MySqlCommand())
//Close the Connection
dbConnection.Close();
I don't think this will work (as expected) with MySql. There are a few statements that cause an implicit commit - CREATE TABLE is one of them.
http://dev.mysql.com/doc/refman/5.7/en/implicit-commit.html
Consider a using statement
You can actually wrap your existing dbConnection within a using statement to ensure that it is safely disposed of (similar to how you are handling your transactions, commands, etc.) :
//Open the SQL Connection
using(var dbConnection = new MySqlConnection(GetConnectionString(WowDatabase))
{
// Other code omitted for brevity
}
Consistent String Interpolation
You have a few spots where you simply concatenate strings via + but you are mostly taking advantage of C# 6's String Interpolation feature. You might want to consider using it everywhere :
string dataTableName = $"entry_{id}";
No Need for Setting CommandType
Additionally, you could remove the the setting of your CommandType property for your actual cmd object as CommandType.Text is the default :
//cmd.CommandType = CommandType.Text;

Retrieving an element from a created list

I created an list named 'PTNList', and everything I needed added to it just fine. Now I am attempting to write code to retrieve each element from that list and run it against an SQL query. I have a feeling I'm not sure exactly how to about this. The CompareNumbers.txt file generates, but nothing is printed to it. Any help is greatly appreciated.
Below is the section of code I believe needs to be worked with.
using (FileStream fs = new FileStream("c:/temp/CompareNumbers.txt", FileMode.Append, FileAccess.Write))
using (StreamWriter sw = new StreamWriter(fs))
foreach (var ptn in PTNList)
{
//create sql for getting the count using "ptn" as the variable thats changing
//call DB with sql
//Get count from query, write it out to a file;
Console.WriteLine("Running Query");
string query2 = #"SELECT COUNT(PRODUCT_TYPE_NO)
AS NumberOfProducts
FROM dbo.PRODUCT
Where PRODUCT_TYPE_NO = " + ptn;
SqlCommand cmd2 = new SqlCommand(query2);
cmd2.Connection = con;
rdr = cmd2.ExecuteReader();
while (rdr.Read())
{
sw.WriteLine(rdr["NumberOfProducts"]);
}
rdr.Close();
}
You haven't used apostrophes around the values. But you should use parameters anyway. You could use one query instead of one for every type. For example with this approach:
string sql = #"SELECT COUNT(PRODUCT_TYPE_NO) AS NumberOfProducts
FROM dbo.PRODUCT
Where PRODUCT_TYPE_NO IN ({0});";
string[] paramNames = PTNList.Select(
(s, i) => "#type" + i.ToString()
).ToArray();
string inClause = string.Join(",", paramNames);
using (SqlCommand cmd = new SqlCommand(string.Format(sql, inClause)))
{
for (int i = 0; i < paramNames.Length; i++)
{
cmd.Parameters.AddWithValue(paramNames[i], PTNList[i]);
}
// con.Open(); // if not already open
int numberOfProducts = (int) cmd.ExecuteScalar();
}
Update: maybe you really just want to loop them and get their count. Then you don't need this complex approach. But you should still use sql-parameters to prevent sql-injection and other issues like missing apostrophes etc.
You'll want to convert the column back to a type, e.g.
sw.WriteLine(rdr["NumberOfProducts"] as string);
Also, note that your query is prone to SqlInjection attacks and should be parameterized, and that SqlCommand is also disposable. You can squeeze a bit more performance by reusing the SqlCommand:
string query2 = #"SELECT COUNT(PRODUCT_TYPE_NO)
AS NumberOfProducts
FROM dbo.PRODUCT
Where PRODUCT_TYPE_NO = #ptn";
using (var cmd2 = new SqlCommand(query2))
{
cmd2.Connection = con;
cmd2.Parameters.Add("#ptn", SqlDbType.Varchar);
foreach (var ptn in PTNList)
{
cmd2.Parameters["#ptn"].Value = ptn;
Console.WriteLine("Running Query");
using var (rdr = cmd2.ExecuteReader())
{
if (rdr.Read())
{
sw.WriteLine(rdr["NumberOfProducts"] as string);
}
}
}
}
Are you sure your query give an result and sw.WriteLine is executed? I would redesign your code like this, becfause if you have an error in your data query, you might get into trouble. I always like to use this (schema):
IDataReader reader = null;
try
{
// create every thing...
}
catch(Exception ex)
{
// catch all exceptions
}
finally()
{
if(reader != null)
{
reader.Close();
}
}
And use the same for your connection, so that you can be sure, it is closed correct.

Export SQL DataBase to WinForm DataSet and then to MDB Database using DataSet

My application is a winform app that relies on a database. At startup of the application it connects to an SQL Database on the server and put this information in a DataSet/DataTable.
If for some reason the database on the server is not accessible, the application has a built in failover and it will get its information from the local database.
If, in a normal scenario, I start the tool it will read from the sql database and if it has been updated on the server (a seperate snippet checks this), it should make sure the local database is up to date and this is where the problem starts.. (see below)
This part works fine and is added as context - this is where we connect to the SQL Database
public static DataSet dtsTableContents;
public static DataTable CreateDatabaseSQLConnection()
{
try
{
string strSqlConnectionString = "Data Source=MyLocation;Initial Catalog=MyCatalog;User=MyUser;Password=MyPassword;";
SqlCommand scoCommand = new SqlCommand();
scoCommand.Connection = new SqlConnection(strSqlConnectionString);
scoCommand.Connection.Open();
string strQueryToTable = "SELECT * FROM " + strTableName;
dtsTableContents = new DataSet();
SqlCommand scmTableInformation = new SqlCommand(strQueryToTable, scnConnectionToDatabase);
SqlDataAdapter sdaTableInformation = new SqlDataAdapter(scmTableInformation);
scnConnectionToDatabase.Open();
sdaTableInformation.Fill(dtsTableContents, strTableName);
DataTable dttTableInformation = dtsTableContents.Tables[strTableName];
scnConnectionToDatabase.Close();
return dttTableInformation;
}
catch
{
return null;
}
}
This snippet is part of the failover method that reads from my local database...
This part works fine and is added as context - this is where we connect to the MDB Database
public static DataTable CreateDatabaseConnection()
{
try
{
string ConnectionString = #"Provider=Microsoft.Jet.OLEDB.4.0;Data Source=MyLocation;Persist Security Info=True;JET OLEDB:Database Password=MyPassword;"
odcConnection = new OleDbConnection(ConnectionString);
odcConnection.Open();
string strQueryToTable = "SELECT * FROM " + strTableName;
DataSet dtsTableContents = new DataSet();
OleDbCommand ocmTableInformation = new OleDbCommand(strQueryToTable, ocnConnectionToDatabase);
OleDbDataAdapter odaTableInformation = new OleDbDataAdapter(ocmTableInformation);
ocnConnectionToDatabase.Open();
odaTableInformation.Fill(dtsTableContents, strTableName);
DataTable dttTableInformation = dtsTableContents.Tables[strTableName];
ocnConnectionToDatabase.Close();
return dttTableInformation;
}
catch
{
return null;
}
}
From my CreateDatabaseSQLConnection() I have a DataSet. This DataSet is verified to contain all the information from the server database. Now I have been googling around and found myself trying to use this code to update the local database based on this article:
http://msdn.microsoft.com/en-us/library/system.data.common.dataadapter.update(v=vs.71).aspx
public static void UpdateLocalDatabase(string strTableName)
{
try
{
if (CreateDatabaseConnection() != null)
{
string strQueryToTable = "SELECT * FROM " + strTableName;
OleDbDataAdapter odaTableInformation = new OleDbDataAdapter();
odaTableInformation.SelectCommand = new OleDbCommand(strQueryToTable, odcConnection);
OleDbCommandBuilder ocbCommand = new OleDbCommandBuilder(odaTableInformation);
odcConnection.Open();
odaTableInformation.Update(dtsTableContents, strTableName);
odcConnection.Close();
}
}
catch { }
}
This snippet runs error-free but it does not seem to change anything. Also the time it takes to run this step takes like milliseconds and I would think this to take longer.
I am using the DataSet I obtained from my SQL Connection, this DataSet I am trying to write to my local database.
Might it be the fact that this is a DataSet from an SQL connection and that I can't write this to my mdb connection via my OleDbAdapter or am I just missing the obvious here?
Any help is appreciated.
Thanks,
Kevin
For a straight backup from the external database to the internal backup
I've just been messing around with an sdf and a server based sql database and it has worked.. It's not by all means the finished product but for one column I've got the program to read from the external database and then write straight away to the local .sdf in around 15 lines of code
SqlConnection sqlCon = new SqlConnection( ExternalDatabaseConnectionString );
SqlCeConnection sqlCECon = new SqlCeConnection( BackUpConnectionString );
using ( sqlCon )
{
using ( sqlCECon )
{
sqlCon.Open( );
sqlCECon.Open( );
SqlCommand get = new SqlCommand( "Select * from [TableToRead]", sqlCon );
SqlCeCommand save = new SqlCeCommand( "Update [BackUpTable] set InfoColumn = #info where ID = #id", sqlCECon );
SqlDataReader reader = get.ExecuteReader( );
if ( reader.HasRows )
{
reader.Read( );
save.Parameters.AddWithValue("#id", reader.GetString(0));
save.Parameters.AddWithValue( "#info", reader.GetString( 1 ));
save.ExecuteNonQuery( );
}
}
}
For one row of a database, backing up one column, it works, I'm assuming you will have some sort of auto incremented key like ID?
I think a first step would be to reduce your reliance on static methods and fields.
If you look at your UpdateLocalDatabase method you'll see that you are passing in strTableName which is used by that method but a method that UpdateLocalDatabase calls (CreateDatabaseConnection) refers to a different global static variable named the same. Most likely the two strTableName variables contain different values and you are not seeing that they are not the same variable.
Also you are trying to write out the global static data set dtsTableContents in UpdateLocalDatabase but if you then look at CreateDatabaseConnection it actually creates a local version of that variable -- again, you have two variables named the same thing where one is global and one is local.
I suspect that the two variables of dtsTableContents is the problem.
My suggestion, again, would be to not have any static methods or variables and, for what you're doing here, try not to use any global variables. Also, refactor and/or rename your methods to match more what they are actually doing.
After endlessly trying to use the DataAdapter.Update(Method) I gave up.. I settled for using an sdf file instead of an mdb.
If I need to update my database, I remove the local database, create a new one, using the same connectionstring. Then I loop over the tables in my dataset, read the column names and types from it and create tables based on that.
After this I loop over my dataset and insert the contents of my dataset which I filled with the information from the server. Below is the code, this is just 'quick and dirty' as a proof of concept but it works for my scenario.
public static void RemoveAndCreateLocalDb(string strLocalDbLocation)
{
try
{
if (File.Exists(strLocalDbLocation))
{
File.Delete(strLocalDbLocation);
}
SqlCeEngine sceEngine = new SqlCeEngine(#"Data Source= " + strLocalDbLocation + ";Persist Security Info=True;Password=MyPass");
sceEngine.CreateDatabase();
}
catch
{ }
}
public static void UpdateLocalDatabase(String strTableName, DataTable dttTable)
{
try
{
// Opening the Connection
sceConnection = CreateDatabaseSQLCEConnection();
sceConnection.Open();
// Creating tables in sdf file - checking headers and types and adding them to a query
StringBuilder stbSqlGetHeaders = new StringBuilder();
stbSqlGetHeaders.Append("create table " + strTableName + " (");
int z = 0;
foreach (DataColumn col in dttTable.Columns)
{
if (z != 0) stbSqlGetHeaders.Append(", "); ;
String strName = col.ColumnName;
String strType = col.DataType.ToString();
if (strType.Equals("")) throw new ArgumentException("DataType Empty");
if (strType.Equals("System.Int32")) strType = "int";
if (strType.Equals("System.String")) strType = "nvarchar (100)";
if (strType.Equals("System.Boolean")) strType = "nvarchar (15)";
if (strType.Equals("System.DateTime")) strType = "datetime";
if (strType.Equals("System.Byte[]")) strType = "nvarchar (100)";
stbSqlGetHeaders.Append(strName + " " + strType);
z++;
}
stbSqlGetHeaders.Append(" )");
SqlCeCommand sceCreateTableCommand;
string strCreateTableQuery = stbSqlGetHeaders.ToString();
sceCreateTableCommand = new SqlCeCommand(strCreateTableQuery, sceConnection);
sceCreateTableCommand.ExecuteNonQuery();
StringBuilder stbSqlQuery = new StringBuilder();
StringBuilder stbFields = new StringBuilder();
StringBuilder stbParameters = new StringBuilder();
stbSqlQuery.Append("insert into " + strTableName + " (");
foreach (DataColumn col in dttTable.Columns)
{
stbFields.Append(col.ColumnName);
stbParameters.Append("#" + col.ColumnName.ToLower());
if (col.ColumnName != dttTable.Columns[dttTable.Columns.Count - 1].ColumnName)
{
stbFields.Append(", ");
stbParameters.Append(", ");
}
}
stbSqlQuery.Append(stbFields.ToString() + ") ");
stbSqlQuery.Append("values (");
stbSqlQuery.Append(stbParameters.ToString() + ") ");
string strTotalRows = dttTable.Rows.Count.ToString();
foreach (DataRow row in dttTable.Rows)
{
SqlCeCommand sceInsertCommand = new SqlCeCommand(stbSqlQuery.ToString(), sceConnection);
foreach (DataColumn col in dttTable.Columns)
{
if (col.ColumnName.ToLower() == "ssma_timestamp")
{
sceInsertCommand.Parameters.AddWithValue("#" + col.ColumnName.ToLower(), "");
}
else
{
sceInsertCommand.Parameters.AddWithValue("#" + col.ColumnName.ToLower(), row[col.ColumnName]);
}
}
sceInsertCommand.ExecuteNonQuery();
}
}
catch { }
}

Load words in lines from file to sqlite table

I made function in c# to read line by line and then load lines to sqlite (s3db).
private void LoadFromDictionary()
{
Encoding enc = Encoding.GetEncoding(1250);
using (StreamReader r = new StreamReader("c:\\Temp2\\dictionary.txt", enc))
{
string line = "";
while ((line = r.ReadLine()) != null)
{
line = line.Trim();
AddWord(line);
}
}
MessageBox.Show("Finally :P", "Info");
}
private void AddWord(string w)
{
String insSQL = "insert into Words values(\"" + w + "\")";
String strConn = #"Data Source=C:\Temp2\dictionary.s3db";
SQLiteConnection conn = new SQLiteConnection(strConn);
SQLiteDataAdapter da = new SQLiteDataAdapter(insSQL, strConn);
da.Fill(dt);
dataGridView1.DataSource = dt.DefaultView;
}
But is it any faster way? I created table by sqlite administrator application.
Can sqlite load itself file and make it as a table?
I am talking about 3+ millions words (one word in one line).
PS. please correct my topic if there is something wrong :)
Yes, there is a much, much faster method using the following techniques:
1) Only open a connection to the database one time
2) Use a parameterized command for better performance and lower overhead (don't have to use new strings on each pass).
3) Wrap the entire operation in a transaction. As a general rule, this will improve your performance.
Note that I do not show transaction rollback or closing the connection, which are also best practices that should be implemented.
private void LoadFromDictionary()
{
Encoding enc = Encoding.GetEncoding(1250);
string strConn = #"Data Source=C:\Temp2\dictionary.s3db";
SqliteConnection conn = new SqliteConnection(strConn);
conn.Open();
string insSQL = "insert or ignore into wyrazy values(#Word)";
DbCommand oCommand = conn.CreateCommand();
oCommand.Connection = conn;
oCommand.CommandText = insSQL;
DbParameter oParameter = oCommand.CreateParameter();
oParameter.Name = "#Word";
oParameter.DbType = DbType.String;
oParameter.Size = 100;
oCommand.Parameters.Add(oParameter);
DbTransaction oTransaction = conn.BeginTransaction();
using (StreamReader r = new StreamReader("c:\\Temp2\\dictionary.txt", enc))
{
string line = "";
while ((line = r.ReadLine()) != null)
{
line = line.Trim();
if (!string.IsNullOrEmpty(line)) {
oParameter.Value = line;
oCommand.ExecuteNonQuery();
}
}
}
oTransaction.Commit();
conn.Close();
MessageBox.Show("Finally :P", "Info");
}
You could try bulk insert. By reading this article please pay special attention to the parametrized queries being used there and which you should use instead of the string concatenations in your sample in the insSQL variable.
Using Transactions usually speed things up quite a bit, depending on your desired batch size. I'm not 100% as familiar with DataAdapters and DataSources but instead of creating a new connection every time to insert one row, modify your code to use one connection and use SQLiteConnection.BeginTransaction() and the when you are done call Transaction.Commit().
I just did this the other day, first use a transaction, and parameterized queries. I was able to load 16 million rows in about a minute doing this.
internal static void FastInsertMany(DbConnection cnn)
{
using (DbTransaction dbTrans = cnn.BeginTransaction())
{
using (DbCommand cmd = cnn.CreateCommand())
{
cmd.CommandText = "INSERT INTO TestCase(MyValue) VALUES(?)";
DbParameter Field1 = cmd.CreateParameter();
cmd.Parameters.Add(Field1);
for (int n = 0; n < 100000; n++)
{
Field1.Value = n + 100000;
cmd.ExecuteNonQuery();
}
}
dbTrans.Commit();
}
}

Categories