I want to create table dynamically and fill it with random numbers, but when execute code, MySQL throw an error about syntax.
for (int i = 0; i < 100; i++)
{
//gameSession is int, i tried use Parameters.AddWithValue but still have same problem.
MySqlCommand dice = new MySqlCommand("INSERT INTO "+gameSession.ToString()+" (Dice1,Dice2) Values (#dice1,#dice2)", myConnection);
Random dice1 = new Random();
dice.Parameters.AddWithValue("#dice1", dice1.Next(1, 7));
Random dice2 = new Random();
dice.Parameters.AddWithValue("#dice2", dice2.Next(1, 7));
if (myConnection.State == ConnectionState.Closed)
{
myConnection.Open();
}
dice.ExecuteNonQuery();
}
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near INSERT INTO '7571877 ...
One major reason you have an error in your syntax is that the table is not quoted. Check out the MySQL manual for identifiers (which includes table names).
Just like you can't really use numbers by default to represent variables, you can't do the same for tables without escaping the identifiers. A couple solutions for you would be to prefix the table name with a letter or to use a single table.
Answering the question in your title:
You can create a table if it doesn't exist using the "IF NOT EXISTS" clause. The SQL would look like this:
CREATE TABLE table_name IF NOT EXISTS (
dice1 INTEGER,
dice2 INTEGER
);
If you insist on going down this route, you should create your table names with a standard prefix letter:
"s" + gameSession.ToString()
Be warned, having a separate table for each session does complicate your database maintenance needs, particularly for dealing with abandoned sessions. It's a lot easier to remove rows in a single table than find all tables that are abandoned.
Another serious concern has to do with database security. When building an application against a database, it is far better to only grant insert and update privileges to a user than it is to grant them the ability to create and drop any table in your database. Since the table names are dynamically created, you can't really maintain per-table privileges easily. If this is web facing you are opening yourself up to some serious liability for very little gain.
Answering the need:
A better design is to create a single table with a session id. Otherwise, you will have any number of tables with one record each. You would create the table once like this:
CREATE TABLE DiceRolls IF NOT EXISTS (
session INTEGER,
dice1 INTEGER,
dice2 INTEGER
);
Your inserts would only need to change a little bit:
for (int i = 0; i < 100; i++)
{
//gameSession is int, i tried use Parameters.AddWithValue but still have same problem.
MySqlCommand dice = new MySqlCommand("INSERT INTO DiceRolls (Session, Dice1,Dice2) Values (#session,#dice1,#dice2)", myConnection);
dice.Parameters.AddWithValue("#session", gameSession);
Random dice1 = new Random();
dice.Parameters.AddWithValue("#dice1", dice1.Next(1, 7));
Random dice2 = new Random();
dice.Parameters.AddWithValue("#dice2", dice2.Next(1, 7));
if (myConnection.State == ConnectionState.Closed)
{
myConnection.Open();
}
dice.ExecuteNonQuery();
}
This makes clean up a lot easier as you can batch job something that deletes everything that isn't a current session.
The problem is with your table name. Make sure table name is correct, then use this function:
void InsertDiece(string connectionString, string tableName, int count)
{
var commandText = String.Format("INSERT INTO `{0}` (Dice1, Dice2) Values (#dice1, #dice2)", tableName);
var random = new Random(DateTime.Now.Milliseconds);
using(var connection = new MySqlConnection(connectionString))
using(var command = connection.CreateCommand())
{
connection.Open();
command.CommandText = commandText;
for(int i = 0; i < count; i++)
{
command.Parameters.AddWithValue("#dice1", random.Next(1, 7));
command.Parameters.AddWithValue("#dice2", random.Next(1, 7));
command.ExecuteNonQuery();
command.Parameters.Clear();
}
}
}
Related
I am trying to insert some data into two MYSQL tables.
the second table stores the first table row Id as a foreign key.
I have this code that works fine but it is super slow. what is the best/fastest way to make it faster?
string ConnectionString = "server=localhost; password = 1234; database = DB ; user = Jack";
MySqlConnection mConnection = new MySqlConnection(ConnectionString);
mConnection.Open();
int index = 1;
for (int i = 0; i < 100000; i++)
{
string insertPerson = "INSERT INTO myentities(Name) VALUES (#first_name);"
+ "INSERT INTO secondtable(Id, Name,myentities) VALUES (#ID, #city, LAST_INSERT_ID());";
MySqlCommand command = new MySqlCommand(insertPerson, mConnection);
command.Parameters.AddWithValue("#first_name", "Jack");
command.Parameters.AddWithValue("#ID", i+1);
command.Parameters.AddWithValue("#city", "Frank");
command.ExecuteNonQuery();
command.Parameters.Clear();
}
I have found the following code on one of the StackoverFlow questions but it was inserting data to a single table only, not to multiple tables which are connected through a foreign key.
This code is pretty fast, but I was not sure how I can make it work with multiple tables.
public static void BulkToMySQL()
{
string ConnectionString = "server=192.168.1xxx";
StringBuilder sCommand = new StringBuilder("INSERT INTO User (FirstName, LastName) VALUES ");
using (MySqlConnection mConnection = new MySqlConnection(ConnectionString))
{
List<string> Rows = new List<string>();
for (int i = 0; i < 100000; i++)
{
Rows.Add(string.Format("('{0}','{1}')", MySqlHelper.EscapeString("test"), MySqlHelper.EscapeString("test")));
}
sCommand.Append(string.Join(",", Rows));
sCommand.Append(";");
mConnection.Open();
using (MySqlCommand myCmd = new MySqlCommand(sCommand.ToString(), mConnection))
{
myCmd.CommandType = CommandType.Text;
myCmd.ExecuteNonQuery();
}
}
}
The fastest way possible is to craft a strategy for not calling mysql in a loop via the .NET MySQL Connector. Especially for i=0 to 99999 . The way you achieve this is either thru CASE A: direct db table manipulation or CASE B: thru CSV to db imports with LOAD DATA INFILE.
For CASE B: it is often wise to bring that data into a staging table or tables. Checks can be made for data readiness depending on the particular circumstances. What that means is that you may be getting external data that needs scrubbed (ETL). Other benefits include not committing unholy data to your production tables not fit for consumption. So it leaves an abort option open to you.
Now onto performance anecdotes. With MySQL and the .NET Connector version 6.9.9.0 in late 2016, I can achieve up to 40x performance gains by going this route. It may seem unnatural not to call an INSERT query but I don't in loops. Ok, sure, in small loops, but not in data ingest with bulk. Not even for 500 rows. You will experience noticable UX improvement if you re-craft some routines.
So the above is for data that truly came from external sources. For CASE A: the normal data that is already in your db the above does not apply. In those situations you strive to craft your SQL to massage your data as much as possible (read: 100%) on the server-side. As such it does so without bringing the data back to the client thus requiring some client-side with Connector looping call to get it back into the server. This does not mandate Stored Procedures necessarily or at all. Client-side calls that operate on the data in place without toward client transfers then back up are what you shoot for.
You can gain some improvement by moving unnecessary operations out of the loop, since anything you do there is repeated 100,000 times:
string insertPerson =
"INSERT INTO myentities(Name) VALUES (#first_name);"
+ "INSERT INTO secondtable(Id, Name,myentities) VALUES (#ID, #city, LAST_INSERT_ID());";
string ConnectionString = "server=localhost; password = 1234; database = DB ; user = Jack";
using (var Connection = new MySqlConnection(ConnectionString))
using (var command = new MySqlCommand(insertPerson, mConnection))
{
//guessing at column types and lengths here
command.Parameters.Add("#first_name", MySqlDbType.VarChar, 50).Value = "Jack";
var id = command.Parameters.Add("#ID", MySqlDbType.Int32);
command.Parameters.Add("#city", MySqlDbType.VarChar, 50).Value = "Frank";
mConnection.Open();
for (int i = 1; i <= 100000; i++)
{
id.Value = i;
command.ExecuteNonQuery();
}
}
But mostly, you try to avoid this scenario. Instead, you'd do something like use a numbers table to project the results for both tables in advance. There are some things you can do with foreign key constraints to set locking (you need to lock the whole table to avoid bad keys if someone else inserts or tries to read partially inserted records), transaction logging (you can set it only log the batch, rather than each change) and foreign keys enforcment (you can turn it off while you handle the insert).
I have the following code that it works fine. My problem is the insert took more than three hours.
How can I optimize the insert query in sql table?
foreach(var sheetName in GetExcelSheetNames(connectionString)) {
using(OleDbConnection con1 = new OleDbConnection(connectionString)) {
var dt = new DataTable();
string query = string.Format("SELECT * FROM [{0}]", sheetName);
con1.Open();
OleDbDataAdapter adapter = new OleDbDataAdapter(query, con1);
adapter.Fill(dt);
using(SqlConnection con = new SqlConnection(consString)) {
con.Open();
for (int i = 2; i < dt.Rows.Count; i++) {
for (int j = 1; j < dt.Columns.Count; j += 3) {
try {
var s = dt.Rows[i][0].ToString();
var dt1 = DateTime.Parse(s, CultureInfo.GetCultureInfo("fr-FR"));
var s1 = dt.Rows[i][j].ToString();
var s2 = dt.Rows[i][j + 1].ToString();
var s3 = sheetName.Remove(sheetName.Length - 1);
{
SqlCommand command = new SqlCommand("INSERT INTO [Obj CA MPX] ([CA TTC],[VAL MRG TTC],[CA HT],[VAL MRG HT],[Rayon],[Date],[Code Site]) VALUES(#ca,#val,#catHT ,#valHT ,#rayon, #date ,#sheetName )", con);
command.Parameters.Add("#date", SqlDbType.Date).Value = dt1;
command.Parameters.AddWithValue("#ca", s1);
command.Parameters.AddWithValue("#val", s2);
command.Parameters.AddWithValue("#rayon", dt.Rows[0][j].ToString());
command.Parameters.AddWithValue("#sheetName", s3);
command.Parameters.Add("#catHT", DBNull.Value).Value = DBNull.Value;
command.Parameters.Add("#valHT", DBNull.Value).Value = DBNull.Value;
command.ExecuteNonQuery();
}
}
}
maybe you should save it as file and use bulk insert
https://msdn.microsoft.com/de-de/library/ms188365%28v=sql.120%29.aspx
SQL Server has the option of using a Bulk Insert.
Here is a good article on importing a csv.
You should first read this article from Eric Lippert: Which is faster?.
Keep this in mind while trying to optimize your process.
The insert took 3 hours, but have you inserted 10 items or 900.000.000.000 items?
If it's the last one, maybe 3 hours are pretty good.
What is your database? SQL Server 2005 Express? SQL Server 2014 Enterprise?
The advices could differ.
Without more details, we will only be able to give you suggestions, that could or could not apply depending on your configuration.
Here are some on the top of my head:
Is the bottleneck on the DB side? Check the execution plan, add indexes if needed
Beware of AddWithValue, it can prevent the use of indexes in your query
If you are loading a lot of data on a non-live database, you could use a lighter recovery model to prevent having a lot of useless logs (using Bulk load will use automatically BULKED_LOGGED, or you could activate the SIMPLE recovery model (alter database [YourDB] set recovery SIMPLE, don't forget to re-enable the FULL recovery model after)
Are there other alternatives than loading data from an Excel file? Can't you use another database instead or converting the Excel file to a CSV?
What does the performance monitor tells you? Maybe you need better hardware (more ram, faster disks, RAID), or move some heavily used files (mdf, ldf) on separate disks.
You could copy the Excel file several times and use parallelization, load in different tables that will be partitions of your final table.
This list could continue forever.
Here is an interesting article about optimizing data loading: We Loaded 1TB in 30 Minutes with SSIS, and So Can You
This article is focused on SSIS but some advices do not apply only to it.
You can put several (e.g. 100) inserts into a string using a string builder. Use an index for the parameter names. Note that you can have a maximum of 2100 parameters for one query.
StringBuilder batch = new StringBuilder();
for (int i = 0; i < pageSize; i++)
{
batch.AppendFormat(
#"INSERT INTO [Obj CA MPX] ([CA TTC],[VAL MRG TTC], ...) VALUES(#ca{0},#val{0}, ...)"
i);
batch.AppendLine();
batch.AppendLine();
}
SqlCommand command = new SqlCommand(batch.ToString(), con)
// append parameters, using the index
for (int i = 0; i < pageSize; i++)
{
command.Parameters.Add("#date" + i, SqlDbType.Date).Value = dt1[i];
command.Parameters.AddWithValue("#ca" + i, s1[i]);
// ...
}
command.ExecuteNonQuery();
Of course this is not finished, you have to integrate the pages into your existing loops, which may not be too simple.
Alternatively, you do not use parameters and put the arguments directly into the query. This way, you can create much larger batches (I would put 1000 to 10000 inserts into one batch) and it's much easier to implement.
I have a list Called ListTypes that holds 10 types of products. Below the store procedure loops and gets every record with the product that is looping and it stores it in the list ListIds. This is killing my sql box since I have over 200 users executing this constantly all day.
I know is not a good architecture to loop a sql statement, but this the only way I made it work. Any ideas how I can make this without looping? Maybe a Linq statement, I never used Linq with this magnitude. Thank you.
protected void GetIds(string Type, string Sub)
{
LinkedIds.Clear();
using (SqlConnection cs = new SqlConnection(connstr))
{
for (int x = 0; x < ListTypes.Count; x++)
{
cs.Open();
SqlCommand select = new SqlCommand("spUI_LinkedIds", cs);
select.CommandType = System.Data.CommandType.StoredProcedure;
select.Parameters.AddWithValue("#Type", Type);
select.Parameters.AddWithValue("#Sub", Sub);
select.Parameters.AddWithValue("#TransId", ListTypes[x]);
SqlDataReader dr = select.ExecuteReader();
while (dr.Read())
{
ListIds.Add(Convert.ToInt32(dr["LinkedId"]));
}
cs.Close();
}
}
}
Not a full answer, but this wouldn't fit in a comment. You can at least update your existing code to be more efficient like this:
protected List<int> GetIds(string Type, string Sub, IEnumerable<int> types)
{
var result = new List<int>();
using (SqlConnection cs = new SqlConnection(connstr))
using (SqlCommand select = new SqlCommand("spUI_LinkedIds", cs))
{
select.CommandType = System.Data.CommandType.StoredProcedure;
//Don't use AddWithValue! Be explicit about your DB types
// I had to guess here. Replace with the actual types from your database
select.Parameters.Add("#Type", SqlDBType.VarChar, 10).Value = Type;
select.Parameters.Add("#Sub", SqlDbType.VarChar, 10).Value = Sub;
var TransID = select.Parameters.Add("#TransId", SqlDbType.Int);
cs.Open();
foreach(int type in types)
{
TransID.Value = type;
SqlDataReader dr = select.ExecuteReader();
while (dr.Read())
{
result.Add((int)dr["LinkedId"]);
}
}
}
return result;
}
Note that this way you only open and close the connection once. Normally in ADO.Net it's better to use a new connection and re-open it for each query. The exception is in a tight loop like this. Also, the only thing that changes inside the loop this way is the one parameter value. Finally, it's better to design methods that don't rely on other class state. This method no longer needs to know about the ListTypes and ListIds class variables, which makes it possible to (among other things) do better unit testing on the method.
Again, this isn't a full answer; it's just an incremental improvement. What you really need to do is write another stored procedure that accepts a table valued parameter, and build on the query from your existing stored procedure to JOIN with the table valued parameter, so that all of this will fit into a single SQL statement. But until you share your stored procedure code, this is about as much help as I can give you.
Besides the improvements others wrote.
You could insert your ID's into a temp table and then make one
SELECT * from WhatEverTable WHERE transid in (select transid from #tempTable)
On a MSSQL this works really fast.
When you're not using a MSSQL it could be possible that one great SQL-Select with joins is faster than a SELECT IN. You have to test these cases by your own on your DBMS.
According to your comment:
The idea is lets say I have a table and I have to get all records from the table that has this 10 types of products. How can I get all of this products? But this number is dynamic.
So... why use a stored procedure at all? Why not query the table?
//If [Type] and [Sub] arguments are external inputs - as in, they come from a user request or something - they should be sanitized. (remove or escape '\' and apostrophe signs)
//create connection
string queryTmpl = "SELECT LinkedId FROM [yourTable] WHERE [TYPE] = '{0}' AND [SUB] = '{1}' AND [TRANSID] IN ({2})";
string query = string.Format(queryTmpl, Type, Sub, string.Join(", ", ListTypes);
SqlCommand select = new SqlCommand(query, cs);
//and so forth
To use Linq-to-SQL you would need to map the table to a class. This would make the query simpler to perform.
I have an array of objects. Each object contains around 20 members. I need to loop through the array and insert the data from the object into my database. Is there a way of doing this that does not require me to put an INSERT statement within the body of my loop? I am using C# and SQL Server.
for(int i =0; i < arr.length; i++)
{
strSQL = "INSERT INTO myTable (field1...field20) VALUES (" + arr[i].field1 + "..." + arr[i].field20)
sqlCmd.execute(strSQL,sqlConn)
}
Why don't you use linq ?
Yes there is way.
First, +1 for lollancf37's answer.
And second, you can use StringBuilder. Build your query parameter with 1 query using StringBuilder, and execute 1 time
Enjoy
SqlCommand sqlcmd = null;
SqlParameter pField1 = new SqlParameter("#Field1", System.Data.SqlDbType.VarChar, 255);
...
SqlParameter pField20 = new SqlParameter("#Field20", System.Data.SqlDbType.VarChar, 255);
try{
sqlcmd = new SqlCommand("INSERT INTO myTable (field1...field20) VALUES (#Field1,...,#Field20)",sqlConn)
sqlcmd.Parameters.Add(pField1);
...
sqlcmd.Parameters.Add(pField20);
for(int i =0; i < arr.length; i++)
{
pField1.value = arr[i].field1;
...
pField20.value = arr[i].field20;
sqlCmd.ExecuteNonQuery();
}
}
catch (Exception ex)
{
LogError(ex.message)
}
finally
{
if (sqlconn != null && sqlconn.State != System.Data.ConnectionState.Closed)
sqlconn.Close();
if (sqlcmd != null)
sqlcmd.Dispose();
}
For simple commands, you could use a format string, something like this:
//assuming that the first param is a number, the second one a string etc...
string insertFormat = #"Insert into myTable((field1...field20)
VALUES ({0},'{1}',..,19})"
for(int i =0; i < arr.length; i++)
{
//assuming that arr[i] is string[]
strSQL = string.Format(insertFormat, arr[i]);
sqlCmd.execute(strSQL,sqlConn)
}
You must be aware, however, that you are leaving yourself wide open for SQL Injection (mandatory xkcd). In order to avoid that, and other issues, you could take a look at "advanced| solutions (using a stored procedure, using reflection to map the fields, using some ORM tool, using Linq-to-sql, etc)
strSQL = "INSERT INTO MYTABLE (COL01, COL01...COL42)";
for(int i = 0; i < arr.length; i++)
{
strSQL += "SELECT (" +arr[0] +", " +arr[1] +"..." +arr[42] +")";
if(i < arr.length - 1) {
strSQL += " UNION ALL ";
}
}
sqlCommand.Execute(strSQL, conn); // I forget how this bit goes, I am not a C# programmer by trade...
I find that inserting more than 500 records at a time like this makes the database run pretty slow.
Parameterised stored procedures.
Write the insert statement down in the db as a stored procedure, with parameters.
Create a method to populate the stored procedure, passing in the object as the parameter.
Call the method in the loop, populating an passing in an object instance each time.
I had a similar problem at one point, the most effective way I found to deal with it was create a 2-dimensional array for your data, and store the column name for each line of data in the array in addition to storing the data itself.
From there, it's pretty trivial to loop through it and use a StringBuilder to assemble the query. Additionally, instead of simply throwing the insert values on, put a parameter name on them (I just used the column name). Then, in the same loop, you can create a new SqlParameter.
If you're using .NET 4, just create an SqlParameter array and add your created parameters to it in your loop. After your loop ends, you can use the "AddRange' method of your command's parameter collection to simply add the parameter array to your command.
This way, you're able to build a fully dynamic query that sanitizes inputs.
I want to update multiple rows like below
update mytable set s_id = {0} where id = {1}
(Here s_id is evaluated based on some complex logic).
For performance reason, updates should happen in batches. Is there any way to batch the update statements and execute the batch through single execute statements? I know in JAVA we can do this through JDBC. Is there similar way in C#?
Thanks in advance
Yes, you can use an SqlDataAdapter.
The SqlDataAdapter has InsertCommand and UpdateCommand properties which allow you to specify an SQLCommand to use to insert new rows into the database and an SqlCommand to update rows in the database respectively.
You can then pass a DataTable to the Update method of the dataadapter, and it will batch up the statements to the server - for rows in the DataTable that are new rows, it executes the INSERT command, for modified rows it executes the UPDATE command.
You can define the batch size using the UpdateBatchSize property.
This approach allows you to deal with large volumes of data, and allows you to nicely handle errors in different ways, i.e. if an error is encountered with a particular update, you can tell it to NOT throw an exception but to carry on with the remaining updates by setting the ContinueUpdateOnError property.
Yes, you can build a plain-text SQL command (parameterized for security), like this:
SqlCommand command = new SqlCommand();
// Set connection, etc.
for(int i=0; i< items.length; i++) {
command.CommandText += string.Format("update mytable set s_id=#s_id{0} where id = #id{0};", i);
command.Parameters.Add("#s_id" + i, items[i].SId);
command.Parameters.Add("#id" + i, items[i].Id);
}
command.ExecuteNonQuery();
Use a StringBuilder (System.Text.StringBuilder) to build your Sql, such as:
StringBuilder sql = new StringBuilder();
int batchSize = 10;
int currentBatchCount = 0;
SqlCommand cmd = null; // The SqlCommand object to use for executing the sql.
for(int i = 0; i < numberOfUpdatesToMake; i++)
{
int sid = 0; // Set the s_id here
int id = 0; // Set id here
sql.AppendFormat("update mytable set s_id = {0} where id = {1}; ", sid, id);
currentBatchCount++;
if (currentBatchCount >= batchSize)
{
cmd.CommandText = sql.ToString();
cmd.ExecuteNonQuery();
sql = new StringBuilder();
currentBatchCount = 0;
}
}
Create a set of those updates (with the id's filled in), separate them by semicolon in one string, set the resulting string to a SqlCommand's CommandText property, then call ExecuteNonQuery().