Add multiple SQL values with same parameterized query? - c#

I'm fairly new to SQL and trying to figure out the best way to add some predefined data. I figured out from searching around here that I should used a parameterized command to avoid a sql injection attack which isn't a huge concern in this case but I would like to avoid the possibility and learn to do it right... Anyway here is the code I have right now:
using (SqlTransaction trans = connection.BeginTransaction())
{
foreach (IEnumerable<string> row in table.RowData)
{
using (SqlCommand sql = new SqlCommand("INSERT INTO " + table.Title
+ " (" + string.Join(", ", table.Headers)
+ ") VALUES (" + string.Join(", ", table.Headers.Select(x => "#" + x)) + ");", connection, trans))
{
for (int i = 0; i < table.Headers.Count(); i++)
{
if (string.IsNullOrEmpty(row.ElementAt(i)))
{ sql.Parameters.AddWithValue("#" + table.Headers.ElementAt(i), DBNull.Value); }
else
{ sql.Parameters.AddWithValue("#" + table.Headers.ElementAt(i), row.ElementAt(i)); }
}
sql.ExecuteNonQuery();
}
}
trans.Commit();
}
This seems to work and all the data gets in there but it 'feels' inefficient to me. I'm wrapping it in a transaction so there is only one commit, but it's creating the parameters every time and just setting different values for each row.
Is there a way to make this use the same parameters but just set different values per row? Or is this the best way to do this and I should not worry about it?
Thanks in advance for any help you can give.

We can do what you want by parsing the headers into parameters in a pre-processing step. I have also removed the explicit transaction because every single insert already gets an implicit transaction by default (why pay the performance penalty of two transactions?).
using (var command = new SqlCommand()) {
command.CommandText =
"INSERT INTO " + table.Title + " ("
+ string.Join(", ", table.Headers)
+ ") VALUES ("
+ string.Join(", ", table.Headers.Select(x => "#" + x))
+ ");";
command.Connection = connection;
foreach (var header in table.Headers) {
/*
Add all parameters as strings. One could choose to infer the
data types by inspecting the first N rows or by using some sort
of specification to map the types from A to B.
*/
command.Parameters.Add("#" + header, typeof(string));
}
foreach (var row in table.RowData) {
for (var i = 0; i < table.Headers.Count(); i++) {
if (!string.IsNullOrEmpty(row.ElementAt(i))) {
command.Parameters["#" + table.Headers.ElementAt(i)].Value = row.ElementAt(i);
}
else {
command.Parameters["#" + table.Headers.ElementAt(i)].Value = DBNull.Value;
}
}
command.ExecuteNonQuery();
}
}

this is my example of insert that works for me
private void insertWordCount(string songId, string wordId, int wordCount)
{
string query = "insert into songs_words_conn values(#wordId,#songId,#wordCount)";
SqlCommand cmd = new SqlCommand(query, conn);
cmd.Parameters.AddWithValue("#wordId", wordId);
cmd.Parameters.AddWithValue("#songId", songId);
cmd.Parameters.AddWithValue("#wordCount", wordCount);
cmd.ExecuteNonQuery();
}

Yes, you can be much more effecient by reusing SqlParameter objects. Here is some pseudo code:
const string sql = "INSERT INTO table1 (column1) VALUES (#p0)";
using (var sqlCommand = new SqlCommand(sql, connection, transaction))
{
var param1 = sqlCommand.Parameters.Add("#p0", SqlDbType.Int);
foreach (var row in table)
{
param1.Value = row["value"];
sqlCommand.ExecuteNonQuery();
}
}

Related

Adding unknown count of values inside a SELECT statement

I have code that adds a value inside a SQL table :
SqlCommand command;
SqlDataAdapter adapter = new SqlDataAdapter();
String strSQL = "";
strSQL = "INSERT INTO tblTest (value1) VALUES ('" + strPLCData + "')";
command = new SqlCommand(strSQL, cnn);
adapter.InsertCommand = new SqlCommand(strSQL, cnn);
try
{
int rows = adapter.InsertCommand.ExecuteNonQuery();
txtStatusLogging.Text += "Inserted " + rows + " row(s) in the database." + Environment.NewLine;
}
catch (Exception ex)
{
MessageBox.Show("Failed to write to database : " + ex.Message);
}
command.Dispose();
But I'm a bit stuck when I want to add an unknown count of values in the database (according to a list of unknown size).
e.g. sometimes add only value1, other times add value1, value2 and value3 .... (depending on whats in a certain list).
How would I go about doing this?
There's absolutely no need for that SqlDataAdapter. If you want to add an arbitrary number of values to a table - use a straight INSERT SqlCommand and just loop over the list of values to insert.
Also: you should always use parametrized queries - no exceptions - and you should put your SqlConnection and SqlCommand objects in using () { ... } blocks - something like this:
Something like this:
public void InsertValues(List<int> values)
{
// define the insert query
string qryInsert = "INSERT INTO dbo.tblTest (value1) VALUES (#singleValue);";
using (SqlConnection conn = new SqlConnection(connectionString))
using (SqlCommand cmdInsert = new SqlCommand(qryInsert, conn))
{
// define parameter
cmdInsert.Parameter.Add("#singleValue", SqlDbType.Int);
conn.Open();
// loop over values
foreach (int aValue in values)
{
// set the parameter value, execute query
cmdInsert.Parameters["#singleValue"].Value = aValue;
cmdInsert.ExecuteNonQuery();
}
conn.Close;
}
}

Increase performance on OleDB insert into statement, BeginTransaction CommitTransaction

I have written to append functions that insert data from custom c# list into MSAccess.
The first simply sets up a new connection for each individual recordset:
public static void appenddatatotable(string connectionstring, string tablename, string[] values)
{
var myconn = new OleDbConnection(connectionstring);
var cmd = new OleDbCommand();
cmd.CommandText = "INSERT INTO " + tablename + " ([RunDate],[ReportingGroup], [Tariff], [Year]) VALUES(#RunDate, #ReportingGroup, #Tariff, #Year)";
cmd.Parameters.AddRange(new[] { new OleDbParameter("#RunDate", values[0]), new OleDbParameter("#ReportingGroup", values[1]), new OleDbParameter("#Tariff", values[2]), new OleDbParameter("#Year", values[3])});
cmd.Connection = myconn;
myconn.Open();
cmd.ExecuteNonQuery();
myconn.Close();
}
I then simply loop over my list of values and call this function on each iteration. This works fine but is slow.
In the second function I tried to include the loop in the function and work with BeginTransction and Committransaction:
public static void appenddatatotable2(string connectionstring, string tablename, string datstr, List<PowRes> values)
{
var myconn = new OleDbConnection(connectionstring);
int icounter = 0;
var cmd = new OleDbCommand();
OleDbTransaction trans = null;
cmd.Connection = myconn;
myconn.Open();
foreach (var item in values)
{
if (icounter == 0)
{
trans = cmd.Connection.BeginTransaction();
cmd.Transaction = trans;
}
cmd.CommandText = "INSERT INTO " + tablename + " ([RunDate],[ReportingGroup], [Tariff], [Year]) VALUES(#RunDate, #ReportingGroup, #Tariff, #Year)";
if (string.IsNullOrEmpty(item.yr))
item.yr = "";
cmd.Parameters.AddRange(new[] { new OleDbParameter("#RunDate", datstr), new OleDbParameter("#ReportingGroup", item.RG), new OleDbParameter("#Tariff", item.tar), new OleDbParameter("#Year", item.yr)});
cmd.ExecuteNonQuery();
icounter++;
if (icounter >= 500)
{
trans.Commit();
icounter = 0;
}
}
if (icounter > 0)
{
trans.Commit();
}
myconn.Close();
}
This also works fine but is EVEN slower.
Is my code wrong? How could I speed up the multiple inserts?
Thanks!
did not test, just my guess for your second function: you add too many parameters to the same command over the loop - cmd.Parameters were never cleared before each usage..
normally committing large set of commands within one connection is much faster than doing them one by one at single connection.
another way to speed up your inserts is to dump all your insert statements into a long text, separated with semicolon, and then fire a commit in one go (i am not sure whether msAccess supports it or not)
EDIT:
to combine the update command into one text:
var updates = values.Select(x => string.Format("INSERT INTO myTable ([RunDate],[ReportingGroup], [Tariff], [Year]) VALUES({0}, {1}, {2}, {3})",
datstr, x.RG, x.tar, x.yr))
.Aggregate((m, n) => m + ";" + n);
cmd.CommandText = update;
Though this could have sql injection issues.
this should be significantly faster than all of your exiting versions
public static void appenddatatotable2(string connectionstring, string tablename, string datstr, List<PowRes> values)
{
string commandText = "INSERT INTO " + tablename + " ([RunDate],[ReportingGroup], [Tariff], [Year]) VALUES(#RunDate, #ReportingGroup, #Tariff, #Year)";
using (var myconn = new OleDbConnection(connectionstring))
{
myconn.Open();
using (var cmd = new OleDbCommand())
{
foreach (var item in values)
{
cmd.CommandText = commandText;
cmd.Parameters.Clear();
cmd.Parameters.AddRange(new[] { new OleDbParameter("#RunDate", datstr), new OleDbParameter("#ReportingGroup", item.RG), new OleDbParameter("#Tariff", item.tar), new OleDbParameter("#Year", item.yr) });
cmd.Connection = myconn;
cmd.Prepare();
cmd.ExecuteNonQuery();
}
}
}
}

Problem in populating the GridView in ASP.NET(C#)

I'm trying to populate a Gridview with results from loop. But I'm getting only last result in the loop.
I think GridView is being overwritten on every time the for loop is being executed.
Can you people help me to remove this problem please.
for (int j = 0; j < i; j++)
{
Label1.Text += fipath[j];
Label1.Text += "-------------";
SqlConnection conn = new SqlConnection("Server=ILLUMINATI;" + "Database=DB;Integrated Security= true");
SqlCommand comm = new SqlCommand("Select * from FileUpload where UploadedBy='" + NAME + "' AND FilePath='" + fipath[j] + "'", conn);
try
{
conn.Open();
SqlDataReader rdr = comm.ExecuteReader();
if (Role.Equals("admin"))
{
GridView1.DataSource = rdr;
GridView1.DataBind();
}
rdr.Close();
}
catch
{
conn.Close();
}
}
There is more than one problem with this code:
seems like if Role== "admin" you don't need to query db at all
DataSource of the grid is overridden on every loop iteration, this is why you see only the last value.
use parameters for SqlCommand to prevent SQL injection.
don't run string concatenation in the loop. Use StringBuilder instead
use using for your connection. The code is cleaner this way.
The fix could look like this:
if (Role != "admin")
return;
var dataTable = new DataTable();
var stringBuilder = new StringBuilder();
using (var connection = new SqlConnection("Server=ILLUMINATI;" + "Database=DB;Integrated Security= true"))
using (var command = connection.CreateCommand())
{
connection.Open();
command.CommandText = "Select * from FileUpload where UploadedBy = #UploadedBy AND FilePath = #FilePath";
command.Parameters.AddWithValue("UploadedBy", NAME);
var filPathParameter = command.Parameters.Add("FilePath", SqlDbType.VarChar);
for (int j = 0; j < i; j++)
{
stringBuilder.Append(fipath[j]);
stringBuilder.Append("-------------");
filPathParameter.Value = fipath[j];
dataTable.Load(command.ExecuteReader(), LoadOption.PreserveChanges);
}
}
Label1.Text += stringBuilder.ToString();
GridView1.DataSource = dataTable;
GridView1.DataBind();
Also, I don't know how many elements your normal loop is. If it is one or two and you have appropriate indexes in FileUpload table then it is ok to leave as is. However, if you need to do the for many times you should consider switching to a single query instead
For example:
var filePathes = string.Join(",", fipath.Select(arg => "'" + arg + "'"));
var command = "Select * from FileUpload where UploadedBy = #UploadedBy AND FilePath in (" + filePathes + ")";
This query is SQL injection prone. And has a 2100 elements limit in MS SQL.
There is more than one way to approach this. Depends on your DBMS and requirements.
Use the in clause in SQL Query and pass the list of ID's in FilePath
SqlCommand comm = new SqlCommand("Select * from FileUpload where UploadedBy='" + NAME
+ "' AND FilePath in (" + listOfIDs + ")", conn);
Check out these URLs that are related to the use of in clause.
Techniques for In-Clause and SQL Server
Parameterizing a SQL IN clause?
Create a list or BindingSource outside the loop, bind that to your gridview and then add all records to that list or source.
The problem with your current approach is that you are overwriting the records pulled from the database with a new datasource each time, so as you stated, only the last one is "set", and the older assignments are disposed of.

SCOPE_IDENTITY is not working in asp.net?

I am trying to insert a record and get its newly generated id by executing two queries one by one, but don't know why its giving me the following error.
Object cannot be cast from DBNull to other types
My code is as below: (I don't want to use sql stored procedures)
SqlParameter sqlParam;
int lastInsertedVideoId = 0;
using (SqlConnection Conn = new SqlConnection(ObjUtils._ConnString))
{
Conn.Open();
using (SqlCommand sqlCmd = Conn.CreateCommand())
{
string sqlInsertValues = "#Name,#Slug";
string sqlColumnNames = "[Name],[Slug]";
string sqlQuery = "INSERT INTO videos(" + sqlColumnNames + ") VALUES(" + sqlInsertValues + ");";
sqlCmd.CommandText = sqlQuery;
sqlCmd.CommandType = CommandType.Text;
sqlParam = sqlCmd.Parameters.Add("#Name", SqlDbType.VarChar);
sqlParam.Value = txtName.Text.Trim();
sqlParam = sqlCmd.Parameters.Add("#Slug", SqlDbType.VarChar);
sqlParam.Value = txtSlug.Text.Trim();
sqlCmd.ExecuteNonQuery();
//getting last inserted video id
sqlCmd.CommandText = "SELECT SCOPE_IDENTITY() AS [lastInsertedVideoId]";
using (SqlDataReader sqlDr = sqlCmd.ExecuteReader())
{
sqlDr.Read();
lastInsertedVideoId = Convert.ToInt32(sqlDr["lastInsertedVideoId"]);
}
}
}
//tags insertion into tag table
if (txtTags.Text.Trim().Length > 0 && lastInsertedVideoId > 0)
{
string sqlBulkTagInsert = "";
string[] tags = txtTags.Text.Split(new string[] { "," }, StringSplitOptions.RemoveEmptyEntries);
foreach (string tag in tags)
{
sqlBulkTagInsert += "INSERT INTO tags(VideoId, Tag) VALUES(" + lastInsertedVideoId + ", " + tag.Trim().ToLowerInvariant()+ "); ";
}
using (SqlConnection Conn = new SqlConnection(ObjUtils._ConnString))
{
Conn.Open();
using (SqlCommand sqlCmd = Conn.CreateCommand())
{
string sqlQuery = sqlBulkTagInsert;
sqlCmd.CommandText = sqlQuery;
sqlCmd.CommandType = CommandType.Text;
sqlCmd.ExecuteNonQuery();
}
}
}
And also if possible, please check is the above code coded well or we can optimize it more for improve performance?
Thanks
The call to SCOPE_IDENTITY() is not being treated as being in the same "scope" as the INSERT command that you're executing.
Essentially, what you need to do is change the line:
string sqlQuery = "INSERT INTO videos(" + sqlColumnNames + ") VALUES(" + sqlInsertValues + ");";
to:
string sqlQuery = "INSERT INTO videos(" + sqlColumnNames + ") VALUES(" + sqlInsertValues + "); SELECT SCOPE_IDENTITY() AS [lastInsertedVideoId]";
and then call
int lastVideoInsertedId = Convert.ToInt32(sqlCmd.ExecuteScalar());
instead of .ExecuteNonQuery and the code block following the "//getting last inserted video id" comment.
The SCOPE_IDENTITY() should be extracted from the first command (SELECT, RETURN or OUT) and passed into the next command. By that, I mean that the SELECT_IDENTITY() should be at the end of the first command. In SQL 2008 there is additional syntax for bring values back as part of the INSERT, which makes this simpler.
Or more efficiently: combine the commands into one to avoid round-trips.

How do I get around the "'" problem in sqlite and c#?

I'm working in Microsoft Visual C# 2008 Express with Sqlite.
I understand that an apostrope (') in my text has problems in a query. My problem is that I thought I could replace it with \'. It doesn't seem to be working... Here's a parred down example of my code:
string myString = "I can't believe it!";
cmd.CommandText = "Insert into myTable (myid,mytext) values (1,'" + myString.Replace("'","\\'") + "');";
The error I get is:
SQLite error:
near "t": syntax error
I've tried a couple other replacements... like the other slash. And I wrote my string and a replaced version of my string out to the console to make sure it was coming out right.
What stupid error am I making here?
Thanks!
-Adeena
The solution presented by Robert will work (i.e. replacing ' by '').
Alternatively you can use parameters as in:
DbCommand cmd = new DbCommand();
DbParameter param = cmd.CreateParameter();
// ...
// more code
// ...
cmd.CommandText = "Insert table (field) values (#param)";
param.ParameterName = "param"
param.DbType = DbType.String;
param.Value = #"This is a sample value with a single quote like this: '";
cmd.Parameters.Add(param);
cmd.ExecuteNonQuery();
Using parameters protects against sql injection, and makes the ' problems qo away.
It is also much faster because sqlite can reuse the execution plan of statements when you use parameters. It can't when you don't use parameters. In this example using a parameter makes the bulk insert action approximately 3 times faster.
private void TestInsertPerformance() {
const int limit = 100000;
using (SQLiteConnection conn = new SQLiteConnection(#"Data Source=c:\testperf.db")) {
conn.Open();
using (SQLiteCommand comm = new SQLiteCommand()) {
comm.Connection = conn;
comm.CommandText = " create table test (n integer) ";
comm.ExecuteNonQuery();
Stopwatch s = new Stopwatch();
s.Start();
using (SQLiteTransaction tran = conn.BeginTransaction()) {
for (int i = 0; i < limit; i++) {
comm.CommandText = "insert into test values (" + i.ToString() + ")";
comm.ExecuteNonQuery();
}
tran.Commit();
}
s.Stop();
MessageBox.Show("time without parm " + s.ElapsedMilliseconds.ToString());
SQLiteParameter parm = comm.CreateParameter();
comm.CommandText = "insert into test values (?)";
comm.Parameters.Add(parm);
s.Reset();
s.Start();
using (SQLiteTransaction tran = conn.BeginTransaction()) {
for (int i = 0; i < limit; i++) {
parm.Value = i;
comm.ExecuteNonQuery();
}
tran.Commit();
}
s.Stop();
MessageBox.Show("time with parm " + s.ElapsedMilliseconds.ToString());
}
conn.Close();
}
}
Sqlite behaves similar to Oracle when it comes to the importance of using parameterised sql statements.

Categories