Efficient Way to Update a lot of Rows from C# - c#

I have a program where I open a SqlConnection, load up a list of objects, modify a value on each object, then update the rows in the SQL Server database. Because the modification requires string parsing I wasn't able to do with with purely T-SQL.
Right now I am looping through the list of objects, and running a SQL update in each iteration. This seems inefficient and I'm wondering if there is a more efficient way to do it using LINQ
The list is called UsageRecords. The value I'm updating is MthlyConsumption.
Here is my code:
foreach (var item in UsageRecords)
{
string UpdateQuery = #"UPDATE tbl810CTImport
SET MthlyConsumption = " + item.MthlyConsumption +
"WHERE ID = " + item.Id;
SqlCommand update = new SqlCommand(UpdateQuery, sourceConnection);
update.ExecuteNonQuery();
}

Try this instead:
string UpdateQuery = #"UPDATE tbl810CTImport SET MthlyConsumption = #consumption WHERE ID = #itemId";
var update = new SqlCommand(UpdateQuery, sourceConnection);
update.Parameters.Add("#consumption", SqlDbType.Int); // Specify the correct types here
update.Parameters.Add("#itemId", SqlDbType.Int); // Specify the correct types here
foreach (var item in UsageRecords)
{
update.Parameters[0].Value = item.MthlyConsumption;
update.Parameters[1].Value = item.Id;
update.ExecuteNonQuery();
}
It should be faster because:
You don't have to create the command each time.
You don't create a new string each time (concatenation)
The query is not parsed at every iteration (Just changes the parameters values).
And it will cache the execution plan. (Thanks to #JohnCarpenter from the comment)

You can either use
SqlDataAdapter - See How to perform batch update in Sql through C# code
or what I have previously done was one of the following:
Tear down the ID's in question, and re-bulkinsert
or
Bulk Insert the ID + new value into a staging table, and update the table on SQL server:
update u
set u.MthlyConsumption = s.MthlyConsumption
from tbl810CTImport u
inner join staging s on
u.id = s.id

In a situation like this, where you can't write a single update statement to cover all your bases, it's a good idea to batch up your statements and run more than one at a time.
var commandSB = new StringBuilder();
int batchCount = 0;
using (var updateCommand = sourceConnection.CreateCommand())
{
foreach (var item in UsageRecords)
{
commandSB.AppendFormat(#"
UPDATE tbl810CTImport
SET MthlyConsumption = #MthlyConsumption{0}
WHERE ID = #ID{0}",
batchCount
);
updateCommand.Parameters.AddWithValue(
"#MthlyConsumption" + batchCount,
item.MthlyConsumption
);
updateCommand.Parameters.AddWithValue(
"#ID" + batchCount,
item.MthlyConsumption
);
if (batchCount == 500) {
updateCommand.CommandText = commandSB.ToString();
updateCommand.ExecuteNonQuery();
commandSB.Clear();
updateCommand.Parameters.Clear();
batchCount = 0;
}
else {
batchCount++;
}
}
if (batchCount != 0) {
updateCommand.ExecuteNonQuery();
}
}

It should be as simple as this . . .
private void button1_Click(object sender, EventArgs e)
{
SqlConnection con = new SqlConnection("Server=YourServerName;Database=YourDataBaseName;Trusted_Connection=True");
try
{
//cmd new SqlCommand( "UPDATE Stocks
//SET Name = #Name, City = #cit Where FirstName = #fn and LastName = #add";
cmd = new SqlCommand("Update Stocks set Ask=#Ask, Bid=#Bid, PreviousClose=#PreviousClose, CurrentOpen=#CurrentOpen Where Name=#Name", con);
cmd.Parameters.AddWithValue("#Name", textBox1.Text);
cmd.Parameters.AddWithValue("#Ask", textBox2.Text);
cmd.Parameters.AddWithValue("#Bid", textBox3.Text);
cmd.Parameters.AddWithValue("#PreviousClose", textBox4.Text);
cmd.Parameters.AddWithValue("#CurrentOpen", textBox5.Text);
con.Open();
int a = cmd.ExecuteNonQuery();
if (a > 0)
{
MessageBox.Show("Data Updated");
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
finally
{
con.Close();
}
}
Change the code to suit your needs.

Related

How to get selected ID from SQL database using textBox and update information?

I am trying to update a databse entry under a specific id in my table when the users enter their ID number in a textBox.
At the moment it updates but updates all entries in my table except the entry containing the users ID number.
This is the code I am currently using:
private void Button1_Click(object sender, EventArgs e)
{
SqlConnection con = new SqlConnection(#"Data Source=DEVELOPMENT\ACCESSCONTROL;Initial Catalog=ACCESSCONTROL;User ID=sa;Password=P#55w0rd123");
SqlCommand check_User_Name = new SqlCommand("SELECT Id FROM NewVisitor WHERE (IDNumber = #IDNumber)", con);
check_User_Name.Parameters.AddWithValue("#IDNumber", idNumber_TxtBox.Text);
con.Open();
int UserExist = (int)check_User_Name.ExecuteScalar();
if (UserExist > 0)
{
var connetionString = #"Data Source=DEVELOPMENT\ACCESSCONTROL;Initial Catalog=ACCESSCONTROL;User ID=sa;Password=P#55w0rd123";
var sql = "UPDATE NewVisitor SET PersonVisit = #PersonVisit, PurposeVisit = #PurposeVisit, Duration = #Duration, Disclaimer = #Disclaimer";
try
{
using (var connection = new SqlConnection(connetionString))
{
using (var command = new SqlCommand(sql, connection))
{
command.Parameters.Add("#PersonVisit", SqlDbType.NVarChar).Value = personVisiting_TxtBox.Text;
command.Parameters.Add("#PurposeVisit", SqlDbType.NVarChar).Value = purposeOfVisit_CMBox.SelectedItem;
command.Parameters.Add("#Duration", SqlDbType.Date).Value = duration_dateTimePicker1.Value.Date;
command.Parameters.Add("#Disclaimer", SqlDbType.NVarChar).Value = disclaimer_CHKBox.Checked;
connection.Open();
command.ExecuteNonQuery();
}
}
}
The whole table has many more fields but would like to just update the above fields within that specific ID.
Thanks
You forgot the WHERE clause on the UPDATE statement, telling it specifically which records to update. It sounds like you just want to add the exact same WHERE clause that you have on your SELECT:
var sql = "UPDATE NewVisitor SET PersonVisit = #PersonVisit, PurposeVisit = #PurposeVisit, Duration = #Duration, Disclaimer = #Disclaimer WHERE (IDNumber = #IDNumber)";
And don't forget to add the paramter for it:
command.Parameters.Add("#IDNumber", SqlDbType.Int).Value = idNumber_TxtBox.Text;
You may need to convert the input value to an integer first, I'm not 100% certain (it's been a while since I've had to use ADO.NET directly). Something like this:
if (!int.TryParse(idNumber_TxtBox.Text, out var idNumber))
{
// input wasn't an integer, handle the error
}
command.Parameters.Add("#IDNumber", SqlDbType.Int).Value = idNumber;

C# ASP.Net Processing Issue - Reading Table While Deleting Rows

Simply, I have an application that has one page that deletes and then re-adds/refreshes the records into a table every 30 seconds. I have another page that runs every 45 seconds that reads the table data and builds a chart.
The problem is, in the read/view page, every once in a while I get a 0 value (from a max count) and the chart shows nothing. I have a feeling that this is happening because the read is being done at the exact same time the delete page has deleted all the records in the table but has not yet refreshed/re-added them.
Is there a way in my application I can hold off on the read when the table is being refreshed?
Best Regards,
Andy
C#
ASP.Net 4.5
SQL Server 2012
My code below is run in an ASP.Net 4.5 built Windows service. It deletes all records in the ActualPlot table and then refreshes/adds new records from a text file every 30 seconds. I basically need to block (lock?) any user from reading the ActualPlot table while the records are being deleted and refreshed. Can you PLEASE help me change my code to do this?
private void timer1_Tick(object sender, ElapsedEventArgs e)
{
// Open the SAP text files, clear the data in the tables and repopulate the new SAP data into the tables.
var cnnString = ConfigurationManager.ConnectionStrings["TaktBoardsConnectionString"].ConnectionString;
SqlConnection conn = new SqlConnection(cnnString);
SqlConnection conndetail = new SqlConnection(cnnString);
SqlConnection connEdit = new SqlConnection(cnnString);
SqlCommand cmdGetProductFile = new SqlCommand();
SqlDataReader reader;
string sql;
// Delete all the records from the ActualPlot and the ActualPlotPreload tables. We are going to repopulate them with the data from the text file.
sql = "DELETE FROM ActualPlotPreload";
try
{
conn.Open();
SqlCommand cmd = new SqlCommand(sql, conn);
cmd.ExecuteNonQuery();
}
catch (System.Data.SqlClient.SqlException ex)
{
string msg = "Delete Error:";
msg += ex.Message;
Library.WriteErrorLog(msg);
}
finally
{
conn.Close();
}
sql = "DELETE FROM ActualPlot";
try
{
conn.Open();
SqlCommand cmd = new SqlCommand(sql, conn);
cmd.ExecuteNonQuery();
}
catch (System.Data.SqlClient.SqlException ex)
{
string msg = "Delete Error:";
msg += ex.Message;
Library.WriteErrorLog(msg);
}
finally
{
conn.Close();
}
// Read the SAP text file and load the data into the ActualPlotPreload table
sql = "SELECT DISTINCT [BoardName], [ProductFile], [ProductFileIdent] FROM [TaktBoards].[dbo].[TaktBoard] ";
sql = sql + "JOIN [TaktBoards].[dbo].[Product] ON [Product].[ProductID] = [TaktBoard].[ProductID]";
cmdGetProductFile.CommandText = sql;
cmdGetProductFile.CommandType = CommandType.Text;
cmdGetProductFile.Connection = conn;
conn.Open();
reader = cmdGetProductFile.ExecuteReader();
string DBProductFile = "";
string DBTischID = "";
string filepath = "";
string[] cellvalues;
DateTime dt, DateCheckNotMidnightShift;
DateTime ldSAPFileLastMod = DateTime.Now;
string MyDateString;
int FileRecordCount = 1;
while (reader.Read())
{
DBProductFile = (string)reader["ProductFile"];
DBTischID = (string)reader["ProductFileIdent"];
filepath = "c:\\inetpub\\wwwroot\\WebApps\\TaktBoard\\FilesFromSAP\\" + DBProductFile;
FileInfo fileInfo = new FileInfo(filepath); // Open file
ldSAPFileLastMod = fileInfo.LastWriteTime; // Get last time modified
try
{
StreamReader sr = new StreamReader(filepath);
FileRecordCount = 1;
// Populate the AcutalPlotPreload table from with the dates from the SAP text file.
sql = "INSERT into ActualPlotPreload (ActualDate, TischID) values (#ActualDate, #TischID)";
while (!sr.EndOfStream)
{
cellvalues = sr.ReadLine().Split(';');
if (FileRecordCount > 1 & cellvalues[7] != "")
{
MyDateString = cellvalues[7];
DateTime ldDateCheck = DateTime.ParseExact(MyDateString, "M/dd/yyyy", null);
DateTime dateNow = DateTime.Now;
string lsDateString = dateNow.Month + "/" + dateNow.Day.ToString("d2") + "/" + dateNow.Year;
DateTime ldCurrentDate = DateTime.ParseExact(lsDateString, "M/dd/yyyy", null);
string lsTischID = cellvalues[119];
if (ldDateCheck == ldCurrentDate)
{
try
{
conndetail.Open();
SqlCommand cmd = new SqlCommand(sql, conndetail);
cmd.Parameters.Add("#ActualDate", SqlDbType.DateTime);
cmd.Parameters.Add("#TischID", SqlDbType.VarChar);
cmd.Parameters["#TischID"].Value = cellvalues[119];
MyDateString = cellvalues[7] + " " + cellvalues[55];
dt = DateTime.ParseExact(MyDateString, "M/dd/yyyy H:mm:ss", null);
cmd.Parameters["#ActualDate"].Value = dt;
// Ignore any midnight shift (12am to 3/4am) units built.
DateCheckNotMidnightShift = DateTime.ParseExact(cellvalues[7] + " 6:00:00", "M/dd/yyyy H:mm:ss", null);
if (dt >= DateCheckNotMidnightShift)
{
cmd.ExecuteNonQuery();
}
}
catch (System.Data.SqlClient.SqlException ex)
{
string msg = "Insert Error:";
msg += ex.Message;
Library.WriteErrorLog(msg);
}
finally
{
conndetail.Close();
}
}
}
FileRecordCount++;
}
sr.Close();
}
catch
{ }
finally
{ }
}
conn.Close();
// Get the unique TischID's and ActualDate from the ActualPlotPreload table. Then loop through each one, adding the ActualUnits
// AcutalDate and TischID to the ActualPlot table. For each unique TischID we make sure that we reset the liTargetUnits to 1 and
// count up as we insert.
SqlCommand cmdGetTischID = new SqlCommand();
SqlDataReader readerTischID;
int liTargetUnits = 0;
string sqlInsert = "INSERT into ActualPlot (ActualUnits, ActualDate, TischID) values (#ActualUnits, #ActualDate, #TischID)";
sql = "SELECT DISTINCT [ActualDate], [TischID] FROM [TaktBoards].[dbo].[ActualPlotPreload] ORDER BY [TischID], [ActualDate] ASC ";
cmdGetTischID.CommandText = sql;
cmdGetTischID.CommandType = CommandType.Text;
cmdGetTischID.Connection = conn;
conn.Open();
readerTischID = cmdGetTischID.ExecuteReader();
DBTischID = "";
DateTime DBActualDate;
string DBTischIDInitial = "";
while (readerTischID.Read())
{
DBTischID = (string)readerTischID["TischID"];
DBActualDate = (DateTime)readerTischID["ActualDate"];
if (DBTischIDInitial != DBTischID)
{
liTargetUnits = 1;
DBTischIDInitial = DBTischID;
}
else
{
liTargetUnits++;
}
try
{
conndetail.Open();
SqlCommand cmd = new SqlCommand(sqlInsert, conndetail);
cmd.Parameters.Add("#ActualUnits", SqlDbType.Real);
cmd.Parameters.Add("#ActualDate", SqlDbType.DateTime);
cmd.Parameters.Add("#TischID", SqlDbType.VarChar);
cmd.Parameters["#TischID"].Value = DBTischID;
cmd.Parameters["#ActualDate"].Value = DBActualDate;
cmd.Parameters["#ActualUnits"].Value = liTargetUnits;
cmd.ExecuteNonQuery();
cmd.Parameters.Clear();
}
catch (System.Data.SqlClient.SqlException ex)
{
string msg = "Insert Error:";
msg += ex.Message;
Library.WriteErrorLog(msg);
}
finally
{
conndetail.Close();
}
}
conn.Close();
Library.WriteErrorLog("SAP text file data has been imported.");
}
If the data is being re-added right back after the delete (basically you know what to re-add before emptying the table), you could have both operation within the same SQL transaction, so that the data will be available to the other page only when it has been re-added.
I mean something like that :
public bool DeleteAndAddData(string connString)
{
using (OleDbConnection conn = new OleDbConnection(connString))
{
OleDbTransaction tran = null;
try
{
conn.Open();
tran = conn.BeginTransaction();
OleDbCommand deleteComm = new OleDbCommand("DELETE FROM Table", conn);
deleteComm.ExecuteNonQuery();
OleDbCommand reAddComm = new OleDbCommand("INSERT INTO Table VALUES(1, 'blabla', 'etc.'", conn);
reAddComm.ExecuteNonQuery();
tran.Commit();
}
catch (Exception ex)
{
tran.Rollback();
return false;
}
}
return true;
}
If your queries don't take too long to execute, you can start the two with a difference of 7.5 seconds, as there is a collision at every 90 seconds when the read/write finishes 3 cycles, and read/view finishes 2 cycles.
That being said, it's not a fool-proof solution, just a trick based on assumptions, in case you wan't to be completely sure that read/view never happens when read/write cycle is happening, try considering having a Read Lock. I would recommend reading Understanding how SQL Server executes a query and Locking in the Database Engine
Hope that helps.
I would try a couple of things:
Make sure your DELETE + INSERT operation is occurring within a single transaction:
BEGIN TRAN
DELETE FROM ...
INSERT INTO ...
COMMIT
If this isn't a busy table, try locking hints your SELECT statement. For example:
SELECT ...
FROM Table
WITH (UPDLOCK, HOLDLOCK)
In the case where the update transactions starts while your SELECT statement is running, this will cause that transaction to wait until the SELECT is finished. Unfortunately it will block other SELECT statements too, but you don't risk reading dirty data.
I was not able to figure this out but I changed my code so the program was not deleting all the rows in the ActualPlot table but checking to see if the row was there and if not adding the new row from the text file.

T-SQL Delete * from table then insert into table

Hello I'm trying to do following
Delete all from table X
insert desired values into table X
I thought then T-SQL would be way to achieve that because when something messes up in the INSERT command then everything will be deleted.
But this code does nothing it doesn't insert or delete the data. May someone help me to fix this issue?
spojeni.Open();
SqlTransaction sqlTrans = spojeni.BeginTransaction();
try
{
string delCmdTxt = "TRUNCATE TABLE PLODINY";
SqlCommand cmdDel = spojeni.CreateCommand();
cmdDel.CommandText = delCmdTxt;
cmdDel.Transaction = sqlTrans;
cmdDel.ExecuteNonQuery();
string insert_sql =
"INSERT INTO PLODINY(PLODINA,CENAZAQ,MJ)VALUES(#PLODINA,#CENAZAQ,#MJ)";
SqlCommand sqlcom = spojeni.CreateCommand();
sqlcom.CommandText = insert_sql;
sqlcom.Transaction = sqlTrans;
foreach (DataGridViewRow row in dataGridView1.Rows)
{
sqlcom.Parameters.AddWithValue("#PLODINA", row.Cells["PLODINA"].Value);
sqlcom.Parameters.AddWithValue("#CENAZAQ", row.Cells["CENAZAQ"].Value);
sqlcom.Parameters.AddWithValue("#MJ", row.Cells["MJ"].Value);
sqlcom.ExecuteNonQuery();
sqlcom.Dispose();
}
sqlTrans.Commit();
}
catch (System.Data.SqlClient.SqlException)
{
sqlTrans.Rollback();
}
finally
{
spojeni.Close();
spojeni.Dispose();
}
this.Close();
Your problem is in your foreach loop. You need to define your parameters before hand, and do not dispose the command object until you're all done with it. You can also use the Where extension method to filter out any invalid rows from your data source since its a UI element.
string insert_sql = "INSERT INTO PLODINY(PLODINA,CENAZAQ,MJ)VALUES(#PLODINA,#CENAZAQ,#MJ)";
SqlCommand sqlcom = spojeni.CreateCommand();
sqlcom.CommandText = insert_sql;
sqlcom.Transaction = sqlTrans;
sqlcom.Parameters.Add("#PLODINA");
sqlcom.Parameters.Add("#CENAZAQ");
sqlcom.Parameters.Add("#MJ");
// some validation - add what you need.
var validRows = dataGridView1.Rows.Cast<DataGridViewRow>()
.Where(row => row.Cells["PLODINA"].Value != null);
foreach (DataGridViewRow row in validRows)
{
sqlcom.Parameters[0].Value = row.Cells["PLODINA"].Value;
sqlcom.Parameters[1].Value = row.Cells["CENAZAQ"].Value;
sqlcom.Parameters[2].Value = row.Cells["MJ"].Value;
sqlcom.ExecuteNonQuery();
}
sqlTrans.Commit();
sqlcom.Dispose();
You are doing your parameters totally wrong, because the only thing in your catch is the sqlTrans.Rollback(); you never see the errors you are getting, the first thing I would change is make that catch
catch (System.Data.SqlClient.SqlException)
{
sqlTrans.Rollback();
throw;
}
so you can now see the errors happen.
The next issue is if the table has any foreign key constraints your TRUNCATE TABLE will fail, if it is failing you can simply replace it with
string delCmdTxt = "delete from PLODINY";
SqlCommand cmdDel = spojeni.CreateCommand();
cmdDel.CommandText = delCmdTxt;
cmdDel.Transaction = sqlTrans;
cmdDel.ExecuteNonQuery();
As to why your inserts are not working, you are disposing the command every instance of the for loop, you are also trying to re-add the parameters every time, reformat that loop to the following
string insert_sql = "INSERT INTO PLODINY(PLODINA,CENAZAQ,MJ)VALUES(#PLODINA,#CENAZAQ,#MJ)";
using(SqlCommand sqlcom = spojeni.CreateCommand())
{
sqlcom.CommandText = insert_sql;
sqlcom.Transaction = sqlTrans;
sqlcom.Parameters.Add("#PLODINA", SqlDbType.NVarChar); //Replace with whatever the correct datatypes are
sqlcom.Parameters.Add("#CENAZAQ", SqlDbType.NVarChar);
sqlcom.Parameters.Add("#MJ", SqlDbType.NVarChar);
foreach (DataGridViewRow row in dataGridView1.Rows)
{
sqlcom.Parameters["#PLODINA"] = row.Cells["PLODINA"].Value;
sqlcom.Parameters["#CENAZAQ"] = row.Cells["CENAZAQ"].Value;
sqlcom.Parameters["#MJ"] = row.Cells["MJ"].Value;
sqlcom.ExecuteNonQuery();
}
}
sqlTrans.Commit();
However your code can be made even better, if your DataGridView was backed by a DataTable via binding you could use a SqlTableAdapter instead, Lets say you load the table from the database, display it on the grid, and then you want to push back the updated information. With a DataTable it would be as simple as
private string _getDataQuery = "select PLODINA, CENAZAQ, MJ from PLODINY";
public void GetData(DataTable data)
{
//You do not need to call open here as SqlDataAdapter does it for you internally.
using(var spojeni = new SqlConnection(GetConnectionString())
using(var adapter = new SqlDataAdapter(_getDataQuery, spojeni)
{
data.Clear();
adapter.Fill(data);
}
}
public void UpdateData(DataTable data)
{
using(var spojeni = new SqlConnection(GetConnectionString())
using(var adapter = new SqlDataAdapter(_getDataQuery, spojeni)
using(var commandBuilder = new SqlCommandBuilder(adapter)
{
//This may or may not be nessesary for spojeni.BeginTransaction()
spojeni.Open();
using(var sqlTrans = spojeni.BeginTransaction())
{
adapter.SelectCommand.Transaction = sqlTrans;
adapter.UpdateCommand = commandBuilder.GetUpdateCommand();
adapter.UpdateCommand.Transaction = sqlTrans;
adapter.DeleteCommand = commandBuilder.GetDeleteCommand();
adapter.DeleteCommand.Transaction = sqlTrans;
adapter.InsertCommand = commandBuilder.GetInsertCommand()
adapter.InsertCommand.Transaction = sqlTrans;
try
{
adapter.Update(data);
sqlTrans.Commit();
}
catch
{
sqlTrans.Rollback();
throw;
}
}
}
}
Truncate Table only works if the table has not foreign key constraints... it's probably failing there and then rolling back the transaction in the catch statement...
Instead of Truncate try Delete From table and see if that fixes it...

Dealing with huge amount of data when inserting into sql database

in my code the user can upload an excel document wish contains it's phone contact list.Me as a developer should read that excel file turn it into a dataTable and insert it into the database .
The Problem is that some clients have a huge amount of contacts like saying 5000 and more contacts and when i am trying to insert this amount of data into the database it's crashing and giving me a timeout exception.
What would be the best way to avoid this kind of exception and is their any code that can reduce the time of the insert statement so the user don't wait too long ?
the code
public SqlConnection connection = new SqlConnection(System.Configuration.ConfigurationManager.ConnectionStrings["ConnectionString"].ConnectionString);
public void Insert(string InsertQuery)
{
SqlDataAdapter adp = new SqlDataAdapter();
adp.InsertCommand = new SqlCommand(InsertQuery, connection);
if (connection.State == System.Data.ConnectionState.Closed)
{
connection.Open();
}
adp.InsertCommand.ExecuteNonQuery();
connection.Close();
}
protected void submit_Click(object sender, EventArgs e)
{
string UploadFolder = "Savedfiles/";
if (Upload.HasFile) {
string fileName = Upload.PostedFile.FileName;
string path=Server.MapPath(UploadFolder+fileName);
Upload.SaveAs(path);
Msg.Text = "successfully uploaded";
DataTable ValuesDt = new DataTable();
ValuesDt = ConvertExcelFileToDataTable(path);
Session["valuesdt"] = ValuesDt;
Excel_grd.DataSource = ValuesDt;
Excel_grd.DataBind();
}
}
protected void SendToServer_Click(object sender, EventArgs e)
{
DataTable Values = Session["valuesdt"] as DataTable ;
if(Values.Rows.Count>0)
{
DataTable dv = Values.DefaultView.ToTable(true, "Mobile1", "Mobile2", "Tel", "Category");
double Mobile1,Mobile2,Tel;string Category="";
for (int i = 0; i < Values.Rows.Count; i++)
{
Mobile1 =Values.Rows[i]["Mobile1"].ToString()==""?0: double.Parse(Values.Rows[i]["Mobile1"].ToString());
Mobile2 = Values.Rows[i]["Mobile2"].ToString() == "" ? 0 : double.Parse(Values.Rows[i]["Mobile2"].ToString());
Tel = Values.Rows[i]["Tel"].ToString() == "" ? 0 : double.Parse(Values.Rows[i]["Tel"].ToString());
Category = Values.Rows[i]["Category"].ToString();
Insert("INSERT INTO client(Mobile1,Mobile2,Tel,Category) VALUES(" + Mobile1 + "," + Mobile2 + "," + Tel + ",'" + Category + "')");
Msg.Text = "Submitied successfully to the server ";
}
}
}
You can try SqlBulkCopy to insert Datatable to Database Table
Something like this,
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.KeepIdentity))
{
bulkCopy.DestinationTableName = DestTableName;
string[] DtColumnName = YourDataTableColumns;
foreach (string dbcol in DbColumnName)//To map Column of Datatable to that of DataBase tabele
{
foreach (string dtcol in DtColumnName)
{
if (dbcol.ToLower() == dtcol.ToLower())
{
SqlBulkCopyColumnMapping mapID = new SqlBulkCopyColumnMapping(dtcol, dbcol);
bulkCopy.ColumnMappings.Add(mapID);
break;
}
}
}
bulkCopy.WriteToServer(YourDataTableName.CreateDataReader());
bulkCopy.Close();
}
For more Read http://msdn.microsoft.com/en-us/library/system.data.sqlclient.sqlbulkcopy.aspx
You are inserting 1 row at a time, which is very expensive for this amount of data
In those cases you should use bulk insert, so the round trip to DB will be only once, if you need to roll back - all is the same transaction
You can use SqlBulkCopy which is more work, or you can use the batch update feature of the SqlAdpater. Instead of creating your own insert statement, then building a sqladapter, and then manually executing it, create a dataset, fill it, create one sqldataadpater, set the number of inserts in a batch, then execute the adapter once.
I could repeat the code, but this article shows exactly how to do it: http://msdn.microsoft.com/en-us/library/kbbwt18a%28v=vs.80%29.aspx
protected void SendToServer_Click(object sender, EventArgs e)
{
DataTable Values = Session["valuesdt"] as DataTable ;
if(Values.Rows.Count>0)
{
DataTable dv = Values.DefaultView.ToTable(true, "Mobile1", "Mobile2", "Tel", "Category");
//Fix up default values
for (int i = 0; i < Values.Rows.Count; i++)
{
Values.Rows[i]["Mobile1"] =Values.Rows[i]["Mobile1"].ToString()==""?0: double.Parse(Values.Rows[i]["Mobile1"].ToString());
Values.Rows[i]["Mobile2"] = Values.Rows[i]["Mobile2"].ToString() == "" ? 0 : double.Parse(Values.Rows[i]["Mobile2"].ToString());
Values.Rows[i]["Tel"] = Values.Rows[i]["Tel"].ToString() == "" ? 0 : double.Parse(Values.Rows[i]["Tel"].ToString());
Values.Rows[i]["Category"] = Values.Rows[i]["Category"].ToString();
}
BatchUpdate(dv,1000);
}
}
public static void BatchUpdate(DataTable dataTable,Int32 batchSize)
{
// Assumes GetConnectionString() returns a valid connection string.
string connectionString = GetConnectionString();
// Connect to the database.
using (SqlConnection connection = new SqlConnection(connectionString))
{
// Create a SqlDataAdapter.
SqlDataAdapter adapter = new SqlDataAdapter();
// Set the INSERT command and parameter.
adapter.InsertCommand = new SqlCommand(
"INSERT INTO client(Mobile1,Mobile2,Tel,Category) VALUES(#Mobile1,#Mobile2,#Tel,#Category);", connection);
adapter.InsertCommand.Parameters.Add("#Mobile1",
SqlDbType.Float);
adapter.InsertCommand.Parameters.Add("#Mobile2",
SqlDbType.Float);
adapter.InsertCommand.Parameters.Add("#Tel",
SqlDbType.Float);
adapter.InsertCommand.Parameters.Add("#Category",
SqlDbType.NVarchar, 50);
adapter.InsertCommand.UpdatedRowSource = UpdateRowSource.None;
// Set the batch size.
adapter.UpdateBatchSize = batchSize;
// Execute the update.
adapter.Update(dataTable);
}
}
I know this is a super old post, but you should not need to use the bulk operations explained in the existing answers for 5000 inserts. Your performance is suffering so much because you close and reopen the connection for each row insert. Here is some code I have used in the past that keeps one connection open and executes as many commands as needed to push all the data to the DB:
public static class DataWorker
{
public static Func<IEnumerable<T>, Task> GetStoredProcedureWorker<T>(Func<SqlConnection> connectionSource, string storedProcedureName, Func<T, IEnumerable<(string paramName, object paramValue)>> parameterizer)
{
if (connectionSource is null) throw new ArgumentNullException(nameof(connectionSource));
SqlConnection openConnection()
{
var conn = connectionSource() ?? throw new ArgumentNullException(nameof(connectionSource), $"Connection from {nameof(connectionSource)} cannot be null");
var connState = conn.State;
if (connState != ConnectionState.Open)
{
conn.Open();
}
return conn;
}
async Task DoStoredProcedureWork(IEnumerable<T> workData)
{
using (var connection = openConnection())
using (var command = connection.CreateCommand())
{
command.CommandType = CommandType.StoredProcedure;
command.CommandText = storedProcedureName;
command.Prepare();
foreach (var thing in workData)
{
command.Parameters.Clear();
foreach (var (paramName, paramValue) in parameterizer(thing))
{
command.Parameters.AddWithValue(paramName, paramValue ?? DBNull.Value);
}
await command.ExecuteNonQueryAsync().ConfigureAwait(false);
}
}
}
return DoStoredProcedureWork;
}
}
This was actually from a project where I was gathering emails for a restriction list, so kind of relevant example of what a parameterizer argument might look like and how to use the above code:
IEnumerable<(string,object)> RestrictionToParameter(EmailRestriction emailRestriction)
{
yield return ("#emailAddress", emailRestriction.Email);
yield return ("#reason", emailRestriction.Reason);
yield return ("#restrictionType", emailRestriction.RestrictionType);
yield return ("#dateTime", emailRestriction.Date);
}
var worker = DataWorker.GetStoredProcedureWorker<EmailRestriction>(ConnectionFactory, #"[emaildata].[AddRestrictedEmail]", RestrictionToParameter);
await worker(emailRestrictions).ConfigureAwait(false);

What's wrong with my IF statement?

I'm creating an auditting table, and I have the easy Insert and Delete auditting methods done. I'm a bit stuck on the Update method - I need to be able to get the current values in the database, the new values in the query parameters, and compare the two so I can input the old values and changed values into a table in the database.
Here is my code:
protected void SqlDataSource1_Updating(object sender, SqlDataSourceCommandEventArgs e)
{
string[] fields = null;
string fieldsstring = null;
string fieldID = e.Command.Parameters[5].Value.ToString();
System.Security.Principal. WindowsPrincipal p = System.Threading.Thread.CurrentPrincipal as System.Security.Principal.WindowsPrincipal;
string[] namearray = p.Identity.Name.Split('\\');
string name = namearray[1];
string queryStringupdatecheck = "SELECT VAXCode, Reference, CostCentre, Department, ReportingCategory FROM NominalCode WHERE ID = #ID";
string queryString = "INSERT INTO Audit (source, action, itemID, item, userid, timestamp) VALUES (#source, #action, #itemID, #item, #userid, #timestamp)";
using (SqlConnection connection = new SqlConnection("con string = deleted for privacy"))
{
SqlCommand commandCheck = new SqlCommand(queryStringupdatecheck, connection);
commandCheck.Parameters.AddWithValue("#ID", fieldID);
connection.Open();
SqlDataReader reader = commandCheck.ExecuteReader();
while (reader.Read())
{
for (int i = 0; i < reader.FieldCount - 1; i++)
{
if (reader[i].ToString() != e.Command.Parameters[i].Value.ToString())
{
fields[i] = e.Command.Parameters[i].Value.ToString() + "Old value: " + reader[i].ToString();
}
else
{
}
}
}
fieldsstring = String.Join(",", fields);
reader.Close();
SqlCommand command = new SqlCommand(queryString, connection);
command.Parameters.AddWithValue("#source", "Nominal");
command.Parameters.AddWithValue("#action", "Update");
command.Parameters.AddWithValue("#itemID", fieldID);
command.Parameters.AddWithValue("#item", fieldsstring);
command.Parameters.AddWithValue("#userid", name);
command.Parameters.AddWithValue("#timestamp", DateTime.Now);
try
{
command.ExecuteNonQuery();
}
catch (Exception x)
{
Response.Write(x);
}
finally
{
connection.Close();
}
}
}
The issue I'm having is that the fields[] array is ALWAYS null. Even though the VS debug window shows that the e.Command.Parameter.Value[i] and the reader[i] are different, the fields variable seems like it's never input into.
Thanks
You never set your fields[] to anything else than null, so it is null when you are trying to access it. You need to create the array before you can assign values to it. Try:
SqlDataReader reader = commandCheck.ExecuteReader();
fields = new string[reader.FieldCount]
I don't really understand what your doing here, but if your auditing, why don't you just insert every change into your audit table along with a timestamp?
Do fields = new string[reader.FieldCount] so that you have an array to assign to. You're trying to write to null[0].

Categories