I'm using MS Sql database in my application and when I do SELECT operation from database it stucks for a few minutes.
And it happens a few times per day. All other SELECTS take several seconds only.
Also I noticed if I close app while SELECT operation is in progress. It will not work at all on any next app starts UNTIL I restart database engine...
Why could it be?
Here is the code snippet:
using (SqlConnection myConnection = new SqlConnection(connString))
{
myConnection.Open();
SqlCommand command = myConnection.CreateCommand();
SqlTransaction transaction;
transaction = myConnection.BeginTransaction("LockTransaction");
command.Connection = myConnection;
command.Transaction = transaction;
try
{
int recordsAtOnce = maxToLock;
command.CommandText =
"SELECT TOP 1000 id, productName from Parts WHERE part_used is NULL;";
List<string> idList = new List<string>();
SqlDataReader myReader = command.ExecuteReader(CommandBehavior.Default);
while (myReader.Read())
{
string id = myReader.GetString(1);
string name = myReader.GetInt32(0).ToString();
idList.Add(id);
}
myReader.Close();
string idsStr = "";
for(int i = 0; i < idList.Count; i++)
{
if (i != 0)
{
idsStr += ", ";
}
idsStr += idList[i];
}
// lock record
command.CommandText = "UPDATE Parts SET part_used=\'rt\' WHERE id in (" + idsStr + ")";
command.Parameters.Clear();
command.CommandType = CommandType.Text;
command.ExecuteNonQuery();
transaction.Commit();
}
catch (Exception ex)
{
transaction.Rollback();
}
I think your reader values are being assigned to the wrong variables.
Try changing:
string id = myReader.GetString(1);
string name = myReader.GetInt32(0).ToString();
idList.Add(id);
To:
string id = myReader.GetInt32(0);
string name = myReader.GetString(1).ToString();
idList.Add(id);
This is most probably because of a lock. If another transaction is writing into the Parts table, your select needs to wait.
How big is the Parts table? That could cause performance problems. Or it could be another transaction locking before you.
Other things:
Call dispose on your connection, command and reader when done with them via using
Parameterize the second query rather than string concating. It's more secure and significantly more performant because of how Sql Server works internally.
minor: If you're looking at thousands of items, use StringBuilder instead of concatenating.
Related
I am trying to get a comma separated list to convert into multiple rows in my database. For example, I am using Winforms and in my textbox I will put "a,b,c"
I want my table to look like this:
but so far I am only achieving :
Here is some sample code for what I am going for:
private void button2_Click(object sender, EventArgs e)
{
for (int i = 0; i < listBox1.Items.Count; i++)
{
listBox1.SetSelected(i, true);
}
string items = "";
StringBuilder sb = new StringBuilder();
foreach (var currentItem in listBox1.SelectedItems)
{
sb.Append(currentItem + ",");
}
items = sb.ToString();
textBox1.Text = items;
}
SQL
string strsql;
strsql = #"UPDATE [table].[dbo].[Departments]
SET [dep_Department] = #department
WHERE [dep_Username] = #username;
IF ##ROWCOUNT = 0
INSERT INTO [table].[dbo].[Departments] ([dep_Username],[dep_Department])
VALUES (#username, #department);";
SqlConnection conn = new SqlConnection(#"coneectionstring;");
var cmd = new SqlCommand(strsql, conn);
cmd.Parameters.Add("#department", SqlDbType.NVarChar, 40).Value = department;
cmd.Parameters.Add("#username", SqlDbType.NVarChar, 60).Value = username;
conn.Open();
cmd.ExecuteNonQuery();
You just need to split the departmentname variable to extract the single departments and then loop over the extracted departments.
But first, if you need to write one record for each department/user pair, then you need to delete all records for that user and then readd them. All this inside a transaction.
Of course I assume that departmentname contains all the departments to be added for the specific user.
So we have two sql command text, the first one deletes the records, the second one adds the record. The SqlCommand and its first parameter are built immediately and the command is used to remove the record, then we can change the command text and add the parameter for the department name outside the loop, and inside the loop you can just update the #department parameter with the current value.
try
{
string clear_recs = #"DELETE FROM [Departments]
WHERE [dep_Username] = #username";
string strsql = #"INSERT INTO [table].[dbo].[Departments]
([dep_Username],[dep_Department])
VALUES (#username, #department);";
using(SqlConnection conn = new SqlConnection(#"coneectionstring;"));
conn.Open();
using(SqlTransaction tr = conn.BeginTransaction());
using(SqlCommand cmd = new SqlCommand(clear_recs, conn, tr));
cmd.Parameters.Add("#username", SqlDbType.NVarChar, 60).Value = username;
cmd.ExecuteNonQuery();
// Change the commandtext and create the parameter
// but do not set the value for now.
cmd.CommandText = strsql;
cmd.Parameters.Add("#department", SqlDbType.NVarChar, 40);
// Split, then loop and update/insert
// using the same SqlCommand.
string[] deps = department.Split(',');
foreach(string dep in deps)
{
cmd.Paramenters["#department"].Value = dep;
cmd.ExecuteNonQuery();
}
// All the write are effectively done here when we commit the transaction
tr.Commit();
}
catch(Exception ex)
{
// Log the errors. If we reach the exception handler, the
// transaction is automatically rolled back and you can try again.
}
The code for the update SQL takes the value from the field in the database and adds it to the number of points in this transaction then is supposed to update the field. Any help on this would be appreciated
string strCon = Properties.Settings.Default.ConString;
OleDbConnection conn = new OleDbConnection(strCon);
conn.Open();//database open
string strSQl = "SELECT points from customer WHERE customerID = " + txtCustID.Text + "";
OleDbCommand cmd = new OleDbCommand(strSQl, conn);
OleDbDataReader reader = cmd.ExecuteReader();//reader open
string strOutput = "";
while (reader.Read())
{
strOutput += reader["points"].ToString();
}//end while loop
int intcurrent = Convert.ToInt32(strOutput);
int intNewTrans = Convert.ToInt32(lblPoints.Text);
int intNewPoints = intcurrent + intNewTrans;
string strSqlUpdate = "UPDATE customer SET points = " + intNewPoints + " WHERE customerID = " + txtCustID.Text + "";
OleDbCommand cmdUpdate = new OleDbCommand(strSqlUpdate, conn);
string strUpdateOutput = "";
while (reader.Read())
{
strUpdateOutput += reader["points"].ToString();
}//end while loop
MessageBox.Show(strUpdateOutput);
reader.Close();
conn.Close();//database closed
Because you are not executing your update command. Just use ExecuteNonQuery like;
cmdUpdate.ExecuteNonQuery();
But more important, you should always use parameterized queries. This kind of string concatenations are open for SQL Injection attacks. Also use using statement to dispose your connection, command and reader automatically instead of calling Close or Dispose methods manually.
can you help me with inserting rows into Oracle database in C#..
I have foreach and in it I just make sql query.. Then when I try to insert into database (with support of debbuging), first row insert in couple of miliseconds but the second insert/update is about 5 minutes.. All the magic is at line where is cmd.ExecuteNonQuery();
With the second row inserted/updated the debbuger return focus to the application and then after cca 5 minutes return back to debbuger.. But it's simple update so there is no so much time needed..
using (var connection = new OracleConnection(_connectionTNS))
{
connection.Open();
var transaction = connection.BeginTransaction(IsolationLevel.ReadCommitted);
int rowsInserted = 0;
foreach (var item in _tableNameAndColumnsList)
{
if (item.Replace)
{
using (OracleCommand cmd = new OracleCommand())
{
cmd.Connection = connection;
cmd.Transaction = transaction;
cmd.CommandText = "UPDATE TABLE TEST WHERE id_test = "+id_test+" ";
rowsInserted += cmd.ExecuteNonQuery();
}
}
}
transaction.Commit();
MessageBox.Show("Changed " + rowsInserted + " database rows...");
Do you have same problem someone?
Thanks Lukas
You are creating a new connection in every loop which is not required and you should avoid it for network consumption
using (OracleCommand cmd = new OracleCommand())
{
cmd.Connection = connection;
cmd.Transaction = transaction;
foreach (var item in _tableNameAndColumnsList)
{
if (item.Replace)
{
cmd.CommandText = "UPDATE TABLE TEST WHERE id_test = "+id_test+" "; /*Assuming this is only test command*/
rowsInserted += cmd.ExecuteNonQuery();
}
}
}
P.S. Not to forgot, Please use the parameterized query to avoid SQL Injection.
I have visual fox pro database with thousands of rows. I am using oledb to fetch data from fox pro and export(after doing some calculation and fomatting) it to sql server.I have used dataset to populate 2-3 datatable(related table) at a time.
First problem is that the memory use is very high because the dataset are huge.I want to reduce memory footprint.Any suggestion for this.
So i have decided to fetch few rows at a time.How do i fetch rows using oledb command so that i can fetch for eg 1-20 and then 20-40 etc
Based on all the comments and HatSoft's code here is what I think you are looking for. This is pseudo-code, whcih means it will not compile but should give you a good idea of where to go from here.
int NumberOfRecordsToRetrieve = 10000;
int StartRecordNumber = 1;
bool EndOfFile = false;
string queryString = "SELECT OrderID, CustomerID FROM Orders WHERE RECNO() BETWEEN #StartRecordNumber AND #EndRecordNumber";
While (!EndOfFile)
{
using (OleDbConnection connection = new OleDbConnection(connectionString))
{
OleDbCommand command = new OleDbCommand(queryString, connection);
command.Parameters.Add(new OleDbParameter("#StartRecordNumber", StartRecordNumber));
command.Parameters.Add(new OleDbParameter("#EndRecordNumber", StartRecordNumber + NumberOfRecordsToRetrieve));
connection.Open();
OleDbDataReader reader = command.ExecuteReader();
EndOfFile = true;
while (reader.Read())
{
EndOfFile = false
//Retrieve records here and do whatever process you wish to do
}
reader.Close();
StartRecordNumber += NumberOfRecordsToRetrieve;
}
}
string queryString = "SELECT OrderID, CustomerID FROM Orders WHERE ORDERID >= #StartOrderID AND ORDERID <= #EndOrderID";
using (OleDbConnection connection = new OleDbConnection(connectionString))
{
OleDbCommand command = new OleDbCommand(queryString, connection);
command.Parameters.Add(new OleDbParameter("#StartOrderID", "PASS THE VALUE HERE"));
command.Parameters.Add(new OleDbParameter("#EndOrderID", "PASS THE VALUE HERE"));
connection.Open();
OleDbDataReader reader = command.ExecuteReader();
while (reader.Read())
{
//Retrieve records here
}
reader.Close();
}
My SQL code inserts 10,000 records into a table from list. If record already exists, it updates a few fields.
Currently it taking more than 10 minutes and timing out unless I restrict the number of records to process. Is there anything in my code which I can do to solve this problem.
foreach(RMSResponse rmsObj in rmsList) {
try {
string connectionString = #"server=localhost\sqlexpress;" + "Trusted_Connection=yes;" + "database=MyDB; " + "connection timeout=30";
SqlConnection conn = new SqlConnection(connectionString);
conn.Open();
string str = "";
str += "SELECT * ";
str += "FROM RMSResponse ";
str += "WHERE LeadID = #RecruitID";
int LeadID;
Boolean IsPositive = true;
SqlCommand selectCommand = new SqlCommand(str, conn);
selectCommand.CommandType = CommandType.Text;
selectCommand.Parameters.Add(new SqlParameter("#RecruitID", rmsObj.RecruitID));
SqlDataReader sqlDataReader = selectCommand.ExecuteReader();
bool hasRows = sqlDataReader.HasRows;
while (sqlDataReader.Read()) {
LeadID = sqlDataReader.GetInt32(1);
IsPositive = sqlDataReader.GetBoolean(2);
IsPositive = (IsPositive == true) ? false : true;
Console.WriteLine("Lead ID: " + LeadID + " IsPositive: " + IsPositive);
}
sqlDataReader.Close();
if (hasRows) {
SqlCommand updateCommand = new SqlCommand("UPDATE RMSResponse set IsPositive=#IsPositive, OptOutDate=#OptOutDate where LeadID=#LeadID", conn);
updateCommand.Parameters.AddWithValue("#LeadID", rmsObj.RecruitID);
updateCommand.Parameters.AddWithValue("#IsPositive", IsPositive);
updateCommand.Parameters.AddWithValue("#OptOutDate", DateTime.Now);
updateCommand.ExecuteNonQuery();
sqlDataReader.Close();
}
if (!hasRows) {
SqlCommand insertCommand = new SqlCommand("INSERT INTO RMSResponse (LeadID, IsPositive, ReceivedDate) " + "VALUES(#LeadID, #IsPositive, #ReceivedDate)", conn);
insertCommand.Parameters.AddWithValue("#LeadID", rmsObj.RecruitID);
insertCommand.Parameters.AddWithValue("#IsPositive", true);
insertCommand.Parameters.AddWithValue("#ReceivedDate", DateTime.Now);
int rows = insertCommand.ExecuteNonQuery();
}
} catch (SqlException ex) {
Console.WriteLine(ex.Message);
}
}
You can move the update to SQL - all you are doing is setting the OptOutDate to be today. You can pass in the list of lead IDs to a batch update statement.
To insert records, you could bulk insert into a staging table, then execute SQL to insert the data for IDs that aren't already in the table.
There isn't much logic in your C#, so pulling the data out then putting it back in is making it unnecessarily slow.
If you don't want to go down this root, then other tips include:
Open one connection outside of the loop
Create one SqlCommand object outside of the loop and reuse it by resetting the paramenters
Change your select SQL to only select the columns you need, not *
A simple update or insert statement should never take 10 mins.
SqlServer is a very good database.
Create an index on LeadID on your table RMSResponse.
If your foreach is looping over many records, definitely consider a stored procedure, stored procedures will reduces a lot of the time taken.
If you want to Update or Insert(UPSERT) look at the Merge command added in SqlServer 2008.