I am having an issue with a school assignment.
For some reason, my sql INSERT command doesn't wotk unless the database table is empty.
If it is empty, it works just fine, but if I use if while there are already entries there, it moves to my "catch" block.
My initial connection and reading from the database to load the items was successful.
The following code also succeeds IF I remove the first "for" block (i.e. if I start my connection by deleting all the data from the table and insert the new data after that. Please read the long comment as to why I did two identical "for" blocks to begin with).
If I delete the firs "for" block and also remove the "DELETE" command, it still doesn't work (i.e. the problem isn't the two identical "for" blocks, but something else which prevents the INSERT command from working unless the table is empty).
Using MessageBox.Show I have tested how far the "try" block goes before going to the "catch" block. I have inserted //--// into that place in the following code. As you can see, the last successful line is before the INSERT command.
private void Form1_FormClosing(object sender, FormClosingEventArgs e)
{
Favorite fav;
SqlCommand delfav = new SqlCommand("DELETE FROM tblFavorites", conn);
SqlCommand addfav = new SqlCommand("INSERT INTO tblFavorites (Title, url) VALUES (#title, #url)", conn);
addfav.Parameters.Add("#title", SqlDbType.NVarChar, 100);
addfav.Parameters.Add("#url", SqlDbType.NVarChar, 300);
try
{
conn.Open();
for (int i = 0; i < favorites.Items.Count; i++)
{
fav = (Favorite)favorites.Items[i];
addfav.Parameters["#title"].Value = fav.getsettitle;
addfav.Parameters["#url"].Value = fav.getseturl;
//--//
addfav.ExecuteNonQuery();
}
/*
At first I didn't have the above section. The result was that each time the browser closed,
it first deleted the database table and then if there was a problem uploading the new data,
the old list was already gone.
To correct for this, I first attempted to add the favorites to the list (despite the fact
that for a short time it will have double the amount of information). If this process is
unsuccessful, it will skip to the catch block and the original data will be saved.
Only if the INSERT command is successful, I delete the data, then repeat the INSERT command,
already knowing that it is successful.
*/
delfav.ExecuteNonQuery();
for (int i = 0; i < favorites.Items.Count; i++)
{
fav = (Favorite)favorites.Items[i];
addfav.Parameters["#title"].Value = fav.getsettitle;
addfav.Parameters["#url"].Value = fav.getseturl;
addfav.ExecuteNonQuery();
}
}
catch
{
DialogResult dialogResult = MessageBox.Show("There was a problem loading your favorites to the database. Do you still wish to quit? \n (All changes to the favorites list will be lost)", "", MessageBoxButtons.YesNo);
if (dialogResult != DialogResult.Yes)
{
e.Cancel = true;
}
}
finally
{
conn.Close();
}
}
Thanx a lot for your help guys!
Related
I have been tasked with creating an application that monitors any "INSERT" events on a specific table. I was going to go about this using SqlDependency to create a notification link between the DB and the C# app, but it turns out I am not able to do this due to security issues.
Due to this, I have modeled my application as follows:
This is well and good, but as it turns out, the SQL table I am querying has a rather large size. The table has nearly 3.5 Million rows 55 columns. When loading into the C# DataTable object, I am getting an out of memory exception.
internal static DataTable ExecuteQuery(string query, Dictionary<string,string> parameters = null)
{
try
{
using (SqlConnection dbconn = new SqlConnection(SQLServer.Settings.ConnectionString))
using (SqlCommand cmd = new SqlCommand())
{
dbconn.Open(); // Open the connection
cmd.CommandText = query; // Set the query text
cmd.Connection = dbconn;
if (parameters != null)
{
foreach (var parameter in parameters) // Add filter parameters
cmd.Parameters.AddWithValue(parameter.Key, parameter.Value);
}
var dt = new DataTable();
using (SqlDataAdapter adpt = new SqlDataAdapter(cmd)){adpt.Fill(dt);} // MY ERROR OCCURS HERE!
dbconn.Close();
queryError = false;
return dt;
}
}
catch(Exception ex)
{
queryError = true;
EventLogger.WriteToLog("ExecuteQuery()", "Application", "Error: An error has occured while performing a database query.\r\nException: " + ex.Message);
return null;
}
}
When running the code above, I get the following error at the line for SqlDataAdapter.Fill(dt)
Exception of type 'System.OutOfMemoryException' was thrown.
Is there a way that I can either restructure my application OR prevent this incredibly high memory consumption from the DataTable class? SQL server seems capable enough to do a select * from the table but when I fill a DataTable with the same data, I use up over 6GB of RAM! Why is there so much overhead when using DataTable?
Here is a link to my flowchart.
I was able to resolve this issue by making use of the SqlDataReaderclass. This class lets you "stream" the sql result set row by row rather bringing back the entire result set all at once and loading that into memory.
So now in step 5 from the flow chart, I can query for only the very first row. Then in step 6, I can query again at a later date and iterate through the new result set one row at a time until I find the original row I started at. All the while, I am filling a DataTable with the new results. This accomplishes two things.
I don't need to load all the data from the query all at once into local memory.
I can immediately get the "inverse" DataSet. AKA... I can get the newly inserted rows that didn't exist the first time I checked.
Which is exactly what I was after. Here is just a portion of the code:
private static SqlDataReader reader;
private static SqlConnection dbconn = new SqlConnection(SQLServer.Settings.ConnectionString);
private void GetNextRows(int numRows)
{
if (dbconn.State != ConnectionState.Open)
OpenConnection();
// Iterate columns one by one for the specified limit.
int rowCnt = 0;
while (rowCnt < numRows)
{
while (reader.Read())
{
object[] row = new object[reader.FieldCount];
reader.GetValues(row);
resultsTable.LoadDataRow(row, LoadOption.PreserveChanges);
rowCnt++;
sessionRowPosition++;
break;
}
}
}
The whole class would be too large for me to post here but one of the caveats was that the interval between checks for me was long, on the order of days, so I needed to close the connection between checks. When closing the connection with a SqlDataReader, you loose your row position so I needed to add a counter to keep track of that.
Check you query for select. You probably get from database many rows.
I've got an application that, when a button is clicked, reads an access database for products and lists them in a listbox and dataGridView. The connection command text is northwind_command.CommandText = "SELECT ProductName, UnitPrice FROM Products WHERE UnitPrice > (#price_greater_than)";
On the first click the program will work, but when the button is clicked a second time an exception is thrown. As this thrown exception causes the data reader to "crash", the third click will work as if it was the first. The fourth click will throw the same exception. If I had to guess, I'd say that the the datareader isn't closing properly, but it should be. Here is the code for that part of the program:
northwind_connection.ConnectionString = #"Provider=Microsoft.Jet.OLEDB.4.0;Data Source='U:\Programming\C#\Week 13\Exercises\ExerciseA1\bin\northwind.mdb';Persist Security Info=True";
northwind_command.Connection = northwind_connection; // connects the command to the connection.
northwind_command.CommandText = "SELECT ProductName, UnitPrice FROM Products WHERE UnitPrice > (#price_greater_than)"; // sets the query used by this command.
northwind_command.Parameters.AddWithValue("#price_greater_than", price_greater_than);
try
{
northwind_connection.Open(); // opens the connection.
northwind_reader = northwind_command.ExecuteReader(); // reads the data from the connection while executing the command.
dataGridView1.Columns.Add("ProductName", "Product Name");
dataGridView1.Columns.Add("UnitPrice", "Product Price");
while (northwind_reader.Read())
{
dataGridView1.Rows.Add(northwind_reader["ProductName"], northwind_reader["UnitPrice"]);
listBox1.Items.Add(northwind_reader["ProductName"] + "\t" + northwind_reader["UnitPrice"]);
}
}catch(Exception mistake)
{
MessageBox.Show(mistake.ToString());
}
northwind_connection.Close();
EDIT: I've solved the issue with some help, but would like to figure out why it was happening in the first place. The offending line was northwind_command.Parameters.AddWithValue("#price_greater_than", price_greater_than);. The line above that one was modified to: northwind_command.CommandText = "SELECT ProductName, UnitPrice FROM Products WHERE UnitPrice > " + price_greater_than; and the program now works correctly.
That method was causing the exception to be thrown, which can be seen below:
I checked the exception message and line 50 contains this code: northwind_reader = northwind_command.ExecuteReader();, which confirms that the AddwithValue method was causing the error.
My bet is that something is not getting disposed of properly. Try modifying your code to use using statements:
using(var northwindConnection = new OleDbConnection())
{
//Set your connection info
using(var northwindCommand = northwindConnection.CreateCommand())
{
//Set your command info
try
{
// Open your connection and any other things
// needed before executing your reader
using(var reader = northwindCommand.ExecuteReader()){
//Do what you need with your reader
}
}
catch(Exception mistake)
{
MessageBox.Show(mistake.ToString());
}
}
}
When a class implements IDisposable, you really should wrap that in a using statement. Doing so will make sure that all resources are properly disposed. In the case of database connections, this will make sure that your connection is closed, so no need to call myConn.Close().
Another things that might be causing issues is that you are adding columns to dataGridView1 every time the button is clicked.
Edit:
Since you found the problem is with AddWithValue, let me add this:
In past experience, I have had issues using the #paramName syntax with OleDbCommand. Try using ?paramNam syntax instead. I have also had issues if the name is to long, so try shortening it.
You should use Paramaters.Add(string, OleDbType).Value = value instead of Paramaters.AddWithValue(string, value). The reason being is that AddWithValue has to interpret the type of the column, and it can sometimes get it wrong.
Add northwind_reader.Close() at the end of the while loop.
You need to close reader before use it again.
example:
while (reader.Read())
{
string value = reader.GetValue(0).tostring();
}
reader.Close();
in your case is northwind_command.close()
I want to build a simple loop to check incoming data from SQL server, compare it to a textfield, and execute non query if there are no duplicates.
I wrote this code:
try
{
bool exists = false;
conn = new SqlConnection(DBConnectionString);
SqlCommand check_user = new SqlCommand("SELECT usrEmail FROM tblUsers", conn);
SqlCommand add_user = new SqlCommand("INSERT INTO tblUsers (usrEmail, usrPassword, usrRealname, usrIsowner) VALUES (#email, #pass, #name, #owner)", conn);
// (I have removed all the paramaters from this code as they are working and irrelevant)
conn.Open();
SqlDataReader check = check_user.ExecuteReader();
while (check.Read())
{
if (Convert.ToString(check[0]) == UserEmail.Text)
{
MessageBox.Show("The email you entered already exists in the system.");
exists = true;
break;
}
}
if (exists == false)
{
add_user.ExecuteNonQuery();
}
else
{
return;
}
}
catch (Exception ex)
{
MessageBox.Show("There was a problem uploading data to the database. Please review the seller's details and try again. " + ex.Message);
return;
}
finally
{
conn.Close();
}
I used breakpoints and saw that the code runs the while loop fine, but when it reaches the ExecuteNonQuery command, it returns an error message:
there is already an open datareader associated with this command which
must be closed first
I tried to use a check.Close(); command, but when I do, it suddenly gets stuck with the duplicate email error message for reasons passing understanding.
Additionally, there was a fix I tried in which the data actually WAS sent to the database (I saw it in SQL Server Management Studio), but still gave an error message... That was even stranger, since the nonquery command is the LAST in this function. If it worked, why did it go to the catch?
I have searched the site for answers, but the most common answers are MARS (I have no idea what that is) or a dataset, which I do not want to use in this case.
Is there a simple solution here? Did I miss something in the code?
The simples way out would be:
using(SqlDataReader check = check_user.ExecuteReader())
{
while (check.Read())
{
if (Convert.ToString(check[0]) == UserEmail.Text)
{
MessageBox.Show("The email you entered already exists in the system.");
exists = true;
break;
}
}
}
That said, there are some serious problems with this code.
First of all, you don't really want to read all users just to check that an email address is already taken. select count(*) from tblUsers where usrEmail = #email is fine...
...or not, because there's a possibility of a race condition. What you should do is add a unique constraint on a usrEmail column and just insert into tblUsers, catching violations. Or you can use merge if you feel like it.
Next, you don't really want to have your data access code all over the place. Factor it out into separate classes/methods at least.
I have a relatively simply routine that looks at database entries for media files, calculates the width, height and filesize, and writes them back into the database.
The database is SQLite, using the System.Data.SQLite library, processing ~4000 rows. I load all rows into an ADO table, update the rows/columns with the new values, then run adapter.Update(table); on it.
Loading the dataset from the db tables half a second or so, updating all the rows with image width/height and getting the file length from FileInfo took maybe 30 seconds. Fine.
The adapter.Update(table); command took somewhere in the vicinity of 5 to 7 minutes to run.
That seems awfully excessive. The ID is a PK INTEGER and thus - according to SQLite's docs, is inherently indexed, yet even so I can't help but think that if I were to run a separate update command for each individual update, this would have completed much faster.
I had considered ADO/adapters to be relatively low level (as opposed to ORMs anyway), and this terrible performance surprised me. Can anyone shed some light on why it would take 5-7 minutes to update a batch of ~4000 records against a locally placed SQLite database?
As a possible aside, is there some way to "peek into" how ADO is processing this? Internal library stepthroughs or...??
Thanks
public static int FillMediaSizes() {
// returns the count of records updated
int recordsAffected = 0;
DataTable table = new DataTable();
SQLiteDataAdapter adapter = new SQLiteDataAdapter();
using (SQLiteConnection conn = new SQLiteConnection(Globals.Config.dbAppNameConnectionString))
using (SQLiteCommand cmdSelect = new SQLiteCommand())
using (SQLiteCommand cmdUpdate = new SQLiteCommand()) {
cmdSelect.Connection = conn;
cmdSelect.CommandText =
"SELECT ID, MediaPathCurrent, MediaWidth, MediaHeight, MediaFilesizeBytes " +
"FROM Media " +
"WHERE MediaType = 1 AND (MediaWidth IS NULL OR MediaHeight IS NULL OR MediaFilesizeBytes IS NULL);";
cmdUpdate.Connection = conn;
cmdUpdate.CommandText =
"UPDATE Media SET MediaWidth = #w, MediaHeight = #h, MediaFilesizeBytes = #b WHERE ID = #id;";
cmdUpdate.Parameters.Add("#w", DbType.Int32, 4, "MediaWidth");
cmdUpdate.Parameters.Add("#h", DbType.Int32, 4, "MediaHeight");
cmdUpdate.Parameters.Add("#b", DbType.Int32, 4, "MediaFilesizeBytes");
SQLiteParameter param = cmdUpdate.Parameters.Add("#id", DbType.Int32);
param.SourceColumn = "ID";
param.SourceVersion = DataRowVersion.Original;
adapter.SelectCommand = cmdSelect;
adapter.UpdateCommand = cmdUpdate;
try {
conn.Open();
adapter.Fill(table);
conn.Close();
}
catch (Exception e) {
Core.ExceptionHandler.HandleException(e, true);
throw new DatabaseOperationException("", e);
}
foreach (DataRow row in table.Rows) {
try {
using (System.Drawing.Image img = System.Drawing.Image.FromFile(row["MediaPathCurrent"].ToString())) {
System.IO.FileInfo fi;
fi = new System.IO.FileInfo(row["MediaPathCurrent"].ToString());
if (img != null) {
int width = img.Width;
int height = img.Height;
long length = fi.Length;
row["MediaWidth"] = width;
row["MediaHeight"] = height;
row["MediaFilesizeBytes"] = (int)length;
}
}
}
catch (Exception e) {
Core.ExceptionHandler.HandleException(e);
DevUtil.Print(e);
continue;
}
}
try {
recordsAffected = adapter.Update(table);
}
catch (Exception e) {
Core.ExceptionHandler.HandleException(e);
throw new DatabaseOperationException("", e);
}
}
return recordsAffected;
}
Use Connection.BeginTransaction() to speed up the DataAdapter update.
conn.Open() 'open connection
Dim myTrans As SQLiteTransaction
myTrans = conn.BeginTransaction()
'Associate the transaction with the select command object of the DataAdapter
objDA.SelectCommand.Transaction = myTrans
objDA.Update(objDT)
Try
myTrans.Commit()
Catch ex As Exception
myTrans.Rollback()
End Try
conn.Close()
This vastly speeds up the update.
Loading the dataset from the db tables half a second or so
This is a single SQL statement (so it's fast). Excute SQL SELECT, populate the dataset, done.
updating all the rows with image width/height and getting the file
length from FileInfo took maybe 30 seconds. Fine.
This is updating the in memory data (so that's fast too), change x row in the dataset, don't talk to SQL at all.
The adapter.Update(table); command took somewhere in the vicinity of 5
to 7 minutes to run.
This will run a SQL update for every updated row. Which is why it's slow.
yet even so I can't help but think that if I were to run a separate
update command for each individual update, this would have completed
much faster.
This is basically what it's doing anyway!
From MSDN
The update is performed on a by-row basis. For every inserted,
modified, and deleted row, the Update method determines the type of
change that has been performed on it (Insert, Update or Delete).
Depending on the type of change, the Insert, Update, or Delete command
template executes to propagate the modified row to the data source.
When an application calls the Update method, the DataAdapter examines
the RowState property, and executes the required INSERT, UPDATE, or
DELETE statements iteratively for each row, based on the order of the
indexes configured in the DataSet.
is there some way to "peek into" how ADO is processing this?
Yes: Debug .NET Framework Source Code in Visual Studio 2012?
I have two stored procedures which are exactly the same, the only difference being one commits the inserts/updates, whilst the other runs in rollback mode.
What I am trying to achieve.
I want the user to fill out the 3 variables and then click a button, this button should then set up and execute the ROLLBACK version of the stored procedure. The user will then be presented with an OK/Cancel dialog MessageBox. If the data looks okay the user will then select OK from the DialogResult, if not they select Cancel. If they do select OK this is when the COMMIT version of the stored procedure will execute.
My problem.
Currently in the code I have copied what I have done for the commit version of the stored proc. i.e once the changes have been mad the dataset is refreshed and the gridview updated. As the ROLLBACK version of the stored proc will not actually make any changes the gridview is never going to show the user what the data WILL look like if they click OK.
In SSMS, if I exec the rollback stored proc it will display a select statement before the ROLLBACK TRAN part, which essentially shows me what the data will look like. Its this SELECT statement that I want to update my dataset with so that the user can inspect the changes before clicking OK (commit)
My Question
Is there anyway of using the SELECT statement within the rollback stored proc to update my dataset/gridview, if not is there anyway of changing my SQLDataAdapter to update the gridview with how the data looks within the transaction of the rollback stored proc, I think I may need to use ExecuteReader but I'm not sure where this will fit into my current code.
Code
//Only execute the updated if there is an ID, OldProfileClass and NewProfileClass specified
if (recordID.Text != "" && oldProfileClass.Text != "" && newProfileClass.Text != "")
{
int ID = Convert.ToInt32(recordID.Text);
int oldPC = Convert.ToInt32(oldProfileClass.Text);
int NewPC = Convert.ToInt32(newProfileClass.Text);
string connstrroll = #"Initial Catalog=mytestdb;Data Source=localhost;Integrated Security=SSPI;";
SqlConnection connroll = new SqlConnection(connstrroll);
connroll.Open();
var cmdroll = new SqlCommand("dbo.myrollbacksp", connroll);
cmdroll.CommandType = CommandType.StoredProcedure;
cmdroll.Parameters.AddWithValue("#meter_id", ID);
cmdroll.Parameters.AddWithValue("#new_profile_num", NewPC);
cmdroll.Parameters.AddWithValue("#old_profile_num", oldPC);
//execute the command 'cmd', the profile class will now be updated at db level
cmdroll.ExecuteNonQuery();
int numberOfRecordsroll = cmdroll.ExecuteNonQuery();
//Once the Profile Class change has been committed, show the results in the gridview
using (SqlDataAdapter aroll = new SqlDataAdapter("SELECT cust_ref, (region+meter_num_1+meter_num_2) as Number, meter_id, site_name, profile_num FROM dbo.Meter WHERE meter_id = #filter", conn))
{
int filter = ID;
aroll.SelectCommand.Parameters.AddWithValue("#filter", filter);
// Use DataAdapter to fill DataTable
DataTable t = new DataTable();
aroll.Fill(t);
// Render data onto the screen
gridSelectID.DataSource = t;
}
//close connections
cmdroll.Dispose();
connroll.Close();
connroll.Dispose();
//confirm update
MessageBox.Show("Number of records affected:" + numberOfRecordsroll + " Please check the data is correct before proceeding", "Please validate your changes", MessageBoxButtons.OKCancel);
if (DialogResult == DialogResult.OK)
{
// CODE TO FIRE THE COMMIT VERSION OF STORED PROC GOES HERE
}
else if (DialogResult == DialogResult.Cancel)
{
//DONT RUN THE COMMIT VERSION OF THE STORED PROC
}
//empty the values of the three text box's once the profile class is updated
recordID.Text = "";
oldProfileClass.Text = "";
newProfileClass.Text = "";
}
else
{
MessageBox.Show("Please provide details for all 3 boxes", "Warning");
}
This did the job...
//show the 'temporary' results in the gridview, taken from the rollback stored proc
using (SqlDataAdapter aroll = new SqlDataAdapter(cmdroll))
{
// Use DataAdapter to fill DataTable
DataTable troll = new DataTable();
aroll.Fill(troll);
// Render data onto the screen
gridSelectID.DataSource = troll;
}