I've been trying to get my query to work for some time it runs but doesn't insert anything nor does it return any errors.
The database connection is open and is successfuly connection.
The Table is called errorlog and holds the following data
- id (int autoincremental, Primary key, Unique)
- exception (varchar)
- time (DateTime)
exception = String(error message)
time = DateTime.Now
Here's the code:
public void insertError(string error, DateTime time)
{
SqlCeParameter[] sqlParams = new SqlCeParameter[]
{
new SqlCeParameter("#exception", error),
new SqlCeParameter("#time", time)
};
try
{
cmd = new SqlCeCommand();
cmd.Connection = connection;
cmd.CommandType = CommandType.Text;
cmd.CommandText = "INSERT INTO errorlog (exception, time) VALUES(#exception, #time)";
cmd.Parameters.AddRange(sqlParams);
cmd.ExecuteNonQuery();
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}
}
Any help would be appreciated, Thanks in advance.
EDIT
Removed quotes around #exception
Heres the connection:
protected DataController()
{
try
{
string appPath = System.IO.Path.GetDirectoryName(Assembly.GetAssembly(typeof(DataController)).CodeBase).Replace(#"file:\", "") + #"\";
string strCon = #"Data Source = " + appPath + #"Data\EasyShop.sdf";
connection = new SqlCeConnection(strCon);
}
catch (Exception e)
{
}
connection.Open();
}
Finally the way it gets called:
public bool log(string msg, bool timestamp = true)
{
DataController dc = DataController.Instance();
dc.insertError(msg, DateTime.Today);
return true;
}
Debug your application and see if connection points exactly to the
database you want. Also check if you look for the inserted records
in the same database.
If your connection belongs to the transaction, check if it's committed. You will not see those records inserted until transaction is committed.
It seems to me, that you INSERT is wrong. Remove quotes around #exception
Open SQL Server Profiler, connect to your database and check if your INSERT appears in there.
Related
I had the connection string and a bunch of my unit tests using it in order to test the logic of some class which was applying some CRUD operations to it. So I was passing it as a private constant field in test class and sharing it to my tests. Everything worked perfectly fine!
But then I realized I have to do it as integration testing. So I've decided to use static helper class to create database via session for me tests to work with it and then drop.
The class is the following:
public static class LocalDB
{
public const string DB_DIRECTORY = "Data";
public static string GetLocalDB(string dbName, bool deleteIfExists = false)
{
try
{
var outputFolder = Path.Combine(Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location), DB_DIRECTORY);
var mdfFilename = dbName + ".mdf";
var dbFileName = Path.Combine(outputFolder, mdfFilename);
var logFileName = Path.Combine(outputFolder, $"{dbName}_log.ldf");
if (!Directory.Exists(outputFolder))
{
Directory.CreateDirectory(outputFolder);
}
if (File.Exists(dbFileName) && deleteIfExists)
{
if (File.Exists(logFileName)) File.Delete(logFileName);
File.Delete(dbFileName);
CreateDatabase(dbName, dbFileName);
}
else if (!File.Exists(dbFileName))
{
CreateDatabase(dbName, dbFileName);
}
var connectionString = string.Format(#"Data Source=(LocalDB)\v11.0;AttachDBFileName={1};Initial Catalog={0};Integrated Security=True;", dbName, dbFileName);
CreateTable(connectionString, "Cpa", dbName);
return connectionString;
}
catch(Exception ex)
{
throw;
}
}
public static bool CreateDatabase(string dbName, string dbFileName)
{
try
{
var connectionString = #"Data Source=(LocalDB)\v11.0;Initial Catalog=master;Integrated Security=True";
using (var connection = new SqlConnection(connectionString))
{
connection.Open();
var cmd = connection.CreateCommand();
DetachDatabase(dbName);
cmd.CommandText = string.Format("CREATE DATABASE {0} ON (NAME = N'{0}', FILENAME = '{1}')", dbName, dbFileName);
cmd.ExecuteNonQuery();
cmd.Dispose();
}
return File.Exists(dbFileName);
}
catch
{
throw;
}
}
public static bool DetachDatabase(string dbName)
{
try
{
var connectionString = $#"Data Source=(LocalDB)\v11.0;Initial Catalog=master;Integrated Security=True";
using (var connection = new SqlConnection(connectionString))
{
connection.Open();
var cmd = connection.CreateCommand();
cmd.CommandText = $"exec sp_detach_db '{dbName}'";
cmd.ExecuteNonQuery();
cmd.Dispose();
return true;
}
}
catch(Exception ex)
{
return false;
}
}
public static bool CreateTable(string connectionString, string tableName, string dbName)
{
connectionString = connectionString.Replace("master", dbName);
try
{
using (var connection = new SqlConnection(connectionString))
{
var createTableQuery = $#"CREATE TABLE {tableName}(
CrmId nvarchar(50) NOT NULL,
Service nvarchar(25) NOT NULL,
RecurringReference nvarchar(50),
ExpiryDate datetime,
CardNumber nvarchar(50),
Enabled bit,
Brand nvarchar(50),
CpaType nvarchar(50),
Channel nvarchar(50)
);";
var command = new SqlCommand(createTableQuery, connection);
connection.Open();
var reader = command.ExecuteReader();
reader.Dispose();
return true;
}
}
catch (Exception ex)
{
return false;
}
}
}
I was calling it's GetLocalDB method in my test class' ctor amd initializing field.
After that I've got the following error "the process cannot access the file blah log.ldf because it is used by another process"
!!! All the tests were using the same connection string, I have no idea what's gone wrong. It can't be the unit tests failure for all I've changed is connection string (was for already existent db -> changed to temporary local (LocalDb class))
Thanks!
I also used localDb for testing in my web application and had a similar issue. This issue can sometime happen if while debugging you stopped in between or some test ran into any exception and disposing of localdb did not happen properly. If that happens then next time when you start running test it wont create the db(inspite of the check ifdbexists) and run into some exception like you mentioned. To check I added a different name every time we ran a set of tests so that db gets created on start of running test and drops after executing all the tests.As its not ideal to give different name every time , try disposing the localdb in a finally block to make sure it happens even in case of exception
Simply, I have an application that has one page that deletes and then re-adds/refreshes the records into a table every 30 seconds. I have another page that runs every 45 seconds that reads the table data and builds a chart.
The problem is, in the read/view page, every once in a while I get a 0 value (from a max count) and the chart shows nothing. I have a feeling that this is happening because the read is being done at the exact same time the delete page has deleted all the records in the table but has not yet refreshed/re-added them.
Is there a way in my application I can hold off on the read when the table is being refreshed?
Best Regards,
Andy
C#
ASP.Net 4.5
SQL Server 2012
My code below is run in an ASP.Net 4.5 built Windows service. It deletes all records in the ActualPlot table and then refreshes/adds new records from a text file every 30 seconds. I basically need to block (lock?) any user from reading the ActualPlot table while the records are being deleted and refreshed. Can you PLEASE help me change my code to do this?
private void timer1_Tick(object sender, ElapsedEventArgs e)
{
// Open the SAP text files, clear the data in the tables and repopulate the new SAP data into the tables.
var cnnString = ConfigurationManager.ConnectionStrings["TaktBoardsConnectionString"].ConnectionString;
SqlConnection conn = new SqlConnection(cnnString);
SqlConnection conndetail = new SqlConnection(cnnString);
SqlConnection connEdit = new SqlConnection(cnnString);
SqlCommand cmdGetProductFile = new SqlCommand();
SqlDataReader reader;
string sql;
// Delete all the records from the ActualPlot and the ActualPlotPreload tables. We are going to repopulate them with the data from the text file.
sql = "DELETE FROM ActualPlotPreload";
try
{
conn.Open();
SqlCommand cmd = new SqlCommand(sql, conn);
cmd.ExecuteNonQuery();
}
catch (System.Data.SqlClient.SqlException ex)
{
string msg = "Delete Error:";
msg += ex.Message;
Library.WriteErrorLog(msg);
}
finally
{
conn.Close();
}
sql = "DELETE FROM ActualPlot";
try
{
conn.Open();
SqlCommand cmd = new SqlCommand(sql, conn);
cmd.ExecuteNonQuery();
}
catch (System.Data.SqlClient.SqlException ex)
{
string msg = "Delete Error:";
msg += ex.Message;
Library.WriteErrorLog(msg);
}
finally
{
conn.Close();
}
// Read the SAP text file and load the data into the ActualPlotPreload table
sql = "SELECT DISTINCT [BoardName], [ProductFile], [ProductFileIdent] FROM [TaktBoards].[dbo].[TaktBoard] ";
sql = sql + "JOIN [TaktBoards].[dbo].[Product] ON [Product].[ProductID] = [TaktBoard].[ProductID]";
cmdGetProductFile.CommandText = sql;
cmdGetProductFile.CommandType = CommandType.Text;
cmdGetProductFile.Connection = conn;
conn.Open();
reader = cmdGetProductFile.ExecuteReader();
string DBProductFile = "";
string DBTischID = "";
string filepath = "";
string[] cellvalues;
DateTime dt, DateCheckNotMidnightShift;
DateTime ldSAPFileLastMod = DateTime.Now;
string MyDateString;
int FileRecordCount = 1;
while (reader.Read())
{
DBProductFile = (string)reader["ProductFile"];
DBTischID = (string)reader["ProductFileIdent"];
filepath = "c:\\inetpub\\wwwroot\\WebApps\\TaktBoard\\FilesFromSAP\\" + DBProductFile;
FileInfo fileInfo = new FileInfo(filepath); // Open file
ldSAPFileLastMod = fileInfo.LastWriteTime; // Get last time modified
try
{
StreamReader sr = new StreamReader(filepath);
FileRecordCount = 1;
// Populate the AcutalPlotPreload table from with the dates from the SAP text file.
sql = "INSERT into ActualPlotPreload (ActualDate, TischID) values (#ActualDate, #TischID)";
while (!sr.EndOfStream)
{
cellvalues = sr.ReadLine().Split(';');
if (FileRecordCount > 1 & cellvalues[7] != "")
{
MyDateString = cellvalues[7];
DateTime ldDateCheck = DateTime.ParseExact(MyDateString, "M/dd/yyyy", null);
DateTime dateNow = DateTime.Now;
string lsDateString = dateNow.Month + "/" + dateNow.Day.ToString("d2") + "/" + dateNow.Year;
DateTime ldCurrentDate = DateTime.ParseExact(lsDateString, "M/dd/yyyy", null);
string lsTischID = cellvalues[119];
if (ldDateCheck == ldCurrentDate)
{
try
{
conndetail.Open();
SqlCommand cmd = new SqlCommand(sql, conndetail);
cmd.Parameters.Add("#ActualDate", SqlDbType.DateTime);
cmd.Parameters.Add("#TischID", SqlDbType.VarChar);
cmd.Parameters["#TischID"].Value = cellvalues[119];
MyDateString = cellvalues[7] + " " + cellvalues[55];
dt = DateTime.ParseExact(MyDateString, "M/dd/yyyy H:mm:ss", null);
cmd.Parameters["#ActualDate"].Value = dt;
// Ignore any midnight shift (12am to 3/4am) units built.
DateCheckNotMidnightShift = DateTime.ParseExact(cellvalues[7] + " 6:00:00", "M/dd/yyyy H:mm:ss", null);
if (dt >= DateCheckNotMidnightShift)
{
cmd.ExecuteNonQuery();
}
}
catch (System.Data.SqlClient.SqlException ex)
{
string msg = "Insert Error:";
msg += ex.Message;
Library.WriteErrorLog(msg);
}
finally
{
conndetail.Close();
}
}
}
FileRecordCount++;
}
sr.Close();
}
catch
{ }
finally
{ }
}
conn.Close();
// Get the unique TischID's and ActualDate from the ActualPlotPreload table. Then loop through each one, adding the ActualUnits
// AcutalDate and TischID to the ActualPlot table. For each unique TischID we make sure that we reset the liTargetUnits to 1 and
// count up as we insert.
SqlCommand cmdGetTischID = new SqlCommand();
SqlDataReader readerTischID;
int liTargetUnits = 0;
string sqlInsert = "INSERT into ActualPlot (ActualUnits, ActualDate, TischID) values (#ActualUnits, #ActualDate, #TischID)";
sql = "SELECT DISTINCT [ActualDate], [TischID] FROM [TaktBoards].[dbo].[ActualPlotPreload] ORDER BY [TischID], [ActualDate] ASC ";
cmdGetTischID.CommandText = sql;
cmdGetTischID.CommandType = CommandType.Text;
cmdGetTischID.Connection = conn;
conn.Open();
readerTischID = cmdGetTischID.ExecuteReader();
DBTischID = "";
DateTime DBActualDate;
string DBTischIDInitial = "";
while (readerTischID.Read())
{
DBTischID = (string)readerTischID["TischID"];
DBActualDate = (DateTime)readerTischID["ActualDate"];
if (DBTischIDInitial != DBTischID)
{
liTargetUnits = 1;
DBTischIDInitial = DBTischID;
}
else
{
liTargetUnits++;
}
try
{
conndetail.Open();
SqlCommand cmd = new SqlCommand(sqlInsert, conndetail);
cmd.Parameters.Add("#ActualUnits", SqlDbType.Real);
cmd.Parameters.Add("#ActualDate", SqlDbType.DateTime);
cmd.Parameters.Add("#TischID", SqlDbType.VarChar);
cmd.Parameters["#TischID"].Value = DBTischID;
cmd.Parameters["#ActualDate"].Value = DBActualDate;
cmd.Parameters["#ActualUnits"].Value = liTargetUnits;
cmd.ExecuteNonQuery();
cmd.Parameters.Clear();
}
catch (System.Data.SqlClient.SqlException ex)
{
string msg = "Insert Error:";
msg += ex.Message;
Library.WriteErrorLog(msg);
}
finally
{
conndetail.Close();
}
}
conn.Close();
Library.WriteErrorLog("SAP text file data has been imported.");
}
If the data is being re-added right back after the delete (basically you know what to re-add before emptying the table), you could have both operation within the same SQL transaction, so that the data will be available to the other page only when it has been re-added.
I mean something like that :
public bool DeleteAndAddData(string connString)
{
using (OleDbConnection conn = new OleDbConnection(connString))
{
OleDbTransaction tran = null;
try
{
conn.Open();
tran = conn.BeginTransaction();
OleDbCommand deleteComm = new OleDbCommand("DELETE FROM Table", conn);
deleteComm.ExecuteNonQuery();
OleDbCommand reAddComm = new OleDbCommand("INSERT INTO Table VALUES(1, 'blabla', 'etc.'", conn);
reAddComm.ExecuteNonQuery();
tran.Commit();
}
catch (Exception ex)
{
tran.Rollback();
return false;
}
}
return true;
}
If your queries don't take too long to execute, you can start the two with a difference of 7.5 seconds, as there is a collision at every 90 seconds when the read/write finishes 3 cycles, and read/view finishes 2 cycles.
That being said, it's not a fool-proof solution, just a trick based on assumptions, in case you wan't to be completely sure that read/view never happens when read/write cycle is happening, try considering having a Read Lock. I would recommend reading Understanding how SQL Server executes a query and Locking in the Database Engine
Hope that helps.
I would try a couple of things:
Make sure your DELETE + INSERT operation is occurring within a single transaction:
BEGIN TRAN
DELETE FROM ...
INSERT INTO ...
COMMIT
If this isn't a busy table, try locking hints your SELECT statement. For example:
SELECT ...
FROM Table
WITH (UPDLOCK, HOLDLOCK)
In the case where the update transactions starts while your SELECT statement is running, this will cause that transaction to wait until the SELECT is finished. Unfortunately it will block other SELECT statements too, but you don't risk reading dirty data.
I was not able to figure this out but I changed my code so the program was not deleting all the rows in the ActualPlot table but checking to see if the row was there and if not adding the new row from the text file.
I've written a method which will try and delete a row from a db table based on a primary key id. The problem i have is that the try block is always returning "Success" even if a record has already been deleted / or it doesn't exist.
public string delete_visit(int id)
{
string deleteResponse = null;
string cnn = ConfigurationManager.ConnectionStrings[connname].ConnectionString;
using (SqlConnection connection = new SqlConnection(cnn))
{
string SQL = string.Empty;
SQL = "DELETE FROM [" + dbname + "].[dbo].[" + tbname + "] WHERE VisitorNumber = #IDNumber ";
using (SqlCommand command = new SqlCommand(SQL, connection))
{
command.Parameters.Add("#IDNumber", SqlDbType.Int);
command.Parameters["#IDNumber"].Value = id;
try
{
connection.Open();
command.ExecuteNonQuery();
deleteResponse = "Success";
}
catch (Exception ex)
{
deleteResponse = "There was a problem deleting the visit from the database. Error message: " + ex.Message;
}
}
}
return deleteResponse;
}
I want to be able to tell if the row was affected. I can do this in SQL Server Management Studio like so:
DELETE FROM Visits
WHERE VisitorNumber=88;
IF ##ROWCOUNT = 0
PRINT 'Warning: No rows were updated';
So i want to know how do i plug in the ##ROWCOUNT bit into my c# so that i can tell if the row was deleted?
thanks
ExecuteNonQuery() returns an int, indicating how many rows were affected.
So:
int rowsAffected = command.ExecuteNonQuery();
if (rowsAffected == 0)
{
deleteResponse = "No rows affected";
}
The problem is that this number can be influenced based on what the query actually does. Executing triggers or calling stored procedures could mess with the output, changing the affected number of rows. If you really must, then first execute a query where you check that the record with the given ID exists.
Well i had a weird exception on my program so i tried to replicate it to show you guys, so what i did was to create a table with id(int-11 primary), title(varchar-255) and generated 100k random titles with 40 chars lenght, when i run my method that reads the count for each id it throws an exception check below for more.
What i found is that this was because of timeouts so i tried this for the timeouts.
set net_write_timeout=99999; set net_read_timeout=99999;
Tried pooling=true on connection
Tried cmd.timeout = 120;
I also tried adding MaxDegreeOfParallelism i played with multiple values but still the same error appears after a while.
My exception:
Could not kill query, aborting connection. Exception was Unable to
read data from the transport connection: A connection attempt failed
because the connected party did not properly respond after a period of
time, or established connection failed because connected host has
failed to respond.
public static string db_main = "Server=" + server + ";Port=" + port + ";Database=" + database_main + ";Uid=" + user + ";Pwd=" + password + ";Pooling=true;";
private void button19_Click(object sender, EventArgs e)
{
List<string> list = db.read_string_list("SELECT id from tablename", db.db_main);
//new ParallelOptions { MaxDegreeOfParallelism = 3 },
Task.Factory.StartNew(() =>
{
Parallel.ForEach(list, id =>
{
string sql = "SELECT COUNT(*) FROM tablename where id=" + id;
var ti = db.read_int(sql, db.db_main);
Console.WriteLine(ti);
});
}).ContinueWith(_ =>
{
Console.WriteLine("Finished");
});
}
public static int? read_int(string sql, string sconn)
{
var rdr = MySqlHelper.ExecuteReader(db.db_main, sql);
if (rdr.HasRows)
{
rdr.Read();
return rdr.GetInt32(0);
}
else
return null;
}
Alternate Method to read int with timeout option.
public static int? read_int2(string sql, string sconn)
{
using (var conn = new MySqlConnection(sconn))
{
using (var cmd = new MySqlCommand(sql, conn))
{
//cmd.CommandTimeout = 120;
conn.Open();
using (var rdr = cmd.ExecuteReader())
{
if (rdr.HasRows)
{
rdr.Read();
return rdr.GetInt32(0);
}
else
return null;
}
}
}
}
What can be causing this? any clues?
So finally my solution on this was to increase net_read_timeout variable (im pointing out net_write_timeout because that can happen when executing a long query too)
Run these queries *Note: After you restart your PC default values will take place again.
set ##global.net_read_timeout = 999;
set ##global.net_write_timeout = 999;
or you can add this on the connection string
default command timeout=999;
Finally i used this method to read the values.
public static int? read_int(string sql, string sconn)
{
try
{
using (MySqlDataReader reader = MySqlHelper.ExecuteReader(sconn, sql))
{
if (reader.HasRows)
{
reader.Read();
return reader.GetInt32(0);
}
else
return null;
}
}
catch (MySqlException ex)
{
//Do your stuff here
throw ex;
}
}
So what I'm doing is reading a lot of data from remote Nettezza database and inserting them into another remote Oracle database. For that I'm using ODBC driver. The problem is that there is a lot of data and it takes too much time. How can I speed up?
Here is what I do:
First I create connection and command for inserting:
String connect = "Driver={Microsoft ODBC for Oracle};CONNECTSTRING=(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=myhost)(PORT=myprt))(CONNECT_DATA=(SERVICE_NAME=myname)));Uid=uid;Pwd=pass";
connection = new OdbcConnection(connect);
connection.Open();
String q = #"INSERT INTO TEST (id)
VALUES (?)";
myCommand = new OdbcCommand(q,connection);
Then I read data from netteza:
String connect = "Driver={NetezzaSQL};servername=server;port=5480;database=db; username=user;password=pass;
string query = #"SELECT T2.LETO_MESEC, T1.*
FROM data T1
JOIN datga2 T2 ON T2.ID = T1.EFT_ID
WHERE T2.LETO_MESEC = '" + mesec + #"'";
using (OdbcConnection connection = new OdbcConnection(connect))
{
try
{
OdbcCommand command = new OdbcCommand(query, connection);
connection.Open();
OdbcDataReader reader = command.ExecuteReader();
int counter=0;
while (reader.Read())
{
int id_first = reader.GetInt32(5);
insertOracle(id_first);
}
}
catch (Exception e)
{
Console.WriteLine("ne dela" + e.ToString());
}
}
And finally my insert :
public void insertOracle(int id_first)
{
try
{
myCommand.Parameters.Clear();
myCommand.Parameters.Add(new OdbcParameter("id", id_first));
myCommand.ExecuteNonQuery();
}
catch (Exception e)
{
Console.WriteLine("ne dela" + e.ToString());
}
}
I noticed that these commit in every line, so how to remove that and speed it up. Right now it takes about 10 minutes for 20000 rows.
Single inserts are always going to be slow -- start processing the data in arrays, selecting a batch of ID's from the source system and loading an array to the target.
Here is an article which might be helpful. http://www.oracle.com/technetwork/issue-archive/2009/09-sep/o59odpnet-085168.html