How to improve sqlite write performance in C# - c#

I'm using sqlite to save log and meet write performance issue.
string log = "INSERT INTO Log VALUES ('2019-12-12 13:43:06','Error','Client','This is log message')"
public int WriteLog(string log)
{
return ExecuteNoQuery(log);
}
public int ExecuteNoQuery(string command)
{
int nResult = -1;
try
{
using (SQLiteConnection dbConnection = new SQLiteConnection(ConnectString))
{
dbConnection.Open();
using (SQLiteCommand dbCommand = dbConnection.CreateCommand())
{
dbCommand.CommandText = command;
nResult = dbCommand.ExecuteNonQuery();
}
}
}
catch (Exception e)
{
// Output error message
}
return nResult;
}
Search in google, transaction could improve the write performance significantly, but unfortunately I don't know when a log message will come, I could not combine the log message. Is there any other way to improve my log write performance?
I tried to add a timer to my code and commit transaction automatically. But I don't think it's a good way to speed up log write performance.
public class DatabaseManager : IDisposable
{
private static SQLiteTransaction transaction = null;
private SQLiteConnection dbConnection = null;
private static Timer transactionTimer;
private long checkInterval = 500;
private DatabaseManager(string connectionString)
{
dbConnection = new SQLiteConnection(connectionString);
dbConnection.Open();
StartTransactionTimer();
}
public void Dispose()
{
if(transaction != null)
{
transaction.Commit();
transaction = null;
}
dbConnection.Close();
dbConnection = null;
}
private void StartTransactionTimer()
{
transactionTimer = new Timer();
transactionTimer.Interval = checkInterval;
transactionTimer.Elapsed += TransactionTimer_Elapsed;
transactionTimer.AutoReset = false;
transactionTimer.Start();
}
private void TransactionTimer_Elapsed(object sender, ElapsedEventArgs e)
{
StartTransation();
transactionTimer.Enabled = true;
}
public void StartTransation()
{
try
{
if (dbConnection == null || dbConnection.State == ConnectionState.Closed)
{
return;
}
if (transaction != null)
{
transaction.Commit();
transaction = null;
}
transaction = dbConnection.BeginTransaction();
}
catch(Exception e)
{
LogError("Error occurs during commit transaction, error message: " + e.Message);
}
}
public int ExecuteNoQuery(string command)
{
int nResult = -1;
try
{
using (SQLiteCommand dbCommand = dbConnection.CreateCommand())
{
dbCommand.CommandText = command;
nResult = dbCommand.ExecuteNonQuery();
}
}
catch (Exception e)
{
LogError("Error occurs during execute sql no result query, error message: ", e.Message);
}
return nResult;
}
}

This started out as a comment, but it's evolving to an answer.
Get rid of the GC.Collect(); code line.
That's not your job to handle garbage collection - and you're probably degrading performance by using it.
No need to close the connection, you're disposing it in the next line anyway.
Why are you locking? Insert statements are usually thread safe - and this one doesn't seem to be an exception of that rule.
You are swallowing exceptions. That's a terrible habit.
Since you're only ever insert a single record, you don't need to return an int - you can simply return a bool (true for success, false for failure)

Why you don't use the entity framework to do the communications with the database?
For me is the easiest way. It's a Microsoft library so you can sure that the performance is very good.
I made some work with entity framework and sqlite db's and everything works very well.
Here an example of use:
var context = new MySqliteDatabase(new SQLiteConnection(#"DataSource=D:\\Mydb.db;cache=shared"));
var author = new Author {
FirstName = "William",
LastName = "Shakespeare",
Books = new List<Book>
{
new Book { Title = "Hamlet"},
new Book { Title = "Othello" },
new Book { Title = "MacBeth" }
}
};
context.Add(author);
context.SaveChanges();
The type of MySqliteDatabase can be created automatically using database first approach or with Code First approach. You have a lot of information and examples on the internet.
Here the link to the official documentation:
https://learn.microsoft.com/en-us/ef/ef6/

Related

SQL dependency event not being triggered

First time working with SQL Dependency... but after having gone over several examples I feel as I am doing everything correct. I've checked that the Broker is Enabled. I've further checked that my query is correct. I am not receiving any exceptions at all! All and all everything seems as it should work... but it is not, and I have no idea how to begin to troubleshoot it without any exceptions being thrown.
Any help would be VERY much appreciated!
Here is my class:
public class NotificationEvent
{
private delegate void RateChangeNotification(DataTable table);
private SqlDependency dependency;
string ConnectionString = #"ConnectionString";
string UserName = Environment.UserName;
public async void StartNotification()
{
SqlDependency.Start(this.ConnectionString, "UserNotificationsQueue");
SqlConnection connection = new SqlConnection(this.ConnectionString);
await connection.OpenAsync();
SqlCommand command = new SqlCommand();
command.Connection = connection;
command.CommandType = CommandType.Text;
command.CommandText = string.Format("SELECT [NotificationID],[UserFrom],[UserTo],[DateTimeSent],[Notification] FROM [dbo].[PersonnellNotifications]", UserName);
command.Notification = null;
this.dependency = new SqlDependency(command, "Service=PostUserNotificationsQueue;", Int32.MaxValue);
dependency.OnChange += new OnChangeEventHandler(this.SqlDependencyOnChange);
await command.ExecuteReaderAsync();
}
private void SqlDependencyOnChange(object sender, SqlNotificationEventArgs eventArgs)
{
if (eventArgs.Info == SqlNotificationInfo.Invalid)
{
Console.WriteLine("The above notification query is not valid.");
}
else
{
Console.WriteLine("Notification Info: " + eventArgs.Info);
Console.WriteLine("Notification source: " + eventArgs.Source);
Console.WriteLine("Notification type: " + eventArgs.Type);
}
}
public void StopNotification()
{
SqlDependency.Stop(this.ConnectionString, "QueueName");
}
}
I am initializing this from another classes IniatializeComponent() as seen:
private void InitializeComponent()
{
// Initialize SQL Dependancy
ne.StartNotification();
}
I have just tested following in my Code and Its working good. I have simplified your code. Please see if this is working and you are getting a call in OnNotificationChange on Db Change.
public async void RegisterForNotification()
{
var connectionString = #"ConnectionString";
using (var connection = new SqlConnection(connectionString))
{
await connection.OpenAsync();
var queryString = "Your Query String";
using (var oCommand = new SqlCommand(queryString, connection))
{
// Starting the listener infrastructure...
SqlDependency.Start(connectionString);
var oDependency = new SqlDependency(oCommand);
oDependency.OnChange += OnNotificationChange;
// NOTE: You have to execute the command, or the notification will never fire.
await oCommand.ExecuteReaderAsync();
}
}
}
private void OnNotificationChange(object sender, SqlNotificationEventArgs e)
{
Console.WriteLine("Notification Info: " + e.Info);
//Re-register the SqlDependency.
RegisterForNotification();
}
Are you setting SQLClientPermission? see:
https://learn.microsoft.com/en-us/dotnet/framework/data/adonet/sql/enabling-query-notifications
// Code requires directives to
// System.Security.Permissions and
// System.Data.SqlClient
private bool CanRequestNotifications()
{
SqlClientPermission permission =
new SqlClientPermission(
PermissionState.Unrestricted);
try
{
permission.Demand();
return true;
}
catch (System.Exception)
{
return false;
}
}

MySql executing a MySqlScript even when the connection has been closed

Is it possible for mysql to execute a script even when the connection has been closed?
I am using mysql community server , through a .NET connector API.
Was using c# to test out the API.
I have the following static class
using System;
using System.Data;
using MySql.Data;
using MySql.Data.MySqlClient;
public static class DataBase
{
static string connStr = "server=localhost;user=root;port=3306;password=*******;";
static MySqlConnection conn;
public static bool Connect()
{
conn = new MySqlConnection(connStr);
try
{
conn.Open();
}
catch (Exception Ex)
{
ErrorHandler(Ex);
return false;
}
return true;
}
public static int ExecuteScript(string scripttext) // returns the number of statements executed
{
MySqlCommand cmd = conn.CreateCommand();
cmd.CommandText = scripttext;
MySqlScript script;
int count= 0;
try
{
script = new MySqlScript(conn, cmd.CommandText);
script.Error += new MySqlScriptErrorEventHandler(script_Error);
script.ScriptCompleted += new EventHandler(script_ScriptCompleted);
script.StatementExecuted += new MySqlStatementExecutedEventHandler(script_StatementExecuted);
count = script.Execute();
}
catch (Exception Ex)
{
count = -1;
ErrorHandler(Ex);
}
return count;
}
# region EventHandlers
static void script_StatementExecuted(object sender, MySqlScriptEventArgs args)
{
string Message = "script_StatementExecuted";
}
static void script_ScriptCompleted(object sender, EventArgs e)
{
string Message = "script_ScriptCompleted!";
}
static void script_Error(Object sender, MySqlScriptErrorEventArgs args)
{
string Message = "script_Error: " + args.Exception.ToString();
}
# endregion
public static bool Disconnect()
{
try
{
conn.Close();
}
catch (Exception Ex)
{
ErrorHandler(Ex);
return false;
}
return true;
}
public static void ErrorHandler(Exception Ex)
{
Console.WriteLine(Ex.Source);
Console.WriteLine(Ex.Message);
Console.WriteLine(Ex.ToString());
}
}
and I am using the following code to test out this class
using System;
using System.Data;
namespace Sample
{
public class Sample
{
public static void Main()
{
if (DataBase.Connect() == true)
Console.WriteLine("Connected");
if (DataBase.Disconnect() == true)
Console.WriteLine("Disconnected");
int count = DataBase.ExecuteScript("drop database sample");
if (count != -1)
{
Console.WriteLine(" Sample Script Executed");
Console.WriteLine(count);
}
Console.ReadKey();
}
}
}
I noticed that even though I have closed my MySql connection using Disconnect() - which i have defined, mysql continues to execute the command i give next and no error is generated.
I feel like I am doing something wrong, as an error should be generated when i try to execute a script on a closed connection.
Is it a problem in my code/logic or some flaw in mysql connector?
I did check through the mysql workbench whether the command was executed properly and it was.
This is a decompile of MySqlScript.Execute code....
public unsafe int Execute()
{
......
flag = 0;
if (this.connection != null)
{
goto Label_0015;
}
throw new InvalidOperationException(Resources.ConnectionNotSet);
Label_0015:
if (this.query == null)
{
goto Label_002A;
}
if (this.query.Length != null)
{
goto Label_002C;
}
Label_002A:
return 0;
Label_002C:
if (this.connection.State == 1)
{
goto Label_0047;
}
flag = 1;
this.connection.Open();
....
As you can see, when you build the MySqlScript the connection passed is saved in an internal variable and before executing the script, if the internal connection variable is closed, the code opens it. Not checked but I suppose that it also closes the connection before exiting (notice that flag=1 before opening)
A part from this I suggest to change your code to avoid keeping a global MySqlConnection object. You gain nothing and risk to incur in very difficult bugs to track.
static string connStr = "server=localhost;user=root;port=3306;password=*******;";
public static MySqlConnection Connect()
{
MySqlConnection conn = new MySqlConnection(connStr);
conn.Open();
return conn;
}
This approach allows to write code that use the Using Statement
public static int ExecuteScript(string scripttext) // returns the number of statements executed
{
using(MySqlConnection conn = Database.Connect())
using(MySqlCommand cmd = conn.CreateCommand())
{
cmd.CommandText = scripttext;
....
}
}
The Using statement will close and dispose the connection and the command freeing valuable resources and also in case of exception you will be sure to have the connection closed and disposed

Page_Load firing multiple times?

We have been dealing with an error for the last couple of days, so we created a small page (quick and dirty programming, my apologies in advance) that connects to the database, checks if a document exists, and displays some data related to the document. If there is an exception, an email is sent with the exception information and some log data.
Here's a simplified version of the code (short explanation below):
namespace My.Namespace
{
public partial class myClass : System.Web.UI.Page
{
private static SqlConnection conn = null;
private static SqlCommand command1 = null;
private static SqlCommand command2 = null;
private static string log = "";
protected void Page_Load(object sender, EventArgs e)
{
if (!Page.IsPostBack)
{
try
{
log += "START\n";
string docId = Request.QueryString["docId"];
if (!String.IsNullOrEmpty(docName))
{
bool docExists = doesDocExist(docId);
if (docExists == true)
{
string docMetadata = getMetadata(docId);
Response.Write(docMetadata);
}
}
else
{
// display error message
}
}
catch (sqlException sqlex)
{
// process exception
sendErrorMessage(sqlex.Message);
}
catch (Exception ex)
{
// process exception
sendErrorMessage(ex.Message);
}
}
}
private static bool doesDocExist(string docId)
{
log += "In doesDocExist\n";
bool docExists = false;
try
{
// open db connection (conn)
string cmd = String.Format("SELECT COUNT(*) FROM docs WHERE id='{0}'", docId);
command1 = new SqlCommand(cmd, conn);
conn.Open();
var val = command1.ExecuteScalar();
int numberOfRows = int.Parse(val.ToString());
if (numberOfRows > 0) { docExists = true; }
}
finally
{
// close db connection (conn)
}
return docExists;
}
protected string getMetadata(string docId)
{
log += "In getMetadata\n";
string docMetadata = "";
try
{
// open db connection (conn)
string cmd = String.Format("SELECT metadata FROM docs WHERE id='{0}'", docID);
command2 = new SqlCommand(cmd, conn);
conn.Open();
SqlDataReader rReader = command2.ExecuteReader();
if (rReader.HasRows)
{
while (rReader.Read())
{
// process metadata
docMetadata += DOCMETADATA;
}
}
}
return docMetadata;
}
public static void sendErrorMessage(string messageText)
{
HttpContext.Current.Response.Write(messageText);
// Send string log via email
}
}
}
I know it's too long, so here is a quick description of it. We have a class with the Page_Load method and three other methods:
doesDocExists: returns a bool value indicating if an document ID is in the database.
getMetadata: returns a string with metadata related to the document.
sendErrorMessage: sends an email with a log generated during the page.
From Page_Load we call doesDocExists. If the value returned is true, then it calls getMetadata and displays the value on the screen. If there's any error, it is caught in the Page_Load and sent as an email.
The problem is that when there's an error, instead of getting an email with the log (i.e.: START - In Function1 - In Function2), the log appears 100 times in the email (i.e.: START - In Function1 - In Function2 - Start - In Function1 - In Function2 - START... and so on), as if Page_Load was fired that many times.
We read online (http://www.craigwardman.com/blog/index.php/2009/01/asp-net-multiple-page-load-problem/) that it could be because of the PostBack. So, we added the condition if (!Page.IsPostBack), but the result is still the same.
Is there any reason why Page_Load would be triggered multiple times? Or is it that we are doing something wrong with the log variable and/or the try/catch that causes this behavior?
The log may be long because you are declaring the string log as static. Does it need to be static?
private static SqlConnection conn = null;
private static SqlCommand command1 = null;
private static SqlCommand command2 = null;
private static string log = "";
The problem is that log is Singleton along with other properties.
Whenever you access that page, you append text to log property which ends up being START - In Function1 - In Function2 - Start - In Function1 - In Function2 - START... and so on
Base on your scenario, you do not need to use Singleton inside myClass.
FYI: Since I do not know the rest of your code, ensure to instantiate conn, command1, command2.
If your page load functions are execute twice because post back is possible when you clicking on the button or link, so should check it and run by the below
if (!IsPostBack)
{
try
{
log += "START\n";
string docId = Request.QueryString["docId"];
if (!String.IsNullOrEmpty(docName))
{
bool docExists = doesDocExist(docId);
if (docExists == true)
{
string docMetadata = getMetadata(docId);
Response.Write(docMetadata);
}
}
else
{
// display error message
}
}
catch (sqlException sqlex)
{
// process exception
sendErrorMessage(sqlex.Message);
}
catch (Exception ex)
{
// process exception
sendErrorMessage(ex.Message);
}
}
}

Weird error on transactions -> using NonQuery getting Reader must be closed

I'm developing a client in C# and POSTGRESQL.
It needs to parse some texts and insert data in the tables correctly, so we have a parsed which gives me a Dictionary for each tables (4 at the moment).
So we have a thread which insert in a queue those dictionaries on a ConcurrentQueue.
Now, we have two timers:
1) every 10 seconds commit an opened transaction and recreates one
these are the methods:
void transactionTimer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
try
{
Commit();
}
catch (Exception ex)
{
Logger.Log(Logger.LogType.ERROR, ex);
Rollback();
}
Transaction();
}
}
public void Commit()
{
if (trans != null)
{
trans.Commit();
trans.Dispose();
trans = null;
}
}
public void Transaction()
{
if (dbCon == null || (dbCon != null && dbCon.State != ConnectionState.Open))
dbCon = Connection;
if (trans == null)
trans = dbCon.BeginTransaction();
}
public void Rollback()
{
if (trans != null)
{
trans.Rollback();
trans.Dispose();
trans = null;
}
}
2) pick one hundred of data on the queue and do a huge insert ( not 100* insert by just one by using paramereters like this:
insert into Tabletest1_HandsData( handDataId, handData) values( #handDataId0, #handData0),( #handDataId1, #handData1),( #handDataId2, #handData2),( #handDataId3, #handData3) ....
by using this helper
public bool Insert(String tableName, Dictionary<String, object> data, bool usingTransaction = false)
{
Boolean returnCode = true;
var sql = GetSQL(tableName, data);
try
{
int rowsUpdated = -1;
var conn = (!usingTransaction) ? Connection : dbCon;
DbCommand mycommand = GetCommand(conn, sql);
GetCommandByDictionary(mycommand, data);
rowsUpdated = mycommand.ExecuteNonQuery();
if (!usingTransaction)
{
conn.Close();
conn.Dispose();
}
}
catch (Exception ex)
{
Logger.Log(Logger.LogType.ERROR, ex);
returnCode = false;
}
return returnCode;
}
protected override void GetCommandByDictionary(DbCommand cmd, Dictionary<string, object> data)
{
foreach (var val in data)
(cmd as NpgsqlCommand).Parameters.AddWithValue(val.Key.ToString(), val.Value);
}
So, we parse and insert on a queue, then every 3 seconds we pick 100 of these and insert, and every 10 second a transaction is committed and recreated
My error is that ExecuteNonQuery is giving me:
There is already an open DataReader associated with this Command which must be closed first.
Why is that happening?
I may take this opportunity to ask you, how can I do better? please feel free to insult me, I have tried a lot ot stuffs
Thanks
Luca

data rollback when one of the stored prod found faile in .net c#

Does anyone know how can I all data rollback in .net c# if one of the stored proc failed during the process?
Example:
protected void btnSave_Click(object sender, EventArgs e)
{
int ixTest= SaveTest("123");
CreateTest(ixTest);
}
protected int SaveTest(int ixTestID)
{
SubSonic.StoredProcedure sp = Db.SPs.TestInsert(
null,
ixTestID);
sp.Execute();
int ixTest= (int)sp.OutputValues[0];
return ixTest;
}
private long CreateTest(int ixTest)
{
long ixTestCustomer = CreateTestCustomer();
TestCustomer testCustomer= new TestCustomer();
try
{
testCustomer.TestCustomerId = ixTest;
testCustomer.InteractionId = ixTestCustomer ;
testCustomer.Save();
}
catch
{
Response.Redirect("pgCallSaveFailure.aspx");
}
m_saleDetail = new TestSaleDetail();
try
{
m_saleDetail.SaleId = sale.SaleId;
m_saleDetail.Save();
}
catch
{
Response.Redirect("pgCallSaveFailure.aspx");
}
return ixTestCustomer ;
}
I have the following code will call to btnSave_Click, then it will call to another 2 function Savetest() and CreateTest() to save data into the database. How can I rollback all data transaction in the following code if issue only happened in CreateTest() and which Savetest() have run successfully. How can I rollback all data for both Savetest() and CreateTest()?
Use the TransactionScope class
Code performs exact is given below. Hope you can get the idea.
public void SaveData()
{
SqlConnection connDB = new SqlConnection();
SqlCommand cmdExecuting = new SqlCommand();
try {
connDB = new SqlConnection(connection_string);
cmdExecuting.Connection = connDB;
connDB.Open();
cmdExecuting.Transaction = connDB.BeginTransaction();
int result = 0;
result = Method1(cmdExecuting);
if (result != 0) {
cmdExecuting.Transaction.Rollback();
return;
}
result = Method2(cmdExecuting);
if (result != 0) {
cmdExecuting.Transaction.Rollback();
return;
}
cmdExecuting.Transaction.Commit();
} catch (Exception ex) {
cmdExecuting.Transaction.Rollback();
} finally {
cmdExecuting.Dispose();
cmdExecuting = null;
connDB.Close();
connDB = null;
}
}
public int Method1(SqlCommand cmdExecuting)
{
cmdExecuting.Parameters.Clear();
cmdExecuting.CommandText = "stored proc 01";
cmdExecuting.CommandType = CommandType.StoredProcedure;
cmdExecuting.Parameters.Add("#para1", SqlDbType.Int);
cmdExecuting.Parameters("#para1").Value = value;
return cmdExecuting.ExecuteScalar();
}
public int Method2(SqlCommand cmdExecuting)
{
cmdExecuting.Parameters.Clear();
cmdExecuting.CommandText = "stored proc 02";
cmdExecuting.CommandType = CommandType.StoredProcedure;
cmdExecuting.Parameters.Add("#para1", SqlDbType.Int);
cmdExecuting.Parameters("#para1").Value = value;
return cmdExecuting.ExecuteScalar();
}

Categories