Multi threading with sqlite in windows store app - c#

I have a little problem with my windows store app. The problem being a background task can pull data from a server and insert new data into the local database while at the same time the user can be navigating to a page that is pulling from that database and of course the app will crash because the database is locked, while this is a rare case, it does happen.
So the question is what are way I can combat this so that one connection will either wait for the other to finish or something like that as to not conflict with each other?
I am using sqlite-net as a sql wrapper class for all my stuff. Any suggestions?

EkoostikMartin is correct. Using sqlite-net, you would want to do:
try {
db.Execute ("PRAGMA journal_mode = WAL;");
} catch (SQLiteException e) {
if (e.Result != SQLite3.Result.Row)
throw e;
}
(Not sure if the catch is required on all platforms, but on WP8 this was necessary for me because the journal_mode PRAGMA returns a row, and the Execute call throws in that scenario)

You should try enabling WAL (Write Ahead Logging) mode. It allows for concurrent reading and writing.
http://www.sqlite.org/wal.html
SQLiteConnectionStringBuilder connBuilder = new SQLiteConnectionStringBuilder();
connBuilder.JournalMode = SQLiteJournalModeEnum.Wal;

Related

App not processing events while trying to open database connection

I am writing a Windows Forms app in C# with Visual Studio 2022 on a Windows 10 machine. The app connects to an Azure database, which works fine. My issue is that sometimes it takes several seconds to connect (maybe 10 or so), or if there is an error it goes all the way to the timeout limit (usually 20 to 30 seconds) before coming back with whatever error message there is.
I am trying to provide some visual feedback to the user during this time, but the application does not appear to be processing any events, so whatever type of feedback I'm trying to send does not get done until the operation completes (at which point it is moot).
Any ideas on how to deal with this? Do I need to open the database on a different thread, and if so, will that be an issue throughout the rest of the app whenever I use the database object opened on a different thread?
I'm trying something simple, like gradually adding a row of dots, like so:
private void InitCloudDatabase()
{
Boolean success = true;
WorkingTimer.Enabled = true;
WorkingTimer.Start();
try
{
AzureAgDatabase db = new AzureAgDatabase();
db.OpenConnection();
}
catch
{
success = false;
}
WorkingTimer.Stop();
pbCloudResult.Image = (success) ? Properties.Resources.icons8_done_96 :
Properties.Resources.Red_X___Fail;
}
private void WorkingTimer_Tick(object sender, EventArgs e)
{
lblCloud.Text += " .";
if (lblCloud.Text.Contains(" . . . . . . . . . . ."))
{
lblCloud.Text = "Database Connection (Cloud)";
}
}
I haven't really worked with Windows Forms before, but in most UI based applications, you should reserve the UI Thread for just UI operations and move all time consuming tasks (Compute or I/O) to a different thread to ensure that the UI is still responsive.
In the case of Windows Forms, looks like you have a BackgroundWorker class that you can use to offload the DB operations into. Here is a walkthrough in the official docs that you can refer to.
Another approach would be to use the Task class to run your database code asynchronously, with lesser code compared to the first approach. You would simply wrap statements that take time in a Task.Run call and have follow up statements in a continuation task.

WPF, Send Messagebox to inform user that DataBase is locked

I've created an app in C# .Net WPF. This app uses an sqlite database, but this database is shared with other programs.
So rarely my app and the others are running simultaneously, and more rarely they attempt to write in db simultaneously.. And All apps are crashing at this point...
So, I want to patch my app to wait that db isn't locked anymore before doing something with it.
I've been thinking about making a dirty try/catch loop with a number of attempting, but it seems to me to be a too dirty way (and waste of ressources)
The other program has a visual indicator when it uses db, so I've thought that a solution can involve user action. When database is locked, a MessageBox open to notify the user to wait until other program has finished before click ok and continue.
Is it a way to test if the database is locked without try/catch?
Following the comments, I tried a try/catch method. I'm still not convinced that this is the cleanest method, because I don't like the idea of ​​waiting for an exception. An expected exception cannot be an exception in my opinion.
Maybe someone will come up with a solution that doesn't use this subterfuge, but I haven't found a better match for my expectations.
Thank you for your comments.
using SQLite;
public bool IsDatabaseLocked(string dbPath)
{
bool locked = true;
using (SQLiteConnection connection = new SQLiteConnection(dbPath))
{
try
{
connection.Execute("BEGIN EXCLUSIVE");
connection.Execute("COMMIT");
locked = false;
}
catch (SQLiteException)
{
// database is locked error
}
}
return locked;
}
public void WaitForDbToBeUnlocked(string dbPath)
{
int i = 0;
while (IsDatabaseLocked(dbPath))
{
i++;
if (i > 10)
{
MessageBox.Show("Please release manually Database or wait");
i = 0;
}
}
}
My answer is inspired by the following questions:
C# - How to detect if SQLite DB is locked?

Database timeout

I have a program that access database and excecute different methods that have a database call.
I have used one conenction for everything but it caused a timeout while executing a long task:
I basically had to go through the more than 6000 records and execute a stored procedure. I thing that caused a timeout since I used only one database connection for everything.
Then I changed the code, so I open and closing the connection for every method I call with "using" approach.
How should I handle the method that will be called a lot. Shouls I open/close connection everytime I access that method?
Or there is a different approach to it?
I do something like this:
foreach(record in MyCollection)//6000
{
using(connection = new SqlConnection(conString))
{
singledata = GetSingleData(record);
}
}
Here is a GetSingleData()
private byte[] GetSingleData(MyObject Data)
{
byte[] singleData = null;
using(SqlCommans......)
{
try
{
.......
//executing stored proc to get just a single row
reader = command.ExecuteReader();
while(reader.Read())
{
singleData = (byte[])reader["ColumnName"];
}
}
catch(SqlException ex)
{
if(!reader.isClosed)
reader.Close();
}
}
return singleData;
}
Is it efficient or I can set up some kind of counter and for each 500 records I can check if connection is closed and if it is then reopen it.
Thank's
Try using a persistent connection. Here's a post that might help if you want to try to tune your system (for MySQL):
http://www.mysqlperformanceblog.com/2011/04/19/mysql-connection-timeouts/
Hope that helps.
There is no such a thing as the only good way to do something. It all depends. In cases where agility is a must and you need to create ad-hoc solutions, opening and closing a connection in each method call might not be good theoretically, but accepted practically.
I urge you to read about these terms and concepts:
Connection pooling
Bulk operations (bulk update, bulk insert)
They might help you in getting more performance.

C# Data Connections Best Practice?

Ok, so this is one of those kind of opinionated topics, but based on your knowledge, opinion, and current practice, what is the best way to set the following scenario up?
I'm building an extensive data entry application, and by extensive I mean I've only got the basics setup which incorporates around 15-25% of the overall program and I have about 15 forms that are partially setup. (They still need work) I'm using SQL Compact 4.0 as my backend database, I don't really need a more expansive database as I'm not storing an MMO's worth of data, and for the moment this is only a local application.
I would love to be able to set it up to be displayed as a single window that just changes to various different pages based on a menu system, but I can't seem to find a good tutorial on how that would be accomplished, so if anyone knows of any, please enlighten me.
The scenario in question however, is how to connect to the databases. I'm using 2 SQLCE databases, one that stores constant data that is based on services and staff, and a second that stores the constantly changing data or new data that's entered based on the first database. I have seen many different methods on how to set this up and currently I am using one in which I have a BaseForm that all other forms inherit from. Within the BaseForm I have methods and variables that are common to many forms thus minimizing the amount of code that is being repeated.
This includes the connection strings to both databases, and 2 methods that open a connection to either of them. Like so:
internal SqlCeConnection dataConn = new SqlCeConnection(#"Data Source = |DataDirectory|\opi_data.sdf");
internal SqlCeConnection logConn = new SqlCeConnection(#"Data Source = |DataDirectory|\opi_logs.sdf");
internal SqlCeCommand command;
internal void openDataConnection() // Opens a connection to the data tables
{
try
{
if(dataConn.State == ConnectionState.Closed)
dataConn.Open();
}
catch(SqlCeException ex)
{
MessageBox.Show(ex.Message, "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
internal void openLogConnection() // Opens a connection to the log tables
{
try
{
if(logConn.State == ConnectionState.Closed)
logConn.Open();
}
catch (SqlCeException ex)
{
MessageBox.Show(ex.Message, "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
Then whenever I need an open connection I simply call the open connection method that corresponds to the database I need access to and then close it in a finally statement. In this way a connection is never open for very long, just when it's needed. Of course this means there are a lot of calls to the open connection methods. So is this the best way to implement this sort of scenario, or are there better ways?
Is it better to just open a connection as soon as a form loads and then close it when the form closes? I have instances where multiple forms are open at a time and each one would probably need an open connection to the databases so if one closes it then the others would be screwed right? Or should I open a connection to both databases upon the application launching? Any input would be appreciated. Thanks.
Connections are pooled by .NET, so re-creating them generally isn't an expensive operation. Keeping connections open for long periods of time, however, can cause issues.
Most "best practices" tell us to open connections as late as possible (right before executing any SQL) and closing them as soon as possible (right after the last bit of data has been extracted).
An effective way of doing this automatically is with using statements:
using (SqlConnection conn = new SqlConnection(...))
{
using(SqlCommand cmd = new SqlCommand(..., conn))
{
conn.Open();
using(DataReader dr = cmd.ExecuteReader()) // or load a DataTable, ExecuteScalar, etc.
{
...
{
}
}
That way, the resources are closed and disposed of even if an exception is thrown.
In short, opening a connection when the app opens or when each form opens is probably not the best approach.

How can i make NHibernate survive database downtime?

I have a C# console app that I would like to keep running, even when its database crashes. In that case it should poll the database to see when it comes back online, and then resume operation. I have this code for it, that I don't like:
public static T Robust<T>(Func<T> function)
{
while (true)
{
try
{
return function();
}
catch (GenericADOException e)
{
Console.WriteLine("SQL Exception. Retrying in 10 seconds");
Thread.Sleep(10000);
}
}
}
[...]
N.Robust(() => Session.CreateCriteria(typeof(MyEntity)).List());
The problem is that I have to insert that pesky N.Robust construct everywhere which clutters the code. Also, I run the risk of forgetting it somewhere. I have been looking into using NHibernate's EventListeners or Inceptors for it, but haven't been able to make it work. Do I really have to fork NHibernate to make this work?
Update
Alright, so I've been able to overcome one of my two issues. By injecting my own event listeners I can at least ensure that all calls to the database goes through the above method.
_configuration.EventListeners.LoadEventListeners
= new ILoadEventListener[] { new RobustEventListener() };
[...]
public class RobustEventListener : ILoadEventListener
{
public void OnLoad(LoadEvent e, LoadType type)
{
if (!RobustMode)
throw new ApplicationException("Not allowed");
}
}
I am still left with a cluttered code base, but I think it's a reasonable price to pay for increasing service uptime.
One archtecturial approach to tolerate database downtime is to use a queue (client side and/or server side). For reads of static or largely static data, cache on the client side with an expiry window (say 15 - 30 minutes).
This is non-trivial if you have complex database transactions.
Sleeping like you propose, is rarely a good idea.
Another option (used mainly in occasionally connected applications) is to use Database Replication. Using a RDBMS with replication support (Sql Server, for example), have your application always talk to the local DB, and let the replication engine deal with synchronization with the remote database automatically when the connection is up. This will probably introduce the issue of conflict management/resolution.

Categories