SqlDependency - Cleaning up Started_Outbound conversations - c#

In our application, we need to update certain fields of a building (ie whether or not it's occupied already) in real time when any other user using that same application changes those fields for a specific building.
However, this is only necessary if both users are currently viewing the same building. A building is selected in a grid, so when the selected row is changed, I create a new SqlDependency object with the correct SqlCommand.
RefreshDependency method:
private void RefreshDependency()
{
if (//Check if a row is selected at all and if it's a valid building)
{
if (_dependency != null)
{
_dependency.OnChange -= OnDependencyChange;
}
using (SqlConnection connection = new SqlConnection(_connectionString))
{
if (connection.State == ConnectionState.Closed)
{
connection.Open();
}
using (SqlCommand command = new SqlCommand($"SELECT [MyFields] FROM [dbo.MyTable] WHERE [BuildingID] = '{selectedBuilding.ID}'", connection))
{
SqlDependency dependency = new SqlDependency(command);
_dependency = dependency;
dependency.OnChange += new OnChangeEventHandler(OnDependencyChange);
using (SqlDataReader reader = command.ExecuteReader())
{
reader.Read();
}
}
connection.Close();
}
}
}
OnDependencyChange event:
private void OnDependencyChange(object sender, SqlNotificationEventArgs e)
{
RefreshDependency();
//Code to update fields
}
Since the OnChange event is not always called, I remove the OnChange event whenever RefreshDependency is called, rather than inside the OnChange event itself, before reassigning it.
The issue now is that while the code works perfectly fine and the application updates correctly whenever a change happens in the database, I noticed while looking into the memory leak issue caused by SqlDependency that every time a new SqlDependency is made, it creates a new conversation in sys.conversation_endpoints. That's alright so far, but if the user happens to keep their overview open for a long time and selects say, 100 buildings over the course of a few hours, then 100 new conversations are added to sys.conversation_endpoints. However, only those that receive a change will ever actually be set to CLOSED, the rest will remain on STARTED_OUTBOUND for an extremely long lifespan.
Now, I can clean out the CLOSED ones no issue, but I don't think I can simply do the same for STARTED_OUTBOUND, lest I remove conversations that actually need to still be open for other users, right? Since everyone naturally shares 1 database.
I'm not sure this is entirely an issue with my code either since even if only a single Dependency is ever made, if no one ever changes anything for that building and the user simply closes the app or overview (causing SqlDependency.Stop() to be called), that will leave that one conversation stuck on STARTED_OUTBOUND as well.
I've noticed that even if a field is changed much, much later, then the database will put all the related conversations to closed as well, no matter how long ago they were created, but considering multiple buildings may never receive a change, I'm a bit worried about leaving these conversations unchecked - I know CLOSED ones are considered a memory leak, and have already implemented a fix for those.
If this is by design for SqlDependency, should I look into using alternatives such as SqlDependencyEx or SqlTableDependency instead?

Related

Usage of mysqlconnection in C# [duplicate]

Ok, so this is one of those kind of opinionated topics, but based on your knowledge, opinion, and current practice, what is the best way to set the following scenario up?
I'm building an extensive data entry application, and by extensive I mean I've only got the basics setup which incorporates around 15-25% of the overall program and I have about 15 forms that are partially setup. (They still need work) I'm using SQL Compact 4.0 as my backend database, I don't really need a more expansive database as I'm not storing an MMO's worth of data, and for the moment this is only a local application.
I would love to be able to set it up to be displayed as a single window that just changes to various different pages based on a menu system, but I can't seem to find a good tutorial on how that would be accomplished, so if anyone knows of any, please enlighten me.
The scenario in question however, is how to connect to the databases. I'm using 2 SQLCE databases, one that stores constant data that is based on services and staff, and a second that stores the constantly changing data or new data that's entered based on the first database. I have seen many different methods on how to set this up and currently I am using one in which I have a BaseForm that all other forms inherit from. Within the BaseForm I have methods and variables that are common to many forms thus minimizing the amount of code that is being repeated.
This includes the connection strings to both databases, and 2 methods that open a connection to either of them. Like so:
internal SqlCeConnection dataConn = new SqlCeConnection(#"Data Source = |DataDirectory|\opi_data.sdf");
internal SqlCeConnection logConn = new SqlCeConnection(#"Data Source = |DataDirectory|\opi_logs.sdf");
internal SqlCeCommand command;
internal void openDataConnection() // Opens a connection to the data tables
{
try
{
if(dataConn.State == ConnectionState.Closed)
dataConn.Open();
}
catch(SqlCeException ex)
{
MessageBox.Show(ex.Message, "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
internal void openLogConnection() // Opens a connection to the log tables
{
try
{
if(logConn.State == ConnectionState.Closed)
logConn.Open();
}
catch (SqlCeException ex)
{
MessageBox.Show(ex.Message, "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
Then whenever I need an open connection I simply call the open connection method that corresponds to the database I need access to and then close it in a finally statement. In this way a connection is never open for very long, just when it's needed. Of course this means there are a lot of calls to the open connection methods. So is this the best way to implement this sort of scenario, or are there better ways?
Is it better to just open a connection as soon as a form loads and then close it when the form closes? I have instances where multiple forms are open at a time and each one would probably need an open connection to the databases so if one closes it then the others would be screwed right? Or should I open a connection to both databases upon the application launching? Any input would be appreciated. Thanks.
Connections are pooled by .NET, so re-creating them generally isn't an expensive operation. Keeping connections open for long periods of time, however, can cause issues.
Most "best practices" tell us to open connections as late as possible (right before executing any SQL) and closing them as soon as possible (right after the last bit of data has been extracted).
An effective way of doing this automatically is with using statements:
using (SqlConnection conn = new SqlConnection(...))
{
using(SqlCommand cmd = new SqlCommand(..., conn))
{
conn.Open();
using(DataReader dr = cmd.ExecuteReader()) // or load a DataTable, ExecuteScalar, etc.
{
...
{
}
}
That way, the resources are closed and disposed of even if an exception is thrown.
In short, opening a connection when the app opens or when each form opens is probably not the best approach.

best way to load large data from Mysql Database using multiple usercontrols

I am in this situtation : view image
there are 5 user controls stacked in the empty panel.
I have a vertical menu in which there are several buttons and when I click on a button, I use : BringToFront() for displaying them .
private void Button1_Click(object sender, EventArgs e)
{
tabRandom1.BringToFront();
}
each usercontrol contains datagridview, and other elements that have to be loaded with the data coming from database, but I would like that if I click on button1, only that the elements of usercontrol1 be loaded .
when i tried :
private async void UserControl1_Load(object sender, EventArgs e)
{
this.DataGridView1.DataSource = await GetDataAsync();
}
i get a exception #er-shoaib :
there is already an open DataReader associated with tis connection which must be closed first.
I am looking for the best way to load the elements of the active usercontrol
Error you are getting is clearly saying you have already opened DataReader.
You cannot have more than one opened DataReader inside one connection.
In order to make your code for communication with database more stable write code with using and it will automatically dispose of objects like this:
using(SqlConnection con = new SqlConnection("..."))
{
con.Open();
using(SqlCommand cmd = new SqlCommand("...", con))
{
using(SqlDataReader dr = cmd.ExecuteReader())
{
while(dr.Read())
// Do the things
}
}
// Do not need close since connection will be disposed
}
or if you opened one connection for let's say whole class (like i think you did up there) just do not wrap SqlConnection con = new Sql.... inside using but everything others do and you will have no problem expect do not forget to do connection.Close().
I am ALWAYS using using for every component in sql connection and it is not affecting my performance.
When you reorganize code this way you will get rid of that problem, but one advice is not to load data when someone open form since you will load it 5 times and user may use only one but better create method like RefreshData() inside your UC and before you do yourUserControl.BringToFront(); you also do yourUserControl.RefreshData() and this way you will load it only if needed and you will always have fresh one plus you will have easy method for refreshing data if needed anywhere.

Rollback SQL without transaction

I have a windows service that uploads data to a database and a MVC-app that utilises said service. The way it works today is something like this:
Upload(someStuff);
WriteLog("Uploaded someStuff");
ReadData(someTable);
WriteLog("Reading someTable-data");
Drop(oldValues);
WriteLog("Dropping old values");
private void Upload(var someStuff)
{
using(var conn = new connection(connectionstring))
{
//performQuery
}
}
private void WriteLog(string message)
{
using(var conn = etc..)
//Insert into log-table
}
private string ReadData(var table)
{
using etc..
//Query
}
///You get the gist.
The client can then see the current status of the upload through a query to the log-table.
I want to be able to perform a rollback if something fails. My first thought was to use a BeginTransaction() and then lastly a transaction.Commit(), but that would make my status-message behave bad. It would just go from "starting upload" and then fastforward to the last step where it would wait for a long time before "Done".
I want the user to be able to see if the process is stuck on some specific step, but I still want to be able to perform a full rollback if something unexpected happens.
How do I achieve this?
Edit:
I don't seem to have been clear in my question. If I do a separate connection for the logging, that would indeed work-ish. The problem is that the actual code will execute super-fast so the statusmessages would pass so fast that the user wouldn't even be able to see them before the final "committing"-message that would take 99% of the upload-time.
Design your table so that it has a (P)ending, (A)ctive (D)eleted flag - then to perform an update, new records are created called 'pending' Status P - your very final stage is to change the current Active to Deleted, and the Pending to Active (you could do that in a transaction). At your leisure, you can then delete the Status D (deleted) records at some time.
In the event of an error, the 'pending' record could become Deleted

C# using SQL Connection object for timer function

I'm creating an winforms application.
In which one form is made transparent, This form is used to show some popup message boxes, using a timer this form queries database in each seconds.
Currently I'm using database connection inside using method (here postgres Data Base).
Method 1
namespace MyApplication
{
public partial class frmCheckStatus: Form
{
private void timerCheckStatus_Tick(object sender, EventArgs e)
{
using (NpgsqlConnection conn = new NpgsqlConnection("My Connection String"))
{
conn.Open();
//Database queries
//Show popup message
conn.Close();//Forsing to close
}
}
}
}
so in each seconds this connection object is created and disposed.
Note : I'm not using this object for any other purpose or inside any forms or methods.
Is it good to create and use a single connection object global to this class, and use inside timer tick function?, and dispose on form close event
Method 2
namespace MyApplication
{
public partial class frmCheckStatus: Form
{
Private NpgsqlConnection conn = new NpgsqlConnection("My Connection String");
private void timerCheckStatus_Tick(object sender, EventArgs e)
{
//Here use conn object for queries.
conn.Open();
//Database queries
//Show popup message
conn.Close();//Forsing to close
}
private void frmCheckStatus_FormClosing(object sender, FormClosingEventArgs e)
{
conn.Dispose();
}
}
}
Which will be better?, considering memory, resource usage, execution time etc. Please give proper reason for your choice of method.
Looking at the documentation for your connection class (Here), it would appear that this supports connection pooling. This will mean that connections to the same endpoint (same connection string) will reuse existing connections rather than incurring the overhead of creating new ones.
Im not familiar with your particular connection, but if the behaviour is anything like SQLConnection class for ADO.net, repeatedly creating a new connection to the same connection string should not be particularly expensive (computationally).
As an aside, i would wrap your connection logic in try / finally to ensure it gets closed in the event of an application exception.
I can't see any advantage to instantiating a new connection every time you run a new query. I know it's done often in code, but there is overhead associated with it, however small. If you're running multiple queries from the start of the program to the end of the program, I think you should re-use the existing connection object.
If your goal is to make the connection "disappear" from the server (which I wouldn't generally worry about if this program runs on one machine -- if it runs on dozens, that's another story -- look up PgBounce), then that should be just as easily accomplished by turning connection pooling off, and then the Close() method would take care of it.
You kind of asked for pros and cons, and while it's not necessarily harmful to instantiate the connection within the loop, I can't imagine how it could be better.
For what it's worth, you may want to consider carrying the connection as a property (preferably outside of the form class, since you may want to eventually use it elsewhere). Something like this:
private NpgsqlConnection _PgConnection;
public NpgsqlConnection PgConnection
{
get
{
if (_PgConnection == null)
{
NpgsqlConnectionStringBuilder sb = new NpgsqlConnectionStringBuilder();
sb.Host = "hostname.whatever.com";
sb.Port = 5432;
sb.UserName = "scott";
sb.Password = "tiger";
sb.Database = "postgres";
sb.Pooling = true;
_PgConnection = new NpgsqlConnection(sb.ToString());
}
if (!_PgConnection.State.Equals(ConnectionState.Open))
_PgConnection.Open();
return _PgConnection;
}
set { _PgConnection = value; }
}
Then, within your form (or wherever you execute your SQL), you can just call the property:
NpgSqlCommand cmd = new NpgSqlCommand("select 1", Database.PgConnection);
...
Database.PgConnection.Close();
And you don't need to worry if the connection is open or closed, or if it's even been created yet.
The only open question would be if you want that connection to actually disappear on the server, which would be changed by altering the Pooled property.

Sqldependency - Records added while processing

I have a question on sql dependency. Lets assume my application receives notification when the underlying query data changes and I am planning to select the data from table, process it and resubscribe/start the dependency again. If the processing takes 1-2 minutes and in the mean time there may be some data added during this processing time. Not sure how that data will get notified or do I have to wait for the next change to occur which can be few minutes to hrs?
Below is my sample code let me know if I am missing something
Code:
private void LoadNotifications()
{
DataTable dt = new DataTable();
using (SqlCommand command = new SqlCommand("SELECT ID FROM dbo.NOTIFICATIONS", m_sqlConn))
{
command.Notification = null;
SqlDependency dependency = new SqlDependency(command);
dependency.OnChange += new OnChangeEventHandler(OnDependencyChange);
if (m_sqlConn.State == ConnectionState.Closed)
{
m_sqlConn.Open();
}
//using (SqlDataReader reader = command.ExecuteReader())
//{
// if (reader.HasRows)
// {
//lETS ASSUME THIS TAKES 2-3 MINUTES
// }
//}
}
}
private void OnDependencyChange(object sender, SqlNotificationEventArgs e)
{
SqlDependency dependency = sender as SqlDependency;
dependency.OnChange -= OnDependencyChange;
LoadNotifications();
}
What you're describing is the typical non-synchronous nature of data changes and app notification. In short, changes may be happening constantly and your app will not see them happen in real-time. Furthermore, the changes may be going on whether or not your front-end app is open. Is there a requirement to see the data as it is being changed in order to make decisions on the front-end app or do you need to review data changes made by other users.
One way in which you might achieve either is a queue of changes in the form of a table that is populated by the underlying trigger. Then code your front-end app to periodically read from that table and mark them as read. This would allow you to de-couple the data changes from the app views and maybe see some processing performance increases.

Categories