C# - Postgres - Memory leak issue over time - c#

Problem: The memory leaks & accumulates over time, and eventually reaches its 99% capacity
This is a question I previously asked on the same topic. I did as the guy who answered told me to do, but I am still experiencing memory leak issues. I really do not understand from where the memory is accumulating. I observed the Windows Task Manager, and found out that periodically memory clears, but the memory accumulation rate is faster than clearing rate, and as a result, memory capacity reaches 99% at the end.
Here is my C# code:
var connString = "Host=x.x.x.x;Port=5432;Username=postgres;Password=password;Database=database";
#Info.Trace("PostGre ");
using (var conn = new Npgsql.NpgsqlConnection(connString)){
conn.Open();
int ctr = 0;
// Insert some data
using (var cmd = new Npgsql.NpgsqlCommand())
{
cmd.Connection = conn;
var par_1 = cmd.Parameters.Add("#r", NpgsqlTypes.NpgsqlDbType.Timestamp);
var par_2 = cmd.Parameters.Add("#p", NpgsqlTypes.NpgsqlDbType.Double);
while(#tag.TerminateTimeScaleLoop == 100)
{
#Info.Trace("Pushed Data: PostGre A " + ctr.ToString());
cmd.CommandText = "INSERT INTO TORQX VALUES (#r,#p)";
par_1.Value = System.DateTime.Now.ToUniversalTime();
par_2.Value = #Tag.RigData.Time.TORQX;
cmd.ExecuteNonQuery();
ctr = ctr + 1;
}
}
#Info.Trace("Pushed Data: PostGre A Terminated");
conn.Close();
}
What is causing the memory accumulation? Can I prevent it from accumulating? If preventing accumulation is impossible, can I manually clear memory? What set of codes will do that?
I have practically no experience with C#, and I was assigned to do a hot fix on this C# code because the person who wrote this code isn't available now. I have lots of experience in Python, but no experience with C#, so please give me suggestions in a really explicit way... otherwise I will have no clue. Thanks!

Related

SQLite random database connection crash

I am programming a database export solution to convert a database. It is exporting and processing some hundred gigabytes of data in a multithreaded environment. All threads work in their own environment with own connections, but for orchestrating the export uses a sqlite "lookup" database that is used by all threads to distribute work and encode a specific field to a new id. This happens quite rarely (once every 50k rows exported) so it should not really slow the process even if locks are used.
For some reason about every 10-15min this lookup database throws an exception "could not open database" with errorcode 14. The exception randomly occurs on any of the ExecuteReader() methods. I tried to lock all methods accessing this database but still it crashes every 10-15min. Why? Actually when I just press on Resume in debugging mode everything works again, seems to be a temporary problem.
ExecuteLookup() is called by "main" exporting thread repeatedly.
DBQueryLookupDb() is called by any of the worker threads. (Something like INSERT INTO progress ...)
// using Microsoft.Data.Sqlite
private long ExecuteLookup(string value)
{
lock (this)
{
using (var cmd = GetSelectLookupCmd()) // "SELECT id,original FROM lookup_id WHERE original = #original", parameter is added, tried to re-use this command but same problem
{
cmd.Parameters["#original"].Value = value;
using (var res = cmd.ExecuteReader())
{
if (res.Read())
{
return res.GetInt64(0);
}
}
}
using (var cmd = GetInsertLookupCmd()) // INSERT INTO ...;SELECT last_insert_rowid();
{
cmd.Parameters["#original"].Value = value;
using (var res = cmd.ExecuteReader())
{
if (res.Read())
{
return res.GetInt64(0);
}
else { throw new Exception("Unexpected fail on lookup insert"); }
}
}
}
}
public void DBQueryLookupDb(string sql)
{
lock (this)
{
using (SqliteCommand c = new SqliteCommand())
{
c.Connection = lookupDb;
c.CommandType = System.Data.CommandType.Text;
c.CommandText = sql;
c.ExecuteNonQuery();
}
}
}
"PRAGMA JOURNAL_MODE ='PERSIST'" resolves the problem. TRUNCATE may work too. It may be a sync problem coming from the network drive (Sqlite deletes the journal and immediatley creates a new one before the network drive is ready again)

Sql Connection Pool timeout

[Disclaimer] : I think I have read every stackoverflow post about this already
I have been breaking my head over this for quite some time now. I am getting the following exception in my asp.net web.api.
Exception thrown: 'System.InvalidOperationException' in mscorlib.dll
Additional information: Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached.
Most people suggested that I should look for leaked connections in my application. Here is my code. Now I am sure that I am not leaking any connections
public async Task<IEnumerable<string>> Get()
{
var ds = new DataSet();
var constring = "Data Source=xxx;Initial Catalog=xxx;User Id=xxx;Password=xxx;Max Pool Size=100";
var asyncConnectionString = new SqlConnectionStringBuilder(constring)
{
AsynchronousProcessing = true
}.ToString();
using (var con = new SqlConnection(asyncConnectionString))
using (var cmd = new SqlCommand("[dbo].[xxx]", con))
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.AddWithValue("#x1", 1);
cmd.Parameters.AddWithValue("#x2", "something");
await con.OpenAsync();
using (var rdr =await cmd.ExecuteReaderAsync())
{
if (rdr.HasRows)
{
ds.Load(rdr, LoadOption.OverwriteChanges, "MyTable");
}
rdr.Close();
con.Close();
ds.Dispose();
}
}
//I know this looks wrong, just an empty api method to show the code
return new string[] { "value1", "value2" };
}
The exception does not occur when I am using my local Sql Server. Only happens when I connect to our 'test server'. Are there anything else I can look at when trying resolve this issue. Like Sql server settings / network settings etc.
The stored procedure I call does not lock up the db I have checked for that as well. If that was the case it should have failed on my local Sql instance as well.
I am using jmeter to generate load, 1500 - threads(users). Surely I should be able to handle way more than that.
Thanks in advance
You have not specified any Connection Time out property, so it's 15 seconds default. using Max Pool Size=100 is not a good idea until you don't have proper hardware resources.
You started 1500 threads, so it seems that the some the threads keep waiting for 15 seconds to get their chance for connection opening. And as time goes out, you get the connection time out error.
So I think increasing the 'Connection Timeout' property in connection string may resolve your issue.

Memory Problems in C# using while(true)

i would like to write an Client in C# which checks if a User is logged in on different Clients. The Client should run 24/7 and refreshes a Database with some State Information for each Client.
My Problem is: The Command Line Tool takes more and more Memory, so ill think that there is a Problem that i allocate Memory which never gets released.
I think it is that i am creating a ManagementScope, but i cannot all the Dispose() Method for it.
Here is my Code:
static void Main(string[] args)
{
Ping pingSender = new Ping();
PingOptions options = new PingOptions();
string sqlconnectionstring = "Data Source=(local)\\SQLEXPRESS;Initial Catalog=clientstat;User ID=...;Password=....;Integrated Security=SSPI";
SqlConnection clientread = new SqlConnection(sqlconnectionstring);
clientread.Open();
// Use the default Ttl value which is 128,
// but change the fragmentation behavior.
options.DontFragment = true;
string username = "";
// Create a buffer of 32 bytes of data to be transmitted.
string data = "aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa";
byte[] buffer = Encoding.ASCII.GetBytes(data);
int timeout = 120;
while (true)
{
SqlCommand clientcommand = new SqlCommand("SELECT * FROM Client WHERE StateID = #stateid", clientread);
clientcommand.Parameters.Add(new SqlParameter("stateid", 1));
SqlDataReader clientreader = clientcommand.ExecuteReader();
while (clientreader.Read())
{
string ipadress = Convert.ToString(clientreader["IP"]);
string clientid = Convert.ToString(clientreader["ID"]);
if (ipadress != string.Empty && clientid != string.Empty)
{
// First Try To Ping Computer
PingReply reply = pingSender.Send(ipadress, timeout, buffer, options);
if (reply.Status == IPStatus.Success)
{
try
{
ManagementScope managementScope = new ManagementScope((#"\\" + ipadress + #"\root\cimv2"));
managementScope.Options.Username = "....";
managementScope.Options.Password = "...";
managementScope.Options.EnablePrivileges = true;
// ObjectQuery to Check if User is logged on
ObjectQuery objectQuery = new ObjectQuery("SELECT * FROM Win32_ComputerSystem");
ManagementObjectSearcher managementObjectSearcher = new ManagementObjectSearcher(managementScope, objectQuery);
ManagementObjectCollection querycollection = managementObjectSearcher.Get();
foreach (ManagementObject mo in querycollection)
{
// Check Here UserName
username = Convert.ToString(mo["UserName"]);
if (username != "")
{
Console.WriteLine(ipadress + " " + username);
}
}
querycollection.Dispose();
managementObjectSearcher.Dispose();
}
catch (Exception x)
{
Console.WriteLine(x.Message);
}
}
}
else
{
Console.WriteLine(clientid + " has no IP-Adress in Database");
}
}
clientcommand.Dispose();
clientreader.Close();
clientreader.Dispose();
}
}
}
Any Ideas or Suggestions what i can improve here? Or what exactly could be a Problem?
Thanks in advance
Idea 1:
You have to Dispose the ManagementObject to release the unmanaged COM resources.
Unfortunately, there is a bug in the Dispose implementation of it. Here are more details about it.
Credits should go to this answer that provides a workaround using GC.Collect(). Unfortunately, it costs.
That's why it is better to use a counter to perform the GC.Collect every n loops, with a n value you will manually tune until the performances are acceptable.
Anyway, I would try to invoke the ManagementObject Dispose() using reflection.
Idea 2:
In general, re-using a opened connection for several queries is not good since it prevents the connection pooling mechanism to work as optimal. Therefore, the sqlconnection may retain resources if used so.
Instead, please include the SqlConnection create/open and close/dispose in the loop, as related to this question.
You should use using (and not invoke Dispose(), it's not needed). The "new" issue would be the nesting, which will look like this:
using (SqlConnection ...)
{
using (SqlCommand ...)
{
using (SqlDataReader ...)
{
...
}
}
}
Basically, if you are instancing something which implements IDisposable, put a using there and be assured that .NET will handle memory for you (at least, it will try to).
Try to add a GC.Collect() call after each top level iteration (just to diagnose the issue). See if the memory behaves the same. If not then you don't have an issue, the GC might just be optimistic and delay collections.
Each iteration uses a non trivial amount of space due to the data reader buffers and what not so if those are just not collected you will observe memory just increasing.
It is just a false alarm though. If your system becomes memory constrained or the app triggers some kind of internal GC threshold collection will happen just fine.
The reason you are getting this exception is you are using while(true) statement and you are not using break; anywhere in a loop to break it explicitly. So the while loop is executing in infinite times and thus taking whole lot of memory. I think you should use Windows Service instead of while(true) to run it 24/7 and do it's operation without exception.
I'm not sure why you're creating a new SqlCommand on each loop iteration.
Just parameterize the SqlCommand, and in the loop iteration, and set the parameters, rather than creating a new SqlCommand.
Do that, and let me know how the memory looks. Remember one more thing - the GC won't kick in until it kicks in (i.e. non-determinism rules are in effect). Unless you really want to run a GC.Collect in your loop (that's sheer madness, IMHO). Another words, the constant creation/disposal of the objects is probably making the memory grow. Remember that a naive Dispose isn't going to make the memory magically shrink. Also keep in mind the memory management model of .NET and you should be all right.

Move record insert to SQL Server CE database into another thread? C# Compact Framework 3.5

I had some trouble trying to place piece of code into another thread to increase performance.
I have following code below (with thread additions with comments), where I parse large XML file (final goal 100,000 rows) and then write it to a SQL Server CE 3.5 database file (.sdf) using record and insert (SqlCeResultSet/SqlCeUpdatableRecord).
Two lines of code in if statement inside the while loop,
xElem = (XElement)XNode.ReadFrom(xmlTextReader);
and
rs.Insert(record);
take about the same amount of time to execute. I was thinking to run rs.Insert(record); while I am parsing the next line of xml file. However, I still was unable to do it using either Thread or ThreadPool.
I have to make sure that the record that I pass to thread is not changed until I finish executing rs.Insert(record); in existing thread. Thus, I tried to place thread.Join() before writing new record (record.SetValue(i, values[i]);), but I still get conflict when I try to run the program - program crashes with bunch of errors due to trying to write identical row several times (especially for index).
Can anyone help me with some advise? How can I move rs.Insert(record); into another thread to increase performance?
XmlTextReader xmlTextReader = new XmlTextReader(modFunctions.InFName);
XElement xElem = new XElement("item");
using (SqlCeConnection cn = new SqlCeConnection(connectionString))
{
if (cn.State == ConnectionState.Closed)
cn.Open();
using (SqlCeCommand cmd = new SqlCeCommand())
{
cmd.Connection = cn;
cmd.CommandText = "item";
cmd.CommandType = CommandType.TableDirect;
using (SqlCeResultSet rs = cmd.ExecuteResultSet(ResultSetOptions.Updatable))
{
SqlCeUpdatableRecord record = rs.CreateRecord();
// Thread code addition
Thread t = new Thread(new ThreadStart(() => rs.Insert(record));
while (xmlTextReader.Read())
{
if (xmlTextReader.NodeType == XmlNodeType.Element &&
xmlTextReader.LocalName == "item" &&
xmlTextReader.IsStartElement() == true)
{
xElem = (XElement)XNode.ReadFrom(xmlTextReader);
values[0] = (string)xElem.Element("Index"); // 0
values[1] = (string)xElem.Element("Name"); // 1
~~~
values[13] = (string)xElem.Element("Notes"); // 13
// Thread code addition -- Wait until previous thread finishes
if (ThreadStartedS == 1)
{
t.Join()
}
// SetValues to record
for (int i = 0; i < values.Length; i++)
{
record.SetValue(i, values[i]); // 0 to 13
}
// Thread code addition -- Start thread to execute rs.Insert(record)
ThreadStartedS = 1;
t.Start();
// Original code without threads
// Insert Record
//rs.Insert(record);
}
}
}
}
}
If all of your processing is going to be done on the device (reading from the XML file on the device then parsing the data on the device), then you will see no performance increase from threading your work.
These Windows Mobile devices only have a single processor, so for them to multithread means one process works for a while, then another process works for a while. You will never have simultaneous processes running at the same time.
On the other hand, if the data from your XML file were located on a remote server, you could call the data in chunks. As a chunk arrives, you could process that data in another thread while waiting on the next chunk of data to arrive in the main thread.
If all of this work is being done on one device, you will not have good luck with multithreading.
You can still display a progress bar (from 0 to NumberOfRecords) with a cancel button so the person waiting for the data collection to complete does not go insane with anticipation.

SQLite DB Insert Very Slow

I am using an SQLite database, and inserting records into it. This takes a hugely long time! I have seen people who say they can process a couple thousand in a minute. I have around 2400 records. Each record takes 30s-2m to complete. Recreating the database is not an option. I have tried to create one transaction different ways. I need to use the timer, because I am using a ProgressBar to show me that something is happening. Here is the code I am using:
string con;
con = string.Format(#"Data Source={0}", documentsFolder);
SQLiteConnection sqlconnection = new SQLiteConnection(con);
SQLiteCommand sqlComm = sqlconnection.CreateCommand();
sqlconnection.Open();
SQLiteTransaction transaction = sqlconnection.BeginTransaction();
Timer timer2 = new Timer();
timer2.Interval = 1000;
timer2.Tick += (source, e) =>
{
URL u = firefox.URLs[count2];
string newtitle = u.title;
form.label1.Text = count2 + "/" + pBar.Maximum;
string c_urls = "insert or ignore into " + table + " (id,
url, title, visit_count, typed_count, last_visit_time, hidden) values (" + dbID + ",'" + u.url + "','"
+ newtitle + "',1,1, " + ToChromeTime(u.visited) + ", 0)";
string c_visited = "insert or ignore into " + table2 + " (id,
url,
visit_time, transition) values (" + dbID2 + "," + dbID + "," +
ToChromeTime(u.visited) + ",805306368)";
sqlComm = new SQLiteCommand(c_urls, sqlconnection);
sqlComm.ExecuteNonQuery();
sqlComm = new SQLiteCommand(c_visited, sqlconnection);
sqlComm.ExecuteNonQuery();
dbID++;
dbID2++;
pBar.Value = count2;
if (pBar.Maximum == count2)
{
pBar.Value = 0;
timer.Stop();
transaction.Commit();
sqlComm.Dispose();
sqlconnection.Dispose();
sqlconnection.Close();
}
count2++;
};
timer2.Start();
What am I doing wrong?
This is what I would address, in order. It may or may not fix the problem, but it won't hurt to see (and it might just do some magic):
Ensure the Database is not being contended with updates (from another thread, process, or even timer!). Writers will acquire locks and unclosed/over-long-running transactions can interact in bad ways. (For updates that take "30 seconds to 2 minutes" I would imagine there is an issue obtaining locks. Also ensure the media the DB is on is sufficient, e.g. local drive.)
The transaction is not being used (??). Move the transaction inside the timer callback, attach it to the appropriate SQLCommands, and dispose it before the callback ends. (Use using).
Not all SQLCommand's are being disposed correctly. Dispose each and every one. (The use of using simplifies this. Do not let it bleed past the callback.)
Placeholders are not being used. Not only is this simpler and easier to use, but it is also ever so slightly more friendly to SQLite and the adapter.
(Example only; there may be errors in the following code.)
// It's okay to keep long-running SQLite connections.
// In my applications I have a single application-wide connection.
// The more important thing is watching thread-access and transactions.
// In any case, we can keep this here.
SQLiteConnection sqlconnection = new SQLiteConnection(con);
sqlconnection.Open();
// In timer event - remember this is on the /UI/ thread.
// DO NOT ALLOW CROSS-THREAD ACCESS TO THE SAME SQLite CONNECTION.
// (You have been warned.)
URL u = firefox.URLs[count2];
string newtitle = u.title;
form.label1.Text = count2 + "/" + pBar.Maximum;
try {
// This transaction is ONLY kept about for this timer callback.
// Great care must be taken with long-running transactions in SQLite.
// SQLite does not have good support for (long running) concurrent-writers
// because it must obtain exclusive file locks.
// There is no Table/Row locks!
sqlconnection.BeginTransaction();
// using ensures cmd will be Disposed as appropriate.
using (var cmd = sqlconnection.CreateCommand()) {
// Using placeholders is cleaner. It shouldn't be an issue to
// re-create the SQLCommand because it can be cached in the adapter/driver
// (although I could be wrong on this, anyway, it's not "this issue" here).
cmd.CommandText = "insert or ignore into " + table
+ " (id, url, title, visit_count, typed_count, last_visit_time, hidden)"
+ " values (#dbID, #url, 'etc, add other parameters')";
// Add each parameter; easy-peasy
cmd.Parameters.Add("#dbID", dbID);
cmd.Parameter.Add("#url", u.url);
// .. add other parameters
cmd.ExecuteNonQuery();
}
// Do same for other command (runs in the same TX)
// Then commit TX
sqlconnection.Commit();
} catch (Exception ex) {
// Or fail TX and propagate exception ..
sqlconnection.Rollback();
throw;
}
if (pBar.Maximum == count2)
{
pBar.Value = 0;
timer.Stop();
// All the other SQLite resources are already
// cleaned up!
sqlconnection.Dispose();
sqlconnection.Close();
}
I'm not sure if this is your problem, but your general pattern of using ADO.NET is wrong - you shouldn't create new command(s) per each insert (and repeatedly pay for query preparation).
Instead, do the following:
Before the loop:
Create command(s) once.
Create appropriate bound parameters.
In the loop:
Just assign appropriate values to the bound parameters.
And execute the command(s).
You could also consider using less fine-grained transactions: try putting several inserts in the same transaction to minimize paying for transaction durability.
You might also want to take a look at this post.
You can try one of the following to improve performance :
Wrap all the inserts in a transaction - Can help in reducing the actual writes to the DB.
Use WAL - The Write-Ahead-Log is a journaling mode that speeds up writes and enables concurrency. (Not recommended if your DB is in a Network location).
Synchronous NORMAL - The Synchronous Mode dictates the the frequency at which data is actually flushed to the physical memory (fsync() calls). This can be time taking on some machines and hence the frequency at which this flush occurs is critical. Make sure to explicitly open connections with "Synchronous=NORMAL" ideal for most scenarios. There is a huge difference between Synchronous MODE as FULL and NORMAL (NORMAL is ~1000 times better).
Find more details in a similar post => What changed between System.Data.SQLite version 1.0.74 and the most recent 1.0.113?

Categories