Too many connections error in mysql - c#

I am developing an app using C# and MySql (stored procedure). After running the app for certain time it shows Too many connection. Then I used commands like SHOW STATUS WHERE variable_name = 'Threads_connected'; and SHOW PROCESSLIST; to debug the problem. It seems each time I run any action on my app, mysql creates new thread and the thread is marked as Sleep. Moreover the thread does not close on time. I found one solution i.e setting mysql environment variables as below.
interactive_timeout=180
wait_timeout=180
Is this solution have any impact on the app as it automatically kills the connection? What happens if data fetching time from database is a bit long?
I am expecting huge traffic about 1000 at a time. So what should be the max connection number in mysql? Will that degrade the mysql performance?
[Note: There is no problem in my app as I have closed every mysql connection]
Thanks in advance.

I hope below artical will help you to get answer of your 2nd point
http://dev.mysql.com/doc/refman/5.5/en/too-many-connections.html
In cases where an application doesn’t close connections properly, wait_timeout is an important parameter to tune and discard unused or idle connections to minimize the number of active connections to your MySQL server – and this will ultimately help to avoid the “Too many connections” error.
Threads_running is a valuable metric to monitor as it doesn’t count sleeping threads – it shows active and the amount of queries currently processing, while threads_connected status variables show all connected threads value including idle connections as well

Related

Increasing the performance with service and multi threading in C#

I am doing a project that needs to communicate with 20 small computer boards. I will need to keep check of their connections and they will also return some data to me. So my aim is to build a control/monitoring system for these boards.
I will be using Visual Studio 2010 and C# WPF.
My idea/ plan would be like this:
On the main thread:
There will be only one control window, so a main thread will be created mainly to update the data to be displayed. Datas of each board will be display and refreshed at a time interval of 1s. The source of data will be from a database where the main thread will look for the latest data(I have not decided on which kind of database to use yet).
There will be control buttons on the control window too. I already have a .dll library, so I will only need to call the functions inside to direct the boards to action (by starting another thread).
There will be two services:
(Timer service) One will be a scheduled timer to turn the boards on/ off at a specific time. Users would be able to change the on/ off time. It would read from the database to get the on/ off time.
(Connection service) Another one will be responsible to ask and receive information/ status from the board every 30s or less. The work would be including connecting with the board through internet, asking for data, receiving the data and then writing the data to the database. And also writing down the exceptions thrown if the internet connection failed.
My questions:
1) For the connection service, I am wondering if I should be starting 20 threads to do this, one thread per connection to a board. Because if the connections were made by only one thread, the next board connection must wait for the first to finish, which may add up to 1-2 mins for the whole process to end. So I would need around 20 - 40 mins to get all the data back. But if I separate the connection to 20 threads, will it make a big difference in the performance? As the 20 threads never dies, it keeps asking for data every 30s if possible. Besides, does that mean I will have to have 20 database, as it would clash the database if 20 threads are writing in at the same time?
2) For updating the display of data on the main thread for every 1s, should I also start a service to do this? And as the connection service is also accessing the same database, will this clash the database?
There will be more than 100 boards to control and monitor in the future, so I would like to make the program as light as possible.
Thank you very much! Comments and ideas very much appreciated!
Starting 20 threads would be the best bet. (Or as Ralf said, use a thread when needed, in your specific case, it would probably be 20 at some point). Most databases are thread safe, meaning you can write into them from separate threads. If you use a "real" database, this isn't any issue at all.
No, use a Timer on the main thread to update your UI. The UI can easily read from the DB. As long as the update action itself is not taking a lot of time, it is OK to do it on the UI thread.
1) Why not use threads when needed. You can use one DBMS they are build to processing large amounts of information.
2) Not sure what you mean by start a service for the UI thread. As with 1) Database Management Systems are build to process data.

Halt Linq query if it will take 'too' long

Currently I have the need to create a reporting program that runs reports on many different tables within a SQL database. Multiple different clients require this functionality but some clients have larger databases than others. What I would like to know is whether it is possible to halt a query after a period of time if it has been taking 'too' long.
To give some context, some clients have tables with in excess of 2 million rows, although a different client may have only 50k rows in the same table. I want to be able to run the query for say 20 seconds and if it has not finished by then return a message to the user to say that the result set will be too large and the report needs to be generated outside of hours as we do not want to run resource intensive operations during the day.
Set the connection timeout on either your connection string or on the DataContext via the CommandTimeoutproperty. When the timeout expires, you will get a TimeoutException, and your query will be cancelled.
You cannot be sure that the query is cancelled on the server the very instant the timeout occurs, but in most cases it will cancel rather quickly. For details read the excellent article "There's no such thing as a query timeout...". The important part from there is:
A client signals a query timeout to the server using an attention
event. An attention event is simply a distinct type of TDS packet a
SQL Server client can send to it. In addition to connect/disconnect,
T-SQL batch, and RPC events, a client can signal an attention to the
server. An attention tells the server to cancel the connection's
currently executing query (if there is one) as soon as possible. An
attention doesn't rollback open transactions, and it doesn't stop the
currently executing query on a dime -- the server aborts whatever it
was doing for the connection at the next available opportunity.
Usually, this happens pretty quickly, but not always.
But remember, it will differ from provider to provider and it might even be subject to change between server versions.
You can do that easily if you run the quer on a background thread. Make the main thread start a timer and spawn a background thread that runs the query. If when 20 seconds are over the background thread hasn't returned a result, the main thread can cancel it.

SQL CONNECTION best Practices

Currently there is discussion as to what are the pros and cons of having a single sql connection architecture.
To elaborate what we are discussing is, at application creation open a sql connection and at application close or error closing the sql connection. And not creating another connection at all, but using just that one to talk with the DB.
We are wondering what the community thinks.
Close the connection as soon as you do not longer need it for an undefined amount of time. By doing so, the connection returns to the connection-pool (if connection pooling is enabled), and can be (re)used by someone else.
(Connections are expensive resources, and are sometimes limited).
If you keep hold on a connection for the entire lifetime of an application, and you have multiple users for that application (thus multiple instances of the app, and multiple connections), and if your DB server is limited to have only x number of concurrent connections, then you could have a problem ....
See also best practices for ado.net
Follow this simple rule... Open connection as late as possible and close it as soon as possible.
I think it's a bad idea, for several reasons.
If you have 10,000 users using your application, that's 10,000 connections open constantly
If you have to restart your Sql Server, all those 10,000 connections are invalidated and your application will suddenly - assuming you've included reconnect logic - be making 10000 near-simultaneous re-connect requests.
To expand on point 1, you should close connections as soon as you can because otherwise you're using up a finite resource for, potentially, an inifinite period of time. If you had Sql Server configured to allow a maximum of 10,001 simultaneous connections, then you can only have 10,001 users running your application at any one time. If you open/close connections on demand then your application will scale much further as the likelihood of all the active users making use of the database simultaneously is, realistically, low.
Under the covers, ADO.NET uses connection pooling to manage the connections to the database. I would suggest leaving it up to the connection pool to take care of your connection needs. Keeping a connection open for the duration of your application is a bad idea.
I use a helpdesk system called Richmond Systems that uses one connection for the life of the application, and as a laptop user, it is a royal pain in the behind. Even when I carry my laptop around open, the jumps between the wireless access points are enough to drop the DB conenction. The software then complains about the DB conenction, gets into an error state and won't close. It has to be killed manually from Task Manager.
In short, DON'T HOLD OPEN A DATABASE CONNECTION FOR LONGER THAN NECESSARY.
But on the flip side, I'd be cautious about opening and closing connections too often. This is a lot cheaper with connection pooling than without, but even with pooling, the pool manager may decide to grow or shrink the pool, turning it back into an expensive operation.
My general rule is to open a connection when the user initiates some action, do the work, then close the connection before waiting for the next user input. For any given "Update" button click or whatever, I'll generally have only one connection. But you definately do not want to keep connections open while waiting for user input if you can at all help it for all the reasons others have mentioned. You could literally wait for days before the user presses another key or touches another button -- what if he leaves his computer on and goes on vacation? Tying up a resource for unpredictable amounts of time like that is bad news. In most cases, the elapsed time waiting for user input will far exceed the time doing actual work.

Transaction commit executes succesfully but not getting done

I've encountered a strange problem in Sql Server.
I have a pocket PC application which connects to a web service, which in turn, connects to a database and inserts lots of data. The web service opens a transaction for each pocket PC which connects to it. Everyday at 12 P.M., 15 to 20 people with different pocket PCs get connected to the web service simultaneously and finish the transfer successfully.
But after that, there remains one open transaction (visible in Activity Monitor) associated with 4000 exclusive locks. After a few hours, they vanish (probably something times out) and some of the transfered data is deleted. Is there a way I can prevent these locks from happening? Or recognize them programmatically and wait for an unlock?
Thanks a lot.
You could run sp_lock and check to see if there are any exclusive locks held on tables you're interested in. That will tell you the SPID of the offending connection, and you can use sp_who or sp_who2 to find more information about that SPID.
Alternatively, the Activity Monitor in Management Studio will give you graphical versions of this information, and will also allow you to kill any offending processes (the kill command will allow you to do the same in a query editor).
You can use SQL Server Profiler to monitor the statements that occuring including begin and end of transactions. There are also some tools from Microsoft Support which are great since they run profiler and blocking scripts. I'm looking to see if I can find these will update if I do/.
If you have an open transaction you should be able to see this in the activity monitor, so you can check if there are any open transactions before you restart the server.
Edit
It sounds like this problem happens at roughly the same time every day. You will want to turn it on before the problem happens.
I suspect you are doing something wrong in code, do you have command timeouts set to a large enough value to do their work, or possibly an error is skipping a COMMIT?
You can inspect what transactions are open by running:
DBCC OPENTRAN
The timeout on your select indicates that the transaction is still open with a lock on atleast part of the table.
How are you doing transactions over web services? How / where in your code are you commiting the transaction?
Doing lots of tests, I found out a deadlock is happening. But I couldn't find the reason, as I'm just inserting so many records in some independent tables.
These links helped a bit, but to no luck:
http://support.microsoft.com/kb/323630
http://support.microsoft.com/kb/162361
I even broke my transactions to smaller ones, but I still got the deadlock. I finally removed the transactions and changed the code to not delete them from the source database, and didn't get the deadlocks anymore.
As a lesson, now I know if you have some (more than one) large transactions getting executed on the same database at the same time, you'll sure have problems in SQL Server, I don't know about Oracle.

Maximum number of connection to SQL Server database

What is the maximum number of connections to SQL Server 2005 for one user? I'm having a problem with C# code trying to make multiple connection to the database in different threads. After about five threads the connections in other threads start timing out. If i knew the exact number of connection for one user, even if it was one, would help with knowing how many threads I can have loading at one time.
5 connections and you start to timeout? That smells like connections not being closed and/or concurrency issues (locks/deadlocks). I have services that spawn threads and generate upwards of 100 connections without any problems.
A long shot, 5 connections sounds like you might have licensing issues. Is the SQL instance you're using limited to the number of concurrent connections? (This is not something I've ever had to deal with. I know there are CAL licensing plans, and that there may be limits if you are using SQL Server Express edition.)
More so than the number of connections allowed per user, it may do you better to make sure you are effectively closing your connections as soon as you are done with them to make sure that the connection pool does not get used up too quickly.

Categories