I have a server and 'x' number of clients.
When the server is listening for an inbound connection, it will create a client handler instance (a class that manages the client communication) which will be spun off in a separate thread.
Depending on the command the client sends to the server, the server may need to access a SQL database to store information about that client.
The client handler instance will 'handle' this request. The only problem is, if multiple client handlers are wanting to access the SQL database to do the exact same thing then there is potential for read / write issues.
I was thinking about exposing a static method on the server, calling it from the client handle instances, then locking the function which accesses the SQL database (either read or write).
Is this a good approach or are there better approaches?
Thanks.
Well, you DO know that SQL has locks and a ton of internal mechanisms to serialize access? That this is part of the ACID conditions ingrained in SQL since the 1950's when SQL was created? That the locking mechanism in SQL is very very fine and basically you try to solve a problem that has been solved more than 60 years ago.... because it seems you need to read a book about the basics of SQL.
Under normal circumstances (standard connection string) resource access is serialized in SQL Server (TransactionIsolationLevel serialized), but that can be tuned. I really suggest learning some SQL fundamentals.
Related
Say I have an application that wants to return information from an online database, it's a public application which anyone has free access to. What is a safe approach for these clients to communicate with the database?
I am interested in how this is done in professional environments. When I tried to do research all I could find were examples of clients connecting directly to the database themselves, leaving their connection strings in the code, these examples I'm sure were for server sided/private applications. But in my scenario I assume there has to be a proxy in between.
Do I set up a web server with PHP scripts that returns results or do I program a server that listens for TCP/IP connections? I feel there is a better way but can't lay my finger on it.
What my question boils down to is essentially..
What technique is used for this 'proxy' and client to communicate? What does the setup look like? Even a diagram would be great. Assume I am working in C#.
At the moment i am putting together an asynchronous tcp server, everything seems to be coming together but im now at the stage where i need to figure out what to do with the data once received (I should mention that it will be used for receiving data primarily and will possibly never send anything to the clients).
As it is written asynchronously i don't particularly want to do any processing of the data in the server application itself (in the handler in which the data is received) to ensure that it will perform as optimally as possible, though eventually the data needs to processed and submitted to various sql tables to be of some use.
As part of a previously asked question here on SO
Asynchronous Processing of Data
Stephen Clearly had pointed out that to ensure no messages are lost due to power failure, system failure etc i should look into some kind of message queue.
In doing so i have seen various ways of doing this, one of which being using SQL server as the host to the queue.
What im wondering is using the SQL Service Broker and a Queue going to be any quicker than doing a normal insert to a table which contains only a UID, The Data (byte array no bigger than 1024 bytes) and a processed flag? And if not what is the fastest insert to use in C#
The processing of said data will probably take place in another application on the same server which will also receive the data and host the sql server if it makes any difference.
Any advice or thoughts will be much appreciated!
If you can afford the license fee I would recommend NServiceBus. If you can't afford it, consider MassTransit. Both of which will manage the message queue for you. They both support multiple queue types such as:
MSMQ
RabbitMQ
ActiveMQ
Azure
Implementing your own queuing system in SQL Server is a poor choice in the long run. Been there, got the t-shirt.
My existing scenario is below:
I have an SQL server which resides in a main company office. The company has another branch which is 60 miles apart.
I have a WPF application installed in computers in the main office and the branch which connects to the SQL server in the main office for printing records etc.
I am specifying the connection parameters in the app.config file as below:
<add name="CompanyEntities" connectionString="metadata=res://*/LabModel.csdl|res://*/LabModel.ssdl|res://*/LabModel.msl;provider=System.Data.SqlClient;provider connection string="Data Source=publicIPaddress of remote SQL here;Initial Catalog=databasename;Persist Security Info=True;User ID=sa;Password=password;MultipleActiveResultSets=True"" providerName="System.Data.EntityClient" />
My problem is that the application in the branch hangs for ever at certain times. So my question is whether its the best practice to use WCF for connecting to the remote SQL server?
Which is the best way to go about it? Is there any links which best discribes this?
In your current scenario, WCF is useless to you. If you write a server side application, that manages the database connection, you can use WCF to send the data to your clients. However that requires your client side software adapted to the use of WCF also, but in that case your clients wouldn't access the database (you would change from a two-layer architecture to a three-layer architecture). It might solve your problems, and it might introduce some other problems.
The hanging problem you describe could be caused by many things. For example you can run out of database connections, get into a deadlock (altough that transaction would be terminated by the server normally), or just simply have a lock on data being edited, and the employee using it going for a break.
It is also possible, that the problem is not with the database connection, but something in the client side code. Since I have no details, I can not tell you anything more specific.
Picking on the wording you chose, I don't think you'll be "connecting to the remote SQL server" using WCF. However, you can certainly use WCF to host a service that provides your client (WPF app) with secure access to your data, though it will do so indirectly. This approach will be quite different from the approach you currently have: accessing the server directly from the WPF app.
Although I haven't used it myself, I believe one approach may be WCF Data Services. If you only need a few operations in your service (e.g. GetRecord) for printing you may also be able to just roll your own WCF service that just provides "Records" to your client app. Any introductory book or tutorial on WCF will probably get you on your way.
The above answers the questions you seemed to be asking. However, WCF doesn't solve connection issues for you: if you say "the application [...] hangs forever at certain times" you should investigate that no matter what. But that's a different question.
Rather than poll against some tables, I'd like to signal a waiting c# app that there are new rows to be processed in a table, maybe via a trigger. Is there some way for the database to signal to a console app, or am I stuck polling the table looking for new rows?
Take a look at Query Notifications (SQL Server 2005+).
Microsoft SQL Server 2005 introduces
query notifications, new functionality
that allows an application to request
a notification from SQL Server when
the results of a query change. Query
notifications allow programmers to
design applications that query the
database only when there is a change
to information that the application
has previously retrieved.
There is an example here of how to write a simple form app to register a query for notification: http://msdn.microsoft.com/en-us/library/a52dhwx7(VS.80).aspx.
This does require the Service Broker to be enabled on the database.
You should take a look at the notes in the Remarks section of the MSDN SqlDependency documentation to make sure it is the right choice for your scenario
Check if SqlCacheDependency can be of any use...
http://www.asp.net/data-access/tutorials/using-sql-cache-dependencies-cs
If it is SQL Server 2008, You can use Event-Based Activation using Service Broker as well.
I have used WCF with SQLCLR to get data from another process into SQL, worked pretty good apart from some minor quirks to set it up. So you can just call other processes from SQl this way. It's usually quite hard to deploy to customers though, DBA don't like this sort of stuff.
GJ
Check out SQL Service Broker. Using a Queue and the WAITFOR syntax I have changed a custom polling service into a blocking/signal service. You could also look at the event-activation. Either way this would allow for a transactional way to call your external program in a non-polling async way that will not slow down triggers and database locks.
I have a web site which uses one SQL database but the hosting company is very slow sometimes and I got database timeout, login and similar errors. Can I implement my code to use two databases simultaneously? I have stored procedures and the data is updated periodically.
EDIT:
Simply: When dbDefault is down and inaccessible I need to use dbSecondary so the web app keeps running. Ant these two databases must be always same.
EDIT:
Some errors:
A transport-level error has occurred when sending the request to the server. (provider: TCP Provider, error: 0 - An existing connection was forcibly closed by the remote host.)
A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server)
Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached.
Cannot open database "db" requested by the login. The login failed. Login failed for user 'root'.
Load balancing and/or fail-over clustering database servers typically involves a lot of work.
You will need to make sure ALL data is merge replicated between the two database servers. Hosting providers rarely provide this option unless you have a dedicated server.
Allowing for merge replication might involve redesigning parts of your database; which may not be feasible.
Unless you are willing to invest a lot of time and money, you are much better off just switching hosting providers to one that has better db support. Considering there are literally thousands upon thousands of such companies out there this is an easy fix.
UPDATE
Almost of all the errors you identified in your edit are generally attributable to failing to properly dispose of connections, commands, and readers. You might want to go through your code to make sure you are accessing the sql server correctly. Every connection, command, and reader should be wrapped in a using clause in order to make sure they are properly released back to the connection pool.
If you provide a data access code sample (new question please) we can help you rewrite it.
Not really.
Data consistency and integrity:
How do you decide what data or what call to make at what time?
What happens on write?
Firewalls, remote server etc:
If you use another hosting company, how will you connect?
Misconception:
Two databases on one server = absolutely no advantage.
The server is probably overloaded, a 2nd db will make it worse
A database timeout could be code related of course and it may not help you to have 2 databases with the same poor code or design...
Not a nice answer, but if your host is providing poor service then your options are limited
ist of all find the reasons of the Timeout if it is in your code than rectify the code by optimizing query etc.
i think what you need is a Failover Server , where you can switch if the one server is down.
Alternatively
you can maintain two connection string in web.config and can switch to other server if one is down.
in both the method , you need to devise an strategey to sync the servers.
If both your database are in synch (which is an obvious requirement for what you are trying to do), the best solution is to rely on a loadbalancer but if you can't, I guess you goal is to run the query against both database in the same time and returns the first result otherwise you will have to wait for the timeout to run the request against the second server.
SO what you need is asynchronous sql command right ?