Given a typical 3 tiers app deployed at two remote sites. The db behind the two installations contains exactly the same structure and same data. There needs to be a replication mechanism between the two backend db to keep them in synch. The native replication feature of SqlServer would do the job. However, the business layer of the app keeps a lot of data cached. If I use db level replication, the cache in the business layer gets out of sync.
Is there a way to create SQL triggers that notify the business layer about changes?
DB: SqlServer2005 or 2008, business layer: C#
Complex Option A: If you have static IP's you can use WCF (TCP/IP or SOAP) in the app and call it using SQL CLR, otherwise you could use WCF Peer-to-Peer although it may be hard with the limits of SQL CLR. To go one step further you could have the SQL CLR talk to the local program which then tells the others to update.
Simpler Option B: Site A clears the cache on it's side and sets a flag in the DB, SQL replication grabs it, inform site B which clears it's cache and the flag. The cache is cleared by either site whenever a disparity is found.
Take a look at Query Notifications, here are a couple of articles about the subject:
SQL Server 2005 Query Notifications Tell .NET 2.0 Apps When Critical Data Changes
Using Query Notifications in .NET 2.0 to handle ad-hoc data refreshes
Sql Cache Invalidation might help.
I've used Message Queuing for this functionality. This solution also copes with downtime between the apps. Just make sure you create the queue as a transactional queue to ensure messages are saved in case of power failure.
See http://www.codeproject.com/KB/database/SqlMSMQ.aspx for an example of how to use it on SQL Server 2005.
Related
In a multiuser environment:
How do you make sure that all clients see eachothers changes? What is the best way to do this?
In the past I created a C# application and installed it on 2 pc's. It connected to a central SQL Express server (the client application worked with Entity Framework Code First as ORM).
When client1 added a record to the database, this was not directly visibly to client2. Only if client2 fetched all data again (hard refresh) the change was visible.
Now I am looking for a solution on how this 'sync'(?) can or should be done. I like working Entity Framework Code First, it would be nice a solution could keep this.
Also the application is still in a very early stage.
I thought by having a central database and multiple clients connecting to it, but I'm not sure if this is a good solution. If your suggestions/solutions would require a central server application where the clients connect to (and where the server application does the database handling) this would be no problem.
If possible a basic sample solution or some basic code that shows how to always work with the latest data could would be very helpful!
Similar questions:
Entity Framework - Underlying data (in database) change notification
Entity Framework data updates in a multi-user environment with central database
Entity framework data context not in sync with database?
Thanks in advance
It depends on your environment and the data you are managing and the architecture you want.
If it's OK/acceptable to let clients have copies of the data which they can work with, they need to work with the data when not connected to the central server, then you could use the the Sync Framework.
You'd have your central SQL Server as usual, and use the Sync Framework to sync with clients.
You would write a "Provider" which would decide how to resolve changes made to the same data by different clients, etc.
You would have to put SQL Express (or possibly LocalDB (new name for SQLCE)) onto the client machines.
Then do your Entity Framework Model/Code to access the local database instead of a central one.
http://blogs.msdn.com/b/sync/archive/2008/06/24/sample-sql-express-client-synchronization-using-sync-services-for-ado-net.aspx
Otherwise it's down to designing and implementing some "tiers" and following a Distributed Internet/Database Architecture/SOA.
A nice free resource:
http://msdn.microsoft.com/en-us/library/ff650706.aspx
http://mtechsoa2011.blogspot.co.uk/2011/04/soa-vs-distributed-internet_27.html
Some useful books:
http://www.amazon.co.uk/Service-Oriented-Architecture-Concepts-Technology-Computing/dp/0131858580/ref=sr_1_1?s=books&ie=UTF8&qid=1343295432&sr=1-1
Other solution is create an "interface" for your database and each put data operation from some client can notify other clients. You can implement such interface by WCF with it's callbacks.
I have no simple code for whole architecture solution... If you'll ask more concrete question about building n-tier application with WCF I'll try to help.
I'm writing an application which will involve interaction with the database. The application will be used by users from the scale of 100 to 1000 and the database should be able to store up to 100,000 rows of data.
Previously I have used Microsoft Access but my users were required to install microsoft access database engine for my application to function as intended. I want my application to as light weight/portable as possible, so is there any better alternatives where users will not be required to install any third party components to run my application?
Look at mongoDB, it is an open source non relational database that has picked up popularity. Its is very fast too.
Depends on whether the DB will be server side or client side.
If it's server side it's up to you really, I would personally go SQL Server as I know that best, or even a mySql/phpMyAdmin combo. But if it's to reside on the client machine try SQLite (just a warning though it is exactly as the name suggests, LITE, so a lot of the more complex SQL might not be supported). SQLite may be exactly what you're looking for depending on the complexity involved in your project.
ALSO: SQLite is supported on iPhone, Blackberry and Android as well. So if you wanna go mobile, no problem.
Your application could connect to a cloud database (like SQL Azure). That will not require any third party components and it will be accessible from anywhere/any device.
Do you need a only one database for all the users or every user has it's own database? If it will be on the server side, i would prefer SQL Servers (ex. MSSQL, MySQL). But for clients side, SQLLite would do the work.
You should consider if a SaaS model is relevant, in other words if you could benefit from hosting all your users' databases on the cloud and let clients access it by remote. In your scenario I would consider a SaaS model if: (1) your users need to scale up their individual database beyond the capacity of their local machine, (2) there is a need to share data between the databases of different users - or maybe this could enable good features you don't have in your system today, or (3) users find it a drag to have to run the database on their local machine, because of the resources it uses up, uptime needed, backups, etc. If any of these are sometimes true, hosting all the databases on the cloud with some sort of multi-tenancy arrangement might be a good solution and will make your application as lightweight as possible because you don't need to include a database at all.
I am writing a application in C# that needs to do the following:
without connecting to the database I need to check if there are some new logs in database. If there are then I am allowed to open the connection and retrieve them.
So I just need to know if there are new logs (elements) in the database WITHOUT opening the connection to it.
Server can send mail to administrator and I could monitor mailbox for changes but that solution is unacceptable.
Can server on inserting new rows create *.txt file on disk with text indication new rows which I can check and delete/edit after downloading change?
(database is in SQL Server 2008 R2)
Is it even possible? Any/And/Or other options to do this are welcome.
Thank you all very much in advance.
Based on the following clarifying comments from the OP under the question:
There is Web application which checks for change every 30 sec and shows latest authorizations. Database is tracking employee authorization and has frequent updates. Now I'm building desktop application, which has local connection to the server and can update more frequently, but client does-not want application to open connection every sec, aldo connection is opened for several ms.
I think that the appropriate solution is a business layer.
If you build a business layer hosted in IIS that performs the database access on behalf of the users using a single database user for access (the application pool user or an impersonated user within the web application), then connection pooling will reduce the number of connections made to the database significantly.
Here is an MSDN article that describes the mechanics and benefits of connection pooling in great detail.
All of the clients, including the web layer, would connect to the business layer using WCF or .Net Remoting (depending on your .Net version), and the business layer would be the only application performing database access.
The added benefit of this approach is that you can move all database access (including from the web client) inside the DMZ so that there is no direct database access from the DMZ outward. This could be a good selling point for your customer.
We use this mechanism extensively for very large, very security and performance conscious customers.
Update
As an alternative, you could have the business layer query the database every 30 seconds, extract the necessary information, and store it locally to the business layer in a database of some sort (Access, Sql Server Express, etc). When requests from clients are received, they will be served from the local data store instead of the database.
You could do this by kicking off a background thread in global.asax's Application_Start event or by adding a cache entry that expires every 30 seconds and performing the work in the cache timeout event.
This will reduce the number of connections to 1 (or 2 if the web isn't modified) every 30 seconds (or whatever the time is).
Try to monitor files change date inside DB folder.
If the client desktop application is not going to be deployed massively you could use SqlDependency. Then you wouldn't have to poll the database on a frequent basis, instead the database will notify you if something changes.
You could also deploy a service on the server which uses SqlDependency and then connect to this service from your desktop applications.
If that's not an option this document mentions some other options.
These two could be applied to your situation:
Create an AFTER UPDATE trigger on the table being monitored, whose action uses SQL Server Service Broker to send a message to the entity needing the notification.
Use Windows Server App Fabric Cache, which supports a change notifications mechanism, based on an in-memory object cache and callback functions you register with the objects.
I am building a system which is made up of 2 application
ASP.Net Website where jobs will be created for engineers to visit data held in central database
Windows Forms Application running on Laptop where jobs will be synced from central database then a form is completed and data sent back to the central database.
What will be the best way to transfer the data, the laptops will be remote and may be on a slow connection.
I will be using SQL Server 2008, Entity Framework, .Net 4
I have looked at the Microsoft Sync Framework but unsure if this will do whats required, also thought about having a web service where the windows form client can pull and push data to.
I have had bad experiences with MSMQ so what to avoid this.
How would you approach this?
Take a look at merge replication for SQL Server. This would allow your laptop users to make changes to the data they've received while offline and later sync back to the central database.
I believe MS Sync framework is designed to handle just this problem. While you can use MSMQ, you'd be writing most of the logic to do the syncing yourself. Sync framework should give you a better abstraction and take care of most of the details.
I am designing a Windows application in C# which when run for the first time creates a local database (SQLite) and the data (around 200 MB or even more) to this is feed as a data stream from a remote server based on the criteria specified by the user.
Currently I have planned to access the database server directly from the application.
Questions:
Is it a good idea to use the database server directly from the application as the server manages connections automatically and I save time in developing a TCP/IP interface.
What can be the second option? Providing a TCP/IP server or interface (Isn't it time consuming to write it?).
As the data is large should I use compression?
With WCF you can avoid the cost of writing TCP/IP code and have a standalone server or a web service hosted on IIS. Moreover, compression can be added without code change.
You have to try with and without compression. The compression rate highly depends on the data and compression time can also be an issue.
Without going into large detail I can say you can use ASP.NET C# (You can choose any .NET language) and you can send and receive data using POST.
Why would you need compression? You are only sending results? If it is big you can use an existing library. I have used sevenzipsharp in the past without much issue.
Edit: There may be an option on the server to gzip output data so you may not need to use anything.
Assuming that your intention is to pull down a subset of the data on the server dependent on client queries for local storage then for reasons of security and control you probably ought to be looking at using web services rather than exposing your database directly to the internet.
There are a large number of options for creating services - WCF being the principal method for new .NET applications and straightforward to implement at both server and client ends - in this case I'd also probably take a look at ADO.NET Data Services as providing a shortcut to a rich set of services.
It is usually best to use ADO.NET or LINQ to SQL (Entity Framework) to connect to your Database directly unless the User is going to be disconnected while using the application.
If you are going to have the user disconnect then continue using SQLite or you can use ADO.NET which can save an XML file of the data and access it like a Table from the users machine without the additional dependence of SQLite.
I would not use compression because C# does not have a built-in library for it and would require an additional dependency.
Try to just use the .NET Framework without additional DLL's and you will have a more flexible application that is easier to install and maintain.
ADO/Entity Framework - http://msdn.microsoft.com/en-us/library/h43ks021.aspx