.net queue system - c#

I am building a system which is made up of 2 application
ASP.Net Website where jobs will be created for engineers to visit data held in central database
Windows Forms Application running on Laptop where jobs will be synced from central database then a form is completed and data sent back to the central database.
What will be the best way to transfer the data, the laptops will be remote and may be on a slow connection.
I will be using SQL Server 2008, Entity Framework, .Net 4
I have looked at the Microsoft Sync Framework but unsure if this will do whats required, also thought about having a web service where the windows form client can pull and push data to.
I have had bad experiences with MSMQ so what to avoid this.
How would you approach this?

Take a look at merge replication for SQL Server. This would allow your laptop users to make changes to the data they've received while offline and later sync back to the central database.

I believe MS Sync framework is designed to handle just this problem. While you can use MSMQ, you'd be writing most of the logic to do the syncing yourself. Sync framework should give you a better abstraction and take care of most of the details.

Related

Technique(s) in a C# multiuser application where all clients have their data up-to-date from a central database

In a multiuser environment:
How do you make sure that all clients see eachothers changes? What is the best way to do this?
In the past I created a C# application and installed it on 2 pc's. It connected to a central SQL Express server (the client application worked with Entity Framework Code First as ORM).
When client1 added a record to the database, this was not directly visibly to client2. Only if client2 fetched all data again (hard refresh) the change was visible.
Now I am looking for a solution on how this 'sync'(?) can or should be done. I like working Entity Framework Code First, it would be nice a solution could keep this.
Also the application is still in a very early stage.
I thought by having a central database and multiple clients connecting to it, but I'm not sure if this is a good solution. If your suggestions/solutions would require a central server application where the clients connect to (and where the server application does the database handling) this would be no problem.
If possible a basic sample solution or some basic code that shows how to always work with the latest data could would be very helpful!
Similar questions:
Entity Framework - Underlying data (in database) change notification
Entity Framework data updates in a multi-user environment with central database
Entity framework data context not in sync with database?
Thanks in advance
It depends on your environment and the data you are managing and the architecture you want.
If it's OK/acceptable to let clients have copies of the data which they can work with, they need to work with the data when not connected to the central server, then you could use the the Sync Framework.
You'd have your central SQL Server as usual, and use the Sync Framework to sync with clients.
You would write a "Provider" which would decide how to resolve changes made to the same data by different clients, etc.
You would have to put SQL Express (or possibly LocalDB (new name for SQLCE)) onto the client machines.
Then do your Entity Framework Model/Code to access the local database instead of a central one.
http://blogs.msdn.com/b/sync/archive/2008/06/24/sample-sql-express-client-synchronization-using-sync-services-for-ado-net.aspx
Otherwise it's down to designing and implementing some "tiers" and following a Distributed Internet/Database Architecture/SOA.
A nice free resource:
http://msdn.microsoft.com/en-us/library/ff650706.aspx
http://mtechsoa2011.blogspot.co.uk/2011/04/soa-vs-distributed-internet_27.html
Some useful books:
http://www.amazon.co.uk/Service-Oriented-Architecture-Concepts-Technology-Computing/dp/0131858580/ref=sr_1_1?s=books&ie=UTF8&qid=1343295432&sr=1-1
Other solution is create an "interface" for your database and each put data operation from some client can notify other clients. You can implement such interface by WCF with it's callbacks.
I have no simple code for whole architecture solution... If you'll ask more concrete question about building n-tier application with WCF I'll try to help.

.NET Compact Framework - SQL Server Compact or Flat File as backup

I've got a question regarding the pros and cons of using a database vs. a flat file in a windows mobile application.
We're developing a mobile application running on Windows Mobile 6.5 and using C#/.NET Compact Framework 3.5.
The mobile component is used in an inventory system to do receiving of deliveries. These data are then sent to the application server using a web service via Wi-Fi.
Now, we need to implement a backup plan. What we agreed so far is when then the mobile is unable to send the data to the server, it has to persist it locally and then send it at a later time. It can either be sent using the mobile or the mobile can be docked/connected to a PC via usb which will pull the data out and the PC can send it to the server.
My question: which is better to use in this scenario? Deploying an SQL Server Compact or writing the data to a flat file (xml, binary etc)
I'd like to get opinions on the pros and cons of each method keeping in mind not only the technical aspects but also the development work involved.
Thanks!
My personal experience with this scenario is to use SQL CE to do Merge Replication instead of a web
service for some very good reasons:
Your application becomes easier to develop because you read/write directly to a database and let the CE engine do the merging. (Linq to Sql etc)
You can manually control merge conflicts at the server or client. And this is already built, you don't have to build it yourself.
There is no backup necessary because your data on the CE device is already in a database.
Data can be cached locally as needed and filtered by device or user (meaning only the data needed by the device/user can be the replicated to the device.)
Although this does require more SQL knowledge, it is solid technology that works, and doesn't have to be re-written, tested and debugged continously.
If you are locked into using a web-service, then using SQL CE seems like overkill to me. I would simply duplicate what a database does and write transactions to a file (xml/json/binary) and then use those as needed (running them against the web-service when in wi-fi range or having a service running on a machine that pulls the files off the mobile device and onto the local PC or server and running those transactions).

From simple desktop application to client server application

I have developed a simple desktop application with a SQL Server database for a single PC and now the client want to make it work on multiple PCs. I want to know what is better: for the moment I have remote the database from sql management and all application just connect to it. Is this a good idea or do I have to do some modification to improve the executing of the application?
The database has a lot of information to be imported to the application.
I don’t have a good idea about WCF but would it help to read about it?
You could have a dedicated server with database hosted on it and all the client applications could connect to it. But one thing you have to take care of is transaction management that is while a user is updating some piece of information, no other user could change that piece of data to make that data inconsistent. You could a look at this post describiing Sql Server Transactions.
Depending on the requirements I'd recommend keeping the local database as cache for speedy application start and implement a synchronisation process where the local and remote databases are compared from time to time or triggered manually by the user.
This would be very similar to how for example IMAP email clients or Evernote works.

partially connected application using asp.net 3.5 (not mobile apps)

We had a requirement to build a ASP.NET 3.5 web application using web forms, WCF, ADO.NET and SQL Server. The users would connect via Internet.
Recently we understood that it is possible that users would often remain disconnected and would have Internet access intermittently.
I need to understand if we can create occasionally connected web application using asp.net 3.5 - what all technologies/features we need to use? Is MS Sync Framework the answer to the problem - is it a viable option to use with web application?
Is windows application the right approach instead of web applications - where the business logic would be run at the client itself, using local SQL Express editions with data then been synced up with Enterprise SQL server at server end when connection is established using replication and/or MS Sync framework. In that case is there a need to use WCF?
Does Silverlight applications help in this context -building paritally connected web apps?
Really appreciate if you can give pointers to how to go about this task of creating .net partially connected apps (not mobile apps)?
It looks to me as if you'll need to store your client data locally when not connected.
If you use wcf you can determine what type of protocol to use according to connectivity without affecting your main code e.g. tcp/ip for LAN, http for internet and msmq for storing up data when disconnected.
If data for transfer is stored up using msmq, as soon as a connection is remade then the data will be passed to your main server.
If you write your wcf, or communications code to run as a service (assuming windows functionality here) then it is up to you whether to retain the asp code or write a new windows app.
edit
Setup MSMQ at both ends, its part of windows setup and can be installed on a client machine, just the same as IIS is, it's on the installation disk but not installed by default.
I wouldn't use it to get web pages, have those available on the local machine, but instead use it to queue up data that MUST get back to the server. Your data access layer should be separated from your GUI layer anyway. I assume that your using the MVC pattern or similar.
I don't know what your application is requried to do but here is the example that I've worked on.
A mobile user who visits clients. He has a replicated copy of a company product database on his laptop. When he visits client sites he may not be able to connect to his company server, but still wants to place client orders. This he does using his laptop based application and database. Order data is queued up in MSMQ on the laptop.
As soon as he is able to connect to his company server MSMQ automatically sends the order data. The server has queued up MSMQ messages of changes to pricing and stock etc. that took place whilst he was disconnected. These are now received and the local database is updated.
The choice of TCP/IP, HTTP or MSMQ all happens seemlessly to the main application, the WCF code copes with the choice.
From what I know, you have two options:
Use Gears (abandoned) or Web Storage to store and sync local data, combined with heavily javascripted web pages that can detect loss of connection and work against the local data store.
Use the Sync Framework with a rich client (WinForms, WPF or possibly Silverlight OOB if it gets supported). The Sync Framework does not require a local installation of a database, instead it uses SQL Server Compact, which is simply a local file.
At this stage using Sync franework with probably rich client seems to be better option. Thanks a lot Guys for taking your time out and trying to answer my queries. I will let you know the technologies used after i manage to deploy the app!

Notifying app from SqlServer

Given a typical 3 tiers app deployed at two remote sites. The db behind the two installations contains exactly the same structure and same data. There needs to be a replication mechanism between the two backend db to keep them in synch. The native replication feature of SqlServer would do the job. However, the business layer of the app keeps a lot of data cached. If I use db level replication, the cache in the business layer gets out of sync.
Is there a way to create SQL triggers that notify the business layer about changes?
DB: SqlServer2005 or 2008, business layer: C#
Complex Option A: If you have static IP's you can use WCF (TCP/IP or SOAP) in the app and call it using SQL CLR, otherwise you could use WCF Peer-to-Peer although it may be hard with the limits of SQL CLR. To go one step further you could have the SQL CLR talk to the local program which then tells the others to update.
Simpler Option B: Site A clears the cache on it's side and sets a flag in the DB, SQL replication grabs it, inform site B which clears it's cache and the flag. The cache is cleared by either site whenever a disparity is found.
Take a look at Query Notifications, here are a couple of articles about the subject:
SQL Server 2005 Query Notifications Tell .NET 2.0 Apps When Critical Data Changes
Using Query Notifications in .NET 2.0 to handle ad-hoc data refreshes
Sql Cache Invalidation might help.
I've used Message Queuing for this functionality. This solution also copes with downtime between the apps. Just make sure you create the queue as a transactional queue to ensure messages are saved in case of power failure.
See http://www.codeproject.com/KB/database/SqlMSMQ.aspx for an example of how to use it on SQL Server 2005.

Categories