MySQL synchronization between 2 databases - c#

I have a desktop Application which needs to be on an online database to sync between different devices. The problem is that the connection with the server 'in my case I use JawsDB free plan on Heroku' is very slow and takes a lot of time to load the initial data.
What I thought to do is to make an offline replica of the database and sync it from time to time when the user wants to sync with the online data. I did some research and I think using master-slave servers is the solution I'm looking for but for most of what I found the master was the online database.
I don't know much about databases but I think the master in my case is the Offline Database.
I'll appreciate if someone can tell me if it's the right solution for my problem and how to start implementing it (I'll be thankful if you can tell me how to implement a sync job that runs from code 'I'm using C#') or of course if there's another solution that should be implemented in my case.

Related

Concurrent database access on shared network drive

I'm part of a small team that currently uses an Access database for scheduling a larger team's availability. This has presented some issues with corruption of the Access database. Additionally, I want to implement additional functionality over time.
I've set out to create an application for the 4-5 of us to use that will solve the concurrent database issue, as well as give the team more functionality.
Since this is a shared network drive, I won't have access to SQL Server (from my guess). I thought maybe a web service would be the way to go, but I don't really want to front the bill for this. Additionally, when I eventually leave the team I don't want to maintain this.
Some ideas I've come up with is an application written in C# that acts as the front-end with SQLite embedded as the back-end. However, I've spent days trying to get the Entity Framework to work with SQLite and am at the point of giving up.
I'm trying to decide what else I can do to solve this issue. Is there another technology I can use?
As was said, it sounds like you try to reinvent the DMBS wheel.
If you have a Database that multiple clients can use at the same time, "sharing a access file on a network share" will simply not cut it. You need a proper DBMS. You have simply outgrown the scale Access was designed for. propably even the scale it was intended for.
You said cost might be an issue, but it is not really: There are dozens of DBMS out there, with a number being Freeware. MySQL is a shining example of a free DBMS. Conver that whole Access thing into a MySQL Database. Write a frontend for the MySQL Database. Done.
If you already have a computer providing the share across the network, that same computer can provide the MySQL Server. Setting up a DBMS with 1+ instances can be a bit more involved then just enabling a share, but not much more then programming a WebService.

Best way to make sql server instance available remotely? Linked Servers or replication? Other?

A co-worker and I are working on some Pharmacy software (in C#) which deals with the management of patient profiles, patient drug prescriptions, etc. All of these different sets of data are stored in a sql server database (we're using 2008 standard but future versions are fine too). Each store has its own sql server instance on a local machine.
Our Goal:
We want to have "Store A" be able to access "Store B's" databases if need be. Basically in the event that perhaps a pharmacy customer is out of town and visits one of the other pharmacy branches.
Things I've thought of:
My initial thoughts were to basically keep an online server instance of sql server which could be accessed through a dns link (or perhaps IP). I was trying to figure out the best way to keep these in sync and I came across sql servers replication. Problem is I was going to use Transactional Replication with updating subscribers but since it's deprecated It's not really a long term option anymore. Microsoft suggests using p2p replication, but that requires enterprise edition and we're really trying to avoid that if we can. I wanted to use a transactional type of replication since it does a much better job of keeping records consistent (not having to wait for something like a merge agent job to run every hour or something like that).
Something I've thought about more recently is maybe having an internet based sql server instance, which would contain nothing but linked servers back to each stores local machine. I wouldn't have to worry about sync problems if other stores just worked directly off each others local machines. But I've read of a lot of people saying that this is a horrible security vulnerability so I'm not sure if this is even a plausible idea but I think maybe there's some way to make this work?
Anyways so this is the basic gist of what we're trying to do. I don't know if replication or linked servers would be the better route to take.
Edit:
What about bi-directional replication? I was reading a little bit about this but I'm a little unsure about if this is what I need or not. I don't want to have to stagger primary keys between servers or anything, since they are pretty important in identifying prescription numbers and stuff like that. But if I could do bi-directional replication, that could be good too.
Not really an answer but I have more space...
SQL Azure is a the 'cloud' version of SQL Server. A VPN is a way of creating your own private network over the internet. Do some research on these terms. Many applications are going cloud nowadays. You should really consider the likelihood that there will be no internet access.
With regards to replication, you can 'roll your own' replication if you own this application and you are happy to support it.
The basic premise is:
Create a trigger on every table which writes the PK of every change to a log table
Create a process which manages copying and merging only changed info (based on the log table) using subscribers and publishers

Push data from Sql Server to desktop application

I have a pretty simple .net 4 desktop application written in c# which needs to display some data inserted to a table on an SQL Server (2005). The data itself is quite simple, just one row of about 10 columns, (mostly counts of other data).
I could just poll the sql server from the application every x interval, but my preference is to have the sql server push the data out to this application if possible, as the timing of the "new data" is often irregular.
In short, I'd like to know if this is possible. Doing some research before posting this question, I found a few possibilities.
1) SignalR: I found this question which seemed promising, but this seems to be in the context of a web application rather than a desktop one. Upon review of the signalR wiki, it seemed to me that it requires some kind of web service or other http connection which I'd prefer to avoid.
2) Sql server change tracking, from this question. Firstly, I'm not on sql 2008 so I assume I'd have to install or configure it (which isn't a problem) but I'm also not sure if this will provide what I need.
I will mention as well that this client application could exist on 100+ different pcs which would all need to be notified on the data change.
So, is such a thing possible? I apologize if the question is a little vague - and thanks in advance for your help!
The SQLDependencyclass is supposed to cater to the very scenario that you are referring to.
While i do not have any personal experience using this, this article seems to be in line with your scenario

Architecture Question - One Central Database and Many Different Programs Accessing It

I am designing a program that will build and maintain a database, and act as a central server. This is the 'first stage' of a grander plan. Coming later will be 3-5 remote programs built around the information put into this database.
The requirements are:
The remote programs must be able to access the information in the database.
The remote programs must be able to set alerts when information in the database changes.
The remote programs must be able to request the central server to go out and fetch new / different data.
So, the question is this: how do I expose this data and events to the outside world? My two choices are:
Have them communicate directly with my 'server' application. This seems easier to:
do event notifications (although I suppose I'm probably missing something in SQL).
It also seems like this is more 'upgradeable' - that is I don't need to worry about the database updating and crashing all my remote programs because something changed. I can account for this and transform it the data to a version the child program will understand.
Just go ahead and let them connect directly to the database.
This nice thing about this is that it's solved. I can use LINQ for SQL. The only thing the main server application needs to do is let the remote programs know where the database is.
I'm unsure how to trigger / relay 'events' for field changes in a database over different programs that may or may not be on the same computer.
Forgive my ignorance on this question. I feel woefully unprepared to ask it, but I'm having a hard time figuring out where to get started with this. It is my first real DB project :-/
Thanks!
If the other programs are going to need to know about updates to the database, then the best solution is to manage all db updates through your server application so it can alert clients of the changes. Otherwise it will be tough for the clients to be aware of changes to the db. This also has the advantage of hiding the implementation details of your storage solution from the clients, so you are free to change databases, etc...
My suggestion would be to go with option 1. Build out a web service that can provide the information they all need. This will be the most flexible and allow you to reduce duplicate backend code that would happen with direct communication with the database.
I would recommend looking at some Data Source design patterns first. This types of patterns will help you come up with solutions about how to manage the states of your data. Otherwise I think that I would require some more information about your requirements for the clients to make any further useful suggestions.
I recommend you learn about SQL Server and/or databases first. You don't appear to realize that most of what you want from your "central server" can all be done by SQL Server itself.
A central databse is the simplest option and the cheapest to both build and maintain.
There are however a few scenarios where a central database could cause problems:
High load on one of the systems: A high load on one of the systems could reduce performance on the other systems. For example someone running an internal report stops you being able to take orders on your eCommerce site.
With several systems writing to the same database there is a greater chance of locking.
With several systems dependent on the same database schema, how do you upgrade? All systems at the same time?
If you need to take down the database all systems stop.

Sync data between a windows desktop app and windows mobile client app

I need to knock up a very quick prototype/proof of concept application to demo to someone within the next couple of days so I've minimal time to research this as fully as I normally would. The set-up is a very simple database application running on a laptop - will only ever be a single user updating a couple of tables so I was thinking of knocking up a basic Win Forms app against SQL Compact. Visual Studio's auto generated data grid edit screens will be fine with a little customisation. The second aspect is to then add a windows mobile client application that can pull data from both tables stored on the laptop, edit some data and insert some extra rows before sending the changes back to the laptop copy of the database.
I've not done any WinMo development so what's the best approach for me to look at. Is it easy enough to sync data between the two databases when the WinMo device is connected to the laptop with USB?
Most of the samples I've looked at so far seem to be syncing SQL Compact with SQL Standard using IIS which seems a bit overkill. The volumes of data to be synced are so small that I can easily write some manual sync code if it's easy for me to query/update the Compact DB from the laptop application when the device is connected.
Edit:
I've seen mention that a quick and easy solution is to use RAPI - when the device is connected copy the DB to the laptop, connect and do the necessary magic and then copy it back to the device. Any problems with this approach? This is a single laptop user with a single mobile device user to sync up so it's pretty basic stuff. In any single sync the volume of updates is likely to be less than 10 records.
Take a look at Microsoft's Sync Framework. The have examples of synchronizing SQLCE as well as contact data. The Sync Developer Center page has loads of info as well.
Trying to do this manually is not fun. It sounds easy, but once you get into things like collision detection, precedence, transactions, guaranteed delivery and loads of other stuff you'll find it really isn't as straightforward as you might think.
EDIT
If your scenario really is as easy as you say (i.e. it's not really a sync, but a data copy) then yes, RAPI is probably the easiest mechanism if ActiveSync exists and is acceptable as part of the solution. It's nice because you don't have to write the transport infrastructure, and if what's in the box in't enough, you can always write custom RAPI extensions.

Categories