From simple desktop application to client server application - c#

I have developed a simple desktop application with a SQL Server database for a single PC and now the client want to make it work on multiple PCs. I want to know what is better: for the moment I have remote the database from sql management and all application just connect to it. Is this a good idea or do I have to do some modification to improve the executing of the application?
The database has a lot of information to be imported to the application.
I don’t have a good idea about WCF but would it help to read about it?

You could have a dedicated server with database hosted on it and all the client applications could connect to it. But one thing you have to take care of is transaction management that is while a user is updating some piece of information, no other user could change that piece of data to make that data inconsistent. You could a look at this post describiing Sql Server Transactions.

Depending on the requirements I'd recommend keeping the local database as cache for speedy application start and implement a synchronisation process where the local and remote databases are compared from time to time or triggered manually by the user.
This would be very similar to how for example IMAP email clients or Evernote works.

Related

how to connect a C# project with Access database placed on server (NOT local network) to edit and view?

I have some Databases built in Access and I want to be able to view and edit them when I'm not connected to a local server\network.
How can I access with C# to view and edit the Access databases placed on a server without VPN or SMB (which means creating local network). Is it possible to edit it realtime on the server?
This needs to be accessed by more than one people, thus I also want for example block a table if one's already editing it (so here also goes the FTP protocol - to download and edit on the PC and reupload)
I hope I was clear enough and provided enough info, thanks for all helpers!! Enlighten me please :)
We would first have to ask how end users going to run and use the C# program?
Desktop: users would need a network connection to the server. (most likly a VPN).
Web based: users would need a network connection to the WEB SERVER. This could also be a VPN, or could be a web server that is public facing. this would then require logons for security.
If users don't have a network connection, then it not going to matter if this is oracle, MySQL, SQL server or Access. And in fact, if this is web based, then users need to be able to connect to that web server.
So, without some kind of network connection to that server or computer where the data resides, and you eliminated a VPN, then your options are limited.
You can build a web site and place it on a server. However, if users don't have any kind of network connection even in the case of a web site, then I fail to see how you can even suggest using FTP let alone any other kind of connection.
This needs to be accessed by more than one people,
Ok, you need multi-user. However the locking up a whole table on sql server to allow only one user is actually quite difficult.
But, we can leave that you want one user in a given table at one time. (but both Access and a web site would in fact allow multiple users - even editing the same table).
All in all?
Then this suggests the most obvious solution: run a web server, and that would allow any user to connect to the web site, and the web site then can read/talk/use the access database that resides on that server. And this then again means that you don't need any client software installed.
FTP is not a practial solution - since it only works on a whole file.
So, users will require some means to connect to some server. That being the case, then write your C# appliation as web based, and thus no client software will be required, and the only software that interacts with the access file on that server will be the web site.
So, running a web site on that server does seem to be the best option.
So, we heading towards a web solution.
So then software would stay and run 100% on the server side, and thus zero client software would be required other then that of a browser.
I developed a simple Python web server to work with the Access DB via HTTP:
https://github.com/vikilpet/MS-Access-HTTP-Server
Probably this is not an ideal solution for your case but it may be a good starting point.

What do we need when running the software with the MySQL database on the user computer?

I created an application with a C # programming language and MySQL database.
When I want to install the software on the user's computer,
What software do I need?
MySQL software should be fully installed on the user's system?
Or is there another way?
Is this with a stand-alone database, or are you creating a shared database the client application needs to connect to?
If it's a stand-alone program then don't use MySQL. Use an embeddable database library like SQLite instead. These are far more durable and resilient and can handle abrupt shutdowns and restarts. MySQL needs a lot more care and attention, plus has a huge footprint in terms of memory and CPU consumption. Managing that automatically is not easy. Walking a client through how to repair a damaged MySQL database is not something you want to do.
If it's a shared database then you technically only need the client libraries, but remember, exposing MySQL to the general internet is extremely dangerous and should be avoided whenever possible. If you must, ensure that your users are using SSH or a VPN of some sort for access to restrict who can connect to your database server.
The best plan for a remote application is to build out an API that intermediates between client and database, giving you the ability to layer in access control at every level necessary to protect the data. MySQL has very broad access control, locking down individual records can be hard to do and easy to get wrong. Unless you can trust all users, it's best to not trust any.
You don't need the server installation on the client / remote machine. Just a workbench should be fine through which you can connect to the server and perform any database operations.

Reaching a file in a server through C# application

I wrote an application in c# & SQLite for storing data of all employees in a company which has around 500 employees. I want to put the database & the application in a file server/shared folder (MS server). Then all employees will have a shortcut of the application in their desktops. I want to make some input fields (text box) enabled/disabled based on the permission of the user runs the application. Whats the best practice for doing that?
I want the user can read/write in the database through my application only (the application is located in the same database folder). I don't want the user to reach the database without my application. How to do that?
I don't want the user to reach the database without my application
If your application will directly access the SQLite database via a Windows file share, this is impossible. Sure, you can make it inconvenient, but it's not really possible.
The only way to achieve this really is by introducing some middleware.
This would typically be a service (WCF perhaps) that listens for connections from your client application, authenticates them, and manages all access to the underlying database. The database would be stored in a location that is visible to the server only, and not visible through a Windows share to your users.
Also, SQLite isn't exactly a great choice for a multi-user system. You can kill two birds with one stone here - switch to a DBMS (MS SQL Server Express is free, also MySQL, PostgreSQL are common free choices) that accepts client connections over a network, and build your application to connect directly to the database server (using integrated Windows authentication may also be possible like this, so you can avoid an explicit logon). In a simple scenario this may be adequate and avoid you needing to build an explicit service layer.
In a more complex scenario, it can still make sense to have a middleware layer between the application and the database - this way, you can change the database design without changing the application design and deploying to all of your client machines - instead, just change the middleware layer in one place and your application won't know the difference.
If you don't want the users to reach your database you should create a client server architecture.
You can run your service on the same machine as the file server (running as a Windows Service) and use WCF for communication between your server and your client. You access your database from your server and let your server authenticate your users and validate that they have access to the application.
You can cheat and try to "hide" database credentials inside your client application, but that is security by obscurity and any one with some programming skills or similar can find out the credentials to the database and connect directly to the database.

Organization of a multi-user application

I have a WPF C# multi-user applicaton which interacts with Sql Server Express database. Currently I have faced up with the following issue:
How to organize the application and the database in order for several users on different stations be able to work on it , maybe i should put the database file on a server, and make my application on all other stations refer to that server when they interact with the datatbase? If so, how can I provide security of the database file.
Is there any scenario in which I could install my application on server and sign it as server and while installing on other machines point that server?
Any advice on general strategies in such cases would be appreciated.
Thanks in advance!
If all the users are concurrent then your going to need to place the SQL instance on a server that they all have access too..
your also going to need to know look at quite a few things like this such as how your going to manage your transactions and just how your persistence layer is going to function in general.
each of those topics are probably going to breed many more SO questions :)
this could help for some inspiration on how your going to structure the persistance layer..
http://msdn.microsoft.com/en-us/magazine/dd569757.aspx
For multi-user application, you definitely should put the database onto a server. And because the application is for multi-users, the first screen shown when a user opens the application is the login screen (just like the case of web application).
Security isn't a matter, once a database is put in the filesystem, only the users on that computer can access it. And of course, the computer which contains databases is supposed to have only administrators as users. Another point is that Windows may have IIS running, don't put the database files under public root of IIS so that non-user people won't be able to download them through HTTP.
Let's say the users are working on the same office. You can assign any computer in the LAN as a server and install the database on it. Any computer in the same LAN has a LAN IP (eg. 192.168.1.100), your application can connect to this IP for database operations.

What is the best way to port the data back and forth from our client’s local database to/from our webserver database?

My question is: What is the best way to port the data back and forth from our client’s local database to/from our webserver database?
An explanation of the problem:
Our clients run our software against their local copy of our SQL Server 2008 R2 database. We routinely (once a day, middle of the night) need to combine fields stored in multiple tables for each of these clients (i.e. a view) and send that information over the internet to a SQL Server 2008 R2 database which we will host on our webserver. Each of our clients may have tens-of-thousands of records which we will need to port to our webserver database.
This information will allow our client’s customers to make payments and place orders. We will store these transactions in one or more tables in our webserver database. At regular intervals we need to push these transaction records back to our client’s local database. The client will then process these transactions and update the records which we push up to our webserver database at night.
I am a C# programmer in a very small shop. My first thought was to write a windows service to control the porting of data back and forth. This would require installing the service on each of our client’s server. I am concerned with our ability to easily maintain and extend that service. In particular, when we decide to port more data back and forth this would require updating the service at each client site. Given the size of our shop, that would become a serious challenge.
I would prefer to manage this process through SQL Server, preferably at the SQL server instance on our webserver. However, we have no one with extensive knowledge of SQL Server. (I am the SQL Server guru here, and I know just enough to be dangerous.) Some of our clients are very small companies and only have SQL Server express installed on their server. We did some experiments with replication, but never found a way to make it wok reliably over the internet, especially with SQL Server express.
I have read some about SSIS, Linked Servers, Service Broker, and 3rd party tools such as RedGate’s SQL Compare. I am uncertain which, if any, of these options would best suit our needs and have not found clear examples showing how to make use of each.
Any guidance on this issue would be very much appreciated. It would be particularly helpful if you can point me to relevant examples showing how to do what I have described above.
Just fast,
one option is to use the MS Sync Framework - it does as I can see exactly what you need, though not sure of the specifics in your case.
hope this helps

Categories