C# and a Network Share Database - c#

I'm sure this has been asked before, but I am developing an application which needs to have a network share database. I don't have the option of setting up a dedicated SQL server, so my only option is a file based database.
The use will initially be 5-10 users who will primarily read from the database, and only write to it a couple of times per hour at most.
I've read on here that people recommend I stay away from Access, but what other options are there to achieve what I am after?

I would suggest either using SQL Server Compact Edition or SQLite.
Neither requires a server to run (they're file based) and both are more full featured and reliable than an Access database.

Related

.net windows application store data offline and store to db when there is network

I am developing a windows application for agricultural purpose. This application will be used by multiple users to maintain the data. The main issue is there won't be network connectivity on the work location. But however by end of the day they can go and synchronize if there are any option.
I just want to know how can we import and store all the data locally and update the data to database when there is network.
The options that i thought is to have SQL on every machine that runs this application. Store the data to local database when there is no network.
Having a separate button to export the local data to the centralized database when there is network.
Looks like this is complicated. Is there any better and easier option.
I prefer using c#, Visual studio.
Thanks.
You can use SQLite for storing data locally. It's fast, lightweight, and public domain.
You can use whatever the database of choice for the centralized server.
Well, this a quite broad question, as it has many options and scenarios. The questions you should ask yourself are:
Does user handle new information only or any information from any other user from the previous syncing?
Do you have to handle update conflicts?
Do you handle text information only or you have complex types and binary files?
As for the solution, the easiest way, from my point of view, would be using SQL Lite on portable devices, is a lightweight SQL client that will allow you to handle information easily. On the server you can use whatever you want, SQL Server, MySQL or any other SQL flavor you may like. Just make sure there is a connector for your portable device OS.
If you keep thinking of using SQL server on the portable device, it's a battery hogger!!!, you might want to check Microsoft Sync framework, as it provides almost all possible scenarios for handling data syncing, manage conflicts, etc.
Thanks for the answers. Please find the below solution that we implemented.
1) Installed SQL express on all the local machines
2) Used Microsoft Sync framework to sync the data. The sync is configured on demand.
Issues faced:
1) We were using geometry datatype on few tables and this was not supported by sync framework.
2) Any change in the database schema will not reflect on the client machine. We will have to delete all the system generated procedures used to track the table change and regenerate it. I am sure there will be a much better way to do this.
Cheers,
Jebli

Which one is the best method to replicate a database in SQL Server?

I was wondering which one is the best way to replicate some data of a database to another.
I have a database in one computer and this one receives some transactions. I need to send this data to another server (in the same local network) but with a modified value (I need to add 11 years to a Timestamp value).
So I was looking for some options for my case, I can develop a windows service to do this but I don't know if the sql server replication can do this for me or if there is another option like some kind of magical trigger that can do that.
I'm using SQL Server 2005 on Windows Server 2003 R2.
This link should help you:
Selecting the Appropriate Type of Replication
Quoted summary from link:
Microsoft SQL Server offers three types of replication. Each type of
replication is suited to different application requirements. Depending
on the needs of your application, you can use one or more types of
replication in a topology:
Snapshot replication
Transactional replication
Merge replication
I personally would replicate the database (transactional) and then use log shipping to update the replicated database (on your second server) with the latest data changes (from the primary server) then use a stored procedure running as a sql agent job to update the fields you need.
I personally am not a fan of triggers as you can end up having triggers activating other triggers and something that takes milliseconds to run can take seconds and if you have large volumes of data that can be painful (I manage a system that has exactly this issue - soon to be replaced thankfully)
hope this helps and if you have some follow up questions I'll be happy to help.

How can I store a lot of data locally for a program

I am current building (in C#) a fairly basic point-of-sale program for a local community in Uganda to use in tracking business at their sunflower seed press. I was thinking that I would need some sort of database (like a SQL database), but I've never set up a database before, so I'm wondering what the best way to do this is. Maybe a database isn't the best way. The program will not have internet access, so everything will have to be done locally on the machine.
I think your first step should be designing out what data you need to store. Build an Entity Relationship Model and decide what your domain model is going to be. There are many different Database Engines out there that you can use that have different features, installation requirements, etc. A database engine can be installed locally, or on a remote machine to connect to. If you're writing a C# app, you'll probably want to use the System.Data namespace. You can use plain ADO .NET, or use something like Linq To Enttiies to help create proxy classes for your data tables.
You can access a SQL database using the same API for queries / record extraction regardless of the DB Engine uses. In some caess, you may need to use a seperate library that provides an implementation (or a better one), as in the case of an Oracle Database and the Oracle Data Access Components. Right out of the gate, .NET works very well with Microsoft SQL Server, but other options would work.
The details of what database engine are not as important as defining a good set of data tables to represent your data.
Yes. If it has lots of data you have to consider using database. Whether you have internet or not, as long as you have local network, you can easily do database.
Set up a database server ( maybe sql)
Do your database and install it on the database server
Do your application and connect to your database through connection string.
You are on the right track to use a database to store data. It is pretty easy to accomplish. Your computer does not need to be connected to the internet.
SQL Server Express Edition is free with a limit of 10 gigs of data. This will probably be much, much more space than you will need.
From C#, use ADO.NET. It is very simple if you know some SQL. Code samples here.

Database on a server without installation?

Right now I am having a customer who is working with several businesses. He is working with their data but is not allowed to directly access their databases. We thought of using SQLite or SQL CE and storing a copy/part of the original database as a file on a network share. Now the problem is that SQL CE is not supporting it and SQLite highly recommends not to do so.
First of all the performance is a huge problem, since our customer is working with a lot of data (up to several gb). The second problem is that SQLite has problems (actually the underlying os functionality for file locking is the problem) with concurrent usage of the database, when it is stored on a network share. I did a lot of research on that topic and many people say that it is just a matter of time that the database gets currupt.
Does anyone know a better solution to that problem or a workaroung which lets me use SQLite? It does not need to be a file based database, as long as nothing needs to be installed or run on the server.
Thanks, David.
If you are going to store data on a network share and have concurrent users accessing it you are going to need a db that can handle concurrent access. MS Access will quickly die if under concurrent access as will SQL Lite.
SQL Server Express is free and works very well. PostgreSQL as suggested by Maxim is an open source full featured db that will do the job very well but may be overkill.
You could also look at Redis ... fast lightweight in memory no sql db that also has capability to persist to file.
You can try PostgreSQL. It is very easy to configure, and is rather reliable. It also support server export/import options.
And any of this makes sense, if you client is able to get his hands on an exported database somehow.

Any ORMs that work with MS-Access (for prototyping)?

I'm in the early stages of a project, and it's not clear yet whether we'll need a "real" database (i.e. SQL Server et al). So I've been doing some prototyping using MS-Access, which is working fine so far. (developing in C#/VS2008/.Net 3.5/MS-Access 2000).
However, the object-relational impedance mismatch is already becoming annoying, and will only get worse as the project evolves.
I have not been able to find an ORM that will work with MS-Access. Any suggestions?
Edit - Follow Up
We ended up using Fluent NHibernate, mainly because it Automaps our object model to a relational database, which has been a huge win for us. Most of the FNH code samples we found used SQLite, and this worked so well that we intend to use it for our production database. (The app is a desktop scientific data collection and analysis package).
MSAccess files can be set up as an ODBC source on Windows machines. Almost any ORM will allow you to use ODBC. Here is a quick tutorial on how to set that up, it's outlined for Win2k but the process is the same for XP+. You also need to have MDAC installed on your box.
NHibernate seems to have native support of MSAccess as well, see here. I've never used it though. It also has an ODBC driver.. Many others support ODBC as well.
And again, as others are saying.. MSAccess does not scale... period. Installing a real database server is fairly easy, so I'd recommend SQL Server Express as others have, or even MySQL or Postgre, whatever is easier to set up.
If this is an application that you intend to deploy to clients, with each client having their own unique database, I would recommend another solution entirely, SQLite. SQLite gives you database power on an app by app basis. If you have a central database server, one of the previously mentioned solutions would be best.
There's only one scenario when choosing the Access Database Engine is a good choice: when building a self-contained Access application using Access Forms (though choosing to use Access in the first place is a questionable choice ;)
The database engine that VS2008 plays nicest with is SQL Server and you will have no problem finding an ORM that plays nice with SQL Server.
Can't give you an answer to your question, but instead of Access you might want to consider one of the following options:
SQL Server Express: is free and compatible with the full SQL Server
SQL Server Compact: also free, does not require any deployment/installation, does not support all features (e.g. no stored procedures).
At this stage, if you are unsure whether you need a "real" database or not, I'd skip MS Access and go straight to sql server express. It's free and still allows you to do everything you need to.
Plus, if you later decide you need to scale up, then you can without any pain.
I recommend you to use something like Microsoft SQL Server or PostgreSQL for prototyping. If you don't want to learn specific SQL syntax and install special tools for designing database schema, you can use ORM that automatically generates database schema from your persistent classes declaration. Anyway this approach is very effective for prototyping.
LLBLGen works with Access
Access is just a bad, bad idea. I believe MS only includes Access in Office to keep legacy users happy.
Even if you find an ORM that will work with an Access database, with few exceptions you're locking yourself into a niche tool that likely will not work out-of-the box with a real database engine. If you decide to switch to a real database engine later on, you'll not only have to deal with migrating the database, but switching to a different ORM.
See this comparison between SQL Server Express and SQL Server Compact. The comparison document also mentions some problems with other data stores, including Access.
If you are REALLY concerned about being able to install SQL Server Express, consider SQL Server Compact:
it can be linked into your redistributable app. No need to install a service (which may require admin rights during install of your application); everything is taken care of when you install your app. This makes the most sense if you need the data to reside on the user's machine instead of a server, and is most analogous to using Access.
It's less powerful than Express (doesn't support views, triggers, stored procedures, which I consider a requirement)
Can be scaled up to Express or other SQL Server versions very easily
Suitable for small-footprint installs like tablets, mobile devices, etc.
Always keep scalability in mind when designing any application. You don't want to wind up having to write a PHP->C++ compiler if/when your app becomes successful just because you picked the wrong tool up front.
While we're at it:
The big issue with Access (or, in this case, the Jet engine, which is the part you'd really be using when integrating an Access database with a .NET app) is that there is no "server" that handles datase requests. The engine, hosted in your app, must read and write directly to a file on disk that contains the database. Whenever this happens, the file must be locked to prevent concurrent writes. Dirty reads become more common as the number of users grows, as does the potential for database corruption.
Imagine having every customer at a large restaurant trying to simultaneously enter the kitchen to write down their orders or retrieve their food. Chaos would result. There'd be a lot of broken dishes, the kitchen would be a mess, you'd be lucky to get what you ordered in any sort of edible condition. With one customer, this probably works fine. With 5, eh, maybe. With 20,50,1000? Not so much.
So, the restaurant industry introduced waiters and managers that buffer IO to the kitchen. The database server application does something roughly analogous to this by restricting access to the files on disk. Everyone gets what they want, faster and in a much more reliable way, and the data store is protected.

Categories