What options are available for local state persistence in .Net - c#

I'm looking for good options for persisting local state of an application (created in .Net / C#.)
I've considered rolling my own solution, or using a simple local database such as Sqllite, however I thought I'd ask the SO community what options are available.
The application I'm working on exists on a Kiosk, and I will need to maintain local state in case of accidental shutdown and other similar exceptions.
What do you recommend?
Update
Well, after further review I decided to use SQLite with DbLinq, the main alternative contender was SQL Server Compact edition, but there are several factors which I found important, which placed SQLite above SQLCE.
Notably.
The API provided by SQLCE (IMO) wasn't as rich as the ones provided by both SQLite.net or DbLinq.
SQLCE runtime performance wasn't as good as SQLite.
SQLite being open-source is a useful factor.
Why I didn't use Properties.Settings or app.config?
For my use case I needed something like a (easily) query-able event log (I need to allow auditing of credit transactions.) The primary state value I was addressing here was available credit on a kiosk. So to restore my state variables at application start up, I can query the log for the last entry and get the required values from there.
When you only need to store a value and retrieve it Properties.Settings or app.config are perfectly adequate places to do this.

I would recommend XmlSerializer or BinaryFormatter, you can look them both on SO or google.
Your choice will depend on few preferences such as size of dataset or ability to serialize it quickly.
And both are really quick way IF YOU NEED whole app state saved in one go.
However, if you need to save app state part by part, as changes occur, you'll be better of with some kind of a database, http://www.sqlite.org/ for example.
http://www.jonasjohn.de/snippets/csharp/xmlserializer-example.htm
http://www.jpreece.com/csharp/serialization-tutorial/

SQLite is definitely a good option, it's lighter than a full blown database, and it's ACID compliant in case something goes terribly wrong.
The only requirement here is that you have one concurrent user for the database.

Can you use the app.config or Properties.Settings?

You can use Settings in VS. For your case, you need user-scope settings that store user preferences.

You can use SQL Server CE 4.0 instead of SQLite. Store data in one file and you can use the Entity Framework Code-First.

Related

.net windows application store data offline and store to db when there is network

I am developing a windows application for agricultural purpose. This application will be used by multiple users to maintain the data. The main issue is there won't be network connectivity on the work location. But however by end of the day they can go and synchronize if there are any option.
I just want to know how can we import and store all the data locally and update the data to database when there is network.
The options that i thought is to have SQL on every machine that runs this application. Store the data to local database when there is no network.
Having a separate button to export the local data to the centralized database when there is network.
Looks like this is complicated. Is there any better and easier option.
I prefer using c#, Visual studio.
Thanks.
You can use SQLite for storing data locally. It's fast, lightweight, and public domain.
You can use whatever the database of choice for the centralized server.
Well, this a quite broad question, as it has many options and scenarios. The questions you should ask yourself are:
Does user handle new information only or any information from any other user from the previous syncing?
Do you have to handle update conflicts?
Do you handle text information only or you have complex types and binary files?
As for the solution, the easiest way, from my point of view, would be using SQL Lite on portable devices, is a lightweight SQL client that will allow you to handle information easily. On the server you can use whatever you want, SQL Server, MySQL or any other SQL flavor you may like. Just make sure there is a connector for your portable device OS.
If you keep thinking of using SQL server on the portable device, it's a battery hogger!!!, you might want to check Microsoft Sync framework, as it provides almost all possible scenarios for handling data syncing, manage conflicts, etc.
Thanks for the answers. Please find the below solution that we implemented.
1) Installed SQL express on all the local machines
2) Used Microsoft Sync framework to sync the data. The sync is configured on demand.
Issues faced:
1) We were using geometry datatype on few tables and this was not supported by sync framework.
2) Any change in the database schema will not reflect on the client machine. We will have to delete all the system generated procedures used to track the table change and regenerate it. I am sure there will be a much better way to do this.
Cheers,
Jebli

Best approach to incremently update application data

I have been working on an application for a couple of years that I updated using a back-end database. The whole key is that everything is cached on the client, so that it never requires an network connection to operate, but when it does have a connection it will always pickup the latest updates. Every application updated is shipped with the latest version of the database and I wanted it to download only the minimum amount of data when the database has been updated.
I currently use a table with a timestamp to check for updates. It looks something like this.
ID - Name - Description- Severity - LastUpdated
0 - test.exe - KnownVirus - Critical - 2009-09-11 13:38
1 - test2.exe - Firewall - None - 2009-09-12 14:38
This approach was fine for what I previously needed, but I am looking to expand more function of the application to use this type of dynamic approach. All the data is currently stored as XML, but I do not want to store complete XML files in the database and only transmit changed data.
So how would you go about allowing a fairly simple approach to storing dynamic content (text/xml/json/xaml) in a database, and have the client only download new updates? I was thinking of having logic that can handle XML inserted directly
ID - Data - Revision
15 - XXX - 15
XXX would be something like <Content><File>Test.dll<File/><Description>New DLL to load.</Description></Content> and would be inserted into the cache, but this would obviously be complicated as I would need to load them in sequence.
Another approach that has been mentioned was to base it on something similar to Source Control, storing the version in the root of the file and calculating the delta to figure out the minimal amount of data that need to be sent to the client.
Anyone got any suggestions on how to approach this with no risk for data corruption? I would also to expand with features that allows me to revert possibly bad revisions, and replace them with new working ones.
It really depends on the tools you are using and the architecture you already have. Is there already a server with some logic and a data access layer?
Dynamic approaches might get complicated, slow and limit the number of solutions. Why do you need a dynamic structure? Would it be feasible to just add data by using a name-value pair approach in a relational database? Static and uniform data structures are much easier to handle.
Before going into detail, you should consider the different scenarios.
Items can be added
Items can be changed
Items can be removed (I assume)
Adding is not a big problem. The client needs to remember the last revision number it got from the server and you write a query which get everything since there.
Changing is basically the same. You should care about identification of the items. You need an unchangeable surrogate key, as it seems to be the ID you already have. (Guids may be useful here.)
Removing is tricky. You need to either flag items as deleted instead of actually removing them, or have a list of removed IDs with the revision number when they had been removed.
Storing the data in the client: Consider using a relational database like SQLite in the client. (It doesn't need installation, it is just storing in a file. Firefox for instance stores quite a lot in SQLite databases.) When using the same in the server, you can probably reuse some code. It is also transaction based, which helps to keep it consistent (rollback in case of error during synchronization).
XML - if you really need it - can be stored just as a string in the database.
When using an abstraction layer or ORM that supports SQLite (eg. NHibernate), you may also reuse some code even when there is another database used by the server. Note that the learning curve for such an ORM might be rather steep. If you don't know anything like this, it could be too much.
You don't need to force reuse of code in the client and server.
Synchronization itself shouldn't be very complicated. You have a revision number in the client and a last revision in the server. You get all new / changed and deleted items since then in the client and apply it to the local store. Update the local revision number. Commit. Done.
I would never update only a part of a revision, because then you can't really know what changed since the last synchronization. Because you do differential updates, it is essential to have a well defined state of the client.
I would go with a solution using Sync Framework.
Quote from Microsoft:
Microsoft Sync Framework is a comprehensive synchronization platform enabling collaboration and offline for applications, services and devices. Developers can build synchronization ecosystems that integrate any application, any data from any store using any protocol over any network. Sync Framework features technologies and tools that enable roaming, sharing, and taking data offline.
A key aspect of Sync Framework is the ability to create custom providers. Providers enable any data sources to participate in the Sync Framework synchronization process, allowing peer-to-peer synchronization to occur.
I have just built an application pretty much exactly as you described. I built it on top of the Microsoft Sync Framework that DjSol mentioned.
I use a C# front end application with a SqlCe database, and a SQL 2005 Server at the other end.
The following articles were extremely useful for me:
Tutorial: Synchronizing SQL Server and SQL Server Compact
Walkthrough: Creating a Sync service
Step by step N-tier configuration of Sync services for ADO.NET 2.0
How to Sync schema changed database using sync framework?
You don't say what your back-end database is, but if it's SQL Server you can use SqlCE (SQL Server Compact Edition) as the client DB and then use RDA merge replication to update the client DB as desired. This will handle all your requirements for sure; there is no need to reinvent the wheel for such a common requirement.

How to persist / save program information

I wrote a reminder program that runs automatically on startup. I want to know if there is a way, other than SQL-Server, to store event, date and time data. I do not want to use SQL-Server for this work, because I think SQL-Server is very big for this simple task. I think that I can use a file to store data in it. What do you think about this?
Some common ways to store information:
As a file. You have many options where you can store the file. For instance, user directory, and program directory. Further explanation here and here. I prefer using a serializer (xml or json).
As a registry entry. You store your information as key-value pairs.
In a light-weight database:
RavenDB: its document-oriented, and stores data in json format
SQLite: relational; I recommend this SQLite Admin for managing purpose
Registry entries are more safe regarding user actions. On the other hand, files can be easily deleted.
You always have the option, to encrypt your information.
As a side note, you can also use PostSharp to declare variables to be stored in your registry. The code becomes something like this:
[RegistryBacking]
private bool _boolean;
I can provide code later if you need it... when I'm home again.
For the part where to persist
From this document (Managing User Data Deployment Guide, download):
Windows uses the Local and LocalLow folders for application data
that does not roam with the user. Usually this data is either machine
specific or too large to roam.
Windows uses the Roaming folder for application specific data, such
as custom dictionaries, which are machine independent and should roam
with the user profile.
So, I suggest using AppData\Roaming and persisting to a file since I consider a 'reminder app' to be user specific. And domain users for example would consider that valuable (syncing to server).
Local and LocalLow (the latter is used for low integrity mode, for applications with reduced privileges) would be more appropriate for some machine/installation specific data which can be calculated on-the-fly.
Registry seems great for some low amount of keys, but doesn't seem to be the best option for such use.
There is another option - IsolatedStorage, which should be used when mentioned options are not applicable, like when using ClickOnce deployments.
For the part how to persist your data to a file ... well, pick your favorite. You could use SQLite database which comes really lightweigt if you want more control and power or just use XML serialization to a file if you consider using SQLite an overkill. Or any of other viable options.
XML. .NET has classes that makes handling xml files easy. If you're saving structured data then XML might be your best bet.
I have for very similar reasons tried some easy to deploy databases and yet use the knowledge i have.
VistaDB 3.x and 4 are my first choice because they are very much SQL Server compaible and allows me to switch to sql server anytime i like. This supports EF too!!!
Next is db4o by Versant which is very very handy. I use it mostly for quick prototyping but i have deployed to several small solutions and perfect for your kind of application.
I hope that helps!

On-Disk database storage, best practices

If this question seems common to you, I apologise, I did a quick search around this site and a few google searches and could not find a satisfying answer.
My question is this;
I have only been a software developer for 3-4 years now. This may seem like a time long enough to answer this question myself however in all my time, I have never had to develop software where the main body of data-storage is not required to be in an on-line database. This time however, my latest development requires only for its data to be stored only to disk.
The actual data itself is light-weight. In-code the main asset will be a class with only a few, string based properties on it which must be persisted. My initial thoughts are on simple serialisation. On application close new assets are simply serialised and stored on disk as a file. I also though maybe for backup purposes (or if it is somehow a better option to a serialised class) an XML file would be appropriate.
I cannot think of any distinct disadvantages of either of these approaches, it is this fact which causes me to ask this question publicly. In my experience, there is rarely a solution to a problem which does not have it's downsides.
Serialization (binary or XML) is appropriate for a small amount of data. The problem with this approach is when you get large amounts of data (that you may need to query).
If you are on a windows platform and in need of a proper database, you can use the embedded database engine that comes with windows - ESENT. It is the backing store of Exchange and RavenDB.
Here are the .NET wrapper libraries for it.
ManagedEsent provides managed access to ESENT, the embeddable database engine native to Windows. ManagedEsent uses the esent.dll that is part of Microsoft Windows so there are no extra unmanaged binaries to download and install.
The most lightweight solution, is of course to use XML and serialization. The main advantage of that is that it is very easy, requiring little code, and is easily editable using a text editor. The other advantage of this is being able to have multiple files, and they will be easy to transfer from PC to PC.
Here is a nice tutorial on XML serialization.
However, if your application is going to be reading, writing, and changing the data a lot, and there is only one source of data, it would be better to use a light-weight database. Many people like SQLite, while I personally prefer Firebird.
See this question for using SQLite with C#, and see here for information for using Firebird with .net.
Another embedded database option is Sql Server Compact Edition. The latest version of this is v4 and it seems to be much improved over previous versions.
It's functionally equivalent to using an XML file, or an access database, or even a plain old text file, in that you don't need to have a Sql Server service running or install anything special on the machine that your application runs on.
I've been using Sqlite in a project and it works very well and it's easy to use too, one thing to keep it mind when using Sqlite though is that it's designed to be used in a single user environment, so if you use it as the database for the backend of a website for instance you're likely to find that it'll struggle under the slightest of load..
Check out this link for the C# wrapper:
http://sqlite.phxsoftware.com/
I also use NHibernate and NHibernate.Linq to interact with the data, you can get a build of both which are compatible here: http://www.dennisdoomen.net/2009/07/nhibernate-210-ga-with-linq-and-fluent.html
NHibernate.Linq allows you to use those nice Linq query syntax on your Sqlite db:
var onePiece = from s in session.Linq() where s.Name == "One Piece" select s;

Any ORMs that work with MS-Access (for prototyping)?

I'm in the early stages of a project, and it's not clear yet whether we'll need a "real" database (i.e. SQL Server et al). So I've been doing some prototyping using MS-Access, which is working fine so far. (developing in C#/VS2008/.Net 3.5/MS-Access 2000).
However, the object-relational impedance mismatch is already becoming annoying, and will only get worse as the project evolves.
I have not been able to find an ORM that will work with MS-Access. Any suggestions?
Edit - Follow Up
We ended up using Fluent NHibernate, mainly because it Automaps our object model to a relational database, which has been a huge win for us. Most of the FNH code samples we found used SQLite, and this worked so well that we intend to use it for our production database. (The app is a desktop scientific data collection and analysis package).
MSAccess files can be set up as an ODBC source on Windows machines. Almost any ORM will allow you to use ODBC. Here is a quick tutorial on how to set that up, it's outlined for Win2k but the process is the same for XP+. You also need to have MDAC installed on your box.
NHibernate seems to have native support of MSAccess as well, see here. I've never used it though. It also has an ODBC driver.. Many others support ODBC as well.
And again, as others are saying.. MSAccess does not scale... period. Installing a real database server is fairly easy, so I'd recommend SQL Server Express as others have, or even MySQL or Postgre, whatever is easier to set up.
If this is an application that you intend to deploy to clients, with each client having their own unique database, I would recommend another solution entirely, SQLite. SQLite gives you database power on an app by app basis. If you have a central database server, one of the previously mentioned solutions would be best.
There's only one scenario when choosing the Access Database Engine is a good choice: when building a self-contained Access application using Access Forms (though choosing to use Access in the first place is a questionable choice ;)
The database engine that VS2008 plays nicest with is SQL Server and you will have no problem finding an ORM that plays nice with SQL Server.
Can't give you an answer to your question, but instead of Access you might want to consider one of the following options:
SQL Server Express: is free and compatible with the full SQL Server
SQL Server Compact: also free, does not require any deployment/installation, does not support all features (e.g. no stored procedures).
At this stage, if you are unsure whether you need a "real" database or not, I'd skip MS Access and go straight to sql server express. It's free and still allows you to do everything you need to.
Plus, if you later decide you need to scale up, then you can without any pain.
I recommend you to use something like Microsoft SQL Server or PostgreSQL for prototyping. If you don't want to learn specific SQL syntax and install special tools for designing database schema, you can use ORM that automatically generates database schema from your persistent classes declaration. Anyway this approach is very effective for prototyping.
LLBLGen works with Access
Access is just a bad, bad idea. I believe MS only includes Access in Office to keep legacy users happy.
Even if you find an ORM that will work with an Access database, with few exceptions you're locking yourself into a niche tool that likely will not work out-of-the box with a real database engine. If you decide to switch to a real database engine later on, you'll not only have to deal with migrating the database, but switching to a different ORM.
See this comparison between SQL Server Express and SQL Server Compact. The comparison document also mentions some problems with other data stores, including Access.
If you are REALLY concerned about being able to install SQL Server Express, consider SQL Server Compact:
it can be linked into your redistributable app. No need to install a service (which may require admin rights during install of your application); everything is taken care of when you install your app. This makes the most sense if you need the data to reside on the user's machine instead of a server, and is most analogous to using Access.
It's less powerful than Express (doesn't support views, triggers, stored procedures, which I consider a requirement)
Can be scaled up to Express or other SQL Server versions very easily
Suitable for small-footprint installs like tablets, mobile devices, etc.
Always keep scalability in mind when designing any application. You don't want to wind up having to write a PHP->C++ compiler if/when your app becomes successful just because you picked the wrong tool up front.
While we're at it:
The big issue with Access (or, in this case, the Jet engine, which is the part you'd really be using when integrating an Access database with a .NET app) is that there is no "server" that handles datase requests. The engine, hosted in your app, must read and write directly to a file on disk that contains the database. Whenever this happens, the file must be locked to prevent concurrent writes. Dirty reads become more common as the number of users grows, as does the potential for database corruption.
Imagine having every customer at a large restaurant trying to simultaneously enter the kitchen to write down their orders or retrieve their food. Chaos would result. There'd be a lot of broken dishes, the kitchen would be a mess, you'd be lucky to get what you ordered in any sort of edible condition. With one customer, this probably works fine. With 5, eh, maybe. With 20,50,1000? Not so much.
So, the restaurant industry introduced waiters and managers that buffer IO to the kitchen. The database server application does something roughly analogous to this by restricting access to the files on disk. Everyone gets what they want, faster and in a much more reliable way, and the data store is protected.

Categories