I am currently in the process of developing a program and not sure where to go from here...
I am using Visual C# and the DotSpatial frawework in order to do the GIS/GPS side of things but am unsure of what back end database to use.
I have had a look at PostgrSQL with PostGIS and also had a look at MSSQL as this now has Geospatial capabilities.
So what I am trying to achieve is the following with the software:
- The software needs to be used both at the persons desk, but also remotely while using the GIS/GPS side of the system to track the users travelling. (i.e. when locating where they need to go - this is custom data on remote sites). This is relatively easy to do with DotSpatial alone and not DB is needed.
- They have custom forms that capture data (text, lats/longs, photos) while out on site.
- The data needs to be able to sync up with the main database when they are back in the office
- This data needs to be viewable by everyone connected to the system once the system is updated
Ultimately if this can be a type of DMS then that would be great. So I am keeping that in mind as well.
Should I use a seperate DB for the datacapture side of things and something else for the main DB or should I use the same for both? Which one is easiest to configure? I would prefer when deploying the software that the installation goes smootly and dont have to manually configure each machine.
The main server is Windows 2008 Server btw.
Any help or suggestions would be greatly appreciated.
I use PostgreSQL with PostGIS on a daily basis. Although it is opensource it provides very good funcionality and performance.
Check this Cross Compare between SQL Server 2008 Spatial, PostgreSQL/PostGIS 1.3-1.4, MySQL 5-6. This could give you a good idea
I second the recommendation for PostgreSQL/PostGIS. It works very well and is well supported by the community. I would note that OpenStreetMap uses PostGIS as well. Indeed, if you ever want to work with their data you'll be wanting PostgreSQL.
Related
I am developing a windows application for agricultural purpose. This application will be used by multiple users to maintain the data. The main issue is there won't be network connectivity on the work location. But however by end of the day they can go and synchronize if there are any option.
I just want to know how can we import and store all the data locally and update the data to database when there is network.
The options that i thought is to have SQL on every machine that runs this application. Store the data to local database when there is no network.
Having a separate button to export the local data to the centralized database when there is network.
Looks like this is complicated. Is there any better and easier option.
I prefer using c#, Visual studio.
Thanks.
You can use SQLite for storing data locally. It's fast, lightweight, and public domain.
You can use whatever the database of choice for the centralized server.
Well, this a quite broad question, as it has many options and scenarios. The questions you should ask yourself are:
Does user handle new information only or any information from any other user from the previous syncing?
Do you have to handle update conflicts?
Do you handle text information only or you have complex types and binary files?
As for the solution, the easiest way, from my point of view, would be using SQL Lite on portable devices, is a lightweight SQL client that will allow you to handle information easily. On the server you can use whatever you want, SQL Server, MySQL or any other SQL flavor you may like. Just make sure there is a connector for your portable device OS.
If you keep thinking of using SQL server on the portable device, it's a battery hogger!!!, you might want to check Microsoft Sync framework, as it provides almost all possible scenarios for handling data syncing, manage conflicts, etc.
Thanks for the answers. Please find the below solution that we implemented.
1) Installed SQL express on all the local machines
2) Used Microsoft Sync framework to sync the data. The sync is configured on demand.
Issues faced:
1) We were using geometry datatype on few tables and this was not supported by sync framework.
2) Any change in the database schema will not reflect on the client machine. We will have to delete all the system generated procedures used to track the table change and regenerate it. I am sure there will be a much better way to do this.
Cheers,
Jebli
I have worked with Microsoft Access as the back-end of my applications in the past and Visual Studio offers me the choice of copying the database to the installation of the executable that I have created. However, I now want to move onto more complex databases and I figured MySQL was a good start because it's free and popular. I know there may be better options and right now I'm currently only in learning stages so I strictly want to stick with MySQL.
My problem is that I have my MySQL running on my localhost. I have connected to it, ran queries, etc. Now if I wanted to deploy this application to other computers while keeping the database (not web-based) how would I go about doing that? The reason I don't want to go web yet is because I just want to get an understanding without dealing with networking yet. I figured this would be the way to go.
Thank you.
MySQL is not primarily anything. It's a full database as is Oracle, SQLServer, Postgre, etc. that can be used for any application that you feel it applies to.
In your case what you really want is SQLite for "embedded" database needs. The database is represented by a single file that can be opened and queried very similarly to MySQL.
http://www.sqlite.org/
To access the database from your C# code there are many libraries available to you. Here is one I used a while back as an example:
http://www.devart.com/dotconnect/sqlite/features.html
To play around with the data, as you would with MySQL Workbench, there are many front-ends. As an example there's a pretty good firefox addon for this:
https://addons.mozilla.org/en-US/firefox/addon/sqlite-manager/
And don't worry, it's extremely easy to use and most of the query syntax will apply to MySQL as well!
I will be building an in-house, Occasionally Connected App (OCA). What technologies would you suggest I employ.
Here are my parameters:
.NET Shop(3.5sp1)
C# for code behind (winform,wpf,silverlight)
SQL Server Backend (2005 or possibly 2008 pending approval)
Solo Developer
Solo SQL Administrator
Low Tech end users
Low bandwidth to 5 Branch offices
This is a LOB app but not a POS.
Majority of users have laptops that they take to Member's Home
The Data for this App is stored in 5 separate Databases, though in one SQL instance.
I am looking for specific recommendations on which path to choose. Merge Replication or Sync Framework database synchronization providers? SQL Express or SQL CE at the Subscriber? Can I use LINQ to SQL for the DAL?
Is a Silverlight 'Offline/Out of Browser App' Example Here, feasible?
This is my first LARGE business application so any experienced comments are welcome.
As requested here is some additional info on the type of Data. My users are Nurses and Social Workers who go to Member's homes and create "Plans" or "Health Assessment Reviews" for them. These are things like a Medication List or a List of there current "Providers". Steps to achieve members' goals or a list of there current/past Diagnosis's. Things like that.
Also the typical Members Name, Address, Phone Number, etc. Mostly this is a Data Storage and Retrieval app that facilitates reporting. Very little "processing" takes place and Nurses and Social Workers work in teams that are assigned members so I usually have very little crossover or potential data conflicts. Nurses and SW's also are responsible for different area's of the MCP(Member Centered Plan)
Additional question; Is Sync Framework really only a viable option if I can use SQL 2008? Seems that way due to the Change Tracking etc....thoughts?
Once you solve the problem of change detection and data movement, everything else is trivial. In other words technologies like WPF, Silverlight, Forms and even WCF are orthogonal to your main problem and your choice should be based on your personal preferences and experience. The real hard nut to crack is working disconnected and synchronizing changes. Which leaves two out-of-the-box avenues: Synch Framework or Replication.
I would say, for your scenario, definetely Synch Framework. Merge replication, like all forms of replication, is designed for systems that are connected continously with intermitent disconnects. And most critically replication can work only over static names. Laptops connecting from various hot-spots and ISPs have a nasty habit of changing FQ names with each connection. Replication can overcome this only if a VPN of sort is used and VPN is usually a major support issue. Replication is just not designed for the high mobility of OCA systems.
Synch Framework will pretty much force you to SQL 2008 back end because of the need to Change Data Capture or Change Tracking, both being SQL 2008 only features.
You will still have plenty of hard problems to solve ahead (authentication, versioning and upgrade, data conflict resolution policies, securing data on the client for accidental media loss etc etc)
Personally, I would say:
.NET 3.5
WCF Data Services (for communication between the client app and your data)
SQL Server 2k5/2k8 (whichever you can use)
Silverlight w/ Out of Browser Functionality
VistaDB (to store data locally on the client until you can push to the server)
use unique-identifier for key if you are creating stuff while offline and not connected and when you do connect, updating the database.
this is going to be way easier than using auto-increment key
Having worked on an occasionally connected application, I'd encourage you to look in to SQL Server CE for the client machines, with Sync Services to handle the connections. Here is a good tutorial.
You could create this stuff from the ground up, it seems.
However, this seems an awful lot like a CRM application, and it wouldn't surprise me if you could find an enterprise software package to do this without starting from scratch and instead modify one of the configurations to meet your business rules.
In a previous life, I was a configuration developer for this thing called Siebel that might be close to what your'e looking for. They even have a built-in synchronization tool called Siebel Remote.
It might be a cheaper route to go than rolling your own from scratch.
I wrote an order taking program for wine sales reps. Here is the video. The client software is installed using click-once. That also installs SQL Server Express and loads the database. I used the Microsoft Sync Framework to sync the local database with the one on the server (see the last section of the video.)
With powerful clients now I don't see any reason to not use SQL Server Express, it is free with a limit of 4GB.
SQL CE had too many limitations - no stored procs being a major one.
You will need to use GUIDs everywhere as the primary key - see the new NewSequentialID().
I love click-once, it is a big time saver.
I'm looking forward to Silverlight, but just haven't had time to look into it. Not sure if I would have done it with Silverlight if doing it now or not.
Having said all this, this is not a project for anyone inexperienced. So I would also get some very experienced help.
I'm in the early stages of a project, and it's not clear yet whether we'll need a "real" database (i.e. SQL Server et al). So I've been doing some prototyping using MS-Access, which is working fine so far. (developing in C#/VS2008/.Net 3.5/MS-Access 2000).
However, the object-relational impedance mismatch is already becoming annoying, and will only get worse as the project evolves.
I have not been able to find an ORM that will work with MS-Access. Any suggestions?
Edit - Follow Up
We ended up using Fluent NHibernate, mainly because it Automaps our object model to a relational database, which has been a huge win for us. Most of the FNH code samples we found used SQLite, and this worked so well that we intend to use it for our production database. (The app is a desktop scientific data collection and analysis package).
MSAccess files can be set up as an ODBC source on Windows machines. Almost any ORM will allow you to use ODBC. Here is a quick tutorial on how to set that up, it's outlined for Win2k but the process is the same for XP+. You also need to have MDAC installed on your box.
NHibernate seems to have native support of MSAccess as well, see here. I've never used it though. It also has an ODBC driver.. Many others support ODBC as well.
And again, as others are saying.. MSAccess does not scale... period. Installing a real database server is fairly easy, so I'd recommend SQL Server Express as others have, or even MySQL or Postgre, whatever is easier to set up.
If this is an application that you intend to deploy to clients, with each client having their own unique database, I would recommend another solution entirely, SQLite. SQLite gives you database power on an app by app basis. If you have a central database server, one of the previously mentioned solutions would be best.
There's only one scenario when choosing the Access Database Engine is a good choice: when building a self-contained Access application using Access Forms (though choosing to use Access in the first place is a questionable choice ;)
The database engine that VS2008 plays nicest with is SQL Server and you will have no problem finding an ORM that plays nice with SQL Server.
Can't give you an answer to your question, but instead of Access you might want to consider one of the following options:
SQL Server Express: is free and compatible with the full SQL Server
SQL Server Compact: also free, does not require any deployment/installation, does not support all features (e.g. no stored procedures).
At this stage, if you are unsure whether you need a "real" database or not, I'd skip MS Access and go straight to sql server express. It's free and still allows you to do everything you need to.
Plus, if you later decide you need to scale up, then you can without any pain.
I recommend you to use something like Microsoft SQL Server or PostgreSQL for prototyping. If you don't want to learn specific SQL syntax and install special tools for designing database schema, you can use ORM that automatically generates database schema from your persistent classes declaration. Anyway this approach is very effective for prototyping.
LLBLGen works with Access
Access is just a bad, bad idea. I believe MS only includes Access in Office to keep legacy users happy.
Even if you find an ORM that will work with an Access database, with few exceptions you're locking yourself into a niche tool that likely will not work out-of-the box with a real database engine. If you decide to switch to a real database engine later on, you'll not only have to deal with migrating the database, but switching to a different ORM.
See this comparison between SQL Server Express and SQL Server Compact. The comparison document also mentions some problems with other data stores, including Access.
If you are REALLY concerned about being able to install SQL Server Express, consider SQL Server Compact:
it can be linked into your redistributable app. No need to install a service (which may require admin rights during install of your application); everything is taken care of when you install your app. This makes the most sense if you need the data to reside on the user's machine instead of a server, and is most analogous to using Access.
It's less powerful than Express (doesn't support views, triggers, stored procedures, which I consider a requirement)
Can be scaled up to Express or other SQL Server versions very easily
Suitable for small-footprint installs like tablets, mobile devices, etc.
Always keep scalability in mind when designing any application. You don't want to wind up having to write a PHP->C++ compiler if/when your app becomes successful just because you picked the wrong tool up front.
While we're at it:
The big issue with Access (or, in this case, the Jet engine, which is the part you'd really be using when integrating an Access database with a .NET app) is that there is no "server" that handles datase requests. The engine, hosted in your app, must read and write directly to a file on disk that contains the database. Whenever this happens, the file must be locked to prevent concurrent writes. Dirty reads become more common as the number of users grows, as does the potential for database corruption.
Imagine having every customer at a large restaurant trying to simultaneously enter the kitchen to write down their orders or retrieve their food. Chaos would result. There'd be a lot of broken dishes, the kitchen would be a mess, you'd be lucky to get what you ordered in any sort of edible condition. With one customer, this probably works fine. With 5, eh, maybe. With 20,50,1000? Not so much.
So, the restaurant industry introduced waiters and managers that buffer IO to the kitchen. The database server application does something roughly analogous to this by restricting access to the files on disk. Everyone gets what they want, faster and in a much more reliable way, and the data store is protected.
I am going to need to create a small windows application that stores basic data about people.
The requirements are:
Recruiters go out into the field and gather demographic data about people they talk to.
Recruiters return 'home' and contribute their gathered data into the main database (SQL Server 2005).
This is the first time I have ever done anything like this. So I am just looking for suggestions. I know I want to use C# to build the windows app and don't have a problem using .net 3.5.
I think a good place to store their data would be in a SQL Server Compact Edition database.
That's as far as I am. I mainly build ASP.net webforms, so this is a bit of a departure for me (but that's a good thing).
How would you tackle this? I am especially interested in how I might get the local data into the big database? Does the .net or sql server have any tools that make this easy? Any suggestions and considerations are welcome.
I would start with the Sync Framework. There's even a sample using SQL Compact as an offline data store.