Data-Driven Websites for Very Small Businesses - c#

I have a client who has a product-based website with hundreds of static product pages that are generated by Microsoft Access reports and pushed up to the ISP via FTP (it is an old design). We are thinking about getting a little more sophisticated and creating a data-driven website, probably using ASP.NET MVC.
Here's my question. Since this is a very small business (a handful of employees), I'd like to avoid enterprise patterns like web services if I can. How does one push updated product information to the website, batch-style? In a SQL Server environment, you can't just push up a new copy of the database, can you?
Clarification: The client already has a system at his facility where he keeps all of his product information and specifications. I would like to refresh the database at the ISP with this information.

You don't mention what exactly the data source is, but the implication is that it's not already in SQL Server. If that's the case, have a look at SSIS.
If the source data is in SQL Server, then I think you'd want to be looking at either transactional replication or log shipping to sync the two databases.

If you are modernizing, and it is a handful of employees, why would you push the product info out batch style?
I don't know exactly what you mean by "data driven", but why not allow the ASP.NET app to query the SQL Server product catalog database directly? Why generate static pages at all?
UPDATE: ok, I see, the real question is, how to update the SQL database running at the ISP.

You create an admin panel so the client can edit the data directly on the server. It is perfectly reasonable to have the client keep all their records on the server as long as the server is backed up nightly. Many cloud and virtual services offer easy ways to do replicated backups.
The additional benefit of this model is that more than one user can be adding or updating records at a time, making the workforce a lot more scalable. Likewise, the users can log in from anywhere they have a web browser to add new records, fix mistakes made in old records, etc.
EDIT: This approach assumes you can convince the client to abandon their current data entry system in favor of a centralized web-based management panel. Even if this isn't the case, the SQL database can be hosted on the server and the client's application could be made to talk to that so you're only ever using one database. From the sounds of it, it's a set of Access forms and macros which you should have source access to.

Assuming that there is no way to sync the data directly between your legacy system DB (is it in Access, or is Access just running the reports) and the SQL Server DB on the website (I'm not aware of any):
The problem with "pushing" the data directly into the SQL server will be that "old" (already in the DB) records won't be updated, but instead removed and then recreated. This is a big problem with foreign keys. Plus, I really don't like the idea of giving the client any access to the db at all.
So considering that, I find that the best is to write a relatively simple page that takes an uploaded file and updates the database. The file will likely be CSV, possibly XML. After a few iterations of writing these pages over the years, here's what I've come up with:
Show file upload box.
On next page load, save file to temp location
Loop through each line (element in XML) and validate all the data. Foreign keys, especially, but also business validations. You can also validate that the header row exists, etc. Don't update the database.
3a. If invalid data exists, save an error message to an array
At the end of the looping, show the view.
4a. If there were errors, show the list of error messages and tell them to re-upload the file.
4b. If there were no errors, create a link that has the file location from #2 and a confirmation flag
After the file location and confirm flag have been submitted run the loop in #3 again, but there's an if (confirmed) {} statement that actually makes the updates to the db.
EDIT: I saw your other post. One of the assumptions I made is that the databases won't be the same. ie, the legacy app will have a table or two. Maybe just products. But the new app will have orders, products, categories, etc, etc. This will complicate "just uploading the file".

Why do you need to push anything?
You just need to create a product management portion of the webpage and a secondly a public facing portion of the webpage. Both portions would touch the same SqlServer database.

.Net has the ability to monitor a database and check for updates. then you can run a query to [push] the data elsewhere.
or use sql to push the data with a trigger on the table(s) in question.
Is this what you were looking for?

You can try Dynamic Data Web Application.

You should have a service that regularly updates the data in the target DB. It will probably run on your source data machine (where the Access-DB is)
The service can use SSIS or ADO.NET to write the data. You can do this over the web, because you have access via TCP/IP to the server I assume.
Please check when the updates are done and how long it takes. If you can do the updates during the night you are fine. If not you should check, if you can still access the web during the import. That is sometimes not the case.

Use wget to push the new data file to the mvc app and once the data is received by the action, the mvc app invokes the processing/importing of the data (maybe in a worker process if you dont want long requests).

Related

Desktop application which can work offline when no connectivity with SQL Server

I am designing a WPF desktop application and using Entity framework Code First to create and use SQL Server Database. My database will be hosted on One Server machine and will be running 24*7.
I want to provide a feature, where you can modify data offline(when you have no connectivity with SQL Server DB) and Save it somehow. And whenever your application will find connection with SQL Server, all changes can be moved to SQL Server DB.
Is there any way to achieve this by using Entity Framework ?
I want to emphasis on the part that I am using Entity Framework. Is this type of functionality already implemented by EF?? Or I have to do it manually, like have to write that in any file system and then manually merge it later to DB ?
You could figure out the specific exceptions that are generated when the SQL Server connection is lost, and embed your calls in try-catch blocks. If the server is offline, then in your catch block, pass the entity to a method that serializes the entity to JSON and saves it to the hard drive in a special directory or something. On your next successful query, check that directory to see if there are any saved entities that need to be saved.
Be specific with your catches - you don't want unrelated exceptions to trigger this code.
Some things to keep in mind - what if somebody else changed the data in the meantime? Are you intending to overwrite those changes? How did you get the data which needs to be saved in the first place if you are offline?
As long as you have all data loaded into DbContext/ObjectContext you're free to amend those data anyway you want. Only when SaveChanges() is invoked, the connection is really needed.
However, if you're going to load everything into the context, you seem to reimplementing DataSet functionality, which, in addition, allows for xml serialization/deserialization of the changes, so the changes can be even saved between sessions.
Not as trendy as EF, though :)
While I have never tried this with SQL-based data I have done it in the past with filesystem-based data and it's a major can of worms.
First, you have to have some means of indicating what data needs to be stored locally so that it will be available when you're offline. This will need to be updated either all the time or before you head out--and that can involve a lot of data transfer.
Second, once you're back online there's a lot of conflict resolution that must be done. If there's a realistic chance that someone else might have changed the data while you were out you need some way of detecting the conflict and prompting the user as to what to do in that situation. This almost certainly requires a system that keeps a detailed edit trail on every unit of data that could reasonably be updated.
In my situation I was very fortunate in that it was virtually certain that if the remote user edited file [x] that overwriting the system copy was the right thing to do. Remote users would only be carrying the files that pertained to their projects, conflicts should never happen. Thus the writeback was simply based on timestamps, nothing more. Data which people in the field would not normally need to modify was handled by not even looking at it, modified files were simply copied from the system to the laptop.
This leaves the middle step--saving the pending writes. I disagree with Elemental Pete's answer in this regard--simply serializing them and saving the result them does not work because what happens when you read that data back in again? You see the old copy, not the changed copy!
My approach to this was a local store of all relevant data that was accessed exactly like the main system data was, all reads and writes worked normally.
Something a lot fancier might be needed if you have data that needs transactions involved.
Note that we also hit a nasty human problem: the update process took several minutes (note: >10y ago) simply analyzing what needed to be done, not counting any actual copy time. The result was people bypassing it when they thought they could. Sometimes they thought wrong, oops!

Best way to store remote database locally for offline usage (without running mysql server locally)?

Good morning fellow friends of paleness.
Please help me find a good approach to solve the following problem:
What am I trying to do?
I would like to create a c# software that does the following two things:
The first part is a database where the user can store contact information. Each set of data consists of around 10 fields, not more than a couple of hundred contacts will be stored, and less than twenty users will have access to the database.
Secondly the user shall be able to select a subset of the data by applying filters. He then should be able to send personalized emails to the contacts in these subsets.
What are the constraints?
I would like to store the contact data mainly on a remote server that has mysql capability
However, when a user is offline, I want him to still be able to access the data, so a way to store the data locally is needed. Selecting subsets and saving email jobs for later execution shall be possible in offline mode
I do not want the user need to install or start any server locally when he uses the program
When the user goes online the local and the remote data will be synchronized
What is the problem?
The solution I came up with was to store the data locally either in an XML file or embed a database like sqlite in the program. But XML will not be a good choice for the "selecting a subset" feature and both approaches will need conversion to mysql I figured. I know the amount of data is not that big, so maybe either approach is fine. But maybe someone has a better idea altogether?
It would be great to hear your thoughts on how to store the data locally.
Cheers, Essi

Best approach to incremently update application data

I have been working on an application for a couple of years that I updated using a back-end database. The whole key is that everything is cached on the client, so that it never requires an network connection to operate, but when it does have a connection it will always pickup the latest updates. Every application updated is shipped with the latest version of the database and I wanted it to download only the minimum amount of data when the database has been updated.
I currently use a table with a timestamp to check for updates. It looks something like this.
ID - Name - Description- Severity - LastUpdated
0 - test.exe - KnownVirus - Critical - 2009-09-11 13:38
1 - test2.exe - Firewall - None - 2009-09-12 14:38
This approach was fine for what I previously needed, but I am looking to expand more function of the application to use this type of dynamic approach. All the data is currently stored as XML, but I do not want to store complete XML files in the database and only transmit changed data.
So how would you go about allowing a fairly simple approach to storing dynamic content (text/xml/json/xaml) in a database, and have the client only download new updates? I was thinking of having logic that can handle XML inserted directly
ID - Data - Revision
15 - XXX - 15
XXX would be something like <Content><File>Test.dll<File/><Description>New DLL to load.</Description></Content> and would be inserted into the cache, but this would obviously be complicated as I would need to load them in sequence.
Another approach that has been mentioned was to base it on something similar to Source Control, storing the version in the root of the file and calculating the delta to figure out the minimal amount of data that need to be sent to the client.
Anyone got any suggestions on how to approach this with no risk for data corruption? I would also to expand with features that allows me to revert possibly bad revisions, and replace them with new working ones.
It really depends on the tools you are using and the architecture you already have. Is there already a server with some logic and a data access layer?
Dynamic approaches might get complicated, slow and limit the number of solutions. Why do you need a dynamic structure? Would it be feasible to just add data by using a name-value pair approach in a relational database? Static and uniform data structures are much easier to handle.
Before going into detail, you should consider the different scenarios.
Items can be added
Items can be changed
Items can be removed (I assume)
Adding is not a big problem. The client needs to remember the last revision number it got from the server and you write a query which get everything since there.
Changing is basically the same. You should care about identification of the items. You need an unchangeable surrogate key, as it seems to be the ID you already have. (Guids may be useful here.)
Removing is tricky. You need to either flag items as deleted instead of actually removing them, or have a list of removed IDs with the revision number when they had been removed.
Storing the data in the client: Consider using a relational database like SQLite in the client. (It doesn't need installation, it is just storing in a file. Firefox for instance stores quite a lot in SQLite databases.) When using the same in the server, you can probably reuse some code. It is also transaction based, which helps to keep it consistent (rollback in case of error during synchronization).
XML - if you really need it - can be stored just as a string in the database.
When using an abstraction layer or ORM that supports SQLite (eg. NHibernate), you may also reuse some code even when there is another database used by the server. Note that the learning curve for such an ORM might be rather steep. If you don't know anything like this, it could be too much.
You don't need to force reuse of code in the client and server.
Synchronization itself shouldn't be very complicated. You have a revision number in the client and a last revision in the server. You get all new / changed and deleted items since then in the client and apply it to the local store. Update the local revision number. Commit. Done.
I would never update only a part of a revision, because then you can't really know what changed since the last synchronization. Because you do differential updates, it is essential to have a well defined state of the client.
I would go with a solution using Sync Framework.
Quote from Microsoft:
Microsoft Sync Framework is a comprehensive synchronization platform enabling collaboration and offline for applications, services and devices. Developers can build synchronization ecosystems that integrate any application, any data from any store using any protocol over any network. Sync Framework features technologies and tools that enable roaming, sharing, and taking data offline.
A key aspect of Sync Framework is the ability to create custom providers. Providers enable any data sources to participate in the Sync Framework synchronization process, allowing peer-to-peer synchronization to occur.
I have just built an application pretty much exactly as you described. I built it on top of the Microsoft Sync Framework that DjSol mentioned.
I use a C# front end application with a SqlCe database, and a SQL 2005 Server at the other end.
The following articles were extremely useful for me:
Tutorial: Synchronizing SQL Server and SQL Server Compact
Walkthrough: Creating a Sync service
Step by step N-tier configuration of Sync services for ADO.NET 2.0
How to Sync schema changed database using sync framework?
You don't say what your back-end database is, but if it's SQL Server you can use SqlCE (SQL Server Compact Edition) as the client DB and then use RDA merge replication to update the client DB as desired. This will handle all your requirements for sure; there is no need to reinvent the wheel for such a common requirement.

Select databases dynamically

I ran into real brick wall trying to connect to dynamic databases. And I don't know how to achieve this,
Here is my process, I have an application where it needs to be adaptable to changes in the work environment, say If the work places server crashes and they create a new database with the name db_new the application would connect to that instead of the old database name.
I already have a window that displays the databases from the server on a listbox where the user can specify which database to use for the application. But the issue is, how can I save the selected database name so that it can run after the new database is selected? ..
as in the administrator should be able to change the database the application uses if necessary and the application should keep on using that selected database till the administrator changes it back to a new one.
Please forgive if the question a bit vague, I just put it together in the best way I could, any help on this would be really great :)
EDIT:
And I cannot use text files or xml s as the database name the application uses should be stored in a secure manner. :)
First of all, you can very easily use a text or XML file: If you store the information in a file, that can't be downloaded by the user (as I assume you would), this is as safe as it can be: If somebody manages to break into the server and read the file, it's game over anyway.
That said, I would recommend you use MySQL proxy or a similar mechanism and point your WebApp at it - failing over to another database or changing the underlying database could then be handled at the proxy layer without the WebApp even knowing about it: The functionality need not be part of your application and in my book it shouldn't.
You haven't told us the language you are using. Therefore we cannot offer very good suggestions.
My first thoughts:
If this was PHP you could have the general app use something along the lines of,
$db->execute('sql statement here');
and then just have the administrator change the current $db when needed. That way $db->execute() will always be executed on the "current" database.
Edit: This should still work in C#. If you have the functions using the database call a variable that is the current db connection then you should be able to change the db connection to the proper database whenever you need while the rest of it continues running since it's just the same variable.

c# query ms access against sql server

I have been asked to setup a course leaflet system for a college. For whatever reason in the past their current system is not linked to their actual course file, they wish to close this link so course leaflets are related to actual course codes. Unfortunately their course file is a ms access database linked to many of their existing systems (cannot easily be upgraded/moved). Since the course leaflets are going on the web it is a requirement with their hosting to use a sql server database.
This means I need to query between the two internally so they can work out what courses they have without a leaflet, I would not like to add ad hoc queries to the access database to do this.
What is the best way to do this in C#, I think LINQ can do it but have not learnt it yet, should I learn it for this project or is there an easier way?
I thought about a linked server to the ms access db but this would require moving the db to the sql server. Another difficult task as from what I can tell links to the database are hard coded.
Just how often does the course file change? Fifty times a day? Once a month?
What about creating the appropriate tables in the SQL Server database? Then every so often (as often as necessary to stay reasonably current), clear those tables out and repopulate them from the Access database. You could set this to run every morning at 3 a.m. or whatever. Or you could just do it periodically whenever the tables change significantly.
Why do you need the Access file to the SQL server to create a Linked Server? Just put it on a network share with appropriate security and create your linked server like that.
To add, LINQ has nothing to do with SQL or Access or anything else, it's for querying in memory object collections. Some linq providers allow you to use that to access your DB in question, but they won't be much help in this situation, I think.

Categories