How to handle intranet application to work offline - c#

I developed a small expense report system used to replace an excel file that was very old.
The setup was pretty simple:
-UI is jquery, asp.net, simple css
-Back end is sql server 2005
-There are some additional tiers for the business layer.
Typical tiered application. It's running on IIS 6 as an Intranet site. It's a lot simpler to use then the excel sheet but of course there is the guy who doesn't seem to have an internet connection so how can he submit his expense report?
I thought of an offline system that serializes objects but this again seems to be a bit complex as eventually these serialized objects need to get back to the database layer. I'm thinking more on the lines of just telling them to submit an excel spreadsheet to an "admin" who does have an internet connection and have them put in his / her expense report. But I fear the "admins" will hate this.
Even if I whip up a desktop client application, I'd still have to hit the network to get to the data layer.
Any other ideas of how to handle the guy who cannot VPN or get on the network but once loved his excel application that I havent thought of?

We implemented a few solutions where the users don't have a connection. One was a WinForms app, and others were ASP.NET apps. For the ASP.NET apps, we modified them to run using IIS on the laptop or client PC, since you can install IIS on a client OS.
What we ended up using as a pattern was to use a DataSet locally to hold/parse.manipulate data on the client PC, and then serialize it to an XML using the built in DataSet.WriteXml method. Then we created an "Upload" screen (asp.net page or WinForms page) that shows the pending uploads. This uploads the DataSet to an ASP.NET Web Service. Since the DataSet object is serializable, using it in a web service is the least amount of work. The web service handles actually inserting/updating the database.
It is actually a lot less work than it sounds like, and the pattern has been so successful for us that it's become our standard method for handling disconnected apps. The troubleshooting on these apps has been negligible, which to me speaks volumes for the simplicity and reliability, compared to other approaches we've tried.

You can implement SQL Server Merge replication. There was a little bit of a learning curve, but I figured it out and implemented it in a couple of weeks so its definitely "do-able". You can have a desktop app that updates a sql server express db and then synchronizes the changes using replication at your choosing (maybe when the user is connected to the network).
How Merge Replciation Works
http://msdn.microsoft.com/en-us/library/ms151329.aspx

You can do a variation of the email xls file to admin. Instead of the admin re-inputting the data, you could have them upload the xls file and your system could parse it and enter the data. Works quite well especially if the xls format is unified.

Using the Office toolkit and macros build into the users Excel application the functionality to export the data into an XML dataset and transmit them when the opportunity permits. On the server end build an adapter that reads those XML files as they arrive and inserts the appropriate records as required.

Related

Web service for Laravel

Assume that I have a third-party database application with SDK that can be used to retrieve data out of the database in XML.
On the other side, I have developed a website using Laravel framework of PHP. The website is supposed to display data from the database of the application.
In regards to above I have the following questions:
As far as I understand, I can either store the requested data in my website database or just show it without storing. What technique do you suggest?
How do I achieve xml data transfer from the database server to the website?
Taking into account that I have experience of development in C#, I assume that I have to develop some web-service that would run on the database server, retrieve the required data and send it to my website. So the web-service has to receive the requests from my Laravel website, retrieve data from database server accordingly and pass the xml response to my website that would finally display it. Am I on the right way? If so, could you please guide me on how to code and bind these parts?
Thank you in advance.
I have to agree with #Serge in the comments - there are many ways to do this because it is a very broad question.
My answer was mostly going to deal with how regularly the third party database was going to be updated but judging from your comments, I'm assuming it will be fairly often? In which case, I would likely connect directly to the third party database from your laravel app using the firebird driver found here: https://github.com/jacquestvanzuydam/laravel-firebird (Please note, I have never used this so I cannot comment on it's quality) instead of writing a C# web service. I don't know much about firebird itself but you will likely want to connect using an SSH tunnel or VPN for security reasons.
Then I would either store data in MySQL if you know it isn't likely to change very often (in this case you would use a laravel command, run on a schedule, to pull data out of firebird every [X] days/hours/minutes depending on the data) or, if the data is likely to change on each potential web request, using some form of caching system (redis, memcache, file cache etc) to speed up the web requests.
Sorry if that isn't particularly helpful - if you can provide more information maybe I can help you out further :)
Good luck!

Fat Server + Thin Rich Client in .NET

I am working out the nitty gritty of a potential server / many-client project and it's in a realm I haven't been before. Disregarding the scale of the project for a moment and assuming this ever goes ahead....
My current idea is that the server should be a fat server with a thin rich client on each workstation, built in C# .NET and probably using WinForms for the user interface, and distributed via ClickOnce for easy and compliant software updating.
Database <-> Server (business logic) <-> Rich Thin Client
Instead of a fat client:
Database <-> Client
I am looking into WCF for the server. Is this advisable for a client-server architecture with the following usage case?
Anywhere between 10 and 100 receptionists and practitioners using the client (company growth would increase this)
Windows 7 and up being primary workstation operating system
Minimal data traffic desirable
Potential for large data to be stored alongside the database in some manner (patient images, video and such)
Is it wise for the server to be performing business logic as much as possible and only sending information and results and doing the basics on the client (validation etc)? It seems logical to me.
Does anyone have some good information on beginning such a big project?
I believe the current software being used is actually a fat client with a direct MSSQL connection.
Not only this, it is also non-distributed and each clinic has it's own separate database causing many problems with data integrity and collation for reporting and such.
This is perfectly valid plan for your architecture. WCF will do fine. If you need a tutorial on authentication sharing between the web application that authenticates users and the ClickOnce module run from within the application, I wrote one once:
http://www.wiktorzychla.com/2008/02/clickonce-webservice-and-shared-forms.html
This was written years ago and while I show how to share authentication between the ClickOnce and ASMX web service, sometime later I wrote another tutorial on sharing the authentication between a Silverlight module and a WCF service.
http://www.wiktorzychla.com/2010/04/aspnet-forms-authentication-sharing-for.html
Combining these two will give you ClickOnce + WCF authentication sharing.

connecting to Oracle from a Windows.Forms Application

I'll working in a Random Moment Sampling desktop app. I don't work with windows forms since a long time and I have the following questions.
I need to query data from Oracle 11g, if I remember right, before my users can start using the client application they need to install the oracle client. I'm right or this changed?
If this is a problem I can use web services to retrieve the data. If someone has recommendations I'm open to alternatives, I'll have approximately 3000 users and I'm looking for the best option.
The application will run in the background querying the database every minute, it will look for samples, the moment it founds ones a window comes up blocking the computer until the user fills the sample.
Is a Windows.Forms application the best option or I shall use Windows Service? I read a few threads but Im thinking in the installation process.
I'm currently on time so I can try a few ideas.
Yes the Oracle software needs to be installed. There is an "instant client" package That is a little more lightweight then the normal client which can allow for connectivity.
Whether to use a service or not depends on the functionality of your system and how extensible you want it. You mentioned you will have 300 users querying the data. If they are querying the same data it may result in more than one user responding to the same data. I don't know if this is what is desired.
edit: to combine a bit if the oracle software is a concern. If you do create a service that serves up your data, the system where the service is run is the only one that will require the Oracle client software.

Windows application in C#?

I am designing a Windows application in C# which when run for the first time creates a local database (SQLite) and the data (around 200 MB or even more) to this is feed as a data stream from a remote server based on the criteria specified by the user.
Currently I have planned to access the database server directly from the application.
Questions:
Is it a good idea to use the database server directly from the application as the server manages connections automatically and I save time in developing a TCP/IP interface.
What can be the second option? Providing a TCP/IP server or interface (Isn't it time consuming to write it?).
As the data is large should I use compression?
With WCF you can avoid the cost of writing TCP/IP code and have a standalone server or a web service hosted on IIS. Moreover, compression can be added without code change.
You have to try with and without compression. The compression rate highly depends on the data and compression time can also be an issue.
Without going into large detail I can say you can use ASP.NET C# (You can choose any .NET language) and you can send and receive data using POST.
Why would you need compression? You are only sending results? If it is big you can use an existing library. I have used sevenzipsharp in the past without much issue.
Edit: There may be an option on the server to gzip output data so you may not need to use anything.
Assuming that your intention is to pull down a subset of the data on the server dependent on client queries for local storage then for reasons of security and control you probably ought to be looking at using web services rather than exposing your database directly to the internet.
There are a large number of options for creating services - WCF being the principal method for new .NET applications and straightforward to implement at both server and client ends - in this case I'd also probably take a look at ADO.NET Data Services as providing a shortcut to a rich set of services.
It is usually best to use ADO.NET or LINQ to SQL (Entity Framework) to connect to your Database directly unless the User is going to be disconnected while using the application.
If you are going to have the user disconnect then continue using SQLite or you can use ADO.NET which can save an XML file of the data and access it like a Table from the users machine without the additional dependence of SQLite.
I would not use compression because C# does not have a built-in library for it and would require an additional dependency.
Try to just use the .NET Framework without additional DLL's and you will have a more flexible application that is easier to install and maintain.
ADO/Entity Framework - http://msdn.microsoft.com/en-us/library/h43ks021.aspx

Shaky connectivity - favor web or desktop app?

I'm a desktop application developer who is temporarily working in the web. I'm working with a client that wants me to build an app for use by locations all over the state; however, these locations have very shaky connectivity.
They really want a centralized web app and are suggesting I build a "lean" web app. I don't know what a "lean web app" means: small HTTP requests but lots of them? or large HTTP requests with few of them? I tend to favor chunky vs chatty.. but I've never had to worry about connectivity before.
Do I suggest a desktop app that replicates data when connectivity exists? If not, what's the best way to approach a web app when connectivity is shaky?
EDIT:
I must qualify my question with further information. Assuming the web option, they've disallowed the use of browser runtime technologies and anything that requires installation. Thus, Silverlight is out, Flash is out, Gears is out - only asp.net and javascript is available to me. Having state this, part of my question was whether to use a desktop app; I suppose that can be extended to "thicker technologies".
EDIT #2: Network is homogeneous - every node is Windows. This won't be changing.
You should get a definition of what the client means by "lean" so that you don't have confusion surrounding it. Maybe present them with several options of lean that you think they might mean. One thing I've found is it's no good at all to guess about client requirements. Just get clarification before you waste a bunch of time.
Shaky connectivity definitely favors a desktop application. Web apps are great for users that have always-on Internet connections, and that might be using a variety of different browsers and operating systems.
Your client probably has locations that are all using Windows, so a desktop application is an appropriate choice. One other advantage of web applications is that they make the deployment issue easy to deal with. Auto-update technologies like ClickOnce make the deployment and update of desktop applications almost as easy.
And not to knock Google Gears, but it's relatively new and would have to be considered more risky than a tried-and-true desktop application.
Update: and if you're limited to just javascript on the client side, you definitely do not want to make this a web app. Your application simply will not be available whenever the Internet connection is down. There are ways to save stuff locally in javascript using cookies and user stores and whatnot, but you just don't want to do this.
If connectivity is so bad, I would suggest that you write a WinForm app that downloads information, locally edits it and then uploads it. This way, if your connection goes down, all you have to do is retry until it works.
They seem to be suggesting a plain vanilla web app that doesn't use AJAX or rely on .NET postbacks or do anything that might make it break down horribly if your connection goes away for a bit. Instead, it should be designed so that you can hit Refresh until it works. In other words, they seem to want the closest thing to a WinForm app, only uglier.
You may consider using a framework like Google Gears to help provide functionality during network down time. This allows users to connect to the web page once (with a functioning connection) and then be able to use the web app from then on, even without a connection.
When the network is restored, the framework can sync changes back with the central database.
There is even a tutorial for using Google Gears with the .Net Framework.
Gears with other languages
You mention that connectivity is shaky at these locations, but that the app needs to be centralized. One thing you might consider is using multiple decentralized read database servers and a single centralized write server. Mysql makes this possible and affordable if your app is small.
Have the main database server at the datacenter/central office. Put up small web/db servers at each location, with your app installed. You can even run them off a user computer if the remote location is not too big. Make the local database servers connect to the centralized database server as replication slaves. As changes come in to the centralized database, the slave servers will pull down the data and make it available locally. When the connection is unavailable, your app data is still at least available, if not up to date. When the connection is available, the database handles replicating all relevant data down.
Now all you have to do is make your app use two separate database handles: reading data it uses the local database, writing data it uses the central database.

Categories