Fat Server + Thin Rich Client in .NET - c#

I am working out the nitty gritty of a potential server / many-client project and it's in a realm I haven't been before. Disregarding the scale of the project for a moment and assuming this ever goes ahead....
My current idea is that the server should be a fat server with a thin rich client on each workstation, built in C# .NET and probably using WinForms for the user interface, and distributed via ClickOnce for easy and compliant software updating.
Database <-> Server (business logic) <-> Rich Thin Client
Instead of a fat client:
Database <-> Client
I am looking into WCF for the server. Is this advisable for a client-server architecture with the following usage case?
Anywhere between 10 and 100 receptionists and practitioners using the client (company growth would increase this)
Windows 7 and up being primary workstation operating system
Minimal data traffic desirable
Potential for large data to be stored alongside the database in some manner (patient images, video and such)
Is it wise for the server to be performing business logic as much as possible and only sending information and results and doing the basics on the client (validation etc)? It seems logical to me.
Does anyone have some good information on beginning such a big project?
I believe the current software being used is actually a fat client with a direct MSSQL connection.
Not only this, it is also non-distributed and each clinic has it's own separate database causing many problems with data integrity and collation for reporting and such.

This is perfectly valid plan for your architecture. WCF will do fine. If you need a tutorial on authentication sharing between the web application that authenticates users and the ClickOnce module run from within the application, I wrote one once:
http://www.wiktorzychla.com/2008/02/clickonce-webservice-and-shared-forms.html
This was written years ago and while I show how to share authentication between the ClickOnce and ASMX web service, sometime later I wrote another tutorial on sharing the authentication between a Silverlight module and a WCF service.
http://www.wiktorzychla.com/2010/04/aspnet-forms-authentication-sharing-for.html
Combining these two will give you ClickOnce + WCF authentication sharing.

Related

Share data between Windows Service and Cassini-dev running in the same process

I have a Windows Service written in C#. I have recently added CassiniDev to it to allow remote web administration and monitoring of the service. The integration went really well except for my inability to interact with data layer of my Windows Service from hosted ASP.NET pages.
I have tried putting everything of interest into a common assembly but the debugger shows there are two loaded assemblies with the same name but from different paths. Cassini runs ASP.NET off some temp folder so the assembly I am using is really "a different instance" in the address space of the same process.
I am not sure what is going on here. Probably some "application domain" separation stuff that I do not understand at this time.
So with Windows Service and the web server running in the same process, how can I make them interact? Say I have some status in the Service part that I want to report in the ASP.NET part. Any ideas how I could make this happen? Shared memory or TCP comes to mind but it sounds like an overkill for purely intra-process communication.
If security isn't an immediate concern, i.e. the data isn't highly sensitive and in a controlled environment, then you could have success using Named Pipes. A managed API for processing piping has been implemented as part of the framework, so you don't need to think in native calls.

What .Net tools should I consider using to build an application to provide monitoring of our real time systems?

I want to build some sort of interface that will monitor our real time routing/switching system. I would like to give a lot of visual feedback to be able to monitor its status visually. Our system and clients are not co-located so they would need to connect via TCP/IP.
I would like to be able to service any number of monitoring clients (although this will probably only ever be about 4-6 clients). I thought of using SilverLight but there appears to be one or two tricks involved in getting SilverLight to connect back to an application running on a different port.
I have also thought of using HTML5 canvas and websockets. Another alternative is to just create the clients using normal Window Forms and perhaps WPF. But this means that to monitor the application the client will have to be downloaded before. I would prefer something that is as easily accessible as web app?
What are some of the more common application stacks to achieve this? What should I watch out for?
EDIT:
Just to add: This will be an internal tool only. But we have offices in a couple of locations.
any choice in this direction could be subjective and arguable, surely somebody could suggest any possible web framework or language...
I would consider, however because of your .NET and C# tags, ASP.NET MVC 3, so basically web based plugin-less ( NO Silverlight ) HTML 5 solution.
Consider that StackOverflow is done in same way (MVC, ASP.NET, SQL Server... ) and outperforms as we all know.
the way you grab the underlying events from TCP, so the way you capture and provide the data from TCP, it's another thing from the front end, I would probably write a Windows Service if the traffic is so high and you want to grab and store data anything regardless any active client connection.
There are plenty of real time charting controls out there also for MVC, MS Chart Control. DevExpress, ExtJS integrated ones...
"real time" and Browser is bothering me.
I would indeed go WPF or WinForms. Using the ClickOnce-Deployment you can make this a no-pain for the user and you can roll-out new versions just by redeploying them and having the user restart the application.
In my company this works really fine and we have no problems whatsoever. The only problem with this is, that the app.config is somewhat hard to find and keep current/valid (redeploy) but in your case this won't change per client (or so I guess).
I agree with #Davide - I would go for a WebService that will obtain all routing/switching data in realtime. You will have a web application and on the client side you will have JQuery/AJAX fetching realtime data from the WebService component.
I've seen cool demo's of Web Orb doing something similar to what you want. http://www.themidnightcoders.com/
If you are starting from scratch, it would be good to check out WCF (Windows Communication Foundation). It's great because it can expose your functionality in many ways, using nothing more than modifying a config file.
If you want a Windows client app, you can host it in a Windows Service, or simply include it as a side assembly. For web apps, you can choose between various formats (JSON, XML), channels (HTTP, TCP) and protocols (SOAP, ODP).
If I got it right, there will be a server-side application which will collect information from the devices and expose it to clients as a service. In that case, a WCF application might be hosted in a Windows Service or IIS on a server machine, and expose the data though one or more endpoints (HTTP, TCP).
I am not aware of problems in connecting a SilverLight app to a service, but I would rather go for a HTML5/JavaScript combo instead, for easier deploying and compatibility with a wider range of devices (no plugins needed). ASP.NET MVC should be the best choice for the web app.

move from rich clsent (WPF) to web based (Silverlight application) for multiuser application

We have a multi user product prototype in WPF which works fine as a prototype. Now we want to build the complete product.
In our product we have scenarios where 2 - 3 users might have to use same data. Say one us editing and the other user is viewing the continuos edits. And also whenever a user changes a common itemm updates should go to all the other users on that same item, and they need to refresh their information. And this should happen without polling continuously.
Is there any advantage of going towards web based product development i.e in Silverlight.
There's absolutely no architectural advantage to Silverlight over WPF in this situation. Your main advantages for Silverlight would be 1) smaller deployment 2) cross platform and 3) more integrated in-browser experience.
From the developer's perspective, however, you might actually find WPF better for developing this kind of collaborative application because you have access to a wider array of networking options. Silverlight has limits on what TCP/UDP ports you can access and has no built-in peer to peer networking capabilities like WCF does on the .NET Framework.
In any case, a Silverlight application for all intents and purposes is a client application, not really any more of a web application than WPF except when it comes to deployment.
Peer to Peer (p2p) vs. Client / Server
Your question really comes down to how you want to design your data storage.
P2P: Do you want each copy of the application to keep a full copy of the data and to exchange updates to the data with the other clients? This works well in a LAN environment but gets challenging over the broad Internet. BitTorrent is a good example of an application that does this.
P2P generally has higher performance and fewer costs but is very tricky to pull off. You'll need a p2p network transport like PeerChannel, and most likely a synchronization engine like Sync Framework, and some form of structured local data store like SQLite.
Client / Server: Do you want one master computer (e.g. server) to host all the data and have each client load / update data to the server? This works well in a LAN or Internet environment. Web browsers / web servers are a good example of this.
Client / Server has the overhead that a dedicated always on server computer has to be involved and you have to program two applications, the client side and the server side. Silverlight or WPF work well for the client piece of this design.
If you're up for a challenge, developing a p2p application can be a lot of fun because there are many obstacles to overcome and the end result is generally more efficient. This will essentially require you use WPF to get the libraries/tools support you need. If you need to get something working quickly, you'll find that the tools you have support client / server much better because this is how most applications are written. Here, WPF and Silverlight will both work, but they are only part of the solution -- you'll need a server technology too like SQL Server or ASP.NET or Azure or ...

partially connected application using asp.net 3.5 (not mobile apps)

We had a requirement to build a ASP.NET 3.5 web application using web forms, WCF, ADO.NET and SQL Server. The users would connect via Internet.
Recently we understood that it is possible that users would often remain disconnected and would have Internet access intermittently.
I need to understand if we can create occasionally connected web application using asp.net 3.5 - what all technologies/features we need to use? Is MS Sync Framework the answer to the problem - is it a viable option to use with web application?
Is windows application the right approach instead of web applications - where the business logic would be run at the client itself, using local SQL Express editions with data then been synced up with Enterprise SQL server at server end when connection is established using replication and/or MS Sync framework. In that case is there a need to use WCF?
Does Silverlight applications help in this context -building paritally connected web apps?
Really appreciate if you can give pointers to how to go about this task of creating .net partially connected apps (not mobile apps)?
It looks to me as if you'll need to store your client data locally when not connected.
If you use wcf you can determine what type of protocol to use according to connectivity without affecting your main code e.g. tcp/ip for LAN, http for internet and msmq for storing up data when disconnected.
If data for transfer is stored up using msmq, as soon as a connection is remade then the data will be passed to your main server.
If you write your wcf, or communications code to run as a service (assuming windows functionality here) then it is up to you whether to retain the asp code or write a new windows app.
edit
Setup MSMQ at both ends, its part of windows setup and can be installed on a client machine, just the same as IIS is, it's on the installation disk but not installed by default.
I wouldn't use it to get web pages, have those available on the local machine, but instead use it to queue up data that MUST get back to the server. Your data access layer should be separated from your GUI layer anyway. I assume that your using the MVC pattern or similar.
I don't know what your application is requried to do but here is the example that I've worked on.
A mobile user who visits clients. He has a replicated copy of a company product database on his laptop. When he visits client sites he may not be able to connect to his company server, but still wants to place client orders. This he does using his laptop based application and database. Order data is queued up in MSMQ on the laptop.
As soon as he is able to connect to his company server MSMQ automatically sends the order data. The server has queued up MSMQ messages of changes to pricing and stock etc. that took place whilst he was disconnected. These are now received and the local database is updated.
The choice of TCP/IP, HTTP or MSMQ all happens seemlessly to the main application, the WCF code copes with the choice.
From what I know, you have two options:
Use Gears (abandoned) or Web Storage to store and sync local data, combined with heavily javascripted web pages that can detect loss of connection and work against the local data store.
Use the Sync Framework with a rich client (WinForms, WPF or possibly Silverlight OOB if it gets supported). The Sync Framework does not require a local installation of a database, instead it uses SQL Server Compact, which is simply a local file.
At this stage using Sync franework with probably rich client seems to be better option. Thanks a lot Guys for taking your time out and trying to answer my queries. I will let you know the technologies used after i manage to deploy the app!

Shaky connectivity - favor web or desktop app?

I'm a desktop application developer who is temporarily working in the web. I'm working with a client that wants me to build an app for use by locations all over the state; however, these locations have very shaky connectivity.
They really want a centralized web app and are suggesting I build a "lean" web app. I don't know what a "lean web app" means: small HTTP requests but lots of them? or large HTTP requests with few of them? I tend to favor chunky vs chatty.. but I've never had to worry about connectivity before.
Do I suggest a desktop app that replicates data when connectivity exists? If not, what's the best way to approach a web app when connectivity is shaky?
EDIT:
I must qualify my question with further information. Assuming the web option, they've disallowed the use of browser runtime technologies and anything that requires installation. Thus, Silverlight is out, Flash is out, Gears is out - only asp.net and javascript is available to me. Having state this, part of my question was whether to use a desktop app; I suppose that can be extended to "thicker technologies".
EDIT #2: Network is homogeneous - every node is Windows. This won't be changing.
You should get a definition of what the client means by "lean" so that you don't have confusion surrounding it. Maybe present them with several options of lean that you think they might mean. One thing I've found is it's no good at all to guess about client requirements. Just get clarification before you waste a bunch of time.
Shaky connectivity definitely favors a desktop application. Web apps are great for users that have always-on Internet connections, and that might be using a variety of different browsers and operating systems.
Your client probably has locations that are all using Windows, so a desktop application is an appropriate choice. One other advantage of web applications is that they make the deployment issue easy to deal with. Auto-update technologies like ClickOnce make the deployment and update of desktop applications almost as easy.
And not to knock Google Gears, but it's relatively new and would have to be considered more risky than a tried-and-true desktop application.
Update: and if you're limited to just javascript on the client side, you definitely do not want to make this a web app. Your application simply will not be available whenever the Internet connection is down. There are ways to save stuff locally in javascript using cookies and user stores and whatnot, but you just don't want to do this.
If connectivity is so bad, I would suggest that you write a WinForm app that downloads information, locally edits it and then uploads it. This way, if your connection goes down, all you have to do is retry until it works.
They seem to be suggesting a plain vanilla web app that doesn't use AJAX or rely on .NET postbacks or do anything that might make it break down horribly if your connection goes away for a bit. Instead, it should be designed so that you can hit Refresh until it works. In other words, they seem to want the closest thing to a WinForm app, only uglier.
You may consider using a framework like Google Gears to help provide functionality during network down time. This allows users to connect to the web page once (with a functioning connection) and then be able to use the web app from then on, even without a connection.
When the network is restored, the framework can sync changes back with the central database.
There is even a tutorial for using Google Gears with the .Net Framework.
Gears with other languages
You mention that connectivity is shaky at these locations, but that the app needs to be centralized. One thing you might consider is using multiple decentralized read database servers and a single centralized write server. Mysql makes this possible and affordable if your app is small.
Have the main database server at the datacenter/central office. Put up small web/db servers at each location, with your app installed. You can even run them off a user computer if the remote location is not too big. Make the local database servers connect to the centralized database server as replication slaves. As changes come in to the centralized database, the slave servers will pull down the data and make it available locally. When the connection is unavailable, your app data is still at least available, if not up to date. When the connection is available, the database handles replicating all relevant data down.
Now all you have to do is make your app use two separate database handles: reading data it uses the local database, writing data it uses the central database.

Categories