I'm a desktop application developer who is temporarily working in the web. I'm working with a client that wants me to build an app for use by locations all over the state; however, these locations have very shaky connectivity.
They really want a centralized web app and are suggesting I build a "lean" web app. I don't know what a "lean web app" means: small HTTP requests but lots of them? or large HTTP requests with few of them? I tend to favor chunky vs chatty.. but I've never had to worry about connectivity before.
Do I suggest a desktop app that replicates data when connectivity exists? If not, what's the best way to approach a web app when connectivity is shaky?
EDIT:
I must qualify my question with further information. Assuming the web option, they've disallowed the use of browser runtime technologies and anything that requires installation. Thus, Silverlight is out, Flash is out, Gears is out - only asp.net and javascript is available to me. Having state this, part of my question was whether to use a desktop app; I suppose that can be extended to "thicker technologies".
EDIT #2: Network is homogeneous - every node is Windows. This won't be changing.
You should get a definition of what the client means by "lean" so that you don't have confusion surrounding it. Maybe present them with several options of lean that you think they might mean. One thing I've found is it's no good at all to guess about client requirements. Just get clarification before you waste a bunch of time.
Shaky connectivity definitely favors a desktop application. Web apps are great for users that have always-on Internet connections, and that might be using a variety of different browsers and operating systems.
Your client probably has locations that are all using Windows, so a desktop application is an appropriate choice. One other advantage of web applications is that they make the deployment issue easy to deal with. Auto-update technologies like ClickOnce make the deployment and update of desktop applications almost as easy.
And not to knock Google Gears, but it's relatively new and would have to be considered more risky than a tried-and-true desktop application.
Update: and if you're limited to just javascript on the client side, you definitely do not want to make this a web app. Your application simply will not be available whenever the Internet connection is down. There are ways to save stuff locally in javascript using cookies and user stores and whatnot, but you just don't want to do this.
If connectivity is so bad, I would suggest that you write a WinForm app that downloads information, locally edits it and then uploads it. This way, if your connection goes down, all you have to do is retry until it works.
They seem to be suggesting a plain vanilla web app that doesn't use AJAX or rely on .NET postbacks or do anything that might make it break down horribly if your connection goes away for a bit. Instead, it should be designed so that you can hit Refresh until it works. In other words, they seem to want the closest thing to a WinForm app, only uglier.
You may consider using a framework like Google Gears to help provide functionality during network down time. This allows users to connect to the web page once (with a functioning connection) and then be able to use the web app from then on, even without a connection.
When the network is restored, the framework can sync changes back with the central database.
There is even a tutorial for using Google Gears with the .Net Framework.
Gears with other languages
You mention that connectivity is shaky at these locations, but that the app needs to be centralized. One thing you might consider is using multiple decentralized read database servers and a single centralized write server. Mysql makes this possible and affordable if your app is small.
Have the main database server at the datacenter/central office. Put up small web/db servers at each location, with your app installed. You can even run them off a user computer if the remote location is not too big. Make the local database servers connect to the centralized database server as replication slaves. As changes come in to the centralized database, the slave servers will pull down the data and make it available locally. When the connection is unavailable, your app data is still at least available, if not up to date. When the connection is available, the database handles replicating all relevant data down.
Now all you have to do is make your app use two separate database handles: reading data it uses the local database, writing data it uses the central database.
Related
I'm looking for a way to establish a simple communication between a c# web application and the operating system.
Since i'm working on Silverlight, i get everything i need to create files into any folder on the C:/ Disk. The problem is that we're going to migrate from Silverlight to Html 5 / C#
So i'd need a way to create files FROM any browser to any OS : Windows,Mac,Linux ..
I thought about using Microsoft Active X but that's not cross platforms.
I'm simply looking for a technology/plugin/software or anything that would allow me to do that, the less client interaction would be the best.
I think your need is in conflict with any common sense about security. If there was a simple way to create any file on any computer that loads your web app, just imagine how quickly all sorts of malware would spread.
But going back to your question - I think it will not be simple (btw. was it really simple in silverlight?). What I can imagine is to have some kind of service running on a client PC (the user would have to install it, or it could be corporate policy if your web app is targeted at corporate solutions). Then the service would listen on some TCP port and your web app could send requests to that port with the intent to create particular file with particular content. All the security concerns would be then implemented in mentioned service so that it doesn't get abused by hostile web apps
I'll working in a Random Moment Sampling desktop app. I don't work with windows forms since a long time and I have the following questions.
I need to query data from Oracle 11g, if I remember right, before my users can start using the client application they need to install the oracle client. I'm right or this changed?
If this is a problem I can use web services to retrieve the data. If someone has recommendations I'm open to alternatives, I'll have approximately 3000 users and I'm looking for the best option.
The application will run in the background querying the database every minute, it will look for samples, the moment it founds ones a window comes up blocking the computer until the user fills the sample.
Is a Windows.Forms application the best option or I shall use Windows Service? I read a few threads but Im thinking in the installation process.
I'm currently on time so I can try a few ideas.
Yes the Oracle software needs to be installed. There is an "instant client" package That is a little more lightweight then the normal client which can allow for connectivity.
Whether to use a service or not depends on the functionality of your system and how extensible you want it. You mentioned you will have 300 users querying the data. If they are querying the same data it may result in more than one user responding to the same data. I don't know if this is what is desired.
edit: to combine a bit if the oracle software is a concern. If you do create a service that serves up your data, the system where the service is run is the only one that will require the Oracle client software.
I want to build some sort of interface that will monitor our real time routing/switching system. I would like to give a lot of visual feedback to be able to monitor its status visually. Our system and clients are not co-located so they would need to connect via TCP/IP.
I would like to be able to service any number of monitoring clients (although this will probably only ever be about 4-6 clients). I thought of using SilverLight but there appears to be one or two tricks involved in getting SilverLight to connect back to an application running on a different port.
I have also thought of using HTML5 canvas and websockets. Another alternative is to just create the clients using normal Window Forms and perhaps WPF. But this means that to monitor the application the client will have to be downloaded before. I would prefer something that is as easily accessible as web app?
What are some of the more common application stacks to achieve this? What should I watch out for?
EDIT:
Just to add: This will be an internal tool only. But we have offices in a couple of locations.
any choice in this direction could be subjective and arguable, surely somebody could suggest any possible web framework or language...
I would consider, however because of your .NET and C# tags, ASP.NET MVC 3, so basically web based plugin-less ( NO Silverlight ) HTML 5 solution.
Consider that StackOverflow is done in same way (MVC, ASP.NET, SQL Server... ) and outperforms as we all know.
the way you grab the underlying events from TCP, so the way you capture and provide the data from TCP, it's another thing from the front end, I would probably write a Windows Service if the traffic is so high and you want to grab and store data anything regardless any active client connection.
There are plenty of real time charting controls out there also for MVC, MS Chart Control. DevExpress, ExtJS integrated ones...
"real time" and Browser is bothering me.
I would indeed go WPF or WinForms. Using the ClickOnce-Deployment you can make this a no-pain for the user and you can roll-out new versions just by redeploying them and having the user restart the application.
In my company this works really fine and we have no problems whatsoever. The only problem with this is, that the app.config is somewhat hard to find and keep current/valid (redeploy) but in your case this won't change per client (or so I guess).
I agree with #Davide - I would go for a WebService that will obtain all routing/switching data in realtime. You will have a web application and on the client side you will have JQuery/AJAX fetching realtime data from the WebService component.
I've seen cool demo's of Web Orb doing something similar to what you want. http://www.themidnightcoders.com/
If you are starting from scratch, it would be good to check out WCF (Windows Communication Foundation). It's great because it can expose your functionality in many ways, using nothing more than modifying a config file.
If you want a Windows client app, you can host it in a Windows Service, or simply include it as a side assembly. For web apps, you can choose between various formats (JSON, XML), channels (HTTP, TCP) and protocols (SOAP, ODP).
If I got it right, there will be a server-side application which will collect information from the devices and expose it to clients as a service. In that case, a WCF application might be hosted in a Windows Service or IIS on a server machine, and expose the data though one or more endpoints (HTTP, TCP).
I am not aware of problems in connecting a SilverLight app to a service, but I would rather go for a HTML5/JavaScript combo instead, for easier deploying and compatibility with a wider range of devices (no plugins needed). ASP.NET MVC should be the best choice for the web app.
We have a multi user product prototype in WPF which works fine as a prototype. Now we want to build the complete product.
In our product we have scenarios where 2 - 3 users might have to use same data. Say one us editing and the other user is viewing the continuos edits. And also whenever a user changes a common itemm updates should go to all the other users on that same item, and they need to refresh their information. And this should happen without polling continuously.
Is there any advantage of going towards web based product development i.e in Silverlight.
There's absolutely no architectural advantage to Silverlight over WPF in this situation. Your main advantages for Silverlight would be 1) smaller deployment 2) cross platform and 3) more integrated in-browser experience.
From the developer's perspective, however, you might actually find WPF better for developing this kind of collaborative application because you have access to a wider array of networking options. Silverlight has limits on what TCP/UDP ports you can access and has no built-in peer to peer networking capabilities like WCF does on the .NET Framework.
In any case, a Silverlight application for all intents and purposes is a client application, not really any more of a web application than WPF except when it comes to deployment.
Peer to Peer (p2p) vs. Client / Server
Your question really comes down to how you want to design your data storage.
P2P: Do you want each copy of the application to keep a full copy of the data and to exchange updates to the data with the other clients? This works well in a LAN environment but gets challenging over the broad Internet. BitTorrent is a good example of an application that does this.
P2P generally has higher performance and fewer costs but is very tricky to pull off. You'll need a p2p network transport like PeerChannel, and most likely a synchronization engine like Sync Framework, and some form of structured local data store like SQLite.
Client / Server: Do you want one master computer (e.g. server) to host all the data and have each client load / update data to the server? This works well in a LAN or Internet environment. Web browsers / web servers are a good example of this.
Client / Server has the overhead that a dedicated always on server computer has to be involved and you have to program two applications, the client side and the server side. Silverlight or WPF work well for the client piece of this design.
If you're up for a challenge, developing a p2p application can be a lot of fun because there are many obstacles to overcome and the end result is generally more efficient. This will essentially require you use WPF to get the libraries/tools support you need. If you need to get something working quickly, you'll find that the tools you have support client / server much better because this is how most applications are written. Here, WPF and Silverlight will both work, but they are only part of the solution -- you'll need a server technology too like SQL Server or ASP.NET or Azure or ...
I'm working on a graduation project for one of my university courses, and I need find some place to run several crawlers I wrote in C# from. With no web hosting experience, I'm a bit lost. Is this something that any site allows? Do I need a special host that gives more access to the server? The crawler is a simple app that does its work, then periodically writes information to a remote database.
A web crawler is a simulation of a normal user. It acess sites like browsers do, getting the html code (javascript, etc.) returned from the server (so no internal access to server code). Being that, any site can be crawled.
Be aware of some web crawler ethics guidelines. There are pages you shouldn't index or follow its links. And web developers build some files and instructions to web crawlers, saying what you can index or follow.
If you can't run it off your desktop for some reason, you'll need a host that lets you execute arbitrary C# code. Most cheap web servers don't do this due to the potential security implications, since there will be several other people running on the same server.
This means you'll need to be on a server where you have your own OS. Either a VPS - Virtual Private Server, where virtualization is used to give you your own OS but share the hardware - or your own dedicated server, where you have both the hardware and software to yourself.
Note that if you're running on a server that's shared in any way, you'll need to make sure to throttle yourself so as to not cause problems for your neighbors; your primary issue will be not using too much CPU or bandwidth. This isn't just for politeness - most web hosts will suspend your hosting if you're causing problems on their network, such as denying the other users of the hardware you're on resources by consuming them all yourself. You can usually burst higher usage levels, but they'll cut you off if you sustain them for a significant period of time.
This doesn't seem to have anything to do with web hosting. You just need a machine with an internet connection and a database server.
I'd check with your university if I were you. At least in my time, a lot was possible to arrange in-house when it came to graduation projects.
Failing that, you could look into a simple VPS (Virtual Private Server) account. Unless you are sure your app runs under Mono, you will need a Windows one. The resource limits are usually a lot lower than you'd get from a dedicated server, but they're relatively affordable. Some will offer a MS SQL Server database you can use next to the VPS account (on another machine). Installing SQL Server on the VPS itself can be a problem license wise.
Make sure you check the terms of usage before you open an account, as well as the (virtual) system specs though. Also check if there is some kind of minimum contract period. Sometimes this can be longer than a single month, especially if there is no setup fee.
If at all possible, find a host that's geographically close to you. A server on the other side of the world can get a little annoying to access remotely using Remote Desktop.
80legs lets you use their crawlers to process millions of web pages with your own program.
The rates are:
$2.00 per million pages
$0.03 per CPU-hour
They claim to crawl 2 billion web pages a day.
You will need a VPS(Virtual private server) or a full on dedicated server. Crawlers are nothing more then applications that "crawl" the internet. While you could set up a web site to be a crawler, it is not practical because the web page would have to be accessed for you crawler to work. You will have to read the ToS(Terms of service) for the host to see what the terms are for usage. Some of the lower prices hosts will cut your connection with a reason of "negatively impacting the network" if you try to use to much bandwidth even though they have given you plenty to use.
VPS are around $30-80 for a linux server and $60+ for a windows server.
Dedicated services run $100+ for both linux and windows servers.
You don't need any web hosting to run your spider. Just ask for a PC with web connection that can act as a dedicated server,configure the database and run the crawler from there.