Is silverlight scalable? - c#

Is silverlight more scalable then HTML. I found out that silverlight code runs on the client except whrn it has to update or fetch data from the server. Will my application be more responsive if I develop it in silverlight. I am not worried about end users installing silverlight on the clients. I am in a position to install silverlight on clients myself.
i just need to know if I develop a silverlight application will it make my application more scalable and/or responsive

Silverlight applications are, for all intents and purposes, "fat client" applications delivered over the web. Their code is executed on the local machine, and communication with a data store is conducted over WCF web services, which are usually wrapped by RIA Services.
Silverlight applications are quite responsive once loaded. Building a well-performing UI in Silverlight may be a little more challenging than it would be in WPF, but not by much.

The question doesn't make sense. HTML by itself doesn't do anything. There is no interactivity, nothing that can be responsive.
Of course, web apps typically rely on server-side logic (which requires a network round-trip, causing a delay) and Javascript (which runs locally, and so is pretty snappy)... But HTML itself is just a language for describing documents. It doesn't do anything, and it isn't "responsive" or "scalable".
Ultimately, it's much the same either way: it won't make a noticeable difference in terms of responsiveness whether you implement your logic in Javascript on a HTML page or in Silverlight. And when you need to communicate with the server, it doesn't matter if the browser or the Silverlight plugin makes the request, in both cases it requires a network round-trip.

Scalable in terms of what? Bandwidth, server CPU?
In theory moving the processing to the client will help server CPU but your data requests will still need processing. Also if you silverlight app is bigger than the web page(s) it would replace you may end up with more bandwidth being used. (You could use a CDN though)
In principle though if many pages are hit during one session it would be fair to think it could be more scalable.
Other issues such as market reachable come into play of course but having a client side app is an approach I have used to help speed and costs.

Related

How to decide between developing a web application and a desktop application [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I am a software engineer intern for a manufacturing company and they want me to develop an application for the company. They are leaning towards a web application however, I wish to know whether a desktop application would better fit the job. Therefore, I have been googling and looking through stackoverflow to find out what the pros and cons between desktop applications and web applications are. The following is essentially what I found:
Quick disclaimer, I have background in C# and WPF so I am a bit biased as it would be easier for me to develop a desktop application. I have no web experience so there is nothing I can really talk about in that area which is why I wish to know more about whether this application is better suited as a web application or desktop application. I am absolutely open to learning Php and web development to expand my abilities. I have started (a bit) looking into developing the web application using Php7 with Laravel framework.
Pros of desktop applications:
Typically faster than Web Applications (Assuming web application will perform complex queries, calculations, etc, and not just display markups)
Development of GUIs is faster
More secure as desktop applications are private by default.
There are more available controls allowing for a more rich and interactive experience for user (Or at least, these controls are easier/faster to implement on desktop based applcations compared to a web-based application)
Can take advantage of user hardware.
Cons of desktop applications:
Use/deployment is limited by system (However, this should not be a problem because all our systems are Windows based.)
Updates and installation must be manually implemented.
If every client desktop gets a database connection, scaling is not
good as database suffers from heavy load. (However, this probably
will not be the case since we won't have more than 500 users).
Pros of web application
Cross platform (No need to deal with different operating systems) so it is easily portable
Development is quick and easy
Deployment is easy as updates are automatic and server side.
Large community support and available frameworks.
Cons of web application
Larger overhead (Applications tend to be slower due to need to transmit data across the internet).
Need to deal with different browsers. Javascript most likely needs to be tweaked to be perfect on one web platform (Chrome, Firefox, etc) and will not be perfect on the others. (However, this is not that big of a deal).
Security is an issue since data will be public.
Please let me know if any of the above is outdated (most of the posts I found were from 2011 or prior) or wrong. Also, if there is any other pro/con to consider.
Moving on to the application description....
Background on the company: We build and process dozens of different parts every day. For each type of part, after X amount of the part is processed, a sample needs to be taken for inspection. So for example, part Y has 3 samples taken every 120 minutes to be inspected (Because the machine typically finishes processing X amounts in 120 minutes). The inspection results (measurement data) are then stored in the database (MySQL database).
General summary of the application's purpose:
View the schematics of all the parts we design (We store all the schematics as pdfs on a network drive, so this is simply just pulling up the specific pdf requested from the drive and displaying it onto the application).
View/update the status of all the machines in the company (What parts are they working on, are they online/offline, etc). A certain user (Inspector) will use this application to update machine status/information. Then another user (Operator) will use the application to view the statuses.
Monitor part inspections. So, for every machine and part being processed, there will be a timer to let a Operator user know when a certain part needs to be submitted for inspection. Upon part submission, an inspector will then receive a notification to inspect the part, and after letting the application know that they completed the inspection, the timer will restart to let the operator know whens the next time they need to submit a part.
The application will calculate statistical data (For example, Cpk values) from the part measurements obtained from inspection results and display the statistical data along with a graph/chart.
I hope I explained all of this clearly enough. Some other things to note, from my understanding, the users will not need remote access. This application will pretty much only be used on company site. Also, the original reason that the company wanted a web application was because operators will be using a tablet for the application and the tablets they acquired were original android based. However, they decided to switch to using Windows Surface tablets so WPF applications are now a possibility.
With all of this being said, I am really looking for input on what route people with more experience would recommend. I am still in college so please forgive my lack of knowledge/experience. What else should I be thinking about when deciding between a web application and desktop application?
Here are some of the pages I have seen while pondering this topic:
Advantages of web applications over desktop applications
https://www.quora.com/How-much-different-is-it-to-build-a-web-application-vs-a-desktop-application
https://www.quora.com/What-are-the-advantages-and-disadvantages-of-web-based-application-development-vs-desktop-application-development
There were more stackoverflow pages but the one listed above pretty much has everything that the other pages stated.
EDIT: Seems like web-application is winning so far (Not that I mind at all, I am actually excited to develop a web-application based off what I am hearing). Is there anyone who would rather do a desktop application? If so, why?
I'm inherently biased against web apps. They're difficult to get right due to browsers, they're typically insecure (by accident though). The platform sucks (JavaScript and the bazillion libraries from random people/orgs), "everything is a string". I could go on.
However it's undeniably the best platform for reaching a wide, public audience and allowing continual updates.
In a corporate environment the advantages do tend to go away, but not entirely. Updates, for example can be achieved generally by storing all your .exe & DLLs in a shared directory. As you say, you can build a much richer UI quicker and cheaper using the Windows platform.
With regards to your architecture, something that has worked for me in a similar situation is to have a Windows front end, but also have the guts of the business logic, data access (connection pooling) and processing off on a stateless web server (or two) accessed from the UI via Web Services (protocol of your choice - I prefer SOAP due to WCF and WSDL but plenty of folks won't).
This allows for centralised data access and a place to put your one-off batch jobs or calculations that can then be shared. It also has the advantage that if you need to do something really intensive, not every client machine has to have that capability.
Your situation seems to fit this model but without a lot of insider knowledge it's primarily opinion, but possibly one to consider.
Sounds like assembling or similar company work proccess monitoring to me.
If i have to build this application then first i will search and do some research if the function you want is possible and easy to develop with the programming language you will use
for example, if i choose to develop using web based then :
Larger overhead (Applications tend to be slower due to need to
transmit data across the internet).
you can use intranet and good spec server computer
Need to deal with different browsers. Javascript most likely needs to
be tweaked to be perfect on one web platform (Chrome, Firefox, etc)
and will not be perfect on the others. (However, this is not that big
of a deal).
then set the standard browser for working in your workplace
View the schematics of all the parts we design (We store all the
schematics as pdfs on a network drive, so this is simply just pulling
up the specific pdf requested from the drive and displaying it onto
the application).
you can upload pdf to server and view it within browser using pdf viewer plugin like pdfjs or similar plugin
View/update the status of all the machines in the company (What parts
are they working on, are they online/offline, etc). A certain user
(Inspector) will use this application to update machine
status/information. Then another user (Operator) will use the
application to view the statuses
is the machine have ip ?
can i use ping function to the machine to determine the machine is online or not to ease the task ?
if not then what is the schedule of the inspector to inspect the machine ?
of course the inspector can login to the system then update the machine status manually using web application
Monitor part inspections. So, for every machine and part being
processed, there will be a timer to let a Operator user know when a
certain part needs to be submitted for inspection. Upon part
submission, an inspector will then receive a notification to inspect
the part, and after letting the application know that they completed
the inspection, the timer will restart to let the operator know whens
the next time they need to submit a part.
this one sounds like scheduling mechanics to ensure the quality, you can make a timer with jquery and using ajax to send notification to the operator with specific data about certain part that need to be inspected
The application will calculate statistical data (For example, Cpk
values) from the part measurements obtained from inspection results
and display the statistical data along with a graph/chart.
this one is depends on your statistic formula, you can use highchart plugin for this one
the second one after you ensure your choosen programming language able to accomplish the task you want is to design the database structure
Quote by Linus Torvalds :
"Bad programmers worry about the code. Good programmers worry about data structures and their relationships."
Have a nice day, good luck with deciding after give it some good thinking to avoid development problem in the future

Client App vs Windows Service vs?

I am currently looking into some re-architecture for a set of applications for our organization. We currently have a set of 10-15 odd stand-alone applications that communicate to each other and provide an intermediate level between client software and hardware.
Problem with current model is lots of individual apps, which add memory overhead, latency in communication, bloat the system and make it difficult to recover from issues in case of any of these apps crashing.
I am thinking of combining the application into 1-2 logical units that help address some of these issues. The dilemma is on how to do this well:
Windows Service
UI Application
Both?
The goal is to have an always on system that will handle all of the client-hardware comms but also have a rich admin-user configuration UI that will be able to talk to all of the individual components of this system and provide config/etc capabilities. Having a WinForms/WPF application will allow easy admin-user access to the system config and provide real time feedback (camera feed, etc), but will leave this open to admins accidentally closing the window. Having a service doing all of that work is great, but I am not sure on how to provide a rich admin-user UI that interacts and changes this service.
Any ideas or links worth reading?
Thanks!
just thought I'd update my own question for anyone else that might be having a similar question.
What I went with is a centralized Windows Service that exposes a number of WCF endpoints for a number of its child components. Sitting on top of that is a UI application that communicates to the Windows Service through the WCF endpoints. To make building & debugging easier the Windows Service is configured to run as a Console application when run in debug.
This solution seems to work great so far!

running timer from global.asax vs quartz.net

I am developing a asp.net site that needs hit a few social media sites daily for blanket friend/follower data. I have chosen arvixe business class as my hosting. In the future if we grow, I'd love to get onto a dedicated server and run a windows service, however since that is not in the cards at this point I need another reliable way of running scheduled tasks. I am familiar with running a thread timer from the app_code(global.aspx). However the app pool recycling will cause some problems with the timer. I have never used task scheduling like quartz but have read a lot about it on stackoverflow. I was looking for some advise as to how to approach my goal. One big problem I have using either method is that I will need the crawler threads to sleep for up to an hour regularly due to api call limits. My first thoughts were to use the db to save the starting and ending of a job. When the app pool recycles I would clear out any parts not completed and only start parts that do not have a record of running on that day. What do the experts here think? any good links to sample architecture of this type of scheduling?
It doesn't really matter what method you use, whether you roll your own or use Quartz. You are at the mercy of ASP.NET/IIS because that's where you want to host it.
Do you have a spare computer laying around that can just run a scheduled task and upload data to a hosted database? To be honest, it's possibly safer (depending on your use case) to just do it that way then try to run a scheduler in ASP.NET.
Somewhat along the lines of Bryan's post;
Find a spare computer.
Instead of allowing DB access have it call up a web service on your site. This service call should be the initiator of the process you are trying to do. Don't try to put params into it, just something like "StartProcess()" should work fine.
As far as going to sleep and resuming later take a look at Workflow Foundation. There are some nice built in features to persist state.
Don't expose your DB to the outside world, instead expose that page or web service and wraps some security around that. WCF has some nice built in security features for that.
The best part is when you decide to move off, you can keep your web service and have it called from a Windows Service in the same manner.
As long as you use a persistent job store (like a database) and you write and schedule your jobs so that they can handle things like being killed half way through, having IIS recycle your process is not that big a deal.
The bigger issue is that IIS shuts your site down if it doesn't have traffic. If you can keep your site up, then just make sure you set the misfire policy appropriately and that your jobs store any state data needed to pick up where they left off, you should be able to pull it off.
If you are language-agnostic and don't mind writing your "job-activation-script" in your favourite, Linux-supported language...
One solution that has worked very well for me is:
Getting relatively cheap, stable Linux hosting(from reputable
companies),
Creating a WCF service on your .Net hosted platform that will contain the logic you want to run regularly (RESTfully or SOAP or XMLRPC... whichever suits you),
Handling the calls through your Linux hosted cron jobs, written in your language of choice(I use PHP).
Working very well, like I said. No VPS expense,configurable and externally activated. I have one central place where my jobs are activated, with 99 to 100% uptime(never had any failures).

High availability & scalability for C#

I've got a C# service that currently runs single-instance on a PC. I'd like to split this component so that it runs on multiple PCs. Each PC should be assigned a certain part of the work. If one PC fails, its work should be moved to a backup machine.
Data synchronization can be done by the DB, so that should not be much of an issue. My current idea is to use some kind of load balancer that splits and sends the incoming requests to the array of PCs and makes sure the work is actually processed.
How would I implement such a functionality? I'm not sure if I'm asking the right question. If my understanding of how this goal should be achieved is wrong, please give me a hint.
Edit:
I wonder if the idea given above (load balancer splitswork packages to PCs and checks for result) is feasible at all. If there is some kind of already implemented solution so this seemingly common problem, I'd love to use that solution.
Availability is a critical requirement.
I'd recommend looking at a Pull model of load-sharing, rather than a Push model. When pushing work, the coordinating server(s)/load-balancer must be aware of all the servers that are currently running in your system so that it knows where to forward requests; this must either be set in config or dynamically set (such as in the Publisher-Subscriber model), then constantly checked to detect if any servers have gone offline. Whilst it's entirely feasible, it can complicate the scaling-out of your application.
With a Pull architecture, you have a central work queue (hosted in MSMQ, Sql Server Service Broker or similar) and each processing service pulls work off that queue. Expose a WCF service to accept external requests and place work onto the queue, safe in the knowledge that some server will do the work, even though you don't know exactly which one. This has the added benefits that each server monitors it's own workload and picks up work as-and-when it is ready, and you can easily add or remove servers to/from this model without any change in config.
This architecture is supported by NServiceBus and the communication between Windows Azure Web & Worker roles.
From what you said each PC will require a full copy of your service -
Each PC should be assigned a certain
part of the work. If one PC fails, its
work should be moved to a backup
machine
Otherwise you won't be able to move its work to another PC.
I would be tempted to have a central server which farms out work to individual PCs. This means that you would need some form of communication between each machine and and keep a record back on the central server of what work has been assigned where.
You'll also need each machine to measure it's cpu loading and reject work if it is too busy.
A multi-threaded approach to the service would make good use of those multiple processor cores that are ubiquitoius nowadays.
How about using a server and multi-threading your processing? Or even multi-threading on a PC as you can get many cores on a standard desktop now.
This obviously doesn't deal with the machine going down, but could give you much more performance for less investment.
you can check windows clustering, and you have to handle set of issues that depends on the behaviour of the service (you can put more details about the service itself so I can answer)
This depends on how you wanted to split your workload, this usually done by
Splitting the same workload by multiple services
Means same service being installed on
different servers and will do the
same job. Assume your service is reading huge data from the db servers and processing them to produce huge client specific datafiles and finally this datafile is been sent to the clients. In this approach all your services installed in diff servers will do the same work but they split the work to increaese the performance.
Splitting the part of the workload by multiple services
In this approach each service will be assigned to the indivitual jobs and works on different goals. in above example one serivce is responsible for reading data from db and generating huge data files and another service is configured only to read the data file and send it to clients.
I have implemented the 2nd approach in one of my work. Because this let me isolate and debug the errors in case of any failures.
The usual approach for load balancer is to split service requests evenly between all service instances.
For each work item (request) you can store relative information in database. Then each service should also have at least one background thread checking database for abandoned work items.
I would suggest that you publish your service through WCF (Windows Communication Foundation).
Then implement a "central" client application which can keep track of available providers of your service and dish out work. The central app will act as scheduler and load balancer of the tasks to be performed.
Check out Juwal Lövy's book on WCF ("Programming WCF Services") for a good introduction on this topic.
You can have a look at NGrid : http://ngrid.sourceforge.net/
or Alchemi : http://www.gridbus.org/~alchemi/index.html
both are grid computing framework with load balancers that will get you started in no time.
Cheers,
Florian

How to programmatically limit bandwidth usage of my c# application?

I've got a backup application here which connects to various webservices and downloads/uploads files from ftp or http servers. What is the easiest way to limit the bandwidth usage of my application?
I need to do that because the application once installed and running will be slowing down internet access for all office people, which eventually will get me into hell. So I'd like to implement a speed-limit which is active during the work-hours and gets disabled at night.
What you are looking for is called Bandwidth throttling And here is a good example how is this done, also review the comments to know how it is done from a client side.
You may also want to take a look at this example too, putting things in a real application

Categories