Server-side vs Client-side web application Performance [closed] - c#

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I am an entry level programmer with only a few months of experience.
Yesterday I was discussing with a colleague how we can improve the performance of a project we are working on together.
The project is built with C# + Ext.NET + JS
The plan was to move as many things as possible to client-side JavaScript instead of interacting with the server all the time.
I thought this was a good idea, but couldn't help but wonder if there is a point where bringing everything to client-side starts making the web application slower. I understand that interacting with the server and reloading unnecessary data all the time is a waste of time in most cases, but I've also seen websites loaded with so much JS that the browser actually lags and the browsing the web application is just a pain.
Is there a golden point? Are there certain 'rules'? How do you achieve maximum performance? Take Google Cloud apps, such as Docs for example, they're pretty fast for what they do, and they're web applications. That is some very good performance.

JavaScript is incredibly fast on the client-side. I assume Ext.NET is like AJAX? If not, you can use AJAX to communicate with the server using JavaScript. It will be pretty fast configured like that. However, the style of coding will change drastically if you're currently using .NET controls on the DOM with click events.

My 2 cents: Use lazy loading of xtypes whenever possible on the client (ie. you can define an xtype but it is only instantiated when it is needed). Especially if those xtypes make ajax calls!

Related

Very simple web service: take inputs, email results [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I work at a small startup as a Data Scientist, and I'm looking for ways to make my analysis a bit more visible/useful to the organization. I'd like to be able to put up a simple web service which allows internal users to run my scripts remotely. They should be able to input a few parameters via a very simple UI, and they should have the option to have the results appear in the browser window (after a possibly long wait), or have them emailed. Results may be a few pdf figures, and they may be Excel spreadsheets (maybe more exotic in the future, but this is it for now).
The scripts are going to be all in Python, which will handle the analysis.
So, I'd like to know what the pros and cons are of using C#/WCF vs. something like Django or Python. I have significant experience in C# working in the Client-side code base here, but I have much less experience with WCF. All of my analysis work is done in Python (and R, to a lesser extent). The main goal is to not take all of my time building a fancy web service/UI---the front end just has to be friendly enough to not intimidate the marketing people. I don't have to worry about encryption, the server will be behind our firewall. I'm pretty platform agnostic, but I think the servers are all Windows based, if this helps.
Thanks in advance.
For extra credit, how does your answer change if some of my scripts are in F#?
You might consider using the Django web framework. You could set up a small app with your python scripts as different views. https://www.djangoproject.com/
And if you don't want to put that much effort into creating a friendly UI you could use twitter bootstrap. http://twitter.github.com/bootstrap/
Then just run the app internally to gather and display data either via HTTP GETs or via e-mail.
edit: I'm sorry I did not read carefully "pros and cons are of using C#/WCF vs. something like Django". I recently made a Django app and it was fairly straight forward.

Consuming a SOAP Web Service with the lowest overhead [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I'm implementing a SOAP Webservice for sending thousands of emails and storing thousands of XML response records in a local database. (c#.net, visual studio 2012)
I would like to make my service consumer as fast and lightweight as possible.
I need to know some of the considerations. I always have a feeling that my code should run faster than it is.
E.g.
I've read that using datasets increase overhead. So should I use lists of objects instead?
Does using ORM introduce slowness into my code?
Is a console application faster than a winform? Because the user needs no GUI to deal with. There are simply some parameters sent to the app that invoke some methods.
What are the most efficient ways to deal with a SOAP Web Service?
Make it work, then worry about making it fast. If you try to guess where the bottle necks will be, you will probably guess wrong. The best way to optimize something is to measure real code before and after.
Datasets and ORM and win form apps, and console apps can all run plenty fast. Use the technologies that suit you, then tune the speed if you actually need it.
Finally if you do have a performance problem, changing your choice of algorithms to better suit your problem will likely yield much greater performance impact than changing any of the technologies you mentioned.
Considering my personal experience with soap, in this scenario I would say your main concern should be on how you retrieve this information from your database (procedures, views, triggers, indexes and etc).
The difference between console, winform and webapp isn't that relevant.
After the app is done you should make a huge stress test on it to be able to see where lies your performance problem, if it exists.

Making an entire web application dynamic [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I know my way around the basics of KnockoutJS, and i can easily make a single page really dynamic... But i am to build a new web application, and i'm looking for advise on how to make making the entire webapplication dynamic where every viewModel and html-template is loaded dynamicly, with no full page requests, but the URL should still indicate what page i am on, eather with a hashtag followed by a path, or something better?
I'm a bit confused:
Is there some framework that plays nice with knockoutJs that helps
achive this?
Can i achive this without worring about KnockoutJS?
Is it just a matter of tweaking the viewModel to dynamicly load and dispose other viewModels
and templates, in a smart way?
What is the best practis, what do i do?
Any pointers, links or tips on this is much appreciated, thanks!
Here is an example, notice how the URL changes, and the new content animates in, how do they do it?
https://www.pokki.com/app/Little-Alchemy
Btw. I use ASP.Net MVC.
There are a few things you are asking for here.
Routing framework
External templates.
There's a simple plugin to help you with the latter: Knockout.js External Template Engine
For the former, there are some routing frameworks available that play nicely with KO. You still generally need to do something with the fetching/creation/disposal of child ViewModels. The routing framework may help you with this, or it may just handle monitoring the events that would normally cause a navigation, and call functions that you supply.
I'd like to see a full drop-in routing framework that allows for more declarative definition of url → ViewModel mapping, but haven't found anything that is truly easy as yet.
The one I have started using is called Path.js, but you still have do do a fair bit of glue code.

Is it ok to use serverside API+AJAX instead classic approach in webdev? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Recently i read an article which said something similar to "stop using classic approach in web development where server gets data from any datastore and renders the views to browser. server must only retreive needed data end response it via json or whatever so client can render it as he wants". Obviously this approach decreases network traffic + page load would be faster. But on the other hand we have to write more JS code(like Knockout). What's your opinion? What problems can appear in this case?
The advice is valid, but should be less dogmatic. The reason to switch to a data-fetching approach is pretty simple in reality: it allows you to reuse the calls elsewhere, if you have multiple parts of an application requiring the same data. Depending on how you do it, though, you might run into increased bandwidth usage due to not being able to get exactly the data you want in one AJAX call (thus duplicating/splitting requests).
The other obvious advantage is it allows you to roll out an outside API pretty easily once that is done.

How can I make my browser unique? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I just made a browser using c# as one of my first projects of programming and it works pretty good but its pretty generic. I tried adding different tools on it but it still ends up pretty bare and I'm better off using Firefox. Is there a way to add useful apps and color on your browser. I want to make it more personal and add things that make it unique not necessarily better than all the others. I did it more for practice than anything but I want to add a little flare to it. If I'm not doing it for a job I might as well go for style points.
Implement HTML5 and CSS3. All of it. That would be worth using.
Now in the short run what I'd really like, would be a browser, that combines V8 JavaScript engine, WebKit as rendering engine with flexibility of XPCOM (but without its bloat). I mean basically cross-platform Google Chrome, accepting Firefox extensions.
I suspect you'll find most of the really useful ideas have already been implemented in the major browsers. If you think of something really neat which hasn't been done already, I think it would be a more useful contribution to the world if you'd implement it for/in the major browsers instead of in your own one.
I'm not trying to put a damper on your aim - it's just that quite a few people have already thought about browsers long and hard. It's great to do something as a route to learning, but it would be unrealistic to think you'll come up with a world-beating new idea at the same time.
Kittens, unicorns (mostly pink) and a sparkling logo would do it for me.
Most browsers these days have given up competing on features and now battle it out on the speed and stability front. Safari 4 (using their SquirrelFish / Nitro JavaScript engine) is the self proclaimed "fastest browser in the world", Chrome implemented their V8 JavaScript engine and also separated processes for each tab in their browser. Firefox 3.5 apparently has increased JavaScript performance to rival Safari 4.
There are umpteen Gecko and WebKit browsers out there trying to come up with the next big idea, but bar Chrome, no browser has made significant (any?) impact on the web in the past few years.

Categories