Loading large data in jquery - c#

I have a web service that returns data, quite a large set, could be 600 rows, by 20 columns.
What is the fastest most efficient way to load this data into an html table in Jquery code?
I tried creating the table html by looping through the data returned and creating a table DOM inside a string, but the looping part is very slow. I have heard of Jquery Templates, but I am not sure this technology is fast enough for large sets of data....
Thanks

Is it possible for you to alter the web service or have another service call it and parse the data server side and return HTML? Processing the JSON on the client-side is going to be your bottleneck. If you can have the service return the required HTML to you, then it's a simple element.html(data) on the client side.
Edit: The question of returning JSON or HTML and the pros and cons of each have been discussed here quite a bit:
1, 2, 3, 4, 5

It seems this is a matter of design. loading 600 x 20 data items at once is not a good idea. Consider clients with low system resources like pocket PCs or TCs (thin client) would suffer to visit such a page.
You need to cache webservice data and load it in chunks into client browser based on the user action. You can use some Ajax controls to do so.

If your goal is to have the user be able to interact with the data as fast as possible, may be you want to consider something like infinite scroll (also called continuous scroll) pattern so you build the grid as needed from the scrolling of the user and not spend the whole time rendering the grid upfront.
Some links:
http://www.infinite-scroll.com/
http://net.tutsplus.com/tutorials/javascript-ajax/how-to-create-an-infinite-scroll-web-gallery/

I think this is where JSON DB might be best useful... you could write a server-side page that responds with json db formatted data for a few rows.. then do your own ajax code to load the rows and process them in your choice of display model like your own <table> with "overflow:auto;" and add rows to that table in chunks.. or use something like 'infinite scroll' already suggested.

Related

Json for caching tables.. is that good Idea?

Current State :
I work at Project Asp.net , it is about reading data from DB and mapping it to a datagrid into webpage. everything is okay , except that response time while the pagination, I know in each time I paginate the table (1 of 50 pages) that will be queried the DB so that's wrong ..
What I plan to do :
enhance the performance of reading data to be more faster . like hitting the DB one time to cache all the data into json file then reuse this for datagrid after mapping them..
or if any idea more better:
for caching these tables for while then use it for pagination .. lifetime of data will expire after the page closed!
I'm facing a scenario where the data is loaded to a js DataTable on client-side and from here, users can interact with the grid (CRUD) before sending it back to the server.
Though, using JSON as a cache on the client slide at first load and let the user play with the data is my solution in this case since it dramatically reduces the server load and data binding.
Please share your thought and it would be nice if you can spare some knowledge on this.
Depending on the situation of how much time the data changes.
If so many changing, than would be better to improve internal database caching or query result caching.
Otherwise you can use in front caching mechanism, like use memory for caching (*) the data/query result. A place where you can put any cached file.
(*)
Layer External:
redis, memcached, etc
Layer Application:
- Shared memory (that could be accessed by many requests/sessions at same time).

Sorting in Array vs Sorting in SQL

I have around 1000 rows of data.On the ASPX page, whenever the user clicks the sort button, it will basically sort the result according to a specific column.
I propose to sort the result in the SQL query which is much more easier with just an Order by clause.
However, my manager insisted me to store the result in an array, then sort the data within an array because he thinks that it will affect the performance to call the database everytime the user clicks the sort button.
Just out of curiosity - Does it really matter?
Also, if we disregard the number of rows, performance wise, which of these methods is actually more efficient?
Well, there are three options:
Sort in the SQL
Sort server-side, in your ASP code
Sort client-side, in your Javascript
There's little reason to go with (2), I'd say. It's meat and drink to a database to sort as it returns data: that's what a database is designed to do.
But there's a strong case for (3) if you want to have a button that the user can click. This means it's all done client-side, so you have no need to send anything to the web server. If you have only a few rows (and 1000 is really very few these days), it'll feel much faster, because you won't have to wait for sending the request and getting a response.
Realistically, if you've got so many things that Javascript is too slow as a sorting mechanism, you've got too many things to display them all anyway.
In short, if this is a one-off thing for displaying the initial page, and you don't want the user to have to interact with the page and sort on different columns etc., then go with (1). But if the user is going to want to sort things after the page has loaded, then (3) is your friend.
Short Answer
Ah... screw it: there's no short answer to a question like this.
Longer Answer
The best solution depends on a lot of factors. The question is somewhat vague, but for the sake of simplicity let's assume that the 1000 rows are stored in the database and are being retrieved by the client.
Now, a few things to get out of the way:
Performance can mean a variety of things in a variety of situations.
Sorting is (relatively) expensive, no matter where you do it.
Sorting is least expensive when done in the database, as the database already has the all the necessary data and is optimized for these operations.
Posting a question on SO to "prove your manager wrong" is a bad idea. (The question could easily have been asked without mentioning the manager.)
Your manager believes that you should upload all the data to the client and do all the processing there. This idea has some merit. With a reasonably sized dataset processing on the client will almost always be faster than making a round trip to the server. Here's the caveat: you have to get all of that data to the client first, and that can be a very expensive operation. 1000 rows is already a big payload to send to a client. If your data set grows much larger then you would be crazy to send all of it at once, particularly if the user really only needs a few rows. In that case you'll have to do some form of paging on the server side, sending chunks of data as the user requests it, usually 10 or 20 rows at a time. Once you start paging at the server your sorting decision is made for you: you have no choice but to do your sorting there. How else would you know which rows to send?
For most "line-of-business" apps your query processing belongs in the database. My generalized recommendation: by all means do your sorting and paging in the database, then return the requested data to the client as a JSON object. Please don't regenerate the entire web page just to update the data in the grid. (I've made this mistake and it's embarrassing.) There are several JavaScript libraries dedicated solely to rendering grids from AJAX data. If this method is executed properly your page will be incredibly responsive and your database will do what it does best.
We had a problem similar to this at my last employer. we had to return large sets of data efficiently, quickly and consistently into a datagridview object.
The solution that they came up was to have a set of filters the user could use to narrow down the query return and to set the maximum number of rows returned at 500. Sorting was then done by the program on an array of those objects.
The reasons behind this were:
Most people will not not process that many rows, they are usually looking for a specific item (Hence the filters)
Sorting on the client side did save the server a bunch of time, especially when there was the potential for thousands of people to be querying the data at the same time.
Performance of the GUI object itself started to become an issue at some point (reason for limiting the returns)
I hope that helps you a bit.
From both a data-modeling perspective and from an application architecture pattern, its "best practice" to put sorting/filtering into the "controller" portion of the MVC pattern. That is directly opposed to the above answer several have already voted for.
The answer to the question is really: "It depends"
If the application stays only one table, no joins, and a low number of rows, then sorting in JavaScript on the client is likely going to win performance tests.
However, since it's already APSX, you may be preparing for your data/model to expand.--Once there are more tables and joins, and if the UI includes a data grid where the choice of which column to sort will change on a per-client basis, then maybe the middle-tier should be handling this sorting for your application.
I suggest reviewing Tom Dykstra's classic Contosa University ASP.NET example which has been updated with Entity Framework and MVC 5. It includes a section on Sorting, Filtering and Paging. This example shows the value of proper MVC architecture and the ease of implementing sorting/filtering on multiple columns.
Remember, applications change (read: "grow") over time so plan for it using an architecture pattern such as MVC.

Adding form fields dynamically, then saving them away

I have an asp.net form page used for capturing data for a particular task, the task items will be displayed in a tabular fashion.
there may be a number of items included in this task, for which I wish to add the appropriate form fields dynamically.
I then will need to 'save' the data from the form.
I was wondering what would be the easiest way to approach this.
I had planned on generating a single 'starter' row and then using jQuery to clone it (whilst renaming the inputs slightly) as I needed.
only problem with this is that depending on the types of task items, the form fields would be slightly different. which would mean having a very complex 'starter' row to start with and a fair bit of code in the js along with the cloning - or I guess holding each 'type' of starter rown, hidden somewhere on the page and cloning them into life as necessary
or maybe having a js function for each task type that uses append(...html...)
I was trying to avoid using an updatepanel - in which I could generate all the new fields in the code behind, but maybe that would be simpler?
but along with this, when I actually want to save the data away, presume the only way to get at it all is to use the Request.Form collection?
interested in how you guys would approach this, as trying to not make this hard on myself.
thanks
nat
If you are building the UI using JQuery, why not send the final results back to the server via a web service, called by a JQuery $.ajax command? This way, you wouldn't need to rebuild the UI from client-side code after every postback, which would be the problem with building content client-side, but posting back from every response.
It looks like your form is very dynamic in nature. Creating all these dynamic controls on the server side can become really messy, especially because you have to work with it both on the server and on the client.
An alternative would be to do everything html related on the client and leave only the data processing to the server, using json as the data format.
I just started to work with angular and I am really impressed.

WCF performance

We have a table in our database which has around 2,500,000 rows (around 3GB). Is it technically possible to view the data in this table in a silverlight application which queries this data using WCF? Potentially, I see issues with the maximum buffer size and timeout errors. We may need the entire data to be used for visualization purposes.
Please guide me if there is a practical solution to this problem.
Moving 3GB to a client is not going to work.
for visualization purposes.
Better prepare the visualization server-side. That will be slow enough.
Generally in this sort of situation if you need to view individual records then you would use a paging strategy. So your call to WCF would be for a page worth of records and you would display those records and the user would click on a next / previous button or some such.
As for the visualisation you should look to perform some transformation / reduction on the server as 2.5 million records is akin to displaying one data point per pixel on your screen.
First of all, have a look here.
Transfering 3GB of data from Disk to Disk can take quite a few minutes let alone on crossing across the network. I think you have got bigger fishes to fry - WCF limitation is irrelevant here.
So let's assume after a few minutes/hours you got the data across teh wire, where do you store it? You Silverlight app if running inside the browser can not grow to 3GB (even on a 64bit machine) and even it could, it does not make any sense. Especialy that amount of data when transformed into objects will take a lot more space.
Here is what I would do:
Get the server to provide snapshots/views of the data that is useful, e.g. providing summary, OLAP cubes, ...
For each record, provide minimum data required.
If you need detail on each record, do that in a separate call
Well, I believe and suggest that you're not going to show 2,5 milion rows in the same listing.
If you develop a good paging of data and the way you query the data is optimal, I don't find the problem with WCF.
I'm agree with querying data with a WCF interface is less efficient than a standalone, direct access to infraestructure solution, but if you need to host some business and data and N clients to access that in a SOA solution, or it's a client-server solution, you'll need to be sure that your queries are efficient.
Suggestions:
Use an OR/M. NHibernate will be your best choice, since it has a lot of ways of tweaking performance and paging is made easy because of it's LINQ support through QueryOver API in NHibernate 3.0. This product has a very interesting caching scheme and it'll let your application efficiently visualize your 2,5 milion-rows database.
Do caching. NHibernate may help you in this area, but think about that and, depending on the client technology (Web, Windows...), you'll find good options for caching presentation views (ASP.NET output caching, for example).
Think about how you're going to serialize objects in WCF: SOAP or JSON? Maybe you would be interested in JSON because serialized objects are tiny enough in order to save network trafic.
If you have questions, just comment out!
Ok, after many users talk about what you do there technically - what is the sense someone without thinking thought you have there?
2.5 million rows make no sensein a grid. Zero. Showing 80 rows per page (wide sdcreen, tilted 90 degree) that would be 31250 pages worth of data. You can not even scripp to a specific page. Ignoring load times -even IF (!) you load that etc., it just makes no sense to have this amount ina grid. Filter it down, then load what you need page wise. But the key here is to force the user to filter BEFORE even thinking about a grid. And once you ahve them, lets not get into takling abuot the performance of the grid.
To show you how bad this is. For get the grid. If you assign ONE PIXEL or every data item, you take 1.33 screens of 1024*768 pixels to show the data. THis is one pixel per item.
So, at the end of the day, even IF (which is impossible) to manage to get this working, you end up with a non sensical / non usable applciation.

Read XML, or JSON file instead of SQL Server

I have asp.net 3.5 C# and a SQL Server 2008 back end.
There is a table that I use the most, this table has around 100 rows and doesn't change often. My code (web service with a cache) is called from JQuery to search a record by ID and return a JSON response to client side.
Recently the server that hosts my site had a big problem and had to migrate to a new server and my site was down for 3 days. I started thinking, to save my data to a XML, or Json file and do not touch the database anymore.
I need your input. I know how to work with the XML file (I will use the LINQ), but don’t know how to read a JSON file from the client side with JQUERY. Maybe I should read it on server side as StreamReader?
Which method do you like better (XML, or JSON). Any help would be greatly appreciated.
You can actually call an AJAX query right to a JSON file and it'll work from jQuery. This is known as a RESTfull api.
For instance:
http://www.myserver.com/api/customers/12
Could be a file 12 in the foler api/customers or it could be a script returning a json response. The idea is that the url represents the resource you're looking for.
However I highly suggest you don't go this approach, even if you load down the 100 rows and search for it in javascript it's a bad idea, as it'll put load on the client and defeats the purpose of AJAX (To only retrieve relevant information).
If you're determined to do away with your database I would suggest using an xml file, as you keep the processing on the server-side and and ensure it's behaviour. Any code that runs on the client is subject to:
Tampering
Version Issues
Taint from plugins
It's also harder to debug.
I suggest exposing a method on your webservice that looks up the row by ID, and then uses a simple LINQ to XML query to retrieve the data you want.
ID search should be very simple, and you could even cache the results if you're getting hit hard (to reduce disk reads & xml parses), but that may be overkill :)

Categories