Is there a way to cache a user requested data depending upon the user/connection. My situation is that I display the data returned from the WCF in ASP.NET Gridview using paging. The gridview paging displays only 10 items at a time. Whenever the next page is clicked, the service is getting called again (which takes time). After each call the WCF connection is closed. Is there a way to fix this issue? I read upon WCF ASP.NET caching mechanism that caches functions call data and it expires after a certain time. My main thing, is per user/call caching like return 10 items at a time without calling the long running function again for each 10 sets of data. Is there a way to do?
Basically call the function the first time and get 100 items, and return it 10 at a time, without ever running the function again (which gets 100 items)?
I believe that the ASP.NET Gridview always does a postback which result in function call everytime a user changes page. But I maybe proven wrong by someone.
That being said, this may not be the best solution for you. But you could avoid using ASP.NET Gridview in your scenario. When you run a WCF call and return 100 items, you keep them on client side as a JSON file stored in memory, then use jQuery or some other form of javascript to enable users to go through pages without making another function call.
Edited
This may prove useful to you to help you do paging on client-side without having multiple repeated call to the server
http://www.smallworkarounds.net/2009/02/jquery-aspnet-how-to-implement.html
Related
I want to process some records one by one , Need to show record number on .aspx page that currently we are processing nth record.
Setting value of nth record in a label does not work as it will be reflecting only after response come back from server (it does not get refreshed for every record - looks like freezing )
You can use different techniques:
Do server pulling via AJAX request by in some period. Good answered question on this topic - Server polling with JavaScript
Use SignalR to send changes back to client when there was changes on server. You can read about it more here: How SignalR works internally?
Choose one that is applicable in your case
Here's my predicament. I have a page in an MVC app that's displaying a list of search results via a partial view using an ajax call. The model is a List<List<string>> representing a dynamic data set, i.e., the users choose which columns they want returned and what order they come back in. All the view is doing is a nested loop which builds out the results table.
One of the potential returned fields is an image of a barcode which is being rendered by another method returning a FileContentResult. Normally this works great, it's slick and performant, to the point where I don't really notice all of the barcodes being rendered and downloaded at all, even in a data set that's hundreds of rows long.
The problem arises when I set a session variable using HttpContext.Current.Session, even something as simple as Session["thingy"] = "thingy";. When that happens there is a drastic performance hit with the barcode images. Result sets that would take a second to load fully are now suffering from image "pop in" for up to 10 seconds after the search button is hit. A few times an image has failed to load, giving an error to the effect of "the server is too busy right now".
Does anyone out there in overflowland have any insight into what could be causing this behavior? I've found a kludgy workaround but it involves unnecessary ajax calls and extra trips to the database.
So the issue was that IIS was treating requests synchronously whenever there was anything stored in the session. So all of my calls to the barcode action were waiting until the last one had finished before moving on, hence the pop-in.
The answer was in this link posted by Alexei. Oddly enough it was the most downvoted answer that provided the easiest solution. I created a new controller for my images and refactored the barcode rendering action into it, then decorated the controller with [SessionState(SessionStateBehavior.Disabled)], forcing IIS to treat any requests to the actions in the controller as asynchronous.
I was having the same issues a while ago. Fixed it by setting EnableSessionState to ReadOnly in my web.config.
I thought it might have some negative side effects but none so far. Even posted a question here in SO looking for comments.
See here: EnableSessionState = ReadOnly - possible side effects?
I would like to be able to load data into a DataGrid in Silverlight as it becomes available. Here is the scenario:
My silverlight client fires a WCF call to the server.
The server takes about 1 to 2 seconds to respond.
The response is between 1 MB and 4 MB (quite large).
This data is loaded into a DataGrid.
Although the server responds quickly, the data is not seen by the user until all 1 MB to 4 MB has been downloaded.
What is the best (or most efficient) way to load this data into the DataGrid as it is being downloaded by the client? As opposed to waiting for the download to complete?
A way of handling this is implementing custom virtualization.
Add a webservice method that returns only ids
Add a webservice method that returns a single object by id
Retrieve the ids and load only the visible objects (and perhaps a few more to allow scrolling)
Retrieve more objects when needed for scrolling.
The problem is part of what I was trying to get at with my comment (you still didn't really specify the Datatype for the return), and what Erno gave you a workable solution for. The web service serializes whatever return type you are sending, and will not give you partial results. It's not a question of how you interface with the grid, it's a question of when the web service call on the client says "ok i received the data you need, now continue processing". For instance, if you are gathering up a data table on the server side with 4MB worth of records, then in your service do a :
return MyMassiveDatatable;
Then you are going to have to wait for the entire data table to be serialized and pumped across the wire.
His solution was to break up the transfer into atomic units. I.E. query first for the id's of the records in one web service call, then iterate through those id's and request the record for each id one at a time, and as you receive the one record back, add that to your client side table so that your display writes each record as you get it.
After x seconds, after the page loads, I need to execute a method in the code behind. I cannot move this logic into JS.
Do I need to use delegates/events for this? Can anyone give me an example (preferably with a code snippet)??
Put a counter in JS that measures the X seconds. Once it's reached it's mark, have it send a message via AJAX back to the server, and the server executes the method.
That's about the only way to ensure that the counting of those seconds is accurate to when the page finishes loading. If you don't care too much about accuracy, just have the server kick off the method x seconds after it sends the page.
Your best solution is going to be to use javascript to either cause a postback, or to send an AJAX request to the server after the X seconds has elapsed.
Due to the page lifecycle of ASP.NET pages, you can't do it from the code-behind directly. You can see this article for more information on the ASP.NET Page Lifecycle.
I would put a bit of javascript that uses the "SetTimeout" to trigger a JS method call that either does the Ajax request, or forces the postback, depending on what you are doing.
Edit
Based on the additional information you put in the comments to the post i would recommend a modified approach. If all you are doing is launching another window, and you want to delay that logic.
Instead of directly calling the window.open or however you are doing it. Simply put that code inside of the code that would be called using the "SetTimeout" method as I referred to earlier. No need to involve the server-side at all.
Question to die hard asp.net experts. I have spent much time trying to find an answer or to do it myself but no luck so far.
ASP.NET web application. I plan to improve page load time so that user experience is better. I want to delay load sections of page using UpdatePanels. I can make one UpdatePanel update itself right after page loads using timer with minimum interval. That works just fine but steps begin when trying to have it done with multiple UpdatePanels. Basically what happens is all panels are updated but sequentially rather than all at the same time.
Now, I have read that this is due to a fact that each async postback result caries full page viewstate and to prevent from viewstate inconsistencies asynchronous postbacks are serialized. Actually they say that only last callback would be successful so I am lucky to have them serialized, I guess.
And now the big question: has anyone found a way round it? In ASP.NET if possible. This would be a VERY valued answer probably not only for me.
Thanks, thanks, thanks (for working answer :-)
UpdatePanels are synchronous by design.
If you want to execute multiple requests concurrently, you'll need to use page methods, AJAX services, or raw AJAX. Either way means giving up on ViewState.
If you want to render ASP.Net controls concurrently for multiple AJAX requests, you can make small independent ASPX files that contain the controls, send AJAX requests to them, and insert the rendered HTML into the DOM. In jQuery, you would do this like this: $('selector').load('something.aspx'). Note that neither postbacks nor viewstate would work.