i have stored my dataset in the View State(because i need to filter the data on different client clicks and show the data) but i feel like the page loading is taking a lot of time, even a checbox checked event(with AutoPostback) which does not have any code to execute is taking almost 2-3 seconds.
is this just because of the view state data, if so are there any alternatives for which i can achieve my tasks? and i need the data to be shown quicky on client events so i have been using the view state. any work around would help.
As #Tushar mentioned above, ViewState is not the place you want to be storing large amounts of data. It's really only meant to preserve the state of controls between round trips, and it can really lead to poor app performance.
Instead you should look into the following server managed options:
Application State - Used for storing data that is shared between all users. Uses server memory.
Session State - Used for storing data specific to a user's session. Also uses server memory. Data can be persisted through app restarts, as well throughout a web-garden or server-farm. More info from MSDN here: http://msdn.microsoft.com/en-us/library/z1hkazw7.aspx
The biggest cons of those methods are memory management, as both options consume server memory, and keep data until there is either a restart of some sorts, or until the session is dropped. Thus, these methods don't always scale well.
Also, here is an MSDN article discussing the various .net methods of state management, with pros and cons for each method :
A third option is to implement a caching strategy by either using the .NET caching libraries, building your own and/or using 3rd party caching servers/libraries. The benefit to using cache is that you have the data automatically expire after any given specified amount of time. However, complexities are introduced when working in a web-garden or server-farm environment.
The biggest thing to remember, is that any of the strategies mentioned above will require some planning and consideration in regards to managing/sharing the data.
If you're storing a large amount of data in ViewState, you'll notice performance issues. Although ViewState is really meant for a "this page only" and Session is meant for "this session", you'll reach a limit with ViewState size where the Session is ultimately much better for performance.
It's worth noting that you might be having some other type of issue, not just an issue with the ViewState (i.e. your database query may be taking a long time and could possibly be cached).
The ViewState makes the page slightly larger due to the extra data embedded in the page's HTML to hold the serialized ViewState. Whether that extra size will cause load problems depends on the connection speed, and on the size of the view state relative to the rest of the page.
The ViewState is sent back to the server with each HTTP request (so including your AutoPostback). Again, whether that causes a noticeable performance issue depends on the view state size and the connection speed.
On a broadband(ish) connection with the amount of ViewState data one would find in a typical page, you would not see 2-3 seconds additional processing time.
Diagnosing
Use the developer tools in your browser (in IE, press F12). You can monitor web requests including the exact header and body sent and received. You can also see the timing for each HTTP request. If the ViewState is not huge (not more than 1-2K perhaps) and your connection speed is not excessively slow, that is not your culprit.
Alternatives
You can hold state entirely server-side, or put any state items that are large entirely on the server. You can use Ajax requests to process page events that depend on that state.
Instead of loading data from a data-source multiple times, only do it one time. The other answers talk about accessing the data. I have run into instances where I load the data every time I do a post-back.
string myString;
public string MyString
{
get
{
// If there is already data in "myString", do not load it again!
if (this.ViewState["myData"] == null)
{
// Load data one time
this.ViewState["myData"] = "Hello";
}
return this.ViewState["myData"] as string;
}
}
How much ViewState slows down your page depends upon have much view state you have. I've inherited pages that generated over a megabyte of viewstate and seen the web server spend 10 seconds just processing the view state. If you don't want to rewrite your application and you need the large amount of view state, you need to investigate alternate strategies for saving / restoring view state. Saving ViewState to a database or even a plain file is much faster -- don't have to stream viewstate to/from client on each request.
Best strategy is to avoid viewstate in the first place though.
Just thought I should add, some controls are simply ViewState pigs, some grids are just terrible for viewstate consumption.
You can view the source of your page and get the ViewState value and use the online ViewState decoder at below url to check how much large are the values stored in your ViewState field for your pages:
http://ignatu.co.uk/ViewStateDecoder.aspx
If you find your viewstate is having large stored values then you should find alternatives for storing your Dataset.
Anyways, you should avoid putting the Dataset into your ViewState.
Related
I have been working with a fairly huge database. I want to populate the webcontrols(dropdownlists) of the page during pageload event to give the webpage as much flexibility as possible. For example, I have a dropdownlist ,that user can select, which will be populated from the unique rows of a specific column of a datatable.
Now, i don't want to fire the oracle query each time page load happens because that will slow down the webpage significantly (about 1mins each time). So i started to think of cookies as solution. I soon found out that limitation of cookie size (4kb) is way too small for my purpose. I will need around 10-15kb of size if i really want to store the datarows locally! So I tried to search if there were any way to increase the cookie size limitation to accommodate my needs and I found the possible solution is localstorage.
Is there really a way to increase the cookie size limitation?
What is the simplest alternative? is it really localstorage? or are there anything else to look into?
details: I am using C#/ASP.NET + ORACLE
Here are all storage options for state information in ASP.Net -
Cache - a memory pool stored on the server and shared across users
Session - stored on the server and unique for each user
Cookies - stored on the client and passed with each HTTP request to the server
QueryString - passed as part of the complete URL string
Context.Items - HttpContext and lasts only the lifetime of that request
Profile - stored in a database and maintains information across multiple sessions
Now, i don't want to fire the oracle query each time page load happens
because that will slow down the webpage significantly (about 1mins
each time).
You have two options for your scenarion -
If data shared across users, use Cache
If data is unique for each users, use Session
Ideally, you do not want to store in ViewState it will make your page very heavy unless you configure to store ViewState in Sql Server (which is out of the scope of this question).
Update:
localStorage - Not all browsers can handle localStorage, so make sure you check it first.
<script type="text/javascript">
if(window.localStorage) {
window.localStorage.SetItem('keyName','valueToUse');
// OR
window.localStorage.keyName = 'valueToUse';
}
</script>
FYI: ASP.NET does not offer specific methods for handling local storage; you can only manipulate at client side via javascript.
You don't want to use cookies.
I had a similar requirement in an application I've been working on. I needed to let a user select a value from ~15,000 choices using an autocomplete input box. I did not want to hit the database each time the component needed to find a value. I decided to use session storage to store all possible values and then feed those values to the autocomplete component. It works great. You could also use local storage.
For my needs, session storage was a better choice because I didn't need to values to persist after the user closed the browser tab. Also, before I store the 15,000 items in session storage, I compress them, using JavaScript compression, to save space. I'm using lz-string for my compression.
You don't want to use cookies for this purpose. There's not only the size limitation, but that cookie is sent back and forth with every request to the server, which could potentially reduce responsiveness in your application depending on your server and user's bandwidth.
Another alternative is to use the Viewstate which stores the data on the page itself. This doesn't have the space limitation, but it does have the same problem as cookies with the increased request size.
You can also use Session which will store the data in memory on the server for each session. This reduces bandwidth requirements, but does take up additional memory space on the server itself, which may not be desirable. Tons of data in memory + tons of sessions could = insufficient memory.
So figure out what your particular tradeoffs are. Viewstate is the most similar to your current solution.
To access viewstate or session in your page, just do Viewstate['KEY'] or Session['KEY']
Another thing to note is that the Session data is persisted for the entire session and across page loads/redirects/etc. Viewstate is ONLY available for that page. If you go to another page, Viewstate is discarded.
If you data is the same for all users, then you could also use Cache, which stores the data at the Application level instead of session. This still takes up memory space on the server, but will not increase as sessions do (a mere 15k in your case!). You can also specify expiration for the cache (either sliding or absolute), which allows you to automatically free up the memory being used if that resource isn't being accessed for a certain amount of time. Just make sure you always check if the data is in the cache before using it as .NET will discard the information if it expires OR if it just needs to clean up memory for some other purpose.
I'm writing a C#/asp.net page, and currently lots of data (say 100 000 Datetime int pairs) that i pull out of a database on page load, however loading the same data every page load seems a little silly. I've considered storing this value in the session, but multiple browser tabs seems to be causing an issue.
Is there a better way to store these values, and is using the session appropriate if i need to support multiple browser tabs?
Your session eats up a lot of server memory. You could store your session in a database (which defeats the purpose since this is the reason why you want to store them in the session in the first place), but reading between the lines (why would each visitor need 100,000 unique datetime pairs?) I think you should really look into storing these values in the cache (if they're the same for all users).
The problem with storing 100k int pairs in the session is that it might work for 1 or 2, hell even a dozen users maybe. But when your website gets popular its not scalable at all. Your server won't be able to store 500,000 user sessions of 100k int pairs each. You'll run out of memory pretty quickly
If this data is unique per user, then the session might be a valid place to cache the data. Although, if you do this then you should be aware that something you add to the session stays there and will need removing. Also since you are storing a lot of data per user session, so you consume a lot of server memory. That's what Steve is getting at with his answer - you could easily find yourself running out of room with multiple users having massive session data
If the data does not vary per user, then the cache is the answer!
The session could become unwieldy. You may want to consider a master page to wrap your pages in. You could then load the data once in the master, and keep it available across the pages, or place into a hidden object on the page (such as a literal) to be able to access.
Why do you need all of the pairs between pages?
What about using caching instead of session variables: http://msdn.microsoft.com/en-us/library/aa478965.aspx
It was suggested that the size of the keys being used in ViewState would cause a performance issue as the size of the view state would be larger and therefore increase the page size thus increasing rendering.
While I can see this being the case that the larger key may result in an increase in the view state size I am not so sure the impact is that significant.
As an example does ViewState["MySpecialProperty"] result in a larger ViewState than ViewState["x"]? And if it does, is the difference really significant enough to be a concern in a standard web app.
The length of viewstate key does affect the size of the viewstate. However it is only minimal. From small tests that I did it on 'MySpecialProperty' vs 'x' the difference by 20 or so characters. Also not that it could add up if you had more lengthy keys.
However you should focus on the values stored in the viewstate instead of the keys, since this will consume much more space.
Some references:
http://www.codeproject.com/Articles/101888/ViewState-Various-ways-to-reduce-performance-overh
As an example does ViewState["MySpecialProperty"] result in a larger ViewState than ViewState["x"]?
Yes, the resulting ViewState would be larger as key/value pairs get encoded.
Is the difference really significant enough to be a concern in a standard web app?
IMO, no. I'd worry more about how many pairs you put into ViewState and whether you actually need to pass that information on to the client.
Straight from the horses mouth: msdn viewstate
...The view state of a page is, by default, placed in a hidden form field named __VIEWSTATE. This hidden form field can easily get very large, on the order of tens of kilobytes. Not only does the __VIEWSTATE form field cause slower downloads, but, whenever the user posts back the Web page, the contents of this hidden form field must be posted back in the HTTP request, thereby lengthening the request time, as well...
The Cost of View State
Nothing comes for free, and view state is no exception. The ASP.NET
view state imposes two performance hits whenever an ASP.NET Web page
is requested:
On all page visits, during the save view state stage the Page class gathers the collective view state for all of the controls in its
control hierarchy and serializes the state to a base-64 encoded
string. (This is the string that is emitted in the hidden __VIEWSTATE
form filed.) Similarly, on postbacks, the load view state stage needs
to deserialize the persisted view state data, and update the pertinent
controls in the control hierarchy.
The __VIEWSTATE hidden form field adds extra size to the Web page that the client must download. For some view state-heavy pages, this
can be tens of kilobytes of data, which can require several extra
seconds (or minutes!) for modem users to download. Also, when posting
back, the __VIEWSTATE form field must be sent back to the Web server
in the HTTP POST headers, thereby increasing the postback request
time.
If you are designing a Web site that is commonly accessed by users
coming over a modem connection, you should be particularly concerned
with the bloat the view state might add to a page. Fortunately, there
are a number of techniques that can be employed to reduce view state
size. We'll first see how to selectively indicate whether or not a
server control should save its view state. If a control's state does
not need to be persisted across postbacks, we can turn off view state
tracking for that control, thereby saving the extra bytes that would
otherwise have been added by that control. Following that, we'll
examine how to remove the view state from the page's hidden form
fields altogether, storing the view state instead on the Web server's
file system.
From here:
By default, view state data is stored in the page in a hidden field
and is encoded using base64 encoding. In addition, a hash of the view
state data is created from the data by using a machine authentication
code (MAC) key. The hash value is added to the encoded view state data
and the resulting string is stored in the page.
So calculating byte by byte, yes your view state would be longer when you use a longer key for your entry because more characters should be converted to base64. but this is never is a big concern because the size of the key compared to the size of the data length is (usually) very small.
Well yes, it affects the size of the ViewState, as you mention it is not significant if you only add one variable to the ViewState, on the other hand, if you start writing variables like x, reading the code will be a pain, so it's better to have a balance favoring the readability of the code
Before considering things like this to improve performance, create some load tests and base on the results take the right decision.
Performance of ASPX pages is something really important, take a look to the 8 seconds rule
However you could try to improve performance related with the ViewState in other ways, basically, disable ViewState for controls that do not need it
For more information:
http://www.guidanceshare.com/wiki/ASP.NET_2.0_Performance_Guidelines_-_View_State
I was wondering if i should be caching the objects returned from my DAL in some way? I may have multiple UI controls calling for the same data in a single load of the page.
What would you guys recommend? Am i begin a little to cautious? Its not a terrible amount of data. But if i should be caching in some way, what would be the recommended approach?
You could cache AND if you really have multiple controls on the same page using the same data you can call the data once in the parent page and pass a reference to it to each control with a Setter in each control (rather than have each control pull the same data from the DAL themselves), eg:
myControl.AllUsers = _allUsers;
....
myOtherControl.AllUsers = _allUsers;
I also agree with #DanielHilgarth. Caching adds complexity (when to refresh the cache, for example). If the page loads quickly anyway, I wouldn't bother.
If the page is slow, database calls in loops are often the culprit in my experience.
It depends if it's safe and nescessary to do so. If the data you are working with does not require realtime data (ie. your blog) then by all means cache if you feel it's nescessary to do so (meaning your site is running slow).
A problem with caching that a lot of times people forget to account for is being able to clear the cache on demand if something that requires an immediate response (for example you ban a user, or update payment gateway information).
There are two main types of caching, sliding cache and fixed-time cache.
Sliding cache (cache that gets extended each time a valid retrieval is performed) is great for resources that have relatively easy to compute values, but may suffer from database / network overhead. Cache for 1 hour (or w/ever) on a sliding cache, and then manually invalidate (remove) the cache whenever an INSERT / UPDATE / DELETE occurs for the DAO. This way the user will see realtime results, but will infact be cached whenever possible.
Fixed time cache is great for resources that are difficult to perform (ie a very complex stored procedure) and do not require realtime accuracy. Cache for 1 hour (or w/ever) for the first time it's requested, and do not clear cache until that first hour is up. INSERT / UPDATE / DELETE is ignored by your cache mechanism (unless it's absolutely nescesssary).
To do so you can look at this library:
http://www.reactiveui.net/
it provides a neat an clean way to cache your objects.
I'd say it is a very neat way.
I understand that storing DataTable in session variable in asp.net is bad since it will use a lot of server's memory. What I don't understand is that then what do you do when:
User comes to a page where it requires to load a DataTable object (from SQL Server).
User clicks on radio button for simple event (Ex. some controls get disabled).
If you don't save the DataTable object in the session, you have to load it from the SQL server again upon postback on same page instead of just fetching it from the session?
Thanks for help.
DataTable's are pretty heavy objects and are not recommended to be stored in ViewState or Session for that matter. The scenario you describe is about caching data. So, why not use ASP.NET's cache?
ViewState, while it does not use as much memory on the server as Session or Cache, still requires serialization/deserialization on the server, requiring some temporary memory usage, in addition to providing your users a large payload of data on each request to/from the server (just take a peek at View Source in any browser and you'll see a very large hidden input with base-64 encoded data). If you don't use encryption, anyone can decode that data being delivered in each request, causing a potential security problem if any of that data is sensitive. ViewState is also meant for small amounts of data and is usually best to stick to the primary data types like ints and strings.
Session generally isn't a good idea either as it also requires serialization/deserialization, which adds additional overhead in addition to the strain on memory per user. Session has memory limits that decrease per user as you increase concurrent users. Session data does not "expire" until the actual session expires for each user, which by default is 30 minutes. Session is great for user-specific data, but is recommended to keep very small and again stick to the primary data types like ints and strings.
Cache does not serialize any data and is limited in size only due to the bitness of the OS. On Windows 2003 32-bit, you have 800 MB total application pool size to work with (1.2 or 1.3 GB if you use the /3GB switch). Under 64-bit, there's much more freedom and limitations are realistically only what you configure up to the amount of available system memory. A benefit of cache is that as memory pressure increases, cache can be expired to free memory for more important things. You also have control as to when items get expired when memory pressure isn't a factor (expiry's are not guaranteed). Take an additional step and you can put a cache dependency on data in the database, if using SQL Server, allowing the data itself to decide when to expire your cache, ensuring fresh data.
Lastly, the often forgotten about Application object can be used, but only for data that you know can be shared across users and does not need to change that often (hopefully not until an application restart).
Use Microsoft's documentation for ViewState, Session, Cache, and Application objects, to determine the wisest use of each for your particular scenario. A combination of using these correctly in addition to using AJAX (to avoid full page postbacks) and HTTP compression to reduce the payload delivered to the client can make for a very responsive site.
In more complex scenarios like Web farms and load balancing, there are additional issues to think about. Questions you will need to ask yourself will be things like: Should a new session be created if a user hits a different server than the originally requested one? Should cache work no matter what server a user hits? These questions will bring you to solutions that may change where you store data. InProc Session is more forgiving than using, say SQL Server, as a session server, as there are additional serialization restrictions.
Another way to store the DataTable, if you only want to use it at page level, is in ViewState. ViewState["dtbl"] = DataTable;
And you can access it from the ViewState Simply DataTable dtbl = (DataTable)ViewState["dtbl"];