I should display a fairly large amount of data in a GridView (around 1000 rows per 10-20 columns), and I see that the first rendering is extremely slow in IE8 (also with compatibility mode enabled). The same page loads very fast on Firefox and Chrome, but unfortunately I have to target IE for this project.
What can I do to improve IE's behavior?
Already you know that for large data source the rendering will be slow :)
You can try answers here on this post
Why do my ASP.NET pages render slowly when placed on the server?
On this page Look at this answer link https://stackoverflow.com/a/730732/448407
But prior to this all, Why don't you use paging in the gridview?
This will allow the page to open as the data to render will be less but this will not be a performance boost at the database level.
For that you need custom paging :
http://www.aspsnippets.com/Articles/Custom-Paging-in-ASP.Net-GridView-using-SQL-Server-Stored-Procedure.aspx
Are you using javascript to render the page? Or the whole HTML is coming from the server?
If Javascript, then you need to switch to server side rendering. Maybe use DataGrid on the server.
If you have large amount of CSS, especially CSS classes defined as .parentClass .childCass {....} then it performs worse in IE.
Another possibility is your page is downloading a lot of script, css, images. IE is usually slower than FF, Chrome is fetching lot of external resources.
So, suggestion would be to:
Render the HTML directly from server.
Set EnableViewstate = false on the DataGrid.
Cleanup CSS.
Reduce number of scripts, css and images.
Let me know if it helps. If not, please prove the html output from your page.
Related
following scenario: We've developed around 400 personal sites and we are currently trying to build our portfolio. Due to multiple reasons we would like to display the index so we can put it on our portfolio. First thought was to make programatically screenshots of every site. The heads in our company promptly debunked it because they want to show it live. Iframes are not an alternative apparently. So we have to download the index. Possibly only with the styles and images needed to display it properly.
I am unsure on how to start doing this.
Do you guys have any ideas?
The underlying technology of CodedUI (and Selenium) uses a web crawler to isolate specific useful parts of a web page. I recommend using that underlying library to crawl your webpages running live, and extract whatever images and divs make up your page structure.
You can then emit these as static HTML to make page snapshots suitable for a site index.
Doing it this way means you will be using the same technology as you use for test automation, but instead of running tests, you can extract the useful structure from your HTML and emit it as a page snapshot. You will have to mark the "useful" parts of your HTML to enable the crawler to extract just the items you think should be indexed (i.e. include a data- property if HTML5). This might be a lot of work - so if you just need a screenshot of each of your pages, just use Selenium or CodedUI to crawl your sites and capture the screen image.
At our company, I need to keep a web page up with a list of open orders for our employees.
In the code behind, I filter the search based on what items are selected from a DropDown list ddlList1 and a TextBox txtSearch, sort of like this:
string sql
if (!String.IsNullOrEmpty(ddlList1.SelectedValue) &&
!String.IsNullOrEmpty(txtSearch.Text)) {
sql = string.Format("{0}={1}", ddlList1.SelectedValue, txtSearch.Text);
} else {
sql = null;
}
GridView1.DataSource = db.Select(sql);
GridView1.DataBind();
Management wants this data to be up to date, and never over 10 minutes old.
I am not sure how to do this. Most of my code is done on Windows Forms.
So far, I have found a way to refresh the page using the META tag:
<meta http-equiv="refresh" content="600;Summary.aspx" />
However, it almost appears as though WC3 recommends against using the refresh property:
Note: The value "refresh" should be used carefully, as it takes the control of a page away from the user. Using "refresh" will cause a failure in W3C's Web Content Accessibility Guidelines.
Ref: HTML meta http-equiv
So, what is the recommended way to refresh my data?
If it helps, our Server is an older SQL 2000 machine.
[Note: I found this question on SO where someone suggested using an AJAX UpdatePanel. My Project, at this time, has no AJAX controls in it. Could I avoid the complexity of AJAX (downloading the latest package, install it into VS2010, add it to my Project's list of references, then reference AJAX in every page that uses it) or just bite the bullet?]
If you don't really do much of anything else on this page, you might as well just leave the auto refresh on. Yes, it's annoying and I hate it and I think it shouldn't exist, but I won't have to use the site and if that's really what the clients want, that's what the client gets.
You can also use javascript or asp controls to force a refresh other than just through the HTTP meta tag, but it comes with exactly the same problems.
If you go the route of using update panels you have a few advantages. First, and possibly most importantly, if you have much content on the page other than just the gridview being updated you aren't rendering it repeatedly. This could (potentially) be a significant reduction in server load based on what your whole page looks like, or it could be minimal. Next, you have a more subtle (and also customize-able) visual impact of the update for the user. If they want the site to just flicker and be up to date, an update panel will do that without a cursor change, spinners all over, etc. (You can also add stuff like that back in if you want it to be apparent to the users when the update panel is posting back.)
you can use ajax to get the data from server and populate the html container by javascript at client. i don't mean updated panels of microsoft, but xmlhttp.
If you're looking for a simple workaround, try this :
<script>
//Refreshing the page every 5 minutes
setInterval("location.reload()", 300000);
</script>
Is the best approach to loading javascript files to put them in an IFrame and then embed the same in asp.net pages? I have read somewhere that this will help to boost page-loading times.
Best is to put script references at the bottom of the page. This ensures that all content is loaded before the scripts. Don't use the iFrame unless needed.
When looking at page load times, it depends on where your code is stored. If it is in the html then it will be reloaded each time the page is loaded. If you move it out to a js file, it will be loaded the first time and cached after that so then you shouldnt have a problem with affecting the load times.
You can use jQuery's $(document).ready to make sure that all the elements are loaded before running any code.
Don't go for the iframe approach. Put the scripts as far away at the bottom of your html as possible and your css as high as possible. Also try to go for unobtrusive javascript if possible. jQuery's great at this so you surely want to take a look into this. The benefit of jQuery is that it's also put on CDN servers which also speeds up performance.
I found these guidelines to be a great help when I was upgrading the performance of one of my former projects at a client: Best Practices for Speeding Up Your Web Site.
Some great tools are also Firebug in combination with YSlow.
Is it possible to compress javascript files.... or anything related to a web page before sending it to the client?
i am using telerik controls, and found that their controls write a lot of extra javascript code that makes the page size huge (something around 500KB).
If you are using IIS7, it has support for compression built in. Highlight the web application folder (or even the website) in the treeview of IIS manager, in the IIS panel in the next pane select Compression, then in the right hand pane select Open Feature. You then have two checkboxes to enable compression on static and dynamic content.
Be aware though that this may not be the silver bullet - it will increase load on the server, and it will increase the load on the client as the browser as it unzips the content. 500KB is a moderate sized page, but it isn't big. Compression like this is usually only beneficial if it is the network pipe that is the problem, which it seldom is these days. Your issue may be more to do with lots of javascript running during the onload of the page - if you see a reasonable difference in speed between IE7 and IE8 this may be an indication of this problem.
You can combine and minify your *.js and *.css files with http://github.com/jetheredge/SquishIt/
But I don't know if it can help you to compress telerik's scripts.
GZIP, Minification & Packing provided you had access to the .js files. You can do this one-off or programatically before sending it to the client.
Check this out.
http://www.julienlecomte.net/blog/2007/08/13/
One of my friends is working on having a good solution to generate aspx pages, out of html pages generated from a legacy asp application.
The idea is to run the legacy app, capture html output, clean the html using some tool (say HtmlTidy) and parse it/transform it to aspx, (using Xslt or a custom tool) so that existing html elements, divs, images, styles etc gets converted neatly to an aspx page (too much ;) ).
Any existing tools/scripts/utilities to do the same?
Here's what you do.
Define what the legacy app is supposed to do. Write down the scenarios of getting pages, posting forms, navigating, etc.
Write unit test-like scripts for the various scenarios.
Use the Python HTTP client library to exercise the legacy app in your various scripts.
If your scripts work, you (a) actually understand the legacy app, (b) can make it do the various things it's supposed to do, and (c) you can reliably capture the HTML response pages.
Update your scripts to capture the HTML responses.
You have the pages. Now you can think about what you need for your ASPX pages.
Edit the HTML by hand to make it into ASPX.
Write something that uses Beautiful Soup to massage the HTML into a form suitable for ASPX. This might be some replacement of text or tags with <asp:... tags.
Create some other, more useful data structure out of the HTML -- one that reflects the structure and meaning of the pages, not just the HTML tags. Generate the ASPX pages from that more useful structure.
Just found HTML agility pack to be useful enough, as they understand C# better than python.
I know this is an old question, but in a similar situation (50k+ legacy ASP pages that need to display in a .NET framework), I did the following.
Created a rewrite engine (HttpModule) which catches all incoming requests and looks for anything that is from the old site.
(in a separate class - keep things organized!) use WebClient or HttpRequest, etc to open a connection to the old server and download the rendered HTML.
Use the HTML agility toolkit (very slick) to extract the content that I'm interested in - in our case, this is always inside if a div with the class "bdy".
Throw this into a cache - a SQL table in this example.
Each hit checks the cache and either a)retrieves the page and builds the cache entry, or b) just gets the page from the cache.
An aspx page built specifically for displaying legacy content receives the rewrite request and displays the relevant content from the legacy page inside of an asp literal control.
The cache is there for performance - since the first request for a given page has a minimum of two hits - one from the browser to the new server, one from the new server to the old server - I store cachable data on the new server so that subsequent requests don't have to go back to the old server. We also cache images, css, scripts, etc.
It gets messy when you have to handle forms, cookies, etc, but these can all be stored in your cache and passed through to the old server with each request if necessary. I also store content expiration dates and other headers that I get back from the legacy server and am sure to pass those back to the browser when rendering the cached page. Just remember to take as content-agnostic an approach as possible. You're effectively building an in-page web proxy that lets IIS render old ASP the way it wants, and manipulating the output.
Works very well - I have all of the old pages working seamlessly within our ASP.NET app. This saved us a solid year of development time that would have been required if we had to touch every legacy asp page.
Good luck!