It's easy enough to record how long a page takes to load using Events with Google Analytics and I can find plenty of examples of how to do that by searching. The problem is most of these methods record the start time as being the moment the page starts to load so in effect all it tells you is how long the page took to render in a browser. I want to time the full page lifecycle, ie from when the request begins until the browser has completely rendered the page to the user.
Anyone know how to do that with GA?
Is there any way to get from the browser when the request started, rather than having to record a timestamp in javascript?
EDIT: The prior answer was from before Google Analytics released its Site Speed feature; it's far preferable to use the built-in feature, which uses the HTML5 Navigation Timing API.
By default, the Site Speed feature is enabled and samples 1% of traffic.
To increase the sample rate, just add this line before your _trackPageview call, setting the second argument to what percent of your traffic you'd like to track (though Google will only record up to 10,000 visits):
_gaq.push(['_setSiteSpeedSampleRate', 50]); //50%
Related
I created a website on MVC C# and locally it behaves as it is expected, but once I upload it on Azure it starts loading slow (initial load), each page takes its own time. I've enabled the Always On feature, but it didn't do much good. Now the question is if there is a way to force on access the whole web application to be build instead of the Page by page mode that is currently active.
If you mean the initial request takes long and then any subsequent requests are fast, then that would be because you are hitting .NET apps cold start. Many .NET apps are slow to JIT and load all their .NET requirements, but once everything is loaded, they are fast.
As scheien said, please choose the closest region when choose app service plan.
If pages still loading slow. I would suggest you use Windows Internet Explorer Developer Tools Network Capture to find out detailed issue: https://msdn.microsoft.com/en-us/library/gg130952(v=vs.85).aspx. If it related with your application, please optimize the code.
Regards
I followed this guide below:
http://googleanalyticssdk.codeplex.com/wikipage?title=Getting%20Started&referringTitle=Overview
It teaches me on google analytics basics. Over at the STANDARD REPORTS I am able to see my real time screens and events that happened. But it is only real time, how can I see past real time results ? On the real time view, there is only last 30mins recorded. Is the link teaching me the right way to store on google analytics ? Where can I retrieve the past events?
Check out this image:
http://i62.tinypic.com/dngl05.png
The real time reports are just that whats happening on your site right now in real time. But you may have noticed that there isn't a lot of data there. This is because it takes time for Google to process and analyse the data that is sent from your site. That's why it can take up to 24 hours before Google is done processing the data for display in the standard reports.
Google Analytics generally updates your reports every 24 hours, so it can take at least
that long for data to appear in your account after you first install the tracking code.
That comes directly from Googles help. Data delay after adding the tracking code
Personal experience has shown that you can sometimes see it as soon as 4 - 8 hours but that depends a lot I think on the amount of data the site/app sends. Its probably not something that I would depend on. Its best to just assume that you can first see today's data tomorrow.
I am doing some web scraping for a research project and have some bandwidth limitations that I am hitting. Due to the nature of my work, I require this to be done through a web browser control(geckofx for csharp). Because of this, I cannot control images that get loaded.
My question is, in windows, is there any way to force certain images to not load. I know web pages can be blocked via hosts file, but it does not work on specific images on a page.
Ideally such a tool would have regex/wildcard for specifying blocked image sets.
You can use Fiddler (or Fiddler Core) as proxy and you should be able to do pretty much anything with each request. In your case you may want to issue additional HEAD request in the script for image requests and see if size is acceptable, if not fail the original request...
I am currently building an asp.net c# Website for a client of mine to promote their band, on there they would like to have a Webplayer which continues to play music as people are browsing the site. The player is located on the Master page so it is included on every site, but it stops playing every time the site does a postback or refreshes in any way.
I think it is possible to achive this using Iframes or Ajax, although i dont have alot of knowledge in either.The site is only about 6 pages with mainly static information on there, so the only postbacks/refreshes will be done by using the navigation menu to load each page.
My question to you is:
how can I achive this?
what would be the easiest method and what are the pros and cons?
are there any other/better ways of achiving this other than using Iframes or Ajax?
I can provide some code if needed.
Thanks,
Seb
The easiest way to achieve this would probably be to have your site put itself into a frame, where another (very small) frame on the page hosts the media player. That way when people switch pages, the frame with the media player is left alone. However, this will probably come out feeling pretty clunky at best.
The best way is probably to use AJAX. All of your page navigation would happen as AJAX requests, so the user never technically leaves the original page. An iframe will probably be necessary for tracking history so the user can click "back" and have the browser do what they expect, but you can find libraries that will take care of that aspect for you. The media player should probably use a different subdomain for its source than the rest of the site content, because most browsers max out at two simultaneous connections to the same domain--your site could feel sluggish if one of these two connections is being used for the music stream at all times.
I am trying to monitor genuine page hits. Here is what my site does. I have an article directory where people can post articles. When their article is posted they are paid depending on the amount of unique users visit their pages. So page hits are important. Here is the problem I am facing.
What I need:
I don't want to track page hits by minor search engines or robots.
I would like the major 4 search engines to surf my site because I can monitor them by IP address and not count their visit as a page hit. This cannot be done for spam bots because they do a good job of passing as a real human or major search engine.
Problems:
There are spam bots on the internet
that do not honor the robot.txt file
There are bots that try to fake being a real human user. By manipulating the user agent and other things in the header.
Performance may suffer by always checking the database for good IP addresses
A human being can bypass the captha only to allow their robot to view my pages
Possible solutions:
Require a captcha on every page. If the captcha passes. then log the IP address as good or submit a cookie on the users machine indicating they passed.
Allow all major search engines IP address, so they will not be presented with a captcha
Purchase a bot detection software
Require the viewer to pass a captca every 7 days
Getting accurate human page views is critical for this site to work properly. Do you guys have any other ideas
You could just leave it to Google Analytics. It does a very good job solving the kind of problem you're trying to solve and it's free.
Do you have a reason not to use an existing service or solution?
If you just want to monitor page hits, set up Google Analytics or a similar service on your site, and they'll do a better job of filtering out the noise than a hand-rolled solution possibly could.