I would like to integrate google analytics experiments to our website.
My situation: We have a solution for feature toggles, that also allows A/B testing. The features are stored in the database and have a percentage that defines how many users will see the feature. We also store the features in a cookie, so the users will see the same view when he refreshs the page.
Now I want to use server the javascript API to track the state of the feature (Google Experiments Documentation). In my understanding I have to send a request to google whenever an experiment is used and I also have to tell google the ID of the alternative. It must happen on the right page to correlate the experiment with the right page view. The problem are ajax requests which might check the split tested features. In this case it is hard to say which features are used for the current page.
I see 3 options:
Track all experiments, even if they are not used on this site. (I think it does not make a lot of sense)
Make a tool/configuration to define the experiments and feature toggles that are used on each site (very easy to make mistakes here).
Track the experiments from server-side code (but I don't know how GA will connect these calls to a page view).
What is the official guideline for ajax requests and google analytics experiments?
Related
I am working on redesigning and streamlining the checkout process for an Ecommerce website using ASP.NET/C# into 1 page. (Currently, the website doesn't use any ASP controls)
Just for a little back info, the customer will be able to enter a shipping address, add a gift message, and select a shipping date per recipient while updating the backend with the correct information. I want this process to be elegant and pain free for customers to use.
Now I have created WebMethods that get called through AJAX requests that work correctly on previous features.
My question is:
Would it be an ok idea to submit ALL these requests through WebMethod AJAX calls, have it update the backend, and render the correct information to the user?
I know it is doable but I just want to be sure that this is an ok approach to take. I have been looking at other JavaScript frame works that might help but am too unfamiliar with them to know if they would work or not.
Please let me know what you think and if you have other suggestions.
Thank you!
Short answer: Yes.
You mentioned:
an Ecommerce website using ASP.NET/C# into 1 page
If that is true, then you are talking about a single page web app. There are a couple of references to single page web apps here:
https://msdn.microsoft.com/en-us/magazine/dn463786.aspx
https://en.wikipedia.org/wiki/Single-page_application
It's very common to do what you are talking about. Using AJAX to perform back-end processes and then update small pieces of the front end is key to making fluid, fast, and responsive web applications. That's mainly because smaller pieces of data are being sent through the network, and the UI is being updated in smaller segments. If architected correctly, single page web apps can provide a slick, intuitive web experience for clients and customers. Emphasis on "If architected correctly". So make sure to do your research.
There are several libraries out there that can help you, such as:
jQuery
jQueryUI
Bootstrap
Angularjs
Sencha ExtJs
Backbone
Knockout
You don't have to use them, but they can save you a lot of time. At the very least, I would recommend using jQuery. Do some research and find out which one(s) will work best to fit your specific application.
I don't see any problem in this approach, the good part about ajax WebMethod is that you will transfer only the important data to do the operation. its better and fast than use a updatePanel
We use a lot of ajax webMethod in our asp.net project and we never got problems with it :)
A good advice is looking for asp.net mvc, its better than asp.net web forms
This is a general question really. I had read some where, it may of been an Microsoft whitepaper or blog I am really not sure as I dont have the link right now.
Basically, the person was describing that (of which he referred to AJAX.Net 1.0) when using an update panel, although you would expect only the controls and components associated to be updated/refreshed, the entire page is submitted with the request in the update panel.
Obviously this defeats the purpose of AJAX when the design is to minimize traffic sent to and from the server. In this case you might as well do a full post back for the page. I guess from a cosmetic point of view, AJAX.Net does the trick as intended but behind the scenes doesn't do what you would expect.
Now this could well of been resolved in later versions I just can't confirm this. I have searched Google high and low for an answer.
What the person said was to use JQuery as this offers true optimized traffic flow when updating, which of course it does, so this is why I ask have Microsoft done the same with their later versions.
Just thought I'd ask you lot before attempting the impossible task of asking someone in Microsoft.
I'll have a look for the link when I get home and if I find it I'll add it here just so you don't think I'm off my rocker. :)
This post ASP.Net AJAX Not Working (Full page postback) explains pretty well about the problem and the solution.
What is the very, very simplest method of getting the RSS (or JSONP) feed of a Twitter account's list or friends' timeline?
Here some background: I have a simple server script that feeds a Twitter widget. I wrote the script about two months ago but have not had the chance to deploy it until now. The script fetches the friends_timeline of a dummy account whose sole purpose is to combine the "friends" tweets into a single RSS feed (rather than making one request for every "friend"). Simple, lightweight, easy to maintain, light on Twitter's servers; seemed to be a good solution at the time. Well, friends_timeline requires authentication and right now the server uses NetworkCredential to pull down these RSS feeds. As a lot of you know, in August Twitter will cease to support basic authentication and force march everyone over to OAuth. I've looked through the OAuth documentation and I'm not very happy with Twitter right now.
I'm also hoping to avoid using the Twitterizer framework. That's a lot of code to check and it won't go near our production servers without a thorough code review. I know that reinventing the wheel is a bad thing, but in this case all I want is the wheel, not a race car.
This page provides a smallish C# Oauth code: http://oauth.net/code/
Don't hate twitter, it was a good idea.
I am trying to detect is a visitor is human or not. I just got an idea but not sure if this will work or not. But if I can store a cookie on the persons browser and retrieve it when they are browsing my site. If I successfully retrieve the cookie can this be a good technique to detect bots and spiders?
A well-designed bot or spider can certainly store -- and send you back -- whatever cookies you're sending. So, no, this technique won't help one bit.
Browsers are just code. Bots are just code. Code can do anything you program it too. That includes cookies.
Bots, spammers and the like work on the principle of low-hanging fruit. They're after as many sites or users as they can get with as little effort as possible. Thus they go after popular packages like phpBB and vBulletin because getting into those will get them into a lot of sites.
By the same token, they won't spend a lot of effort to get into your site if the effort is only for your site (unless your site happens to be Facebook or the like). So the best defense against malicious activity of this kind of simply to be different in such a way that an automatic script already written won't work on your site.
But an "I am human" cookie isn't the answer.
No, as Alex says this won't work; the typical process is to use a robots.txt to get them to behave. Further to that, you start to investigate the user-agent string (but this can be spoofed). Any more work than this and you're into CAPTCHA territory.
What are you actually trying to avoid?
You should take a look at the information in the actual http headers and how .Net exposes these things to you. The extra information you have about the person hitting your website is there. Take a look at what Firefox is doing by downloading Live Http Headers plugin and go to your own site. Basically, at a page level, the Request.Headers property exposes this information. I don't know if it's the same in asp.net mvc though. So, the important header for what you want is the User-Agent. This can be altered, obviously, but the major crawlers will let you know who they are by sending a unique UserAgent that identifies them. Same thing with the major browsers.
I wrote a bot that works with cookies and javascript. The easiest way of bot/spam prevention is use Nobot component in Ajax Control Toolkit.
http://www.asp.net/AJAX/AjaxControlToolkit/Samples/NoBot/NoBot.aspx
I have a asp.net 3.5 application hosted on IIS 7.0. I'm looking for a comprehensive system to monitor traffic, down to page level minimum. Does .net have any specific tools or is it better to write my own, or what systems/software is freely available to use
Thanks
Use Google Analytics. Its a small piece of Javascript code that is inserted before the tag. Its based on Urchin analytics tracking software which Google bought. They've been doing this for a long long time.
As long as your site is referenced using a fully qualified domain name, Google Analytics can track what you need. It's got lots of flexibility with the filter mechanism as well (let's you rewrite URLs based on query string parameters, etc.)
LOTS of functionality and well thought out as well as a pretty good API if you need to do tracking on things other than clicks.
If you have access to the IIS logs, you can use a log analyzer to interpret the data. An example is the free AWStats analyzer:
http://awstats.sourceforge.net/
An alternative (and one I recommend) is Google Analytics (http://www.google.com/analytics). This relies on you embedding a small chunk of Javascript in each page you want tracking, then Google does the grunt work for you, presenting the results in an attractive Flash-rich site.
I'd suggest trying both and seeing which suits your needs. I'd definitely recommend against rolling your own system, as the above solutions are very mature and capable. Best of luck!
You'll need a client-side / javascript tracking service (such as Google Analytics but there are other good free alternatives out there) because it runs even when the user clicks the back button and the previous page (on your site) is loaded from the browser cache and not the server. The IIS won't "see" the reload since no request is made to it.