Track embedded charts via Google Analytics - c#

I've got a site which produces charts such as the one below
I'd like to encourage visitors to embed the generated graphic on their own sites and blogs. Is it possible to include views for that chart in google Analytics? I want to be able to see when a site embeds the image so that it's tracked in the analytics reports.
I'd envision some API that I can call from the server-side method which generates the PNG, but haven't been able to find anything specific.
Thanks!

This is indeed possible, have a look at the Google Analytics for Mobile Websites documentation. This details how to build a request to google analytics on the server, with quite a few different code samples (C# included).
While this documentation revolves around tracking page views, this concept could be extended to other types of activity you can record in google analytics, such as Events. In your specific situation I think I would want to setup the view of the chart as an Event, as this will not 'mess up' your true traffic (though you could use an advanced segment to exclude the chart traffic if you chose to register them as page views).
The documentation for event tracking is available here. Looking through it should give you a good idea of how you could express viewing your charts. Once you got an idea of how you wanted to track the event in GA, write the javascript and then view the URL (beacon) it generates to send the information to google. You'll be able to use that as a template to send event information from the server.
In regards to actually serving the image, you have a lot of options. If you app is written in MVC, look at the FileResult class (and the asssociated File() method available on the Controller class). If you working in a Webforms app, you will be using a Response.WriteFile() or something to that affect. This wrox article has an example of the idea behind this. The example is for creating a no leaching / hotlinking image handler, but the concept of writing an image to the HttpReponse is the same.

Related

How to prevent saving files asp.net mvc 4 application

I 'm currently working on creating a web learning portal using ASP.NET MVC 4, where my clients requirement is that the end user should not be able to save the file. The save restriction is applicable on all video type files, images, pdf, word docs and powerpoint type files.
I understand the following exceptions and the client is good with this:
User can make use of print screen
copy-paste text
I need your guidance how best to accomplish this.
For video, I came across the following link http://www.strathweb.com/2013/01/asynchronously-streaming-video-with-asp-net-web-api/.
Thanks,
Hemant.
I came across this online tool http://www.docspal.com/ but they doesn't seems to provide any API for the developers.
Am I the only one with this requirement :(
It is really hard to stop users from copying documents from the website.
However, just to lower it down, you should be able to disable right click which may at least decrease the number of users that will try to copy your content. Do something like following in jQuery.
$(document).bind('contextmenu', function (e) {
e.preventDefault();
alert('Right Click is not allowed');
});
Refer to this for more.

How to update content via Google Sites API?

My apologies if this has already been requested/answered before, but I'm having a hard time trying to find information on the web about this. I've been tasked to find a way to directly update the content of a Google Sites page (Such as a text box or image) using the API.
I've been looking about the web for examples of doing this using C#, and have not found much to go by. I have some experience with writing/calling REST and SOAP APIs, but I can't seem to work out what Google uses for its API.
I'm struggling as to how I approach this, as the few examples I have found don't really walk through the code, so they leave me little to understand in order to manipulate it to what I want to achieve. One site I did find was Sites API Demo
Does anybody have any examples, or able to provide a simple example of updating content on a Google Site?
I appreciate any help you can give.
Mark
I realise this is an old question, but I have successfully updated the content of a google sites page using the library provided here, and following the example in the wiki.
I compiled the source as a .dll and added it as a reference to my project.

How to scrape a flash based site?

We are using Html Agility Pack to scrape data for HTML-based site; is there any DLL like Html Agility Pack to scrape flash-based site?
It really depends on the site you are trying to scrap. There are two types of sites in this regard:
If the site has the data inside the swf file, then you'll have to decompile the swf file, and read the data inside. with enough work you can probably do it programmatically. However if this is the case, it might be easier to just gather the data manually, since it's probably isn't going to change much.
If most cases however, especially with sites that have a lot of data, the flash file is actually contacting an external API. In that case you can simply ignore the flash altogether and get to the API directly. If your not sure, just activate Firebug's net panel, and start browsing. If it's using an external api it should become obvious.
Once you find that API, you could probably reverse engineer how to manipulate it to give you whatever data you need.
Also note that if it's a big enough site, there are probably non-flash ways to get to the same data:
It might have a mobile site (with no flash) - try accessing the site with an iPhone user-agent.
It might have a site for crawlers (like googlebot) - try accessing the site with a googlebot user-agent.
EDIT:
if your talking about crawling (crawling means getting data from any random site) rather then scraping (Getting structured data from a specific site), then there's not much you can do, even googlebot isn't scrapping flash content. Mostly because unlike HTML, flash doesn't have a standardized syntax that you can immediately tell what is text, what is a link etc...
You won't have much luck with the HTML Agility Pack. One method would be to use something like FiddlerCore to proxy HTTP requests to/from a Flash site. You would start the FiddlerCore proxy, then use something like the C# WebBrowser to go to the URL you want to scrape. As the page loads, all those HTTP requests will get proxied and you can inspect their contents. However, you wouldn't get most text since that's often static within the Flash. Instead, you'd get mostly larger content (videos, audio, and maybe images) that are usually stored separately. This will be slowed compared to more traditional scraping/crawling because you'll actually have to execute/run the page in the browser.
If you're familiar with all of those YouTube Downloader type of extensions, they work on this same principal except that they intercept HTTP requests directly from FireFox (for example) rather than a separate proxy.
I believe that Google and some of the big search engines have a special arrangement with Adobe/Flash and are provided with some software that lets their search engine crawlers see more of the text and things that Google relies on. Same goes for PDF content. I don't know if any of this software is publicly available.
Scraping Flash content would be quite involved, and the reliability of any component that claims to do so is questionable at best. However, if you wish to "crawl" or follow hyperlinks in a Flash animation on some web page, you might have some luck with Infant. Infant is a free Java library for web crawling, and offers limited / best-effort Flash content hyperlink following abilities. Infant is not open source, but is free for personal and commercial use. No registration required!
How about capturing the whole page as an image and running an OCR on the page to read the data

Making Dynamically Created ASP.net Page SEO Friendly

im starting the pseudo code of a new site, and want it to be as SEO friendly as possible.
the site i am creating is a booking agency site with c# and asp.net. essentially bands will register on the site with their availability and other info, and fill out their profile information with images etc. this info will be stored in a db.
creating this is not a problem, but i want the site to be a SEO friendly as possible.
I know google loves huge sites with great content. And all of these profile pages would be an excellent addition to my site for seo purposes. i also hear that google cannot see dynamically generated content when crawling a site.
i want to find a method of coding these pages, so google can see the content when it crawls them.
i need a pointer in the right direction for a solution for this. nothing is off limits - i will basically code my entire site around this principle, i just have no idea where to start looking for a solution. im not looking for a code solution, just what i should be researching to solve this issue.
Thanks in advance
i also hear that google cannot see dynamically generated content when crawling a site.
Google can see anything you can retrieve via http GET request (ie: there's a specific URL for it) and that someone either linked to or is listed in a published xml site map file.
To make sure that your profile pages fit this, you will want to make sure that profiles are all rendered via a single asp.net *.aspx file that determines which page is shown via a url parameter. Something that looks like this:
http://example.com/profiles.aspx?profile=SomeBandName
Now, you probably also want a friendly URL, that looks like this:
http://example.com/profiles/SomeBandName
To do that, you need to set up routing.
In order to crawl and index your pages by google or other search engine properly. Follow the following guidelines.
i: Page title must be precise and according to content available in page.
ii: Page url should be user friendly.
iii: Content is king (useful content)
iv: No ajax or javascript oriented way to load contents.
v: No flash or other media files. if exist must have description via alt tag.
vi: Create url sitemap of all static and dynamically generated contents.
vii: Submit sitemap to google and keep tracking how google crawl and index your pages.
fix issues contineously if google found via crawling.
In this way your most pages and content will be index properly and fastly.
I'd look into dynamic URL Rewriting.
Basically instead of having one page say http://localhost/Profile.aspx you'll have a bunch of simulated urls like
http://localhost/profiles/Band1
http://localhost/profiles/Band2
http://localhost/profiles/Band3
etc.
All of those will then map to back to the orgial profile.aspx page with a parameter so internally in your code it would look like http://localhost/Profile.aspx?Name=Band1, http://localhost/Profile.aspx?Name=Band2, etc
Basically your website appears to have a bunch of pages for each band but in reality they are all getting mapped back to the same asp.net page but have different parameters.
This is article I read about it some time back. http://weblogs.asp.net/scottgu/archive/2007/02/26/tip-trick-url-rewriting-with-asp-net.aspx
i also hear that google cannot see dynamically generated content when crawling a site.
you could create a sitemap.xml with the urls pointing to the dynamic profile pages. using google webmaster tools you can submit and monitor the crawling progress.
you may also create an index page or something similar ('browse by category' pages) that link to matching profile pages.
a reference for seo I regularly use is http://www.seomoz.org/learn-seo

Building a Silverlight application upload utility?

This post is really more of a discussion if this is even possible.
There are numerous examples all over the web, but all of those are using asp.net applications and unfortunately I can't go that route. So my goal is to build an upload utility in Silverlight that can be deployed as a CRM 2011 web resource, without using anything Asp.Net related.
I have looked at the Telerik SL upload control, but it appears to require a ServiceURL handler and I'm not sure I can embed something like that within Silverlight and make it work?
I guess I'm looking for some direction here on what my options would be. I don't want to start down one path and run into a brick wall.
Thanks for reading!
Your SL control can upload the bytes of the file as an attachment (in the Notes area) to the entity record. Here is a project on CodePlex that uses SL to upload an image as an attachment:
http://crmattachmentimage.codeplex.com/
Hopefully that will get you pointed in the right direction!
I haven't really looked into the creation of web-resources from silverlight, but considering the web-resource utility would be using the CRM SOAP end-point, I imagine you could do the same from Silverlight.
From Silverlight, you would either need to make a reference to SOAP End-Point url which can be found in the CRM client in Settings -> Customization -> Developer Resources.
Otherwise, you could use the open-source project CrmSilverSoap library which already has all the generated proxy classes as well as a few helper methods for connecting to the various CRM services.
In trying to work with and create the web resources, I'd have a look at the this SDK article which shows to how to use some messages for Creating Web Resources. You will need to make the required modifications to code to enable these messages to be sent via the reference soap end-point in silverlight.

Categories