Use razor/asp.net mvc3 to generate static html pages? - c#

For one projet, I've to generate static .html pages, which are gonna to be published on a remote server.
I've to automate the creation of those files from a c# code, which takes data from a SQL Server database.
Data will be not often changed(every 4-5 month), and this website will be highly frequented.
Since I find the razor synthax of asp.net MVC3 very effective, I was wondering if it's possible to use asp.net MVC3/Razor to generate those .html pages?
So:
Is this a good idea?
If yes, what is the good way?
If you think to another good manner of doing it, which way?
Thank you for the help
Edit
Regarding answers, I need to make a precision: I don't want/need to use web caching, for a lot of reasons(load(millions of pages loaded every month), integration(we integrate our page in an optimized apache with, another part of a website), number of pages(caching will only help me if I've the same pages a lot of time, but I will have ~2500 pages, so with murphy's law, except if I put a very high cache timeout, I will have to generate them often). So I really search something to generate HTML pages.
Edit 2
I just got a new constraint :/ Those template must be localized. Meaning that I should have something equivalent to the following razor code: #MyLocalizationFile.My.MyValue
Edit 3
Currently, I'm thinking of doing a dynamic website, and call some http query on it, to store the generated HTML. BUT, is there a way to avoid the http? meaning simulate an http call, specifiy the output stream and the url called(with only GET call).
Our previous load numbers were really underestimated, actually they have a little more than one million visitor each days, ~ 14 million pages loads/day.

Yes it is. Even when you can cache the results, HTML pages will be always faster and use lower server resources
A good way is to transform your razor views into text and then save the text as a html file.
Another way can be using T4 templates, but I recommend Razor.

You can use the Razor Engine (NuGet-link, their website), This way you can create templates from a console application without using asp.net MVC.
I use it as follows:
public string ParseFile<T>(string fileName, T model) {
var file = File.OpenText(fileName);
var sb = new StringBuilder();
string line;
while ((line = file.ReadLine()) != null)
{
// RazorEngine does not recognize the #model line, remove it
if (!line.StartsWith("#model ", StringComparison.OrdinalIgnoreCase))
sb.AppendLine(line);
}
file.Close();
// Stuff to make sure we get unescaped-Html back:
var config = new FluentTemplateServiceConfiguration(
c => c.WithEncoding(RazorEngine.Encoding.Raw));
string result;
using (var service = new TemplateService(config))
{
return service.Parse<T>(sb.ToString(), model);
}
}
}

Rather than generating static HTML pages, I think it would be better to dynamically generate the pages each time, but using Caching to increase performance.
See this article on caching with ASP.NET MVC3 for more information:
http://www.asp.net/mvc/tutorials/older-versions/controllers-and-routing/improving-performance-with-output-caching-cs

I ended by creating a normal asp.net MVC website and then generate page by going on the page with a WebClient.
Like this I can have a preview of the website and I can enjoy the full power of Razor+MVC helpers.

Is there any performance reason you've run into that would merit the effort of pre-rendering the website? How many pages are we talking about? What kind of parameters do your controllers take? If vanilla caching does not satisfy your requirements, for me the best approach would be a disk-based caching provider...
http://www.juliencorioland.net/Archives/en-aspnet-mvc-custom-output-cache-provider

Look at T4 templates or a similar templating solution

I'm working on a similar solution. My website is running normally (ASP.NET + DB + CMS) on a staging environment and then I use wget to crawl it and generate static html pages. Those static html pages, including assets are then uploaded to a Amazon S3 Bucket. That way the website becomes fully static, with no dependencies.
I'm planning to have a daily task that crawls specific pages on the website to make it speedier, e.g. only crawl /news every day.
I know you've already found a solution, but maybe this answer might be helpful to others.

Related

What's the canonical method for sharing localized ui strings between Microsoft's MVC and Javascript?

I'm designing an application with a dynamic, javascript based, client - which will require some localization.
The application contains significant ui components which aren't javascript generated, and are served through various MVC views.
I'd like to be able to only localize my strings once (whether they are used in Javascript, or as part of a view). Right now, I'm using resource file to localize the strings for MVC, and i'm thinking of generating a dedicated view, which will generate a localized javascript file, storing all of the strings into a namespace (that is, a view which serves something like this:
localized = {
WelcomeMessage: #Resource.Welcome,
Logout: #Resource.Logout,
}
Is there a prescribed, or canonical, method for achieving this when combining an MVC served service, with a complex javascript client?
It depends on how big your client-side code is. If you have a kilobyte or more of localised text then it would be worth moving them into a dynamically-generated "JavaScriptStrings.js" file which would be loaded by a separate <script> element. You can take advantage of the HTTP Expires header to force client-caching and thus not harm performance at all.
Commentator #AntP has a point about using data- attributes, but this might become unmanageable quickly. The other option is to return it inline in the HTML response, it's simpler and would be faster for clients, unless you have a lot of strings to return and/or you're serving mobile clients.

Recommended way to track unique pages view for each URL on my site that's behind a CDN

I have an ASP.NET MVC 3 site that is behind a CDN (combo of Azure CDN and CloudFlare). Each page view pulls data from SQL server (unless it's being cached by the CDN of course).
I'd like to be able to display a "N page views" on each page on my site (~10k pages) so when visitors view the pages, they know how popular (or unpopular) it is. Stackoverflow does this on each question page. I don't care WHO the users were, just the grand total per page.
I'm wondering what the best way to do this is. I've thought of using client code, but this is easily messed with by a malicious user to inflate page view count. So it seems the best way is to implement this on the server. I did my best to search Stackoverflow for code samples and recommended approaches but couldn't find something that applied to what I"m asking.
If I were implementing this, I would probably just stand on top of my existing analytics package. I like using Google Analytics, and it tracks all kinds of awesome data, including page hit count:
http://www.google.com/analytics/
In a little piece of async javascript I would just use their API to get the total page views:
http://code.google.com/apis/analytics/docs/
I know it's lazy, but there are far fewer things I can screw up by using their code :-)
Here's some example code I've written to do just this:
https://github.com/TomGullen/C-Unique-Pageview-Tracker/blob/master/Tracker.cs
Advantages
No reliance on database
Customisable memory usage
Disadvantages
Will double count unique views depending on cache expiry + max queue size
Example usage
var handler = new UniquePageviewHandler("BlogPost_" + blogPostID);
if(handler.ProcessPageView(visitorIPAddress)){
// This is a new page view, so process accordingly
// EG: Increment `unique page views` count in database by 1
}
You'll need to modify the code I've linked to slightly to get it working in your project but it should work without too much issues.

Url rewriting asp.net 3.5

I have implemented URL rewriting using Intelligencia, everything works perfectly.
now if i have an anchor i could do somenthig like
Test
with the seo friendly url olready in place
or do i have to do somenthing like
Test
public string GetSeoUrl(string url)
{
if(url == "../TestPage.aspx") return ../TestPage;
}
This will allow me to manage from a central location all the URLs.
I am working on .net 3.5 Web Form
But what are the implications of both approaches?Is it going to be slower?less efficient?is the right way to do it?
Thanks
I think second way is right way to do it. ( Managing from Central Location )
also,i don't think there will be any SEO implication using this as the final URL will be same but second one is rendred from server.
It might be littlebit slow but not noticable. as it runs some server code to generate url while first cenarion there won't be any processing.
I see you are using .NET, you can to do this already developed solution from ASP.NET MVC Framework called URL routing.
Read this http://weblogs.asp.net/scottgu/archive/2007/12/03/asp-net-mvc-framework-part-2-url-routing.aspx
I would do the first one. The URL is the contract. Using the second approach may make you think that you can easily change the URL by updating the function. Changing the URL would cause seo issues.

ASP.NET URL remapping &redirection - Best Practice needed

This is the scenario: I have a list of about 5000 URLs which have already been published to various customers. Now, all of these URLs' location has changed on my server side. The server is still the same though. This is a ASP.NET website with .NET3.5/C#.
My requirement is : Though the customers use the older source URL they should be redirected to the new URL without any perceived change or intermediate redirection message etc.
I am trying to make sense of the whole scenario:
Where would I put the actual mapping of Old URL to New URL -- in a database or some config. file or is there a better option?
How would I actual implement a redirect:
Should I write a method with Server.Transfer ot Response.Redirect?
And is there a best practice to it like - placing the actual re-routing in HTTPModules..or is it Application_BeginRequest?
I am looking to achieve with a best-practice compliant methodology and very low performance degradation, if any.
If your application already uses a database then I'd use that. Make the old URL the primary key and lookups should be very fast. I'd personally wrap the whole thing in .NET classes that abstracts it and allow you to create a Dictionary<string,string> of all the URLs which can be loaded into memory from the DB and cached. This will be even faster.
Definitely DON'T use Server.Transfer. Instead you should do a 301 Permanently Moved redirect. This will let search engines know to use the new URL. If you were using NET 4.0 you could use the HttpResponse.RedirectPermanent method. However, in earlier versions you have to set the headers yourself - but this is trivial.
Keep the data in a database, but load into ASP.NET cache to reduce access time.
You definitely want to use HTTPModules. It's the accepted practice, and having recently tried to do it inside Global.asax, I can tell you that unless you want to do only the simplest kind of stuff (i.e. "~/mypage.aspx/3" <-> "~/mypage.aspx?param1=3) it's much more complicated and buggy than it seems.
In fact, I regret even trying to roll my own URL rewriting solution. It's just not worth it if you want something you can depend on. Scott Guthrie has a very good blog post on the subject, and he recommends UrlRewriter.net or UrlRewriting.net as a couple of free, open-source URL rewriting solutions.
Good luck.

Using *.html extension in dynamic UR's for SEO

My situation is. I have a project planned to be built on ASP.NET MVC 2. And one of the major requirements is SEO optimization. A customer wants to use static-like URLs that end up with .html extension for this project that make URLs more SEO friendly. E.g. "mysite.com/about.html " or "mysite.com/items/getitem/5.html" etc.
I wonder is there any benefit from SEO perspective to use .html extension in dynamic URLs? Are Google and other search engines rank work better with such URLs?
I would use sitemaps instead, this enables you to have dynamic content (and to use MVC) but still be crawled completely.
See: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156184
and http://www.codinghorror.com/blog/2008/10/the-importance-of-sitemaps.html
No, the base URL doesn't matter. If you're serving dynamic content in .aspx or .html, it's all the same. If you do serve ASP.NET content with .html because of requirements (as dumb as they may be), then I suggest finding an alternative extension (e.g. .htm) for all static content. You don't want your static HTML files getting processed unnecessarily.
As Femaref said, you can use sitemaps to help.
Also, make sure your URL doesn't change (including variables) if the content is the same. This shouldn't be a problem with MVC.
Edit: In your example:
mysite.com/items/getitem/5.html
I'm guessing what you originally wanted is:
mysite.com/items/getitem/5
No extension doesn't make a difference either. Since that's not a problem, I would also argue that an extension makes the URL less "clean" and also suggests that there is a file called 5.html in that path, which is obviously not true.
Search engines don't care at all at what your webpage extensions look like.
If anything all you're doing indicating a file type for the page served up.
Can the page be reached?
Is there content?
Are there links pointing to that page?
That's what a search engine is worried about. Anyone can create a custom solution using custom file extensions for a website and have it work just fine.

Categories