I am in the process of moving all of the images in my web application over to a CDN but I want to easily be able to switch the CDN on or off without having to hard code the path to the images.
My first thought was to add an HttpHandler for image extensions that depending whether a variable in the web.config (something like ) will serve the image from the server or from the CDN. But after giving this a little though I think I've essentially ruled this out as it will cause ASP.NET to handle the request for every single image, thus adding overhead, and it might actually completely mitigate the benefits of using a CDN.
An alternative approach is, since all of my pages inherit from a base page class, I could create a function in the base class that determines what path to serve the files from based off the web.config variable. I would then do something like this in the markup:
<img src='<%= GetImagePath()/image.png' />
I think this is probably what I'll have to end up doing, but it seems a little clunky to me. I also envision problems with the old .NET error of not being able to modify the control collection because of the "<%=" though the "<%#" solution will probably work.
Any thoughts or ideas on how to implement this?
You've dismissed writing an HttpHandler based on an assumption of pre-optimization. I would revisit this and definitely write a simple HttpHandler and test it out. You might find that your Page method solution might even be slower, especially if you get the ASP preprocessor involved.
HttpHandlers are pretty close to the metal - it's a miniscule amount of overhead for IIS to hand the request to ASP.Net. It would be a more elegant solution than what you're proposing, and probably more scalable and I'm willing to bet - faster.
Have you considered a slightly simpler approach?
If your pages all inherit from a base class, you could expose a property on that which contains the prepend URL to your CDN (or, to your local server if you want to switch the CDN off). It is then a trivial matter of storing the prepend URL in the web.config:
public string PrependURLPath() {
get { return ConfigurationManager.AppSettings["ImagePrependURL"].ToString(); }
}
In your <appSettings/> element, you can simply choose what the prepend URL would be, eg:
http://my.cdn.com/user/
or:
http://my.own.server.com/images/
Pretty simple!
You would then be able to code your image refernces as per your example, but calling your base page property to expose the desired path:
<img src='<%= this.BasePage.PrependURLPath() + [YourImagePath.png] %>'/>
I agree that setting the image source through the inline call is messy, but you could probably do as someone else has suggested and then iterate through the image controls on your page, changing the prepend URL as you go.
Even if your pages currently only inherit from System.Web.UI.Page, it's a simple matter to create your own base class which inherits System.Web.Page, then do a find/replace in your solution on all remaining pages.
Hope this helps.
weighing in pretty late here, but i've been looking for a similar solution myself. searched google to sanity check what i had done. didn't consider the HttpHandler approach, what i did was simply extend the ASP.net Image control:
public class Img : Image
{
public Img()
{
RelativePath = false;
}
public bool RelativePath { get; set; }
public override string ImageUrl
{
get
{
if (RelativePath)
return base.ImageUrl;
return "http://some.configurable-value.com" + base.ImageUrl;
}
set { base.ImageUrl = value; }
}
}
it's rough and ready, but it works :) obviously it should rely on some configurable value rather than a string literal, but that's not a big change
If you display your images using tags you could create a control adapter, these allow you to alter the way .net controls render or universally alter them something like this should do the trick:
using System.Web.UI.WebControls.Adapters;
using System.Web.UI;
using System.Web.UI.WebControls;
namespace ExampleCode
{
public class ImageAdapter : WebControlAdapter
{
private bool UseCdn
{
get { return true; } // Get value from config or anywhere else
}
protected override void OnPreRender(EventArgs e)
{
base.OnPreRender(e);
Image image = (Image)Control;
if (UseCdn)
{
// If using relative urls for images may need to handle ~
image.ImageUrl = String.Format("{0}/{1}", "CDN URL", image.ImageUrl);
}
}
}
}
Then add a browser file to the App_Browsers folder in your web project like below:
<browsers>
<browser refID="Default">
<controlAdapters>
<adapter
controlType="System.Web.UI.WebControls.Image"
adapterType="ExampleCode.ImageAdapter"
/>
</controlAdapters>
</browser>
</browsers>
you could loop all controls and change the images url in the prerender event on your base class...
The good thing about the HTTP Handler approach is that it's quite re-usable and configurable: you can identify img paths to handle based on location - assuming the structure they're in helps this.
The possible drawback is that image file extensions (.jpg, .png, etc) aren't automatically passed on to the asp.net pipe-line; you can easily config IIS to do so - but you need to have a certain level of contriol over IIS - so it might not be an option if you're on a shared hosting environment.
I will go for #Rhys approach for image control.
Most of the time, I try to use background image css than using image control.
After that I upload both css and images together to the cloud and working fine for relative path.
Doesn't look like there has been an accepted answer yet so here is my suggestion. I had similar problems dealing with modifying URL's transparently (to a different end, but I thought about using it for CDN support as well).
This is an old filter / module but it worked well for my needs with a little tuning: http://www.paraesthesia.com/archive/2007/12/14/urlabsolutifiermodule---convert-urls-in-asp.net-output-to-absolute.aspx
What you can do is make a response filter and hook it with an httpmodule (as this absolutifier does). If you use this module + response filter you could probably achieve what you need by modifying the source for it to replace the hostname / prefix all urls to use the CDN.
I had to solve your problem and another one, that is I do not want to take resources from the CDN during development but only when the website is deployed on the production server.
To solve this I developed an ExpressionBuilder that prepends the CDN URL only in production.
<asp:Image ImageUrl="<%$ CdnUrl:/images/myimage.png %>" runat="server" />
In previous code the CDN URL will be prepended only in production.
namespace IdeaR.Web.Compilation
{
[ExpressionPrefix("CdnUrl")]
public class CdnUrlExpressionBuilder : ExpressionBuilder
{
public static object GetCdnUrl(string expression, Type target, string entry)
{
var retvalue = expression;
var productionUri = new Uri("http://www.myproductionurl.com",
UriKind.Absolute);
var currentUri = HttpContext.Current.Request.Url;
var cdnUrl = "http://cdn.mycdn.com";
// If this is a production website URL
if (currentUri.Scheme == productionUri.Scheme &&
currentUri.Host == productionUri.Host)
retvalue = cdnUrl + expression;
return retvalue;
}
public override CodeExpression GetCodeExpression(BoundPropertyEntry entry,
object parsedData, ExpressionBuilderContext context)
{
var componentType = entry.DeclaringType;
var expressionArray = new CodeExpression[3]
{
new CodePrimitiveExpression(entry.Expression.Trim()),
new CodeTypeOfExpression(componentType),
new CodePrimitiveExpression(entry.Name)
};
var descriptor = TypeDescriptor.GetProperties(componentType)
[entry.PropertyInfo.Name];
return new CodeCastExpression(descriptor.PropertyType,
new CodeMethodInvokeExpression(
new CodeTypeReferenceExpression(GetType()),
"GetCdnUrl", expressionArray));
}
}
}
For more information I wrote an article on this
How to use a CDN in production but not during development
Related
I have to do paging for an odata endpoint built using Entity Framework . I know I can do it using
private ODataQuerySettings settings = new ODataQuerySettings();
settings.PageSize = myPageSize; // I keep this value in web.config of solution
and
options.ApplyTo(IQueryable, settings);
But I am constrained not to use ApplyTo (i.e. I don't want to use the settings above) and take the page size from the web.config of my solution without modifying the url presented by the web api i.e. no client size paging.
So, far I haven't found a way to do this. I can't put page size in [ Enable Query ] as that is not dynamically lifting page size parameter from web.config.
I wonder if what I want could be done or am I trying to do something too tricky.
You can extend the default behavior of the EnableQuery attribute to use web.config's value as you want. Maybe something like this:
public class EnablePagedQueryAttribute : EnableQueryAttribute
{
public EnablePagedQueryAttribute()
{
int myPageSizeFromWebConfig = 0;
// Get value from web.config as you want:
if (int.TryParse(ConfigurationManager.AppSettings["myPageSize"], out myPageSizeFromWebConfig))
{
this.PageSize = myPageSizeFromWebConfig;
}
}
}
I want to show some XHTML documents that reference some resources (style sheets, scripts, images, etc). These resources are local, but they do not exist on the file system - instead, they are generated by my application.
Using Android.Webkit.WebView and Android.Webkit.WebViewClient, I can intercept requests and provide these resources flawlessly, using something like this:
internal class MyWebViewClient : WebViewClient
{
public override WebResourceResponse ShouldInterceptRequest (WebView view, string url)
{
/* logic to create a resource stream for the requested url */
return new WebResourceResponse (mimeType, encoding, generatedStream);
}
}
Can I achieve something similar using Xamarin.Forms.WebView and its related classes? If so, how? I haven't noticed in the API documentation any methods that look like they provide equivalent behavior.
The Xamarin.Forms WebView control is very basic at present. The class members show that you wouldn't be able achieve what you are wanting to do.
You can load a HTML resource etc here that is quite useful in determining how to reference local files, if you do decide and go down that route.
Do note, however, that in Xamarin.Forms v1.2.2.6243 on Android the Source property is incorrectly set for URLs. For instance, if you navigate to www.yahoo.com and do a few clicks on that site, you will see some query string parameters etc. However, on Android this always comes back as Source property being www.yahoo.com. Xamarin have created a temporary fix for this, however you have to include and implement your own custom renderer at present to overcome this.
Hello fellow seasoned developers!
I was wondering if it were possible to override the .Net Request.QueryString object somehow? It would be really nice if this could be done without having to create my own HTTP module as well.
I have been tasked with RSA(1024) encrypting all the querystrings in my application. However, the application is already built and there's a LOT of places where querystrings are being set so ideally i would like to make a global change that would decrypt the query and place it in the normal Request.QueryString so as to not have to change my code everywhere, and maybe pass it along to other devs within my team and they also don't have to change their code.
Now, I already built the encryption object and use the SessionID for salts to make the keys unique per session. I have also tried intercepting the HTTP request in Global.asax so as to rewrite the request path with the decrypted query, however, that was a bust as any postbacks being performed on those pages put the decrypted querystring back into the POST, which i obviously don't want.
So now i'm at a stage where i would like instead of re-writing the path, to intercept or override the Request.QueryString object on a global level and use my decryption methods there whenever a call to this[key] is placed, and thus again, not having to stop using Request.QueryString. However, after hours of searching on the web, i could not find a single example on how to do this...
If anyone could help me out with this i would be most grateful!
I think the easiest way to accomplish this is to use an Extension Method. Perhaps something like this:
class Program
{
static void Main()
{
var decryptedValue = HttpContext.Current.Request.DecryptQueryStringParam("myParam");
}
}
public static class HttpRequestExtensions
{
public static string DecryptQueryStringParam(this HttpRequest extendee, string name)
{
// do stuff to decrypt
return DecryptMethodStub(extendee.QueryString[name]);
}
private string DecryptMethodStub(string queryString)
{
return "something decrypted the string";
}
}
Please note that the Program class above is for illustrative purposes only... in reality you would call Request.{ExtensionMethod} within the body of a asp.net web forms page or an MVC controller which already provide direct access to the HttpRequest object through the Request property.
here is some information about extensions:
http://msdn.microsoft.com/en-us/library/bb383977.aspx
I'm working on a small school project, an ASP.NET C# website; we're working with a Web Application, using a Global.asax file to centralize request logic.
Anyway, my colleague and I are responsible for the coding in our group, and we both come as reasonably experienced PHP developers. We both rather enjoy working with the architectural style used by the PHP framework Laravel, using routes (callbacks associated with) as the "controllers", and (despite it being a square peg, round hole issue) are trying to replicate that functionality for this project.
This is no easy task; I've implemented the IRouteHandler interface as a CallbackRouteHandler in an attempt to start replicating this functionality:
public class CallbackRouteHandler : IRouteHandler
{
public Func<RequestContext, IHttpHandler> Callback { get; protected set; }
public CallbackRouteHandler(Func<RequestContext, IHttpHandler> callback)
{
this.Callback = callback;
}
public IHttpHandler GetHttpHandler(RequestContext requestContext)
{
return this.Callback(requestContext);
}
}
Unfortunately this is about as far as I've gotten. I'm reading through the ASP.NET Page Life Cycle Overview, attempting to understand better the entire process.
What we're stuck on is programmatically loading ASPX files (rather, instantiating as Page objects) in the scope of a given route callback. We were hoping there would be a reasonably easy way to accomplish, within the scope of the callback, something like:
// instantiate the target ASPX page object
OurSite.SomeNamespace.SomePage page = new OurSite.SomeNamespace.SomePage();
// manipulate the page object, setting public properties, etc.
page.SomeControl.Text = "Foobar!";
// eventually "render" the file to somehow; at this point, the
// page and it's associated code-behind events take control
page.Render();
I'm having trouble understanding both: 1) How to do this? 2) When (relative to the aforementioned page life-cycle) to do this.
How (if at all) can one accomplish this sort of functionality? I'm seeing that this process, hidden away by ASP.NET, is seemingly very complicated, but surely others have tread down this path before.
I went with MVC for this project, however I've since had the opportunity to dissect the ASP.NET request pipeline a bit, and have implemented custom routing solutions as warranted.
I'm creating a controller that will serve the combined/minified versions of my JavaScript and CSS. I need to somewhere along the line define which scripts/styles to be loaded.
When a request is made, for example for style.css?VersionNumberHere, it will check if the combined/minified data is already in the HttpContext.Cache, if so spit it out. Otherwise, I need to look up the definition that makes up style.css.
I created a Script/StyleBuilder (that inherits from ContentBuilder) that will store all the paths that need to be combined and then squished (so this would be the definition of style.css).
Where should I be storing these references to the "builders"? Should they be in a static class or a singleton that implements an interface so that it can be tested?
Here's the interface that the abstract class ContentBuilder implements (you can easily imagine the implementation):
public interface IContentBuilder : IEnumerable<string>
{
string Name { get; }
int Count { get; }
string[] ValidExtensions { get; }
void Add(string path);
bool ValidatePath(string path);
string GetHtmlReference(); // Spits out <script>, or <link> depending on implementation.
string Build(); // Minifies, combines etc.
}
And here is ideally what I'd like to be able to do with these:
ContentBuilderContainer.Current.Add("main.js", c => new ScriptBuilder()
{
"/path/to/test.js",
"/path/to/test2.js",
"/path/to/test3.js"
});
ContentBuilderContainer.Current.Add("style.css", c => new StyleBuilder()
{
"/path/to/style.css",
"/path/to/test.less"
});
Then to output all the HTML for all registered IContentBuilder:
ContentBuilder.Container.Current.BuildHtml();
Maybe you should check out SquishIt. Some more info on it in this blog post. We use it in production.
Attach caching attributes to your controller actions and cache by parameter like this:
[OutputCache(Duration = 7200, Location = OutputCacheLocation.Client, VaryByParam = "jsPath;ServerHost")]
[CompressFilter]
// Minifies, compresses JavaScript files and replaces tildas "~" with input serverHost address
// (for correct rewrite of paths inside JS files) and stores the response in client (browser) cache for a day
[ActionName("tildajs")]
public virtual JavaScriptResult ResolveTildasJavaScript(string jsPath, string serverHost)
...
I made the following interface:
public interface IContentBuilderContainer
{
int Count { get; }
bool Add(string name, Func<IContentBuilder> contentBuilder);
string RenderHtml();
}
And then in the implmentation of ContentBuilderContainer:
public class ContentBuilderContainer : IContentBuilderContainer
{
// Other members removed for simplicity.
#region Static Properties
/// <summary>
/// Gets or sets the current content builder container.
/// </summary>
public static IContentBuilderContainer Current
{
get;
set;
}
#endregion
#region Static Constructors
static ContentBuilderContainer()
{
ContentBuilderContainer.Current = new ContentBuilderContainer();
}
#endregion
}
This way there's a single ContentBuilderContainer living at one time.
I helped write some code that did this recently. Here's a high level overview of the solution that was implemented. Hopefully it will give you some good ideas.
Configuration: We created custom configuration elements that define a key and their a corresponding list of directories. So the key JS is linked to our /Content/Scripts folder, and CSS is linked to our /Content/Styles folder. I have seen other solutions where the configuration allowed for individual files to be listed.
Controller: The controller was set up to receive requests something along the lines of /Content/Get/JS and /Content/Get/CSS. The controller uses the configuration key and client request headers to come up with a cache key that identifies the content we want to serve: JS-MSIE-ZIP, CSS-FFX, etc. The controller then checks our cache service. If the content is not there, it gets concatenated, minified, compressed, cached and then served. Handy fallout is that the content is compressed before going into the cache instead of every time it's served.
View: In the View, the links are set up like this:
<link href="<%: Url.Action("Get", "Content", new { key = "CSS" }) %>" rel="stylesheet" type="text/css" />
Cache Service: We're using an existing cache service we have that just wraps the application cache. At some point we'll probably move that to Velocity or something similar. If the amount of CSS and JS we cache keeps growing, we'll probably change the format of the key to a proper filename and move the content to the file system. But, memory's pretty cheap, we'll see what happens.
Reasoning: (if it matters)
We did this in order to keep the JavaScript for different features in separate files in source control without having to link to all of the files individually in the HTML. Because we configure our content by directory and not individual files, we can also run a full minification during production builds to speed up the whole run time process somewhat. Yet we still get the benefit of determining which content to serve based on the client browser, and cached compressed versions.
In development, the system can be set up with a quick configuration change so that every request rebuilds the JS. The files are concatenated with file names injected in comments for easy searching, but the content is not minified and nothing is cached anywhere. This allows us to change, test and debug the JS without recompiling the application.
Couldn't quite find all these features in a package out there so we spent a couple of days and built it. Admittedly some features were just for fun, but that's why we like doing what we do. =)