I'm creating a controller that will serve the combined/minified versions of my JavaScript and CSS. I need to somewhere along the line define which scripts/styles to be loaded.
When a request is made, for example for style.css?VersionNumberHere, it will check if the combined/minified data is already in the HttpContext.Cache, if so spit it out. Otherwise, I need to look up the definition that makes up style.css.
I created a Script/StyleBuilder (that inherits from ContentBuilder) that will store all the paths that need to be combined and then squished (so this would be the definition of style.css).
Where should I be storing these references to the "builders"? Should they be in a static class or a singleton that implements an interface so that it can be tested?
Here's the interface that the abstract class ContentBuilder implements (you can easily imagine the implementation):
public interface IContentBuilder : IEnumerable<string>
{
string Name { get; }
int Count { get; }
string[] ValidExtensions { get; }
void Add(string path);
bool ValidatePath(string path);
string GetHtmlReference(); // Spits out <script>, or <link> depending on implementation.
string Build(); // Minifies, combines etc.
}
And here is ideally what I'd like to be able to do with these:
ContentBuilderContainer.Current.Add("main.js", c => new ScriptBuilder()
{
"/path/to/test.js",
"/path/to/test2.js",
"/path/to/test3.js"
});
ContentBuilderContainer.Current.Add("style.css", c => new StyleBuilder()
{
"/path/to/style.css",
"/path/to/test.less"
});
Then to output all the HTML for all registered IContentBuilder:
ContentBuilder.Container.Current.BuildHtml();
Maybe you should check out SquishIt. Some more info on it in this blog post. We use it in production.
Attach caching attributes to your controller actions and cache by parameter like this:
[OutputCache(Duration = 7200, Location = OutputCacheLocation.Client, VaryByParam = "jsPath;ServerHost")]
[CompressFilter]
// Minifies, compresses JavaScript files and replaces tildas "~" with input serverHost address
// (for correct rewrite of paths inside JS files) and stores the response in client (browser) cache for a day
[ActionName("tildajs")]
public virtual JavaScriptResult ResolveTildasJavaScript(string jsPath, string serverHost)
...
I made the following interface:
public interface IContentBuilderContainer
{
int Count { get; }
bool Add(string name, Func<IContentBuilder> contentBuilder);
string RenderHtml();
}
And then in the implmentation of ContentBuilderContainer:
public class ContentBuilderContainer : IContentBuilderContainer
{
// Other members removed for simplicity.
#region Static Properties
/// <summary>
/// Gets or sets the current content builder container.
/// </summary>
public static IContentBuilderContainer Current
{
get;
set;
}
#endregion
#region Static Constructors
static ContentBuilderContainer()
{
ContentBuilderContainer.Current = new ContentBuilderContainer();
}
#endregion
}
This way there's a single ContentBuilderContainer living at one time.
I helped write some code that did this recently. Here's a high level overview of the solution that was implemented. Hopefully it will give you some good ideas.
Configuration: We created custom configuration elements that define a key and their a corresponding list of directories. So the key JS is linked to our /Content/Scripts folder, and CSS is linked to our /Content/Styles folder. I have seen other solutions where the configuration allowed for individual files to be listed.
Controller: The controller was set up to receive requests something along the lines of /Content/Get/JS and /Content/Get/CSS. The controller uses the configuration key and client request headers to come up with a cache key that identifies the content we want to serve: JS-MSIE-ZIP, CSS-FFX, etc. The controller then checks our cache service. If the content is not there, it gets concatenated, minified, compressed, cached and then served. Handy fallout is that the content is compressed before going into the cache instead of every time it's served.
View: In the View, the links are set up like this:
<link href="<%: Url.Action("Get", "Content", new { key = "CSS" }) %>" rel="stylesheet" type="text/css" />
Cache Service: We're using an existing cache service we have that just wraps the application cache. At some point we'll probably move that to Velocity or something similar. If the amount of CSS and JS we cache keeps growing, we'll probably change the format of the key to a proper filename and move the content to the file system. But, memory's pretty cheap, we'll see what happens.
Reasoning: (if it matters)
We did this in order to keep the JavaScript for different features in separate files in source control without having to link to all of the files individually in the HTML. Because we configure our content by directory and not individual files, we can also run a full minification during production builds to speed up the whole run time process somewhat. Yet we still get the benefit of determining which content to serve based on the client browser, and cached compressed versions.
In development, the system can be set up with a quick configuration change so that every request rebuilds the JS. The files are concatenated with file names injected in comments for easy searching, but the content is not minified and nothing is cached anywhere. This allows us to change, test and debug the JS without recompiling the application.
Couldn't quite find all these features in a package out there so we spent a couple of days and built it. Admittedly some features were just for fun, but that's why we like doing what we do. =)
Related
My task is to import a feature from an existing application (classic MVC5) into NopCommerce 4.30. The goal is to have a button somewhere on the actual shop's page that redirects to the plugin's Index view providing the actual features to the users.
This is my first time working with NC. Having read the overall NC documentation, I figured the best approach would be to create a plugin containing the desired functionality. Following the instructions for creating a plugin in NC 4.30 (cf. here), I was able to create and install the plugin.
However, I cannot seem to open any (custom) plugin views from within the application. Taking a look at the plugins shipped with the NC base files, I noticed that most of the plugins had a file named RouteProvider.cs that contained routing information. Based on those files, I created my own simple version and added it to the project, e.g. (sample only)
public class RouteProvider : IRouteProvider
{
private const string ViewNameIndex = "Plugin.Misc.MyPlugin.Index";
private const string RoutePattern = "Plugins/Index";
/// <summary>
/// Register routes
/// </summary>
/// <param name="endpointRouteBuilder">Route builder</param>
public void RegisterRoutes(IEndpointRouteBuilder endpointRouteBuilder)
{
endpointRouteBuilder.MapControllerRoute(ViewNameIndex, RoutePattern,
new { controller = "MyController", action = "Index" });
}
/// <summary>
/// Gets a priority of route provider
/// </summary>
public int Priority => 0;
}
However, I still cannot use any plugin URL.
Moreover, I do not quite understand the purpose of the first and second parameter in MapControllerRoute and how they affect the URL being used. I could not infer their usage from the sample I used, cf. below. (I took the liberty of adding ImportContactRoute and UnsubscribeContactRoute from SendinBlueDefault.cs to this sample)
public class RouteProvider : IRouteProvider
{
private const string ImportContactsRoute => "Plugin.Misc.SendinBlue.ImportContacts";
private const string UnsubscribeContactRoute => "Plugin.Misc.SendinBlue.Unsubscribe";
/// <summary>
/// Register routes
/// </summary>
/// <param name="endpointRouteBuilder">Route builder</param>
public void RegisterRoutes(IEndpointRouteBuilder endpointRouteBuilder)
{
endpointRouteBuilder.MapControllerRoute(ImportContactsRoute, "Plugins/SendinBlue/ImportContacts",
new { controller = "SendinBlue", action = "ImportContacts" });
endpointRouteBuilder.MapControllerRoute(UnsubscribeContactRoute, "Plugins/SendinBlue/UnsubscribeWebHook",
new { controller = "SendinBlue", action = "UnsubscribeWebHook" });
}
/// <summary>
/// Gets a priority of route provider
/// </summary>
public int Priority => 0;
}
What is Plugin.Misc.SendinBlue.ImportContacts trying to match here? There is neither a corresponding folder in Plugins\Nop.Plugin.Misc.SendinBlue\ nor in Presentation\Nop.Web\Plugins\Misc.SendinBlue\.
I assume that Plugins/SendinBlue/ImportContacts refers to the actual route that might be entered as a URL, but again I am not sure about it at this stage.
Could somebody please explain how I might correctly create and access routes to my plugin's views from within the application, or else provide an alternative approach in case I was barking up the wrong tree.
Update
While researching ways to resolve the issue, I came across the following post in the NC forum. The post linked to this page, which describes three "Ways to display Views in Your NopCommerce Plugins":
Embedded Resources
Theme Override
Custom View Engine
As valuable as it is, you'll notice that the post dates back to 2013. The question is whether the approaches are still valid, given the fact that the technology has changed.
I've already tried the first approach of making my View's into embedded resources to no avail.
What would be the correct approache(s) for NC 4.30 in this case?
I gave it another shot and I would like to point out that using approach
1.Embedded resources (cf. above) actually works.
I haven't tried the other two approaches due to a lack of understanding regarding these topics.
Some issues remain, but in my opinion, this question has been answered.
I've read microsoft article about resource based authorization with IAuthorizatinService, but it allows to autorize only one resource. For example i have a User class and File class. File has an owner and can be public or not, so the file can be viewed only if its public or the user is owner of this file. I need to display a list of all files for different users, so each user will see all public files and all owned files.
In authorization handler i have this:
protected override Task HandleRequirementAsync(AuthorizationHandlerContext context,
OwnerOrPublicRequirement requirement,
File resource)
{
if (resource.IsPublic || context.User.Identity?.Name == resource.Owner)
{
context.Succeed(requirement);
}
return Task.CompletedTask;
}
Then in controller i had to do like this:
List<File> authorizedFiles = new List<File>();
foreach (var file in _dbContext.Files)
{
var result = await _authorizationService
.AuthorizeAsync(User, file, new OwnerOrPublicRequirement());
if (result.Success)
{
authorizedFiles.Add(file);
}
}
But it looks ugly cause i have to load all the files from DB and then filter them one by one. What if i have like millions of files and most of them are nor public not owned by user? I will not be able to load all of them and filter like this due to out of memory. I can rewrite it to LINQ query and let DB will do all the job:
var authorizedFiles = _dbContext.Files
.Select(f => f)
.Where(f.IsPublic || f.User.Identity?.Name == f.Owner)
.ToList();
But then i will have two places with code that does same thing, so whenever i need to change authorization logic i have to fix two different parts of code. So what will be the propper way of doing this?
Don't use the custom authorization provider too much extra cost and complexity.
Have one place to get the list of files and let the database do the heavy work of filtering and sorting by filename.
Death by a thousand cuts of having to know dozens/hundreds of special features of the ASP.NET framework costs. Each special knowledge item costs minutes per year to support for you and future developers and adds risk to the project.
Combined together, hundreds of small extra features/specialized knowledge needed, will add man days (months?) to the cost of keeping your production system alive and enhancing it. Microsoft seemed to forget the keep it simple and keeps adding dozens of specialized knowledge needed features with each new version of ASP.NET.
A developer should be able to read the application main program, then trace how each piece of code in the entire application code base is called without needing to know internals/extensibility hell micro-trivia of the ASP.NET framework.
Here's the deal: I have a site where multiple people will be sharing the same account and should each be able to be on a page that uploads files and keeps a list of the files they've uploaded that session. The controller for the file uploading page looks like
public class FileUploadController : Controller
{
// ...
private List<ThreePartKey> uploadedFiles = new List<ThreePartKey> ();
public ActionResult Index ( )
{
// ...
}
[HttpPost]
public ActionResult Index (HttpPostedFileBase file)
{
// ...
if (!errorOccured)
{
uploadedFiles.Add(new ThreePartKey { orgname = selectedOrgName, catname = selectedCatName, filename = fileNameNoExtension, fullfilepath = newFileUrlPathAndName });
}
// ...
}
and the problem is that uploadedFiles keeps getting re-initialized whenever [HttpPost] public ActionResult Index (HttpPostedFileBase file) is called, meaning the user's list of uploaded files only shows the last uploaded one. So I instead tried
private static List<ThreePartKey> uploadedFiles = new List<ThreePartKey> ();
and that screwed up everything because all the signed-in users are sharing the same list.
Is there any easy way to do what I'm trying to do?
Controllers are instantiated and destroyed on every request. If you want to persist information in a webserver it is strongly advised to use a permanent backing store such as a database.
You can use static state in ASP.NET applications (WebForms, MVC, OWIN, etc) however this is only recommended for caching for performance. It cannot be relied upon because static state is only local to the current AppDomain in the current Application Pool (w3wp.exe instance) - if your website is run in multiple pools or appdomains, or if your application is restarted (or killed due to inactivity) then the stored state is lost.
On option is to provide a 'session' code/id with each request. When user first connects to your site, they are given a session-code (I use 'code' to indicate it has nothing to do with what we would normally call 'session').
Every link has that session-code as part of the url and every post includes the session-code. Then your upload cache can be:
private static ILookup<int, ThreePartKey> uploadedFiles;
(or dictionary if you prefer)
private static IDictionary<int, IList<ThreePartKey>> uploadedFiles;
Depends on the size of the rest of your site if this is workable or not - in most cases probably not as described... but could be managed, eg use the IP address as the 'code' or if you're using AnglurJS or single page application.
As pointed out, any static/singleton cache will still be lost if the app pool is reset, eg via the inactivity timeout setting in IIS.
Another option is to persist the files in subfolders based on the user's IP address.
You've only stipulated that they all use the same login, not how the files are stored etc, so maybe this would work for you.
I have a logging class that, well, logs things. I would like to add the ability to automatically have the current page be logged with the messages.
Is there a way to get the information I'm looking for?
Thanks,
From your class you can use the HttpContext.Current property (in System.Web.dll). From there, you can create a chain of properties:
Request
Url and RawUrl
The underlying object is a Page object, so if you cast it to that, then use any object you would normally use from within a Page object, such as the Request property.
It's brittle and hard to test but you can use System.Web.HttpContext.Current which will give you a Request property which in turn has the RawUrl property.
public static class MyClass
{
public static string GetURL()
{
HttpRequest request = HttpContext.Current.Request;
string url = request.Url.ToString();
return url;
}
}
I tried to break it down a little :)
In the past I've also rolled my own logging classes and used Console.Writeln() but really there are a number of good logging options that already exist so why go there? I use NLog pretty much everywhere; it is extremely flexible with various log output destinations including console and file, lots of log format options, and is trivial to set up with versions targeting the various .net frameworks including compact. Running the installer will add NLog config file options to the Visual Studio Add New Item dialog. Using in your code is simple:
// declare in your class
private static Logger logger = LogManager.GetCurrentClassLogger();
...
// use in your code
logger.Debug(() => string.Format("Url: {0}", HttpContext.Current.Request.Url));
I am in the process of moving all of the images in my web application over to a CDN but I want to easily be able to switch the CDN on or off without having to hard code the path to the images.
My first thought was to add an HttpHandler for image extensions that depending whether a variable in the web.config (something like ) will serve the image from the server or from the CDN. But after giving this a little though I think I've essentially ruled this out as it will cause ASP.NET to handle the request for every single image, thus adding overhead, and it might actually completely mitigate the benefits of using a CDN.
An alternative approach is, since all of my pages inherit from a base page class, I could create a function in the base class that determines what path to serve the files from based off the web.config variable. I would then do something like this in the markup:
<img src='<%= GetImagePath()/image.png' />
I think this is probably what I'll have to end up doing, but it seems a little clunky to me. I also envision problems with the old .NET error of not being able to modify the control collection because of the "<%=" though the "<%#" solution will probably work.
Any thoughts or ideas on how to implement this?
You've dismissed writing an HttpHandler based on an assumption of pre-optimization. I would revisit this and definitely write a simple HttpHandler and test it out. You might find that your Page method solution might even be slower, especially if you get the ASP preprocessor involved.
HttpHandlers are pretty close to the metal - it's a miniscule amount of overhead for IIS to hand the request to ASP.Net. It would be a more elegant solution than what you're proposing, and probably more scalable and I'm willing to bet - faster.
Have you considered a slightly simpler approach?
If your pages all inherit from a base class, you could expose a property on that which contains the prepend URL to your CDN (or, to your local server if you want to switch the CDN off). It is then a trivial matter of storing the prepend URL in the web.config:
public string PrependURLPath() {
get { return ConfigurationManager.AppSettings["ImagePrependURL"].ToString(); }
}
In your <appSettings/> element, you can simply choose what the prepend URL would be, eg:
http://my.cdn.com/user/
or:
http://my.own.server.com/images/
Pretty simple!
You would then be able to code your image refernces as per your example, but calling your base page property to expose the desired path:
<img src='<%= this.BasePage.PrependURLPath() + [YourImagePath.png] %>'/>
I agree that setting the image source through the inline call is messy, but you could probably do as someone else has suggested and then iterate through the image controls on your page, changing the prepend URL as you go.
Even if your pages currently only inherit from System.Web.UI.Page, it's a simple matter to create your own base class which inherits System.Web.Page, then do a find/replace in your solution on all remaining pages.
Hope this helps.
weighing in pretty late here, but i've been looking for a similar solution myself. searched google to sanity check what i had done. didn't consider the HttpHandler approach, what i did was simply extend the ASP.net Image control:
public class Img : Image
{
public Img()
{
RelativePath = false;
}
public bool RelativePath { get; set; }
public override string ImageUrl
{
get
{
if (RelativePath)
return base.ImageUrl;
return "http://some.configurable-value.com" + base.ImageUrl;
}
set { base.ImageUrl = value; }
}
}
it's rough and ready, but it works :) obviously it should rely on some configurable value rather than a string literal, but that's not a big change
If you display your images using tags you could create a control adapter, these allow you to alter the way .net controls render or universally alter them something like this should do the trick:
using System.Web.UI.WebControls.Adapters;
using System.Web.UI;
using System.Web.UI.WebControls;
namespace ExampleCode
{
public class ImageAdapter : WebControlAdapter
{
private bool UseCdn
{
get { return true; } // Get value from config or anywhere else
}
protected override void OnPreRender(EventArgs e)
{
base.OnPreRender(e);
Image image = (Image)Control;
if (UseCdn)
{
// If using relative urls for images may need to handle ~
image.ImageUrl = String.Format("{0}/{1}", "CDN URL", image.ImageUrl);
}
}
}
}
Then add a browser file to the App_Browsers folder in your web project like below:
<browsers>
<browser refID="Default">
<controlAdapters>
<adapter
controlType="System.Web.UI.WebControls.Image"
adapterType="ExampleCode.ImageAdapter"
/>
</controlAdapters>
</browser>
</browsers>
you could loop all controls and change the images url in the prerender event on your base class...
The good thing about the HTTP Handler approach is that it's quite re-usable and configurable: you can identify img paths to handle based on location - assuming the structure they're in helps this.
The possible drawback is that image file extensions (.jpg, .png, etc) aren't automatically passed on to the asp.net pipe-line; you can easily config IIS to do so - but you need to have a certain level of contriol over IIS - so it might not be an option if you're on a shared hosting environment.
I will go for #Rhys approach for image control.
Most of the time, I try to use background image css than using image control.
After that I upload both css and images together to the cloud and working fine for relative path.
Doesn't look like there has been an accepted answer yet so here is my suggestion. I had similar problems dealing with modifying URL's transparently (to a different end, but I thought about using it for CDN support as well).
This is an old filter / module but it worked well for my needs with a little tuning: http://www.paraesthesia.com/archive/2007/12/14/urlabsolutifiermodule---convert-urls-in-asp.net-output-to-absolute.aspx
What you can do is make a response filter and hook it with an httpmodule (as this absolutifier does). If you use this module + response filter you could probably achieve what you need by modifying the source for it to replace the hostname / prefix all urls to use the CDN.
I had to solve your problem and another one, that is I do not want to take resources from the CDN during development but only when the website is deployed on the production server.
To solve this I developed an ExpressionBuilder that prepends the CDN URL only in production.
<asp:Image ImageUrl="<%$ CdnUrl:/images/myimage.png %>" runat="server" />
In previous code the CDN URL will be prepended only in production.
namespace IdeaR.Web.Compilation
{
[ExpressionPrefix("CdnUrl")]
public class CdnUrlExpressionBuilder : ExpressionBuilder
{
public static object GetCdnUrl(string expression, Type target, string entry)
{
var retvalue = expression;
var productionUri = new Uri("http://www.myproductionurl.com",
UriKind.Absolute);
var currentUri = HttpContext.Current.Request.Url;
var cdnUrl = "http://cdn.mycdn.com";
// If this is a production website URL
if (currentUri.Scheme == productionUri.Scheme &&
currentUri.Host == productionUri.Host)
retvalue = cdnUrl + expression;
return retvalue;
}
public override CodeExpression GetCodeExpression(BoundPropertyEntry entry,
object parsedData, ExpressionBuilderContext context)
{
var componentType = entry.DeclaringType;
var expressionArray = new CodeExpression[3]
{
new CodePrimitiveExpression(entry.Expression.Trim()),
new CodeTypeOfExpression(componentType),
new CodePrimitiveExpression(entry.Name)
};
var descriptor = TypeDescriptor.GetProperties(componentType)
[entry.PropertyInfo.Name];
return new CodeCastExpression(descriptor.PropertyType,
new CodeMethodInvokeExpression(
new CodeTypeReferenceExpression(GetType()),
"GetCdnUrl", expressionArray));
}
}
}
For more information I wrote an article on this
How to use a CDN in production but not during development