ResolveUrl XSS alternative - c#

After the research published showing that .aspx routes are vulnerable to reflected XSS, what is the recommended alternative to using Page.ResolveUrl or Control.ResolveUrl? The linked article doesn't suggest any mitigations.
Summary of the linked research:
For .aspx pages (not MVC), even if you don't have cookieless sessions enabled, ASP.NET still parses those "special" URL formats such http://www.example.com/(S(lit3py55t21z5v55vlm25s55))/orderform.aspx
it includes them in the page output whenever you use ResolveUrl.
Thus it creates an attack vector where a call like ResolveUrl( "~/Images/logo.png" ) will inject content of the attacker's choice into your page output, e.g.
/(S("onerror="alert`1`"))/Images/logo.png`
I've posted one possible answer below but am looking for better ideas.
Note that ResolveClientUrl is not a direct replacement since it generates a relative Url, e.g. ../Images/logo.png unlike ResolveUrl which generates a root Url e.g. /myapp/Images/logo.png

One approach is to use HttpRuntime.AppDomainAppVirtualPath instead of the special tilde syntax. So the example from above...
Instead of:
ResolveUrl( "~/Images/logo.png" )
We would have:
HttpRuntime.AppDomainAppVirtualPath.TrimEnd( '/' ) + "/Images/logo.png"
Slightly less concise but seems to accomplish the same thing without invoking the ancient "cookieless" route parsing.

Use ResolveClientUrl instead of ResolveUrl.
ResolveClientUrl will not allow XSS.

Related

Change razor syntax

How can I change razor syntax in RazorEngine?
I need to use specific keyword instead of"#" symbol.
For example: $$Model.someField instead of #Model.someField. ("$$" instead of "#").
You can't. Razor is not really designed in a way to do it. Basically (Microsoft.AspNet.)Razor has some specially written parsers which handle "#" in a special manner (by switching parsers). This means the languages (C#, Html in this case) itself need to be compatible with this procedure as well!
If you want to replace "#" with something else you need to rewrite the Razor Parsers. This is possible, but at this point you already implemented the hardest part of Razor yourself...
The real question you should ask yourself (or answer here) is: Why you want to do it? It is not as trivial as one would think, I was at this point before.
As freedomn-m suggested you should use #Html.Raw("#") or ## if you need to output a "#".
matthid
- a RazorEngine contributor

URL from user, XSS security? [duplicate]

We have a high security application and we want to allow users to enter URLs that other users will see.
This introduces a high risk of XSS hacks - a user could potentially enter javascript that another user ends up executing. Since we hold sensitive data it's essential that this never happens.
What are the best practices in dealing with this? Is any security whitelist or escape pattern alone good enough?
Any advice on dealing with redirections ("this link goes outside our site" message on a warning page before following the link, for instance)
Is there an argument for not supporting user entered links at all?
Clarification:
Basically our users want to input:
stackoverflow.com
And have it output to another user:
stackoverflow.com
What I really worry about is them using this in a XSS hack. I.e. they input:
alert('hacked!');
So other users get this link:
stackoverflow.com
My example is just to explain the risk - I'm well aware that javascript and URLs are different things, but by letting them input the latter they may be able to execute the former.
You'd be amazed how many sites you can break with this trick - HTML is even worse. If they know to deal with links do they also know to sanitise <iframe>, <img> and clever CSS references?
I'm working in a high security environment - a single XSS hack could result in very high losses for us. I'm happy that I could produce a Regex (or use one of the excellent suggestions so far) that could exclude everything that I could think of, but would that be enough?
If you think URLs can't contain code, think again!
https://owasp.org/www-community/xss-filter-evasion-cheatsheet
Read that, and weep.
Here's how we do it on Stack Overflow:
/// <summary>
/// returns "safe" URL, stripping anything outside normal charsets for URL
/// </summary>
public static string SanitizeUrl(string url)
{
return Regex.Replace(url, #"[^-A-Za-z0-9+&##/%?=~_|!:,.;\(\)]", "");
}
The process of rendering a link "safe" should go through three or four steps:
Unescape/re-encode the string you've been given (RSnake has documented a number of tricks at http://ha.ckers.org/xss.html that use escaping and UTF encodings).
Clean the link up: Regexes are a good start - make sure to truncate the string or throw it away if it contains a " (or whatever you use to close the attributes in your output); If you're doing the links only as references to other information you can also force the protocol at the end of this process - if the portion before the first colon is not 'http' or 'https' then append 'http://' to the start. This allows you to create usable links from incomplete input as a user would type into a browser and gives you a last shot at tripping up whatever mischief someone has tried to sneak in.
Check that the result is a well formed URL (protocol://host.domain[:port][/path][/[file]][?queryField=queryValue][#anchor]).
Possibly check the result against a site blacklist or try to fetch it through some sort of malware checker.
If security is a priority I would hope that the users would forgive a bit of paranoia in this process, even if it does end up throwing away some safe links.
Use a library, such as OWASP-ESAPI API:
PHP - http://code.google.com/p/owasp-esapi-php/
Java - http://code.google.com/p/owasp-esapi-java/
.NET - http://code.google.com/p/owasp-esapi-dotnet/
Python - http://code.google.com/p/owasp-esapi-python/
Read the following:
https://www.golemtechnologies.com/articles/prevent-xss#how-to-prevent-cross-site-scripting
https://www.owasp.org/
http://www.secbytes.com/blog/?p=253
For example:
$url = "http://stackoverflow.com"; // e.g., $_GET["user-homepage"];
$esapi = new ESAPI( "/etc/php5/esapi/ESAPI.xml" ); // Modified copy of ESAPI.xml
$sanitizer = ESAPI::getSanitizer();
$sanitized_url = $sanitizer->getSanitizedURL( "user-homepage", $url );
Another example is to use a built-in function. PHP's filter_var function is an example:
$url = "http://stackoverflow.com"; // e.g., $_GET["user-homepage"];
$sanitized_url = filter_var($url, FILTER_SANITIZE_URL);
Using filter_var allows javascript calls, and filters out schemes that are neither http nor https. Using the OWASP ESAPI Sanitizer is probably the best option.
Still another example is the code from WordPress:
http://core.trac.wordpress.org/browser/tags/3.5.1/wp-includes/formatting.php#L2561
Additionally, since there is no way of knowing where the URL links (i.e., it might be a valid URL, but the contents of the URL could be mischievous), Google has a safe browsing API you can call:
https://developers.google.com/safe-browsing/lookup_guide
Rolling your own regex for sanitation is problematic for several reasons:
Unless you are Jon Skeet, the code will have errors.
Existing APIs have many hours of review and testing behind them.
Existing URL-validation APIs consider internationalization.
Existing APIs will be kept up-to-date with emerging standards.
Other issues to consider:
What schemes do you permit (are file:/// and telnet:// acceptable)?
What restrictions do you want to place on the content of the URL (are malware URLs acceptable)?
Just HTMLEncode the links when you output them. Make sure you don't allow javascript: links. (It's best to have a whitelist of protocols that are accepted, e.g., http, https, and mailto.)
You don't specify the language of your application, I will then presume ASP.NET, and for this you can use the Microsoft Anti-Cross Site Scripting Library
It is very easy to use, all you need is an include and that is it :)
While you're on the topic, why not given a read on Design Guidelines for Secure Web Applications
If any other language.... if there is a library for ASP.NET, has to be available as well for other kind of language (PHP, Python, ROR, etc)
For Pythonistas, try Scrapy's w3lib.
OWASP ESAPI pre-dates Python 2.7 and is archived on the now-defunct Google Code.
How about not displaying them as a link? Just use the text.
Combined with a warning to proceed at your own risk may be enough.
addition - see also Should I sanitize HTML markup for a hosted CMS? for a discussion on sanitizing user input
There is a library for javascript that solves this problem
https://github.com/braintree/sanitize-url
Try it =)
In my project written in JavaScript I use this regex as white list:
url.match(/^((https?|ftp):\/\/|\.{0,2}\/)/)
the only limitation is that you need to put ./ in front for files in same directory but I think I can live with that.
Using Regular Expression to prevent XSS vulnerability is becoming complicated thus hard to maintain over time while it could leave some vulnerabilities behind. Having URL validation using regular expression is helpful in some scenarios but better not be mixed with vulnerability checks.
Solution probably is to use combination of an encoder like AntiXssEncoder.UrlEncode for encoding Query portion of the URL and QueryBuilder for the rest:
public sealed class AntiXssUrlEncoder
{
public string EncodeUri(Uri uri, bool isEncoded = false)
{
// Encode the Query portion of URL to prevent XSS attack if is not already encoded. Otherwise let UriBuilder take care code it.
var encodedQuery = isEncoded ? uri.Query.TrimStart('?') : AntiXssEncoder.UrlEncode(uri.Query.TrimStart('?'));
var encodedUri = new UriBuilder
{
Scheme = uri.Scheme,
Host = uri.Host,
Path = uri.AbsolutePath,
Query = encodedQuery.Trim(),
Fragment = uri.Fragment
};
if (uri.Port != 80 && uri.Port != 443)
{
encodedUri.Port = uri.Port;
}
return encodedUri.ToString();
}
public static string Encode(string uri)
{
var baseUri = new Uri(uri);
var antiXssUrlEncoder = new AntiXssUrlEncoder();
return antiXssUrlEncoder.EncodeUri(baseUri);
}
}
You may need to include white listing to exclude some characters from encoding. That could become helpful for particular sites.
HTML Encoding the page that render the URL is another thing you may need to consider too.
BTW. Please note that encoding URL may break Web Parameter Tampering so the encoded link may appear not working as expected.
Also, you need to be careful about double encoding
P.S. AntiXssEncoder.UrlEncode was better be named AntiXssEncoder.EncodeForUrl to be more descriptive. Basically, It encodes a string for URL not encode a given URL and return usable URL.
You could use a hex code to convert the entire URL and send it to your server. That way the client would not understand the content in the first glance. After reading the content, you could decode the content URL = ? and send it to the browser.
Allowing a URL and allowing JavaScript are 2 different things.

Specify Route only on URL without suffix

I'm using ASP.NET 4.5 and have the following routing rule in my Global.asax file:
RouteTable.Routes.MapPageRoute("defaultRoute", "{*value}", "~/default.aspx")
What I'm trying to accomplish is redirecting dynamically generated URLs that are formatted like this:
http://myurl.com/firstnamelastname
Here is what one might actually look like:
http://myurl.com/davemackey
My problem is that the above redirects all requests - e.g. to axd or jpg files. Now I could add exclusions for every other type of file like so:
RouteTable.Routes.Ignore("{resource}.axd/*pathInfo}")
But this would be error prone and tedious (e.g., what happens if someone adds another file type to the project?).
So, what I'd like to do is something like this:
RouteTable.Routes.MapPageRoute("defaultRoute", "{*value}(where no suffix)", "~/default.aspx")
Or, put into my clear English:
If URL does not have a suffix, then redirect using defaultRoute to ~/default.aspx
Any thoughts on how to accomplish this?
==
Update:
I found this MSDN article. It seems that using Constraints might work to implement what I am speaking of above...but I'm not exactly sure how...
==
Update 2:
I've got a passable solution for the moment. I added the following:
RouteTable.Routes.Ignore("{path}/{value}")
Since image and other files are kept in sub-directories, this forces them to be excluded. Still, I have two concerns with this
What if the path is longer than a single sub-directory, e.g. images/people/person.jpg?
What if a file is placed into the main root (shouldn't be, but it could happen) that is a jpg or etc.?

populating textboxes with less than / greater than symbols (< and >)

so I've been running into some problems where in various parts of my website I'm developing, I'm displaying some logs that contain < and > symbols in various spots. Well when I display the log it works fine. Of course anytime I navigate away I get an error of:
A potentially dangerous Request.Form value was detected from the client ...
Now I understand it's because < and > are html special characters which I get. But, is there any way to disable or somehow allow the page to display / process those? I know I could strip those characters out of anyplace they may appear, but I'd rather not if I don't have to.
Any suggestions?
You didn't post any code, so I will assume you want something along the lines of:
<textbox><</textbox>
It's simple really, HTML encode your content:
<textbox><</textbox>
You can use HttpUtility.HtmlEncode to do this.
Replace ">" with ">" and "<" with "<"
Read this see a list of HTMLs special entities
If you simply want your web application to allow form input to contain potentially dangerous characters there are a few ways to do this depending on framework. I mostly use MVC myself, where you use the [ValidateInput(false)] attribute on your controller actions.
For WebForms, I'll direct you here instead.. http://msdn.microsoft.com/en-us/library/ie/bt244wbb.aspx :)
to answer your question put ValidateRequest="false" in <%#Page...
be careful, as you are now responsible for prevent script attacks

Replace asp:Image relative path (../../images) with absolute path on another server

i have a web application and all of its images are relative path,
for example '../../images/logo.png',
i need to change all of the images in the application to another domain,
for example : 'static.domain.com/images/logo.png'
is there a fast way to change all the data ?
of course the long option is to iterate all images and change them manually,
Replace all occurrences manually
Use custom image class, inherited from System.Web.UI.WebControls.Image, to make possible configuration of behavior regarding image path
In general you should encapsulate relative paths to resources in a code block, like this (using an ASP.NET MVC example as my classic ASP.NET skills are gettting rusty):
<%= Url.Content( "~/images/logo.png" ) %>
To map them to a different path than the default, you can either define a custom route that matches all *.png files (and any other used formats) or introduce your own helper extensions so that you can rewrite the above to something like this:
<%= Url.Static( "~/images/logo.png" ) %>
The easiest is like #TBohnen.jnr said
Why don't you do a find and replace
(ctrl + H) and replace "../../images/"
with 'static.domain.com/images/' risky
but should be the easiest?
you can write a httmhandler for handling .png files, and change their address there.
another way is to press ctrl+f and find and replace ../../ with your new address in entire solutions which takes a minute.

Categories