ASP.NET reCAPTCHA in Internet Explorer SSL Warning Message - c#

I have implemented the reCAPTCHA solution (latest version), into my ASP.NET web project. It works fine and dandy in my local environment, but on our SSL encrypted server I receive the warning message "Do you want to view only the webpage content that was delivered securely? This webpage contains content that will not be delivered using a secure HTTPS connection, which could compromise the security of the entire webpage." This only occurs in Internet Explorer.
I've found, on these forums and others to change the deprecated server to the new of googles (old: https://api-secure.recaptcha.net to new: https://www.google.com/recaptcha/api), but I am not directly referencing the javascript files, just using the .net library.
Any help would be greatly appreciated!

If the MVC Helper is using Context.Request.IsSecureConnection as Dan has pointed out above, and your application server is behind a load balancer that intercepts HTTPS and forwards as HTTP, then OverrideSecureMode will likely be false and rendering would take place insecurely.
If behind a load balancer, one of the ways to find the originating protocol would be to do something like this (provided you have access to the X-Forwarded-Proto header field).
bool isSecureConnection = String.Equals(
filterContext.HttpContext.Request.Headers["X-Forwarded-Proto"],
"https",
StringComparison.OrdinalIgnoreCase);

From browsing the control's source, the control has a property called OverrideSecureMode that, when set to true, always causes the control to render via HTTPS.
The MVC Helper, on the other hand, doesn't seem to allow setting that property. It seems to be using Context.Request.IsSecureConnection to determine which hostname to use; discovering why that value is wrong for you is another way to attack the problem.

Related

Redirecting from http to https on SecureConnection is unable to pass values in POST request. How can I preserve them?

In my project, I have an .ashx page which accepts POST requests from outside, and collects a string variable by using this code:
string infoPost = httpRequest["infoPost"].ToString();
This code works perfectly on my local, or an IIS server.
The problem started when I published it to an IIS server which I dont have control over it. Somehow the object was coming empty and I was getting "Object reference not set to an instance of an object." error on this code.
I did a bit of research, and found out that the SecureConnection setting is causing the problem. IIS server converts all "http" requests to "https", but it loses infoPost variable while doing that. I tested this idea by calling this page with "https" directly, and this time the code worked perfectly, and I grabbed the posted string.
But I dont want hardcoded job. I tried to understand if the website set as a secureconnection or not by using this code:
string strSecure = "http://";
if (HttpContext.Current.Request.IsSecureConnection)
strSecure = "https://";
Again, this code worked well on my local, but doesn't work on the IIS I mentioned.
Sorry for the long explanation, here are my questions in simple:
When IIS somehow redirects http requests to https, it is unable to pass the parameters inside the POST. Is it true? How can I prevent it?
I want to understand if a website is published on SecureConnection or not, but seems like "HttpContext.Current.Request.IsSecureConnection" code doesn't work on IIS. Is my assumption correct? Is there any other way to understand and decide which tag I should be using, (http or https)?
You don't want POSTed values to redirect from http to https, that would be a security hole. The purpose of this feature is to force you to confront whatever it is in the application that is POSTing values in clear-text, because they are being exposed before the redirect ever happens.

How to make Microsoft.Web.Helpers.ReCaptcha work with X-Forwarded-Proto

We are using ASP.NET Web Helpers Library 3.2.2 (Nuget Link) developing a web app that will be run on Amazon behind an Elastic Load Balancer (ELB).
It looks like all requests being sent between the ELB and web server are unencrypted. This causes context.Request.IsSecureConnection to be false.
As far as the browser is concerned the request is secure, it shows https:// and goes through port 443 as expected.
The following code in the Web Helpers Library causes some browser warnings/error when it's expecting a secure connection for all resources.
UrlBuilder urlBuilder =
new UrlBuilder(context.Request.IsSecureConnection
? "https://www.google.com/recaptcha/api" :
"http://www.google.com/recaptcha/api");
Question: What if anything can I do about this? Without replacing this ReCaptcha helper with a home-grown one is there anything i can do to force context.Request.IsSecureConnection to be true when it's a secure connection between the browser and the ELB?
We use context.Request.Headers["X-Forwarded-Proto"] to determine if it's on a secure connection in other places we have control over.
Just have it render the embed code to a string and then manipulate the string to change the URL if the proto header is included (or blanket set it to HTTPS).

'blocked' The page at 'X' was loaded over HTTPS, but ran insecure content from

How I can fix this block if everything happen inside of my domain?
[blocked] The page at 'https://example.com/secure/CMS/Edit/Default.aspx' was loaded over HTTPS, but ran insecure content from 'http://example.com/en/?idkeep=True&DE_VM=4&DE_LNK=183_185790&DE_RND=536512159&id=183_185790': this content should also be loaded over HTTPS.
This happen in the CMS called EPiServer 6.
I was getting this problem. I removed "http:" prefix from that link.
old : iframe src="http://example.com" >
new : iframe src="//example.com" >
Now it's working perfect.
Thank you.
You can not put insecure (http not https) content (images, stylesheets, inline frames, etc.) on a secure webpage (https).
browser will block the insecure element while loading the page.
Search where in your script (https :// example.com/secure/CMS/Edit/Default.aspx) trying to access (http :// example.com/en/?idkeep=True&DE_VM=4&DE_LNK=183_185790&DE_RND=536512159&id=183_18579) and just add 'S' after http.
in other words your page should try to access http**S**://example.com/en/ not (http:// example.com /en/)
But I know, if the requested page is not available on HTTPS, there is nothing more you can do :-S
Because I get this project in the meddle without any other developer I will know, this server has a reverse-inverse-proxy. Than without any code the server change the protocol.
But to fix or minimize this problem I ensure all scripts are assigned with relative protocol. Removing http[s]:// and leave just //
We resolved this problem by setting the link properties to open up in a new tab and not attempt to display the insecure page in the frame on the secure page.

Why does DotNetNuke log me out on post ajax requests?

Previously, when I tried to do an ajax call to an ashx as a non-superuser account (i.e. as portal specific user) my web server would return cookies to clear my authorization. I posted a question about this and it seemed the answer was to make sure that the portalid=xx was specified in my GET parameters.
However, I have just found out that if I add portalid=xx in a POST request, DotNetNuke seems to ignore and and log out any non-superuser account.
How can I keep authorization during DNN POST ajax requests?
I think I have a good handle on the whole situation, and unfortunately it appears that the only true solution is to make sure each child portal has its own subdomain rather than a sub-url (e.g. portal.domain.com rather than domain.com/portal).
The problem is that when your portal 0 is domain.com but portal 1 is domain.com/portal everything works correctly until you need to access an .ashx file via ajax. What happens then is the URL that's requested is instead domain.com/DesktopModules/MyModule/Handler.ashx, which does not contain the /portal/ in it, thus causing DNN to think you are doing a request on portal 0 and logging you out.
While GET requests can overcome this with a portal=1 parameter, this does not seem to work for POST requests.
Therefore, the best solution it seems is to have your portal on a distinct subdomain (portal.domain.com), and then you don't risk missing something like this.
I've found a few things for you to check out and see if any of them solve your problem.
Make sure you are using a ScriptManagerProxy. This allows ascx pages to use AJAX while the parent page is also using AJAX.
There have been many reports of people not being able to run AJAX with DNN if Page State Persistence is set to "Memory". Those who experience this have been able to fix it by switching Page State Persistence to "Page". The easiest way to do this is to run this query:
update HostSettings
set SettingValue='P'
where SettingName='PageStatePersister'
After you run that, you'll need to recycle the application. If you don't have access to the server, just add a space or carriage return to your web.config file (that will force the app to recycle).
Lastly, you might see if you have this line in your web.config. Sometimes removing it will help:
<system.web>
<xhtmlConformance mode="Legacy" />
</system.web>

How to get started using DotNetOpenAuth

I created a simple page using the code provided by this page (the first sample):
http://www.dotnetopenauth.net/developers/code-snippets/programmatic-openid-relying-party/
But I can't seem to get it to work, I can redirect to the provider but when the provider redirects back to my page, I get error 500, "The request was rejected by the HTTP filter".
I already checked ISAPI filters which I have none.
I've never seen that error before. Is this page hosted by the Visual Studio Personal Web Server (Casini) or IIS? I suspect you have an HTTP filter installed in IIS (or perhaps your web.config file) that is rejecting the incoming message for some reason.
Note that you need to turn off ASP.NET's default page request validation on any page that can receive an OpenID authentication response because those responses can include character sequences that look like HTML/Javascript-injection attacks but in fact is harmless.
I discovered that I'm using Isa in the server, so I just followed this instructions to get it working.
http://blog.brianfarnhill.com/2009/02/19/sharepoint-gets-the-error-the-request-was-rejected-by-the-http-filter/

Categories