Security warning Parent website is http but iframe is https - c#

We are third party widget proving services company for payment (for example). PARENT website can add our widget built in ASP.net MVC -5 in their iframe. Our widget URL is completely secure (https) but the parent website where its getting embedded is not. Now that parent website is used by 100's of people to make the payment. Problem is they see that the whole website is not "SECURE". they cannot see that the iframe where they are making payment is secure. Is there any way I can solve this issue. How can I make the parent website detect that the iframe its using is HTTPS hence make the whole page as secure. Or if there is another way to handle it then please guide me.
Parent - http
iframe - https
users : scared of making payments

You can't, because it's in fact not secure. HTTPS only provides protection if it's HTTPS all the way. If any part drops to HTTP, the whole channel is insecure.
Tell your clients to implement SSL to protect their users. If they choose not to, there's not really anything you can do about it. It might be worth updating your terms of service to indemnify your organization against damages should your clients not utilize SSL and some sort of breach should occur because of that, when using your widget. Basically, transfer the legal responsibility to the client.

Related

ASP.NET embedding internal pages within page with expired certificate C#

I am very green to developing a web front-end.
I wanted to create an eternally hosted site that will pull a collection of resources.
This is the scenario:
I want to "embed" pages within the web app. My thoughts, make the site called look exactly like it does originally, but keep the navigation header above. I have googled quite a bit to try and get a good direction in where to take this. From what I have found, iframe is the way to go.
The issues:
We host Dell Open manage Essentials on one of our servers. The only way to access this is through https://ome/. We currently do not have CA in place, therefore the certificate that is currently on the server is expired. This error is accurate due to the lack of a valid certificate.
My question:
1). Is iframe the right approach to this situation.
2). How do I bypass, or at least give the user the ability to continue to the embedded site? These sites are all internal.
You can't decide for a user to display content for a site with an expired certificate. The user has to accept that risk. That's why the browsers now immediately flag pages with expired certificates and make it super-obvious. There are few cases where you'd actually want to bypass this - so few, that the common browsers just don't make exceptions.
The VERY difficult way is to route your IFRAME url through a proxy that doesn't care about expired certs.
The less difficult way is to spend $50 on a certificate. Or you can even get a free one (YMMV) at https://startssl.com (at your own risk. I am unaffiliated.)

Solution for mixed content blocking

Our product has a collection of sites and the main page contain 3 iframes which loads these different web sites. We are going to enable SSL on all the site. We allow html user data to be displayed in our systems. Currently we put this on hold since we experience Mixed Content Issues because of following reasons.
Some of the elements in the user’s data which refers http content.
Ex: img, js etc
Some of the third party which loads in our iframes.
(Different content provider)
We thought of developing our own web proxy, we do have concern about performance as well as expensiveness of this solution. Can anybody tell what are the available solutions for the Mixed Content Issues and available third party web proxy where we can buy?
The best solution would probably be to purchase remote servers from some service (google will give you millions of hits) and then set up a CGI script to load the insecure content on to the remote server, cache it, and then serve that content. That way your users are protected from 3rd parties knowing what they look at and if you set up your SSL certificate on those servers then you can easily get around the mixed content.
That being said, there will be a big hiccup when you start loading your user's content off the remote server as it will have to start caching everything.
Using a web proxy is not a good solution for following reasons:
We have performance problem and expensiveness of this solution like you said.
The most problematic of this solution is we still have security vulnerability. The point of using https on a site is to prevent the site from sniffers and man-in-middle attack. If you use a web proxy, the connection between your browser and your proxy is still vulnerable.
I'm not sure whether a web proxy would help in anyway because the browser always interprets these links as http even if your server is SSL enabled.
For more information about mixed content: https://developer.mozilla.org/en-US/docs/Security/MixedContent
The correct way to deal with this situation is you must modify all your links to load content with https. Or a better way is to use protocol relative url
<script src="//scripts/main.js"></script>
Mixed content warnings are built into browsers by design to indicate exactly what they mean. You can turn them off in settings or just click ok, so by throwing the mixed content, you're degrading UI, but not functionality.
A few things come to mind, since the providers can't change their content:
Write a back end scraper for your app that scrapes the web page and servers the content locally over https.
Don't render the content immediately, make the user click on it to open the iframe so that at least your page loads and you can warn the user (optional).
Enhance either solution by checking for https first, a lot of websites have 80 and 443 both open, but as you pointed out, not everybody.
Not too familiar with this one, but you can maybe even have the server instance of internet explorer open the pages and cache them for you simplifying the scrape.
If I was writing this, I would check for https when possible and allow the mixed content warnings as all that's by design.

Single Sign-On on Winforms Web Browser control

I have a Winforms app which hosts a web browser control. Within this control you can also navigate to pre-determined external websites.
I need to implement Single Sign-On so that the user doesn't need to authenticate in each of the known external websites. I have already some ideas but it would be nice to hear all your opinions.
What would be the best way to do this?
In fact, is there something already for this? (edit: how do browsers remember logins/passwords)?
Cheers
Generally it is the responsibility of the site to implement SSO, and the client will then automatically respond to the site. Since the web browser control is using IE it inherits the same capabilities as the web browser. For example if the site uses Windows Authentication then the control will authenticate following a challenge from the site without user intervention. Similarly the control will perform the necessary redirects if the site is using SAML 2.0.
Since these are external websites I have to assume that Windows Authentication is not going to work well because the server and the client are on different domains. Therefore something along the lines of SAML sounds like the most secure option.
It seems like implementing SAML is going to be a problem for you and you need to manually complete and submit web forms which load inside the control. This is possible by accessing the DOM but it quickly becomes a difficult to maintain solution.
The web browser control offers up a document property that gives you an HtmlDocument object which allows you to find elements and execute JavaScript in pages. You need to use these mechanisms to automatically perform the authentication. The steps might look like this:
Capture URL, or some cookie that will let you know if authentication is required by inspecting the web browser control properties. You might want to look into OnNavigate().
Access the document and complete the form values.
Call a JavaScript submit function to submit the form, or inject some JavaScript to do this. I find it easier to insert JavaScript into pages than to write more complicated C# code in a lot of cases. It is easier to prototype in a regular browser.
Unless the websites all share a common trusted authentication mechanism (like OpenID) you're stuck doing custom coding for each site.
Browsers remember passwords for single sites. I wouldn't call that "single sign-on", which is a method of using a trusted authority to authenticate across multiple disparate web sites which all rely on that authority to verify a user's identity.
As you asked for SSO packages that already do this, some examples are:
RSA ClearTrust
CA SiteMinder

Detecting SSL Browser support

How can you detect if the client browser has SSL support? I am not refering
to the server Variables HTTPS_* . I want to be able to determine
if the browser has no SSL support.
P.S. I know this is possible because this company (http://www.cyscape.com)
has a product that can even detect when you unselect SSL support from your
browser options.
All browsers have SSL support (period). No one is going to release a browser that cannot be used. HTTPS is a security requirement and apart of OWASP A3: Broken Authentication and Session Management.
While it is relatively easy to check if a server supports SSL connections, detecting browser support for the same is extremely difficult. The solution likely requires a client-side browser extension that implements the logic necessary to search through browser configuration or version information for SSL support. This problem becomes even more difficult, because the extension would need to work with multiple browsers.
If you do not visitors that cannot connect to a particular page over SSL, there are usually server-side methods you can employ, such as redirecting them to a landing page where they are notified of the SSL requirement, or you can simply deny their web request.
As mentioned, there's no reason a modern web browser will have SSL disabled by default.
At the SSL level, does your server receive a connection when you give the browser an https link?
At the HTTP level, you could try various scenarios that assign a session cookie via HTTP, then update some session variables via links only accessible via HTTPS. Or you could set the "secure" attribute on a cookie and see how the browser handles it.
You could try a JavaScript methodology and inspect the window.location property or just try setting it to an https link. (Or try some Java functions using LiveConnect or do something similar with Flash.)
Is there a particular motivation for the question? If you're trying to determine SSL support for browsers that for some bizarre reason don't have SSL enabled, then a cookie or JavaScript approach should be fine. If you're trying to determine SSL support for an adversarial browser (e.g. a bot that doesn't follow robots.txt) or you have more reason to not trust client-side checks like JavaScript, then checking SSL either might not be a useful solution or you might have to go deeper into seeing if the SSL handshake differs from common browsers.
Checks for whether or not a client supports SSL will be subject to Man-in-the-Middle attacks where an active network attacker gains control of the user's connection and makes it appear as if the client doesn't support SSL.
This question is most often asked when developing mobile sites targeted at older phones that may not support real SSL (they support WAP-TLS). The number of phones in this category continues to shrink and my suggestion is to ignore them, maybe even going so far as to blacklist their user agents.

Going from http to https, what issues do I have to handle?

My site has https sections (ssl), and others are regular http (not using ssl).
Are there any issues going from ssl to non-ssl pages?
Some times that user will click on a link, which will be ssl, then click on another link that leaves https to http based urls.
I understand that when on a ssl page, all images have to be also served using https.
What other issues do I have to handle?
I recall that a popup displays sometimes telling the user about a security issue, like some content isn't secure, I am guessing that is when you are under https and the page is loading images that are not under https.
Mixing is generally a bad idea just because it tends to detract from the user experience and coding around the differences makes the application that much harder to maintain. If you need SSL for even a little of the site, I'd recommend putting it all behind SSL. Some companies use a hybrid for the public "low end" site and SSL for the actual customer experience.
As Miyagi mentioned, session sometimes gets goofy, but it's not impossible if you keep the session stored in an external location. These means all session objects must be serializable, compact, etc, and it also means you'll need to manage the sessionid in a common browser element (cookie is usually the safest).
There is a good article on The Codeproject about this theme. The author encapsulates the switching by code and configuration. Not so long ago I tried to go this way - and stopped going it. There were some handling problems. But the main reason for stopping was the bad user experience mentioned by Joel before.
If you are using sessions on your site you will lose any session information when switching between ssl pages and non-ssl pages.

Categories