Our product has a collection of sites and the main page contain 3 iframes which loads these different web sites. We are going to enable SSL on all the site. We allow html user data to be displayed in our systems. Currently we put this on hold since we experience Mixed Content Issues because of following reasons.
Some of the elements in the user’s data which refers http content.
Ex: img, js etc
Some of the third party which loads in our iframes.
(Different content provider)
We thought of developing our own web proxy, we do have concern about performance as well as expensiveness of this solution. Can anybody tell what are the available solutions for the Mixed Content Issues and available third party web proxy where we can buy?
The best solution would probably be to purchase remote servers from some service (google will give you millions of hits) and then set up a CGI script to load the insecure content on to the remote server, cache it, and then serve that content. That way your users are protected from 3rd parties knowing what they look at and if you set up your SSL certificate on those servers then you can easily get around the mixed content.
That being said, there will be a big hiccup when you start loading your user's content off the remote server as it will have to start caching everything.
Using a web proxy is not a good solution for following reasons:
We have performance problem and expensiveness of this solution like you said.
The most problematic of this solution is we still have security vulnerability. The point of using https on a site is to prevent the site from sniffers and man-in-middle attack. If you use a web proxy, the connection between your browser and your proxy is still vulnerable.
I'm not sure whether a web proxy would help in anyway because the browser always interprets these links as http even if your server is SSL enabled.
For more information about mixed content: https://developer.mozilla.org/en-US/docs/Security/MixedContent
The correct way to deal with this situation is you must modify all your links to load content with https. Or a better way is to use protocol relative url
<script src="//scripts/main.js"></script>
Mixed content warnings are built into browsers by design to indicate exactly what they mean. You can turn them off in settings or just click ok, so by throwing the mixed content, you're degrading UI, but not functionality.
A few things come to mind, since the providers can't change their content:
Write a back end scraper for your app that scrapes the web page and servers the content locally over https.
Don't render the content immediately, make the user click on it to open the iframe so that at least your page loads and you can warn the user (optional).
Enhance either solution by checking for https first, a lot of websites have 80 and 443 both open, but as you pointed out, not everybody.
Not too familiar with this one, but you can maybe even have the server instance of internet explorer open the pages and cache them for you simplifying the scrape.
If I was writing this, I would check for https when possible and allow the mixed content warnings as all that's by design.
Related
I am very green to developing a web front-end.
I wanted to create an eternally hosted site that will pull a collection of resources.
This is the scenario:
I want to "embed" pages within the web app. My thoughts, make the site called look exactly like it does originally, but keep the navigation header above. I have googled quite a bit to try and get a good direction in where to take this. From what I have found, iframe is the way to go.
The issues:
We host Dell Open manage Essentials on one of our servers. The only way to access this is through https://ome/. We currently do not have CA in place, therefore the certificate that is currently on the server is expired. This error is accurate due to the lack of a valid certificate.
My question:
1). Is iframe the right approach to this situation.
2). How do I bypass, or at least give the user the ability to continue to the embedded site? These sites are all internal.
You can't decide for a user to display content for a site with an expired certificate. The user has to accept that risk. That's why the browsers now immediately flag pages with expired certificates and make it super-obvious. There are few cases where you'd actually want to bypass this - so few, that the common browsers just don't make exceptions.
The VERY difficult way is to route your IFRAME url through a proxy that doesn't care about expired certs.
The less difficult way is to spend $50 on a certificate. Or you can even get a free one (YMMV) at https://startssl.com (at your own risk. I am unaffiliated.)
I have a program that opens a web browser control and just displays a web page from our server. They can't navigate around or anything.
The users are not allowed to know the credentials required to login, so after some googling on how to log into a server I found this:
http://user_name:password#URL
This is 'hard coded' into the web browsers code. -It works fine.
HOWEVER: Some smart ass managed to grab the credentials by using WireShark which tracks all the packets sent from your machine.
Is there a way I can encrypt this so the users cannot find out?
I've tried other things like using POST but with the way the page was setup, it was proving extremely difficult to get working. -(Its an SSRS Report Manager webpage)
I forgot to include a link to this question: How to encrypt/decrypt the url in C#
^I cannot use this answer as I myself am not allowed to change any of the server setup!
Sorry if this is an awful question, I've tried searching around for the past few days but can't find anything that works.
Perhaps you could work around your issue with a layer of indirection - for example, you could create a simple MVC website that doesn't require any authentication (or indeed, requires some authentication that you fully control) and it is this site that actually makes the request to the SSRS page.
That way you can have full control over how you send authentication, and you need never worry about someone ever getting access to the actual SSRS system. Now if your solution requires the webpage to be interactive then I'm not sure this will work for you, but if it's just a static report, it might be the way to go.
i.e. your flow from the app would be
User logs into your app (or use Windows credentials, etc)
User clicks to request the SSRS page
Your app makes an HTTP request to your MVC application
Your MVC application makes the "real" HTTP request to SSRS (eg via HttpClient, etc) and dumps the result back to the caller (for example,it could write the SSRS response via #HTML.Raw in an MVC View) The credentials for SSRS will therefore never be sent by your app, so you don't need to worry about that problem any more...
Just a thought.
Incidentally, you could take a look here for the various options that SSRS allows for authentication; you may find some method that suits (for e.g Custom authentication) - I know you mentioned you can't change anything on the server so I'm just including it for posterity.
Q: is it possible to manipulate http request header or using any other technique in C# when making request (to servers like yahoo.com/cnn.com) using C#, so that the returned web page text(stream)'s size can be greatly reduced - a simplified webpage without all other extra scripts/image/css? or even better can I just request a sub-section of the webpage of my interest to be downloaded only? I just need the responded page to be minimized as much as possible so that it can be downloaded as fast as possible before the page can be processed later.
It really depends on the site and services it provides and configuration it has. Things that may help to look for (not a complete list):
API exposed that let you access data directly. E.g. XML or JSON type response.
Compression - your client has to request via appropriate HTTP headers, e.g. Accept-Encoding: gzip, deflate, and needless to say know how to process response accordingly. SO thread on doing this in C#.
Requesting mobile version of site if site supports such a thing. How site exposes such version really depends on the site. Some prefix their URLs with m., some respond to User-Agent string, some use other strategies...
Use HTTP Range header. Also depends if site supports it. MSDN link for .NET API.
Have a play with tweaking some of the browser capabilities in your HTTP request header, see here. Although your response to this will vary from site to site but this is how a client tells the server what it is capable of displaying and dealing with.
There is no way to ask server to render different amount of data outside of what server supports via C# or any other language. I.e. there is no generic mechanism to tell server "don't render inline CSS/JS/Images" or "don't render ad content" or even "just give me article text".
Many sites have "mobile" versions that will have potentially smaller page sizes, but likely contain different or less information than desktop version. You should be able to request mobile version by picking different url or specifying "user agent" corresponding to a phone.
Some sites provide data as RSS feed or some other means to obtain data automatically - you may want to check with each side.
If you know particular portion of the page to download you may be able to use range header for GET request, but it may not be supported by dynamic pages.
Side notes:
- most sites will server CSS/JS as separate files.
- make sure to check license to see if there are any limitations on each site.
I have a windows application developed in c#.Net which is used as a website blocker for a network.I have done this by modifying the hosts file.It works fine when urls are blocked like "www.yahoo.com".Now my requirement is I have to block the urls based on the keywords.i.e when the user just types "yahoo" in the browser,I should verify the keyword and block a corresponding website.Now how can I track the website typed by the user in the browser and block or allow the user to particular site based on the url.I should not allow the user to view the page if the keyword is present.How to do this?Can some one help me to do this?
There's plenty of code samples out there that will act as proxies (eg. http://code.cheesydesign.com/?p=393) however I would strongly suggest following the advice of the comments you've been given and go with an existing application.
Building a proxy that will not interfere with the complicated web apps of today is not trivial. You also need to be careful about blocking based on keywords - web apps I've worked on have failed in spectacular ways due to proxies doing this, and rejecting requests for important javascript files (often due to minification or compression) rendering our app useless.
Also consider that your proxy won't be able to check SSL traffic (which is increasing all the time) without serving up your own certs acting as a man-in-the-middle.
My site has https sections (ssl), and others are regular http (not using ssl).
Are there any issues going from ssl to non-ssl pages?
Some times that user will click on a link, which will be ssl, then click on another link that leaves https to http based urls.
I understand that when on a ssl page, all images have to be also served using https.
What other issues do I have to handle?
I recall that a popup displays sometimes telling the user about a security issue, like some content isn't secure, I am guessing that is when you are under https and the page is loading images that are not under https.
Mixing is generally a bad idea just because it tends to detract from the user experience and coding around the differences makes the application that much harder to maintain. If you need SSL for even a little of the site, I'd recommend putting it all behind SSL. Some companies use a hybrid for the public "low end" site and SSL for the actual customer experience.
As Miyagi mentioned, session sometimes gets goofy, but it's not impossible if you keep the session stored in an external location. These means all session objects must be serializable, compact, etc, and it also means you'll need to manage the sessionid in a common browser element (cookie is usually the safest).
There is a good article on The Codeproject about this theme. The author encapsulates the switching by code and configuration. Not so long ago I tried to go this way - and stopped going it. There were some handling problems. But the main reason for stopping was the bad user experience mentioned by Joel before.
If you are using sessions on your site you will lose any session information when switching between ssl pages and non-ssl pages.