I am very green to developing a web front-end.
I wanted to create an eternally hosted site that will pull a collection of resources.
This is the scenario:
I want to "embed" pages within the web app. My thoughts, make the site called look exactly like it does originally, but keep the navigation header above. I have googled quite a bit to try and get a good direction in where to take this. From what I have found, iframe is the way to go.
The issues:
We host Dell Open manage Essentials on one of our servers. The only way to access this is through https://ome/. We currently do not have CA in place, therefore the certificate that is currently on the server is expired. This error is accurate due to the lack of a valid certificate.
My question:
1). Is iframe the right approach to this situation.
2). How do I bypass, or at least give the user the ability to continue to the embedded site? These sites are all internal.
You can't decide for a user to display content for a site with an expired certificate. The user has to accept that risk. That's why the browsers now immediately flag pages with expired certificates and make it super-obvious. There are few cases where you'd actually want to bypass this - so few, that the common browsers just don't make exceptions.
The VERY difficult way is to route your IFRAME url through a proxy that doesn't care about expired certs.
The less difficult way is to spend $50 on a certificate. Or you can even get a free one (YMMV) at https://startssl.com (at your own risk. I am unaffiliated.)
Related
I have a single solution with multiple C# ASP.NET Web Forms projects. I want a way to identify a given browser so that each website can identifier that same browser. I need to do this from the C# Code-Behind code (not with the client code, like JavaScript). I also cannot use the Session because it isn't shared across websites. I don't think cookies are either.
For example, if a user logs onto Website1 and then logs onto Website2 with the same browser on the same computer, I want to be able to identify that. But if a user logs onto Website1 with Chrome and then Website1 with FireFox (regardless of whether it's on the same computer or not), I want to detect that as well.
If it makes any difference, I am using Azure to publish my web projects. So all websites will have similar domains (eg website1.azurewebsites.net and website2.azurewebsites.net).
If you want to track someone using the same browser on the same computer then use a cookie. If the websites have different domains you'll need to be clever because modern browsers have a lot of protection against what they see as tracking cookies. One option is using a hidden interstitial page as described here.
Your second scenario, a user accessing same site with different browsers, I suggest storing the user agent string (one of the request headers) and adding this to a login audit so you can build up a collection of different user agents used by a given user. There are libraries available for parsing user agent strings and extracting name, version, engine etc.
Between these two techniques and a bit of business logic you should get what you need. If you would like me to clarify any of this, let me know and I'll provide more detail.
We are third party widget proving services company for payment (for example). PARENT website can add our widget built in ASP.net MVC -5 in their iframe. Our widget URL is completely secure (https) but the parent website where its getting embedded is not. Now that parent website is used by 100's of people to make the payment. Problem is they see that the whole website is not "SECURE". they cannot see that the iframe where they are making payment is secure. Is there any way I can solve this issue. How can I make the parent website detect that the iframe its using is HTTPS hence make the whole page as secure. Or if there is another way to handle it then please guide me.
Parent - http
iframe - https
users : scared of making payments
You can't, because it's in fact not secure. HTTPS only provides protection if it's HTTPS all the way. If any part drops to HTTP, the whole channel is insecure.
Tell your clients to implement SSL to protect their users. If they choose not to, there's not really anything you can do about it. It might be worth updating your terms of service to indemnify your organization against damages should your clients not utilize SSL and some sort of breach should occur because of that, when using your widget. Basically, transfer the legal responsibility to the client.
Our operator has implemented a Round Robin load balancer on our web portal and it seems to be causing some problems I can't get to the bottom of.
I'm able to identify which server we're on and as we navigate around the site we stay on server A. If I leave it for 5 minutes and try another page I'll get pushed to server B, logged out and shown the log in page.
I've got them to make sure the MachineKey in the machine.config is the same on both servers and I've tested locally that the session isn't being used - I can turn the session off completely locally and it still works. I've verified on both servers it is creating an ASPXAUTH cookie on the domain so we should be classed as authenticated on both servers - but keep loosing my authentication every time I change server.
Any ideas on what could be causing the logging out? I'm guessing it's my misunderstanding about how ASPXAUTH works.
Sessions are handled separately from Forms Authentication. There is a good explanation of this here.
The most common reason for Forms Authentication failures on load-balanced environments is lack of synchronization of the MachineKey element. You've stated that you've got the server operators to ensure that the MachineKey is synchronized, but have you verified this yourself in some way? Is this the case on ALL the web servers? From previous dealings with a couple of commercial web hosts, I've found that it is (unfortunately) difficult to take their assurances at face value.
Another thing to check is if the FormsAuthentication configuration (timeout, path, name, etc.) is the same on all of the hosts.
Are the patch levels the same on all of the hosts? You might want to see if the compatibility switch mentioned here is applicable in your situation.
Assuming that the hosting setup is correct, maybe you have initialization code on the page that logs you out if some condition is not fulfilled?
Try to take a look at the server logs and trace the sequence of HTTP requests involved during a failed page request. That might produce a clue.
Edit: This guide to troubleshooting Forms Authentication problems is detailed, and quite helpful: Troubleshooting Forms Authentication
Check for any other application functionality which depends on cookies. The web server on Server B will not recognize cookies that came from Server A. If any part of your authentication depends on cookies being populated, then that could cause your problem.
You have probably already ensured that the domain used for cookies is the same on all of the load balanced servers, but I thought I'd mention that. If the domains aren't compatible, then the browser will simply not send cookies to the server.
I have a Winforms app which hosts a web browser control. Within this control you can also navigate to pre-determined external websites.
I need to implement Single Sign-On so that the user doesn't need to authenticate in each of the known external websites. I have already some ideas but it would be nice to hear all your opinions.
What would be the best way to do this?
In fact, is there something already for this? (edit: how do browsers remember logins/passwords)?
Cheers
Generally it is the responsibility of the site to implement SSO, and the client will then automatically respond to the site. Since the web browser control is using IE it inherits the same capabilities as the web browser. For example if the site uses Windows Authentication then the control will authenticate following a challenge from the site without user intervention. Similarly the control will perform the necessary redirects if the site is using SAML 2.0.
Since these are external websites I have to assume that Windows Authentication is not going to work well because the server and the client are on different domains. Therefore something along the lines of SAML sounds like the most secure option.
It seems like implementing SAML is going to be a problem for you and you need to manually complete and submit web forms which load inside the control. This is possible by accessing the DOM but it quickly becomes a difficult to maintain solution.
The web browser control offers up a document property that gives you an HtmlDocument object which allows you to find elements and execute JavaScript in pages. You need to use these mechanisms to automatically perform the authentication. The steps might look like this:
Capture URL, or some cookie that will let you know if authentication is required by inspecting the web browser control properties. You might want to look into OnNavigate().
Access the document and complete the form values.
Call a JavaScript submit function to submit the form, or inject some JavaScript to do this. I find it easier to insert JavaScript into pages than to write more complicated C# code in a lot of cases. It is easier to prototype in a regular browser.
Unless the websites all share a common trusted authentication mechanism (like OpenID) you're stuck doing custom coding for each site.
Browsers remember passwords for single sites. I wouldn't call that "single sign-on", which is a method of using a trusted authority to authenticate across multiple disparate web sites which all rely on that authority to verify a user's identity.
As you asked for SSO packages that already do this, some examples are:
RSA ClearTrust
CA SiteMinder
My site has https sections (ssl), and others are regular http (not using ssl).
Are there any issues going from ssl to non-ssl pages?
Some times that user will click on a link, which will be ssl, then click on another link that leaves https to http based urls.
I understand that when on a ssl page, all images have to be also served using https.
What other issues do I have to handle?
I recall that a popup displays sometimes telling the user about a security issue, like some content isn't secure, I am guessing that is when you are under https and the page is loading images that are not under https.
Mixing is generally a bad idea just because it tends to detract from the user experience and coding around the differences makes the application that much harder to maintain. If you need SSL for even a little of the site, I'd recommend putting it all behind SSL. Some companies use a hybrid for the public "low end" site and SSL for the actual customer experience.
As Miyagi mentioned, session sometimes gets goofy, but it's not impossible if you keep the session stored in an external location. These means all session objects must be serializable, compact, etc, and it also means you'll need to manage the sessionid in a common browser element (cookie is usually the safest).
There is a good article on The Codeproject about this theme. The author encapsulates the switching by code and configuration. Not so long ago I tried to go this way - and stopped going it. There were some handling problems. But the main reason for stopping was the bad user experience mentioned by Joel before.
If you are using sessions on your site you will lose any session information when switching between ssl pages and non-ssl pages.