I make use of *firefox and *iexplore etc. within my selenium tests to get around the issue of self-signed SSL certificates on my local machine. Unfortunately, now that I've moved from XP over to 7, this seems to have stopped working.
I'm running the selenium RC server process as administrator, since that was necessary to get an IE instance to launch properly.
I've tried adding permanent security exceptions for the certificate in question, and have confirmed that this works when I myself launch a browser session. But when a browser session is instantiated by the Selenium RC, I'm still receiving the security warnings.
I've also tried specifying the 32bit version of IE, in case it was just the 64bit version that wasn't working, but both exhibit the same behaviour.
I've also tried temporarily disabled UAC, in case I was falling foul of a permissions/elevation problem, but that also did not help.
Has anybody managed to get the heightened privilege browsers working properly on Windows 7? Alternatively, does anybody know a way that I can get around this issue? (short of not using https!)
Thanks!
Have you tried adding the certificate to your list of accepted certificates by hand?
As IE doesn't have profiles, and each session uses the user's profile, you can add the certificate by hand the first time the error occurs to Selenium and next time the browser finds an error to warn, it will find the cert in the list of exceptions and proceed with the page without warning.
For Firefox, the best way to get around this is to create a custom profile with all the certificates accepted, then specify that profile when you start your Selenium server. I use this same strategy for setting up browsers in different languages.
*chrome is normally the way to run Firefox with relaxed security.
Related
We're trying to use BrowserStack in local mode to hit our AUT on our internal network. I've got it configured so that I'm getting the remote browser session, but as soon as I try to direct it to our app's (or any) URL within the firewall via Chrome, the browser throws up a dialog complaining that the connection is not private and asking for a user name and password. This is not a normal pop-up and we can't automate it with Selenium, so it stops us dead.
When accessed via a desktop browser session the system knows who I am and opens for me with all appropriate permissions etc. It doesn't know who... or what... is coming through the remote session, thus the requirement to authenticate, I guess?
We see slightly different behavior with Edge: No pop-up, it just clocks until time-out. OTOH, if I just try to launch Edge in BrowserStack Live, I get a message saying "If you're behind a corporate firewall, disable SSL inspection for *.browserstack.com", and then I get thrown out.
I've been scouring everything I can find about local to see if there's some way I can create the remote session with an 'identity' (I'm talking network identity, not B'stack user name/access key) and all I'm finding are things that pertain to proxy servers. As far as I know, there is not a proxy server at play here, so I'm really kinda up in the air.
Are there folks with experience with B'stack local mode who might be able to provide some insight?
rabbit
If you don't have the local binary downloaded and running on your machine...it could cause this to happen.
You can download it here:
https://www.browserstack.com/local-testing/automate#command-line
After you have it unzipped on your local machine...open up terminal:
./BrowserStackLocal --key yourkeyhere
Once it is running on your local machine, try to open up BrowserStackLive and run your manual tests.
Are you also having issues with automated tests not completing or just the manual testing of the website?
I am trying to get my silverlight application running with elevated privileges in browser. However, no matter what I do, it doesnt get elevated.
I have tried to add registry key AllowElevatedTrustAppsInBrowser (as DWORD with value 1), and signed the XAP file using VS 2012. I also came across a blog that mentioned the clientaccesspolicy.xml file, but I was not able to allow elevated privileges with this either. I put the xml file inside the web project hosting the html file that displays the XAP.
Has anyone actually managed to get this to run?
I also tried following this: http://mtaulty.com/CommunityServer/blogs/mike_taultys_blog/archive/2011/04/27/silverlight-5-beta-rough-notes-trusted-apps-in-the-browser.aspx but im unsure about where to run the commands he runs on windows.
There is a good summary on how to enable in-browser elevated trust by Mister Goodcat here, where he also provides some troubleshooting tips:
One thing to keep in mind is that even if your application runs as trusted in-browser app, it is still subject to the security restrictions the browser itself imposes. That means that its possibilities may be much more restricted than if they ran out of browser, for example by Internet Explorer's Protected Mode. In addition, the Silverlight runtime itself restricts use of certain features for in-browser trusted apps, for example you cannot use the Window class and/or create additional windows when you're running in the browser.
If none of the above applies to you and you still run into problems, one thing to do is check whether your certificate(s) have been installed correctly. There's a snap-in for the management console for this. Here is an article that describes how to get there (note that you should add a snap-in for your user account, not the computer account as in this description).
You can also check whether your registry key is actually and successfully queried, for example by using a tool like Process Monitor from the Sysinternals Suite. Watch for operations of type "ReqQueryValue" of your browser executable that access the key we created above, and make sure the Result is "SUCCESS".
Background: ClickOnce app that is bundled with the web server of an embedded device. Customers access the web interface, like you would access your router's configuration pages, and hit a button there to launch the ClickOnce app.
Problem: One customer claims he is getting signing error "Cannot continue. The application is improperly formatted. Contact the application publisher for assistance." And in the details, "Your Web browser settings do not allow you to run unsigned applications." This happens before ClickOnce even gets to the part where it downloads the app. Just click the "launch" button and the error is immediately thrown. Customer is running Windows XP and IE8.
The application and deployment manifests are both signed with our VeriSign Class 3 Code Signing certificate. I have check and retested a dozen times with different IE settings, and every time it downloads the application, successfully verifies, and launches. We've had him reflash his device and same thing. No one else has this problem, just him, so I can only conclude that there is nothing wrong with the manifests or their signatures.
His IT department is freakish about security so I strongly suspect that he has some IE settings interfering with ClickOnce, either trying to enforce stricter signing requirements, or blocking it altogether. What could be causing this? Are there some group policies somewhere that are maybe shutting him down?
Try clearing the browser cache.
Go to Internet Options -> Advanced, then click Reset...
For me worked check in Internet Options -> Advanced -> "Allow Software to run or install even if the signature is invalid"
I am having an issue with .NET detecting the proxy settings configured through internet explorer.
I'm writing a client application that supports proxies, and to test I set up an array of 9 squid servers to support various authentication methods for HTTP and HTTPs. I have a script that updates IE to whichever configuration I choose (which proxy, detection via "Auto", PAC, or hardcode).
I have tried the 3 methods below to detect the IE configuration through .NET. On occassion I notice that .NET picks up the wrong set of proxy servers. IE has the correct settings, and if I browse the web with IE, I can see I am hitting the correct servers via wireshark.
WebRequest.GetSystemWebProxy().GetProxy(destination);
GlobalProxySelection.Select.GetProxy(destination);
WebRequest.DefaultWebProxy
Here are the following tips I have:
My script sets a PAC file on a webserver, and updates the configuration in IE, then clears IE's cache
.NET seems to get "stuck" on a certain proxy configuration, and I have to set another configuration for .NET to realize there was a change. Occasionally it seems to pick some random set of servers (I'm sure they're not random, just a set of servers I used once and are in some cached PAC file or something). As in, I will check the proxy for the destination "https://www.secure.com" and I may have IE configured for and thus expect to get "http://squidserver:18" and instead it will return "http://squidserver:28" (port 18 runs NTLM, 28 runs without authentication). All the squid servers work.
This does not appear to be an issue on XP, only Vista, 2003, and windows 7.
Hardcoding the proxy servers in IE ALWAYS works
Time always solves the issue - if I leave the computer for about 20 or 30 minutes and come back, .NET picks up the correct proxy settings, as if a cached PAC script expired.
I found the solution.
.NET uses the "WinHttp Web Proxy Auto Discovery Service" to perform PAC script execution, and probably caches the results. Simply stopping and restarting this service does the trick. The following command line does this for me.
NET STOP WinHttpAutoProxySvc
NET START WinHttpAutoProxySvc
http://wiki.blackviper.com/wiki/WinHTTP_Web_Proxy_Auto-Discovery_Service
I found this by following James Kovacs' suggestion of attaching the debugger. I had already reflected through the code and made a failed attempt to attach a debugger before I ever posted the question, but could not decipher exactly what was happening. Running out of options, I tried debugging again, and after several hours found the following comment in _AutoPWebProxyScriptEngine.cs on line 76 that led me to this discovery
// In Win2003 winhttp added a Windows Service handling the auto-proxy discovery. In XP using winhttp
// APIs will load, compile and execute the wpad file in-process. This will also load COM, since
// WinHttp requires COM to compile the file. For these reasons, we don't use WinHttp on XP, but
// only on newer OS versions where the "WinHTTP Web Proxy Auto-Discovery Service" exists.
I had the same issue and I succeded getting/setting the proxy setting in registry:
[HKEY_CURRENT_USER\Software\Microsoft\Windows\CurrentVersion\Internet Settings]
"ProxyServer"="<your proxy IP address>:8080"
"ProxyEnable"=dword:00000001
"ProxyOverride"="<local>"
I have code that uses process class to sign code with a verisign signature.
So basically it uses command line cmd. Now this all used to work but recently (maybe due to malware), it fails at the timestamping stage. Pretty sure this is because it is no longer able to access the verisign external url in order to time stamp.
I have tried the same command from command manually and the result is the same. I have a proxy configured in IE which is necessary for external web access but I assume when I'm using cmd for e.g. it's not derived from IE settings?
To me it sounds like your IE proxy settings got messed up. Malware can do that. Go into IE, then Internet Options (it's different depending on version of IE and Windows). Once in there, hunt down proxy settings and turn them off. I'd guess they are currently enabled.
Also, timestamping will fail if the certificate expired. Did it expire?