My PC is running two c# application.
I want the one of the C# application go through fiddler proxy that running on My local PC.
and I want the other one C# app go through fiddler proxy that running on remote PC.
I have installed 2 fiddler cert generated by the above two different proxy.
But I found that only the fiddler running on local PC can decrypt the https.
Is is because my PC treat the two cert as one single cert ?
Can make one application trust on one set of cert and the other one trust on another set of certs ?
Related
I have two applications: one is WebAPI REST Service written on .NET 4.5, and another one is its client application written on .NET Core 2. There's a windows authentication enabled on the service side. Client one makes simple GET call using HttpClient with HttpClientHandler loaded with credentials from configuration, it is some service account. Very basic stuff that can easily be googled in tons of examples, there's nothing fancy there at all.
When this call is performed from my local dev laptop to DEV server then everything works as expected: user provided in the client's configuration gets authenticated successfully on the service and is able to perform whatever operations he is allowed to perform.
It also works fine if I perform this call from the browser (Chrome) - it asks me for credentials and then returns result that perfectly correlates with whatever credentials I put there. It works this way with both remote and local services.
However, if I run the service locally on my laptop (local IIS Server, not an Express one) and call is performed using localhost address then I observe strange behavior: windows authentication seems to work as well, but the user that service authenticates is not the one from client's configuration. Instead, it is my AD account with which I have logged on interactively in windows (btw, it's win 10 ent x64).
Of course, I've verified five times every single config setting and runtime value in my applications, all settings on local and dev server's IIS, on web application and IIS Server level and app pools used for apps. Everything I see looks good - except it behaves differently when on localhost. Also, I googled every question that I could imagine but still can't seem to find an explanation to this behavior.
So, why HttpClient uses my account on the localhost instead of explicitly provided credentials? What am I missing here?
I have an application that can be downloaded from our website and runs on user PC.
This application provides connection between our hardware and our web application. It uses SignalR for communication.
Basically, I run SignalR server under WinForms application and have javascript client that tries to access it through http://localhost:8084/signalR.
Everything works fine when I use HTTP version of the web application, but fails, when I use HTTPS for my web application:
Most of the browsers don't allow unsecured connections from a secure page.
So, my question is how can I have self-signed certificate included in my software which installs certificate on user pc during installation and how can I make it work in the way that browsers not complaining about unsecured connection?
If you run SignalR server on user PC each user must obtain the certificate for SignalR connection it launches. Self-signed certificate would be reasonable decision for an Intranet or development. It isn't safe enough for the internet. Another possible problem is retrieving name of the machine where SignalR is running. Certificate attaches to certain local machine(I could be wrong at this) and for connecting to client's application SignalR server you need to know the name of machine he uses. Migrating SignalR server from client to web app server will solve mentioned troubles.
I have a c# web API that uses a client certificate in order to connect to a third party identification provider (BankID https://www.bankid.com/en/).
When running in Visual Studio the connection handshake works fine. But then I deploy it to IIS running on a Virtual Machine on Azure, and for some reason now it doesn't work as expected, and instead I get an Exception:
"The request was aborted: Could not create SSL/TLS secure channel."
I've installed Visual Studio on the VM, and verified that it works; and I've installed IIS on ly local machine and verified that it works. So it seems to be a combination if IIS on Azure VM:s.
At first I thought it could be that the client certificate was'nt picked up from the certificate store. But I added handling of "no certificate found", and this doesn't trigger.
I've also tried the alternative of picking up the certificate from file. But this doesn't work either.
Would appreciate some help!
I managed to figure out what the problem was. The IIS user (IIS_IUSRS) didn't have the permission to use the private key. So it wasn't actually related to Azure.
For anyone running into the same issue, look at the detailed answer here: https://stackoverflow.com/a/2647003/8721876.
Try putting this in your constructor:
ServicePointManager.Expect100Continue = true;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls11;
I am investigating how to create a PKCS#10 certificate signing request (CSR), send it to a Certificate Autority (CA), and create an X.509 client certificate based on the response from the CA. I want to do this in a Windows 10 application (so, an app built on UWP). Can i do this without an extra server to run the "CertCli 1.0 Type Library"?
The only info I've found is this: MSDN Article
It's for Windows 8.1, and it says that I need to have a server "between" the application and the CA, in order to send the CSR to the CA and create the actual certificate. Then I send this certificate back to the application. Deploying and running an extra server simply to host an API seems wasteful, so I'm trying to find another way.
Is this still necessary in Windows 10?
Could I package the "CertCli 1.0 Type Library" as a DLL or something along with my application?
Are there open source alternatives (that will run in a UWP app) that can be used to handle the certificate and CSR instead?
I'm developing an HTTPS proxy server using titanium proxy server. I will monitor some websites and will modify the response of those websites. Some of those websites are https, that's why I'm using https proxy server.
Now here is the problem. As far as I know, to intercept an HTTPS site, we must have certificate and titanium proxy server has a dummy certificate. I think whenever the server is started, titanium proxy server tries to install that certificate as a root certificate so that browsers can trust that certificate. This approach is working perfect for all browsers except firefox.
I can see titanium root certificate installed in all browsers and I can successfully intercept HTTPS websites running in Google Chrome or Safari. But whenever I try to open a HTTPS website in firefox, it gives an error that the connection is untrusted.
I think certificate is not being installed in firefox or is being rejected by firefox as it is not signed by any CA authority. But why are other browsers not showing the same error?
My problem is that I need to intercept HTTPS websites too - it is requirement of the project. What if I buy a certificate from a CA authority and use that certificate with application, will it work or not? Can I use web server certificates with a desktop application? I really don't know much about SSL. any help will be much appreciated. Looking forward to you guys.
PS: I'm attaching a snapshot of firefox windows, for reference.
error window
Firefox uses it's own CA store, while Chrome, Safari and IE use the systems CA store on Windows and Mac. Thus you explicitly need to import the certificate into Firefox as trusted.