I am working on a university project in which I need to get some product information out of the database of outpan.com into a string or an array of strings.
I am new to coding, that's why I am needing quite a lot of help still. Does anyone of you know how to send a request & get the answer from a c#-environment (Windows Form Application)?
The description on outpan itself (https://www.outpan.com/developers.php) says to send the call by using HTTPS in curl, but what does it practically mean? Do I need to install extra libraries?
I would be glad, if someone could help me with this problem or provide me with a tutorial on how to make these curl calls to a database starting from a c# environment.
If there are more information needed about my settings, let me know.
The Outpan API uses Basic HTTP auth, so all the request will need to have a header like:
Authorization: Basic QWxhZGRpbjpvcGVuIHNlc2FtZQ==
In the request. In order to do that with C#, you could do the following:
var request = (HttpWebRequest)WebRequest.Create("https://api.outpan.com/v1/products/0796435419035");
var encodedString = Convert.ToBase64String(Encoding.Default.GetBytes("-your-api-key-here-:"));
request.Headers["Authorization"] = "Basic " + encodedString;
var response = request.GetResponse();
For a full description of the header, check out the wiki page http://en.wikipedia.org/wiki/Basic_access_authentication. Note that the base64 encoded string can be in the form [username]:[password], but the outpan api docs ( https://www.outpan.com/developers.php ) write that they do not use the password part.
Also see: Forcing Basic Authentication in WebRequest for a nice method wrapper for this logic.
I have App that makes use of some web service and acquire data via JSON, all was working fine for quite long time, up until latest discoveries about SSLv3 being vulnerable to man-in-the-middle attacks and server owners turning off SSLv3 for good. My application started to have problems connecting and returned error "Request was aborted: cannot establish secure SSL/TLS connection". I've tried to look for solution and found information i got to add this code before creating web request:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
ServicePointManager.ServerCertificateValidationCallback = delegate{
return true;
};
Unfortunately no luck here, app acts the same as before, and I have no clue if this code does nothing or there is still some problem with server. Error information is pretty vague and i have problem figuring where things go wrong.
Here is my code
...
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.ContentType = GetRequestContentType();
request.Method = method.ToString();
request.Credentials = GetCredential(url);
request.PreAuthenticate = true;
CookieContainer cookieContainer = new CookieContainer();
request.CookieContainer = cookieContainer;
...
I want to ask how to set Tls12 to be used as default and ensure that at my end request I make is with desired protocol.
If I confirm that my app at my end works fine, is there way to get more detailed information from server response and pinpoint precise reason of error?
Thanks for all answers and suggestions.
EDIT
Second part of question is solved, I found this tool http://www.telerik.com/download/fiddler it pretty much allows to see what is going on with outgoing and incoming data. There is also thing that this tool allow to decode SSL connections, enabling this option makes that my application starts to work. I assume that this app does something that make communication between my app and destination host possible. But i do still have no idea what it could be. And how to make my app to handle these connections properly by itself.
Being desperate made me to inspect whole source code (part responsible for getting data of the internet was 3rd party and until it worked fine there was no reason to change it) and I discovered that line
request.Credentials = GetCredential(url);
called method that in its body had
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
So all my attempts to change that value before creating httpwebrequest was overwritten. Changing SecurityProtocolType to Tls12 makes it all work now.
I'm using an HttpListener to create a very basic web server. I've got it setup to use SSL using the httpcfg tool to bind to the appropriate port and certificate. This seems to be working fine. I'd now like to use client certificate authentication. So I added a call to GetClientCertificate on the HttpListenerRequest object but it is always coming back with null. My test client is very simple:
HttpWebRequest webReq = (HttpWebRequest) WebRequest.Create("https://127.0.0.1:8080/ssltest/");
webReq.ClientCertificates.Add(new X509Certificate2("ssltest.pfx", "ssltest"));
webReq.GetResponse();
I noticed that the httpcfg tool has a flag that indicates if client certificates should be negotiated so I tried specifying that flag (-f 2) but I'm still not getting the client cert. I also came across this Microsoft support issue which seems pretty relevant but I'm using the latest .NET 2.0 service pack and I've also tried the httpcfg flag both of which should avoid the issue.
I am assuming I am missing something obvious here. Any ideas?
Edit: I just found this question which seems very relevant (maybe even a duplicate?). Unfortunately there is no accepted answer for that question either. The suggested answer makes a suggestion for something I already tried (httpcfg tool with the appropriate flag).
According to http://support.microsoft.com/kb/895971/en-us the HttpWebRequest.ClientCertificates.Add performs validation already so the cert fails validation on the client-side and never gets sent.
The above link contains code to relax the validation... NEVER use that in production!!!
I have a website with digest authentication required and when I browse it with IE8, it gives me 401 even the password is correct. Firefox and Chrome works correctly. I checked the authorization headers with Fiddler, everything seems fine. Can you give me any hints on the problem?
p.s. Additionally I do have the same problem with implementing digest authentication in C#, I don't know these two are related.
I was facing this problem and this was the only mention of it on the net. In Digest Access Authentication the sequence of events that take place is.
GET on /url
401 with a WWW-Authenticate header
This pops up the login dialog on your browser. After you enter your credentials.
GET on /url along with the Authorization header.
200 OK (If everything goes well).
This works fine for Firefox and Chrome but was not working fully for IE8.
By fully I mean, that if I did a GET on a virtual location on the server it worked, but it did not work when I did a GET on a static file. In the case for a static file I was prompted for a login again and again.
After using a sniffer I found out that in the case of requesting a virtual location the sequence of events happened as mentioned above, but when I requested a static file the sequence was as follows:
GET on /url
401 with a WWW-Authenticate header
This pops up the login dialog on your browser. After you enter your credentials.
GET on /url (WITHOUT THE Authorization header)
401 Un-Authorized.
Basically when it was a static file, it took the username and password but never sent it across in the Authorization header. Server not getting this header responded with 401 which again prompted the login.
To make IE8 work properly you have to fool it in thinking that this is not a static file, but is a virtual location. For me, it was easy as I had access the server's source code. I really don't know how to do it, if you don't have access to it.
If you have requested a virtual location.
1. GET /virtual_location
2. 401 with WWW-Authenticate header which will look something like
WWW-Authenticate: Digest realm="validusers#robapi.abb", domain="127.0.0.1:80", qop="auth", nonce="9001cd8a528157344c6373810637d030", opaque="", algorithm="MD5", stale="FALSE"
Notice the opapue parameter is an empty string.
On the other hand if you requested a static-file
1. GET /staticfile.txt
2. 401 with WWW-Authenticate header which will look something like
WWW-Authenticate: Digest realm="validusers#robapi.abb", domain="127.0.0.1:80", qop="auth", nonce="81bd1ca10ed6314570b7362484f0fd31", opaque="0-1c5-4f7f4c1e", algorithm="MD5", stale="FALSE"
Here the opaque parameter is a non empty string.
Hence, if you an ensure that the opaque parameter is always an empty string, IE8 will consider it as a virtual location and the request will go through normally. Since I had access to the the server's code I was able to do this.
Hope this is of any help.
Regards,
Satya Sidhu
I had the same problem. In my case, I was requiring digest authentication for my entire site, using directives in either "<Directory />" or "<Location />". Either way works for Firefox and Safari on Mac, PC, and iOS. Unfortunately, IE8 seems to have trouble with this. After trying several other changes, I finally found that if I only require authentication on a subdirectory (e.g. "<Location /private>"), and move my content into the protected directory, IE8 started working. I went back and forth a few times, changing only this attribute, to confirm that this is the critical difference.
Incidentally, it's worth noting that a tcpdump showed that IE8 wasn't even trying to send a digest authentication. It presented the auth dialog box, took my username and password, then sent a normal GET request with no authentication info.
Are (were) you protecting the entire content tree?
I'm not sure why IE8 (and only IE8) cares about this distinction, but this is what I found.
In searching for a solution to the problem, yours was the only mention that seemed relevant, and I could find no answer posted on the net. This leads me to believe that either no one tries to configure Digest authentication in this way, or most people just give up and use Firefox (or some other non-MS browser)
Wow, I'm definitely having the same problem. I have two virtual hosts, both using digest authentication. On one site I am trying to protect the entire site (i.e. ) and it works in all browsers I have tried except IE8. On the other site, I'm only protecting a subdirectory, and that works fine in IE8.
I had the same problem and tried to use the digest authentication for the whole vhost. But the following configuration did not work on IE.
<Location />
AuthType Digest
AuthName "Login"
AuthDigestDomain /
AuthUserFile /path/to/.htdigest
Require valid-user
</Location>
The workaround in http://lists.centos.org/pipermail/centos/2013-January/131225.html worked well:
ErrorDocument 401 "some random text"
A better solution is to exclude the apache error pages that are normally located at /error/.*
e.g.
Alias /error/ "/usr/share/apache2/error/"
The following configuration worked well for me (see also https://bz.apache.org/bugzilla/show_bug.cgi?id=10932#c5):
<LocationMatch "^/(?!error/)">
AuthType Digest
AuthName "Login"
AuthDigestDomain /
AuthUserFile /path/to/.htdigest
Require valid-user
</LocationMatch>
I need to login to a website and perform an action. The website is REST based so I can easily login by doing this (the login info is included as a querystring on the URL, so I dont't need to set the credentials):
CookieContainer cookieJar = new CookieContainer();
HttpWebRequest firstRequest = (HttpWebRequest) WebRequest.Create(loginUrl);
firstRequest.CookieContainer = cookieJar;
firstRequest.KeepAlive = true;
firstRequest.Method = "POST";
HttpWebResponse firstResponse = (HttpWebResponse)firstRequest.GetResponse();
That works and logs me in. I get a cookie back to maintain the session and it's stored in the cookieJar shown above. Then I do a second request such as this:
HttpWebRequest secondRequest = (HttpWebRequest) WebRequest.Create(actionUrl);
secondRequest.Method = "POST";
secondRequest.KeepAlive = true;
secondRequest.CookieContainer = cookieJar;
WebResponse secondResponse = secondRequest.GetResponse();
And I ensure I assign the cookies to the new request. But for some reason this doesn't appear to work. I get back an error telling me "my session has timed out or expired", and this is done one right after the other so its not a timing issue.
I've used Fiddler to examine the HTTP headers but I'm finding that difficult since this is HTTPS. (I know i can decrypt it but doesn't seem to work well.)
I can take my URL's for this rest service and paste them into firefox and it all works fine, so it must be something I'm doing wrong and not the other end of the connection.
I'm not very familiar with HTTPS. Do I need to do something else to maintain my session? I thought the cookie would be it, but perhaps there is something else I need to maintain across the two requests?
Here are the headers returned when I send in the first request (except I changed the cookie to protect the innocent!):
X-DB-Content-length=19
Keep-Alive=timeout=15, max=50
Connection=Keep-Alive
Transfer-Encoding=chunked
Content-Type=text/html; charset=WINDOWS-1252
Date=Mon, 16 Nov 2009 15:26:34 GMT
Set-Cookie:MyCookie stuff goes here
Server=Oracle-Application-Server-10g
Any help would be appreciated, I'm running out of ideas.
I finally got it working after decrypting the HTTP traffic from my program.
The cookie I'm getting back doesn't list the Path variable. So .NET takes the current path and assigns that as the path on the cookie including the current page. ie: If it was at http://mysite/somepath/somepage.htm it would set the cookie path=/somepath/somepage.htm. This is a bug as it should be assigned to "/" which is what all web browsers do. (hope they fix this.)
After noticing this I grabbed the cookie and modified the path property and everything works fine now.
Anyone else with a problem like this check out Fiddler. .NET uses the windows certificate store so to decrypt http traffic from your program you will need to follow the instructions here: http://www.fiddler2.com/Fiddler/help/httpsdecryption.asp . You will also need to turn on decryption under the Options\HTTPS tab of Fiddler.
From MSDN:
When a user moves back and forth between secure and public areas, the ASP.NET-generated session cookie (or URL if you have enabled cookie-less session state) moves with them in plaintext, but the authentication cookie is never passed over unencrypted HTTP connections as long as the Secure cookie property is set.
So basically, the cookie can be passed over both HTTP and HTTPS if the 'Secure' property is set to 'false'.
see also how can I share an asp.net session between http and https