HttpClient / WebClient - Trouble Interacting With Websites via Proxy - c#

I'd like to use an HttpClient to interact with a website. This is in a corporate-type environment, all web access goes through a web proxy. By default, both the HttpClient and WebClient seem to 'just work' with the proxy here - but I've also specified the proxy details in code.
My problem is that some URLs will load correctly, others will not. The following code shows what I mean.
var webc = new WebClient();
var x1 = webc.DownloadString("http://www.google.com"); // Works
var x2 = webc.DownloadString("http://www.google.ie"); // Works
var x3 = webc.DownloadString("http://maps.google.com"); // Works
var x4 = webc.DownloadString("http://maps.google.ie"); // 403 Forbidden exception
I see the same behaviour with the HttpClient, but the code is more verbose. If I fetch the HTML returned in the 403 error it indicates that I have not authenticated and shows my username as empty.
Using Chrome/FF/IE - I can browse to all four of the sample URLs. The proxy doesn't prevent me, or show the same error message.
It seems like the code only fails on sites that have a non 'www' subdomain - when it is a non .com site. As crazy as that sounds.
I've tried running Fiddler locally to see if anything was different between the requests - from what I can see - it looks identical, except for the URL:
GET http://maps.google.ie/ HTTP/1.1 Host: maps.google.ie
Proxy-Connection: Keep-Alive
GET http://www.google.com/ HTTP/1.1 Host: www.google.com
Proxy-Connection: Keep-Alive
In the 'Auth' tab fiddler shows:
No Proxy-Authorization Header is present.
No Authorization Header is present.
For both. But the .com example works; and the .ie example fails. I tried pulling up the same maps.google.ie URL from within Chrome - which works great and I can see that it has a Proxy-Authorization in it's GET
GET http://maps.google.ie/ HTTP/1.1 Host: maps.google.ie
Proxy-Connection: keep-alive Proxy-Authorization: NTLM
T3RMTVNTUAAFAAACB4IBogQABAAzAAAACwALACgAAAAGAbFdAAAAD1BBVUxTT01xOTlEU1UTUR==
Can anyone tell me what's going on here? If that Proxy-Authorization is what I need, how do I get the HttpClient/WebClient to include it? I've tried creating a WebProxy and setting the Credentials on it - with the CredentialCache and with supplying the username/pass/domain (and every variation of the domain name I could think of). When I get it 'wrong' - all the sites seem to return 403. But when I get it right - the top 3 work and the 4th doesn't. In Fiddler, I'm never able to see that Proxy-Authorization in any of the requests I make - but it still works for the 3 first three sites.
I'm sure I've missed something, but I'm at a loss. Any help would be much appreciated.

There are two ways:
var webc = new WebClient();
webc.UseDefaultCredentials = true;
var x4 = webc.DownloadString("http://maps.google.ie");
or, put this in your app.config
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<system.net>
<defaultProxy useDefaultCredentials="true" />
</system.net>
</configuration>
See tomfanning's answer here: Proxy Basic Authentication in C#: HTTP 407 error
I don't understand why "UseDefaultCredentials" does not default to true. If you work in a corporation that uses a proxy, any app that doesn't do this cannot get out of the LAN.

Moby Disk and Aron are both correct, in the sense that those are ways of specifying the proxy. But as mentioned in my question, using them didn't help.
For whatever reason, the web proxy required a User-Agent to be set. Once set, everything worked.
_client.DefaultRequestHeaders.TryAddWithoutValidation("User-Agent", "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:24.0) Gecko/20100101 Firefox/24.0");

var webc = new WebClient
{
Proxy = new WebProxy
{
Credentials = new NetworkCredential(...),
}
};
var x1 = webc.DownloadString("http://www.google.com"); // Works
var x2 = webc.DownloadString("http://www.google.ie"); // Works
var x3 = webc.DownloadString("http://maps.google.com"); // Works
var x4 = webc.DownloadString("http://maps.google.ie"); // 403 Forbidden exception
Unfortunately .net is really annoying for programmatically setting the Proxy credentials. You expect you should be able to do this all in config, but it doesn't work out of the box. You can only set the Proxy address in config and not the credentials.

Related

WebClient returning 403 error only for this website?

I am trying to download file from these links by using C# WebClient, but I am getting 403 error.
https://www.digikey.com/product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=5&pageSize=500
https://www.digikey.com/product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=4&pageSize=500
I tried to use different user agents, accept encoding etc.
I replaced and tried https to http from url, but no success.
When I paste these urls in Chrome or FireFox or IE, I am able to download file, sometimes it give 403 error, then I replace https to http from url, it downloads. But no success in webclient
Tried Fiddler to inspect, no success
Can someone try in your system, solve this problem.
Here is my code:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
WebClient client= new WebClient();
Uri request_url = new Uri("https://www.digikey.com/product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=5&pageSize=500);
//tried http also http://www.digikey.com/product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=5&pageSize=500
client.Headers.Add("user-agent", " Mozilla/5.0 (Windows NT 6.1; WOW64; rv:25.0) Gecko/20100101 Firefox/25.0");
client.DownloadFile(request_url, #"E:\123.csv");
I know there are many threads related to this topic, I tried all of them, no success, please don't mark duplicate. Try in your system, this <10 lines of code.
Note: the same code is working for other websites, only for this website it is giving error.
As I mentioned in my comment the issue here is that the server is expecting a cookie (specifically 'i10c.bdddb') to be present and is giving a 403 error when it's not. However, the cookie is sent with the 403 response. So you can make an initial junk request that will fail but give you the cookie. After this you can then proceed as normal.
Through some trial and error I was able to get the CSV using the code below:
System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls;
CookieContainer cookieContainer = new CookieContainer();
Uri baseUri = new Uri("https://www.digikey.com");
using (HttpClientHandler handler = new HttpClientHandler() { CookieContainer = cookieContainer })
using (HttpClient client = new HttpClient(handler) { BaseAddress = baseUri})
{
//The User-Agent is required (what values work would need to be tested)
client.DefaultRequestHeaders.Add("User-Agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:67.0) Gecko/20100101 Firefox/67.0");
//Make our initial junk request that will fail but get the cookie
HttpResponseMessage getCookiesResponse = await client.GetAsync("/product-search/download.csv");
//Check if we actually got cookies
if (cookieContainer.GetCookies(baseUri).Count > 0)
{
//Try getting the data
HttpResponseMessage dataResponse = await client.GetAsync("product-search/download.csv?FV=ffe00035&quantity=0&ColumnSort=0&page=4&pageSize=500");
if(dataResponse.StatusCode == HttpStatusCode.OK)
{
Console.Write(await dataResponse.Content.ReadAsStringAsync());
}
}
else
{
throw new Exception("Failed to get cookies!");
}
}
Notes
Even with the right cookie if you don't send a User-Agent header the server will return a 403. I'm not sure what the server expects in terms of a user agent, I just copied the value my browser sends.
In the check to see if cookies have been set it would be a good idea to verify you actually have the 'i10c.bdddb' cookie instead of just checking if there are any cookies.
This is just a quick bit of sample code so it's not the cleanest. You may want to look into FormUrlEncodedContent to send the page number and other parameters.
I tested with your URL and I was able to reproduce your error. Any requests that I try with the querystring parameter quantity=0 seems to fail with a HTTP Error 403.
I would suggest requesting a quantity greater than zero.
A HTTP 403 status code mean forbidden, so there is a problem with your credentials. It doesn't seem to be like you're sending any. If you add them into your header this should work fine like this:
client.Headers.Add("Authorization", "token");
or sending them like this:
client.UseDefaultCredentials = true;
client.Credentials = new NetworkCredential("username", "password");
Most likely the links are working through web browsers is because you have already authenticated and the browser is sending the credentials/token.
I have this issue with Digi-key too.
The solution for me is to turn off my VPN service.

WebPush {"Received unexpected response code"}

How to send notification using Webpush library. I was trying but its throws error msg like {"Received unexpected response code"}
****Now i have created web API to send notification & calling you through fiddler,but did n't get exception it's stuck somewhere
here is my code sample****
public void Sendnotification()
{
try
{
WebPush.VapidDetails vapidKeys = apidHelper.GenerateVapidKeys();
string subject =#"mailto:xyz.com";
string publicKey = Convert.ToString(vapidKeys.PublicKey);
string privateKey = Convert.ToString(vapidKeys.PrivateKey);
var subscription = new PushSubscription(pushEndpoint, p256dh, auth);
var vapidDetails = new VapidDetails(subject, publicKey, privateKey);
client.SendNotification(subscription, "payload", vapidDetails);
}
catch (WebPushException e)
{
}
}
I have configured Https enabled to call api using fidder. Please have look. also its throws error, it stuck somewhere
now it got the error please have look it's showing error HTTP/1.1 410 NotRegistered
See the full screen of Fiddler response error details
If you are getting the error 410 (to check the error use fiddler to intercept the https call), probably what you have is an error in the subscription data of the user probably the keys stored in your database doesn't match the subscription in the browser an easy fix could be to re-subscribe and re-save the subscription data and try again.
to setup fiddler, you have to use it as a proxy visual studio to intercept the https calls and also you have to enable https decryption.
EDIT
you can set up fiddler just by adding this configuration in your web.config or app.config:
<system.net>
<defaultProxy
enabled = "true"
useDefaultCredentials = "true">
<proxy autoDetect="false" bypassonlocal="false" proxyaddress="http://127.0.0.1:8888" usesystemdefault="false" />
</defaultProxy>
</system.net>
if in any case, you get unauthorized registration check this questions:
Web Push API Chrome, returning "Unauthorized Registration"
WebPushError and UnauthorizedRegistration when try to send push notification to Chrome and Opera, FF is OK

Using DotNetOpenAuth for NetSuite SuiteSignOn (Outbound Single Sign-on)

I am trying to figure out how to use DotNetOpenAuth (DNOA) to interface to NetSuite's SuiteSignOn. I have a java example I am trying to duplicate the function of, but I am new to OAuth. Here is what I have to work with:
This is the high level of what NetSuite wants to happen:
User logs in to NetSuite, initiating a NetSuite session.
User clicks on one of the following in the NetSuite user interface:
o A subtab that provides SuiteSignOn access
o A page displaying a portlet that provides SuiteSignOn access
o A link for a Suitelet that provides SuiteSignOn access
o An action button that results in the execution of a user event script that provides SuiteSignOn access
NetSuite generates a token, and sends this token to the external application as the value for the oauth_token URL parameter. This outbound HTTP call also includes a dcand an env URL parameter. These values can be mapped to the URL to be used for NetSuite access (see Mappings of dc and env URL Parameter Values). If any data fields were previously defined as required context for the connection, NetSuite sends values for these fields at the same time.
The external application sends back to NetSuite the token, the consumer key, and its shared secret, along with other information such as the timestamp and nonce, in order to verify the user. The consumer key is a unique identifier for the application provider, generated by NetSuite when the application provider sets up a SuiteSignOn connection. The shared secret is a password defined by the application provider during this setup.
NetSuite responds to the verification, sending any user identification information that was previously defined as necessary for the connection, in XML format. This information may include standard fields like email address or name, or custom fields.
The external application sends the HTML for the landing page, and the page displays. Or, if there is a problem, an error is returned instead.
NetSuite HTTP Outbound Call (got this figured out).
When a user accesses a SuiteSignOn connection point, NetSuite issues an outbound call to start the handshake. The following is an example of this call:
GET /SSO/demoApp.php?oauth_token=01046c1211661d6c6b415040422f0daf09310e3ea4ba&dc=001&env=PRODUCTION HTTP/1.1
Host: externalsystem.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:19.0) Gecko/20100101 Firefox/19.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Connection: keep-alive
External Application HTTP Verify Call (trying to prepare this with DotNetOpenAuth).
Upon receipt of the NetSuite HTTP outbound call, the external application needs to issue an HTTP verify call. The following is an example of this call:
GET /app/common/integration/ssoapplistener.nl HTTP/1.0
Host: system.netsuite.com
Authorization: OAuth oauth_token="01046c1211661d6c6b415040422f0daf09310e3ea4ba", oauth_consumer_key="3moWE2ukbW4lohz7", oauth_signature_method="PLAINTEXT", oauth_signature="foobar1%26", oauth_timestamp="1364997730", oauth_nonce="392380036"
NetSuite HTTP Verify Call Response (I can code this).
Upon receipt of the verify call from the external application, NetSuite sends a response. The following is an example of this response:
HTTP/1.1 200 OK
Date: Tue, 16 Apr 2013 13:30:41 GMT
Server: Apache/2.2.17
Set-Cookie: lastUser=1326288_79_3; expires=Tuesday, 23-Apr-2013 13:30:42 GMT; path=/
Set-Cookie: NS_VER=2013.1.0; domain=system.netsuite.com; path=/
X-Powered-By: Servlet/2.5 JSP/2.1
P3P: CP="CAO PSAa OUR BUS PUR"
Vary: User-Agent
Connection: close
Content-Type: text/html; charset=utf-8
<?xml version="1.0" encoding="UTF-8"?>
<outboundSso>
<entityInfo>
<ENTITYLASTNAME>Smith</ENTITYLASTNAME>
<ENTITYINTERNALID>79</ENTITYINTERNALID>
<ENTITYACCOUNT>1326288</ENTITYACCOUNT>
<ENTITYFIRSTNAME>John</ENTITYFIRSTNAME>
<ENTITYEMAIL>jsmith#netsuite.com</ENTITYEMAIL>
</entityInfo>
</outboundSso>
The excerpts of a Java example using OAuth 1.0a that I'm trying to port to .net/DotNetOpenAuth:
import net.oauth.OAuth;
import net.oauth.OAuthAccessor;
import net.oauth.OAuthConsumer;
import net.oauth.OAuthMessage;
import net.oauth.client.OAuthClient;
import net.oauth.http.HttpMessage;
<<snip>>
OAuthConsumer consumer = new OAuthConsumer(null, CONSUMER_KEY, SHARED_SECRET, null);
consumer.setProperty(OAuth.OAUTH_SIGNATURE_METHOD, "PLAINTEXT");
OAuthAccessor oauthAccessor = new OAuthAccessor(consumer);
//Get the token from NetSuite
oauthAccessor.accessToken = request.getParameter("oauth_token");
<<snip>>
OAuthMessage rqt = null;
rqt = oauthAccessor.newRequestMessage("POST", ssoVerifyUrl, null);
HttpMessage message =
rqt.toHttpRequest(OAuthClient.ParameterStyle.AUTHORIZATION_HEADER);
verifyConnection.setRequestProperty("Authorization",
message.getHeader("Authorization"));
Being new to OAuth and DotNetOpenAuth, I'm fumbling around.
What is the proper replacement for OAuthConsumer in DNOA in this situation? WebConsumer? DesktopConsumer?
Assuming I need such a consumer, how much of the ServiceProviderDescription do I need to provide? I only have one endpoint (/app/common/integration/ssoapplistener.nl), I'm not sure if that is a Request, Access, or other type of endpoint.
What is the proper replacement for OAuthAccessor in DNOA?
Thanks for any assistance,
Bo.
Ok, after a lot of digging and experimenting, I got DotNetOpenAuth to work with NetSuite's SuiteSignOn. It may not be perfect, but it does work!
I got my tokenmanager from this post:
https://developer.yahoo.com/forum/Fantasy-Sports-API/Authenticating-with-NET-using-DotNetOpenAuth/1279209867000-4eee22f1-25fd-3589-9115-1a835add3212
using DotNetOpenAuth.OAuth;
using DotNetOpenAuth.OAuth.ChannelElements;
using DotNetOpenAuth.OAuth.Messages;
using DotNetOpenAuth.Messaging;
using DotNetOpenAuth.OpenId.Extensions.OAuth;
// In my Page_Load method, I receive the GET request from NetSuite:
public partial class sso_page : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
// This is what the NetSuite SuiteSignOn ConnectionPoint sends:
// GET /administratorportal/SSO/sso_page.aspx?oauth_token=08046c1c166a7a6c47471857502d364b0d59415418156f15db22f76dcfe648&dc=001&env=SANDBOX
// see the NetSuite SuiteSignOn doc about dc & env processing to build endpoints
ServiceProviderDescription provider = GetServiceDescription();
// Set up OAuth with our keys and stuff
string token = Request.Params["oauth_token"];
string consumerKey = "yourconsumerkey"; // this has to match what is defined on our NetSuite account - ConnectionPoint to CRMLink
string sharedSecret = "yoursharedsecret"; // this has to match what is defined on our NetSuite account - ConnectionPoint to CRMLink - Careful - NO funny chars like '!'
// I got this InMemoryTokenManager from another DotNetOpenAuth post in SO
InMemoryTokenManager _tokenManager = new InMemoryTokenManager(consumerKey, sharedSecret);
AuthorizationApprovedResponse authApprovedResponse = new AuthorizationApprovedResponse();
authApprovedResponse.RequestToken = token;
_tokenManager.StoreOpenIdAuthorizedRequestToken(consumerKey, authApprovedResponse);
WebConsumer consumer = new WebConsumer(provider, _tokenManager);
// this is the SSO address in netsuite to use. Should be production or sandbox, based on the values of dc and env
string uri = "https://system.sandbox.netsuite.com/app/common/integration/ssoapplistener.nl";
MessageReceivingEndpoint endpoint = new MessageReceivingEndpoint(uri, methods);
WebRequest verifyRequest = consumer.PrepareAuthorizedRequest(endpoint, token );
HttpWebResponse responseData = verifyRequest.GetResponse() as HttpWebResponse;
XDocument responseXml;
responseXml = XDocument.Load(responseData.GetResponseStream());
// process the SSO values that come back from NetSuite in the XML They should look something
// like the following:
/* XML response should look like this:
<?xml version="1.0" encoding="UTF-8"?>
<outboundSso>
<entityInfo>
<ENTITYINTERNALID>987654</ENTITYINTERNALID>
<ENTITYNAME>Fred</ENTITYNAME>
<ENTITYEMAIL>fred#yourcompany.com</ENTITYEMAIL>
</entityInfo>
</outboundSso>
*/
// If that data looks good, you can mark the user as logged in, and redirect to whatever
// page (like SSOLandingPage.aspx) you want, which will be shown inside a frame on the NetSuite page.
Response.Redirect("~/SSOLandingPage.aspx", false);
// If that data looks bad, invalid user/login? Then you could respond with an error or redirect to a login.aspx page or something.
There is some other error handling and different returns depending on what happens, but the above is the basics of receiving an SSO login from NetSuite SuiteSignOn.
This was a hardcoded ServiceProviderDescription I used. You need to read the NetSuite SuiteSignOn doc to understand how to dynamically build these endpoints based on values of dc and env, I did not do that here yet.
// I'm not completely sure why I need all these endpoints below, and since I provide an endpoint as such:
// MessageReceivingEndpoint endpoint = new MessageReceivingEndpoint(uri, methods );
// these don't seem like I need them. But I need a ServiceProviderDescription to create a consumer, so...
private ServiceProviderDescription GetServiceDescription()
{
return new ServiceProviderDescription
{
AccessTokenEndpoint = new MessageReceivingEndpoint("https://system.sandbox.netsuite.com/app/common/integration/ssoapplistener.nl", HttpDeliveryMethods.GetRequest),
RequestTokenEndpoint = new MessageReceivingEndpoint("https://system.sandbox.netsuite.com/app/common/integration/ssoapplistener.nl", HttpDeliveryMethods.GetRequest),
UserAuthorizationEndpoint = new MessageReceivingEndpoint("https://system.sandbox.netsuite.com/app/common/integration/ssoapplistener.nl", HttpDeliveryMethods.GetRequest),
ProtocolVersion = ProtocolVersion.V10a,
TamperProtectionElements = new ITamperProtectionChannelBindingElement[] { new PlaintextSigningBindingElement() }
};
}

JsonServiceClient not using cookie placed in cookiecontainer

I've posted a couple of questions previously to get where I am now:
Reconnecting to Servicestack session in an asp.net MVC4 application
and
nullreference exception when adding session cookie to ServiceStack
Quick background on the app: This is an asp.net MVC application accessing data on a remote servicestack installation.
At this point I am successfully authenticating with SS, saving the session key in a cookie, and inserting that cookie into the CookieContainer of a new JsonServiceClient instance.
However when I try to grab some data via the new JsonServiceClient instance:
CallList = client.Get(new ServiceCallRequest()).Result;
The remote ServiceStack instance seems to be redirecting me to the default ASP login area (/Auth/login or something similar). This redirect in itself isn't a problem, but it does seem to indicate that the client isn't using the SS session already established on the remote machine.
This is the code that is actually inserting the cookie into the client cookie container and calling for a list of objects:
public List<ServiceCallModel> LoadList()
{
try
{
var cookie = HttpContext.Request.Cookies.Get(SessionFeature.PermanentSessionId);
var client = new JsonServiceClient([api address]);
cookie.Domain = ".domain.com";
var cookie1 = new Cookie(SessionFeature.PermanentSessionId, cookie.Value);
cookie1.Domain = ".domain.com";
client.CookieContainer.Add(cookie1);
List<ServiceCallModel> CallList = new List<ServiceCallModel>();
CallList = client.Get(new ServiceCallRequest()).Result;
return CallList;
}
catch (Exception ex)
{
return new List<ServiceCallModel>();
}
}
I can verify that this remote resource works with a monotouch Android application using the C# client. The only difference of course is that the Android client is persistent; the one in question here isn't.
The above example always returns a WebServiceException ("Not Found") (Which I assume is actually a 401/403 that has been redirected annoyingly by ASP).
Does this seem reasonable, or am I missing/misunderstanding some functionality of the JsonServiceClient/ServiceStack?
Thanks much
Update
Using Fiddler, I can confirm that the cookie is being stored in the browser and sent back to the MVC application in the request header:
GET / HTTP/1.1
Host: [web app address]
Connection: keep-alive
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko)
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8
Cookie: ASP.NET_SessionId=3457C511FECBECCD3C055C21;.MONOAUTH=PzE6iBuLIbv1evgACfpUwpC1D7opCANroPDQN/mXHNvGAgkjqq04Tdd8EnnGTL7y3lWYWY4+GWaXGDT0Fm7+eJxRdpy LJMaUQ6BiYmb3dxRi1B3f/qkmPMbFIoC7vC9M; ss-pid=lzJ+o9vG/7F3YZ9JNN2F
At this point I am trying to find out of that same ss-pid value is then making it back to the API server in a request header.
Update 2
Using tcpdump I was able to see that the ss-pid value is in fact making it all the way back to the API server (The "remote servicestack instance"). So now I think I need to troubleshoot that, not the client. Any thoughts?
Snippet of tcpdump output:
0x0090: 6e65 740d 0a43 6f6f 6b69 653a 2073 732d net..Cookie:.ss-
0x00a0: 7069 643d 6c7a 4a2b 6f39 7647 2f37 4633 pid=lzJ+o9vG/7F3
0x00b0: 595a 394a 4e4e 3246 0d0a 4163 6365 7074 YZ9JNN2F..Accept
I know that the ss-pid values are different in each part of this post. They were obtained at different times
Update 3
I've also changed the LogFormat in the vhost config file to spit out the value for the cookie called "ss-pid" (At the end of the log entry).
The resulting logs on the ServiceStack remote API server looks like this:
172.16.0.17 - - [08/Oct/2013:12:26:52 -0400] "GET /sc/0 HTTP/1.1" 500 3082 "-" "-" "HFMtFpPQkpE0Br6/fEFg"
172.16.0.17 - - [08/Oct/2013:12:27:06 -0400] "GET /sc/0 HTTP/1.1" 302 394 "-" "-" "HFMtFpPQkpE0Br6/fEFg"
172.16.0.17 - - [08/Oct/2013:12:27:07 -0400] "GET /login.aspx?ReturnUrl=%2fsc%2f0 HTTP/1.1" 404 451 "-" "-" "HFMtFpPQkpE0Br6/fEFg"
This "500" status on the first request sticks out. I will be investigating this now.
Update 4
The 500 status seems to be a case of Microsoft.Web.Infrastructure.dll being included in the bin directory. Deleted that to resolve the 500 response, but this did not fix the overall problem.
ServiceStack places two Session cookies in the Request, 'ss-id' and 'ss-pid'. This line in your code...
var cookie = HttpContext.Request.Cookies.Get(SessionFeature.PermanentSessionId);
will grab the 'ss-pid' cookie. But, you probably want the grab the 'ss-id' cookie which is the cookie for your current authenticated session.
var cookie = HttpContext.Request.Cookies.Get(SessionFeature.SessionId);
Take a look here for more information on ServiceStack Sessions.

C# Network credentials not being passed to server?

Edit: Using:
byte[] authBytes = System.Text.Encoding.UTF8.GetBytes(user + ":" + password);
wr.Headers["Authorization"] = "Basic " + Convert.ToBase64String(authBytes);
Seems to work fine.
I have an application that communicates with a CMS. I'm currently trying to get this client to be able to upload text/xml data to the CMS using a "POST" method.
I can pass this through using curl perfectly fine:
curl -u user:password -H "Content-Type:text/xml" -d "<element>myXML</element>" serverURL
However, trying to use the HttpWebRequest in C# I can't get the server to return what I want it to. So I fired up Wireshark and had a look at what was actually being passed through, and it's pretty much identical except for the fact that when using curl I can see:
Authorization: Basic <a bunch of hex>=\r\n
Credentials: user:password
In the HTTP header fields, while in the output from my client, these header fields are simply not present. ("Credentials:" isn't actually there in plain text, it's a subtree of "Authorization:" - so I'm not sure where it's getting it from, but the username and password are correct.)
The C# code I'm trying to use to set the credentials for the webrequest is something like this:
NetworkCredential myCred = new NetworkCredential(
user, password, serverURL);
CredentialCache myCache = new CredentialCache();
myCache.Add(new Uri(serverURL), "Basic", myCred);
HttpWebRequest wr = (HttpWebRequest) HttpWebRequest.Create(serverURL);
wr.Credentials = myCache;
I've tried just setting the credentials like this too (and without specifying serverURL):
wr.Credentials = new NetworkCredential(user,password,serverURL);
But that still doesn't make it show up in wireshark. Does anyone have any idea if:
A) That authorization information should actually be in the HTTP header for this to work, and
B) If it is - then how do I make C# put it in? I only seem to be able to find decent examples using the default credentials, which doesn't apply to what I'm doing.
Thanks in advance.
.NET's WebRequest has an infuriating default behavior where it only sends credentials after receiving an HTTP 401 Not Authorized response.
Manually adding the credentials header (as you've done) seems to be the best solution available.
More details in this post

Categories