How to get proxy - c#

I try to get proxy for web request (HttpWebRequest or webclient)
In control panel->Internet Options->Connecitons->LAN Settings
you will see 3 options:
Automatically detect settings
Use automatic configuration script
Use a proxy server for your LAN
I want to make sure no matter whichever setting, my web request pick up the same proxy as browser does.
I am using the code below to achieve this; however, when 1. is checked, I try the same URL in browser and my code, it looks my code is much slower. I guess the way I get proxy in code may be not efficient or appropriate.
Is there anything I can change in my code to mirror the speed of the browser?
var client = (HttpWebRequest)WebRequest.Create(uriStr);
client.Headers["something"] = something;
client.Timeout = ConnectionTimeOut; //1 min
var proxyURI = WebRequest.GetSystemWebProxy().GetProxy(uri);
var proxy = new WebProxy(proxyURI, true)
{
Credentials = CredentialCache.DefaultNetworkCredentials
};
//if there is no proxy, proxy will return the same uri
//do we need check if client.Proxy is null or not,
if (proxyURI != null && !string.IsNullOrEmpty(proxyURI.AbsoluteUri) && !proxy.Address.Equals(uri))
{
client.Proxy = proxy;
}

Your approach is fine.
What may be causing the speed difference is that the browser may have either cached the page you are requesting or cached the proxy/proxy credentials and does not need to perform any net new fetching as you are performing within your code.
Have you tried subsequent requests within your application after acquiring the proxy/credentials?

Related

Does RestSharp perform WINS lookup?

I am trying to connect to an API deployed on a Windows server that does not belong to a domain.
When I initialize my RestClient this way my requests fail. If I provide my machine's IP address, my request works properly.
var options = new RestClientOptions("https://MyMachine/") {
Timeout = 1000
};
var client = new RestClient(options);
My guess is, RestSharp only performs DNS lookup and not WINS lookup. Is my guess right?
Thank you very much for your input, it has been helpful to understand that I was on the wrong track.
I solved my issue by adding the following line:
ServicePointManager.EnableDnsRoundRobin = true;

Asp.net Web api2 impersonate on webClient call

we have a web api what has is how user account let's say ApplicationPoolUser
that used have access to the databases used by the api, etc which work fine.
but i'm trying to send a http get method on files on a remote server(sharepoint 2007) using webClient
here's what im using :
WindowsImpersonationContext impersonationContext = null;
Uri uri = new Uri(Path.Combine(this.Document.path , Document.fileNameOriginal));
Stream stream = null;
// WindowsIdentity.GetCurrent().Name return 'ApplicationPoolUser'
try
{
WindowsIdentity wi = System.Web.HttpContext.Current.Request.LogonUserIdentity;
impersonationContext = WindowsIdentity.Impersonate(wi.Token);
// WindowsIdentity.GetCurrent().Name return 'CurrentRequestingUser'
WebClient client = new WebClient() {
UseDefaultCredentials = true,
CachePolicy = new System.Net.Cache.RequestCachePolicy(RequestCacheLevel.BypassCache)
};
stream = client.OpenRead(uri);
// OpenRead Authentified on sharepoint server has ApplicationPoolUser
}
catch(WebException ex)
{
HttpWebResponse webResp = (HttpWebResponse)ex.Response;
if(webResp.StatusCode == HttpStatusCode.NotFound)
throw new GeneralException(Common.Enums.ExceptionMessage.NotFound, webResp.StatusDescription);
else
{
throw ex;
}
}
is there a way to force the authentification on behalf of the user without turning asp.net Identity ON ? in the web.config / IIS site.
I dont want the whole code to execute has the impersonated user request just this small part ...
I did try to use httpClient instead by i've found that since httpclient start in a new thread, it will always use the application pool identity.
can i create the Negotiate Call myself and add it to the request ?
thank you.
EDIT :
i have tried Removing all AuthenticationManager except Kerberos, and the request still use NTLM for authentication, what am i doing wrong ?
There are multiple factors, which can make an impersonation or to be precise a delegation of the user credentials impossible.
1) if you are using asynchronous methods (directly or not) you might experience a problem with flowing the identity. You can check, if that might be an problem with the following call:
System.Security.SecurityContext.IsWindowsIdentityFlowSuppressed();
This should return false - if not you can use this as a reference: Calling an async WCF Service while being impersonated
2) You have to enable constrained delegation for the executing account. You have a so called Kerberos double hop scenario. You have to allow the Sharepoint user to act as another user or else impersonate() will not succeed as expected.

Onvif SOAP request with SOAP level authentication and HTTP authentication

This question has been discussed in several topics here but I could not find the answer for me.
What I'm trying to do is use an IP camera through the Onvif interface. I've generated the web services from the WSDL files available in the Onvif homepage, and added the custom SOAP authentication code as suggested here, and I am able to retrieve the device capabilities etc. etc.
But for some services, e.g, PTZ control, also HTTP authentication is needed. My code removes the ClientCredentials behaivor (so yeah, I guess setting them does not make any sense, but I still left those lines in hope that maybe the HTTP transport would try to use them):
HttpTransportBindingElement httpBindingElement = new HttpTransportBindingElement();
httpBindingElement.AuthenticationScheme = AuthenticationSchemes.Basic;
...
PTZClient ptzClient = new PTZClient(customBinding, endPointAddress);
ptzClient.Endpoint.Behaviors.Remove(typeof(System.ServiceModel.Description.ClientCredentials));
UsernameClientCredentials onvifCredentials = new UsernameClientCredentials(new UsernameInfo(_username, _password));
ptzClient.Endpoint.Behaviors.Add(onvifCredentials);
ptzClient.ClientCredentials.UserName.UserName = _username;
ptzClient.ClientCredentials.UserName.Password = _password;
Still when I look at wireshark, i see that the SOAP authentication is generated but no HTTP authentication header is set (well, I already expected that since i have a custom behaivor here). So the question is, if I am creating the binding this way, what are my best options to add HTTP authentication headers? Can I just add a message inspector, and if so, any examples? Must I create a different transport binding? I've seen people advising others to use BasicHttpBinding and then setting the Security property on that, but where do the credentials go in that case and how do I apply the BasicHttpBinding instance to my binding? Are there any callbacks in the WCF that get triggered by the HTTP 401 code that i can hook up to and then provide the header? This is actually my first experience with WCF and so far I've done everything from examples found in the internet, but as for this particular issue I haven't been able to find anything.
If anyone is interested this is how I got it working. I combined the BasicHttpBinding with the client credentials in a following way:
TransportSecurityBindingElement transportSecurity = new TransportSecurityBindingElement();
// UsernameCredentials is a class implementing WS-UsernameToken authentication
transportSecurity.EndpointSupportingTokenParameters.SignedEncrypted.Add(new UsernameTokenParameters());
transportSecurity.AllowInsecureTransport = true;
transportSecurity.IncludeTimestamp = false;
TextMessageEncodingBindingElement messageEncoding = new TextMessageEncodingBindingElement(MessageVersion.Soap12, Encoding.UTF8);
HttpClientCredentialType[] credentialTypes = new HttpClientCredentialType[3] { HttpClientCredentialType.None, HttpClientCredentialType.Basic, HttpClientCredentialType.Digest };
...
foreach (HttpClientCredentialType credentialType in credentialTypes)
{
BasicHttpBinding httpBinding = new BasicHttpBinding(BasicHttpSecurityMode.TransportCredentialOnly);
httpBinding.Security.Transport.ClientCredentialType = credentialType;
BindingElementCollection elements = new BindingElementCollection(new BindingElement[1]{messageEncoding});
foreach(BindingElement element in httpBinding.CreateBindingElements())
{
if (element is TextMessageEncodingBindingElement)
continue;
elements.Add(element);
}
CustomBinding customBinding = new CustomBinding(elements);
DeviceClient deviceClient = new DeviceClient(customBinding, endPointAddress);
if (credentialType == HttpClientCredentialType.Basic)
{
// Set all credentials, not sure from which one WCF actually takes the value
deviceClient.ClientCredentials.UserName.UserName = pair[0];
deviceClient.ClientCredentials.UserName.Password = pair[1];
}
else if (credentialType == HttpClientCredentialType.Digest)
{
deviceClient.ClientCredentials.HttpDigest.AllowedImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Delegation;
deviceClient.ClientCredentials.HttpDigest.ClientCredential.UserName = pair[0];
deviceClient.ClientCredentials.HttpDigest.ClientCredential.Password = pair[1];
}
}
This works efficiently with a device for which we do not know the authentication mode and works on both (HTTP/SOAP) authentication level.
I detailed how HTTP digest works in another answer.
Remember that only functions of class PRE_AUTH, according to ยง5.12.1 of the Core spec, require authentication.
You should invoke a function of any class but PRE_AUTH without any form authentication. If you get a HTTP 401 then you have to use HTTP digset, otherwise you'll have to got with WS-UsernameToken.
You can't directly use HTTP digest because you'll need at least the device to send you the challange for HTTP digest.

How do I use the cookie container with RestSharp and ASP.NET sessions?

I'd like to be able to call an authentication action on a controller and if it succeeds, store the authenticated user details in the session.
However, I'm not sure how to keep the requests inside the session as I'm using RestSharp as a detached client. I need to somehow get a key back from the server on successful authorisation and then for each future call, check the key with that stored in the session.
How do I ensure the RestClient in RestSharp sends all future requests with the cookie set correctly so inside service calls, the session variable can be retrieved correctly?
I've been looking at the cookie container with HttpFactory but there doesn't seem to be any documentation on this anywhere.
If someone is having a similar problem, please note that the above is not quite required for a simple "store my cookies after each request" problem.
Jaffas approach above works, but you can simply attach a CookieStore to your RestClient and have the cookies be stored automatically.
I know this is not a solution for everyone, since you might want to store dedicated cookies only. On the other hand it makes your life easier for testing a REST client!
(I used Jaffas variables for ease):
CookieContainer _cookieJar = new CookieContainer();
var client = new RestClient("http://<test-server>/letteron"); //test URL
client.CookieContainer = _cookieJar;
I worked this out in the end.
Basically create a cookie container, then add the session cookie from the response into the cookie container. All future requests will then contain this cookie.
var sessionCookie = response.Cookies.SingleOrDefault(x => x.Name == "ASP.NET_SessionId");
if (sessionCookie != null)
{
_cookieJar.Add(new Cookie(sessionCookie.Name, sessionCookie.Value, sessionCookie.Path, sessionCookie.Domain));
}

Grabbing Cookies in Web Browser Control - WP7

In order to log into a certain part of a website the users of my application require their cookie. To do this I need to grab it and pass it to url.
Does anyone know how to grab a certain websites cookie from the browser control?
I saw this method but wasn't quite clear.
Thanks, TP.
As of WP 7.1 Mango "release", if one may call it, please see the WebBrowser Control Overview for Windows Phone. It has been recently updated a little bit, and it turns out that they actually have added some support for cookie-retrieval from the WebBrowser. On the bottom of the page you will find a tiny link GetCookies(WebBrowser) pointing to description of a new class: WebBrowserExtensions with this very handy method. Yes, this class has only that one single member. It's an extension method, I suppose no explanations needed on that.
I have not played with this method much, but it seems that this will allow you to access the very same thing as the JS trick: the cookieset for the current URL. It probably will not allow to set anything, nor to peek cookies for other URLs. Maybe if you play hard with the CookieContainer you will receive, but I doubt.
On the 7.0 release, I've been struggling quite hard to achieve "cookie transparency" for my App. Long story short, my app was doing some background HTTP requests, and also had a WebBrowser to show some online content -- and "it would be great" if both sources of connections would emit the same cookies to the server.. And guess what, my application had to make the first request, then let the browser navigate. With such requirements, there was virtually is no way to achieve consistency of the cookies - bah, even with the current new and glorious GetCookie method, I suppose it would be damn hard. So, to the point - it was possible, but needed to use some hidden API, that is present publicitly on the Phone, but is hidden in the SDK. The API is available the (public) class System.Net.Browser.WebRequestCreator, freely available. The quirk is: in the SDK this class has a single public static property "IWebRequestCreate ClientHttp" with a method "Create" that you can use to "factory" your "raw http" connections - in case you dont want to use the WebClient for some reason. On the phone, and on the emulator, there is a second public static property called "IWebRequestCreate BrowserHttp", easily returned by Reflection:
PropertyInfo brwhttp = typeof(System.Net.Browser.WebRequestCreator)
.GetProperty("BrowserHttp")
with this property, you will be able to obtain a "special" internal instance of IWebRequestCreate that is used internally by the WebBrowser. By opening your background HTTP requests with this class, you will get your cookies automatically set as if they were created/sent by the WebBrowser control, but in turn - you will NOT be able to modify http headers, userprovide http user authentication and neither do a few lowlevel things - because all that settings will be synced with the WebBrowser's data stored for current 'system user instance', if I'm allowed to call it as such on the single-user Phone device heh. The interoperation between connections and the WebBrowser works both ways - if your HTTP connection (created with use of the 'hidden property') receives any settings/cookies/etc -- then the WebBrowser will instantly notice them and update its own cache. No cookie/session loss on neither of the sides!
If you need to passively get cookies for your subsequent connections after some first WebBrowser navigation - please use the GetCookie or the JS way.
But if you need your code to be first, and then pass authz to the WebBrowser -- you will probably have to dig deeper and use the above.. It's been hidden, so please resort to the other means first!
..and don't ask me how did I found it or how much time it took :P
have a nice fun with it
//edit: I've just found out, that the BrowserHttp property is a normal Silverlight's way to access the Browser's connection factory, please see BrowserHttp. It seems that it is only has been hidden in the 'miniSilverlight' written for the WP7 platform!
The approach being described in the post you linked is to use the WebBrowser control's InvokeScript method to run some javascript. However the post appears to use a "cookies" collection which doesn't actually exist.
string cookie = myWebBrowser.InvokeScript("document.cookie") as string;
Now for the hard part the string you get contains all pertinent cookie name/value pairs for the page with the values still being Url encoded. You will need to parse the returned string for the value you need.
See document.cookie property documentation.
Edit:
Looking at it fresh instead of relying on the post, InvokeScript invokes named function on the window of the host browser. Hence the page being displayed in the WebBrowser would itself need to include a function like:-
function getCookie() { return document.cookie; }
Then the InvokeScript would look like:-
string cookie = myWebBrowser.InvokeScript("getCookie");
As #quetzalcoatl already suggested, you can use internal instance of WebRequestCreator to share cookies between browser instances and instances of WebRequest. You don't get to access the cookies directly though, I think that's just a security measure by Microsoft.
This code below creates a WebReqeust object, connected to CookieContainer of WebBrowser instance. It then posts to a url to log in the user and store cookies in the container.
After it's done, all browser instances within the app instance will have required set of cookies.
var browser = new WebBrowser();
var brwhttp = typeof (WebRequestCreator).GetProperty("BrowserHttp");
var requestFactory = brwhttp.GetValue(browser, null) as IWebRequestCreate;
var uri = new Uri("https://www.login.com/login-handler");
var req = requestFactory.Create(uri);
req.Method = "POST";
var postParams = new Dictionary<string, string> {
{"username", "turtlepower"},
{"password": "ZoMgPaSSw0Rd1"}
};
req.BeginGetRequestStream(aReq => {
var webRequest = (HttpWebRequest)aReq.AsyncState;
using (var postStream = webRequest.EndGetRequestStream(aReq)) {
// Build your POST request here
var postDataBuilder = new StringBuilder();
foreach (var pair in paramsDict) {
if (postDataBuilder.Length != 0) {
postDataBuilder.Append("&");
}
postDataBuilder.AppendFormat("{0}={1}", pair.Key, HttpUtility.UrlEncode(pair.Value));
}
var bytes = Encoding.UTF8.GetBytes(postDataBuilder.ToString());
postStream.Write(bytes, 0, bytes.Length);
}
// Receive response
webRequest.BeginGetResponse(aResp => {
var webRequest2 = (HttpWebRequest) aResp.AsyncState;
webRequest = (HttpWebRequest)aResp.AsyncState;
string resp;
using (var response = (HttpWebResponse)webRequest2.EndGetResponse(aResp)) {
using (var streamResponse = response.GetResponseStream()) {
using (var streamReader = new System.IO.StreamReader(streamResponse)) {
resp = streamReader.ReadToEnd();
}
}
}
}, webRequest);
}, req);
One of the issues I couldn't solve though was exceptions thrown when server returns 302 - it seems to throw WebException error with "Not found" description.
// Ensure this is set to true BEFORE navigating to the page
webBrowser1.IsScriptEnabled = true;
// Once the page has loaded, you can read the cookie string
string cookieString = webBrowser1.InvokeScript("eval", new string[] { "document.cookie;" }) as string;
The cookieString variable will contain the full cookie for the document. You can then parse the string.
There is an WebBrowser Extension class which is exactly developed for this:
CookieCollection tempCookies = Microsoft.Phone.Controls.WebBrowserExtensions.GetCookies(this.BrowserControl);

Categories