I've been battling with this issue for over 2 weeks now and have gotten nowhere.
First the environment: Windows 2003 R2 SP2 / SharePoint 2007 SP1 / .NET 3.5.
Basically, we are making web service calls to gather data from a remote API. The API has several endpoints for REST and several for SOAP. The endpoints are HTTPS endpoints with digest authentication. When we make calls with the SOAP endpoints, everything seems to work just fine. But then we try to make a call using REST and the thread hangs then dies a horrible death when IIS decides that the thread isn't responding anymore and kills it. At first, we thought this was an SSL issue (and it still might be) because we don't see any issues when using the HTTP endpoints (route to the same API just not SSL).
Below is the code we're using to make the REST call:
private void Process(HttpContext context, String url, String restParam)
{
ServicePointManager.ServerCertificateValidationCallback += new System.Net.Security.RemoteCertificateValidationCallback(validateCertificate);
WriteLogMessage("Start Process");
String pattern = "{0}{1}";
String address = String.Format(pattern, url, restParam);
WriteLogMessage("ADDRESS is" + address);
LSWebClient client = new LSWebClient();
client.Timeout = 600000;
WriteLogMessage("TIMEOUT (client.Timeout) is " + client.Timeout.ToString());
client.Credentials = new NetworkCredential(XYZConfigurationSettings.APIUserName, XYZConfigurationSettings.APIPassword);
try {
String result = client.DownloadString(address);
WriteLogMessage("End Process. RESULT length is " + (result != null ? result.Length : 0));
context.Response.Write(result);
}
catch (Exception ex)
{
WriteLogMessage("EXCEPTION!!! Message----" + ex.Message + "---- StackTrace ----" + ex.StackTrace + "");
}
}
private bool validateCertificate(object sender, X509Certificate cert, X509Chain chain, System.Net.Security.SslPolicyErrors error)
{
WriteLogMessage("bypassAllCertificateStuff");
return true;
}
So, crappy code aside, we put in a few things here to try to get around what we thought was an SSL Certificate issue. (setting the request timeout to 10 minutes, using custom certificate validation, etc...) However, none of this seems to fix the issue.
Here's the result of our logging:
2/28/2011 3:35:28 PM: Start
2/28/2011 3:35:28 PM: Start Process
2/28/2011 3:35:28 PM: ADDRESS ishttps://<host>/ws/rs/v1/taxonomy/TA/root/
2/28/2011 3:35:28 PM: TIMEOUT (client.Timeout) is 600000
2/28/2011 3:35:50 PM: CheckValidationResult
2/28/2011 3:35:50 PM: bypassAllCertificateStuff
2/28/2011 3:41:51 PM: EXCEPTION!!! Message ----Thread was being aborted.---- StackTrace ---- at System.Net.Connection.CompleteStartConnection(Boolean async, HttpWebRequest httpWebRequest)
at System.Net.Connection.CompleteStartRequest(Boolean onSubmitThread, HttpWebRequest request, TriState needReConnect)
at System.Net.Connection.SubmitRequest(HttpWebRequest request)
at System.Net.ServicePoint.SubmitRequest(HttpWebRequest request, String connName)
at System.Net.HttpWebRequest.SubmitRequest(ServicePoint servicePoint)
at System.Net.HttpWebRequest.GetResponse()
at System.Net.WebClient.GetWebResponse(WebRequest request)
at System.Net.WebClient.DownloadBits(WebRequest request, Stream writeStream, CompletionDelegate completionDelegate, AsyncOperation asyncOp)
at System.Net.WebClient.DownloadDataInternal(Uri address, WebRequest& request)
at System.Net.WebClient.DownloadString(Uri address)
at System.Net.WebClient.DownloadString(String address)
at XYZ.DAO.Handlers.RestServiceHandler.Process(HttpContext context, String url, String restParam)
at XYZ.DAO.Handlers.RestServiceHandler.ProcessRequest(HttpContext context)----
I have attempted to use my browser to view the return data, but the browser is IE6, which doesn't support SSL. However, I can see (in Fiddler / Charles proxy) that it does attempt to make the request and receives a 401 error but since I can not see server traffic using these programs I can not tell at exactly what step the error is happening.
To make matters worse, I can not reproduce this issue on any other server I have (note: they are all Windows 2008 servers).
So, in summary, here's what I've found:
SOAP - work
REST - no work
Win2008 - work
Win2003 - no work
HTTP - work
HTTPS - no work
If anyone has any insight or any other debugging / information gathering that I haven't tried I would be extremely greatful.
You should be able to get a bunch more tracing information if you add the following to your client .config file.
<system.diagnostics>
<sources>
<source name="System.Net" switchValue="Information, ActivityTracing">
<listeners>
<add name="System.Net"
type="System.Diagnostics.TextWriterTraceListener"
initializeData="System.Net.trace.log" />
</listeners>
</source>
</sources>
</system.diagnostics>
I've found what was causing the web service call to hang - the issue was that the service we were calling was using replay attack protection along with digest security:
Our server would send an initial
request sans security header
The request was responded to with a
standard 401 challenge providing a
nonce for use. (That nonce expires
after 10 seconds after the
challenge)
Our server then took 30 seconds to
generate a second response using
this nonce
So the remote server would then find the
expired nonce and again issue
another 401 challenge.
The cycle would continue until the local server's thread was terminated. However, why our local server is taking 30 $##%! seconds to generate a security header is beyond me. I inspected the logs that were provided through the diagnostics above, but none of it was much help. I'm going to chalk it up to the server being overloaded and not having enough memory to process it's way out of a wet paper bag.
Related
I have a project that has to get 100's of pages of data from a site each day. I use a paid for proxy with login details and I wait 5 seconds between requests so I don't hammer their site and pass a referer, user-agent and it is a simple GET request.
However I tried to make just a little C# Console script to test various ways of adding proxies e.g with or without credentials and got a working IP:Port from the web > http://www.freeproxylists.net/ to test it with, as my own details in this test didn't work. I am at a loss to why this test script isn't working when my main project is.
I am accessing an old site I own anyway so I am not blocking my own home IP as I can access it on the web (or any other page or site) in a browser easily.
Without using a proxy I just get a 30 second wait (the timeout length) then a "Timeout Error", with the proxy I get NO wait at all (free proxy OR one I own with credentials) before a "Timeout Error" - so whether I use a proxy or not it fails to return a response.
I am probably just sleep drained but would like to know what I am doing wrong as I just copied my "MakeHTTPGetRequest" method from my main projects Scraper class and just removed all the case statements in the try/catch to check for Connection/Timeout/404/Service/Server errors etc and put it into one simple Main method here...
public static void Main(string[] args)
{
string url = "https://www.strictly-software.com"; // a site I own
//int port = ????; // working in main project crawler
int port = 3128; // from a list of working free proxies
string proxyUser = "????"; // working in main project crawler
string proxyPassword = "????"; // working in main project crawler
string proxyIP = "167.99.230.151"; // from a list of working proxies
ShowDebug("Make a request to: " + url + " with proxy:" + proxyIP + ":" + port.ToString());
// user basic IP and Port proxy with no login
WebProxy proxy = new WebProxy(proxyIP, port);
/*
// use default port, username and password to login
// get same error with correct personal proxy and login but not
// in main project
WebProxy proxy = new WebProxy(proxyIP, port)
{
Credentials = new NetworkCredential(proxyUser, proxyPassword)
};
*/
ShowDebug("Use Proxy: " + proxy.Address.ToString());
HttpWebRequest client = (HttpWebRequest)WebRequest.Create(url);
client.Referer = "https://www.strictly-software.com";
client.Method = "GET";
client.ContentLength = 0;
client.ContentType = "application/x-www-form-urlencoded;charset=UTF-8";client.Proxy = proxy;
client.UserAgent = "Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:79.0) Gecko/20100101 Firefox/79.0";
client.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
client.Headers.Add("Accept-Encoding", "gzip,deflate");
client.KeepAlive = true;
client.Timeout = 30;
ShowDebug("make request with " + client.UserAgent.ToString());
try
{
// tried adding this to see if it would help but didn't
//ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
// get the response
HttpWebResponse response = (HttpWebResponse)client.GetResponse();
ShowDebug("response.ContentEncoding = " + response.ContentEncoding.ToString());
ShowDebug("response.ContentType = " + response.ContentType.ToString());
ShowDebug("Status Desc: " + response.StatusDescription.ToString());
ShowDebug("HTTP Status Code: " + response.StatusCode.ToString());
ShowDebug("Now get the full response back");
// old method not working with £ signs
StreamReader ResponseStream = new StreamReader(response.GetResponseStream(), Encoding.UTF8);
string ResponseContent = ResponseStream.ReadToEnd().Trim();
ShowDebug("content from response == " + Environment.NewLine + ResponseContent);
ResponseStream.Close();
response.Close();
}
catch (WebException ex)
{
ShowDebug("An error occurred");
ShowDebug("WebException " + ex.Message.ToString());
ShowDebug(ex.Status.ToString());
}
catch(Exception ex)
{
ShowDebug("An error occurred");
ShowDebug("Exception " + ex.Message.ToString());
}
finally
{
ShowDebug("At the end");
}
}
The error messages from the console (ShowDebug is just a wrapper for the time + message)...
02/08/2020 00:00:00: Make a request to: https://www.strictly-software.com with proxy:167.99.230.151:3128
02/08/2020 00:00:00: Use Proxy: http://167.99.230.151:3128/
02/08/2020 00:00:00: make request with Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:79.0) Gecko/20100101 Firefox/79.0
02/08/2020 00:00:00: An error occurred
02/08/2020 00:00:00: WebException The operation has timed out
02/08/2020 00:00:00: Timeout
02/08/2020 00:00:00: At the end
I am sure it is just a something I have missed but I know that this code was copied from my main project that is currently crawling through 100s of pages with the same code and using a proxy with my credentials that work as I am getting data back at the moment from the main projects code.
I can ping the IP address of the proxy, but going to it in a browser returns a connection error, this is despite my big project using the same proxy to ripple through tons of pages and return HTML all night long...
I just wanted to update my main project by adding new methods to pass in custom proxies, or not use a proxy for the 1st attempt but if it fails then use one for a final attempt, or use a default proxy:port etc.
You set your timeout to be 30 milliseconds: client.Timeout = 30;
That could be causing your timeouts.
More info here.
Not sure if this solves your problem, but:
The documentation for HttpWebRequest states the following:
The local computer or application config file may specify that a
default proxy be used. If the Proxy property is specified, then the
proxy settings from the Proxy property override the local computer or
application config file and the HttpWebRequest instance will use the
proxy settings specified. If no proxy is specified in a config file
and the Proxy property is unspecified, the HttpWebRequest class uses
the proxy settings inherited from Internet Explorer on the local
computer. If there are no proxy settings in Internet Explorer, the
request is sent directly to the server.
Maybe there is a proxy in the IE settings configured? This does not explain why the request fails using the custom proxy, but it may be worth a shot.
I also suggest, that you test the free proxy using something like postman. From my experience, these free proxies don't work at least half of the time.
My best guess is, that when not using the proxy, the request fails because of the IE stuff and when using the proxy, the proxy itself is simply not working.
Hopefully this solves the problem...
How to send notification using Webpush library. I was trying but its throws error msg like {"Received unexpected response code"}
****Now i have created web API to send notification & calling you through fiddler,but did n't get exception it's stuck somewhere
here is my code sample****
public void Sendnotification()
{
try
{
WebPush.VapidDetails vapidKeys = apidHelper.GenerateVapidKeys();
string subject =#"mailto:xyz.com";
string publicKey = Convert.ToString(vapidKeys.PublicKey);
string privateKey = Convert.ToString(vapidKeys.PrivateKey);
var subscription = new PushSubscription(pushEndpoint, p256dh, auth);
var vapidDetails = new VapidDetails(subject, publicKey, privateKey);
client.SendNotification(subscription, "payload", vapidDetails);
}
catch (WebPushException e)
{
}
}
I have configured Https enabled to call api using fidder. Please have look. also its throws error, it stuck somewhere
now it got the error please have look it's showing error HTTP/1.1 410 NotRegistered
See the full screen of Fiddler response error details
If you are getting the error 410 (to check the error use fiddler to intercept the https call), probably what you have is an error in the subscription data of the user probably the keys stored in your database doesn't match the subscription in the browser an easy fix could be to re-subscribe and re-save the subscription data and try again.
to setup fiddler, you have to use it as a proxy visual studio to intercept the https calls and also you have to enable https decryption.
EDIT
you can set up fiddler just by adding this configuration in your web.config or app.config:
<system.net>
<defaultProxy
enabled = "true"
useDefaultCredentials = "true">
<proxy autoDetect="false" bypassonlocal="false" proxyaddress="http://127.0.0.1:8888" usesystemdefault="false" />
</defaultProxy>
</system.net>
if in any case, you get unauthorized registration check this questions:
Web Push API Chrome, returning "Unauthorized Registration"
WebPushError and UnauthorizedRegistration when try to send push notification to Chrome and Opera, FF is OK
I have an intranet MVC.NET website. Let's call it MySite. From MySite I'm trying to make a web request to another intranet website. Let's call the other website OtherSite. Both websites are in the same domain and are running under IIS. Both websites are using:
<authentication mode="Windows" />
<authorization>
<allow verbs="OPTIONS" users="*" />
<deny users="?" />
</authorization>
MySite is accessed by an authenticated user (same domain) with a web browser (Chrome, IE). Let's call that user Client. The credentials from Client, should be used when MySite calls OtherSite.
I have tried the following:
With WebRequest:
var request = WebRequest.CreateHttp(uri);
request.Credentials = CredentialCache.DefaultCredentials;
request.ImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Impersonation;
var response = request.GetResponse();
return response;
With WebClient, as suggested here;
using (var client = new WebClient { UseDefaultCredentials = true })
{
client.Headers.Add(HttpRequestHeader.ContentType, "application/json; charset=utf-8");
var data = client.DownloadData(uri);
return data;
}
Both with and without this code around it:
var wi = (System.Security.Principal.WindowsIdentity)HttpContext.Current.User.Identity;
var wic = wi.Impersonate();
try
{
// Code for making request goes here...
}
catch (Exception exc)
{
// handle exception
}
finally
{
wic.Undo();
}
I've tried with and without <identity impersonate="true" /> in the web.config for MySite.
As soon as I try to impersonate the Client I get a 401 from OtherSite. And when I check the IIS logs for OtherSite it looks like the credentials aren't passed with the request at all.
If I don't impersonate the user it all works great. But as soon as I try to impersonate it fails and returns a 401. Do I have to do anything in Active Directory? I've seen this question, where the answer was delegation. Is that the issue? What can be the reason for the 401 I get when i impersonate?
The IIS-logs on OtherSite looks like this when I impersonate:
2016-10-19 07:33:26 2a01:9080:700:0:3fe7:b92a:552:1246 GET /odata/$metadata - 80 - 2a01:9080:700:0:8d90:4bc0:2ffd:d088 - - 401 0 0 0
2016-10-19 07:33:26 2a01:9080:700:0:3fe7:b92a:552:1246 GET /odata/$metadata - 80 - 2a01:9080:700:0:8d90:4bc0:2ffd:d088 - - 401 1 2148074252 0
They look like this when I don't impersonate:
2016-10-19 07:57:11 2a01:9080:700:0:3fe7:b92a:552:1246 GET /odata/$metadata - 80 MyDomain\SVC_ServiceAccount1 2a01:9080:700:0:8d90:4bc0:2ffd:d088 - - 200 0 0 0
2016-10-19 07:57:11 2a01:9080:700:0:3fe7:b92a:552:1246 GET /odata/$metadata - 80 MyDomain\SVC_ServiceAccount1 2a01:9080:700:0:8d90:4bc0:2ffd:d088 - - 200 0 0 0
I have a service account for the app pool, named MyDomain\SVC_ServiceAccount1 in the logs above. Real name is something else...
If your MVC site is running in IIS, what application pool is that running in?
You may need to set that application pool to run under a specific identity, otherwise it'll attempt to access the remote resource using a machine identity.
EDIT (in response to comments)
My next thought was that IIS was also configured for anonymous access and that this was what was being passed. In the IIS management console when you look at authentication for that site what does it say? The settings you're talking about here are about authorisation not authentication.
If you try disabling anonymous authentication in IIS (I'd do an iisreset after to be sure there are no lingering worker processes hanging around) and leaving only Windows authentication there and enabled what happens?
This is how I perform the setup in my code (note I'm using HttpClient though)
protected HttpClient SetupHttpClient(string uri)
{
HttpClient client = new HttpClient {BaseAddress = uri};
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
return client;
}
So I'm using Grapevine.RESTClient to manage the client side of my REST interface. I'm using it to communicate between a service running in LocalSystem and a process run by the user on the same machine.
My problem is that when the service is not running my client gets an exception with a message of 'Error: Value cannot be null. Parameter name: cookies'
I'm trying to create some logic on the client that is supposed to understand and accept that sometimes the service is unavailable like when the service is auto updating.
Or maybe I should just accept that the message of the exception is a little odd?
RESTClient client;
client = new RESTClient(baseUrl);
RESTRequest request = new RESTRequest(resource);
request.Method = Grapevine.HttpMethod.GET;
request.ContentType = Grapevine.ContentType.JSON;
request.Timeout = 30000;
RESTResponse response = client.Execute(request);
The above throws a System.ArgumentNullException with e.Message = "Value cannot be null.\r\nParameter name: cookies"
Hmmm... Looking at the Grapevine code on github it seems the code tries to add a cookie collection to this.Cookies even if the response object was created out of e.response in the catch block of the GetResponse call. It may or may not have a cookie collection. There should have been a test for null block around the this.Cookies.Add(response.Cookies) right?
https://github.com/scottoffen/Grapevine/blob/master/Grapevine/Client/RESTClient.cs
Unable to create a grapevine tag as the developer of grapevine suggested to do. Dont have enough points
I've had the same problem. Unfortunately the error message is misleading. To me the fix was to add a default proxy to the file App.config.
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.net>
<defaultProxy useDefaultCredentials="true" />
</system.net>
</configuration>
What I am trying to accomplish is adding a GET method to my WCF REST based service and accessing it via the WebRequest class from a Silverlight 3 client application.
I am getting the error The remote server returned an error: NotFound. which, as I understand it, can be just a generic error for any 500 error encountered on the server.
WCF operation contract:
[OperationContract, WebGet(UriTemplate = "path/{id}")]
Stream Get(string id);
Operation implementation:
public Stream Get(string id)
{
WebOperationContext.Current.OutgoingResponse.ContentType = "application/xml; charset=utf-8";
return new MemoryStream(Encoding.UTF8.GetBytes("<xml><id>1</id><name>Some Name</name></xml>));
}
Client code that throws exception:
HttpWebRequest webRequest = WebRequest.CreateHttp("http://domain.com/my-service.svc/path/1");
webRequest.BeginGetResponse(
x =>
{
try
{
using (WebResponse webResponse = webRequest.EndGetResponse(x)) <--Exception thrown here
using (Stream stream = webResponse.GetResponseStream())
{
//do stuff here...eventually.
}
}
catch (Exception ex)
{
}
},
null);
I suspect that it has something to do with the return type and have also tried returning XmlElement to no avail. I am really stumped here, any ideas what I might be doing wrong?
Note that I can successfully hit the method via Fiddler and a web browser.
Try putting the code below into your web.config file(change the file name in the initializeData attribute appropriately.
If you are using full IIS, and not Casini or IIS Express (I use the latter), make sure to put the log file somewhere where your web application has write permissions). This will cause WCF to generate a fairly detailed log file. I've found the log to be pretty handy.
<system.diagnostics>
<sources>
<source name="System.ServiceModel"
switchValue="Information, ActivityTracing"
propagateActivity="true">
<listeners>
<add name="traceListener"
type="System.Diagnostics.XmlWriterTraceListener"
initializeData= "c:\temp\WEBTraces.log" />
</listeners>
</source>
</sources>
</system.diagnostics>
Here's another thing to check: Is domain.com exactly the same domain name as your silverlight app is running from (example -- is your SL app starting as localhost/xx and your web service call going to domain.com?
For security reasons, Silverlight will not make cross-domain web service calls unless the called domain grants it permission (same as Flash). If this is the case, you will need a clientaccesspolicy.xml file.
You can read about it here: http://weblogs.asp.net/jgalloway/archive/2008/12/12/silverlight-crossdomain-access-workarounds.aspx
There is a video here: http://www.silverlight.net/learn/data-networking/introduction-to-data-and-networking/how-to-use-cross-domain-policy-files-with-silverlight
There are some helpers here: http://timheuer.com/blog/archive/2008/04/06/silverlight-cross-domain-policy-file-snippet-intellisense.aspx
NotFound should mean 404 and not 500. A 404 error could be produced by a wrong URI.
Uri resturi = new Uri(String.Format("http://{0}:8080/MyService/", hostname)); // http
WebHttpBinding rest = new WebHttpBinding(WebHttpSecurityMode.TransportCredentialOnly); // WebHttpSecurityMode.Transport for ssl
host.AddServiceEndpoint(typeof(IMyService), rest, resturi);
In the code example above your service would be aviable via http://host:8080/MyService/path/1