I built a set of API's for one of our developers to consume in our web application. I did this in a .NET 4.0 class library project and wrote integration tests to ensure the API integrated with the backend service correctly. In the integration tests (a unit test project), as well as a console application, the API's work correctly and return all the expected results. However, when we execute the same API's from a ASP.NET web page that is running under IIS, the API fails at the following line of code:
using (HttpWebResponse response = (HttpWebRequest)request.GetResponse())
The failure is a WebException with a status of SendFailure and a socket error of ConnectionReset (10054) in the inner exception. The error is The underlying connection was closed: An unexpected error occurred on a send. This is using HTTPS, as well (hence the X509).
I already know that this is actually when the request is made, but I'm trying to pin-point what is different about an IIS environment that would prevent the stream from being able to write bytes over the network. I know that this is actually the web service server closing the connection before we get a chance to send our data, but I want to urge, again, that this same API works fine under a integration or unit test, or console application all day long.
I have already exhausted as many articles and posts on the internet that are related that I could find, including extensive Msdn documentation such as checking things related to the certificate, modifying HTTP headers service point properties. I'm truly at a loss because the code is not complicated, I've written web request code too many times to count, but here it is:
private string ExecuteServiceMessages(Uri serviceUrl, X509Certificate clientCertificate, string requestBody)
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(serviceUrl);
request.ClientCertificates.Add(clientCertificate);
request.Date = DateTime.Now;
request.Method = WebRequestMethods.Http.Post;
request.ContentType = MediaTypeNames.Text.Xml;
request.UserAgent = "******";
request.KeepAlive = false;
using (Stream requestStream = request.GetRequestStream())
using (StreamWriter writer = new StreamWriter(requestStream))
{
writer.Write(requestBody);
writer.Close();
}
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
using (Stream responseStream = response.GetResponseStream())
using (StreamReader reader = new StreamReader(responseStream))
{
string data = reader.ReadToEnd();
reader.Close();
return data;
}
}
The certificate is being loaded in by X509Certificate.CreateFromCertFile, where our certificate for testing is just in a directory of the website. We're not pulling it directly from the certificate store (unless we don't fully understand how certificates work when loaded from a file).
Is there a delay of any kind before you get the error (e.g. something that might indicate a timeout is occuring)?
A lot happens when the framework tries to validate that certificate. One of the things it may do (depending on the X509 policy you're using) is check the Certificate Revocation List, which requires a connection to the internet. This has bitten us a couple times when we tried to run the code on a server that is in an environment with limited internet access-- it spins for exactly 60 seconds then returns the error you are seeing (not very helpful). If this is it, you can change your CRL policy to offline, or edit your hosts file and override DNS so that the CRL check is performed using the loopback address-- it'll fail but at least you won't get a timeout.
You may be connecting internet through a proxy check your IE lan settings.
from c# you need to add proxy settings
var request = (HttpWebRequest)WebRequest.CreateHttp(url);
WebProxy proxy = new WebProxy("http://127.0.0.1:8888", true);
proxy.Credentials = new NetworkCredential("ID", "pwd", "Domain");
request.Proxy = proxy;
request.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
request.Timeout = 1000 * 60 * 5;
request.Method = method;
return request.GetResponse();
Related
I am creating a small web api in C#/.NET, which as some point have to request some data from an external API.
When I am testing in local, on my machine, it is working perfectly.
But when It is deployed on distant web hosting (I am using Ionos/1&1 Web Windows Hosting solution), for testing my app, I get this error message from my own API:
A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond {ip}
Here is the code calling the API and getting the response
Uri myUri = new Uri("url", UriKind.Absolute);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(myUri);
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream stream = response.GetResponseStream())
using (StreamReader reader = new StreamReader(stream))
{
string result = reader.ReadToEnd();
return JsonConvert.DeserializeObject<ModelOfResult>(result);
}
}
Is it a proxy issues ? Code issues ? Hosting issues ?
Sorry for my poor english, and thank you in advance
It seems it is proxy issue. Please try to add this code on your web.config:
<system.net>
<defaultProxy enabled="false" >
</defaultProxy>
</system.net>
thanks to the support from Ionos, I manage to get a proxy to make it works, by adding this code
request.Proxy = new WebProxy({proxy}, {proxy_port});
If you have any issue, please try the solution of TheGunners, or ask for your provider to give you the proxy
Still a shame I have to ask for it, but hey, it is working now
I'm trying to write C# code to scrape html code from a secure website (i.e. https://www.exampe.com).
I'm using Visual Studio 2017 with .Net Framework 4.5.2.
This is my code:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
using (Stream stream = response.GetResponseStream())
using (StreamReader reader = new StreamReader(stream))
{
string html = reader.ReadToEnd();
}
When the above code is run, the following error is raised at line #2, GetResponse():
The request was aborted: Could not create SSL/TLS secure channel
When I use regular HTTP URL, then the above code works fine. It only breaks on HTTPS.
PS:
This website is external, meaning outside of our network and works fine when I view it via any browser. I assume it's got all necessary legitimate and valid certificates.
And by the way, my code works fine when I run it at home on a separate computer that's not connected to my company's network.
Please help!
This could be a TLS 1.2 problem. You have to tell C# to use TLS 1.2:
System.Net.ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
add this code before your requests.
I found an answer to my question. All that needs to be done is to add the following line of code before creating a request:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
... and the rest is same as I had originally.
Now my app works great!
Thank you!
I have application which uses http to obtain data from my server like this:
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(requestString);
req.Timeout = 200 * 1000;
req.Headers.Add(String.Format("deleteme: {0}", content));
HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
Stream resStream = resp.GetResponseStream();
StreamReader read = new StreamReader(resStream);
html = read.ReadToEnd();
Everything works fine, but how can I hide my requests from Fiddler (something similar to Wireshark)? I want to prevent users to see it.
Fiddler works by registering itself as the http proxy in Windows.
You can disable your application from using the default proxy by setting a specific proxy (like in the code below "no proxy") anywhere in your application before making web requests:
HttpWebRequest.DefaultWebProxy = new WebProxy();
Note that this will also prevent your application from using a configured proxy when one is set-up for legitimate reasons.
This will hide the requests from Fiddler or any other tool that traces web requests by registering itself as http proxy. This will not prevent tracing the request with other tools that operate on a different level in the stack (like Wireshark)
Security by obscurity does not really work. If you want to make it impossible to read the data transferred, use actual encryption.
I am trying to add functionality to my C# app, to test a connection to an OData service which is secured with only Windows Authentication. The following block of code is what I am using to perform this test:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(new Uri(SERVICE_NAME));
CredentialCache myCache = new CredentialCache();
myCache.Add(new Uri(SERVICE_NAME), "Negotiate", new NetworkCredential(user, password));
resolver.Credentials = myCache;
// Do a simple request
request.Credentials = myCache;
request.PreAuthenticate = true;
request.KeepAlive = true;
request.AllowAutoRedirect = true;
request.CookieContainer = new CookieContainer();
object response = request.GetResponse(); // This is where the exception is thrown
When I run the above code, I receive the 401 - Unauthorized error as previously stated. However, when I have Fiddler2 running, the code works fine. So I am using Wireshark instead. In addition, the service works perfect within my browser (Chrome), and if I use Wireshark to compare the HTTP requests/responses for the Authentication, I see that they are nearly identical, except that in Chrome I have: Accept, User-Agent, Accept-Encoding, and Accept-Language headers, while my C# app does not have these. The only other difference is that my C# app sets the "Negotiate Seal" flag in the NTLM header, while Chrome does not set this flag.
Despite these differences, the authentication phase seems to work fine in the C# app, up until the service returns a 302 - Redirection, at which point the app tries a GET on the newly redirected URI, which returns a 401 again (when Chrome does the analogous GET, it receives HTTP 200 - OK, and proceeds on its merry way).
So, any ideas what could cause this? Problem with the service? or my code?
Thanks a lot!
-Erik
Ok, two whole days of research and I found the answer. Line 3 in the code above was using the full URI of the service (".../Northwind/Northwind.svc"), when the request was redirected, the credentials no longer applied to the new URI. The solution was to pass in only the beginning part ("...") of the URI. Stupid mistake on my part.
I've got a Browser Helper Object written in C# that makes a HTTPWebRequest POST to my server when the user clicks a button on a windows form, and in the normal case this works great. However, every so often, I'll get a BHO that seems to go crazy and continually send me a huge number of HTTPWebRequests (around 50 to 100 per minute). The only way I've been able to stop the behavior to have the client reboot their PC, often the users have even closed IE, but the POSTs keep rolling in.
Has anyone ever seen a similar behavior when using HTTPWebRequest? It almost seems like some retry logic in the connection is going crazy, but I didn't think HTTPWebRequest had any retry mechanism built in, so that seems very unlikely.
I'm setting up my connection incorrectly and is there a good strategy for preventing something like this?
Here is how I'm setting up my connection:
ServicePointManager.SecurityProtocol = SecurityProtocolType.Ssl3;
HttpWebRequest webRequest = (HttpWebRequest)HttpWebRequest.Create(myUrl);
webRequest.Timeout = 30000;
webRequest.Method = "POST";
webRequest.KeepAlive = false;
System.Net.ServicePointManager.Expect100Continue = false;
webRequest.ContentType = "text/xml";
webRequest.ContentLength = myData.Length;
using (StreamWriter requestWriter = new StreamWriter(webRequest.GetRequestStream()))
{
requestWriter.Write(this.myData);
requestWriter.Close();
}
Can you have the client get a wireshark trace from his PC when this happens?
Could there be some other object that is hosting the IE control, which in turn causes your BHO to be hosted and start triggering the requests?
Unfortunately, unless you can get a repro on your machine, it will just be a shot in the dark.