I'm attempting to use Fogbugz's BugzScout in order to automatically submit unhanded application exceptions to my Fogbugz on demand Account. I've written up a wrapper class for it and everything appears to be just groovy - on my box. Testing the same code in the production environment, behind a Proxy that requires authentication, I have had nothing but issues.
I went to work modifying the BugzScout code in order to get it to authenticate with the Proxy, and after trying many different methods suggested via a Google search, found one that works! But now I'm getting an "Connection actively refused" error from Fogbugz itself, and I don't know what to do.
Here is the code where the BugzScout connects via a .net WebClient to submit a new case, with my modifications to deal with our Proxy. What am I doing that would cause Fogbugz to refuse my request? I've removed all non web-client related code from the procedure for ease of reading.
public string Submit(){
WebClient client = new WebClient();
WebProxy proxy = new WebProxy();
proxy.UseDefaultCredentials = true;
client.Proxy = proxy;
Byte[] response = client.DownloadData(fogBugzUrl);
string responseText = System.Text.Encoding.UTF8.GetString(response);
return (responseText == "") ? this.defaultMsg : responseText;
}
The url is correct and the case is filled in properly- this has been verified.
EDIT: Additional info.
Using Fogbugz on Demand.
Using FogBugz.net code in it's entirety, with only these additions
WebProxy proxy = new WebProxy();
proxy.UseDefaultCredentials = true;
client.Proxy = proxy;
Error occurs when attempting to connect to both https://oursite.fogbugz.com/scoutsubmit.asp and http://oursite.fogbugz.com//scoutsubmit.asp (except one says port 443, and the other port 80, obviously)
I don't know anything about web authentication so I can't tell you what kind I'm using- if you tell me where to look I'd be happy to answer that for you.
Got the fix from Fogbugz- this is the appropriate network code to get though the proxy authentication and not mis-authenticate with Bugzscout.
WebRequest.DefaultWebProxy.Credentials = CredentialCache.DefaultNetworkCredentials;
WebRequest request = WebRequest.Create(fogBugzUrl);
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
request.Proxy.Credentials = CredentialCache.DefaultNetworkCredentials;
Stream requestStream = request.GetRequestStream();
requestStream.Write(bytes, 0, bytes.Length);
requestStream.Close();
Is your fogbugzUrl using HTTP Basic Authentication? Is it SSL (hosted on On Demand?)
The connection actively refused message would be coming from the web server itself, not really FogBugz.
Can you post the HTTP Status Code?
One thing to note if you are using FogBugz On Demand is you HAVE to use the https:// url (not the http url).
Related
I'm having difficulty understanding how web requests and credentials work in .NET.
I have the following method that is executing a request to a SOAP endpoint.
public WebResponse Execute(NetworkCredential Credentials)
{
HttpWebRequest webRequest = CreateWebRequest(_url, actionUrl);
webRequest.AllowAutoRedirect = true;
webRequest.PreAuthenticate = true;
webRequest.Credentials = Credentials;
// Add headers and content into the requestStream
asyncResult.AsyncWaitHandle.WaitOne();
return webRequest.EndGetResponse(asyncResult);
}
It works well enough. However, users of my applications may have to execute dozens of these requests in short succession. Hundreds over the course of the day. My goal is to implement some of the recommendations I've read about, namely using an HttpClient that exists for the entire lifetime of the application, and to use the CredentialCache to store user's credentials, instead of passing them in to each request.
So I'm starting with the CredentialCache.
Following the example linked above, I instantiated a CredentialCache and added my network credentials to it. Note that this is the exact same NetworkCredential object that I was passing to the request earlier.
NetworkCredential credential = new NetworkCredential();
credential.UserName = Name;
credential.Password = PW;
Program.CredCache.Add(new Uri("https://blah.com/"), "Basic", credential);
Then, when I go to send my HTTP request, I get the credentials from the cache, instead of providing the credentials object directly.
public WebResponse Execute(NetworkCredential Credentials)
{
HttpWebRequest webRequest = CreateWebRequest(_url, actionUrl);
webRequest.AllowAutoRedirect = true;
webRequest.PreAuthenticate = true;
webRequest.Credentials = Program.CredCache;
// more stuff down here
}
The request now fails with a 401 error.
I am failing to understand this on several levels. For starters, I can't seem to figure out whether or not the CredentialCache has indeed passed the proper credentials to the HTTP request.
I suspect part of the problem might be that I'm trying to use "Basic" authentication. I tried "Digest" as well just as a shot in the dark (which also failed), but I'm sure there must be a way to see what kind of authentication the server is expecting.
I have been combing StackOverflow and MDN trying to read up as much as possible about this, but I am having a difficult time separating the relevant information from the outdated and irrelevant information.
If anyone can help me solve the problem that would be most appreciated, but even links to proper educational resources would be helpful.
According to the documentation the CredentialCache class is only for SMTP, it explicitly says that it is not for HTTP or FTP requests:
https://msdn.microsoft.com/en-us/library/system.net.credentialcache(v=vs.110).aspx
Which directly contradicts the info in the later api docs. Which one is right I don't know.
You could try using the HttpClient class. The methods and return types are different, so you would need to tweak your other a code a bit, but it would look a bit like this:
public class CommsClass
{
private HttpClient _httpClient;
public CommsClass(NetworkCredential credentials)
{
var handler = new HttpClientHandler { Credentials = credentials };
_httpclient = new HttpClient(handler);
}
public HttpResponseMessage Execute(HttpRequestMessage message)
{
var response = _httpClient.SendAsync(message).Result;
return response;
}
}
You can do all sorts of other things with the handler, and the client like set request headers or set a base address.
I am trying to patch a .net web application that after years of working started failing to get UPS shipping quotes, which is impacting web business dramatically. After much trial and error, I found the following code that works just fine in a console application:
static string FindUPSPlease()
{
string post_data = "<xml data string>";
string uri = "https://onlinetools.ups.com/ups.app/xml/Rate";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
byte[] postBytes = Encoding.ASCII.GetBytes(post_data);
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = postBytes.Length;
Stream requestStream = request.GetRequestStream();
requestStream.Write(postBytes, 0, postBytes.Length);
requestStream.Close();
// get response and send to console
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Console.WriteLine(new StreamReader(response.GetResponseStream()).ReadToEnd());
Console.WriteLine(response.StatusCode);
return "done";
}
This runs in Visual Studio just fine and gets a nice little response from UPS that the XML is, of course, malformed.
But, if I paste this function into the web application without changing a single character, an exception is thrown on request.GetRequestStream():
Authentication failed because the remote party has closed the transport stream.
I tried it in a couple of different place in the application with the same result.
What is there about the web application environment that would affect the request?
It turns out to be a TLS issue. I guess the console app uses a higher protocol by default than the web application, although none was specified. So, all you have to do is add the following line(s) of code sometime prior to making the request:
using System.Net;
...
System.Net.ServicePointManager.SecurityProtocol |= SecurityProtocolType.Tls11 | SecurityProtocolType.Tls12;
That was all it took, though I spent an enormous amount of getting there.
Here is the response from UPS on the issue:
Effective January 18, 2018, UPS will only accept TLS 1.1 and TLS 1.2 security protocols... 100% of requests from customers who are on TLS 1.0 while using production URLS (onlinetools.ups.com/tool name) will be rejected.
Anyway, hope this helps someone.
Jim
Can you try setting the Credentials to your request object like following.
request.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
Try setting the default credentials or check if there is any proxy server set and pass it like in the example below.
The example is given for WebClient.
I was having problem with setting Default Credential, as proxy was enabled on the server. So i passed the proxy URL and port with credentials which can access it.
using (System.Net.WebClient web = new System.Net.WebClient())
{
//IWebProxy defaultWebProxy = WebRequest.DefaultWebProxy;
//defaultWebProxy.Credentials = CredentialCache.DefaultCredentials;
//web.Proxy = defaultWebProxy;
var proxyURI = new Uri(string.Format("{0}:{1}", proxyURL, proxyPort));
//Set credentials
System.Net.ICredentials credentials = new System.Net.NetworkCredential(proxyUserId, proxyPassword);
//Set proxy
web.Proxy = new System.Net.WebProxy(proxyURI, true, null, credentials);
web.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
var result = web.UploadString(URL, "");
return result;
}
I have the following code which connects to my php server and retrieves data from it. The only thing is, I need to send the username and password securely from this webrequest to the PHP server. Looking at the docs for the webrequest class, there is a credentials property as well as a preauthenticate property. I'm assuming these are for the network credentials (all my users are in AD).
Is it possible to secure this post request with credentials or is this just a bad idea? I've also found SetBasicAuthHeader - I'll read up on this and see if it might help. All traffic will be on SSL from ASPX site to the PHP site
// variables to store parameter values
string url = "https://myphpserver.php";
// creates the post data for the POST request
string postData = "Username=" + username + "&Password=" + "&UID=" + UniqueRecID;
// create the POST request
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(url);
webRequest.Method = "POST";
webRequest.ContentType = "application/x-www-form-urlencoded";
webRequest.ContentLength = postData.Length;
// POST the data
using (StreamWriter requestWriter2 = new StreamWriter(webRequest.GetRequestStream()))
{
requestWriter2.Write(postData);
}
// This actually does the request and gets the response back
HttpWebResponse resp = (HttpWebResponse)webRequest.GetResponse();
string responseData = string.Empty;
using (StreamReader responseReader = new StreamReader(webRequest.GetResponse().GetResponseStream()))
{
// dumps the HTML from the response into a string variable
responseData = responseReader.ReadToEnd();
}
SetBasicAuthHeader is for HTTP Basic Access Authentication so won't help here as you're handling authentication at application level. Really, this is no more insecure than just going to the page in a browser. I see you're using SSL so your request will be encrypted anyway and you have nothing to worry about.
If you're concerned for some other reason (although I can't think why), it sounds like you have control over the PHP end so you could just encrypt the password and add an extra POST parameter so the server knows to decrypt it.
When using HTTPS your data is safe in the message and transport scope. It means no one can decode it or sniff the packets. I suggest you read this article HTTPS Wiki
I referred several websites, which had answer for this question
"The remote server returned an error: (407) Proxy Authentication Required." ,but none were helpful.
I wrote a sample code to check the proxy authentication in office. The code throws exception.
My requirement:- Verify what the website returns. Outside office, the code works fine, but in office it throws an exception due to proxy.
When I hardcode the credentials using new NetworkCredential, it works fine.
int ResponseCode;
string url = "http://www.msftncsi.com/ncsi.txt";
WebRequest request = WebRequest.Create(url);
request.Credentials = CredentialCache.DefaultCredentials;
using (WebResponse response = request.GetResponse())
{
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
responseFromServer = reader.ReadToEnd();
ResponseCode = (int)((HttpWebResponse)response).StatusCode;
reader.Close();
}
I do not want to Hardcode. I referred the solution in http://social.msdn.microsoft.com/Forums/is/csharpgeneral/thread/c06d3032-dceb-4a1a-bb6a-778fd13a938a, but even that didnt help.
What am I missing?
I had the same issue, this did the trick for me
request.Proxy.Credentials = CredentialCache.DefaultCredentials;
There are many things here. You can try setting Credentials explicitly
request.Credentials = new NetworkCredentials(username, password)
You might need to specify proxy. By default it uses your IE proxy. You might not want that
WebRequest webRequest = WebRequest.Create("http://stackoverflow.com/");
webRequest.Proxy = new WebProxy("http://proxyserver:80/",true);
I have a console application that I wrote for .NET/Windows that I suddenly had the need for on my unix system. Mono has, for the most part, been hugely successful at providing this for me.
There is however a small issue:
The application issues many HttpWebRequests as it runs, and for a small portion of these, Mono is returning an error:
Error getting response stream (Write: The authentication or decryption has failed.): SendFailure
This error message seems to indicate an SSL error. However, this application does not issue any request to SSL-secured URIs (i.e. all URIs are http://).
The main code in question is as follows:
HttpWebRequest req = WebRequest.Create(url) as HttpWebRequest;
req.UserAgent = UserAgent;
req.AuthenticationLevel = AuthenticationLevel.None;
req.AllowAutoRedirect = true;
req.KeepAlive = true;
req.CachePolicy = new RequestCachePolicy(RequestCacheLevel.NoCacheNoStore);
req.Timeout = timeout;
if (useCompression) {
req.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip,deflate");
}
Edit: For the purposes of running this code, you can define dummy variables as follows:
string url = "http://example.com";
string UserAgent = "WhateverBot";
int timeout = 5000;
bool useCompression = true;
It should be noted that the code works without any problem on Windows/.NET.
I also had this problem when running Mono in Windows even when the URL is not HTTPS. Let me say this loudly: ITS NOT REDIRECTED.
The solution has been to use the hack:
System.Net.ServicePointManager.ServerCertificateValidationCallback += delegate
{
return true;
};
My guess is that Mono does not look in the Windows keystore for certificates so does not find any so HTTPWebXXXX is not initialized correctly and then fails incorrectly.
My guess is that anyone taking the trouble to reproduce your error will know what they are doing with Mono (unlike me for example) and so have their environment set up correctly and be unable reproduce this real world problem.