i want to consume a php Webservice from C# which is protected by htaccess.
i added the Service by adding a web reference in my VS 2010.
mywsfromwsdl myws = new mywsfromwsdl();
System.Net.CredentialCache myCredentials = new System.Net.CredentialCache();
NetworkCredential netCred = new NetworkCredential("user", "pass");
myCredentials.Add(new Uri(myws.Url), "Basic", netCred);
myws.Credentials = myCredentials;
myws.PreAuthenticate = true;
tbxIN.Text = "<?xml version=\"1.0\" encoding=\"UTF-8\"?> "+
" <test> "+
" <parm1>4</parm1> "+
" <parm1>2</parm1> "+
" </test>";
tbxOUT.Text= myws.func1(tbxIN.Text.ToString());
The VS shows an error called 400 Bad REquest my last row.
If i delete the .htaccess File on the server, the pgm works fine, but i cant delete. because other PHP User use the Service.
Can anybody tell me how to send the Credentials correctly ?
By jo
Sometimes C# and Apache clash a bit: in this case, it might be that your client is expecting a 100 Continue response due to authentication being active, but the server doesn't send it.
This kind of behavior is toggled by this line:
ServicePointManager.Expect100Continue = false;
Add it before executing the request.
It's also worth pointing out that when 400 bad request happens, you might find some useful details in server's logs.
According to MSDN you shouldn't need the credential cache:
// Set the client-side credentials using the Credentials property.
ICredentials credentials = new NetworkCredential("Joe",SecurelyStoredPassword,"mydomain");
math.Credentials = credentials;
Have you tried this method instead of the cache object? More info can be found here.
Related
I have a project that has to get 100's of pages of data from a site each day. I use a paid for proxy with login details and I wait 5 seconds between requests so I don't hammer their site and pass a referer, user-agent and it is a simple GET request.
However I tried to make just a little C# Console script to test various ways of adding proxies e.g with or without credentials and got a working IP:Port from the web > http://www.freeproxylists.net/ to test it with, as my own details in this test didn't work. I am at a loss to why this test script isn't working when my main project is.
I am accessing an old site I own anyway so I am not blocking my own home IP as I can access it on the web (or any other page or site) in a browser easily.
Without using a proxy I just get a 30 second wait (the timeout length) then a "Timeout Error", with the proxy I get NO wait at all (free proxy OR one I own with credentials) before a "Timeout Error" - so whether I use a proxy or not it fails to return a response.
I am probably just sleep drained but would like to know what I am doing wrong as I just copied my "MakeHTTPGetRequest" method from my main projects Scraper class and just removed all the case statements in the try/catch to check for Connection/Timeout/404/Service/Server errors etc and put it into one simple Main method here...
public static void Main(string[] args)
{
string url = "https://www.strictly-software.com"; // a site I own
//int port = ????; // working in main project crawler
int port = 3128; // from a list of working free proxies
string proxyUser = "????"; // working in main project crawler
string proxyPassword = "????"; // working in main project crawler
string proxyIP = "167.99.230.151"; // from a list of working proxies
ShowDebug("Make a request to: " + url + " with proxy:" + proxyIP + ":" + port.ToString());
// user basic IP and Port proxy with no login
WebProxy proxy = new WebProxy(proxyIP, port);
/*
// use default port, username and password to login
// get same error with correct personal proxy and login but not
// in main project
WebProxy proxy = new WebProxy(proxyIP, port)
{
Credentials = new NetworkCredential(proxyUser, proxyPassword)
};
*/
ShowDebug("Use Proxy: " + proxy.Address.ToString());
HttpWebRequest client = (HttpWebRequest)WebRequest.Create(url);
client.Referer = "https://www.strictly-software.com";
client.Method = "GET";
client.ContentLength = 0;
client.ContentType = "application/x-www-form-urlencoded;charset=UTF-8";client.Proxy = proxy;
client.UserAgent = "Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:79.0) Gecko/20100101 Firefox/79.0";
client.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
client.Headers.Add("Accept-Encoding", "gzip,deflate");
client.KeepAlive = true;
client.Timeout = 30;
ShowDebug("make request with " + client.UserAgent.ToString());
try
{
// tried adding this to see if it would help but didn't
//ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
// get the response
HttpWebResponse response = (HttpWebResponse)client.GetResponse();
ShowDebug("response.ContentEncoding = " + response.ContentEncoding.ToString());
ShowDebug("response.ContentType = " + response.ContentType.ToString());
ShowDebug("Status Desc: " + response.StatusDescription.ToString());
ShowDebug("HTTP Status Code: " + response.StatusCode.ToString());
ShowDebug("Now get the full response back");
// old method not working with £ signs
StreamReader ResponseStream = new StreamReader(response.GetResponseStream(), Encoding.UTF8);
string ResponseContent = ResponseStream.ReadToEnd().Trim();
ShowDebug("content from response == " + Environment.NewLine + ResponseContent);
ResponseStream.Close();
response.Close();
}
catch (WebException ex)
{
ShowDebug("An error occurred");
ShowDebug("WebException " + ex.Message.ToString());
ShowDebug(ex.Status.ToString());
}
catch(Exception ex)
{
ShowDebug("An error occurred");
ShowDebug("Exception " + ex.Message.ToString());
}
finally
{
ShowDebug("At the end");
}
}
The error messages from the console (ShowDebug is just a wrapper for the time + message)...
02/08/2020 00:00:00: Make a request to: https://www.strictly-software.com with proxy:167.99.230.151:3128
02/08/2020 00:00:00: Use Proxy: http://167.99.230.151:3128/
02/08/2020 00:00:00: make request with Mozilla/5.0 (Windows NT 6.3; Win64; x64; rv:79.0) Gecko/20100101 Firefox/79.0
02/08/2020 00:00:00: An error occurred
02/08/2020 00:00:00: WebException The operation has timed out
02/08/2020 00:00:00: Timeout
02/08/2020 00:00:00: At the end
I am sure it is just a something I have missed but I know that this code was copied from my main project that is currently crawling through 100s of pages with the same code and using a proxy with my credentials that work as I am getting data back at the moment from the main projects code.
I can ping the IP address of the proxy, but going to it in a browser returns a connection error, this is despite my big project using the same proxy to ripple through tons of pages and return HTML all night long...
I just wanted to update my main project by adding new methods to pass in custom proxies, or not use a proxy for the 1st attempt but if it fails then use one for a final attempt, or use a default proxy:port etc.
You set your timeout to be 30 milliseconds: client.Timeout = 30;
That could be causing your timeouts.
More info here.
Not sure if this solves your problem, but:
The documentation for HttpWebRequest states the following:
The local computer or application config file may specify that a
default proxy be used. If the Proxy property is specified, then the
proxy settings from the Proxy property override the local computer or
application config file and the HttpWebRequest instance will use the
proxy settings specified. If no proxy is specified in a config file
and the Proxy property is unspecified, the HttpWebRequest class uses
the proxy settings inherited from Internet Explorer on the local
computer. If there are no proxy settings in Internet Explorer, the
request is sent directly to the server.
Maybe there is a proxy in the IE settings configured? This does not explain why the request fails using the custom proxy, but it may be worth a shot.
I also suggest, that you test the free proxy using something like postman. From my experience, these free proxies don't work at least half of the time.
My best guess is, that when not using the proxy, the request fails because of the IE stuff and when using the proxy, the proxy itself is simply not working.
Hopefully this solves the problem...
I am trying to get access a URL using .Net but when I run my Program I get the Error The remote server returned an error: (403) Forbidden. Now, the issue is if I click the link http://thisIsMyUR, and enter the user name and password as in the below code. It totally works. I am not able to understand why this exception is coming? Please refer the code below.
Side Note: I am using this sample function below to fire Build of my project in Jenkins Server.
string url = "http://thisIsMyURL";
WebRequest webRequest = WebRequest.Create(url);
webRequest.Credentials = new NetworkCredential("admin", "pass");
WebResponse response = webRequest.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string responseText = reader.ReadToEnd();
Other than what was recommended by Bradley Uffner in the comments, you can try to provide an actual user agent during the request. I have seen certain servers which would not respond to requests without that header for some odd perceived security reason.
EDIT: as requested, i'll update with some more information.
Some servers may choose to ignore requests, giving some error code (or closing connection) when certain conditions are not met. A good way of checking if that is the case, is to actually send all the standard headers sent by your average web browser in the request. The "User agent" is one of those headers, and this example adds it to your request:
string url = "http://thisIsMyURL";
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(url);
webRequest.UserAgent = "USER AGENT VALUE";
HttpWebRequest.Credentials = new NetworkCredential("admin", "pass");
WebResponse response = webRequest.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string responseText = reader.ReadToEnd();
The cast to HttpWebRequest is required so you have access to the UserAgent header field. Other standard HTTP fields can be accessed this way.
You can check which headers are sent by your browser of choice by inspecting its requests using the browser developer tools (usually Right click on webpage -> Inspect elements -> Network).
I figured it out with the help of some articles. A basic authentication needs to be added. Below is a sample.
var userName = "admin;"
var password = "password";
var encodedAuthentication = Convert.ToBase64String(Encoding.GetEncoding("ISO-8859-1").GetBytes(userName + ":" + password));
webRequest.Headers.Add("Authorization", "Basic " + encodedAuthentication);
I am trying to connect to a corporate ftp server via code (C#/VB). It has been set up to run in a browser using SolarWinds Serv-U software so that users can access it via a browser. The address is in the form:
https://ftp.example.com
From here they are presented with a login form (part of Serv-U) in which they enter their u/p and log in.
I have been trying to use HttpWebRequest class to log in, but each time I get an '401 Unauthorised - not logged in' error. In the web request I set the credentials:
Dim loginUri = New Uri("https://ftp.example.com")
Dim loginRequest As HttpWebRequest = DirectCast(WebRequest.Create(loginUri), HttpWebRequest)
With loginRequest
.Accept = "*/*"
.ContentType = "application/x-www-form-urlencoded"
.CookieContainer = New CookieContainer()
.Credentials = New NetworkCredential("user", "pass")
.Method = WebRequestMethods.Http.Get
End With
Dim loginResponse As HttpWebResponse = loginRequest.GetResponse()
I'm not even sure if this approach is possible; there are quite a number of cookies set by the browser during the login process which is not a desirable thing to replicate in code.
I've done a fair bit of searching on the subject and haven't found any definitive answers. Should I just push back on the sysadmin to set up a proper ftp server over SSL? It is a requirement that we use :443 as many firewalls block 21 (& 22).
Thanks - Z
I have to receive the API from Pingdom. The address is https://api.pingdom.com
How can I in .NET do a http get when it is https? Google gives me nothing to work with :(
Best regards
UPDATE::
Thanks for help.. Trying with PowerShell:
$NC = New-Object System.Net.NetworkCredential("USER", "PASS")
$CC = New-Object System.Net.CredentialCache
$CC.Add("api.pingdom.com", 443, "Basic", $NC)
$webclient = [System.Net.WebRequest]::Create("https://api.pingdom.com")
$webclient.Credentials = $CC
$webclient.PreAuthenticate = $true
$webclient.Method = "POST"
$webclient.GetResponse()
I get the error: Exception calling "GetResponse" with "0" argument(s): "The remote server returned an error: (401) Unauthorized."
Any good advice?
Basically,
http://www.pingdom.com/services/api-documentation-rest/
The authentication method for user credentials is HTTP Basic Access Authentication (encrypted over HTTPS). This means you will provide your credentials every time you make a request. No sessions are used.
HTTP Basic Access Authentication is well documented both here and on MSDN.
This answer together with the API docs should get you started down the right path.
https://stackoverflow.com/a/1127295/64976
Assuming you use a WebRequest, you attach a CredentialCache to your request:
NetworkCredential nc = new NetworkCredential("user", "password");
CredentialCache cc = new CredentialCache();
cc.Add("www.site.com", 443, "Basic", nc);
The CredentialCache is used to be able to set Basic authentication.
You should be able to set the credentials of the webclient and then any time a login is needed, it will supply what you gave it.
WebClient webClient = new WebClient();
webClient.Credentials = new System.Net.NetworkCredential("UserName", "Password", "Domain");
Edit: Using:
byte[] authBytes = System.Text.Encoding.UTF8.GetBytes(user + ":" + password);
wr.Headers["Authorization"] = "Basic " + Convert.ToBase64String(authBytes);
Seems to work fine.
I have an application that communicates with a CMS. I'm currently trying to get this client to be able to upload text/xml data to the CMS using a "POST" method.
I can pass this through using curl perfectly fine:
curl -u user:password -H "Content-Type:text/xml" -d "<element>myXML</element>" serverURL
However, trying to use the HttpWebRequest in C# I can't get the server to return what I want it to. So I fired up Wireshark and had a look at what was actually being passed through, and it's pretty much identical except for the fact that when using curl I can see:
Authorization: Basic <a bunch of hex>=\r\n
Credentials: user:password
In the HTTP header fields, while in the output from my client, these header fields are simply not present. ("Credentials:" isn't actually there in plain text, it's a subtree of "Authorization:" - so I'm not sure where it's getting it from, but the username and password are correct.)
The C# code I'm trying to use to set the credentials for the webrequest is something like this:
NetworkCredential myCred = new NetworkCredential(
user, password, serverURL);
CredentialCache myCache = new CredentialCache();
myCache.Add(new Uri(serverURL), "Basic", myCred);
HttpWebRequest wr = (HttpWebRequest) HttpWebRequest.Create(serverURL);
wr.Credentials = myCache;
I've tried just setting the credentials like this too (and without specifying serverURL):
wr.Credentials = new NetworkCredential(user,password,serverURL);
But that still doesn't make it show up in wireshark. Does anyone have any idea if:
A) That authorization information should actually be in the HTTP header for this to work, and
B) If it is - then how do I make C# put it in? I only seem to be able to find decent examples using the default credentials, which doesn't apply to what I'm doing.
Thanks in advance.
.NET's WebRequest has an infuriating default behavior where it only sends credentials after receiving an HTTP 401 Not Authorized response.
Manually adding the credentials header (as you've done) seems to be the best solution available.
More details in this post