I'm currently writing an API that gets data from a Point of Sale System's web interface. So far, I haven't had any problems logging in and generating reports to get data from until this situation.
In general, I can use the follow method to return an HttpWebRequest object that does the trick for most request to the web server.
private HttpWebRequest DefaultRequestObject(string path)
{
var request = (HttpWebRequest)WebRequest.Create(_baseUrl + path);
request.Method = "GET";
request.Host = _baseUrl.Substring(8); // Cut off the https://
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:30.0) Gecko/20100101 Firefox/30.0";
request.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
request.Headers.Add("Accept-Language", "en-US,en;q=0.5");
request.Headers.Add("Accept-Encoding", "gzip, deflate");
request.SendChunked = false;
request.AllowAutoRedirect = true;
request.ProtocolVersion = HttpVersion.Version11;
var sp = request.ServicePoint;
sp.Expect100Continue = false;
var prop = sp.GetType().GetProperty("HttpBehaviour", BindingFlags.Instance | BindingFlags.NonPublic);
prop.SetValue(sp, (byte)0, null);
request.CookieContainer = _cookieJar;
if (!String.IsNullOrEmpty(_cookieString))
request.Headers.Add(HttpRequestHeader.Cookie, _cookieString);
return request;
}
When I send a GET request, I use the following method:
public MgrngResponse GetContent(string serverPath, Dictionary<string, string> args)
{
if (!serverPath.StartsWith("/"))
serverPath = "/~pos/mgrng/" + serverPath;
var requestString = serverPath + "?" + HttpDataFormat(args);
var request = DefaultRequestObject(requestString);
try
{
var response = (HttpWebResponse)request.GetResponse();
var mgrngResponse = new MgrngResponse(response);
if (!String.IsNullOrEmpty(mgrngResponse.HttpResponse.GetResponseHeader("Set-Cookie")))
SaveMgrngResponseCookies(mgrngResponse);
_sessionExpiration = DateTime.Now.AddMinutes(15);
UpdateStatus(mgrngResponse.Content);
return mgrngResponse;
}
catch (WebException webException)
{
using (WebResponse response = webException.Response)
{
var httpResponse = (HttpWebResponse)response;
Console.WriteLine("Error code: {0}", httpResponse.StatusCode);
using (Stream data = response.GetResponseStream())
using (var reader = new StreamReader(data))
{
string text = reader.ReadToEnd();
Console.WriteLine(text);
}
}
var eventArgs = new SessionUpdateEventArgs(SessionStatus.ConnectionError, "Unable to GET data");
RaiseStatusChangeEvent(eventArgs);
return null;
}
}
This works well for all of the pages I've attempted so far, but now I'm running into a new problem where when I try to get the response for a particular page, the method throw a WebException with a 500 Internal Server Error.
I used Fiddler to match a browser's request exactly with mine (with the exception of the order of the headers and the cookie values obviously) but I'm still getting the Internal Server Error.
Below are the Raw Requests from Fiddler. The first one is from Firefox and the second one is from my program.
GET https://location.posprovider.com/~pos/mgrng/Report.php?boc_brand=7&csv_delimeter=csv_delimeter_comma&format=text&format1=csv&format1=txt&format1=pdf&format1=html HTTP/1.1
Host: location.posprovider.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:30.0) Gecko/20100101 Firefox/30.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Cookie: USER=af182fda473354eb3199522726ca61c9d5516c95f165cffd63b9522726ca61c9dc714cb52a46278e4399720706ea41e9dc714cb52a46278e4399720706ea41e9dc714cb52a46278e4399720706ea41e9dc714cb52a46278e4399720706ea41e9fc516c950a6607ae63b9522726ca61c9; PHPSESSID=9d7f54f9a1769a3e0572745fe0db3d97
Connection: keep-alive
GET https://location.posprovider.com/~pos/mgrng/Report.php?boc_brand=7&csv_delimeter=csv_delimeter_comma&format=text&format1=csv&format1=txt&format1=pdf&format1=html HTTP/1.1
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:30.0) Gecko/20100101 Firefox/30.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Host: location.posprovider.com
Cookie: PHPSESSID=16ad21c9d69fe85b3d185ab284f8620b; USER=30dca66f355e0ba89b6eda3c3e822ea24a95e5209e0f90bec94eda3c3e822ea243b5c500582b78cde96efa1c1ea20e8243b5c500582b78cde96efa1c1ea20e8243b5c500582b78cde96efa1c1ea20e8243b5c500582b78cde96efa1c1ea20e826395e520780b58edc94eda3c3e822ea2
Connection: Keep-Alive
I even tried logging into the web interface and then copy and pasting my generated request string into the browser and I got the desired data.
Any ideas?
Pull your programs request into Fiddlers composer and gradually eliminate the remaining differences. What do you find doing that?
Great advice! Apparently there's something wrong with my cookie line.
I copy and pasted the cookie line from the browser's version into mine
and it worked. Now I just need to resolve why my cookies aren't
correct...
I agree with that diagnosis. Who knows what's going on inside of the server. Probably some fragile code that expect a very exact cookie string format.
Related
I try to access an API from browswer, it returns data properly.
Below is the output of Chrome Dev Tools - Network Tab
GET /xxxxxxx/api/xxxxxxx/xxxxxxxxxxxxx?referencenumber=AVXD13198802469/1 HTTP/1.1
Host: xxxx.xxxxxxx.xxx
Connection: keep-alive
Cache-Control: max-age=0
sec-ch-ua: " Not A;Brand";v="99", "Chromium";v="96", "Google Chrome";v="96"
sec-ch-ua-mobile: ?0
sec-ch-ua-platform: "Windows"
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.45 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Sec-Fetch-Site: none
Sec-Fetch-Mode: navigate
Sec-Fetch-User: ?1
Sec-Fetch-Dest: document
Accept-Encoding: gzip, deflate, br
Accept-Language: en-GB,en;q=0.9,en-US;q=0.8,hi;q=0.7,te;q=0.6
Cookie: _ga=GA1.2.324340773.1637687185; _hjSessionUser_1162220=eyJpZCI6IjVjN2Q4ZjZiLTE3NGYtNWRlOS1iN2ZjLWFhMzU3NGJjYmFjNSIsImNyZWF0ZWQiOjE2Mzc2ODcxODUzMzYsImV4aXN0aW5nIjpmYWxzZX0=; OptanonAlertBoxClosed=2021-11-23T17:06:55.324Z; OptanonConsent=isGpcEnabled=0&datestamp=Tue+Nov+23+2021+22%3A37%3A28+GMT%2B0530+(India+Standard+Time)&version=6.18.0&isIABGlobal=false&hosts=&consentId=e2272ab8-8e01-4859-902a-e8e84fbe8b35&interactionCount=1&landingPath=NotLandingPage&groups=C0001%3A1%2CC0003%3A1%2CC0002%3A1%2CC0004%3A1&geolocation=%3B&AwaitingReconsent=false; AWSALB=hSe9Dtqo8cPvWzIyv/lT0nhcCJ822BzrFDng1sT+fBBmde4CPOMbJJpCE3PESkURtsxxEGKsTwlnlN8ybLLed4pVYfE6tDiFKz9WD5fBYeydSBZw/k1tMkG+/2fa; AWSALBCORS=hSe9Dtqo8cPvWzIyv/lT0nhcCJ822BzrFDng1sT+fBBmde4CPOMbJJpCE3PESkURtsxxEGKsTwlnlN8ybLLed4pVYfE6tDiFKz9WD5fBYeydSBZw/k1tMkG+/2fa; dtCookie=v_4_srv_3_sn_2832183B98BD4E50DD4D6456885CECA3_perc_100000_ol_0_mul_1_app-3A86e062a5b6c28a86_1_rcs-3Acss_0
but when i try to execute the same url from .net app, it gives a 403 error. my .net code is as follows
public HttpResponseMessage SendRequestPostNew(string destinationMethod, string destinationURL,string requestContent,string TimeoutValue)
{
var httpClientHandler = new HttpClientHandler();
httpClientHandler.ServerCertificateCustomValidationCallback = delegate { return true; };
httpClientHandler.ServerCertificateCustomValidationCallback = (message, cert, chain, errors) => { return true; };
HttpClient clientNew = new HttpClient(httpClientHandler);
try
{
using (var newRequest = new HttpRequestMessage(new HttpMethod(destinationMethod), destinationURL))
{
newRequest.Headers.Accept.Clear();
newRequest.Headers.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
//newRequest.Content = new StringContent(requestContent, Encoding.UTF8, request.ContentType);
newRequest.Content = new StringContent(requestContent, Encoding.UTF8, "application/json");
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls13 | SecurityProtocolType.Tls12 | SecurityProtocolType.Tls11 | SecurityProtocolType.Tls;
ServicePointManager.ServerCertificateValidationCallback +=(sender, cert, chain, sslPolicyErrors) => { return true; };
//System.Net.ServicePointManager.Expect100Continue = false;
client.Timeout = TimeSpan.FromSeconds(Convert.ToInt32(TimeoutValue));
var response = clientNew.SendAsync(newRequest);
{
return response.Result;
}
}
}catch(Exception ex)
{
throw ex;
}
finally
{
clientNew.Dispose();
}
}
403 ERROR
The request could not be satisfied.
Bad request. We can't connect to the server for this app or website at this time. There might be too much traffic or a configuration error. Try again later, or contact the app or website owner.
If you provide content to customers through CloudFront, you can find steps to troubleshoot and help prevent this error by reviewing the CloudFront documentation.
Generated by cloudfront (CloudFront) Request ID: EXUpjNsCEJfyHq_q0PobrhVpOr1e3EfbH8grxVhVTsz036MSbIrkmg==
Where could be the issue?
CloudFront might block the request if "user-agent" header is absent or doesn't make sense.
Possibly a few more headers are missing. Try to compare to set of headers:
the ones sent by your browser
the ones sent by your application
I'm trying to receive data from compressor, but it's answer is always empty. Here is my code:
WebClient client = new WebClient();
client.Headers[HttpRequestHeader.ContentType]= "application/x-www-form-urlencoded";
string question = "online_pressure";
string URL = "http://10.0.163.51/getVar.cgi";
string answer = client.UploadString(URL, "POST", question);
Console.WriteLine(answer);
When I use this code for another compressor, which different only 2 strings, it works great and I can see answer in console:
string question = "QUESTION=300201";
string URL = "http://10.0.163.50/cgi-bin/mkv.cgi";
Code in VBS works great for both compressors. I can see answer in MsgBox from first and second compressors:
Dim objHTTP
strToSend = "online_pressure"
Set objHTTP = CreateObject("Microsoft.XMLHTTP")
Call objHTTP.Open("POST", "http://10.0.163.51/getVar.cgi", false)
objHTTP.setRequestHeader "Content-Type", "application/x-www-form-urlencoded"
objHTTP.Send strToSend
MsgBox(objHTTP.ResponseText)
HttpRequest code works just for second compressor too:
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(URL);
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = data.Length;
StreamWriter requestWriter = new StreamWriter(request.GetRequestStream(), System.Text.Encoding.ASCII);
requestWriter.Write(data);
requestWriter.Close();
try
{
// get the response
WebResponse webResponse = request.GetResponse();
Stream webStream = webResponse.GetResponseStream();
StreamReader responseReader = new StreamReader(webStream);
string response = responseReader.ReadToEnd();
responseReader.Close();
Console.WriteLine(response);
}
catch (WebException we)
{
string webExceptionMessage = we.Message;
}
What I can try else to get data from first compressor in C#?
I compared the three requests in Fiddler 4 and realized that the only difference (apart from some other headers which won't affect behavior) between vbs script and both WebClient
and HttpWebRequest is that, managed API's send the Expect: 100-continue header and vbs script does not.
This can be the issue if the software running on the compressor device does not support this.
Please try the following, which tells the HttpWebRequest to not send this header:
For the HttpWebRequest, you can simply prevent sending this by:
request.ServicePoint.Expect100Continue = false;
Note: For the WebClient, the assignment requires accessing to the HttpWebRequest object used internally, but this has "protected" access modifier and can be worked around.
Before setting this value:
POST http://oguzozgul.com.tr/getVar.cgi HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Host: oguzozgul.com.tr
Content-Length: 15
Expect: 100-continue
Connection: Keep-Alive
online_pressure
After setting this value:
POST http://oguzozgul.com.tr/getVar.cgi HTTP/1.1
Content-Type: application/x-www-form-urlencoded
Host: oguzozgul.com.tr
Content-Length: 15
Connection: Keep-Alive
online_pressure
I also want to put the vb script request here so you can see other differences as well. The Accept-Encoding and some other headers are also not sent by default:
POST http://oguzozgul.com.tr/getVar.cgi HTTP/1.1
Accept: */*
Content-Type: application/x-www-form-urlencoded
Accept-Language: tr,en-US;q=0.8,en-GB;q=0.7,en;q=0.5,zh-Hans-CN;q=0.3,zh-Hans;q=0.2
UA-CPU: AMD64
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.3; Win64; x64; Trident/7.0; .NET4.0E; .NET4.0C; Tablet PC 2.0; .NET CLR 3.5.30729; .NET CLR 2.0.50727; .NET CLR 3.0.30729; wbx 1.0.0; Zoom 3.6.0; wbxapp 1.0.0)
Host: oguzozgul.com.tr
Content-Length: 15
Connection: Keep-Alive
Pragma: no-cache
online_pressure
WebClient and HttpWebRequest still do not works with this comressor. I tried to add any headers, but no result.
I tried PowerShell and it returns empty response too:
Invoke-WebRequest -Uri http://10.0.163.51/getVar.cgi -Method POST -Body "package_discharge_pressure" -ContentType "application/x-www-form-urlencoded"
Curl not working too, it hangs without errors
curl -i -X POST -H "Content-Type:application/x-www-form-urlencoded" -d "sump_pressure" http://10.0.163.51/getVar.cgi
Javascript and VBS works good. And today I found c# code wich works too
WinHttpRequest req = new WinHttpRequest();
try {
req.Open("POST", "http://10.0.163.51/getVar.cgi", true);
req.SetRequestHeader("Content-Type", "application/x-www-form-urlencoded");
req.Send("sump_pressure");
req.WaitForResponse();
}
catch(Exception ex) {
Console.WriteLine("Error : " + ex.Message.Trim());
}
Console.WriteLine(req.ResponseText);
In visual studio need to make reference to WinHttp
I am using C# to make a request to my webserver like this so I can save the data from the client into my model.
This is the route:
Route::get('client/register', 'ClientController#store');
C# request:
string response = Network.sendWebRequest("http://localhost/myproject/public/client/register?identity=" + getIdentity());
As you can see I send a GET request with identity and a base64 encoded string as value.
Then in my store function I decode the base64 encoded string, explode it and try to save it in my database (HINT: The database table is empty, so don't worry the record does not exist already)
ClientController
public function store(Request $request)
{
echo "store function";
$clientIdentityB64 = request('identity');
if (empty($clientIdentityB64)) { return "empty"; }
$clientIdentity = base64_decode($clientIdentityB64);
$identityArr = explode(";", $clientIdentity);
$clientQuery = Client::select("id")
->where("base64Id", "=", $clientIdentityB64)
->get()
->first();
if (is_null($clientQuery)) {
$client = new Client();
$client->ip = $request->ip();
$client->base64Id = $clientIdentityB64;
$client->userName = $identityArr[0];
$client->userDomainName = $identityArr[1];
$client->machineName = $identityArr[2];
$client->osVersion = $identityArr[3];
$client->userAgent = $request->userAgent();
$client->save();
}
}
However, It does not save the model into the database if I make the request. You can see that I output store function at the top of the store function, but I only get the response if I remove $client->save() from the code.
If $client->save() is in the code then I get this response:
Exception: System.Net.WebException: Request canceled: disconnected
bei System.Net.HttpWebRequest.GetResponse()
bei Helper.Network.sendWebRequest(String URL) in C:\Users....
I already echoed out all the values with success, only the save function does not work even though the values are there.
However, It does work If I call the following URL from a normal webbrowser: http://localhost/myproject/public/client/register?identity=somebase64encodedhash.
How to solve this?
This is my network class which I used above:
public static class Network
{
public static string sendWebRequest(string URL)
{
System.Net.WebRequest request = System.Net.WebRequest.Create(URL);
try {
System.Net.WebResponse response = request.GetResponse();
System.IO.StreamReader streamReader = new System.IO.StreamReader(response.GetResponseStream());
return (string)streamReader.ReadToEnd();
} catch (Exception e) {
return "Exception: " + e;
}
}
}
UPDATE
Request from Webbrowser:
GET /myproject/public/client/register?identity=LO0dzILO0E5ULO0IBXaW5kb3dzIEDUNLO0JMQUNLO01pY3Jvc29mdCBXaW5kb3dzIE5UIDYuMi45MjAwLjA= HTTP/1.1 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Encoding: gzip, deflate Accept-Language: de,en-US;q=0.7,en;q=0.3 Client-Ip: 147.153.195.175 Connection: keep-alive Cookie: XSRF-TOKEN=eyJpdiI6IkhPNW9yeVhkNGFiQWFnWUtFQ2FRREE9PSIsInZhbHVlIjoiZm1yR1NxN0tvTEFJSTNoYnJWdmNRWWZEVTRDOWVYUHJNemhDWCs4TFY4VnNoU2J5ZXNha3dPNHAwNHlKc0lHMlRGQUdiUE9BdnJcL2NDb1l4RytYSFJRPT0iLCJtYWMiOiI2ODliODM1ZDU4NzFlMTcxNmUyNGMzYjNhYTBkNGU0NjI3NDQ4MmYxZTFiZTg5YWNjMmEyZWFiMDliYWExZjY4In0%3D; laravel_session=eyJpdiI6Ijl0Mnlid0xkdTRRQVh2UDZhXC95MzFBPT0iLCJ2YWx1ZSI6InR4eCtTRVg5XC81YTU3dHB1azluVHpYemw5a2ZkbnBEMGdsRllGSXFIajFJSUdWcitwSTFQS1pWXC9LUlR3Vzh0eExSY3Via2xwOTBhVG5kZXVzRXdWbGc9PSIsIm1hYyI6Ijg5MmZiNDMxOTFhMGQzNjMxNmM0NzI1NzAyOTk3ZTU4ODkwMzdkYzM3YTIxZTU3M2MzMWYzYjM5NTI1OWVhZDQifQ%3D%3D Dnt: 1 Host: localhost Upgrade-Insecure-Requests: 1 User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:55.0) Gecko/20100101 Firefox/55.0 Via: 223.100.9.3 X-Forwarded-For: 36.2.247.179
Request from C# Application:
GET /myproject/public/client/register?identity=LO0dzILO0E5ULO0IBXaW5kb3dzIEDUNLO0JMQUNLO01pY3Jvc29mdCBXaW5kb3dzIE5UIDYuMi45MjAwLjA= HTTP/1.1
Host: localhost
Hint: I changed the base64 string to something else...
UPDATE
I changed my Network class to this, so I can attach an empty cookieContainer to the request. But it does not change the request at all.
public static class Network
{
public static string sendWebRequest(string URL)
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(URL);
request.CookieContainer = new CookieContainer();
try {
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
System.IO.StreamReader streamReader = new System.IO.StreamReader(response.GetResponseStream());
return (string)streamReader.ReadToEnd();
} catch (Exception e) {
return "Exception: " + e;
}
}
}
I solved it by adding a webBrowser element to my C# Form and calling the URL from the browser like this:
webBrowser1.Navigate("http://localhost/myproject/public/client/register?identity=" + getIdentity());
I'm trying to automate configuring a wireless router's SSID and Password via c# webclient. The router has no API that I know of. It's an unbranded chinese router. The web config seems to be the only option for configuration. It uses http-basic-authentication (you browse to the IP address of the router and get a generic dialog asking for username and password).
I've used Wireshark to get the headers and form fields that the http-post requests use when I manually update the SSID and Password (two separate forms). I then attempted to use webclient to emulate those post requests.
Here is a snippet of code that I am using to attempt to save a new SSID (NameValueCollection is defined elsewhere):
private const string FORM_SSID = "http://192.168.1.2/formWlanSetup.htm";
private const string REF_SSID = "http://192.168.1.2/formRedirect.htm?redirect-url=wlbasic.htm&wlan_id=0";
private NameValueCollection mFields = HttpUtility.ParseQueryString(string.Empty, Encoding.ASCII);
public string SaveConfigResponse()
{
try
{
using (WebClient wc = new WebClient())
{
wc.Headers[HttpRequestHeader.Accept] = "text/html, application/xhtml+xml, */*";
wc.Headers[HttpRequestHeader.Referer] = REF_SSID;
wc.Headers[HttpRequestHeader.AcceptLanguage] = "en-US";
wc.Headers[HttpRequestHeader.UserAgent] = "Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko";
wc.Headers[HttpRequestHeader.ContentType] = "application/x-www-form-urlencoded";
wc.Headers[HttpRequestHeader.AcceptEncoding] = "gzip, deflate";
wc.Headers[HttpRequestHeader.Host] = "192.168.1.2";
wc.Headers[HttpRequestHeader.Connection] = "Keep-Alive";
wc.Headers[HttpRequestHeader.ContentLength] = Encoding.ASCII.GetBytes(mFields.ToString()).Length.ToString();
wc.Headers[HttpRequestHeader.CacheControl] = "no-cache";
string credentials = Convert.ToBase64String(Encoding.ASCII.GetBytes(config_user + ":" + config_pass));
wc.Headers[HttpRequestHeader.Authorization] = string.Format("Basic {0}", credentials);
//wc.Credentials = new NetworkCredential("admin", "admin");
return Encoding.ASCII.GetString(wc.UploadValues(FORM_SSID, "POST", mFields));
}
}
catch (Exception ex)
{
return ex.Message;
}
}
This results in an http-status-code-401 not authorized response. Is what I'm trying to do just impossible?
UPDATE
Here are the HTTP headers of both the browser post/response and the WebClient post/response. Again, I tried to match what I saw the browser posting as well as I could with my WebClient post.
Browser:
POST /formWlanSetup.htm HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Referer: http://192.168.1.2/formRedirect.htm?redirect-url=wlbasic.htm&wlan_id=0
Accept-Language: en-US
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Content-Type: application/x-www-form-urlencoded
Accept-Encoding: gzip, deflate
Host: 192.168.1.2
Content-Length: 524
Connection: Keep-Alive
Cache-Control: no-cache
Authorization: Basic YWRtaW46YWRtaW4=
HTTP/1.1 302 Found
Location: wlbasic.htm
Content-Length: 183
Date: Thu, 23 Oct 2014 18:18:27 GMT
Server: eCos Embedded Web Server
Connection: close
Content-Type: text/html
Transfer-Encoding: chunked
Cache-Control: no-cache
WebClient:
POST /formWlanSetup.htm HTTP/1.1
Accept-Language: en-US
Accept-Encoding: gzip, deflate
Cache-Control: no-cache
Authorization: Basic YWRtaW46YWRtaW4=
Accept: text/html, application/xhtml+xml, */*
Content-Type: application/x-www-form-urlencoded
Referer: http://192.168.1.2/formRedirect.htm?redirect-url=wlbasic.htm&wlan_id=0
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
Host: 192.168.1.2
Content-Length: 524
Connection: Keep-Alive
HTTP/1.1 401 Not Authorized
WWW-Authenticate: Basic realm="AP"
Date: Thu, 23 Oct 2014 18:18:41 GMT
Server: eCos Embedded Web Server
Connection: close
Content-Type: text/html
Transfer-Encoding: chunked
Cache-Control: no-cache
Again, that was all gleaned from Wireshark. I'm not very familiar with Wireshark, but I was able to get this far. If I knew how to properly extract the raw packet data and pastebin it, I would.
Important New Observations
The Wireshark captures of the post packets from both Browser and WebClient obviously differ in the order of the headers. I don't know how significant that might or might not be, though, as the data for each header is clearly the same.
One stark difference between the packets that I noticed is that Wireshark reports the Browser packet to be significantly larger than the WebClient packet. Looking at the itemized view, I couldn't find any obvious differences. I assume posting raw data for comparison would reveal a lot, but again, I don't really know how to do that.
I had a bewildering revelation. Despite the response clearly stating '(401) Unauthorized', the post is in fact being accepted by the router! Driving in to the router's web config after my WebClient post shows that the settings were accepted and saved.
That last one is a biggie. I find myself in a situation where I can get my config to save with a WebClient post, but I have to ignore a 401 response in order to do so. Obviously, this is far from ideal. So close, yet so far!
FINAL UPDATE (RESOLUTION)
I've solved the issue of failing basic authentication, though not with WebClient. I used the suggestion from #caesay and went with HttpWebRequest (together with WebResponse). My form posts result in redirects, so I had to allow for that.
This is essentially what I went with:
private bool ConfigureRouter()
{
bool passed = false;
string response = "";
HttpWebRequest WEBREQ = null;
WebResponse WEBRESP = null;
// Attempt to POST form to router that saves a new SSID.
try
{
var uri = new Uri(FORM_SSID); // Create URI from URL string.
WEBREQ = HttpWebRequest.Create(uri) as HttpWebRequest;
// If POST will result in redirects, you won't see an "OK"
// response if you don't allow those redirects
WEBREQ.AllowAutoRedirect = true;
// Basic authentication will first send the request without
// creds. This is protocol standard.
// When the server replies with 401, the HttpWebRequest will
// automatically send the request again with the creds when
// when PreAuthenticate is set.
WEBREQ.PreAuthenticate = true;
WEBREQ.AuthenticationLevel = System.Net.Security.AuthenticationLevel.MutualAuthRequested;
// Mimic all headers known to satisfy the request
// as discovered with a tool like Wireshark or Fiddler
// when the form was submitted from a browser.
WEBREQ.Method = "POST";
WEBREQ.Accept = "text/html, application/xhtml+xml, */*";
WEBREQ.Headers.Add("Accept-Language", "en-US"); // No AcceptLanguage property built-in to HttpWebRequest
WEBREQ.UserAgent = USER_AGENT;
WEBREQ.Referer = REF_SSID;
WEBREQ.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
WEBREQ.KeepAlive = true;
WEBREQ.Headers.Add("Pragma", "no-cache"); // No Pragma property built-in to HttpWebRequest
// Use a cached credential so that the creds are properly
// submitted with subsequent redirect requests.
CredentialCache creds = new CredentialCache();
creds.Add(uri, "Basic", new NetworkCredential(config_user, config_pass));
WEBREQ.Credentials = creds;
// Submit the form.
using (Stream stream = WEBREQ.GetRequestStream())
{
SSID ssid = new SSID(ssid_scanned); // Gets predefined form fields with new SSID inserted (NameValueCollection PostData)
stream.Write(ssid.PostData, 0, ssid.PostData.Length);
}
// Get the response from the final redirect.
WEBRESP = WEBREQ.GetResponse();
response = ((HttpWebResponse)WEBRESP).StatusCode.ToString();
if (response == "OK")
{
StatusUpdate("STATUS: SSID save was successful.");
passed = true;
}
else
{
StatusUpdate("FAILED: SSID save was unsuccessful.");
passed = false;
}
WEBRESP.Close();
}
catch (Exception ex)
{
StatusUpdate("ERROR: " + ex.Message);
return false;
}
return passed;
}
Is what I'm trying to do just impossible?
No, its not impossible. I have had many headaches with web scraping like this over the years because some web servers are picky, and your router interface is likely a custom web server implementation that isnt as forgiving as apache or iis.
I would do a wireshark capture and get the raw packet data that chrome sends (w/ payload etc), and then do the same capture for your application. Make sure the packets are as similar as you can get them. If you still have issues, post the packet captures to pastebin or something so we can have a look.
EDIT::
Instead of using the limited WebClient API, try using some lower level items, I wonder if the following code will work for you:
var uri = new Uri("http://192.168.1.2/formWlanSetup.htm");
var cookies = new CookieContainer();
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.CookieContainer = cookies;
request.ServicePoint.Expect100Continue = false;
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko";
request.Referer = "http://192.168.1.2/formRedirect.htm?redirect-url=wlbasic.htm&wlan_id=0";
request.Credentials = new NetworkCredential(config_user, config_pass);
request.PreAuthenticate = true;
var response = request.GetResponse();
var reader = new StreamReader(response.GetResponseStream());
string htmlResponse = reader.ReadToEnd();
We have a tool which checks if a given URL is a live URL. If a given url is live another part of our software can screen scrap the content from it.
This is my code for checking if a url is live
public static bool IsLiveUrl(string url)
{
HttpWebRequest webRequest = WebRequest.Create(url) as HttpWebRequest;
webRequest.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.6) Gecko/20060728 Firefox/1.5";
webRequest.CookieContainer = new CookieContainer();
WebResponse webResponse;
try
{
webResponse = webRequest.GetResponse();
}
catch (WebException e)
{
return false;
}
catch (Exception ex)
{
return false;
}
return true;
}
This code works perfectly but for a particular site hosted on apache i am getting a web exception with following message. "The remote server returned an error: (403) Forbidden"
On further inspection i found the following details in the WebException object
Status="ProtocolError"
StatusDescription="Bad Behaviour"
This is the request header "User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.6) Gecko/20060728 Firefox/1.5
Host: scenicspares.co.uk
Connection: Keep-Alive"
This is the response header "Keep-Alive: timeout=4, max=512
Connection: Keep-Alive
Transfer-Encoding: chunked
Content-Type: text/html
Date: Thu, 13 Jan 2011 10:29:36 GMT
Server: Apache"
I extracted these headers using a watch in vs2008. The frame work in use is 3.5.
It turned out that all i needed to do was following
webRequest.Accept = "*/*";
webResponse = webRequest.GetResponse();
and it was fixed.
I believe there are quite a lot of similar problems that depend on server application. In my particular case see: The remote server returned an error: (403) Forbidden
I fixed it for my web scraping app after facing this issue for day long, hope it might help others:
public static string GetPageContent(string url)
{
CookieContainer cookieContainer = new CookieContainer();
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
request.CookieContainer = cookieContainer; // after Create() method
request.AllowAutoRedirect = true; // should be true
request.UserAgent= ".NET Framework Test Client"; // should not be null
var responseStr = string.Empty;
using (var response = request.GetResponse())
{
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
responseStr = reader.ReadToEnd();
reader.Close();
dataStream.Close();
}
return responseStr;
}