Why do I get an empty response? - c#

I am sending:
GET / HTTP/1.1
Host: example.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv28.0) Gecko/20100101 Firefox/28.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: de,en-US;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate
Connection: close
My Code:
htmlClient.Connect(Dns.GetHostAddresses(url.Host)[0], 80); // Connect to server on port 80
htmlClient.Send(ASCIIEncoding.Default.GetBytes(htmlreq));
byte[] htmlReqBuff = new byte[10240];
htmlClient.Receive(htmlReqBuff);
htmlClient.Disconnect(false);
htmlClient.Shutdown(SocketShutdown.Both);
htmlClient.Dispose();
Log.info(ASCIIEncoding.Default.GetString(htmlReqBuff));
From some Webservers I get an request timeout.

Your request must end with a empty line <CR><LF> to indicate the end of the HTTP request.

Related

Requesting to github-api using .net HttpClient says Forbidden

I am trying to getting data from this github api link https://api.github.com/users/arif2009 . It works fine using posman get request.
But if i try to get data using .net HttpClient then it says Forbidden.
C# code:
using (var client = new HttpClient())
{
client.BaseAddress = new Uri("https://api.github.com/");
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
var response = await client.GetAsync("users/arif2009");
if (response.IsSuccessStatusCode)
{
var data = response.Content.ReadAsAsync<GithubUser>();
}
}
Response:
Can anybody tell me where i made the mistake?
I took a look at your requests using Telerik's Fiddler and found the following.
Request with your code:
GET https://api.github.com/users/arif2009 HTTP/1.1
Accept: application/json
Host: api.github.com
Request from Postman:
GET https://api.github.com/users/arif2009 HTTP/1.1
Host: api.github.com
Connection: keep-alive
Accept: application/json
Cache-Control: no-cache
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.142 Safari/537.36
Postman-Token: d560ee28-b1e8-ece5-2612-87371ddcb295
Accept-Encoding: gzip, deflate, br
Accept-Language: en-GB,en;q=0.9,ja-JP;q=0.8,ja;q=0.7,en-US;q=0.6
The obvious missing header seemed to be "User-Agent", so I added this:
client.DefaultRequestHeaders.UserAgent.Add(new ProductInfoHeaderValue("product", "1")); // set your own values here
Which produced the following request:
GET https://api.github.com/users/arif2009 HTTP/1.1
Accept: application/json
User-Agent: product/1
Host: api.github.com
And returned the following response:
HTTP/1.1 200 OK
Date: Wed, 17 Jul 2019 15:19:35 GMT
Content-Type: application/json; charset=utf-8
Content-Length: 1249
Server: GitHub.com
Status: 200 OK
X-RateLimit-Limit: 60
X-RateLimit-Remaining: 52
X-RateLimit-Reset: 1563380375
Cache-Control: public, max-age=60, s-maxage=60
Vary: Accept
ETag: "1df3e0be6e824ca684f27963806533da"
Last-Modified: Tue, 16 Jul 2019 05:58:59 GMT
X-GitHub-Media-Type: github.v3
Access-Control-Expose-Headers: ETag, Link, Location, Retry-After, X-GitHub-OTP, X-RateLimit-Limit, X-RateLimit-Remaining, X-RateLimit-Reset, X-OAuth-Scopes, X-Accepted-OAuth-Scopes, X-Poll-Interval, X-GitHub-Media-Type
Access-Control-Allow-Origin: *
Strict-Transport-Security: max-age=31536000; includeSubdomains; preload
X-Frame-Options: deny
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
Referrer-Policy: origin-when-cross-origin, strict-origin-when-cross-origin
Content-Security-Policy: default-src 'none'
Vary: Accept-Encoding
X-GitHub-Request-Id: D5E0:8D63:8BE92C:A799AE:5D2F3C86
{"login":"arif2009","id":6396346,"node_id":"MDQ6VXNlcjYzOTYzNDY=","avatar_url":"https://avatars0.githubusercontent.com/u/6396346?v=4","gravatar_id":"","url":"https://api.github.com/users/arif2009","html_url":"https://github.com/arif2009","followers_url":"https://api.github.com/users/arif2009/followers","following_url":"https://api.github.com/users/arif2009/following{/other_user}","gists_url":"https://api.github.com/users/arif2009/gists{/gist_id}","starred_url":"https://api.github.com/users/arif2009/starred{/owner}{/repo}","subscriptions_url":"https://api.github.com/users/arif2009/subscriptions","organizations_url":"https://api.github.com/users/arif2009/orgs","repos_url":"https://api.github.com/users/arif2009/repos","events_url":"https://api.github.com/users/arif2009/events{/privacy}","received_events_url":"https://api.github.com/users/arif2009/received_events","type":"User","site_admin":false,"name":"Arif","company":"#BrainStation-23 ","blog":"https://arif2009.github.io/","location":"Bangladesh","email":null,"hireable":true,"bio":"Software Engineer | Full Stack | Web Developer | Technical Writer","public_repos":15,"public_gists":2,"followers":9,"following":7,"created_at":"2014-01-14T05:03:47Z","updated_at":"2019-07-16T05:58:59Z"}
this API reference from github says that 'User-Agent' is a required header:
All API requests MUST include a valid User-Agent header. Requests with no User-Agent header will be rejected.
Postman automatically adds its own User-Agent to the call when this is not provided by the user. (This is nicely demonstrated by #John's answer).
simply adding this header will resolve your issue:
client.DefaultRequestHeaders.UserAgent.Add(new ProductInfoHeaderValue("yourAppName", "yourVersionNumber"));
You may try to put the complete URI into the GetAsync instead using the BaseAddress, as you do not give the GetAsync an URI, but only a string.
When sending a HttpRequestMessage with a relative Uri, the message Uri will be added to the BaseAddress property to create an absolute Uri.
https://learn.microsoft.com/de-de/dotnet/api/system.net.http.httpclient.baseaddress?view=netframework-4.8

Bad Request returned only with IE 11 on remote machine for XHR that passes credentials

I've been trying to figure out what piece I'm missing when making an XHR to an MS Web API that requires windows auth.
This request works locally on both Chrome and IE 11 as well as Chrome on a remote box (not the server). The problem is IE 11 on the remote box.
According to the dev tools, IE makes 3 requests. The first two requests pass an Authorization: Negotiate header and return 401s (preflights for CORS?). However, the third returns a 400. It seems like it fails to authenticate in a way that I don't understand, especially since other browsers and local tests work.
The API is a self-hosted OWIN console app. Here's the startup:
public void Configuration(IAppBuilder appBuilder)
{
appBuilder.UseCors(CorsOptions.AllowAll);
var listener = (HttpListener)appBuilder.Properties["System.Net.HttpListener"];
if (listener != null)
{
listener.AuthenticationSchemeSelectorDelegate = request =>
{
if (string.Compare(request.HttpMethod, "OPTIONS", StringComparison.OrdinalIgnoreCase) == 0)
{
return AuthenticationSchemes.Anonymous;
}
else
{
return AuthenticationSchemes.IntegratedWindowsAuthentication;
}
};
}
var config = new HttpConfiguration();
config.Routes.MapHttpRoute("DefaultApi", "api/{controller}/{action}/{id}", new { id = RouteParameter.Optional });
appBuilder.UseWebApi(config);
}
Here's the client-side XHR call:
var request = new XMLHttpRequest();
request.open('GET', 'http://xxxx:9000/api/test/something', true);
request.timeout = 10000;
request.withCredentials = true;
request.onload = function() {
if (request.status >= 200 && request.status < 400) {
console.log('done');
} else {
console.error('error');
}
};
request.onerror = function() {
// There was a connection error of some sort
};
request.send();
And the API Controller:
[Authorize]
[RoutePrefix("api/test")]
public class TestController : ApiController
{
[HttpGet]
[ActionName("something")]
public IHttpActionResult Something()
{
return Ok();
}
}
2 Requests that return 401 and the one that returns a 400:
First 401:
Request URL: http://xxxx:9000/xxxx
Request Method: GET
Status Code: 401 / Unauthorized
Request Headers
Accept: */*
Accept-Encoding: gzip, deflate
Accept-Language: en-US
Authorization: Negotiate [token]
Connection: Keep-Alive
Host: xxxx:9000
Referer: http://xxxx/xxxx.html
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 10.0; WOW64; Trident/7.0; .NET4.0C; .NET4.0E; .NET CLR 2.0.50727; .NET CLR 3.0.30729; .NET CLR 3.5.30729; InfoPath.3)
Response Headers
Content-Length: 0
Date: Fri, 22 Dec 2017 14:03:09 GMT
Server: Microsoft-HTTPAPI/2.0
WWW-Authenticate: Negotiate [token]
-------------
Second 401
Request URL: http://xxxx:9000/xxxx
Request Method: GET
Status Code: 401 / Unauthorized
Request Headers
Accept: */*
Accept-Encoding: gzip, deflate
Accept-Language: en-US
Authorization: Negotiate [token]
Connection: Keep-Alive
Host: xxxx:9000
Referer: http://xxxx/xxxx.html
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 10.0; WOW64; Trident/7.0; .NET4.0C; .NET4.0E; .NET CLR 2.0.50727; .NET CLR 3.0.30729; .NET CLR 3.5.30729; InfoPath.3)
Response Headers
Content-Length: 0
Date: Fri, 22 Dec 2017 14:03:09 GMT
Server: Microsoft-HTTPAPI/2.0
WWW-Authenticate: Negotiate [token]
-----------
400
Request URL: http://xxxx:9000/xxxx
Request Method: GET
Status Code: 400 / Bad Request
Request Headers
Accept: */*
Accept-Encoding: gzip, deflate
Accept-Language: en-US
Authorization: Negotiate [token]
Connection: Keep-Alive
Host: xxxx:9000
Referer: http://xxxx/xxxx.html
User-Agent: Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 10.0; WOW64; Trident/7.0; .NET4.0C; .NET4.0E; .NET CLR 2.0.50727; .NET CLR 3.0.30729; .NET CLR 3.5.30729; InfoPath.3)
Response Headers
Content-Length: 0
Date: Fri, 22 Dec 2017 14:03:12 GMT
Server: Microsoft-HTTPAPI/2.0
Trusted site
My guess is that the remote machine is restricted by domain policies and your site is not seen as a trusted site on that machine.
Add (or ask to add, if policies are set by IT) your website the appropriate security zone (Local intranet or Trusted sites) and the problem should be fixed.
Document mode
While you're at it, also see if they're pushing any old document mode for IE11 to run in).
Enable CORS for WebAPI
I see you already call UseCors on OWIN's IAppBuilder, but additionally you can try to enable CORS on the HttpConfiguration instance:
config.EnableCors(new EnableCorsAttribute("*", "*", "GET, POST, OPTIONS, PUT, DELETE"));
I did the following to pass a HttpOnly cookie/token using XHR:
Use axios or similar library on the client and use a config containing withCredentials:
import axios from 'axios';
const config = {
method: 'get',
url: 'http://xxxx:9000/api/test/something',
timeout: 5000,
withCredentials: true
};
//if (setCookie === true) {
//config.mode = 'no-cors'; // <= THIS IS IMPORTANT IF SETTING COOKIE!
//}
const response = await axios(config);
On the server set cors options:
const corsOptions = {
origin: 'http://xxxx:9000', // Set explicit origin!
methods: ['GET', 'POST', 'PUT', 'DELETE'], // Allow method(s)!
credentials: true //enable server to see credentials.
};
source: xhr.spec.whatwg.org
I hope it works for you too, it took some time before it worked. If it fails then window.postMessage long shot might work. Finally if you are using windows it is also possible it is a local problem:
Whatever security zone your website is in (Internet Options >
Security) make sure you ENABLE the following setting in your zone:
Miscellaneous > Access data sources across domains.

HTTPS request/response in C#

I'm trying to view captcha from a site, but I screw something, because it is incorrect when I submit the answer, though I get the session and everything, so I decided to do it request by request, exactly the way they appear in fiddler, but it is https, and I can't find a tutorial, or explanation or anything about https requests in C#, for example, the first request is:
CONNECT passport.abv.bg:443 HTTP/1.1
User-Agent: Mozilla/5.0 (Windows NT 6.1; rv:29.0) Gecko/20100101 Firefox/29.0
Connection: keep-alive
Connection: keep-alive
Host: passport.abv.bg
so I try to do it like that:
HttpWebRequest req0 = (HttpWebRequest)WebRequest.Create("https://passport.abv.bg:443/");
req0.Method = "CONNECT";
req0.KeepAlive = true;
req0.UserAgent = "Mozilla/5.0 (Windows NT 6.1; rv:29.0) Gecko/20100101 Firefox/29.0";
req0.Host = "passport.abv.bg";
HttpWebResponse resp0 = (HttpWebResponse)req0.GetResponse();
StreamReader Reader0 = new StreamReader(resp0.GetResponseStream());
string thePage0 = Reader0.ReadToEnd();
Reader0.Close();
and of course it won't work, I can't even see the result, as it's not string, and the application freezes..
Can you give me some info please, I really can't find any explanation how to use https requests in C#

Model Binder not working on first request after App Pool recycle

I have a problem with the model binder. Each time after I recycle the App Pool, the first request to the server fails to perform model binding. Request.InputStream has the proper request parameters, but the Action parameters are null and Request.Params does not contain them either.
This appears to happen on the first POST request with JSON. The real first request is a GET to Index(), which returns the view. After that, the client makes an AJAX request that fails.
This is what my Action looks like:
public async Task<ActionResult> GetQueryCounters(string query) { ... }
This is the contents of Request.InputStream on the first request (it's the same on subsequent requests):
{"query":"SELECT count(*) FROM accounts"}
This is the raw request as seen in Fiddler:
POST https://[url]/GetQueryCounters HTTP/1.1
Host: [url]
Connection: keep-alive
Content-Length: 281
Origin: https://[url]
User-Agent: Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/32.0.1700.76 Safari/537.36
Content-Type: application/json; charset=UTF-8
Accept: application/json, text/javascript, */*; q=0.01
x-ajaxRequest: true
X-Requested-With: XMLHttpRequest
__RequestVerificationToken: [snip]
Referer: https://[url]
Accept-Encoding: gzip,deflate,sdch
Accept-Language: en-US,en;q=0.8,he;q=0.6
Cookie: [snip]
{"query":"SELECT count(*) FROM accounts"}
What could be causing this?

Which HTTP header must be sent when call webrequest or webclient?

I am creating a web robot. Usually the http tools returns quite a few information and some of these are readonly (e.g. Connect: keep-alive). How to know which ones are required?
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Charset: ISO-8859-9,utf-8;q=0.7,*;q=0.3
Accept-Encoding: gzip,deflate,sdch
Accept-Language: tr-TR,tr;q=0.8,en-US;q=0.6,en;q=0.4
Cache-Control: max-age=0
Content-Length: 269
Content-Type: application/x-www-form-urlencoded
Host: closure-compiler.appspot.com
Origin: null
Connection: keep-alive
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.794.0 Safari/535.1
Usually the code looks like the following. Someone pointed out that the following code missed to set Content-Type?
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://closure-compiler.appspot.com/compile");
req.Connection = "keep-alive";
req.Headers.Add("Cache-Control", "max-age=0");
req.Headers.Add("Origin","null");
req.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/14.0.794.0 Safari/535.1";
req.ContentType = "application/x-www-form-urlencoded";
req.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
req.Headers.Add("Accept-Encoding", "gzip,deflate,sdch");
req.Headers.Add("Accept-Language", "tr-TR,tr;q=0.8,en-US;q=0.6,en;q=0.4");
req.Headers.Add("Accept-Charset", " ISO-8859-9,utf-8;q=0.7,*;q=0.3");
req.Method = "POST";
Stream reqStr = req.GetRequestStream();
No headers are required for general requests. Particular resources may require different headers. The right way is to ask owner of the resource what headers are needed. But if you want to cheat in some sort of game/forum you will have to figure out headers and other values yourself.
According to w3.org the simplest request should look somewhat similar to this:
GET <uri> CrLf
Thst's all.

Categories