Why is WebClient (System.Net) getting from URL twice? - c#

I've got a method like this:
private string getFromURL(string url)
{
WebClient myClient = new WebClient();
return myClient.DownloadString(url);
}
using WebClient from System.Net. It appears to be hitting the url twice (I'm also watching the log of the web server in question and it records two hits). Any idea why this might be?
EDIT: the answer was in fact programmer error. I no longer have any reason to think this is behaving strangely. Thanks for the answers.

Or if the URL is subtly different in the two cases it could be responding to a HTTP redirect request.

My guess is that it's doing a HEAD before the GET. Does your log show the HTTP method being used?

check out tcpmon:
https://tcpmon.dev.java.net/
it's a java tool - but you can run it easy w/out being a "java" guy
Chances are there's a redirect or something to itself, so you should be able to see if the http requests are identical or slightly different.
Also, check out curl (cygwin) - you can test sending the requests from there and see if there's a redirect or something.

Related

New Server Security Causing Issues To API Response

one of my old project/app was working fine for years, very recently client report that app does not working any longer due to API response issue.
it's just a get request to an API with some parameters..
till date (before issues occurs) it returns following response:
,,3,1669179307,0,
but recently it shows following response: (note nothing is changed in the source php/code files since project start)
<html><title>You are being redirected...</title>
<noscript>Javascript is required. Please enable javascript before you are allowed to see this page.</noscript>
<script>var s={},u,c,U,r,i,l=0,a,e=eval,w=String.fromCharCode,sucuri_cloudproxy_js='',S='bT0nP2RUNCcuc3Vic3RyKDMsIDEpICsgJycgKyAKJz9iVGYnLnN1YnN0cigzLCAxKSArICcnICsgCidIcExjJy5zdWJzdHIoMywgMSkgK1N0cmluZy5mcm9tQ2hhckNvZGUoNTYpICsgJ3FAYycuY2hhckF0KDIpKyAnJyArIAonNycgKyAgICcnICsgClN0cmluZy5mcm9tQ2hhckNvZGUoMHg2MykgKyAgJycgKycnKyIyc3VjdXIiLmNoYXJBdCgwKSsiYyIgKyAiNHNlYyIuc3Vic3RyKDAsMSkgKyAiZW0iLmNoYXJBdCgwKSArICAnJyArIAoiM2MiLmNoYXJBdCgwKSArICIiICtTdHJpbmcuZnJvbUNoYXJDb2RlKDk3KSArICJlIi5zbGljZSgwLDEpICsgICcnICsnZCcgKyAgJ0RiJy5zbGljZSgxLDIpKyAnJyArJycrJ3hJMScuY2hhckF0KDIpK1N0cmluZy5mcm9tQ2hhckNvZGUoMHgzMSkgKyAncTAwJy5jaGFyQXQoMikrU3RyaW5nLmZyb21DaGFyQ29kZSgweDYzKSArICIiICsnSHVIZScuc3Vic3RyKDMsIDEpICsiN3N1Ii5zbGljZSgwLDEpICsgIjhzdSIuc2xpY2UoMCwxKSArICdjJyArICAiZHN1Y3VyIi5jaGFyQXQoMCkrJ2EnICsgICIiICsiY3N1Y3VyIi5jaGFyQXQoMCkrImRzZWMiLnN1YnN0cigwLDEpICsgU3RyaW5nLmZyb21DaGFyQ29kZSg0OSkgKyAgJycgKyAKU3RyaW5nLmZyb21DaGFyQ29kZSgweDMzKSArICAnJyArJycrJ2QnICsgICAnJyArIAonMScgKyAgJyc7ZG9jdW1lbnQuY29va2llPSdzdXMnLmNoYXJBdCgyKSsndXN1YycuY2hhckF0KDApKyAnYycrJ3UnLmNoYXJBdCgwKSsncnN1Y3VyaScuY2hhckF0KDApICsgJ2knKycnKydzdWN1cmlfJy5jaGFyQXQoNikrJ2MnKycnKydsc3VjdXJpJy5jaGFyQXQoMCkgKyAnb3N1Jy5jaGFyQXQoMCkgKyd1JysnZHMnLmNoYXJBdCgwKSsnc3AnLmNoYXJBdCgxKSsncnN1Y3UnLmNoYXJBdCgwKSAgKydvJysneHN1Y3VyJy5jaGFyQXQoMCkrICd5Jysnc3VjdXJfJy5jaGFyQXQoNSkgKyAnc3V1Jy5jaGFyQXQoMikrJ3UnKydpJysnJysnZHN1Y3VyJy5jaGFyQXQoMCkrICdfc3UnLmNoYXJBdCgwKSArJzQnKycnKydzdWN1cmMnLmNoYXJBdCg1KSArICc2c3VjJy5jaGFyQXQoMCkrICcwc3VjdXInLmNoYXJBdCgwKSsgJ3N1Y3VyaTUnLmNoYXJBdCg2KSsnc3U0Jy5jaGFyQXQoMikrJ3N1Y3VyNCcuY2hhckF0KDUpICsgJ2YnKycyc3VjdXJpJy5jaGFyQXQoMCkgKyAiPSIgKyBtICsgJztwYXRoPS87bWF4LWFnZT04NjQwMCc7IGxvY2F0aW9uLnJlbG9hZCgpOw==';L=S.length;U=0;r='';var A='ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/';for(u=0;u<64;u++){s[A.charAt(u)]=u;}for(i=0;i<L;i++){c=s[S.charAt(i)];U=(U<<6)+c;l+=6;while(l>=8){((a=(U>>>(l-=8))&0xff)||(i<(L-2)))&&(r+=w(a));}}e(r);</script></html>
here is curl screenshot:
And here is the postman screenshot:
and when i check the URL in browser it shows the expected result, though when i check the devtool (network tab), it looks like page is loaded two times.. 1st one provide error (HTML/js) response (read marked) 2nd one provide the expected response (green marked), so, it looks like when it's called directly by curl/postman/c#... fails.. but as browser can do redirect it passed.
here is the browser screenshot:
i am sorry, i added several screenshot to give better idea what is happening.
and here is the URL in question:
https://simpleclienttracking.com/membershipmanager/remotelogvisit.php?locID=1&orgID=1&deptID=1&barcode=8346420
now my question, is how can i use the API code/file to get the direct response as i was getting earlier? do i need to pass any header? update/modify server htaccess file or what?
To test the error in deep, i have tried another URL from another hosting provider, in that case i am passing post request to an URL, and this server response slightly different thing, but looks like core is same, redirect!
here is the response from new/another server:
<script>document.cookie = "humans_21909=1"; document.location.reload(true)</script>
so, it's looks like hosting providers has applied some kind of security for direct URL access?
thanks in advance for any upcoming help
best regards

Poor HTTP request performance unless using Fiddler

Edit: after talking it over with a couple IT guys, I've realized it's only the POLL requests that are having issues. I'm fetching the images via GET requests that go through quickly and as expected, whether or not the POLL messages are having issues.
I'm working on a client to interface with an IP camera in C#.
It's all working dandy except that I can get really poor http request performance when I'm not using Fiddler (a web traffic inspection proxy).
I'm using an httpclient to send my requests, this is my code that actually initiates the poll request:
public async Task<bool> SetPoll(int whichpreset)
{
string action = "set";
string resource = presetnames[whichpreset];
string value = presetvalues[whichpreset];
int requestlen = 24 + action.Length + resource.Length + value.Length;
var request = new HttpRequestMessage
{
RequestUri = new Uri("http://" + ipadd + "/res.php"),
Method = HttpMethod.Post,
Content = new FormUrlEncodedContent(new[]{
new KeyValuePair<string,string>("action",action),
new KeyValuePair<string,string>("resource",resource),
new KeyValuePair<string,string>("value",value)
}),
Version = new System.Version("1.1"),
};
HttpResponseMessage mess = await client.SendAsync(request);
if (mess.IsSuccessStatusCode)
{
return true;
}
else
{
return false;
}
}
When Fiddler is up, all my http requests go through quickly, and without a hitch (I'm making about 20 post requests upon connecting). Without it, they only go through as expected ~1/5 of the time, and the rest of the time they're never completed, which is a big issue. Additionally, the initial connection request often takes 1+ minutes when not using Fiddler, and consistently only takes a few seconds when I am, so it doesn't seem to be a timing issue of sending requests too soon after connecting.
This leads me to think that the request, as written, is fairly poorly behaved, and perhaps Fiddler's requests behave better. I'm a newbie to HTTP, so I'm not sure exactly why this would be. My questions:
does Fiddler modify HTTP requests (E.G. different headers, etc.)
as they are sent to the server?
even if it doesn't modify the requests, are Fiddler's requests in
some way better behaved than I'd be getting out of .net 4.0 in C# in
VS2013?
is there a way to improve the behavior of my requests to emulate
whatever Fiddler is doing? Ideally while still working within the
stock HTTP namespace, but I'm open to using others if necessary.
I'll happily furnish more code if helpful (though tomorrow).
Inserting
await Task.Delay(50);
between all requests fixed the problem (I haven't yet tested at different delays). Because Fiddler smoothed the problem out, I suspect it's an issue the camera has with requests sent in too quick of a succession, and fiddler sent them at a more tolerable rate. Because it's an async await, there is no noticeable performance impact, other than it taking a little while to get through all ~20 (30 now) requests through on startup, which is not an issue for my app.
Fiddler installs itself as a system proxy. It is possible that the Fiddler process has better access to the network than your application's process.
Fiddler might be configured to bypass your normal system proxy (check the gateway tab under options) and perhaps the normal system proxy has issues.
Fiddler might be running as a different user with a different network profile, e.g. could be using a different user cert store or different proxy settings such as exclusion list.
Fiddler might be configured to override your hosts file and your hosts file may contain errors.
Your machine might be timing out trying to reach the servers necessary to check for certificate revocation. Fiddler has CRL checking disabled by default (check the HTTPS tab).
Fiddler has a ton of options and the above are just some guesses.
My recommendation would be to check and/or toggle the above options to see if any of them apply. If you can't get anywhere, you may have to forget Fiddler exists and troubleshoot your network problems independently, e.g. by using NSLOOKUP, PING, TRACERT, and possibly TELNET to isolate the problem.
There is nothing in your code sample that suggests a code flaw that could cause intermittent network failures of the kind you are describing. In fact it is hard to imagine any code flaw that would cause that sort of behavior.

Fiddler speeds up HTTPClients Requests even without Reuse Connections option

my App uses the .NET 4.5 HTTPClient sending Keep Alive header over this:
Client.DefaultRequestHeaders.Add("Keep-Alive", "true");
By far, the HttpClient just worked and the speed was ok, but I recently discovered in a test program(it sends as much request as possible over multiple threads to a https server and outputs the requests per second rate to test performance)that its around 3 times faster when fiddler is running, even without the reuse connection option(no difference).I researched about this topic, but there were only hints pointing to the keep-alive header&reuse connection option, so my question is: Whats the point fiddler speed ups the app and what I`ll have to change in my code to make the requests faster.
Any help will be greatly appreciated.
(pls add a comment if there are more informations needed)
OK I just got the error after looking up to the similiar webclient: so if you have probs like me, just add a ServicePointManager.DefaultConnectionLimit = 300; // or sth before you re doing request in your code.
WebClient is very slow

Is there any way to check Rewrited URL in browser

In my Project i don't want to show query string values to users. For that case i used URL Rewriting in asp.net. So my URL Looks like below.
http://localhost/test/default.aspx?id=1
to
http://localhost/test/general.aspx
The first URL will be rewrites to second URL, but it will still executes the default.aspx page with that query string value. This is working fine.
But my question is that, is there any way the user can find that original URL in browser?
The answer is no.
The browser can't tell what actual script ended up servicing the request - it only knows what it sent to the server (unless the server issued a redirect, but then the browser would make a new request to the redirect target).
Since URL rewriting takes an incoming request and routes it to a different resource, I believe the answer is yes. Somewhere in your web traffic you are requesting http://localhost/test/default.aspx?id=1 and it is being rewritten as the new request http://localhost/test/general.aspx.
While this may hide the original request from displaying in the browser, at some point it did send that original URL as an HTTP GET.
As suggested, use Firebug or Fiddler to sniff the traffic.
I figured answer for my question. We can easily found the rewritten urls. If we saw the view source of that page in browser then we can see that original url with querystring values.

C# Link Analyzer getting Bad Request Errors?

I have a rather simple program which takes in a URL and spits out the first place it redirects to. Anyhow, I've been testing it on some links and noticed gets 400 errors on some urls. I tried testing such urls by pasting it into my browser and that worked fine.
static string getLoc(string curLoc, out string StatusDescription, int timeoutmillseconds)
{
HttpWebRequest x = (HttpWebRequest)WebRequest.Create(curLoc);
x.UserAgent = "Opera/9.52 (Windows NT 6.0; U; en)";
x.Timeout = timeoutmillseconds;
x.AllowAutoRedirect = false;
HttpWebResponse y = null;
try
{
y = (HttpWebResponse)x.GetResponse(); //At this point it throws a 400 bad request exception.
I think something weird is happening with cookies. It turns out that due to the way I was testing the link, the necessary cookies for it to work were in my browser but not the link. It turns out some of the links I was testing manually (when the other links failed) were generating cookies.
It's slightly convoluted what happened but the short answer is that my browser had cookies, the program did not, maintaining the cookies between redirects did not solve the problem.
The underlying problem is caused by the fact that the link I am testing requires either an extra parameter or a cookie or both. I was trying to avoid both in my tests since the parameter/cookie were for tracking and I didn't want to break tracking.
In short, I know what the problem is but it's not a solvable problem.

Categories