C# Webclient problem with looping download? - c#

this is my implementation of Webclient, supposedly, this download should be continuous, but for some reason, which debug don't even help, i got 1 success in the first run, then the rest are failed. Does anyone know why ?
for (int i = 1; i <= Count; i++)
{
using (WebClient wc = new WebClient())
{
wc.Headers["Accept-Encoding"] = "gzip";
wc.Headers["User-Agent"] = "Mozilla/4.0 (Compatible; Windows NT 5.1; MSIE 6.0) (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
byte[] arr = wc.DownloadData(url);
if (arr.Length > 0)
Console.WriteLine(i.ToString() + ": SUCCESS");
else
Console.WriteLine(i.ToString() + ": FAILED");
}
}

when i messed up with this code, it work, LOL! I dont' know what to say anymore...
using (WebClient client = new WebClient())
{
//manipulate request headers (optional)
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)");
//execute request and read response as string to console
using (StreamReader reader = new StreamReader(client.OpenRead(url)))
{
string s = reader.ReadToEnd();
//Console.WriteLine(s);
Console.WriteLine("Vote " + i.ToString() + ": SUCCESS");
i++;
}
}
// *** Establish the request
HttpWebRequest loHttp =
(HttpWebRequest)WebRequest.Create(url);
// *** Set properties
loHttp.Timeout = 10000; // 10 secs
loHttp.UserAgent = "Mozilla/4.0 (Compatible; Windows NT 5.1; MSIE 6.0) (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
loHttp.Headers["Accept-Encoding"] = "gzip";
// *** Retrieve request info headers
HttpWebResponse loWebResponse = (HttpWebResponse)loHttp.GetResponse();
Encoding enc = Encoding.GetEncoding(1252); // Windows default Code Page
StreamReader loResponseStream =
new StreamReader(loWebResponse.GetResponseStream(), enc);
string lcHtml = loResponseStream.ReadToEnd();
loWebResponse.Close();
loResponseStream.Close();
if (lcHtml.Length > 0)
{
Console.WriteLine("Vote " + i.ToString() + ": SUCCESS");
i++;
}
else
Console.WriteLine("Vote " + i.ToString() + ": FAILED");
}
Marked as community wiki, so that if anyone know why, please edit... :(

Use a debugging proxy like Fiddler, and you'll be able to see the HTTP transactions.

Related

C# download image from src without extension

I would like to download an image from this url http://squirlytraffic.com/surficon.php?ts=1491591235
I tried this code but I don't see the image when I open it.
using (WebClient client = new WebClient())
{
client.DownloadFile("http://squirlytraffic.com/surficon.php?ts=1491591235", #"D:\image.jpg");
}
You need to set your credentials using the the WebClient Credentials property. You can do this by assigning it an instance of NetworkCredential. See below:
using (WebClient client = new WebClient()){
client.Credentials = new NetworkCredential("user-name", "password");
client.DownloadFile("url", #"file-location");
}
EDIT
If you don't want to hard code a username and password, you can set the UseDefaultCredentials property of the WebClient to true. This will use the credentials of the currently logged in user. From the documentation.
The Credentials property contains the authentication credentials used to access a resource on a host. In most client-side scenarios, you should use the DefaultCredentials, which are the credentials of the currently logged on user. To do this, set the UseDefaultCredentials property to true instead of setting this property.
Which would mean you could amend the above code to:
using (WebClient client = new WebClient()){
client.UseDefaultCredentials = true;
client.DownloadFile("url", #"file-location");
}
Try this way, those parameters are passed when login
StringBuilder postData = new StringBuilder();
postData.Append("login=" + HttpUtility.UrlEncode("username") + "&");
postData.Append("password=" + HttpUtility.UrlEncode("password") + "&");
postData.Append("Submit=" + HttpUtility.UrlEncode("Login"));
ASCIIEncoding ascii = new ASCIIEncoding();
byte[] postBytes = ascii.GetBytes(postData.ToString());
CookieContainer cc = new CookieContainer();
HttpWebRequest webReq = (HttpWebRequest)WebRequest.Create("http://squirlytraffic.com/members.php");
webReq.Method = "POST";
webReq.ContentType = "application/x-www-form-urlencoded";
webReq.ContentLength = postBytes.Length;
webReq.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
webReq.CookieContainer = cc;
Stream postStream = webReq.GetRequestStream();
postStream.Write(postBytes, 0, postBytes.Length);
postStream.Flush();
postStream.Close();
HttpWebResponse res = (HttpWebResponse)webReq.GetResponse();
HttpWebRequest ImgReq = (HttpWebRequest)WebRequest.Create("http://squirlytraffic.com/surficon.php?ts=1491591235");
ImgReq.Method = "GET";
ImgReq.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
ImgReq.CookieContainer = cc;
HttpWebResponse ImgRes = (HttpWebResponse)ImgReq.GetResponse();
Stream Img = ImgRes.GetResponseStream();

How to query Mixpanel from C#

I am pretty new to web access from C# and completely new to Mixpanel. I am trying to run a query with this code:
using (WebClient wc = new WebClient())
{
wc.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)");
byte[] creds = UTF8Encoding.UTF8.GetBytes("<my API secret>:");
wc.Headers.Add("Authorization", "Basic " + Convert.ToBase64String(creds));
var reqparm = new System.Collections.Specialized.NameValueCollection();
reqparm.Add("script", "function main() { return Events({\"from_date\":\"2016-10-01\",\"to_date\":\"2016-10-167\"}).reduce(mixpanel.reducer.count()); }");
byte[] responsebytes = wc.UploadValues("https://mixpanel.com/api/2.0/jql", "POST", reqparm);
}
The query is taken directly from this Mixpanel sample:
function main()
{
return Events
({
from_date: "2016-01-04",
to_date: "2016-01-04"
}).reduce(mixpanel.reducer.count());
}
I've tried lots of variations on the above, but UploadValues always returns 400 (Bad Request). What am I doing wrong?
TIA
There was an error in one of the dates that I didn't notice ("2016-10-167").

Downloading files from Web using basic URL Authorization

I can successfully download a file using the following link in a browser:
https://userbob:qwerty#assoc-datafeeds-na.amazon.com/datafeed/getReport?filename=feedreport-1265.xml.gz
Now I need to to do this in my C# application.
Solution 1: using WebClient
string url = $"https://assoc-datafeeds-na.amazon.com/datafeed/getReport?filename=feedreport-1265.xml.gz";
using (var webClient = new WebClient())
{
string authInfo =Convert.ToBase64String(Encoding.Default.GetBytes("userbob:qwerty"));
webClient.Headers[HttpRequestHeader.Authorization] = "Basic " + authInfo;
webClient.Headers[HttpRequestHeader.ContentType] = "application/octet-stream";
webClient.Headers[HttpRequestHeader.UserAgent] = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
var data = webClient.DownloadData(url);
}
DownloadData throws an exception: The remote server returned an error: (500) Internal Server Error.
Solution 2: using WebRequest
var request = WebRequest.Create(url);
string authInfo =Convert.ToBase64String(Encoding.Default.GetBytes("userbob:qwerty"));
((HttpWebRequest)request).CookieContainer = new CookieContainer();
request.Headers["Authorization"] = "Basic " + authInfo;
((HttpWebRequest)request).UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
((HttpWebRequest)request).ContentType = "application/octet-stream";
var response = request.GetResponse();
The same exception on the last line. I feel I am missing some header or setting to fully mimic browser behavior.
Interestingly, if I omit Authorization header in either of the solutions, I get 401:Unauthorized exceptions. That makes me think that I pass authorization and the problem is in something else. What am I missing ?
You have to pass token by converting to Base64:
var username = "xxxxxxx";
var password = "yyyyyyy";
var token = Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(username + ":" + password));
request.Headers.Add("Authorization", "Basic " + token);

Receiving incomplete data when using HttpWebResponse::GetResponseStream method

Is HttpWebResponse::GetResponseStream() guaranteed to get all data contained in an HTTP response? Or do I need to create some kind of loop and wait to make sure all data is being sent from the server to which I'm connected? The code below successfully grabs a response about 50% of the time.
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://JohnDoeServerSite.com");
req.Method = "POST";
req.ContentType = #"text/xml; charset=utf-8";
req.Host = "http://JohnDoeServerSite.com";
req.UserAgent = "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; InfoPath.1)";
using (StreamWriter sw = new StreamWriter(req.GetRequestStream()))
{
sw.Write(xml);
}
string result;
using (HttpWebResponse res = (HttpWebResponse)req.GetResponse())
using (Stream st = res.GetResponseStream())
{
Thread.Sleep(10000); // Added to see if additional data would be sent (perhaps?)
using (StreamReader sr = new StreamReader(st, Encoding.UTF8))
{
st.Flush();
result = sr.ReadToEnd();
}
}
After looking into this more, it turns out that the server was not sending all of the data in its entirety. It seems that Michael Yoon is right in saying that everything should be returned.

How can I login to ASP.Net Forms Authenticated site using C#?

I am trying to Screen Scrape a WebSite that uses ASP.Net Forms Authentication. I have the following code which makes a Get request to get the cookies and then I want to post the Login info with the cookies I just got to the login.aspx page.
When I watch what is submitted to the login.aspx page with Fiddler I see the Posted data but for the submitted cookies I see the message " but it says "This request did not send any cookie data."
If I login into the app using Internet Explorer I can see that the cookies and the posted data are submitted to the login.aspx page and everything is working fine.
But if I loop through I can print out the cookies that I am assuming should be sent along with the request using this block
foreach (System.Net.Cookie cook in getCookies.CookieContainer.GetCookies(new Uri("https://app.example.com")))
{
Console.WriteLine(cook.Name + ": " + cook.Value);
}
What is is that I am doing wrong that is resulting in the cookies not being sent with my request?
public void Login()
{
HttpWebRequest getCookies = (HttpWebRequest)WebRequest.Create("https://app.example.com");
CookieContainer cookieJar = new CookieContainer();
getCookies.Accept = "image/gif, image/jpeg, image/pjpeg, image/pjpeg, application/vnd.ms-xpsdocument, application/x-ms-application, application/x-ms-xbap, application/xaml+xml, */*";
getCookies.UserAgent = "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.2; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET4.0E; .NET4.0C)";
getCookies.Headers.Add("Accept-Encoding", "gzip, deflate");
getCookies.AllowAutoRedirect = true;
getCookies.CookieContainer = cookieJar;
using (HttpWebResponse cookieResponse = getCookies.GetResponse() as HttpWebResponse)
{
StreamReader responseReader = new StreamReader(cookieResponse.GetResponseStream());
string responseData = responseReader.ReadToEnd();
string ViewState = this.GetViewStateFromHtml(responseData);
getCookies = (HttpWebRequest)WebRequest.Create("https://app.example.com/Membership/Login.aspx?ReturnUrl=%2fHome%2fHomeSummary.aspx");
getCookies.Method = "Post";
getCookies.ContentType = "application/x-www-form-urlencoded";
getCookies.Accept = "text/html, application/xhtml+xml. */*";
getCookies.UserAgent = "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.2; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET4.0E; .NET4.0C)";
getCookies.Headers.Add("Accept-Encoding", "gzip, deflate");
getCookies.AllowAutoRedirect = true;
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] byte1 = encoding.GetBytes(this.GetPostData(Username, Password));
getCookies.ContentLength = byte1.Length;
StreamWriter requestWriter = new StreamWriter(getCookies.GetRequestStream());
requestWriter.Write(this.GetPostData(Username, Password));
requestWriter.Close();
getCookies.CookieContainer = cookieJar;
foreach (System.Net.Cookie cook in getCookies.CookieContainer.GetCookies(new Uri("https://app.example.com")))
{
Console.WriteLine(cook.Name + ": " + cook.Value);
}
using (HttpWebResponse postResponse = (HttpWebResponse)getCookies.GetResponse())
{
StreamReader postLoginResponseReader = new StreamReader(postResponse.GetResponseStream());
string postLoginResponseData = postLoginResponseReader.ReadToEnd();
File.WriteAllText(#"C:\postLoginResponse.txt", postLoginResponseData);
}
Console.WriteLine(#"Check C:\postLoginResponse.txt");
Console.ReadLine();
}
I don't see where you are adding your forms auth cookie to the HttpWebRequest cookie container.
In the past I have done this
var request = (HttpWebRequest) WebRequest.Create(remoteFilename);
request.CookieContainer = new CookieContainer();
request.CookieContainer.Add(new Cookie(".ASPXAUTH", "71AE9C8F6CFDC86BD7FD3AD7B214C4E1", "/", "build.mercaridirect.com.au"));
minor point You should change the name of your variable getCookies to webRequest or similar, improves readibility.
This is a portion of a larger code snippet, so please let me know if you need more of it. Essentially, you can code directly against the MembershipProvider and the FormsAuthentication classes to simulate the login:
bool validated = Membership.ValidateUser(uname, pwd);
if (validated)
{
if (Request.QueryString["ReturnUrl"] != null)
{
FormsAuthentication.RedirectFromLoginPage(uname, false);
}
else
{
FormsAuthentication.SetAuthCookie(uname, false);
}
return;
}
//Response.Write("Failed to authenticate, invalid credentials.");
Hope this helps.

Categories