Downloading files from Web using basic URL Authorization - c#

I can successfully download a file using the following link in a browser:
https://userbob:qwerty#assoc-datafeeds-na.amazon.com/datafeed/getReport?filename=feedreport-1265.xml.gz
Now I need to to do this in my C# application.
Solution 1: using WebClient
string url = $"https://assoc-datafeeds-na.amazon.com/datafeed/getReport?filename=feedreport-1265.xml.gz";
using (var webClient = new WebClient())
{
string authInfo =Convert.ToBase64String(Encoding.Default.GetBytes("userbob:qwerty"));
webClient.Headers[HttpRequestHeader.Authorization] = "Basic " + authInfo;
webClient.Headers[HttpRequestHeader.ContentType] = "application/octet-stream";
webClient.Headers[HttpRequestHeader.UserAgent] = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
var data = webClient.DownloadData(url);
}
DownloadData throws an exception: The remote server returned an error: (500) Internal Server Error.
Solution 2: using WebRequest
var request = WebRequest.Create(url);
string authInfo =Convert.ToBase64String(Encoding.Default.GetBytes("userbob:qwerty"));
((HttpWebRequest)request).CookieContainer = new CookieContainer();
request.Headers["Authorization"] = "Basic " + authInfo;
((HttpWebRequest)request).UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
((HttpWebRequest)request).ContentType = "application/octet-stream";
var response = request.GetResponse();
The same exception on the last line. I feel I am missing some header or setting to fully mimic browser behavior.
Interestingly, if I omit Authorization header in either of the solutions, I get 401:Unauthorized exceptions. That makes me think that I pass authorization and the problem is in something else. What am I missing ?

You have to pass token by converting to Base64:
var username = "xxxxxxx";
var password = "yyyyyyy";
var token = Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(username + ":" + password));
request.Headers.Add("Authorization", "Basic " + token);

Related

C# download image from src without extension

I would like to download an image from this url http://squirlytraffic.com/surficon.php?ts=1491591235
I tried this code but I don't see the image when I open it.
using (WebClient client = new WebClient())
{
client.DownloadFile("http://squirlytraffic.com/surficon.php?ts=1491591235", #"D:\image.jpg");
}
You need to set your credentials using the the WebClient Credentials property. You can do this by assigning it an instance of NetworkCredential. See below:
using (WebClient client = new WebClient()){
client.Credentials = new NetworkCredential("user-name", "password");
client.DownloadFile("url", #"file-location");
}
EDIT
If you don't want to hard code a username and password, you can set the UseDefaultCredentials property of the WebClient to true. This will use the credentials of the currently logged in user. From the documentation.
The Credentials property contains the authentication credentials used to access a resource on a host. In most client-side scenarios, you should use the DefaultCredentials, which are the credentials of the currently logged on user. To do this, set the UseDefaultCredentials property to true instead of setting this property.
Which would mean you could amend the above code to:
using (WebClient client = new WebClient()){
client.UseDefaultCredentials = true;
client.DownloadFile("url", #"file-location");
}
Try this way, those parameters are passed when login
StringBuilder postData = new StringBuilder();
postData.Append("login=" + HttpUtility.UrlEncode("username") + "&");
postData.Append("password=" + HttpUtility.UrlEncode("password") + "&");
postData.Append("Submit=" + HttpUtility.UrlEncode("Login"));
ASCIIEncoding ascii = new ASCIIEncoding();
byte[] postBytes = ascii.GetBytes(postData.ToString());
CookieContainer cc = new CookieContainer();
HttpWebRequest webReq = (HttpWebRequest)WebRequest.Create("http://squirlytraffic.com/members.php");
webReq.Method = "POST";
webReq.ContentType = "application/x-www-form-urlencoded";
webReq.ContentLength = postBytes.Length;
webReq.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
webReq.CookieContainer = cc;
Stream postStream = webReq.GetRequestStream();
postStream.Write(postBytes, 0, postBytes.Length);
postStream.Flush();
postStream.Close();
HttpWebResponse res = (HttpWebResponse)webReq.GetResponse();
HttpWebRequest ImgReq = (HttpWebRequest)WebRequest.Create("http://squirlytraffic.com/surficon.php?ts=1491591235");
ImgReq.Method = "GET";
ImgReq.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 1.1.4322; .NET CLR 2.0.50727)";
ImgReq.CookieContainer = cc;
HttpWebResponse ImgRes = (HttpWebResponse)ImgReq.GetResponse();
Stream Img = ImgRes.GetResponseStream();

How to query Mixpanel from C#

I am pretty new to web access from C# and completely new to Mixpanel. I am trying to run a query with this code:
using (WebClient wc = new WebClient())
{
wc.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)");
byte[] creds = UTF8Encoding.UTF8.GetBytes("<my API secret>:");
wc.Headers.Add("Authorization", "Basic " + Convert.ToBase64String(creds));
var reqparm = new System.Collections.Specialized.NameValueCollection();
reqparm.Add("script", "function main() { return Events({\"from_date\":\"2016-10-01\",\"to_date\":\"2016-10-167\"}).reduce(mixpanel.reducer.count()); }");
byte[] responsebytes = wc.UploadValues("https://mixpanel.com/api/2.0/jql", "POST", reqparm);
}
The query is taken directly from this Mixpanel sample:
function main()
{
return Events
({
from_date: "2016-01-04",
to_date: "2016-01-04"
}).reduce(mixpanel.reducer.count());
}
I've tried lots of variations on the above, but UploadValues always returns 400 (Bad Request). What am I doing wrong?
TIA
There was an error in one of the dates that I didn't notice ("2016-10-167").

Sending SMS using clickatell in ASP.Net

I have tried this code from their site
using System.Net;
using System.IO;
WebClient client = new WebClient();
// Add a user agent header in case the requested URI contains a query.
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)");
client.QueryString.Add("user", "xxxx");
client.QueryString.Add("password", "xxxx");
client.QueryString.Add("api_id", "xxxx");
client.QueryString.Add("to", "xxxx");
client.QueryString.Add("text", "This is an example message");
string baseurl = "http://api.clickatell.com/http/sendmsg";
Stream data = client.OpenRead(baseurl);
StreamReader reader = new StreamReader(data);
string s = reader.ReadToEnd();
data.Close();
reader.Close();
return s;
I am getting an Proxy authentication failure in
string baseurl ="http://api.clickatell.com/http/sendmsg";
Can anyone help me out??
I think you are behind a proxy and your web client needs to authenticate. Try the following code:
string userName = "<user name>";
string password = "<password>";
ICredentials creds = new NetworkCredential(userName, password);
client.Credentials = creds;

How can I login to ASP.Net Forms Authenticated site using C#?

I am trying to Screen Scrape a WebSite that uses ASP.Net Forms Authentication. I have the following code which makes a Get request to get the cookies and then I want to post the Login info with the cookies I just got to the login.aspx page.
When I watch what is submitted to the login.aspx page with Fiddler I see the Posted data but for the submitted cookies I see the message " but it says "This request did not send any cookie data."
If I login into the app using Internet Explorer I can see that the cookies and the posted data are submitted to the login.aspx page and everything is working fine.
But if I loop through I can print out the cookies that I am assuming should be sent along with the request using this block
foreach (System.Net.Cookie cook in getCookies.CookieContainer.GetCookies(new Uri("https://app.example.com")))
{
Console.WriteLine(cook.Name + ": " + cook.Value);
}
What is is that I am doing wrong that is resulting in the cookies not being sent with my request?
public void Login()
{
HttpWebRequest getCookies = (HttpWebRequest)WebRequest.Create("https://app.example.com");
CookieContainer cookieJar = new CookieContainer();
getCookies.Accept = "image/gif, image/jpeg, image/pjpeg, image/pjpeg, application/vnd.ms-xpsdocument, application/x-ms-application, application/x-ms-xbap, application/xaml+xml, */*";
getCookies.UserAgent = "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.2; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET4.0E; .NET4.0C)";
getCookies.Headers.Add("Accept-Encoding", "gzip, deflate");
getCookies.AllowAutoRedirect = true;
getCookies.CookieContainer = cookieJar;
using (HttpWebResponse cookieResponse = getCookies.GetResponse() as HttpWebResponse)
{
StreamReader responseReader = new StreamReader(cookieResponse.GetResponseStream());
string responseData = responseReader.ReadToEnd();
string ViewState = this.GetViewStateFromHtml(responseData);
getCookies = (HttpWebRequest)WebRequest.Create("https://app.example.com/Membership/Login.aspx?ReturnUrl=%2fHome%2fHomeSummary.aspx");
getCookies.Method = "Post";
getCookies.ContentType = "application/x-www-form-urlencoded";
getCookies.Accept = "text/html, application/xhtml+xml. */*";
getCookies.UserAgent = "Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.2; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET4.0E; .NET4.0C)";
getCookies.Headers.Add("Accept-Encoding", "gzip, deflate");
getCookies.AllowAutoRedirect = true;
ASCIIEncoding encoding = new ASCIIEncoding();
byte[] byte1 = encoding.GetBytes(this.GetPostData(Username, Password));
getCookies.ContentLength = byte1.Length;
StreamWriter requestWriter = new StreamWriter(getCookies.GetRequestStream());
requestWriter.Write(this.GetPostData(Username, Password));
requestWriter.Close();
getCookies.CookieContainer = cookieJar;
foreach (System.Net.Cookie cook in getCookies.CookieContainer.GetCookies(new Uri("https://app.example.com")))
{
Console.WriteLine(cook.Name + ": " + cook.Value);
}
using (HttpWebResponse postResponse = (HttpWebResponse)getCookies.GetResponse())
{
StreamReader postLoginResponseReader = new StreamReader(postResponse.GetResponseStream());
string postLoginResponseData = postLoginResponseReader.ReadToEnd();
File.WriteAllText(#"C:\postLoginResponse.txt", postLoginResponseData);
}
Console.WriteLine(#"Check C:\postLoginResponse.txt");
Console.ReadLine();
}
I don't see where you are adding your forms auth cookie to the HttpWebRequest cookie container.
In the past I have done this
var request = (HttpWebRequest) WebRequest.Create(remoteFilename);
request.CookieContainer = new CookieContainer();
request.CookieContainer.Add(new Cookie(".ASPXAUTH", "71AE9C8F6CFDC86BD7FD3AD7B214C4E1", "/", "build.mercaridirect.com.au"));
minor point You should change the name of your variable getCookies to webRequest or similar, improves readibility.
This is a portion of a larger code snippet, so please let me know if you need more of it. Essentially, you can code directly against the MembershipProvider and the FormsAuthentication classes to simulate the login:
bool validated = Membership.ValidateUser(uname, pwd);
if (validated)
{
if (Request.QueryString["ReturnUrl"] != null)
{
FormsAuthentication.RedirectFromLoginPage(uname, false);
}
else
{
FormsAuthentication.SetAuthCookie(uname, false);
}
return;
}
//Response.Write("Failed to authenticate, invalid credentials.");
Hope this helps.

Automate downloads from password protected website [duplicate]

This question already has answers here:
Login to the page with HttpWebRequest
(2 answers)
Closed 6 years ago.
I need some help with a work project I have been assigned. At the moment we manually go to the site, logon and then download 2 excel files from a supplier's website every month. The files are then loaded into SQL.
We want to automate this process. Now the loading of the files into SQL I can do, but I am not sure how I can automate logging onto the website entering my user details and collecting the files. I mostly deal with SQL and have very little .NET experience, so any code samples would be most appreciated.
Just to confirm. The logon form is on a aspx page. just a basic form with a table containing the username & password fields, the forgotten password link and the logon button
You can either use webclient or httpwebrequest.
Login to the page with HttpWebRequest
How do you login to a webpage and retrieve its content in C#?
Httpwebrequest example:
HttpWebRequest req = (HttpWebRequest)HttpWebRequest.Create("http://sso.bhmobile.ba/sso/login");
req.UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; .NET CLR 1.0.3705;)";
req.Method = "POST";
req.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
req.Headers.Add("Accept-Language: en-us,en;q=0.5");
req.Headers.Add("Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7");
req.KeepAlive = true;
req.Headers.Add("Keep-Alive: 300");
req.Referer ="http://sso.bhmobile.ba/sso/login";
req.ContentType = "application/x-www-form-urlencoded";
String Username = "username";
String PassWord = "Password";
StreamWriter sw = new StreamWriter(req.GetRequestStream());
sw.Write("application=portal&url=http%3A%2F%2Fwww.bhmobile.ba%2Fportal%2Fredirect%3Bjsessionid%3D1C568AAA1FB8B5C757CF5F68BE6ECE65%3Ftype%3Dssologin%26url%3D%2Fportal%2Fshow%3Bjsessionid%3D1C568AAA1FB8B5C757CF5F68BE6ECE65%3Fidc%3D1023278&realm=sso&userid=" + Username + "&password=" + password + "&x=16&y=11");
sw.Close();
HttpWebResponse response = (HttpWebResponse)req.GetResponse();
StreamReader reader = new StreamReader(response.GetResponseStream());
string tmp = reader.ReadToEnd();
foreach (Cookie cook in response.Cookies)
{
tmp += "\n" + cook.Name + ": " + cook.Value;
}
Response.Write(tmp);
Response.End();
Webclient example:
WebClient wc = new WebClient();
wc.Credentials = new NetworkCredential("username", "password");
string url = "http://foo.com";
try
{
using (Stream stream = wc.OpenRead(new Uri(url)))
{
using (StreamReader reader = new StreamReader(stream))
{
return reader.ReadToEnd();
}
}
}
catch (WebException e)
{
//Error handeling
}

Categories