Okay so here is the deal. As the question states, I'm trying to POST a file to a webserver and am having a few issues.
I've tried posting this same file to the same webserver using Curl.exe and have had no issues. I've posted the flags I used with curl just incase they might point out any potential reasons why I'm having trouble with the .NET classes.
curl.exe --user "myUser:myPass" --header "Content-Type: application/gzip"
--data-binary "#filename.txt.gz" --cookie "data=service; data-ver=2; date=20100212;
time=0900; location=1234" --output "out.txt" --dump-header "header.txt"
http://mysite/receive
I'm trying to use a .NET class like WebClient or HttpWebRequest to do the same thing. Here is a sample of the code I've tried. With the WebClient I get a 505 HTTP Version Not Supported error and with the HttpWebRequest I get a 501 Not Implemented.
When trying it with a WebClient:
public void sendFileClient(string path){
string url = "http://mysite/receive";
WebClient wc = new WebClient();
string USERNAME = "myUser";
string PSSWD = "myPass";
NetworkCredential creds = new NetworkCredential(USERNAME, PSSWD);
wc.Credentials = creds;
wc.Headers.Set(HttpRequestHeader.ContentType, "application/gzip");
wc.Headers.Set("Cookie", "location=1234; date=20100226; time=1630; data=service; data-ver=2");
wc.UploadFile(url, "POST", path);
}
And while using a HttpRequest:
public Stream sendFile(string path)
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create("http://myserver/receive");
string USERNAME = "myUser";
string PSSWD = "myPass";
NetworkCredential creds = new NetworkCredential(USERNAME, PSSWD);
request.Credentials = creds;
request.Method = "POST";
request.ContentType = "application/gzip";
request.Headers.Set("Cookie", "location=1234; date=20100226; time=1630; data=service; data-ver=2");
FileInfo fInfo = new FileInfo(path);
long numBytes = fInfo.Length;
FileStream fStream = new FileStream(path, FileMode.Open, FileAccess.Read);
BinaryReader br = new BinaryReader(fStream);
byte[] data = br.ReadBytes((int)numBytes);
br.Close();
fStream.Close();
fStream.Dispose();
Stream wrStream = request.GetRequestStream();
BinaryWriter bw = new BinaryWriter(wrStream);
bw.Write(data);
bw.Close();
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
return response.GetResponseStream();
}
First, use something like fiddler and inspect the requests and responses to see what differs between curl and System.Net.WebClient.
Also, you can try (although inspecting with the debugging proxy should allow you to pinpoint the difference):
Use the credential cache to set your credentials for basic authentication:
var cc= new CredentialCache();
cc.Add(new Uri(url),
"Basic",
new NetworkCredential("USERNAME", "PASSWORD"));
wc.Credentials = cc;
Set a user agent header:
string _UserAgent = "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
wc.Headers.Add(HttpRequestHeader.UserAgent, _UserAgent);
Change the protocol version on the WebRequest:
reqeust.KeepAlive = false;
request.ProtocolVersion=HttpVersion.Version10;
There might be another 2 reasons when a 501 accord.
----------1---------
when the postdate contain some Chinese Chracter or some other character.
e.g.
postDate = "type=user&username=计算机学院&password=123&Submit=+登录+"
in order post the right message,you may also add following 2 line;
Request.SendChunked = true;
Request.TransferEncoding = "GB2312";
this also lead to a 501.
in that occasion,you can delete the 2 line,and modify postDate like so.
postDate = "type=user&username=%BC%C6%CB%E3%BB%FA%D1%A7%D4%BA&password=123&Submit=+%C8%B7%C8%CF+"
maybe this is a solution to modify the postDate,however i havn't test yet.
string str = Encoding.GetEncoding("gb2312").GetString(tmpBytes);
----------2---------
if Response.StatusCode == HttpStatusCode.Redirect Redirect is equals to 302.
following line is a must:
Request.AllowAutoRedirect = false;
Related
In C # application, when calling API interface, it often takes 15+ seconds to access. API is deployed in another network segment of intranet and needs to be accessed by proxy. Some one said that it was a DNS problem, try to setting the host, which has no effect.
Environment: Windows Server 2012 R2, IIS v8.5
Code Script:
private string PostHttp(string url, string authHeader, string requestBody)
{
var webRequest = System.Net.WebRequest.Create(url);
ServicePointManager.ServerCertificateValidationCallback = new System.Net.Security.RemoteCertificateValidationCallback(CheckValidationResult);
byte[] data = System.Text.Encoding.UTF8.GetBytes(requestBody);
webRequest.Method = "POST";
webRequest.Headers.Add("Accept-Language", "zh-cn,zh;q=0.8,en-us;q=0.5,en;q=0.3");
webRequest.Headers.Add("Accept-Encoding", "gzip, deflate");
webRequest.Headers.Add("Authorization", authHeader);
webRequest.ContentLength = data.Length;
webRequest.ContentType = "application/x-www-form-urlencoded";
System.Net.WebProxy proxy = new System.Net.WebProxy("http://myHttpProxyAddress", false);
proxy.Credentials = new System.Net.NetworkCredential("HttpProxyUser", "HttpProxyPassword");
webRequest.Proxy = proxy;
var writer = webRequest.GetRequestStream();
writer.Write(data, 0, data.Length);
writer.Close();
using (WebResponse webResponse = webRequest.GetResponse())
{
System.IO.StreamReader reader = null;
if (webResponse.Headers["Content-Encoding"] == "gzip")
reader = new System.IO.StreamReader(new GZipStream(webResponse.GetResponseStream(), CompressionMode.Decompress), System.Text.Encoding.UTF8);
else
reader = new System.IO.StreamReader(webResponse.GetResponseStream(), System.Text.Encoding.UTF8);
var result = reader.ReadToEnd();
reader.Close();
return result;
}
}
it's very difficult to point out by just hearing someone said one of the possibility (e.g DNS error) , try to use some http utility (e.g Curl) to check and measure that.
you can wrap the request above with below this guide and measure the timing detail.
How do I measure request and response times at once using cURL?
I am trying to write some code to connect to an HTTPS site that uses Siteminder authentication.
I keep getting a 401. Any ideas?
I have read a few different things on here but none have really seemed all that helpful. I am also using Fiddler/Firefox Tamper to snoop what's going on.
Here is what I've got so far in regards to code:
try
{
Uri uri = new Uri("https://websiteaddresshere");
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri) as HttpWebRequest;
request.Accept = "text/html, application/xhtml+xml, */*";
request.UserAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko";
// request.Connection = "Keep-Alive";
// request.Method = "Get";
// request.Accept = "text";
request.AllowAutoRedirect = true;
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
Cookie emersoncookie = new Cookie("SMCHALLENGE", "YES");
emersoncookie.Domain = "mydomain";
emersoncookie.Path = "/";
// authentication
var cache = new CredentialCache();
cache.Add(uri, "False", new NetworkCredential("myusername", "mypassword"));
request.Credentials = cache;
// response.
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
using (Stream stream = response.GetResponseStream())
{
XmlTextReader reader = new XmlTextReader(stream);
MessageBox.Show(stream.ToString());
}
}
}
catch (WebException exception)
{
string responseText;
using (var reader = new StreamReader(exception.Response.GetResponseStream()))
{
responseText = reader.ReadToEnd();
MessageBox.Show(responseText.ToString());
}
}
After doing some more reading on the MSDN website I decided to go a different route.
I ended up making this a service since it will need to be a service at the end of the day and did the following:
CookieContainer emersoncookie = new CookieContainer();
HttpWebRequest request = (HttpWebRequest)
WebRequest.Create("https://websiteaddress");
request.Credentials = new NetworkCredential("username", "password");
request.CookieContainer = emersoncookie;
request.Method = "GET";
HttpWebResponse response = (HttpWebResponse)
request.GetResponse();
Stream resStream = response.GetResponseStream();
using (Stream output = File.OpenWrite(#"c:\\somefolder\\somefile.someextention"))
using (Stream input = resStream)
{
input.CopyTo(output);
}
To anyone that might be running into Siteminder authentication issues, this piece of code works pretty well.
I couldn't get Jasen's code to work. Maybe your SM setup is different from mine. But with SiteMinder it's generally a two step authentication process. The code block below works for me:
//Make initial request
RestClient client = new RestClient("http://theResourceDomain/myApp");
client.CookieContainer = new CookieContainer();
IRestResponse response = client.Get(new RestRequest("someProduct/orders"));
//Now add credentials.
client.Authenticator = new HttpBasicAuthenticator("username", "password");
//Get resource from the SiteMinder URI which will redirect to the correct API URI upon authentication.
response = client.Get(new RestRequest(response.ResponseUri));
Although this uses RestSharp, it can be easily replicated using HttpClient or even HttpWebRequest.
I have figured out(by inspecting both the website its-self and the response form the website when using my C# application) that the website in question does not use the form's nor cookies to do the secure connections so I am wondering if anyone knows of a way to potentially use the SSL certificate or the website headers(which by all accounts I think do have a cookie in them), to allow the log-in and then download of a certain file.
Could I get code examples or links to ways to use the headers to log-in to a site is what I am asking, I have all the necessary credentials just not the extensive knowledge to use them.
Thanks a bunch.
public void downloadStuff()
{
string formUrl = "https://secure.website.co.nz/extension/Login.aspx";
string formParams = string.Format("ctl00_main_Login1_UserName={0}&ctl00_main_Login1_Password={1}", "*********", "************");
string cookieHeader;
WebRequest req = WebRequest.Create(formUrl);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
string pageSource;
string getUrl = "https://secure.website.co.nz/extension/Downloader.axd?x=stock-all";
WebRequest getRequest = WebRequest.Create(getUrl);
getRequest.Headers.Add("Cookie", cookieHeader);
WebResponse getResponse = getRequest.GetResponse();
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
WebClient exeed = new WebClient();
exeed.DownloadFileAsync(new Uri(#"https://secure.website.co.nz/extension/Downloader.axd?x=stock-all"), "test.txt");
pageSource = sr.ReadToEnd();
}
}
Note: This question was also put up because if what I have done is correct, I would not know how to use a cookie to download a file. Potentially example code of how to capture and use a cookie and or a header would be the best option.
Thanks, This was an edit!
I need to login to a website a download the source code from various pages when logged in. I am able to do this quite easily when using the Windows Forms WebBrowser class, however this is not appropiate and I need to be able to do this with WebRequest or one of the others. Unfortunately it doesn't like how I am handling the cookies.
I am using the following code and get the following response: {"w_id":"LoginForm2_6","message":"Please enable cookies in your browser so you can log in.","success":0,"showLink":false}
string url2 = "%2Fapp%2Futils%2Flogin_form%2Fredirect%2Fhome";
string login = "username";
string password = "password";
string w_id = "LoginForm2_6";
string loginurl = "http://loginurl.com";
string cookieHeader;
WebRequest req = WebRequest.Create(loginurl);
req.Proxy = WebRequest.DefaultWebProxy;
req.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
req.Proxy.Credentials = System.Net.CredentialCache.DefaultNetworkCredentials;
req.ContentType = "application/x-www-form-urlencoded; charset=UTF-8";
req.Method = "POST";
string postData = string.Format("w_id={2}&login={0}&password={1}&url2={3}", login, password, w_id, url2);
byte[] bytes = Encoding.ASCII.GetBytes(postData);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
string pageSource = "";
using (StreamReader sr = new StreamReader(resp.GetResponseStream()))
{
pageSource = sr.ReadToEnd();
}
richTextBox1.Text = pageSource;
If anyone could tell me where I'm going wrong, it would be greatly appreciated.
Also, to let you know, if I use the following with the webbrowser class, it works in fine:
b.Navigate(fullurl, "", enc.GetBytes(postData), "Content-Type: application/x-www-form-urlencoded\r\n");
I know this is old to reply, but the user Matthew Brindley answered similar question with a completely working example. The question is about accessing to the
source code of a website that requires user login previously. All done from a C# application using WebRequest and WebResponse
I'm trying to interface with the Google Reader (undocumented/unofficial) API using information from this page. My first step is to get a SID and token, which works fine, but I can't seem to POST anything without getting a 401 error.
Here is the code I'm using to get my SID and token:
static string getSid()
{
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://www.google.com/accounts/ClientLogin?service=reader&Email=username&Passwd=password");
req.Method = "GET";
string sid;
HttpWebResponse response = (HttpWebResponse)req.GetResponse();
using (var stream = response.GetResponseStream())
{
StreamReader r = new StreamReader(stream);
string resp = r.ReadToEnd();
int indexSid = resp.IndexOf("SID=") + 4;
int indexLsid = resp.IndexOf("LSID=");
sid = resp.Substring(indexSid, indexLsid - 5);
return sid;
}
}
And to generate a cookie and get the token:
static Cookie getCookie(string sid)
{
Cookie cookie = new Cookie("SID", sid, "/", ".google.com");
return cookie;
}
static string getToken(string sid, Cookie cookie)
{
string token;
string url = "http://www.google.com/reader/api/0/token";
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(url);
req.CookieContainer = new CookieContainer();
req.CookieContainer.Add(cookie);
HttpWebResponse response = (HttpWebResponse)req.GetResponse();
using (var stream = response.GetResponseStream())
{
StreamReader r = new StreamReader(stream);
token = r.ReadToEnd();
return token;
}
}
Now if I try to do a POST (in this example, insert a new tag) using the following, I get the 401 error.
static void insertTag(string tag, Cookie cookie)
{
string result = "";
string data = Uri.EscapeDataString("a="+tag+"&T=" + Program.token);
byte[] buffer = Encoding.GetEncoding(1252).GetBytes(data);
HttpWebRequest WebReq = (HttpWebRequest)WebRequest.Create
("http://www.google.com/reader/api/0/edit-tag?client=-");
WebReq.Method = "POST";
WebReq.Credentials = new NetworkCredential("username", "password");
WebReq.ContentType = "application/x-www-form-urlencoded";
WebReq.ContentLength = buffer.Length;
Stream PostData = WebReq.GetRequestStream();
PostData.Write(buffer, 0, buffer.Length);
PostData.Close();
HttpWebResponse WebResp = (HttpWebResponse)WebReq.GetResponse();
Stream Answer = WebResp.GetResponseStream();
StreamReader _Answer = new StreamReader(Answer);
result = _Answer.ReadToEnd();
if (result.Length < 0)
result = "";
}
The error occurs on the line Stream Answer = WebResp.GetResponseStream();
You need to double check that you have a user agent set. I have run into this same problem before when i didnt have it set.
For example:
WebClient client = new WebClient();
client.Headers.Add("user-agent", "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)");
Or: MSDN Link
myHttpWebRequest=(HttpWebRequest)WebRequest.Create("http://www.contoso.com");
myHttpWebRequest.UserAgent="Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; .NET CLR 1.0.3705;)";
Turns out I was using the wrong URL to access the Google Reader APIs thanks to some outdated documentation! The correct URL for adding labels in Google Reader (as of August 2009) is
http://www.google.com/reader/api/0/subscription/edit?client=scroll
with POST arguments
a=user/-/label/[your label]&s=feed/[feed url]&ac=edit&T=[token]