I'm downloading a file, from a webserver, using the code below, and am recieving this error:
The Error:
Error saving file from URL:The server committed a protocol violation.
Section=ResponseHeader Detail='Content-Length' header value is invalid
From running Fiddler while this is running, it says:
Content-Length response header is not a valid unsigned integer
Content-Length: 13312583
The Code:
public static bool SaveFileFromURL(string url, string destinationFileName, int timeoutInSeconds)
{
//SetAllowUnsafeHeaderParsing20();
Configuration config = ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.None);
SettingsSection section = (SettingsSection)config.GetSection("system.net/settings");
section.HttpWebRequest.UseUnsafeHeaderParsing = false;
config.Save();
// Create a web request to the URL
HttpWebRequest MyRequest = (HttpWebRequest)WebRequest.Create(url);
MyRequest.UseDefaultCredentials = true;
MyRequest.ContentLength = 0;
MyRequest.Timeout = timeoutInSeconds * 1000;
try
{
// Get the web response
HttpWebResponse MyResponse = (HttpWebResponse)MyRequest.GetResponse();
// Make sure the response is valid
if (HttpStatusCode.OK == MyResponse.StatusCode)
{
// Open the response stream
using (Stream MyResponseStream = MyResponse.GetResponseStream())
{
// Open the destination file
using (FileStream MyFileStream = new FileStream(destinationFileName, FileMode.OpenOrCreate, FileAccess.Write))
{
// Create a 4K buffer to chunk the file
byte[] MyBuffer = new byte[4096];
int BytesRead;
// Read the chunk of the web response into the buffer
while (0 < (BytesRead = MyResponseStream.Read(MyBuffer, 0, MyBuffer.Length)))
{
// Write the chunk from the buffer to the file
MyFileStream.Write(MyBuffer, 0, BytesRead);
}
}
}
}
}
catch (Exception err)
{
throw new Exception("Error saving file from URL:" + err.Message, err);
}
return true;
}
Update: If I pass the URL straight into a browser, the file is downloaded successfully, and the error is thrown on the GetResponse line.
Update 2: I get the same error with WebClient.Downloadfile:
public static bool DL_Webclient(string url, string destinationFileName)
{
WebClient myWebClient = new WebClient();
myWebClient.UseDefaultCredentials = true;
myWebClient.DownloadFile(url, destinationFileName);
return true;
}
Update 3: Having retrieved the other headers in the message (using Fiddler), they are:
HTTP/1.1 200 OK
Connection: close
Date: Wed, 03 Mar 2010 08:43:06 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
Content-Length: 13314320
Content-Type: application/x-evsaveset
Set-Cookie: ASPSESSIONIDQQCQSCRC=CFHHJHADOIBCFAFOHFJCDNEG; path=/
Cache-control: private
Are there other HTTP headers present?
It might have to do with the 'Transfer-Encoding' header or similar rules. For more specific information about these headers and their effect on the Content-Length header can be found at the W3C website and More W3C website
Hope this helps,
Could you not use WebClient.DownloadFile instead?
Related
Basically, I'm making a HTTP library, so I'm using TcpClient to create a socket connection and send the HTTP headers and stuff.
Now I can get the HTML of google.com, but I can't for pastebin.com and stackoverflow.com for example, as when I try that, I keep getting a "301 Moved Permanently" error, no matter what. Also, the "Location" section is the same as the input url, so I don't get it : Why is it broken like that?
Here is the code that makes the request :
var parsedUrl = ParseUrl();
var array = parsedUrl.Split('/');
var host = array[0];
var subpages = array[1];
Socket = new TcpClient(host, 80)
{
ReceiveBufferSize = BufferSize
};
var buffer = new byte[BufferSize];
var headers = new StringBuilder();
headers.AppendLine("GET /" + subpages + " HTTP/1.1");
headers.AppendLine("Host: " + host);
foreach (var header in Headers) headers.AppendLine(header);
headers.AppendLine();
headers.AppendLine();
Socket.SendBufferSize = headers.Length;
var sent = Socket.Client.Send(Encoding.ASCII.GetBytes(headers.ToString()));
var received = Socket.Client.Receive(buffer);
var response = new RawResponseNET()
{
Url = parsedUrl,
BufferSize = BufferSize,
BytesSent = sent,
BytesReceived = received,
Request = this,
Buffer = buffer,
SocketStream = Socket.GetStream(),
BufferIndex = BinaryMatch(buffer, Encoding.ASCII.GetBytes("\r\n\r\n")) + 4
};
response.ResponseHeaders = Encoding.UTF8.GetString(buffer, 0, response.BufferIndex);
Socket.Close();
return response;
The RawResponseNET class just contains a method named ToString() that will just parse the response as a string.
The headers that I add (excluding Host) are Accept: text/html, charset=utf-8 and Connection: close. I've tried adding some more headers (including some when my web browser does a request) but no luck.
Here is what the response headers look like for pastebin.com :
HTTP/1.1 301 Moved Permanently
Date: Sat, 03 Oct 2020 12:50:09 GMT
Transfer-Encoding: chunked
Connection: close
Cache-Control: max-age=3600
Expires: Sat, 03 Oct 2020 13:50:09 GMT
Location: https://www.pastebin.com/
cf-request-id: 05901c48e80000ee2ffca20200000001
Server: cloudflare
CF-RAY: 5dc6c987ddcbee2f-CDG
So then, my final question is : How would I fix that problem?
I am trying to add trailer in HttpWebRequest header, but it is not appending that trailer after end of file data.
wreq.ContentType = "application/octet-stream";
wreq.AllowWriteStreamBuffering = false;
wreq.SendChunked = true;
//wreq.Headers.Add(HttpRequestHeader.Te, "trailers");
wreq.Headers.Add(HttpRequestHeader.Trailer, "Test");
wreq.Headers["Test"] = "the-value";
using (Stream POSTstream = wreq.GetRequestStream())
{
//dataByte is file-data in byte[]
POSTstream.Write(dataByte, 0, dataByte.Length);
POSTstream.Flush();
//hashValue is trailer in byte[]
POSTstream.Write(hashValue, 0, hashValue.Length);
POSTstream.Flush();
POSTstream.Close();
}
it should append this trailer "Test" # EOF after blank chunk, but it doesn't append it. when i tried to add trailer programatically it consider it as file data rather than trailer.
Expected request:
POST <URL> HTTP/1.1
Content-Type: application/octet-stream
Trailer: Test
Transfer-Encoding: chunked
5d
File-data
0
Test: the-value
Actual request:
POST <URL> HTTP/1.1
Content-Type: application/octet-stream
Trailer: Test
Transfer-Encoding: chunked
5d
File-data
5A
Test: the-value
0
Why this Test trailer is not getting after blank chunk. This trailer will be used on server to identify end of file.
Please Help.
After more than a week research I came to know that Dotnet does not allow to add trailer in httprequest. To achieve above expected request I have used Node.js. For that,
First I have create app.js file which contains code for creating request with trailer and get response from server:
var http = require('http');
var fs = require('fs');
var options = {hostname: 'ABC',port: 6688,path: 'XYZ/101/-3/test.file',method: 'POST',
headers: {'Content-type' : 'application/octet-stream','Trailer' : 'Test','Transfer-Encoding': 'chunked'}};
var fileContent = fs.readFileSync('C:\\test.file');
var req = http.request(options, function(res) {
res.setEncoding('utf8');
res.on('data', function (chunk) {fs.writeFile('C:\\response.xml', chunk);});});
req.on('error', function(e) {fs.writeFile('C:\\response.xml','Error: ' + e.message);});
var len = fileContent.length;
var bufSize = 4096;
for (var i = 0 ; i < len ; ) {
if (i+bufSize <len){req.write(fileContent.slice(i, i+bufSize));}
else{req.write(fileContent.slice(i,len));req.addTrailers({'Test': 'TESTTEST'});req.end();}i = i +bufSize;
}
And run this app.js file from dotnet as application and get the response:
//build app.js file content
var template = GetAppJSString();
var appjsfile = "C:\\app.js";
using (var sw = new StreamWriter(appjsfile))
{
sw.Write(template);
sw.Close();
}
var psi = new ProcessStartInfo
{
CreateNoWindow = true,
FileName = "C:\\Node.exe",
Arguments = "\"" + appjsfile + "\"",
UseShellExecute = false,
WindowStyle = ProcessWindowStyle.Hidden
};
try
{
_nodeProcess.StartInfo = psi;
_nodeProcess.Start();
_nodeProcess.WaitForExit();
//read and process response
var responseText = string.Empty;
using (var sr = new StreamReader("C:\\response.xml"))
{
responseText = sr.ReadToEnd();
sr.Close();
}
File.Delete("C:\\response.xml");
// process response
}
catch (Exception ex) { }
Hope this will help others and save their time.
I'm having a bit of trouble calling a method from a generic handler we have. I've tried using two separate techniques to call a simple 'HelloWorld()' method but I get two different errors:
The first technique is as follows:
WebClient wc = new WebClient();
NameValueCollection formData = new NameValueCollection();
formData["method"] = "HelloWorld";
byte[] data;
try
{
data = wc.UploadValues(_domain, formData);
}
catch (WebException ex)
{
Label1.Text = ex.Message;
return;
}
string response = Encoding.UTF8.GetString(data);
Label1.Text = response;
wc.Dispose();
and I get the following error:
{"id":null,"error":{"name":"Found String where Object was expected."}}
and the second technique I've tried is:
var httpWebRequest = (HttpWebRequest)WebRequest.Create(_domain);
httpWebRequest.ContentType = "text/json";
httpWebRequest.Method = "POST";
using (var streamWriter = new StreamWriter(httpWebRequest.GetRequestStream()))
{
string json = "{\"method\":\"helloWorld\"}"; //," +
//"\"password\":\"bla\"}";
streamWriter.Write(json);
streamWriter.Flush();
streamWriter.Close();
try
{
var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse();
using (var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
var result = streamReader.ReadToEnd();
}
}
catch (WebException wex)
{
Label2.Text = wex.Message;
}
catch (Exception ex)
{
Label2.Text = ex.Message;
}
}
and with this, I get the following error:
The remote server returned an error: (500) Internal Server Error.
When I test the call from the ".ashx?test" page the method runs and the details at the bottom of the screen are:
Pragma: no-cache
Date: Tue, 23 Jul 2013 13:46:19 GMT
Server: ASP.NET Development Server/11.0.0.0
X-AspNet-Version: 2.0.50727
Content-Type: application/json; charset=utf-8
Cache-Control: no-cache
Connection: Close
Content-Length: 32
Expires: -1
Any ideas as to why this wouldn't be working?
Thanks!
An ASHX handler is not a web service. You don't call methods within the ASXH handler. You just call the handler, and it delivers data directly, be it a text or binary data - that's up to you.
I put url to browser's address bar and it downloads the zip file to HD. The size of zipped file is 386 bytes as written in its properties.
When I use UnZipFiles method to extract the file - it works.
But, I want to download programaticaly and extract it in memory. I use GetResultFromServer method to get zipped content. As shown in headers the size of the content is the same as the size of zipped file saved on HD:
content-disposition: attachment; filename=emaillog-xml.zip
Content-Length: 386
Cache-Control: private
Content-Type: application/zip
Date: Mon, 10 Sep 2012 08:28:28 GMT
Server: Microsoft-IIS/7.5
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
My question is how to extract the content returned by GetResultFromServer?
I tried the following:
var ms = new MemoryStream(Encoding.Unicode.GetBytes(res))
var s = new ZipInputStream(ms);
but I get Unable to read from this stream.
UPDATED
I tried var zipStream = new System.IO.Compression.GZipStream(response.GetResponseStream(), CompressionMode.Decompress) but I get The magic number in GZip header is not correct error
Code
private string GetResultFromServer(ElasticLogParams elasticLogParams)
{
var webRequest = (HttpWebRequest)WebRequest.Create(url);
var response = webRequest.GetResponse();
using (var reader = new StreamReader(response.GetResponseStream()))
{
var res = reader.ReadToEnd();
var headers = response.Headers.ToString();
return res;
}
}
public static void UnZipFiles(string zippedFilePath, Stream stream = null)
{
var s = new ZipInputStream(stream ?? File.OpenRead(zippedFilePath));
ZipEntry theEntry;
while ((theEntry = s.GetNextEntry()) != null)
{
using (var streamWriter = File.Create(#"D:\extractedXML.xml"))
{
var size = 2048;
var data = new byte[size];
while (true)
{
size = s.Read(data, 0, size);
if (size > 0)
{
streamWriter.Write(data, 0, size);
}
else
{
break;
}
}
streamWriter.Close();
}
}
s.Close();
}
Give this a shot:
var response = webRequest.GetResponse() as HttpWebResponse;
var stream = response.GetResponseStream();
var s = new ZipInputStream(stream);
I believe you're very close and that you're using the right approach -- you can use this article to back that up -- their code is very similar.
I'm using http://dotnetzip.codeplex.com/
I've not used it to download stuff, but to extract stuff that people upload to my server. I assume it should work perfectly the other way round too.
I've tried the built-in zip from microsoft too, but also had issues. So I gave it up and switched.
I need to make a POST request with an xml data.
String xml = "";
byte[] data = System.Text.Encoding.UTF8.GetBytes(xml);
HttpClient.post(url, data, "text/xml")
Then I call the POST function:
public static String post(String url, byte[] data, String contentType){
String body = "";
body = getResponse("POST", url, data, contentType, 80);
return body;
}
Now I call this function to make the request / get the response:
public static String getResponse(String method, String url, byte[] data, String contentType, int serverPort)
{
String result = null;
HttpWebRequest request = sendRequest(method, url, data, contentType, serverPort);
HttpWebResponse response = null;
try
{
response = (HttpWebResponse)request.GetResponse();
if (response != null){
// Get the stream associated with the response
Stream receiveStream = response.GetResponseStream ();
// Pipes the stream to a higher level stream reader
StreamReader readStream = new StreamReader (receiveStream, System.Text.Encoding.UTF8);
result = readStream.ReadToEnd ();
}
}
catch(WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError){
throw new HttpClientException("HTTP response error. ", (int)(((HttpWebResponse)ex.Response).StatusCode), ((HttpWebResponse)ex.Response).StatusDescription);
}
else{
throw new HttpClientException("HTTP response error with status: " + ex.Status.ToString());
}
}
}
and
public static HttpWebRequest sendRequest(String method, String url, byte[] data, String contentType, int serverPort){
HttpWebRequest request = null;
try
{
UriBuilder requestUri = new UriBuilder(url);
requestUri.Port = serverPort;
request = (HttpWebRequest)WebRequest.Create(requestUri.Uri);
request.Method = method;
//
if ((method == "POST") && (data != null) && (data.Length > 0)){
request.ContentLength = data.Length;
request.ContentType = ((String.IsNullOrEmpty(contentType))?"application/x-www-form-urlencoded":contentType);
Stream dataStream = request.GetRequestStream ();
dataStream.Write (data, 0, data.Length);
// Close the Stream object.
dataStream.Close ();
}
}
catch(WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError){
throw new HttpClientException("HTTP request error. ", (int)(((HttpWebResponse)ex.Response).StatusCode), ((HttpWebResponse)ex.Response).StatusDescription);
}
else{
throw new HttpClientException("HTTP request error with status: " + ex.Status.ToString());
}
}
}
It always give me an HttpCliendException:
video_api.HttpClientException: HttpClient exception :HTTP response error. with `code: 400` and `status: Bad Request`
But when I tried it with Firefox addon HTTP Resource Test, it ran fine and get 202 Accepted status with the same XML doc.
I consoled the content-type and data.length before the post request was called, the content-type was text/xml and the data.length is 143.
I have known some websites to be picky about request headers and return different results solely based on those values. Compare the request headers of the HTTP request in FireFox, and your request, and if you mimic the headers in the FireFox Resource Test, it will most likely work (Request.AddHeader("name", "value")). The other difference to note might be the user-agent which again can be picky for web servers.
Use fiddler (http://www.fiddler2.com/fiddler2/) to see exactly what headers are sent by Firefox. Then see what's different in the headers you are sending.
In my project i use some custom settings in the config.area
<defaultProxy useDefaultCredentials="true">
...
</defaultProxy>
It helped me disabling the proxy:
request.Proxy = null;
Another Solution would be that one messed up SSL and NON-SSL Ports.
When you talk plain http to a https port the answer 400 Bad Request on Apache.