I am using WebClient with C#
the following code works fine
wc = new WebClient();
wc.Headers.Add("Content-Type", "application/x-www-form-urlencoded");
NameValueCollection nvc = new NameValueCollection();
nvc.Add("webdata", JsonConvert.SerializeObject(webdata));
response = wc.UploadValues(#"http://localhost/api/results", "PUT", nvc);
The application is most likely to be used over a mobile data connection, so to minimize costs, i would like to make sure the data is compressed, as it is all txt. I have used json instead of xml to reduce the size (and could possibly alter the format to reduce overhead further)
Do i need to compress the data it manually prior to adding it to the WebClient
or is there some way i can tell WebClient that my webserver can handle compression
(or does compression on the webserver only work for downloads?)
i am running apache/php on the webserver
thanks in advance
Http compression is normally only used for responses. It is possible to compress requests, but not all web servers will accept these requests and decompress them.
Have you tried adding a header of type "Content-Encoding" and value "gzip" to your request?
You'll still have to compress the contents manually with a GZipStream and write the compressed bytes out to the request stream though.
don't forget to flush your writers to the streams, or not all data will be sent over the wire :)
Related
I writting an application which is using WebClient class.
Adding something like that:
ExC.Headers.Add("Accept-Encoding: gzip, deflate");
where ExC is:
class ExWebClient1 : WebClient
{
protected override WebRequest GetWebRequest(Uri address)
{
HttpWebRequest request = (HttpWebRequest)base.GetWebRequest(address);
request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
return request;
}
}
It will be a diffrence in speed when i will be using encoded response?
Short answer is usually Yes.
Long answer, It depends. on the following:
Is the Server configured to compress responses or not.
Whether the request is for a dynamic or static content. (some servers do not compress dynamic content)
Bandwidth and latency between server and client.
Size of the response being returned, on small responses it wont make big a difference.
Also note, that adding the "accept-encoding" on the client side, tells the server "I understand gzip/deflate" and does not force the server to compress response.
It depends, by adding this in header you are just letting server know that client app making request can accept zipped content. If server is capable of sending zipped response it will zip the data back after interpreting this in header. On performance If data to be fetched is huge zipping may help otherwise zipping will be small overhead which generally is negligible.
I have below code in a third party function, it posts file to a webserver. I want to post data in parts, what change should i do in the code. Below code is working and the "request" object contains everything.
private static HttpWebResponse GetRawResponse(HttpWebRequest request)
{
return (HttpWebResponse)request.GetResponse();
}
Also is there a way in which I can find out what is the full name (with path) of the file which is going to be uploaded from the httpwebrequest object.
Thanks.
HttpWebRequest does not spawn multiple HTTP requests. You cannot upload a file in chunks unless you actually create multiple HttpWebRequests. I don't think that would be useful unless you have a server side process to stitch them back together.
If you still want true chunked uploading you might want to look at raw TCP or some other mechanism.
Hope that helps. See this post as well.
Im looking for solution how can I upload some file using http request. I got the idea that I'll transfer my files by post and on PHP side I'll put files on the server.
How doin this?
var client = new System.Net.WebClient();
client.UploadFile(address, filename);
See UploadFile on MSDN.
Well I'd try the simplest possible thing that might work to start with - WebClient.UploadFile:
WebClient client = new WebClient();
client.UploadFile(url, file);
Of course, you'll have to write appropriate PHP code to handle the upload...
One more approach is to upload file via browser and handle upload request/response with Fiddler. After that, you can write exact request using HttpWebRequest via C#.
My ASP.NET app returns JSON object to user, it contains binary encoded data. Due to this I decided to enable HTTP compression and the problem begun with Content-Length.
If I enable compression the Content-Length header is ignored while response s send and connection is not closed immediately. The connection is still open for about 15 seconds after all data has been sent.
I would like to have HTTP compression enabled but don't know how to solve the problem with Content-Length header.
context.Response.AddHeader("Content-Length", jsonObject.ToString().Length.ToString());
context.Response.Write(jsonObject);
context.Response.Flush();
context.Response.Close();
Content-Length represents the length of the data being transferred in your case it is the compressed bytes. See this SO question whose answer links to relevant RFC: content-length when using http compression
In case, HTTP Compression is happening via web server (and not by your code) then I will suggest you to try adding content-length header by your self. Hopefully, web server will add it correctly.
This can be verified using Chrome on http://httpd.apache.org/ if you look at the developer console you would see the Content-Length to be much smaller than the actual uncompressed page in bytes.
I am on a linux server connecting to a webservice via PHP/Soap.
The problem is that a method is zipping the response via SharpZipLib. So all I get in return is a garbled string.
Does anyone know of a way to unzip this with PHP or JS?
Thanks!
Update:
This is the compressed test data that gets returned:
UEsDBC0AAAAIAI5TWz3XB/zi//////////8EABQAZGF0YQEAEADWAgAAAAAAABYBAAAAAAAAfZLvToNAEMTnUXyDatXEDxcS/3zxizH6BBVESaESKFHf3t+cOWgtNhcuYXd2Zndug570qjdV6rVVpxV3pQ9tlCnohv+ab6Mc1J0G7kynZBb/5IKeYTDLAGOm28hVwtmpobqItfuYACpp1Ki42jobOGqO1eYRIXI2egHfofeOTqt7OE6o8QQdmbnpjMm01JXOdcG5ZKplVDpeEeBr6LCir2umKaJCj3ZSbGPEE3+Nsd/57fADtfYhoRtwZqmJ/c3Z+bmaHl9Kzq6CX20bWRJzjvMNbtjZ71Fvtdfz2RjPY/2ESy54ExJjC6P78U74XYudOaw2gPUOTSyfRDut9cjLmGma2//24TBTwj85573zDhziFkc29wdQSwECLQAtAAAACACOU1s91wf84v//////////BAAUAAAAAAAAAAAAAAAAAAAAZGF0YQEAEADWAgAAAAAAABYBAAAAAAAAUEsFBgAAAAABAAEARgAAAEwBAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=
Chances are, it's using gzip. You should look into PHP Zlib and the gzdecode or gzdeflate methods. You'll probably need to look at the Content type header or another response header.
Something you can try is also setting an Accept header in the web service request that tells the service you don't know how to deal with compression. If it's a proper service, it will honor the request.
EDIT Looking at the .pdf, they're sending the data as a zip archive - so you need to find a PHP lib that deals with in memory zip archives. The C# code they use to decode it is pretty straightforward - they just read all the entries in the archive and expand them. You can try storing it as an in memory buffer using a PHP wrapper along with PHP Zip.
Did you try setting an Accept Header that asks for no compression?
You can unzip your string using such function:
function decode($data)
{
$filename = tempnam('/tmp', 'tempfile');
file_put_contents($filename, base64_decode($data));
$zip = zip_open($filename);
$entry = zip_read($zip);
$decoded = zip_entry_read($entry);
zip_entry_close($entry);
zip_close($zip);
return $decoded;
}