How can I detect whether or not GZIP is enabled for a particular request in my HTTP Module? I apply a filter to the output, and when it acts on gzipped content, it screws up the compression somehow, and the client browser throws an error that it can't decode the content.
public void Init(HttpApplication context)
{
// if(HttpContext.Current.IsCompressed) // Check for compressed content here
// Set up the filter / replacement.
context.PostReleaseRequestState += (_, __) =>
{
var filterStream = new ResponseFilterStream(HttpContext.Current.Response.Filter);
filterStream.TransformString += FilterPage;
HttpContext.Current.Response.Filter = filterStream;
};
}
ResponseFilterStream is a custom stream which caches all stream writes and presents the contents as an event in order to allow a method to modify the contents of the stream. It works great for modifying HTML requests (which is what I want), but I don't want it to act on gzip-compressed responses. How can I detect a gzipped response and prevent the filter stream from being hooked up to the response?
For a response, you can check the Encoding http header for a value of gzip or deflate
For a request, you need to check Accept-Encoding http header for a value of gzip or deflate encoding.
HTTP Compression
Related
I am trying to add Content-Type header to HttpClient GET request, here my code:
HttpClient client=new ....
bool added = client.DefaultRequestHeaders.TryAddWithoutValidation("Content-Type", "application/x-www-form-urlencoded");
var response = await client.GetAsync(...
but the added variable is false, i.e it failed to add the header.
How can I add this header?
NOTE:
This post deals with POST request, I asked about GET
If you look at the Http/1.1 specification:
A sender that generates a message containing a payload body SHOULD
generate a Content-Type header field in that message unless the
intended media type of the enclosed representation is unknown to the
sender. If a Content-Type header field is not present, the recipient
MAY either assume a media type of "application/octet-stream"
([RFC2046], Section 4.5.1) or examine the data to determine its type.
Check also the MDN on get requests
The HTTP GET method requests a representation of the specified resource. Requests using GET should only retrieve data.
Sending body/payload in a GET request may cause some existing implementations to reject the request — while not prohibited by the specification, the semantics are undefined. It is better to just avoid sending payloads in GET requests.
Effectively, that means that wether you send or not the header, it's going to be ignored and/or rejected.
When setting the content type, it's better to set it from the content itself: How do you set the Content-Type header for an HttpClient request?
Im currently working on a project, where I call an api using a POST request.
This might help in your case. Its how its done in an official Microsoft Documentation.
using (var content = new ByteArrayContent(byteData))
{
// This example uses the "application/octet-stream" content type.
// The other content types you can use are "application/json"
// and "multipart/form-data".
content.Headers.ContentType = new mediaTypeHeaderValue("application/octet-stream");
response = await client.PostAsync(uriBase, content);
}
My application downloads a zipped xml file from the web and tries to create XML reader:
var fullReportUrl = "http://..."; // valid url here
//client below is an instance of HttpClient
var fullReportResponse = client.GetAsync(fullReportUrl).Result;
var zippedXmlStream = fullReportResponse.Content.ReadAsStreamAsync().Result;
XmlReader xmlReader = null;
using(var gZipStream = new GZipStream(zippedXmlStream, CompressionMode.Decompress))
{
try
{
xmlReader = XmlReader.Create(gZipStream, settings);
}
catch (Exception xmlEx)
{
}
}
When I try to create XML reader I get an error:
"The magic number in GZip header is not correct. Make sure you are passing in a GZip stream.
When I use the URL in the browser I succesfully download a zip file with a well formatted XML in it. My OS is able to unzip it without any issues. I examined the first two characters of the downloaded file and they appear to be 'PK' which is consistent with a ZIP format.
I might be missing a step in stream transformations. What am I doing wrong?
You don't need to use GzipStream for decompressing any http response with HttpClient. You can use HttpClientHandler AutomaticDecompression to make HttpClient decompress the request automatically for you.
HttpClientHandler handler = new HttpClientHandler()
{
// both gzip and deflate
AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate
};
using (var client = new HttpClient(handler))
{
var fullReportResponse = client.GetAsync(fullReportUrl).Result;
}
Edit 1:
Web Servers won't gzip output all the requests. First they check accept-encoding header, if the header is set and it is something like Accept-Encoding: deflate, gzip;q=1.0, *;q=0.5 the web server understands the client could support gzip or deflate so Web Server might ( depends on the app logic or server configuration ) compress the output into gzip or deflate. In your scenario I don't think you have set accept-encoding header so the web response will return uncompressed. Although I recommend you to try the code above.
Read more about accept-encoding on MDN
I have been working on getting gzip/deflate compression working on Web API responses. I have been using the code from Github - MessageHandlers.Compression. However it didn't appear to work. There was no Content-Encoding header appearing in the Google Developer console or in Firebug in Firefox and the Content-Length was consistently set to the uncompressed size of the data. So I kept stripping out the code until I ended up with the following:
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
// Send the request to the web api controller
var response = await base.SendAsync(request, cancellationToken).ConfigureAwait(false);
// Compress uncompressed responses from the server
if (response.Content != null && request.Headers.AcceptEncoding.IsNotNullOrEmpty())
{
var content = response.Content;
var bytes = content.ReadAsByteArrayAsync().Result;
if (bytes != null && bytes.Length > 1024)
{
// The data has already been serialised to JSon by this point
var compressedBytes = Compress(bytes);
response.Content = new ByteArrayContent(compressedBytes);
var headers = response.Content.Headers;
headers.Remove("Content-Type");
headers.ContentLength = compressedBytes.Length;
headers.ContentEncoding.Clear();
headers.ContentEncoding.Add("gzip");
headers.Add("Content-Type", "application/json");
}
}
return response;
}
private static byte[] Compress(byte[] input)
{
using (var compressStream = new MemoryStream())
{
using (var compressor = new GZipStream(compressStream, CompressionMode.Compress))
{
compressor.Write(input, 0, input.Length);
compressor.Close();
return compressStream.ToArray();
}
}
}
When I initially did this I made a mistake, and set the content encoding in the header to 'gzip' when I used a DeflateStream in the Compress method. As you would expect I got an error in the browser, however the response headers were correct(!). That is, the Content-Encoding header was set and the Content-Length was correct. As well, looking at the raw data I could clearly see if was compressed. As soon as I corrected my error though the problem returned.
What I am wondering is do the latest versions of the browser decompress the content behind the scenes, or is there actually something wrong with my code? Responses are sent in Json format
Any help much appreciated.
EDIT
I tried dumping the headers to a log file in the following methods in Global.asax (listed in the order they appeared in the log):
Application_PreSendRequestHeaders
Application_EndRequest
Application_PreSendRequestContent
In each case the required headers were there, even though they didn't appear in the Google developer console. I then took a look at the solution at Code Project. When run from the command line everything worked as anticipated. However when I called the web server from Google Chrome I got the exact same result. That is, no Content-Encoding header, and no indication as to whether the content had been compressed or not. However with the developer console open it's easy to see this header at other sites (stack overflow for instance). I have to assume this is therefore something to do with compressed responses from web api services. It's so hard to know though if this is actually working client side.
In case anyone doesn't want to read all the comments the answer came from Jerry Hewett (jerhewet). Namely that anti-virus software intercepts the response before it gets to the browser. The anti-virus software decompresses the data, no doubt as part of the scanning process. Huge thanks to Jerry for his help here.
I'm trying to send HTTP requests in C# that look like HTTP requests from a certain software. I wanted to use System.Net.HttpWebRequest but it doesn't give me the control I need over its headers: their letter-casing can't be changed (e.g. I want the Connection header to be keep-alive and not Keep-Alive), I don't have full control over the headers ordering, etc.
I tried using HttpClient from CodeScales library. Unfortunately, it doesn't decompress responses automatically (see HttpWebRequest.AutomaticDecompression). I decompressed it myself with System.IO.Compression.GZipStream and DeflateStream, but it didn't work when the response had the header Transfer-Encoding: chunked.
System.Net.Http.HttpRequestHeaders seems to give more control over headers than HttpWebRequest, but still not enough.
How can it be done?
Edit: I know that HTTP accepts those headers as valid anyway, but I'm working with a server that validates the headers and refuses to respond if they're not exactly what it expects.
To set some headers in the HTTPWebRequest class, you have to either use an attribute from the class (for example HttpWebRequest.KeepAlive = true), or you have to add the custom header to the request by calling the add method to the request headers.
Something important is that is you try to add the header (in a custom way) while it's already an attribute of the request, it'll send you an error.
objRequest.Headers.Add("Accept", "some data");
is incorrect. You'd rather say.
objRequest.Accept = "some data";
In your case you can :
objRequest.KeepAlive = true;
Don't worry to much for the letter-casing, it doesn't matter as far as you're sending the appropriate headers to the server.
I am trying to stream dynamically generated data to a client over HTTP using IIS, and the connection has to remain open for a long period of time, and the server will send periodic status updates to the client while it is performing a time-consuming operation.
This MUST all be handled within ONE request, but I am using a WebClient.OpenRead() stream, which cannot be opened until the headers are sent.
How can I force IIS to send headers to the client, and later send a response body?
This behaviour is normally achievable by setting KeepAlive to true and setting Expect header to "100 and continue". By doing this, server will send the headers with result code 100.
I am not sure if this is possible using WebClient.
Use HttpWebRequest instead to be able to set the values above. In fact WebClient does nothing magical but using GET to get the data. Here is the code for calling OpenRead in Reflector:
try
{
request = this.m_WebRequest = this.GetWebRequest(this.GetUri(address));
Stream responseStream = (this.m_WebResponse = this.GetWebResponse(request)).GetResponseStream();
if (Logging.On)
{
Logging.Exit(Logging.Web, this, "OpenRead", responseStream);
}
stream2 = responseStream;
}
catch (Exception exception)
{
//