I have an issue where if I enable dynamic content compression in IIS 7.5, I get a different content length. I know this could happen since the data is being compressed but the problem is it is actually bigger.
Before
After
I know there are related posts like this one but the solutions are often modules modifying the content-length. In this example, I ruled that out by using a simple demo WCF app but I still get an incorrect content length. IF you think I missed the correct question / answer, just let me know.
WCF service returns incorrect Content-Length when using gzip encoding
Here is the solution of the demo wcf I am using. https://github.com/janmchan/WCFDemo.git
As it turns out, there was nothing wrong with the response. Using fiddler, I could see the saw response as the compressed version and it seems that the length corresponds to the length of those characters. So our conclusion is that the end system receiving this does not know how to handle the compressed response. I'll keep this answer open for debate until we have confirmed that this is the case.
Related
I have tried the C# sample at https://learn.microsoft.com/en-us/azure/azure-monitor/platform/data-collector-api butI always get an error response back. Has anyone been able to get this sample to work? I have double and triply checked the workstation id and the secret key but still cannot call the API sucessfully.
Yes, as Doris Lv mentioned, displaying the error would certainly help to understand the issue better.
In additional to it what I could observe in general is, the C# example provided for the Azure Monitor Data Collector API uses ASCII encoding while other parts of the page reference UTF8 so try to tweak your code with UTF8 and see if it resolves your error.
I am using MonoTouch to call a remote web service from an iOS app. I use HttpWebRequest and it works great for me for GET, PUT, and POST requests. However, when I try to make a DELETE request, I get some odd behavior: the entity body that I send gets truncated and the server receives an empty body (Content-Length: 0).
The identical code works perfectly when run on a Windows Phone with the WP7.1 implementation of System.Net.HttpWebRequest.
I know that there is some debate on whether RFC 2616 allows an entity body in a DELETE request (e.g. Phil Haack's question). This question isn't about that - it is about why the body does not make it to the server.
Now to the question :-) Is this issue in MonoTouch's implementation of HttpWebRequest (i.e. Mono enforces a Content-Length of 0 for the body of a DELETE request)? Or does Mono implement HWR on top of an Apple framework that is responsible for this behavior? The reason for the question, of course, is to better understand whether I can work around the issue and/or implore Miguel to allow DELETE bodies, or whether I need to change my wire format.
This looks like a bug in Mono, after a (very) quick look in the source code I found this, which seems to be the culprit.
You should file a bug with a test case so it can be fixed (even better: provide a patch as well, in which case it shouldn't take long to get it fixed).
I've got some strings that I need to compress server-side in C#, then decompress client-side in JavaScript. What can I use to accomplish this?
Assuming you're fetching this data over HTTP, is there any reason you can't do this at the HTTP level? (See this article for information about HTTP compression.)
That way you shouldn't need to do anything on the client side, apart from making sure that the request includes the appropriate Accept-Encoding header. Depending on your server, you may be able to just tweak some server settings to get the compression automatically on that side too...
To be honest, it's worth breaking out WireShark to check exactly what's going up and down the wire already. It's just possible you've already got compression without knowing it :)
I've been throwing a little bit of spare time at writing a BitTorrent client, mostly out of curiosity but partly out of a desire to improve my c# skills.
I've been using the theory wiki as my guide. I've built up a library of classes for handling BEncoding, which I'm quite confident in; basically because the sanity check is to regenerate the original .torrent file from my internal representation immediately after parsing, then hash and compare.
The next stage is to get tracker announces working. Here I hit a stumbling block, because trackers reject my requests without terribly useful error messages.
Take, for instance, the latest stack overflow database dump. My code generates the following announce URI:
http://208.106.250.207:8192/announce?info_hash=-%CA8%C1%C9rDb%ADL%ED%B4%2A%15i%80Z%B8%F%C&peer_id=01234567890123456789&port=6881&uploaded=0&downloaded=0&left=0&compact=0&no_peer_id=0&event=started
The tracker's response to my code:
d14:failure reason32:invalid info hash and/or peer ide
The tracker's response to that string dropped into Chrome's address bar:
d8:completei2e11:external ip13:168.7.249.11110:incompletei0e8:intervali600e5:peerslee
The peer_id is (valid) garbage, but changing it to something sensible (impersonating a widely used client) doesn't change anything.
Like I said, I'm pretty sure I'm pulling the info dictionary out properly and hashing (SHA1) like I should, and the peer id is well formed.
My guess is I'm doing some minor thing stupidly wrong, and would appreciate any help in spotting what it is exactly.
Its kind of hard to guess what code would be pertinent (and there's far to much to just post). However, I'll try and post anything asked for.
EDIT
I wasn't hex encoding the info_hash, which sort of helps.
This is the code that takes the generates URI and try's to fetch a response:
//uri is the above
WebRequest req = WebRequest.Create(uri);
WebResponse resp = req.GetResponse();
Stream stream = resp.GetResponseStream();
MonoTorrent is a BitTorrent implementation that comes with Mono.
In the HTTPTracker class there is a CreateAnnounceString method.
Maybe you can compare your implementation with how that method is doing it?
(You probably need to hunt down where the AnnounceParameters instance is created.)
This isn't an answer to your problem, but it may help for testing.
There are open-source PHP-based torrent trackers out there. They are incredibly inefficient (I know, I wrote a caching mechanism for one back in the day), but you could set up your own local tracker and modify the PHP code to help debug your client as it communicates with the tracker. Having a local client-server setup would make troubleshooting a lot easier.
What exactly are you hashing? You should only hash the info section, not the whole torrent file... So basically, decode the file, reencode the info section, hash that.
ie. For the torrent posted, all you should be hashing is:
d6:lengthi241671490e4:name20:so-export-2009-07.7z12:piece lengthi262144e6:pieces18440:<lots of binary data>e
There is a error in the URL %-encoding of the info_hash. The leading zeros in the two last bytes of the info_hash has been removed.
It is: info_hash=-%CA8%C1%C9rDb%ADL%ED%B4%2A%15i%80Z%B8%F%C
Should be: info_hash=-%CA8%C1%C9rDb%ADL%ED%B4%2A%15i%80Z%B8%0F%0C
When the announce string is dropped into Chrome's address bar it's probably auto-corrected by the browser.
I have a "simple" task. I have an existing project with a web service written in C# which has a method that will send a huge XML file to the client. (This is a backup file of data stored on the server that needs to be sent somewhere else.) This service also had some additional authentication/authorization set up.
And I have an existing Delphi 2007 application for WIN32 which calls the web service to extract the XML data for further processing. It's a legacy system that runs without a .NET installation.
Only problem: the XML file is huge (at least 5 MB) and needs to be sent as a whole. Due to system requirements I cannot just split this up into multiple parts. And I'm not allowed to make major changes to either the C# or the Delphi code. (I can only change the method call on both client and server.) And I'm not allowed to spend more than 8 (work) hours to come up with a better solution or else things will just stay unchanged.
The modification I want to add is to compress the XML data (which reduces it to about 100 KB) and then send it to the client as a binary stream. The Delphi code should then accept this incoming stream and de compress the XML data again. Now, with a minimum of changes to the existing code, how should this be done?
(And yes, I wrote the original client and server in the past and it was never meant to send that much data at once. Unfortunately, the developer who took it over from me had other ideas, made several dumb changes, did more damage and left the company before my steel-tipped boot could connect to his behind so now I need to fix a few things. Fixing this web service has a very low priority compared to the other damage that needs to be restored.)
The server code is based on legacy ASMX stuff, the client code is the result of the Delphi SOAP import with some additional modifications.
The XML is a daily update for the 3000+ users which happens to be huge in it's current design. We're working on this but that takes time. There are more important items that need to be fixed first, but as I said, there's a small amount of time available to fix this problem quickly.
This sounds like a good candidate for an HttpHandler
My good links are on my work computer (I'll add them when I get to work), but you can look to see if it will be a good fit.
-- edit --
Here are the links...
http://www.ddj.com/windows/184416694
http://visualstudiomagazine.com/articles/2006/08/01/create-dedicated-service-handlers.aspx?sc_lang=en&sc_mode=edit
What is the problem with a 5MB file in a soap message? I have written a document server that runs over soap and this server has no problem with large files.
If the size is a problem for you I would just compress and decompress the xml data. This can easily be done with one of the many (free) available components for compression of a TStream descendant.
If you get that kind of compression, merely convert each byte to its hex equivalent, which will only double the size, and send this. Then do the opposite on the other end. Or am I missing something?
I would agree with Brad Bruce, HttpHandler would be fast, and using GZIP or Deflate Compression with I might be wrong... browsers support natively. you can get easy great compression on text based data for cheap cpu time.
System.IO.Compression.GZipStream GZipStream = new System.IO.Compression.GZipStream("Your XML Doc Stream",System.IO.Compression.CompressionMode.Compress)