I am using Silverlight 5, .NET 4.5.1 and OpenRIA.
A silverlight client calls a long-run OpenRIA operation. The operation is asynchronous. From the client side I can see that the function from a code is being called just once. From the IIS server-side on the other hand, the WCF function is called multiple times.
What I have logged through Fiddler - the operation was finished with an error. It was invoked once, but with a message „NOTE: This request was retried after a Receive operation failed.”
Request:
GET http://localhost:11213/ClientBin/KEEP-Web-Services-PayrollListService.svc/binary/GetPayrollList?payrollListId=efb1df5d-993a-4c4b-9fe6-013561547632 HTTP/1.1
Accept: */*
Referer: http://localhost:11213/ClientBin/KEEP.xap
Accept-Language: pl
Accept-Encoding: gzip, deflate
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64; Trident/7.0; rv:11.0) like Gecko
Host: localhost:11213
DNT: 1
Connection: Keep-Alive
Cookie: .KEEP_ASPXAUTH_iPersonel=8673CF25C5650AE86CE77A22B9C9A9D20E7588A077E5EADFFE8F5090F08B48639C9F309B1720BC4AD0D4DE342F149D52234DD8C5F15C0B0CCAD5A074C91E8F14B74FC27D7740A91614DECE034A9F99186375ACEB887E610B32CEA5786BF5EA02D35F144BC49D1E4C254478385EEB4D7E8811959E5494D9D6E9F17D698FCBDC93
Response:
HTTP/1.1 504 Fiddler - Receive Failure
Date: Wed, 28 Oct 2015 13:06:41 GMT
Content-Type: text/html; charset=UTF-8
Connection: close
Cache-Control: no-cache, must-revalidate
Timestamp: 14:06:41.238
[Fiddler] ReadResponse() failed: The server did not return a complete response for this request. Server returned 0 bytes.
The situation occurs in IISExpres and IIS 7.5, locally and remotely.
UPDATE 1.
I have found the reason that causes an operation repeated.
Failed to allocate a managed memory buffer of 134217728 bytes. The
amount of available memory may be low.
What I can do to handle it with OpenRIA (former wcf-ria-services)? I see no custom binding to have any effect to OpenRIA.
The problem was IIS in 32-bit. Using 64-bit version solves the problem.
P.S. Visual Studio has an option to use IIS-Express in 64-bit. You can also add to the registry:
reg add HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\12.0\WebProjects /v Use64BitIISExpress /t REG_DWORD /d 1
12.0 - Visual Studio 2013
Related
I have an older C# MVC 4.7 web app that has a specific $.post call. Running locally from Visual Studio 2019 in any browser, I have no problem at all. The call goes through and posts the partial page to the div as expected. Running from our test URL, it gives me an error in Edge and Chrome, but not in Firefox. It returns a 411 error, which I know means it's a content length issue. The problem is that others accessing the site do not run across that issue in any browser. Since it works on others' machines and on one particular browser on my local machine, I suspect it's a security setting or something along those lines for that particular site. I've cleared out the settings, reset to factory default, removed extensions, assured that's up-to-date and tested it with and without antivirus software interaction.
This fails:
$.post('/Controller/Method', function (data) {
$('#container').html(data);
});
But this works:
$.post("/Controller/OtherMethod", { paramOne: varOne, paramTwo: varTwo }, function (data) {
$("#container").html(data);
});
Both functions work in Firefox when running from the test URL, only second one works in Edge/Chrome from test URL.
Any ideas on what I might need to check?
Here's the header from the failing call in Dev Tools:
General
Request URL: https://[url]/Controller/Method?param=123
Request Method: POST
Status Code: 411
Remote Address: [remote_ip]
Referrer Policy: strict-origin-when-cross-origin
Response Headers
content-type: text/html; charset=us-ascii
date: Thu, 19 Aug 2021 17:08:18 GMT
server: Microsoft-HTTPAPI/2.0
Request Headers
:authority: [url]
:method: POST
:path: Controller/Method?param=123
:scheme: https
accept: /
accept-encoding: gzip, deflate, br
accept-language: en-US,en;q=0.9
cache-control: no-cache
content-length: 0
cookie: [cookie info]
origin: https://[url]
pragma: no-cache
referer: https://[url]/Controller/Method?param=123
sec-ch-ua: "Chromium";v="92", " Not A;Brand";v="99", "Google
Chrome";v="92"
sec-ch-ua-mobile: ?0
sec-fetch-dest: empty
sec-fetch-mode: cors
sec-fetch-site: same-origin
user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64)
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.159
Safari/537.36
x-requested-with: XMLHttpRequest
After experiencing the same problem in both Chrome and Edge across two different websites, I think I have the answer:
The problem appears to be BitDefender antivirus.
If I disable BitDefender protection, the post requests succeed.
EDIT:
Bizarrely, if I then re-enable BitDefender protection, the issue doesn't come back.
After stumbling across the solution, I found a few references to people having the same problem: https://community.bitdefender.com/en/discussion/88573/status-411-length-required-during-browsing https://support.mozilla.org/en-US/questions/1344632
I have a basic http file server with a simple API written in WPF/C# that's meant for users to access locally or from their local network. It usesSystem.Net.HttpListener. So the URL to load a jpg for example:
http://localhost:8085/?getfile=d:/temp/Patern_test.jpg
works perfectly in every browser including Edge. No problems. But if I embed any link to my file server in an HTML file:
<html>
<body>
<object data="http://localhost:8085/?getfile=d:/temp/Patern_test.jpg">
</object>
</body>
</html>
Then it only works in Chrome and Firefox. Not Edge or IE 11. I've tried image, video, and iframe tags and the result is the same. I've tried replacing localhost with the actual IP and calling it remotely, same results.
However if I embed a link to another site instead of to my server, it works fine on all browsers:
<html>
<body>
<object data="https://upload.wikimedia.org/wikipedia/commons/d/db/Patern_test.jpg">
</object>
</body>
</html>
So it seems there's something about my server that Edge doesn't like. Can anyone help me understand what this could be? Here is a comparison of the headers:
My Local HTTP Server (Edge doesn't like this):
Request URL: http://localhost:8085/?getfile=d:/temp/Patern_test.jpg
Request Method: GET
Status Code: 200 OK
Remote Address: [::1]:8085
Referrer Policy: no-referrer-when-downgrade
Response Headers
HTTP/1.1 200 OK
Content-Length: 36623
Content-Type: image/jpg
Server: Microsoft-HTTPAPI/2.0
Date: Thu, 30 Aug 2018 10:50:03 GMT
Request Headers
GET /?getfile=d:/temp/Patern_test.jpg HTTP/1.1
Host: localhost:8085
Connection: keep-alive
Cache-Control: max-age=0
Upgrade-Insecure-Requests: 1
DNT: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.106 Safari/537.36
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9
Wikipedia Server (Edge likes this) :
Request URL: https://upload.wikimedia.org/wikipedia/commons/d/db/Patern_test.jpg
Request Method: GET
Status Code: 304
Remote Address: 198.35.26.112:443
Referrer Policy: no-referrer-when-downgrade
Response Headers
accept-ranges: bytes
access-control-allow-origin: *
access-control-expose-headers: Age, Date, Content-Length, Content-Range, X-Content-Duration, X-Cache, X-Varnish
age: 39865
content-type: image/jpeg
date: Thu, 30 Aug 2018 11:12:24 GMT
etag: 77d2a6bf840622331df62963174df72d
last-modified: Mon, 07 Oct 2013 13:46:55 GMT
status: 304
strict-transport-security: max-age=106384710; includeSubDomains; preload
timing-allow-origin: *
via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
x-analytics: https=1
x-cache: cp1090 hit/1, cp2022 pass, cp4021 hit/3, cp4021 miss
x-cache-status: hit-local
x-client-ip: 47.154.205.235
x-object-meta-sha1base36: 8ktqmjes2nx4xcpy5h6v7o1sh6yao09
x-timestamp: 1381153614.85319
x-trans-id: tx76f1f3c2f9384f9eaff9b-005b871d52
x-varnish: 207150442 204346683, 73416305, 38868466 25417638, 135979082
Request Headers
:authority: upload.wikimedia.org
:method: GET
:path: /wikipedia/commons/d/db/Patern_test.jpg
:scheme: https
accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8
accept-encoding: gzip, deflate, br
accept-language: en-US,en;q=0.9
cache-control: max-age=0
cookie: WMF-Last-Access-Global=26-Aug-2018; GeoIP=US:CA:Thousand_Oaks:34.21:-118.88:v4
dnt: 1
if-modified-since: Mon, 07 Oct 2013 13:46:55 GMT
if-none-match: 77d2a6bf840622331df62963174df72d
upgrade-insecure-requests: 1
user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.106 Safari/537.36
Also I should mention that I very much need this to work in Edge because it will be used in conjunction with a WebView control in a UWP app.
UPDATE:
It turns out this issue only happens under two conditions, neither of which actually matter to me:
The link is embedded in a local html file that I load directly in Edge.
If I use UWP's WebView.NavigateToString(stringOfHtml).
If the HTML is served directly from the local server using WebView.Navigate(Uri) then the embedded link works fine.
This leads me to suspect this is some kind of low-level security measure going on to prevent dynamically created html from running malicious code. Just a thought.
I have a C# .NET application that is writing a zipped file back to the client for download. However, the browser does not receive the file or it rejects the file. The browser does not show any notifications. I have tried it both on Firefox and Chrome.
I have captured the request and response from the client and server using Fiddler:
Request:
POST http://localhost:62526/Reports/_Report_RewardLetters HTTP/1.1
Host: localhost:62526
Connection: keep-alive
Content-Length: 228
Accept: */*
Origin: http://localhost:62526
X-Requested-With: XMLHttpRequest
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/48.0.2564.109 Safari/537.36
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
DNT: 1
Referer: http://localhost:62526/Reports
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.8
Response:
HTTP/1.1 200 OK
Cache-Control: private
Transfer-Encoding: chunked
Content-Type: application/octet-stream
Server: Microsoft-IIS/10.0
content-dispostion: filename=Letter.zip
X-AspNet-Version: 4.0.30319
X-SourceFiles: =?UTF-8?B?QzpcVXNlcnNcc2FpbmlfaFxTb3VyY2VcUmVwb3NcUmVzZWFyY2hPZmZpY2VEYXNoYm9hcmRcUmVzZWFyY2hPZmZpY2VEYXNoYm9hcmRcUmVwb3J0c1xfUmVwb3J0X1Jld2FyZExldHRlcnM=?=
X-Powered-By: ASP.NET
Date: Thu, 11 Feb 2016 23:13:22 GMT
....Truncated the file contents.....
My code:
Response.Clear();
Response.ClearHeaders();
Response.ClearContent();
Response.AddHeader("content-dispostion", "filename=MyFile.zip");
Response.ContentType = "application/octet-stream";
Response.Flush();
Response.WriteFile(myfile);
Response.Flush();
Response.End();
I have tried numerous combinations of Response.Flush(), Response.Clear(), HttpContext.ApplicationInstance.CompleteRequest(), Response.BinaryWrite(), Response.TransmitFile(), etc. but none seem to work. Additionally, I have in my code the necessary checks to determine the existence of the file.
From the fiddler captures, I think there is something wrong in the encoding or the server response of the file being sent to the client whereby the browser is rejecting the file without any notification.
Thanks for your help!
Just a thought: do you need the Response.Flush statements? Setting your headers, writing your file and then calling Response.End should be enough.
Also, set the ContentType to "application/zip, application/octet-stream"
I had made a mistake that was causing all the fuss. The view was using an Ajax form to make the request instead of a normal Html form.
I fixed the problem by changing my controller and view according to the information present here:
http://geekswithblogs.net/rgupta/archive/2014/06/23/downloading-file-using-ajax-and-jquery-after-submitting-form-data.aspx
Download Excel file via AJAX MVC
In request header Accept-encoding: gzip, deflate, is missing, but in response header Content-encoding: gzip is present. does it cause compression failed. if yes, how to avoid it??
Request URL: http://something.com/something.js
Request Method: GET
Status Code: 200 OK 200 OK
Request Headers
Accept: */*
Referer: somthing.comsomthing.aspx
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/38.0.2125.111 Safari/537.36
X-DevTools-Emulate-Network-Conditions-Client-Id: 2D3ED9B5-95BD-4984-9EEE-405C2889F11E
Response Headers
Accept-Ranges: bytes
Content-Encoding: gzip
Content-Length: 884
Content-Type: application/x-javascript
Date: Tue, 28 Oct 2014 11:09:13 GMT
ETag: "0ac99ce3e9fcf1:0"
Last-Modified: Mon, 14 Jul 2014 08:37:12 GMT
Server: Microsoft-IIS/8.0
Vary: Accept-Encoding
X-Powered-By: ASP.NET
From RFC 7231:
A request without an Accept-Encoding header field implies that the
user agent has no preferences regarding content-codings. Although
this allows the server to use any content-coding in a response, it
does not imply that the user agent will be able to correctly process
all encodings.
In short: if you specify no Accept-Encoding, it's legal (though ill-advised) for the server to send you compressed content. There doesn't appear to be a solid, reliable way to tell a web server that it should definitely not compress. You can try Accept-Encoding: *;q=0 or Accept-Encoding: identity, but support for this is not universal across web servers, and proxies can mess things up as well.
In the end you are probably better off with simply handling compressed content if it comes back as such -- there is no good reason for a client to not support compression and libraries for this are freely available.
I am currently developing a small websocket server in C# to handle connections from browsers.
I mostly used the code from Mozilla and Microsoft (respectively here and there). Unfortunately, when I try to connect to my basic server from a browser (the script was taken from websocket.org), the GET request appears to be fragmented...
I would like to understand why the GET request is divided into two parts. Let me show you my code and the output I get from it.
Code:
while ((i = stream.Read(bytes, 0, bytes.Length)) != 0)
{
data = Encoding.UTF8.GetString(bytes,0,i);
Console.WriteLine("Received: {0}", data);
}
Output:
Received: GET / HTTP/1.1
Host: 127.0.0.1
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64; rv:30.0) Gecko/20100101 Firefox/30.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-US,en;q=0.5
Accept-Encoding: gzip, deflate
Received:
Sec-WebSocket-Version: 13
Origin: null
Sec-WebSocket-Key: 2QYy54zGPPKAkNyPgFjkbw==
Connection: keep-alive, Upgrade
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
I also tried on Chrome to see how different it would be and by running the same code and the same script in the browser I get the following output:
Received: GET / HTTP/1.1
Upgrade: websocket
Connection: Upgrade
Host: 127.0.0.1
Origin: null
Pragma: no-cache
Cache-Control: no-cache
Sec-WebSocket-Key: mp5oLqe/YaQAxRksoVZWKg==
Sec-WebSocket-Version: 13
Sec-WebSocket-Extensions: permessage-deflate; client_
Received: max_window_bits, x-webkit-deflate-frame
User-Agent: Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.153 Safari/537.36
I see nothing wrong in the output with Chrome, so I don't really understand why the request is divided when using firefox...
I could merge the two parts of the request together, but if I understood the RFC6455 properly, any bad request should be dropped by the server.
Any suggestion?
Thank you very much.
That is totally fine. Maybe Firefox is sending the information in a way (slower) you have to read twice the network buffer but this is OK. It seems it is sending the common information that never change first, and after the information about the websocket (that has to be generated each time) last, probably to reduce the "time for the first byte".
Regarding to the HTTP protocol, the header ends when there is an empty line, so as long you do not get one and the network buffer contains information (readed!=0), you can continue reading the header.
Usually, you are going to find this situation often when doing network programming. You have to continue reading until the network stream says readed=0.