I have an ASPX page which should retrieve some content (some plain text data) asynchronously, and write something before/during/after the operation.
Currently, I can reach the "during" step but page content doesn't change anymore afterwards.
Big issue is I cannot perform any kind of debugging due to infrastructure (mis)configuration and not being allowed to run Remote Debugging Tools, I have to rely on publishing and see what happens...
Code behind looks like this (This is a .NET 3.5 (changing target framework is not an option) project created under VS2008 and later upgraded to VS2010)
void Page_Load()
{
myLabel = "Preparing to fetch content ...";
FetchContent();
}
void FetchContent()
{
try {
// "http://myUrl" returns text with header 'Content-disposition: inline;'
// If called directly, Text can be seen in the browser alright.
WebRequest request = WebRequest.Create("http://myUrl");
myLabel = "Fetching ...";
request.BeginGetResponse(new AsyncCallback((result)=>
{
//EXCEPTION HERE: 401 Unauthorized ??? url works via browser!
WebResponse resp = request.EndGetResponse(result);
StreamReader stream = new StreamReader(resp.GetResponseStream());
myLabel = "Done";
}
} catch {myLabel = "Request KO"; }
}
In the ASPX code, myLabel is simply shown:
<body>
<pre><%=myLabel %></pre>
</body>
The url responds fairly quickly if called from a browser, but in this code myLabel never shows Done., it stays on the Fetching... text like the callback is never fired.
Am I missing something obvious here ?
UPDATE
Closer inspection revealed that EndGetResponse returns a 401 Unauthorized status code. It works flawlessly if I invoke the exact same url via a browser though ! Some now more focused searching got me the solution now.
After finding out the 401 Unauthorized status code in the response, I managed to find other answers right here on SO which made me solve my (as it turns out) trivial issue adding this:
request.UseDefaultCredentials = true;
Related
I have a problem loading a 3D model on an online server, the error shown is related to accessing the Forge API, locally works smoothly however when mounted on the server or a website is made marks the following error "Failed to load resource: the server responded with a status of 404 (Not Found)", then "onDocumentLoadFailure() - errorCode:7".
As I comment, what I find stranger is that, locally, it works. Attached the segment of the code where it displays the error.
function getAccessToken() {
var xmlHttp = null;
xmlHttp = new XMLHttpRequest();
xmlHttp.open("GET", '/api/forge/toke', false); //Address not found
xmlHttp.send(null);
return xmlHttp.responseText;
}
Thank you very much in advance.
Are you sure the code you're running locally and the code you've deployed are really the same?
The getAccessToken function doesn't seem to be correct, for several reasons:
First of all, there seems to be a typo in the URL - shouldn't it be /api/forge/token instead of /api/forge/toke?
More importantly, the HTTP request is asynchronous, meaning that it cannot return the response immediately after calling xmlHttp.send(). You can find more details about the usage of XMLHttpRequest in https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest/Using_XMLHttpRequest.
And finally, assuming that the function is passed to Autodesk.Viewing.Initializer options, it should return the token using a callback parameter passed to it (as shown in https://forge.autodesk.com/en/docs/viewer/v7/developers_guide/viewer_basics/initialization/#example).
With that, your getAccessToken should probably look more like this (using the more modern fetch and async/await):
async function getAccessToken(callback) {
const resp = await fetch('/api/forge/token');
const json = await resp.json();
callback(json.access_token, json.expires_in);
}
I've already found the issue. When I make the deploy I have to change the url where the request is made for the public or the name of the domain. For example: mywebsite.com/aplication-name/api/forge/token.
I have chrome giving me the "Aw Snap" error if I do a subsequent (standard) request while my previous Ajax request is still processing. In my browser debugger I can see that the Ajax request has a status of (Cancelled). Can anyone explain this behaviour or help me to get around this?
The code which causes this problem is proprietary although I think the concept is fairly simple and I was hoping someone can shed any light given the description.
I have a method within a controller called GetItemCount
[NoCache]
public JsonResult GetItemCount(...)
{
// other code removed for brevity
Thread.Sleep(5000);
// other code removed for brevity
return new JsonResult
{
Data = new { count = 10 },
JsonRequestBehavior = JsonRequestBehavior.AllowGet,
ContentEncoding = Encoding.UTF8
}
}
The above method gets called via Jquery Ajax library using
$.ajax(request);
Now, while that indicator badge is waiting to be shown, if I click on an href tag anywhere on the page...the tab crashes. If I wait for the indicator to show and then click, its all okay. I have also attached the Chrome debugger which marks the request as (Cancelled).
Some more debugging information, the below is what the Chrome developer tools tells me
Synchronous XMLHttpRequest on the main thread is deprecated because of
its detrimental effects to the end user's experience. For more help,
check http://xhr.spec.whatwg.org/. browserLink:37 Setting
'XMLHttpRequest.withCredentials' for synchronous requests is
deprecated.
I recently came across a Chrome issue which I think is worth sharing it with you.
I worked on a self written API using an HttpHandler which primary should return json data. But when an error occures I wanted to display an html file. That worked pretty well in IE and FF, but not in Chrome.
Looking to the developer tools revealed this error: net::ERR_INCOMPLETE_CHUNKED_ENCODING
Google said not very much about this issue while it was seen very much. All I got to know was, that it was magically disappearing after some time.
I found out it lays on this lines of code:
result.StoreResult(context);
context.Response.Flush();
context.Response.Close(); //<-- this causes the error
After removing the last line it worked well. I donĀ“t know why only Chrome had/has an issue with that, but it seemed as if I closed the response stream before chrome finished reading it.
I hope it helps those of you coming across the same or a similar issue.
Now my question:
How is the best pratice in closing/flushing the response stream? Are there any rules?
According to ASP.NET sets the transfer encoding as chunked on premature flushing the Response:
ASP.NET transfers the data to the client in chunked encoding (Transfer-Encoding: chunked), if you prematurely flush the Response stream for the Http request and the Content-Length header for the Response is not explicitly set by you.
Solution: You need to explicitly set the Content-Length header for the Response to prevent ASP.NET from chunking the response on flushing.
Here's the C# code that I used for preventing ASP.NET from chunking the response by setting the required header:
protected void writeJsonData (string s) {
HttpContext context=this.Context;
HttpResponse response=context.Response;
context.Response.ContentType = "text/json";
byte[] b = response.ContentEncoding.GetBytes(s);
response.AddHeader("Content-Length", b.Length.ToString());
response.BinaryWrite(b);
try
{
this.Context.Response.Flush();
this.Context.Response.Close();
}
catch (Exception) { }
}
I was running into this error when generating a file and pushing it to the user for download, but only occasionally. When it didn't fail, the file was consistently 2 bytes short. Close() forcibly closes the connection, whether it's finished or not, and in my case it was not. Leaving it out, as suggested in the question, meant the resulting file contained both the generated content as well as the HTML for the entire page.
The solution here was replacing
context.Response.Flush();
context.Response.Close();
with
context.Response.End();
which does the same, but without cutting the transaction short.
In my case, the problem was cache-related and was happening when doing a CORS request.
Forcing the response header Cache-Control to no-cache resolved my issue:
[ using Symfony HttpFoundation component ]
<?php
$response->headers->add(array(
'Cache-Control' => 'no-cache'
));
I was also getting same error. This issue was with web server user permission on cache folder.
On the offchance that someone is landing here as a result of issues with their ASP.net Core project, I was able to resolve by adding the IIS middleware.
This is done by adding UseIISIntegration when instantiating your webhost instance.
Once I had the same problem and the main reason was lying in my controller return type.
If you try to return a C# object just as-is, you will only get net::ERR_INCOMPLETE_CHUNKED_ENCODING so don't forget to serialize your complex objects before sending them out for java script client (or View).
i.e. my controller return type was :
public async Task<List<ComplexModel>> GetComplexModelList(){
return new List<ComplexModel>()
}
Which caused INCOMPLETE_CHUNKED_ENCODING error, so I tried to fix my mistake with something like:
using Newtonsoft.Json;
...
public async Task<string> GetComplexModelList(){
return JsonConvert.SerializeObject(new List<ComplexModel>())
}
I want to use facebook like button in my website. as most of my visitors are from Iran and facebook is filtered in Iran, if somebody comes from Iran, the like button gets filtered and the page gets very ugly. what I have thought to prevent this is to use webclient to try to connect to facebook. if successful, I place the like button, otherwise I don't:
string fb_result="failed";
WebClient webclient= new WebClient();
try{
fb_result=webclient.DownloadString("http://www.fabebook.com");
}
catch{
fb_result="failed"; //if its being filtered exception is raised
}
the in my html:
<%if(fb_result)!="failed"){%>
<div id="fb-root"></div>
<script>(function(d, s, id) {
var js, fjs = d.getElementsByTagName(s)[0];
if (d.getElementById(id)) return;
js = d.createElement(s); js.id = id;
js.src = "//connect.facebook.net/en_US/all.js#xfbml=1";
fjs.parentNode.insertBefore(js, fjs);
}(document, 'script', 'facebook-jssdk'));</script>
<div class="fb-like" data-href="http://www.mysite.com" data-send="true" data-width="450" data-show-faces="true"></div>
<%}%>
this works fine but the problem is, when webclient is unable to connect, it takes to much time to raise the error. is there any way to make webclient try for a shorter time and if not able to connect raise the error faster.
by the way, any other ways to check if connection to facebook is possible or not is appeciated. as this might not be the only way to check it.
As I stated in my comments, the WebClient code you've written will not work because it is executed on the server, so it will attempt to connect to Facebook using your webserver's connection, which will always succeed (unless your webhost goes offline).
The best approach is to have an AJAX request made by your page when it loads in the browser. It should make a request for a known resource on Facebook (such as their homepage, or the "Like" image itself). Your AJAX response handler will then load the rest of the Facebook client scripts if, and only if, the response is as-expected. If you test it by requesting the Like image then you just need to check the response's content-type (i.e. ensure it's image/png); if the response has a non-200 status code or if the request times-out then you don't load the Facebook scripts.
What I want to be able to do: Edit HTTP Requests before they are sent off to the server
User navigates to a webpage of their choice in their browser > They encounter a request they wish to edit > they edit the request and then that gets sent to the server instead of the original one.
What I have done so far: I have captured the request, now I need help finding the code to edit it. Here is my code for capturing the request so far:
Fiddler.FiddlerApplication.BeforeRequest += sess =>
{
//Code to detect user specified URL here
}
Is it possible for me to edit the request before it is actually sent? If it can be done using the FiddlerCore API only then I'd be grateful, although I am willing to download more binaries if required.
Additional notes: I have tried streamwriters, binary writers, copy the respose into a memory stream edit it then copy it back, none of those methods work for me. Also when I try some methods my app just hangs and doesn't respond to things like pressing the X.
Maybe I'm just bad at explaining what I'm trying to achieve seems the only good answer I have has been about reponses :/
If the request reads the string "hello world" then I'd like the user to be able to change the REQUEST to say "hello there"
Such a noobish mistake I made, I thought that RequestBody was read only! Turns out I could have simply edited the response like this:
session.RequestBody = myBytes;
Really annoyed at myself for this!
In the demo app, adding the delegate is shown as:
Fiddler.FiddlerApplication.BeforeResponse += delegate(Fiddler.Session oS) {
// Console.WriteLine("{0}:HTTP {1} for {2}", oS.id, oS.responseCode, oS.fullUrl);
// Uncomment the following two statements to decompress/unchunk the
// HTTP response and subsequently modify any HTTP responses to replace
// instances of the word "Microsoft" with "Bayden". You MUST also
// set bBufferResponse = true inside the beforeREQUEST method above.
//
//oS.utilDecodeResponse(); oS.utilReplaceInResponse("Microsoft", "Bayden");
};