What I am trying to accomplish is adding a GET method to my WCF REST based service and accessing it via the WebRequest class from a Silverlight 3 client application.
I am getting the error The remote server returned an error: NotFound. which, as I understand it, can be just a generic error for any 500 error encountered on the server.
WCF operation contract:
[OperationContract, WebGet(UriTemplate = "path/{id}")]
Stream Get(string id);
Operation implementation:
public Stream Get(string id)
{
WebOperationContext.Current.OutgoingResponse.ContentType = "application/xml; charset=utf-8";
return new MemoryStream(Encoding.UTF8.GetBytes("<xml><id>1</id><name>Some Name</name></xml>));
}
Client code that throws exception:
HttpWebRequest webRequest = WebRequest.CreateHttp("http://domain.com/my-service.svc/path/1");
webRequest.BeginGetResponse(
x =>
{
try
{
using (WebResponse webResponse = webRequest.EndGetResponse(x)) <--Exception thrown here
using (Stream stream = webResponse.GetResponseStream())
{
//do stuff here...eventually.
}
}
catch (Exception ex)
{
}
},
null);
I suspect that it has something to do with the return type and have also tried returning XmlElement to no avail. I am really stumped here, any ideas what I might be doing wrong?
Note that I can successfully hit the method via Fiddler and a web browser.
Try putting the code below into your web.config file(change the file name in the initializeData attribute appropriately.
If you are using full IIS, and not Casini or IIS Express (I use the latter), make sure to put the log file somewhere where your web application has write permissions). This will cause WCF to generate a fairly detailed log file. I've found the log to be pretty handy.
<system.diagnostics>
<sources>
<source name="System.ServiceModel"
switchValue="Information, ActivityTracing"
propagateActivity="true">
<listeners>
<add name="traceListener"
type="System.Diagnostics.XmlWriterTraceListener"
initializeData= "c:\temp\WEBTraces.log" />
</listeners>
</source>
</sources>
</system.diagnostics>
Here's another thing to check: Is domain.com exactly the same domain name as your silverlight app is running from (example -- is your SL app starting as localhost/xx and your web service call going to domain.com?
For security reasons, Silverlight will not make cross-domain web service calls unless the called domain grants it permission (same as Flash). If this is the case, you will need a clientaccesspolicy.xml file.
You can read about it here: http://weblogs.asp.net/jgalloway/archive/2008/12/12/silverlight-crossdomain-access-workarounds.aspx
There is a video here: http://www.silverlight.net/learn/data-networking/introduction-to-data-and-networking/how-to-use-cross-domain-policy-files-with-silverlight
There are some helpers here: http://timheuer.com/blog/archive/2008/04/06/silverlight-cross-domain-policy-file-snippet-intellisense.aspx
NotFound should mean 404 and not 500. A 404 error could be produced by a wrong URI.
Uri resturi = new Uri(String.Format("http://{0}:8080/MyService/", hostname)); // http
WebHttpBinding rest = new WebHttpBinding(WebHttpSecurityMode.TransportCredentialOnly); // WebHttpSecurityMode.Transport for ssl
host.AddServiceEndpoint(typeof(IMyService), rest, resturi);
In the code example above your service would be aviable via http://host:8080/MyService/path/1
Related
So I'm using Grapevine.RESTClient to manage the client side of my REST interface. I'm using it to communicate between a service running in LocalSystem and a process run by the user on the same machine.
My problem is that when the service is not running my client gets an exception with a message of 'Error: Value cannot be null. Parameter name: cookies'
I'm trying to create some logic on the client that is supposed to understand and accept that sometimes the service is unavailable like when the service is auto updating.
Or maybe I should just accept that the message of the exception is a little odd?
RESTClient client;
client = new RESTClient(baseUrl);
RESTRequest request = new RESTRequest(resource);
request.Method = Grapevine.HttpMethod.GET;
request.ContentType = Grapevine.ContentType.JSON;
request.Timeout = 30000;
RESTResponse response = client.Execute(request);
The above throws a System.ArgumentNullException with e.Message = "Value cannot be null.\r\nParameter name: cookies"
Hmmm... Looking at the Grapevine code on github it seems the code tries to add a cookie collection to this.Cookies even if the response object was created out of e.response in the catch block of the GetResponse call. It may or may not have a cookie collection. There should have been a test for null block around the this.Cookies.Add(response.Cookies) right?
https://github.com/scottoffen/Grapevine/blob/master/Grapevine/Client/RESTClient.cs
Unable to create a grapevine tag as the developer of grapevine suggested to do. Dont have enough points
I've had the same problem. Unfortunately the error message is misleading. To me the fix was to add a default proxy to the file App.config.
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.net>
<defaultProxy useDefaultCredentials="true" />
</system.net>
</configuration>
I have a webpage at localhost:63342 with a jQuery ajax call in that webpage, to my webservice server at localhost:55000. In the webservice I set the Access-Control headers.
In Chrome's developer tools, Network tab, I can see that the OPTIONS preflight thing is sent, and the response header has the following, which looks great.
Access-Control-Allow-Headers:x-requested-with, X-Auth-Token, Content-Type, Accept, Authorization
Access-Control-Allow-Methods:POST, OPTIONS, GET
Access-Control-Allow-Origin:*
Cache-Control:private
Content-Length:0
Date:Fri, 06 Jun 2014 13:30:58 GMT
Server:Microsoft-IIS/8.0
However, the response to the OPTIONS request hits the error function of my jQuery ajax call. Developer tools shows me that the browser prepares the POST, but fails it because it thinks the resource does not have the Access-Control-Allow-Origin header set. The browser does not try to send the POST. Here is the error message from the console of the browser:
XMLHttpRequest cannot load http://localhost:55000/webservice/ws.svc/CreateOuting. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:63342' is therefore not allowed access.
Its as though jQuery is interfering in the OPTIONS, POST process. Any ideas on what I should do to make this work?
Here is my ajax call
$.ajax({
type: 'POST',
data: JSON.stringify(obj),
headers: { "Content-type": "application/json" },
url: base_url + 'CreateOuting',
crossDomain: true,
success: function (an_outing) {
$('#listviewOutings').listview('refresh', true);
$('#boxOutingName')[0].value = '';
myLib.OpenBasicPopup('Success', 'The outing name was saved.')
},
error: function (err) {
alert(err.statusText); // response to OPTIONS request ends up here
}
});
Here is how I set the headers in the method on the server (.NET C#):
public bh_Outing CreateOuting(string strOuting) {
try
{
//for all cors requests
WebOperationContext.Current.OutgoingResponse.Headers.Add("Access-Control-Allow-Origin", "*");
//identify preflight request and add extra headers
if (WebOperationContext.Current.IncomingRequest.Method == "OPTIONS")
{
WebOperationContext.Current.OutgoingResponse.Headers.Add("Access-Control-Allow-Methods", "POST, OPTIONS, GET");
WebOperationContext.Current.OutgoingResponse.Headers.Add("Access-Control-Allow-Headers", "x-requested-with, X-Auth-Token, Content-Type, Accept, Authorization");
return null;
}
// do stuff
Here is the interface for that method. I dont think its perfect yet, but I dont think its the problem either.
[WebInvoke(UriTemplate = "*", Method = "*", ResponseFormat = WebMessageFormat.Json)]
[OperationContract]
bh_Outing CreateOuting(string strOuting);
Thanks for taking a look at this. I am truly stuck.
Update, 6/17/14, 5:38 PM EST
I added a element to my webconfig as in this post, and this made no change to my results.
This may be not the reason, but have you tried enabling cors in jQuery? Before any cors ajax request:
jQuery.support.cors = true;
The cause of the problem that made me post the above question, seems to be that Chrome was incorrectly representing what was really happening. As I wrote above, what was happening was:
... the response to the OPTIONS request hits the error function of my
jQuery ajax call. Developer tools shows me that the browser prepares
the POST, but fails it because it thinks the resource does not have
the Access-Control-Allow-Origin header set. The browser does not try
to send the POST.
To try to get more detail, I installed and used Microsoft Message Analyzer. It showed that the OPTIONS request was being sent; the OPTIONS response was being sent; the POST request was being sent back to the webservice (!!!) and a POST response was being sent.
This was a huge breakthrough because now, instead of trying to solve a CORS problem (the error message that Chrome developer tools showed) I started working to solve a "Not Found" and then a "Bad Request" error (the error messages that MS Message Analyzer showed).
The other technique I used that helped a lot was that I set up tracing in my webservice. Tracing wrote stuff to a log file which default-opens in an event viewer and provided the details that I needed to figure out the (real) issues. Here is what I put in the webconfig to enable tracing.
<system.diagnostics>
<trace autoflush="true" />
<sources>
<source name="System.ServiceModel"
switchValue="Critical, Error, Warning"
propagateActivity="true">
<listeners>
<add name="myListener"
type="System.Diagnostics.XmlWriterTraceListener"
initializeData= "c:\logs\ServiceModelExceptions.svcLog" />
</listeners>
</source>
</sources>
</system.diagnostics>
I made no changes to the CORS settings (jQuery and web.config) and its running fine in Chrome now. Also, I have the webservice out on the www, so it is truly cross-domain, not just localhost:63342 to localhost:55000.
I have spent a good time now on configuring my proxy. At the moment I use a service called proxybonanza. They supply me with a proxy which I use to fetch webpages.
I'm using HTMLAGILITYPACK
Now if I run my code without a proxy there's no problem locally or when uploaded to webhost server.
If I decide to use the proxy, it takes somewhat longer but it stills works locally.
If I publish my solution to, to my webhost I get a SocketException (0x274c)
"A connection attempt failed because the connected party did not properly respond
after a period of time, or established connection failed because connected host has
failed to respond 38.69.197.71:45623"
I have been debugging this for a long time.
My app.config has two entries that are relevant for this
httpWebRequest useUnsafeHeaderParsing="true"
httpRuntime executionTimeout="180"
That helped me through a couple of problems.
Now this is my C# code.
HtmlWeb htmlweb = new HtmlWeb();
htmlweb.PreRequest = new HtmlAgilityPack.HtmlWeb.PreRequestHandler(OnPreRequest);
HtmlDocument htmldoc = htmlweb.Load(#"http://www.websitetofetch.com,
"IP", port, "username", "password");
//This is the preRequest config
static bool OnPreRequest(HttpWebRequest request)
{
request.KeepAlive = false;
request.Timeout = 100000;
request.ReadWriteTimeout = 1000000;
request.ProtocolVersion = HttpVersion.Version10;
return true; // ok, go on
}
What am I doing wrong? I have enabled the tracer in the appconfig, but I don't get a log on my webhost...?
Log stuff from app.config
<system.diagnostics>
<sources>
<source name="System.ServiceModel.MessageLogging" switchValue="Warning, ActivityTracing" >
<listeners>
<add name="ServiceModelTraceListener"/>
</listeners>
</source>
<source name="System.ServiceModel" switchValue="Verbose,ActivityTracing">
<listeners>
<add name="ServiceModelTraceListener"/>
</listeners>
</source>
<source name="System.Runtime.Serialization" switchValue="Verbose,ActivityTracing">
<listeners>
<add name="ServiceModelTraceListener"/>
</listeners>
</source>
</sources>
<sharedListeners>
<add initializeData="App_tracelog.svclog"
type="System.Diagnostics.XmlWriterTraceListener, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"
name="ServiceModelTraceListener" traceOutputOptions="Timestamp"/>
</sharedListeners>
</system.diagnostics>
Can anyone spot the problem I have these setting on and off like a thousand times..
request.KeepAlive = false;
System.Net.ServicePointManager.Expect100Continue = false;
Carl
Try downloading the page as a string first, then passing it to HtmlAgilityPack. This will let you isolate errors that happen during the download process from those that happen during the html parsing process. If you have an issue with proxybonanza (see end of post) you will be able to isolate that issue from a HtmlAgilityPack configuration issue.
Download page using WebClient:
// Download page
System.Net.WebClient client = new System.Net.WebClient();
client.Proxy = new System.Net.WebProxy("{proxy address and port}");
string html = client.DownloadString("http://example.com");
// Process result
HtmlAgilityPack.HtmlDocument htmlDoc = new HtmlAgilityPack.HtmlDocument();
htmlDoc.LoadHtml(html);
If you want more control over the request, use System.Net.HttpWebRequest:
// Create request
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://example.com/");
// Apply settings (including proxy)
request.Proxy = new WebProxy("{proxy address and port}");
request.KeepAlive = false;
request.Timeout = 100000;
request.ReadWriteTimeout = 1000000;
request.ProtocolVersion = HttpVersion.Version10;
// Get response
try
{
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream stream = response.GetResponseStream();
StreamReader reader = new StreamReader(stream);
string html = reader.ReadToEnd();
}
catch (WebException)
{
// Handle web exceptions
}
catch (Exception)
{
// Handle other exceptions
}
// Process result
HtmlAgilityPack.HtmlDocument htmlDoc = new HtmlAgilityPack.HtmlDocument();
htmlDoc.LoadHtml(html);
Also, ensure that your proxy provider (proxybonanza) allows access from your production environment to your proxies. Most providers will limit access to the proxies to certain IP addresses. They may have allowed access to the external IP of the network where you are running locally but NOT the external IP address of your production environment.
It sounds like your web host has disabled outgoing connections from ASP.NET applications for security because it would allow other scripts/apps to perform malicious attacks from their servers.
You would have to ask them to unblock connections on your account, but don't be surprised if they say no.
I've been battling with this issue for over 2 weeks now and have gotten nowhere.
First the environment: Windows 2003 R2 SP2 / SharePoint 2007 SP1 / .NET 3.5.
Basically, we are making web service calls to gather data from a remote API. The API has several endpoints for REST and several for SOAP. The endpoints are HTTPS endpoints with digest authentication. When we make calls with the SOAP endpoints, everything seems to work just fine. But then we try to make a call using REST and the thread hangs then dies a horrible death when IIS decides that the thread isn't responding anymore and kills it. At first, we thought this was an SSL issue (and it still might be) because we don't see any issues when using the HTTP endpoints (route to the same API just not SSL).
Below is the code we're using to make the REST call:
private void Process(HttpContext context, String url, String restParam)
{
ServicePointManager.ServerCertificateValidationCallback += new System.Net.Security.RemoteCertificateValidationCallback(validateCertificate);
WriteLogMessage("Start Process");
String pattern = "{0}{1}";
String address = String.Format(pattern, url, restParam);
WriteLogMessage("ADDRESS is" + address);
LSWebClient client = new LSWebClient();
client.Timeout = 600000;
WriteLogMessage("TIMEOUT (client.Timeout) is " + client.Timeout.ToString());
client.Credentials = new NetworkCredential(XYZConfigurationSettings.APIUserName, XYZConfigurationSettings.APIPassword);
try {
String result = client.DownloadString(address);
WriteLogMessage("End Process. RESULT length is " + (result != null ? result.Length : 0));
context.Response.Write(result);
}
catch (Exception ex)
{
WriteLogMessage("EXCEPTION!!! Message----" + ex.Message + "---- StackTrace ----" + ex.StackTrace + "");
}
}
private bool validateCertificate(object sender, X509Certificate cert, X509Chain chain, System.Net.Security.SslPolicyErrors error)
{
WriteLogMessage("bypassAllCertificateStuff");
return true;
}
So, crappy code aside, we put in a few things here to try to get around what we thought was an SSL Certificate issue. (setting the request timeout to 10 minutes, using custom certificate validation, etc...) However, none of this seems to fix the issue.
Here's the result of our logging:
2/28/2011 3:35:28 PM: Start
2/28/2011 3:35:28 PM: Start Process
2/28/2011 3:35:28 PM: ADDRESS ishttps://<host>/ws/rs/v1/taxonomy/TA/root/
2/28/2011 3:35:28 PM: TIMEOUT (client.Timeout) is 600000
2/28/2011 3:35:50 PM: CheckValidationResult
2/28/2011 3:35:50 PM: bypassAllCertificateStuff
2/28/2011 3:41:51 PM: EXCEPTION!!! Message ----Thread was being aborted.---- StackTrace ---- at System.Net.Connection.CompleteStartConnection(Boolean async, HttpWebRequest httpWebRequest)
at System.Net.Connection.CompleteStartRequest(Boolean onSubmitThread, HttpWebRequest request, TriState needReConnect)
at System.Net.Connection.SubmitRequest(HttpWebRequest request)
at System.Net.ServicePoint.SubmitRequest(HttpWebRequest request, String connName)
at System.Net.HttpWebRequest.SubmitRequest(ServicePoint servicePoint)
at System.Net.HttpWebRequest.GetResponse()
at System.Net.WebClient.GetWebResponse(WebRequest request)
at System.Net.WebClient.DownloadBits(WebRequest request, Stream writeStream, CompletionDelegate completionDelegate, AsyncOperation asyncOp)
at System.Net.WebClient.DownloadDataInternal(Uri address, WebRequest& request)
at System.Net.WebClient.DownloadString(Uri address)
at System.Net.WebClient.DownloadString(String address)
at XYZ.DAO.Handlers.RestServiceHandler.Process(HttpContext context, String url, String restParam)
at XYZ.DAO.Handlers.RestServiceHandler.ProcessRequest(HttpContext context)----
I have attempted to use my browser to view the return data, but the browser is IE6, which doesn't support SSL. However, I can see (in Fiddler / Charles proxy) that it does attempt to make the request and receives a 401 error but since I can not see server traffic using these programs I can not tell at exactly what step the error is happening.
To make matters worse, I can not reproduce this issue on any other server I have (note: they are all Windows 2008 servers).
So, in summary, here's what I've found:
SOAP - work
REST - no work
Win2008 - work
Win2003 - no work
HTTP - work
HTTPS - no work
If anyone has any insight or any other debugging / information gathering that I haven't tried I would be extremely greatful.
You should be able to get a bunch more tracing information if you add the following to your client .config file.
<system.diagnostics>
<sources>
<source name="System.Net" switchValue="Information, ActivityTracing">
<listeners>
<add name="System.Net"
type="System.Diagnostics.TextWriterTraceListener"
initializeData="System.Net.trace.log" />
</listeners>
</source>
</sources>
</system.diagnostics>
I've found what was causing the web service call to hang - the issue was that the service we were calling was using replay attack protection along with digest security:
Our server would send an initial
request sans security header
The request was responded to with a
standard 401 challenge providing a
nonce for use. (That nonce expires
after 10 seconds after the
challenge)
Our server then took 30 seconds to
generate a second response using
this nonce
So the remote server would then find the
expired nonce and again issue
another 401 challenge.
The cycle would continue until the local server's thread was terminated. However, why our local server is taking 30 $##%! seconds to generate a security header is beyond me. I inspected the logs that were provided through the diagnostics above, but none of it was much help. I'm going to chalk it up to the server being overloaded and not having enough memory to process it's way out of a wet paper bag.
I am currently writing a C# web service which has several methods, one of which has to receive HTTP POST requests. The first thing i have done is alter the web.config file in the web service project as below.
<webServices>
<protocols>
<add name="HttpSoap"/>
<add name="HttpPost"/>
<add name="HttpPostLocalhost"/>
<add name="Documentation"/>
</protocols>
</webServices>
I can run the web service locally and when i click on the method in the browser, i can see it handles HTTP POST requests and accepts args=string, as my signature of the web method accepts one string parameter named args. I am then testing this via a test ASP.NET app using the code below to fire the HTTP POST request.
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(ConfigurationManager.AppSettings["PaymentHubURL"].ToString());
request.KeepAlive = false;
request.ContentType = "application/x-www-form-urlencoded";
request.Method = "POST";
StringBuilder sb = new StringBuilder();
sb.Append("message_type=");
sb.Append(HttpUtility.UrlEncode("Txn_Response"));
byte[] bytes = UTF8Encoding.UTF8.GetBytes(sb.ToString());
request.ContentLength = bytes.Length;
using (Stream postStream = request.GetRequestStream())
{
postStream.Write(bytes, 0, bytes.Length);
}
string test;
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
StreamReader reader = new StreamReader(response.GetResponseStream());
test = reader.ReadToEnd();
}
But when i run this i get "The remote server returned an error: (500) Internal Server Error". If i remove the parameter, by removing the stringbuilder and byte code, as well as having no parameter in the web service, it works. So it is obviously a problem with the parameters. I actually want to send more data, and was using a string[] parameter in the web service, but this also failed.
Can anyone help??
I would suggest that you reconsider your approach. Microsoft has written pretty awesome libraries for consuming web services, but there are two ways to do it - "add web reference" and "add service reference".
In your case, it seems you have an "asmx web service" so I would recommend that you add a "web reference" to you project in visual studio. This is assuming you are using visual studio.
After you add this web reference, you can create your client by "new"-ing it. You can the execute any web method on this client. This is the easiest way to consume web services. You do not have to deal with any http complications.
Hope this helps.
I can guess you are building the HttpPost request wrongly.
Try to use the code showed at the link below to create your request:
http://msdn.microsoft.com/en-us/library/debx8sh9.aspx
Or probabily the response doesn't contain unicode char value
try to copy and paste this code to get the response
System.Text.Encoding encode = System.Text.Encoding.GetEncoding("utf-8");
StreamReader objSR;
webResponse = (HttpWebResponse)response.GetResponse();
StreamReader reader = webResponse.GetResponseStream();
objSR = new StreamReader(objStream, encode, true);
sResponse = objSR.ReadToEnd();