I am currently having 2 issues in service stack. I am currently trying to build a service to imitate an existing server software. This requires a few things that i am having issues with.
This is using a self hosted servicestack instance and latest version
I need to have service on "/" that takes no paramters.
All my services need to return results using a customer XML serializer not the data contact one no matter what is in the accept header. (currently return html representation of DTO)
For issue 1 i have been using [FallbackRoute("/")] which is working but then no matter what i do i can't get my custom serializer to be used.
For issue 2 i made a custom serializer using the dotnet xml serializer that will generate the output i need and registered it as a ContentTypeFilters. I then manually set the response type header but this did not trigger my serializer. This is really starting to drive me nuts as i need to implement about 20 services and i can't even get the simple root service working let alone the rest of them.
Basically my XML is in a format the DataContract serializer can't handle and the url's and content must be an exact match for the existing system.
It looks like both issue 1 and issue 2 are really the same issue; Your custom serialiser isn't getting called. This is either an issue with registering your serialiser, returning the content type or both. Below shows how you should set it up. Using ServiceStack v4:
Register your custom serialiser:
In your AppHost Configure method you need to register your custom XML serialiser:
StreamSerializerDelegate serialize = (request, response, stream) => {
// Replace with appropriate call to your serializer and write the output to stream
var myCustomSerializer = new MyCustomSerializer(response);
stream.write(myCustomerSerializer.getResult());
};
StreamDeserializerDelegate deserialize = (type, fromStream) => {
// Implement if you expect to receive responses using your type
throw new NotImplementedException();
};
// Register these methods to run if content type 'application/xml' is sent/received
ContentTypes.Register("application/xml", serialize, deserialize);
Set the return content type:
In your service you need to set the return content type, so the serialiser knows to run. You can do this either by adding an attribute on each method than needs to use this type, or if all your methods return this type you can configure it as the default.
Per method basis:
You can use the AddHeader attribute with the ContentType parameter. i.e:
public class TestService : Service
{
[AddHeader(ContentType = "application/xml")]
public TestResponse Get(RootRequest request)
{
return new TestResponse { Message = "Hello from root" };
}
}
All methods return this type:
You can set the default content type in the AppHost Configure method. i.e:
public override void Configure(Funq.Container container)
{
SetConfig(new HostConfig {
DebugMode = true,
DefaultContentType = "application/xml"
});
}
Fully working demo app
The demo is a self hosted console app, that takes a request to the root / or to /Test and returns a custom serialised response.
Hope this helps.
Related
Disclaimer: I have not worked with SOAP web services ever. At all. Not even a little bit. So the concept of channels and WCF scaffolding has got me a bit confused, hence I'm here.
I have been asked to integrate to a SOAP XML web service which uses basic authentication. (e.g. Authorization header, Basic xxxxxxxxxxxxxxxxxxxxxxxxxxxx <- which is a Base64 encoded username:password). My project is in .NET Core using C#.
I have used Visual Studio WCF connected service discovery to produce scaffolding code which has served me very well for instantiating the required objects etc, however my issue is I've been asked to use Basic authentication, and I have no idea where to inject this code into the scaffolding that's been produced. I have worked with basic authentication before, so I understand 'how' to do it, for things like REST APIs etc. Just username:password, base64 encode and add to Authorization header. However, I am unsure how to do this for this scaffolded SOAP web service.
The code that i believe can be injected into every request, to add your custom headers, is:
using (OperationContextScope scope = new OperationContextScope(IContextChannel or OperationContext)
{
OperationContext.Current.OutgoingMessageProperties[HttpRequestMessageProperty.Name] = new HttpRequestMessageProperty()
{
Headers =
{
{ "MyCustomHeader", Environment.UserName },
{ HttpRequestHeader.UserAgent, "My Custom Agent"}
}
};
// perform proxy operations...
}
The OperationContextScope expects either an IContextChannel or OperationContext. I am stuck as to what to add here. If I look at my scaffolded code, I can find the 'client' for the web service, here:
public partial class POSUserInformationManagerV1_2Client : System.ServiceModel.ClientBase<McaEndpointPosUserInformation.POSUserInformationManagerV1_2>, McaEndpointPosUserInformation.POSUserInformationManagerV1_2
And I can find the 'channel' here, but it's just another interface, that doesn't have any contracts specified?
[System.CodeDom.Compiler.GeneratedCodeAttribute("Microsoft.Tools.ServiceModel.Svcutil", "2.0.2")]
public interface POSUserInformationManagerV1_2Channel : McaEndpointPosUserInformation.POSUserInformationManagerV1_2, System.ServiceModel.IClientChannel
{
}
I looked up ChannelBase, and it seems like it should accept a variety of objects that implement one or another channel interface (including IClientChannel, which the scaffolded POSUserInformationManagerV1_2Channel uses)
protected class ChannelBase<T> : IDisposable, IChannel, ICommunicationObject, IOutputChannel, IRequestChannel, IClientChannel, IContextChannel, IExtensibleObject<IContextChannel> where T : class
{
protected ChannelBase(ClientBase<T> client);
[SecuritySafeCritical]
protected IAsyncResult BeginInvoke(string methodName, object[] args, AsyncCallback callback, object state);
[SecuritySafeCritical]
protected object EndInvoke(string methodName, object[] args, IAsyncResult result);
But I'm still stuck on what I can put into the OperationContextScope to connect it appropriately to the 'channel'. I've tried POSUserInformationManagerV1_2Client and the relevent Channel interface, but neither will convert to an IContextChannel. Does anyone have any ideas/thoughts?
EDIT: Here is where I am trying to inject the code to add the Auth HTTP header:
public System.Threading.Tasks.Task<McaEndpointPosUserInformation.requestUserInformationResponse> requestUserInformationAsync(McaEndpointPosUserInformation.requestUserInformation request)
{
using (OperationContextScope scope = new OperationContextScope(request)
{
OperationContext.Current.OutgoingMessageProperties[HttpRequestMessageProperty.Name] = new HttpRequestMessageProperty()
{
Headers =
{
{ "MyCustomHeader", Environment.UserName },
{ HttpRequestHeader.UserAgent, "My Custom Agent"}
}
};
// perform proxy operations...
}
return base.Channel.requestUserInformationAsync(request);
}
The issue turned out to be not setting up the transport security to be 'Basic' through the use of:
// Set the binding. Without this, the WCF call will be made as anonymous
var binding = new BasicHttpsBinding();
binding.Security.Transport.ClientCredentialType = HttpClientCredentialType.Basic;
Is there are a way to programmatically generate the /$metadata response returned from an ASP.Net Web Api OData controller route in a way that can be serialized to XML?
The reason I want to do this is that I'm using breeze to access the Web API using the OData adapter and would like to pre-load the Breeze MetadataStore with the metadata, like in this http://breeze.github.io/doc-js/metadata-load-from-script.htmlexample.
But this example does not seem to work with the OData adapter as it uses different metadata.
If I understand your question, you are trying to simulate GET /$metadata on the server so you can store the results in a file. In ASP.NET OData, $metadata is represented by an object that implements IEdmModel (e.g., the result of calling ODataModelBuilder.GetEdmModel). The problem then becomes how to serialize that model to XML.
The following helper will write service metadata to the given stream. For the model and config parameters, you should pass the same objects you used for your service configuration.
public class MetadataHelper
{
public static Task WriteMetadataAsync(Stream stream, IEdmModel model, HttpConfiguration config, string odataRouteName)
{
var request = new HttpRequestMessage(HttpMethod.Get, "/$metadata");
request.ODataProperties().Model = model;
request.ODataProperties().RouteName = odataRouteName;
request.SetConfiguration(config);
var payloadKinds = new List<ODataPayloadKind> { ODataPayloadKind.MetadataDocument };
var xmlMediaType = new MediaTypeHeaderValue("application/xml");
var formatter = new ODataMediaTypeFormatter(payloadKinds).GetPerRequestFormatterInstance(model.GetType(), request, xmlMediaType);
var content = new StringContent(String.Empty);
content.Headers.ContentType = xmlMediaType;
return formatter.WriteToStreamAsync(model.GetType(), model, stream, content, null);
}
}
True, the OData metadata is handled correctly by Breeze only when reading an OData response; the MetadataStore doesn't import/export it directly.
I think the easiest way to handle this is to create a separate bit of client-side code that will
Create an EntityManager that will
Fetch the metadata from the OData server
Export the metadata from the MetadataStore
Log the metadata so you can capture it and store it in a file
Not elegant, but it gets the job done.
Some future version of breeze.server.net will do the OData-to-Breeze metadata conversion on the server, so we won't have this problem.
I have a large WSDL file that I need to generate a WCF Web service from. I can generate a service using svcutil.exe, however it's not generating what I need.
I need a service that accepts/returns XML, rather than serialized types. The reason for this is if there is an error in the incoming XML it will fail before it hits my code - we can't have this. We need to intercept the XML before any serialization happens to catch it.
Is this possible?
Or is there a way I can modify the generated services so I can work with the raw XML rather than the derived "Message" type?
Effecitvely I want something similar to:
XmlDocument PersonRevised(XmlDocument request);
Current code:
[ServiceContract(Namespace = "urn:hl7-org:v3")]
public interface IPRPA_AR101202
{
[OperationContract(Name = "PersonRevised", Action = "urn:hl7-org:v3/PRPA_IN101204")]
PersonRevisedResponse PersonRevised(Message request);
}
public class PRPA_AR101202 : WCFServiceBase, IPRPA_AR101202
{
PersonRevisedResponse IPRPA_AR101202.PersonRevised(Message request)
{
PersonRevised pr = this.ParseMessage<PersonRevised>(request, HL7_XML_NAMESPACE);
PersonRevisedResult result = new PersonRevisedResult();
PersonRevisedResponse r = new PersonRevisedResponse(result);
return r;
}
}
update:
Based on the answer I was able to create a WCF service that accepted a string, however now I am getting null on the implemented services that are based off of WSDL contracts( on the input parameter); regardless of whether it's a string or an XmlDocument/XmlNode.
Thoughts?
I need a service that accepts/returns XML, rather than serialized
types
In that case you are better off using POX and not using SOAP/WSDL at all. There are some resources for this here and here.
The reason for this is if there is an error in the incoming XML it
will fail before it hits my code - we can't have this.
I kind of know what you're saying here. It is annoying that any serialization exceptions will kill the channel rather than bubble back to the client, however, the whole point of exposing a service metadata endpoint is that clients will always serialize types which are exposed outside the service boundary correctly because that's what the WSDL is supposed to be for.
Effecitvely I want something similar to: XmlDocument
PersonRevised(XmlDocument request);
As you are no doubt aware, exposing a XmlDocument type is not equivalent to exposing XML. Exposing XmlDocument will not be pretty.
If you absolutely need full control over the deserialization, you will have to expose your operation as accepting a parameter of type string. Then you can do what you want with it.
public string PersonRevised(string request)
{
// Deserialize here...
}
I have been pulling my hair out for the past two days trying to figure out why XmlFormatter (via DataContractSerializer) does not serialize data I return in my Web API method. WebAPI decides to use JSON anyway but I need the result to be in XML (as per application that will use this API). I have setup my browser to send Accept: application/xml for the resolver to use the XmlFormatter (but the result is always json).
Controller:
public class MyController : ApiController
{
public MyDataResultList GetData(string someArgument)
{
// magic here that gets the data
MyDataResultList items = MyDataResultList.GetData(someArgument);
return items;
}
}
MyDataResultList is contained in a dll and has this similar layout:
[DataContract]
[CollectionDataContract(Name = "MyDataList")]
[KnownType(typeof(List<MyDataItem>))]
public class MyDataResultList : List<MyDataItem>
{
[DataMember]
public string SomethingHere
{
get;
set;
}
[DataMember]
public TimeSpan StartTime
{
get;
set;
}
[DataMember]
public TimeSpan StopTime
{
get;
set;
}
}
I have tried setting UseXmlSerializer to true, but I need to use the DataContractSerializer on the client end to de-serialize the results back correctly.
So the final question is, is it possible to configure web API to throw an exception if it is unable to serialize using whatever formatter comes first? I believe (in my opinion) it's very misleading and too abstractive to have it just silently fall back to JSON without giving me any clue as to what is causing that.
Update: manually serializing MyDataResultList using DCS throws InvalidDataContractException: Type 'MyDataResultList' with CollectionDataContractAttribute attribute is an invalid collection type since it has DataContractAttribute attribute. But the underlying question remains: how to get the Web API to throw this to me instead of silently falling back to JSON? (and making debugging more difficult)
Update2: DataContract serializer seems to skip SomethingHere/StartTime/EndTime properties entirely even though they have [DataMember] on them.
Your way of diagnosis is correct and yeah Web API's content negotiation process will try to find the best formatter based on bunch of logic(ex: Accept header if present, Content-Type header if Accept-Header not present, asks each formatter if it can serialize a type etc.).
You can disable this default behavior (i.e finding the first formatter in the list of formatters which can write/serialize a type) by doing the following. This will result in a 406 Not Acceptable response being generated:
DefaultContentNegotiator negotiator = new DefaultContentNegotiator(excludeMatchOnTypeOnly: true);
config.Services.Replace(typeof(IContentNegotiator), negotiator);
We have an ASP.NET Web Application wired up with ServiceStack. I've never written functional tests before, but have been tasked to write tests (nUnit) against our API and prove it's working all the way down to the Database level.
Can someone help me get started writing these tests?
Here's an example of a post method on our Users service.
public object Post( UserRequest request )
{
var response = new UserResponse { User = _userService.Save( request ) };
return new HttpResult( response )
{
StatusCode = HttpStatusCode.Created,
Headers = { { HttpHeaders.Location, base.Request.AbsoluteUri.CombineWith( response.User.Id.ToString () ) } }
};
}
Now I know how to write a standard Unit Test, but am confused on this part. Do I have to call the WebAPI via HTTP and initialize a Post? Do I just call the method like I would a unit test? I suppose it's the "Functional Test" part that eludes me.
Testing the service contract
For an end-to-end functional test, I focus on verifying that the service can accept a request message and produce the expected response message for simple use cases.
A web service is a contract: given a message of a certain form, the service will produce a response message of a given form. And secondarialy, the service will alter the state of its underlying system in a certain way. Note that to the end client, the message is not your DTO class, but a specific example of a request in a given text format (JSON, XML, etc.), sent with a specific verb to a specific URL, with a given set of headers.
There are multiple layers to a ServiceStack web service:
client -> message -> web server -> ServiceStack host -> service class -> business logic
Simple unit testing and integration testing is best for the business logic layer. It's usually easy write unit tests directly against your service classes too: it should be easy to construct a DTO object, call a Get/Post method on your service class, and validate the response object. But these do not test anything that's happening inside the ServiceStack host: routing, serialization/deserialization, execution of request filters, etc. Of course, you don't want to test the ServiceStack code itself as that's framework code that has its own unit tests. But there is an opportunity to test the specific path that a specific request message takes going into the service and coming out of it. This is the part of the service contract that can't be fully verified by looking directly at the service class.
Don't try for 100% coverage
I would not recommend trying to get 100% coverage of all business logic with these functional tests. I focus on covering the major use cases with these tests - one or two reqest examples per endpoint usually. Detailed testing of specific business logic cases is much more efficiently done by writing traditional unit tests against your business logic classes. (Your business logic and data access are not implemented in your ServiceStack service classes, right?)
The implementation
We are going to run a ServiceStack service in-process and use an HTTP client to send requests to it and then verify the content of the responses. This implementation is specific to NUnit; a similar implementation should be possible in other frameworks.
First, you need an NUnit setup fixture that runs one before all of your tests, to set up the in-process ServiceStack host:
// this needs to be in the root namespace of your functional tests
public class ServiceStackTestHostContext
{
[TestFixtureSetUp] // this method will run once before all other unit tests
public void OnTestFixtureSetUp()
{
AppHost = new ServiceTestAppHost();
AppHost.Init();
AppHost.Start(ServiceTestAppHost.BaseUrl);
// do any other setup. I have some code here to initialize a database context, etc.
}
[TestFixtureTearDown] // runs once after all other unit tests
public void OnTestFixtureTearDown()
{
AppHost.Dispose();
}
}
Your actual ServiceStack implementation probably has an AppHost class that's a subclass of AppHostBase (at least if it's running in IIS). We need to subclass a different base class to run this ServiceStack host in-process:
// the main detail is that this uses a different base class
public class ServiceTestAppHost : AppHostHttpListenerBase
{
public const string BaseUrl = "http://localhost:8082/";
public override void Configure(Container container)
{
// Add some request/response filters to set up the correct database
// connection for the integration test database (may not be necessary
// depending on your implementation)
RequestFilters.Add((httpRequest, httpResponse, requestDto) =>
{
var dbContext = MakeSomeDatabaseContext();
httpRequest.Items["DatabaseIntegrationTestContext"] = dbContext;
});
ResponseFilters.Add((httpRequest, httpResponse, responseDto) =>
{
var dbContext = httpRequest.Items["DatabaseIntegrationTestContext"] as DbContext;
if (dbContext != null) {
dbContext.Dispose();
httpRequest.Items.Remove("DatabaseIntegrationTestContext");
}
});
// now include any configuration you want to share between this
// and your regular AppHost, e.g. IoC setup, EndpointHostConfig,
// JsConfig setup, adding Plugins, etc.
SharedAppHost.Configure(container);
}
}
Now you should have an in-process ServiceStack service running for all of your tests. Sending requests to this service is pretty easy now:
[Test]
public void MyTest()
{
// first do any necessary database setup. Or you could have a
// test be a whole end-to-end use case where you do Post/Put
// requests to create a resource, Get requests to query the
// resource, and Delete request to delete it.
// I use RestSharp as a way to test the request/response
// a little more independently from the ServiceStack framework.
// Alternatively you could a ServiceStack client like JsonServiceClient.
var client = new RestClient(ServiceTestAppHost.BaseUrl);
client.Authenticator = new HttpBasicAuthenticator(NUnitTestLoginName, NUnitTestLoginPassword);
var request = new RestRequest...
var response = client.Execute<ResponseClass>(request);
// do assertions on the response object now
}
Note that you may have to run Visual Studio in admin mode in order to get the service to successfully open that port; see comments below and this follow-up question.
Going further: schema validation
I work on an API for an enterprise system, where clients pay a lot of money for custom solutions and expect a highly robust service. Thus we use schema validation to be absolutely sure we don't break the service contract at the lowest level. I don't think schema validation is necessary for most projects, but here's what you can do if you want to take your testing a step further.
One of the ways in which you can inadventently break your service's contract is to change a DTO in a way that is not backward compatible: e.g., rename an existing property or alter custom serialization code. This can break a client of your service by making data no longer available or parseable, but you typically can't detect this change by unit testing your business logic. The best way to prevent this from happening is to keep your request DTOs separate and single-purpose and separate from your business/data access layer, but there's still a chance someone will accidentally apply a refactoring incorrectly.
To guard against this, you can add schema validation to your functional test. We do this only for specific use cases that we know a paying client is actually going to use in production. The idea is that if this test breaks, then we know that the code that broke the test would break this client's integration if it were to be deployed to production.
[Test(Description = "Ticket # where you implemented the use case the client is paying for")]
public void MySchemaValidationTest()
{
// Send a raw request with a hard-coded URL and request body.
// Use a non-ServiceStack client for this.
var request = new RestRequest("/service/endpoint/url", Method.POST);
request.RequestFormat = DataFormat.Json;
request.AddBody(requestBodyObject);
var response = Client.Execute(request);
Assert.That(response.StatusCode, Is.EqualTo(HttpStatusCode.OK));
RestSchemaValidator.ValidateResponse("ExpectedResponse.json", response.Content);
}
To validate the response, create a JSON Schema file that describes the expected format of the response: what fields are are required to exist for this specific use case, what data types are expected, etc. This implementation uses the Json.NET schema parser.
using Newtonsoft.Json.Linq;
using Newtonsoft.Json.Schema;
public static class RestSchemaValidator
{
static readonly string ResourceLocation = typeof(RestSchemaValidator).Namespace;
public static void ValidateResponse(string resourceFileName, string restResponseContent)
{
var resourceFullName = "{0}.{1}".FormatUsing(ResourceLocation, resourceFileName);
JsonSchema schema;
// the json file name that is given to this method is stored as a
// resource file inside the test project (BuildAction = Embedded Resource)
using(var stream = Assembly.GetExecutingAssembly().GetManifestResourceStream(resourceFullName))
using(var reader = new StreamReader(stream))
using (Assembly.GetExecutingAssembly().GetManifestResourceStream(resourceFileName))
{
var schematext = reader.ReadToEnd();
schema = JsonSchema.Parse(schematext);
}
var parsedResponse = JObject.Parse(restResponseContent);
Assert.DoesNotThrow(() => parsedResponse.Validate(schema));
}
}
Here's an example of a json schema file. Note that this is specific to this one use case and is not a generic description of the response DTO class. The properties are all marked as required as these are the specific ones the client are expecting in this use case. The schema might leave out other unused properties that currently exist in the response DTO. Based on this schema, the call to RestSchemaValidator.ValidateResponse will fail if any of the expected fields are missing in the response JSON, have unexpected data types, etc.
{
"description": "Description of the use case",
"type": "object",
"additionalProperties": false,
"properties":
{
"SomeIntegerField": {"type": "integer", "required": true},
"SomeArrayField": {
"type": "array",
"required": true,
"items": {
"type": "object",
"additionalProperties": false,
"properties": {
"Property1": {"type": "integer", "required": true},
"Property2": {"type": "string", "required": true}
}
}
}
}
}
This type of test should be written once and never modified unless the use case it's modeled on becomes obsolete. The idea is that these tests will represent actual usages of your API in production and ensure that the exact messages your API promises to return do not change in a way that breaks existing usages.
Other info
ServiceStack itself has some examples of running tests against an in-process host, on which the above implementation is based.