I have a web api that others are using, and one of our developers changed the name of the action method or method parameter. Now the client has a problem. Is there any solution that, before deploying to the production server, check whether there have been changes in the existing public api (not adding new api)?
I would highly recommend creating a Swagger OpenAPI Specification for your API if you're going to be sharing your API with other teams. A Swagger definition is a specification that describes how your API looks like in terms of actions, expected input, output, etcetera. You can use such a specification as a contract between the producer and all consumers of your API.
Then, you can use tools like Swagger CodeGen to generate clients and server implementations, as a guarantee that an implementation conforms to the contract. In your case, this would prevent a developer from breaking the API (unless they change the contract as well).
If you want to collaborate, I recommend placing (a copy of) your Swagger definitions in a separate GIT repo (e.g. my-awesome-contracts), give all the teams access to this repo and make any changes through pull requests. This serves two purposes:
It gives all the consumers of your API an early notice on any changes
you're planning to make.
It allows any consumers to actually give feedback on the proposed changes.
Related
We have REST-API based on ASP.NET WebApi. And we distribute bia nuget .net client library to communicate with our API.
WebApi controllers and client library shares client models.
If I add some new optional field to model - it's ok, but when in some new release we accidentally add new enum value on API, previous version of client will thrown exception if it receives new enum value.
So, it there any way to check backward compatibility of API and client and prevent breaking changes?
In this particular case: don't use enum in client libraries. This situation will happen every time when you need to extend this enum. I suppose you don't want to make a new version of API for each such change. Inside your client you should accept string values and then try to parse it. If it's parsed - OK, if it's not - set YourEnum.None.
In general: to prevent breaking changes you should create integration tests and run them on your CI server. If you add a new rule in api - existing integration tests shouldn't fail. At the same time there is no 100% guarantee. It depends on how you are meticulous while writing integration tests. The same story as for unit testing actually.
when in some new release we accidentally add new enum value on API, previous version of client will thrown exception if it receives new enum value.
Based on #mtkachenko's answer: Write integration tests where all supported versions of your client (also 3rd party clients, if any) are tested against the release candidate (or dev/staging/QA environment) of your API.
If the tests fail with any of these client versions, then your API change is incompatible.
Plan for extension - e.g. in JSON, prefer objects over plain strings/numbers, so you can add more fields in the future.
Also document expected future extensions, so developers are aware that e.g.
a particular field may get additional values in a future API version, and unknown values can be treated e.g. like the existing value XY.
a particular array field which today always has one element may in the future have multiple elements.
add new enum value
Zalando suggests to never extend enums that are used in responses:
https://opensource.zalando.com/restful-api-guidelines/#107
You may also find more useful information there, I think their document contains lots of good principles and ideas:
https://opensource.zalando.com/restful-api-guidelines/#general-guidelines
https://opensource.zalando.com/restful-api-guidelines/#compatibility
https://opensource.zalando.com/restful-api-guidelines/#deprecation
This looks like a typical use case for REST API versioning.
You can will need to support different versions in API and then have your client specify which version it is requesting.
Different mode of versioning are:
URL Versioning
api.example.com/v1
Versioning using header if you want to keep a single URL for a resource
Accept:Version: v1
Refer microsoft api versioning guideline here.
The library is pretty well documented.
I have a Windows UWP client application that needs to call a REST API hosted by my ASP.NET service. To generate my client proxy I use the following Visual Studio option...
Right click project -> Add -> REST API Client...
I provide the URL of the swagger endpoint and it generates the expected client code. But the downside is it generates all the classes even though in my case I have a shared class library that has all the server side classes defined. This is a pain because the generated classes do not respect the inheritance of my class hierarchy and flattens everything into non-inherited classes.
Is it possible to get AutoRest to reuse an existing .NET library for classes instead of always generating new classes? This was an option when I used the WCF client proxy generator.
It seems like Add REST API client doesn't have advanced setting for reusing. But Add REST API client has two ways for loading metadata file, swagger URL and existing metadata file. By testing on my site, it should be able to update an existing metadata file and to remove or adjust the nodes that you don't want be generated. And then load the updated existing metadata when adding REST API client.
The classes generated may be determined by the metadata json file and the host value. You may also try to submit a request here to see if swagger team can keep the hierarchy when generating the meta file. Or you may need to manual create the proxy to reuse the libraries.
I think it would be fair to describe the "REST API Client" generation tool in Visual Studio as "spartan".
This answer may be too late to help you, or there may be reasons why you can't use a different tool, but in the hope of benefiting you and/or future readers I'll detail how I achieved generating a REST client in NSwagStudio which reuses my existing classes and enums. (NSwagStudio is free and open source, and I have no affiliation).
On the left hand pane, we select our input. Besides the expected Swagger feeds there are some interesting options such as "Web API via reflection" which "uses .NET reflection to analyze ASP.NET Web API or ASP.NET Core controllers" - this is the option I used but my screenshot shows the default Swagger input.
On the right hand pane, click "CSharp Client" and switch to the "CSharp Client" tab.
The magic bullet is to untick "Generate DTO types":
This will cause it to generate only the client, so you can reuse your existing DTOs.
You'll want to specify a namespace for the client, and optionally one or more namespaces which will be added to the generated C# file as using directives. For example, if you want your client to be in the namespace MyNamespace and your model classes are in SomeOtherNamespace you'd enter the following:
It's well worth having a play with the options. A few quick notes about some of the defaults and why I'm happy with them:
The HttpClient is injected and you control the lifecycle (which seems to me a good thing)
A BaseUrl property is defined. I haven't tested this yet but I'm hopeful from looking at the generated code that this will allow me to safely spin up multiple instances of the client class to talk to multiple servers which share the same API
The JsonSerializerSettings property is protected, but can be configured via the UpdateJsonSerializerSettings partial method
I've saved my settings from the File menu and added the .nswag file to source control, so that I can easily regenerate the client in future if necessary.
I am working on the asp.net web api where i have to return back objects in json. With respect to making a better approach and easy for (android/ios)mobile developer to consume these Web APIs and parse json objects, what is best approach for making these objects definitions remain shared amongst webapi project and mobile project, so that if we have to change any property then it can easily be reflected on both projects in a better way. It would be great if someone explains it in detail.
There is no such sync method as you are asking.
On your WebAPI server side you'll define the objects and then return them in your API methods. JSON serialization will be automatically handled by the framework, using your serialization engine of choice (i.e. JSON.NET). Remember that with WebAPI you don't decide the output format server side, you just return a response containing the object(s) and then the framework reads the HTTP HEADERS of the request to determine whether the client asked for JSON or XML and then returns what was asked.
The best thing you can do is define a clear API with nice conventions and keep it documented, and if you change anything have the documentation reflect the changes. Avoid making breaking changes, and if you really must, deprecate a property or an object for at least a couple of versions before removing it.
That's the way all public API work anyway.
I'm working on a project that is our companies first foray into Domain Driven Development.
Our Web API originally simply provided CRUD operations and the project exposed OData controllers, but I'm not sure if that is still a good idea.
Is OData a good way to expose non-CRUD APIs?
More info:
Initially our web api basically exposed CRUD functions. To create a new User you would simply create one and post it to the service. To change, for example, an address you would get a copy of the user entity, make changes, then perform an update operation. Basic OData stuff.
Beyond providing query support, OData also exposed the service in a readily consumable way, so it could be added to other projects as a service reference and accessed with a proxy.
Since we have moved over to a DDD approach, things have changed significantly. Our Web API is now simply a gateway to a number of independent sub-domain services. We no longer provide CRUD operations or direct access to entities, instead making service calls to manipulate entities. Instead of creating a User entity sending it to the User service via a Put request, a consumer must generate a CreateUserBindingModel and send it to the User/Create service and let the service generate the entity. Changing an address is done through the ChangeAddress(ChangeAddressBindingModel model) method, rather than just updating the whole object. Queries are much more targeted and rarely if ever return entire domain objects.
Is it a bad idea to keep using OData as a basis for our Web API, when we no longer provide CRUD operations? Is there another way to expose the details of our service the way you can with OData? I know WCF services provide similar functionality, but I was under the impression it was even more tied to CRUD than OData.
OData is a data oriented API spec, it's anti-DDD. Although it can satisfy all your requirements to implement REST APIs but it's best for data processing API. I guess you already know that using OData feels like operating the database via HTTP. If you are using DDD you should forget OData totally.
In OData, actions and functions are a way to add server-side behaviors that are not easily defined as CRUD operations on entities
https://learn.microsoft.com/en-us/aspnet/web-api/overview/odata-support-in-aspnet-web-api/odata-v4/odata-actions-and-functions
https://blogs.msdn.microsoft.com/alexj/2012/02/03/cqrs-with-odata-and-actions/
https://github.com/OData/ODataSamples/blob/master/WebApiCore/ODataActionSample/ODataActionSample/
I'm refactoring an existing C# .NET Web Service that is consumed by existing Delphi 2006 (non-.NET) clients. I don't want to rebuild/redeploy the clients. My goal is to keep the WSDL identical so that the proxy classes won't change.
I used a tool (Regionerate) to region and sort the methods/properties based on our current standards. This changed the tag ordering in the WSDL.
I can use an XML diff tool to compare the files and ignore ordering, but I'm not sure if this will affect the clients. Is order of web methods or (to-be-proxy) class properties relevant?
The order should be totally irrelevant, for the methods in the WSDL as well as for the properties in the classes.
The only way I can imagine how this would affect the clients would be if the clients didn't use standard libraries to consume the service, but did it by ways of some custom coded weirdness - and even then the implementer would have had to go some extra miles to introduce a dependency on the order ;)