Change default JSON serializer in WCF [duplicate] - c#

This question already has answers here:
Replace WCF default JSON serialization
(2 answers)
Closed 6 years ago.
I'm building a WCF service that also has an endpoint for JSON.
The problem is the DataContractSerializer default in WCF uses a standard I don't want to use. I looked into changing this, but I found no good way to do it. All threads and solutions I find for this are pretty old, 2012-2013. Is there a clean solution for this or have Microsoft made some changes to WCF to make this easier than writing your own DispatchMessageFormatter, etc.?
Note I'm not talking about the Web. This is a pure self-hosted WCF service.
I tried already implementing a DispatchMessageFormatter. It works, but it has some issues coming along with it, for example, all WebContentFormat has to be Raw, etc.
This question specifically mentions the accepted answers in that question and asking for another way to do this now five years later without all of the negative side effects it brings.

Try to intercept the message in the MessageInspector class (derives from IClientMessageInspector or IDispatchMessageInspector for the client and service respectively). Then convert XML to JSON and send as response.
I have not tried it; it is just a wild guess.

Related

Is protobuf a good choice of the universal data model? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 11 months ago.
Improve this question
We are trying to find a universal data model, which can be applied everywhere.
I think there is no doubt that protobuf is good in the microservice architecture as a protocol among processes of different platforms (Windows, Linux, etc.) and languages (Java, C#, C++, Python, etc.)
But how about using protobuf within the process?
To make it simple and easy to illustrate the problems, let say I am making a C# gRPC service and GUI (gRPC client).
After some study, it seems that it is not suitable from what I can see.
Stability: It looks like that protobuf is still in a changing phase. For example, removing Optional keyword in proto3, but adding it back again in proto release 3.15.
Data type: The data types in proto are not fully compatible with common data types. For example, decimal. We need to define another complex data type (What's the best way to represent System.Decimal in Protocol Buffers?) and doing the conversion.
Conversion: We need conversion before we can utilize it in the related language. You cannot add the self-defined proto decimal directly in c#.
Repeated conversion: Conversion is not one off, but back and forth. Let say we have this proto object passing through 5 functions and need to have some calculations on the decimal field at each function. That means, we will need to convert the decimal field in the proto object to C# decimal in each function, have the calculation to get the result, convert and assign the result back to the decimal field in the proto object, then pass the proto object to the next function.
GUI control binding: We cannot bind the proto field (for those without an matched type in C#). Even we can specify the conversion to do so in the control somehow, (Indeed, I am not sure if we can/it is good to do so) in simple control, like textbox. It may not be easy for complicated control like datagridview because there may be different built-in editors for different native data types. If we use proto data types, that means we need to write customized editors for them. And also, we may need to define other behaviors, like how to sort them.
Using the auto-generated proto class within the process seems not a good idea to me as the reasons listed above. The above case only focus on C# service and client. When it comes to different languages, the cases should be more complicated.
Although .net soap services are slow and wordy (if you look at the wsdl definition), one thing I appreciate very much is that both the service and the client are using the same object with native data types, which can be used directly without any issue. The conversion is done in the communication directly and automatically.
The way I can think of at the moment is that:
Using proto as a communication protocol only
Write a class (use native/built-in data types) for each proto message type and our own conversion logic. Doing so because I cannot find any framework/lib to do so.
After receiving the proto object, convert it directly to an object with native/built-in types before further processing. (the benefit here is that even there is a major change in proto spec, we only need to change the conversion logic only without affecting other layers).
Am I on the right track? What is common/best practice to resolve the problems listed above?
Any help is highly appreciated!!!

How to pass json from C# app to Node.js app [duplicate]

This question already has answers here:
How to post JSON with HttpClient using C#?
(2 answers)
How to post JSON to a server using C#?
(15 answers)
Closed 3 years ago.
I have a web client working against a Node.js app.
Now I need to update and integrate an existing C# into this scenario.
My C# generates a JSON object and I need to consume it in the Node.js app.
What's the simplest way do achieve this?
You can do it in many options, and you have a lot of examples in stackoverflow and in google.
Like #Alex said, you can do it via REST API, protocol like RTP, web sockets (for me this is the easiest way).
You have a lot of examples and module's like for example:
Mikeal's request
npmjs request
...
Examples:
here
and here

c# AspNetCore WebApi - how to tell if json property is set or not on PUT? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I have hard time finding the best way to deal with default values, when deserializing json into a class model in my AspNetCore WebApi.
Whenever a client makes a PUT request to the api, how should i figure out if a property was set to null in the request - or not set at all in the request?
At this moment I use the [FromBody] Attribute for deserialization into a class type, along with ModelValidation for requiring fields etc. But once the json request has been deserialized, how can i tell if eg. a "string name" property was explicitly set to null, or not set at all in the json requst, but defaulted to null?
It the case it was not set at all, i don't want to change the state of the actual model being saved in DB, for that property.
The problem arises when a client uses PUT, and a new field has been implemented, which the client does not know about. I don't want clients overwriting a "new" value to null, that they have no intention of setting in the first place.
Is there any standard or best practice for handling this? I can't imagine i'm the only one with this problem. Implementing my own json deserializer, or, implementing versioning for the endpoint for the sake of adding an additional field, seems a bit over the top.. And coordinating a deploy for all the clients at the same time (where the handling of the new property/value is handled), is not an option either.
All suggestions appreciated.
Regards Frederik
I think you're not using proper HTTP method and that is source of your problem. HTTP PUT means that you want to overwrite resource at request url with what is in request body. And because C# doesn't have undefined then it cannot differentiate NULL from not provided property.
If you need to do partial modification then you should use PATCH instead

JSON field's names in the Web Service response [duplicate]

This question already has answers here:
Serializing F# Record type to JSON includes '#' character after each property
(2 answers)
Closed 7 years ago.
I use F#, but I believe the question is not F# specific.
I have the interface for Web Service:
[<ServiceContract>]
type IRestService =
[<OperationContract>]
[<WebGet(UriTemplate = "Maintenance", ResponseFormat=WebMessageFormat.Json)>]
abstract GetMaintenancesRest: a:unit -> Maintenance[]
When I am trying to use this service I can get JSON, but all field's names in the JSON have the symbol '#':
[{"Address#":"one","Assetid#":"","Assignmentdate#":"/Date(1434147917730-0700)/","Comment#":"" ...
Why and how can I fix it?
It's F#. I ran into this a long time ago, so I might be forgetting. As I recall you may not use records to return from the service, or you get this error. You need class types with properties. I don't think mutable records work either. I also don't think public fields work; it's got to be public read/write properties.

How to share same object instance (singleton) between processes in C#? [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
How to share objects across processes in .Net?
I can do this for a single process (single .exe) but how can I do it between processes?
You can do this via remoting. You class needs to inherit from MarshalByRefObject, which will give your clients a proxy to the real object.
You'd need to use some sort of distributed hash table or caching mechanism.
Try to avoid things like remoting if you can, because calls to a remote object can get expensive and start to really hurt performance. If you do go with .net remoting, then carefully consider the interface of the remote object. You should be passing coarse grained data across the process boundry, so avoid chatty interfaces with lots of calls with little bits of data.
What are the requirements of the class that you want to act as a singleton? There might be a totally different way of looking at it. Currently the thinking is that singletons are undesirable because they are difficult to unit test reliably, so avoiding the singleton concept could be the direction to take.
using .Net remoting
(see answers above or by this url: http://msdn.microsoft.com/en-us/library/kwdt6w2k%28VS.71%29.aspx)

Categories