I have an assembly with DataContracts and I need to generate .proto schema for it to be able to exchange the data with java system. The DataContracts code can be changed but I cannot add [ProtoContract] and [ProtoMember] attributes in it because it will result in protobuf-net assembly dependency. We use WCF in C# parts of the system so we would not want to have dependency on proto-buf assembly in most of C# projects that don't work with java system.
On the protobuf-net site in a GettingStarted section it's said that:
Don't Like Attributes?
In v2, everything that can be done with attributes can also be configured at runtime via RuntimeTypeModel.
However I've no clue how to actually configure serialization without attributes and I haven't seen any examples of that.
I'm trying to do
[DataContract]
public class MyEntity
{
[DataMember(Order = 1)]
public String PropertyA { get; set; }
[DataMember(Order = 2)]
public int PropertyB { get; set; }
}
RuntimeTypeModel.Default.Add(typeof(MyEntity), false);
string proto = Serializer.GetProto<MyEntity>();
And get the following as the value of proto
package ProtobufTest;
message MyEntity {
}
Clarification: most of this answer relates to the pre-edit question, where false was passed to RuntimeTypeModel.Add(...)
I've used your exact code (I inferred that this was in namespace ProtobufTest, but the rest was copy/paste from the question) with r2.0.0.640 (the current NuGet deployment), and I get:
package ProtobufTest;
message MyEntity {
optional string PropertyA = 1;
optional int32 PropertyB = 2 [default = 0];
}
Further, you get the exact same result even if you remove the RuntimeTypeModel.Default.Add(...) line.
It is unclear to me why you are seeing something different - can you clarify:
which protobuf-net version you are using exactly
if those [DataContract] / [DataMember] attributes are the System.Runtime.Serialization.dll ones, or your own (sorry if that seems a bizarre question)
To answer the question fully: if you couldn't have any attributes (and the ones you have are just fine), you could also do:
RuntimeTypeModel.Default.Add(typeof(MyEntity), false)
.Add(1, "PropertyA")
.Add(2, "PropertyB");
which would configure PropertyA as key 1, and PropertyB as key 2.
Related
I'm using something like this in dotnet asp net core 6:
<PackageReference Include="protobuf-net.Grpc.AspNetCore" Version="1.0.152" />
<PackageReference Include="protobuf-net.Grpc.AspNetCore.Reflection" Version="1.0.152" />
[DataContract]
public class TaskItem
{
//other properties omitted
[DataMember(Order = 5)]
public DateTime DueDate { get; set; } = null!;
}
Now, when I call the service with grpcurl
"DueDate": {
"value": "458398",
"scale": "HOURS"
}
And in the generated proto file
import "protobuf-net/bcl.proto"; // schema for protobuf-net's handling of core .NET types
message TaskItem {
//other properties omitted
.bcl.DateTime DueDate = 5;
Is there a way to specify a custom converter so that it will serialize to ISO 8601 string in order to better support cross platform (I'll have some clients in js where having a string is ok since I just need new Date(v) and d.toISOString()) ?
I know I can just declare DueDate as string, but then the "problem" is that when I use C# code-first client I also need to convert back to DateTime and to string ...
For example, I can do the following with JSON
.AddJsonOptions(x =>
{
x.JsonSerializerOptions.Converters.Add(new JsonStringEnumConverter());
});
What you ask is very different from a JSON type converter. As the docs explain the standard way of serializing dates is the google.protobuf.Timestamp type. That's defined in the proto file. When you use code-first that file is generated by the open source protobuf-net.Grpc tool.
To use the Timestamp type you need to tell the tool to format that property using a well-known type with the ProtoMember attribute :
[ProtoMember(1, DataFormat = DataFormat.WellKnown)]
public DateTime Time { get; set; }
This is shown in the tool's Getting Started document.
This isn't the default for legacy reasons :
(for legacy reasons, protobuf-net defaults to a different library-specific layout that pre-dates the introduction of .google.protobuf.Timestamp). It is recommended to use DataFormat.WellKnown on DateTime and TimeSpan values whenever possible.
I hope someone can help me. This is code I use to retrieve a bunch of data about a particular device with an api (in this case, a Twinkly light string).
Here's my code, which is partially functional.
HttpResponseMessage result = await httpClient.GetAsync(uri);
string response = await result.Content.ReadAsStringAsync();
JObject jObject = JObject.Parse(response);
Layout layout = new Layout();
layout = JsonConvert.DeserializeObject<Layout>(response);
I say it's "partially" functional because every property that is in the root level deserializes into the model just fine, but the json also returns a property called "coordinates" which consists of an array entry for each bulb, and each entry has three values for x,y,z.
I have tried a lot of stuff to get the data from the coordinates array and i can break-mode view that the data is in there.
However it doesn't deserialize properly. I have the correct number of elements in the coordinates array, but they are all x:0, y:0, z:0
Here is my model schema. I hope someone can help me with this. This is my first foray into api work, and the first time i've had a nested model like this.
internal class Layout
{
public int aspectXY { get; set; }
public int aspectXZ { get; set; }
public LedPosition[] coordinates { get; set; }
public string source { get; set; } //linear, 2d, 3d
public bool synthesized { get; set; }
public string uuid { get; set; }
}
internal class LedPosition
{
double x { get; set; }
double y { get; set; }
double z { get; set; }
}
Note: I've tried assigning the properties manually like this:
JToken dataToken = jObject.GetValue("coordinates");
and that indeed received the data but it didn't help me as it merely moved the issue.
you don' t need parse and deserialized in the same time, it would be enough
var response = await result.Content.ReadAsStringAsync();
var layout = JsonConvert.DeserializeObject<Layout>(response);
to make LedPosition properties visible make them public too
public class LedPosition
{
public double x { get; set; }
public double y { get; set; }
public double z { get; set; }
}
since it is used by another class this class should be public too
public class Layout
One thing I've learned recently from the big dog CTO at my work was you can actually copy the JSON you're expecting, and go to Edit -> Paste Special -> Paste JSON as Classes in Visual Studio and it'll paste it as the classes you need, with the proper names/properties. Really slick. Maybe try that and see if it comes out with a different model than what you have now.
This is my first foray into api work
Two things I want to point out, then..
does the api you're using publish a swagger/open api document?
No - see 2 below
Yes - take a look at tools like NSwag(Studio), Autorest and others. You feed the swagger.json into them and they crank out a few thousand lines of code that creates a client that does all the http calling, deserializing, the classes of data etc. if means your code would end up looking like:
var client = new TwinklyLightClient();
var spec = client.GetTwinklyLightSpec();
foreach(var coord in spec.Coords)
Console.Write(spec.X);
This is how APIs are supposed to be; the tools that create them operate to rules, the tools that describe them operate to rules so the consumption of them can be done by tools operating to rules - writing boilerplate json and http request bodies is a job for a computer because it's repetitive and always follows the same pattern
The API doesn't publish a spec we can use to get the computer to write the boring bits for us. Durn. Well, you can either make the spec yourself (not so hard) or go slightly more manual
Take your json (view it raw and copy it)
Go to any one of a number of websites that turn json into code - I like http://QuickType.io because it does a lot of languages, has a lot of customization and gives advanced examples of custom type deser, but there are others - and paste that json in
Instantly it's transformed into eg C# and can be pasted into your project
It gives an example of how to use it in the comments - a one liner something like:
var json = httpCallHereTo.GetTheResponseAsJsonString();
var twinklyLightSpec = TwinklyLightSpec.FromJson(json);
Yes, visual studio can make classes from json, but it's not very sophisticated - it does the job, but these sites that make json to c# go further in allowing you to choose arrays or lists, what the root object is called, decorating every property with a JsonProperty attribute that specifies the json name and keeps the c# property to c# naming conventions (or allows you to rename it to suit you)..
..and they work out of the box, which would resolve this problem you're having right now
I'm facing an issue where I feel that the answer should be obvious, but I'm missing something simple and it just escapes me :)
I'm developing an app that must talk to two different versions of Dynamics CRM, this due to migration period overlap.
I chose to implement all business processes using simple data objects that are CRM version independent and I use two connectors as plugins (separate csproj projects) in which I implement connection/query specific code for each CRM version.
Then I have a mapper project, where I also define an interface that plugins inherit so I can switch them freely. In this mapper I use a factory to instantiate appropriate connector depending on which CRM I want to talk to (decided at runtime), then extract all data I need into data objects and pass them on to business process handlers.
Plugin connectors have to reference appropriate MS Xrm nugets to connect and query respective CRM.
(Microsoft.CrmSdk.CoreAssemblies and Microsoft.PowerPlatform.Dataverse.Client)
These libraries are different versions and target different .NET version but contain same namespaces and types (particulary EntityCollection type which is the result of every query)
Because of plugins referencing different XRM libs, the EntityCollection, although being same type in same namespace, originates from different assembly version each time and I need to define this return type in my interface for plugins and be able to retrieve and work with EntityCollection in the mapper disregarding its containing assembly version.
The issues here are the type conversion between same types that reside in different versions of same package, method definition in the interface and the return type, also which library must be referenced in the mapper project to be able to define that interface.
I found this thread
Type conversion of identical types from different versions of same assembly
that practically states that it's not doable, but I'm not entirely convinced...
What do you think ?
Interesting situation. One idea would be to create a custom class, say MyEntityCollection, that you can construct from either type of Microsoft EntityCollection.
You could immediately convert any EntityCollection you retrieve into this type and deal with only MyEntityCollection in the rest of your code.
For example (untested code):
public class MyEntityCollection : IExtensibleDataObject
{
public MyEntityCollection(Microsoft.Xrm.Sdk.EntityCollection c)
{
Entities = c.Entities;
MoreRecords = c.MoreRecords;
PagingCookie = c.PagingCookie;
MinActiveRowVersion = c.MinActiveRowVersion;
TotalRecordCount = c.TotalRecordCount;
TotalRecordCountLimitExceeded = c.TotalRecordCountLimitExceeded;
EntityName = c.EntityName;
ExtensionData = c.ExtensionData;
}
public MyEntityCollection(Microsoft.PowerPlatform.Dataverse.Client.EntityCollection c)
{
Entities = c.Entities;
MoreRecords = c.MoreRecords;
PagingCookie = c.PagingCookie;
MinActiveRowVersion = c.MinActiveRowVersion;
TotalRecordCount = c.TotalRecordCount;
TotalRecordCountLimitExceeded = c.TotalRecordCountLimitExceeded;
EntityName = c.EntityName;
ExtensionData = c.ExtensionData;
}
public DataCollection<Entity> Entities { get; }
public Entity this[int index] { get; set; }
public bool MoreRecords { get; set; }
public string PagingCookie { get; set; }
public string MinActiveRowVersion { get; set; }
public int TotalRecordCount { get; set; }
public bool TotalRecordCountLimitExceeded { get; set; }
public string EntityName { get; set; }
public ExtensionDataObject ExtensionData { get; set; }
}
I have now resolved the issue with what one might call a "terrible hack" :-P
The mapper project still targets .NET5 and is now directly referencing the new client library thus having access to Entity and EntityCollection types from the newer sdk.
The plugin that wraps the old Xrm library is no more a plugin but a standalone service app with it's own .Net Framework 4.5 runtime. After a successfull request is made to the on-prem CRM this service serializes the EntityCollection result and sends it to the mapper as Json where it's deserialized into newer version of an EntityCollection using DataContractJsonSerializer
My comparisons have not shown any incompatibilities (yet) so reviving EntityCollection this way works for now.
Now I will have to do the same for OrganizationRequest and I hope it works too :=)
Thanks for your tips and suggestions
I'm updating the SDK for the Azure Cognitive Search service, from v10 to v11. I have followed all the steps in the guide in order to upgrade, however I have noticed a strange behavior about the indexing (merge or upload) operation: the UploadDocumentAsync (but also with other methods used to indexing data) operation fails when a property of type Collection (Edm.ComplexType) is null, with the following error:
A node of type 'PrimitiveValue' was read from the JSON reader when trying to read the contents of the property. However, a 'StartArray' node was expected json.
IndexDocumentsResult response = await searchClient.UploadDocumentsAsync<T>(documents).ConfigureAwait (false);
With v10 this problem did not arise. A workaround I found is to set collections as empty arrays and not with null value, but I would like to find a better solution.
EDITED:
I upgraded from Microsoft.Azure.Search v10.1.0 to Azure.Search.Documents v11.1.1
Following an example of a generic T class used to indexing data:
public class IndexEntity
{
[JsonProperty("#search.score")]
public double SearchScore { get; set; }
[JsonProperty("Key")]
public Guid Id { get; set; }
[JsonProperty("Code")]
public string Code { get; set; }
[JsonProperty("ComplexObj")]
public ComplexType[] CollectionOfComplexType{ get; set; }
}
Following the definition of ModelObjectToIndex
public class ComplexType
{
[JsonProperty("Id")]
public string Id { get; set; }
[JsonProperty("Value")]
public string Value { get; set; }
}
Basically when the CollectionOfComplexType property is null, I get the above error. If I set it as an empty array, the error does not occur, but as mentioned I don't like this solution, furthermore in the old version it was an allowed operation (the indexing was completed successfully)
Our Azure.Search.Documents behavior seems to have changed in this regard. I've opened https://github.com/Azure/azure-sdk-for-net/issues/18169 to track resolution.
You can workaround this issue without initializing your collections to an empty array by passing in a JsonSerializerSettings that was similar to what we did in our older Microsoft.Azure.Search library, since it seems from using the JsonPropertyAttribute you're using Newtonsoft.Json (aka Json.NET) anyway:
Add a package reference to Microsoft.Azure.Core.NewtonsoftJson if you haven't already. It recently GA'd so you don't need to use a preview if you were, which I presume since System.Text.Json - our default serializer - would not have honored your property renames.
Pass in a JsonSerializerSettings before creating your SearchClient like so:
var settings = new JsonSerializerSettings
{
// Customize anything else you want here; otherwise, defaults are used.
NullValueHandling = NullValueHandling.Ignore,
};
var options = new SearchClientOptions
{
Serializer = new NewtonsoftJsonObjectSerializer(settings),
};
var searchClient = new SearchClient(options);
We'll discuss how to resolve this by default, if we even can. One big change from the older library is the ability to customize the serializer used. By default we use System.Text.Json, but we support other serializers including Newtonsoft.Json. If someone were to pass in their own settings - or even desire the defaults - changing that could be catastrophic. So I'm curious: if we at least documented this behavior change (perhaps on the SearchClient class remarks and/or UploadDocuments and related methods) and how to retain previous behavior, would that have helped or otherwise been satisfactory?
I'm seeing C# URI types serialized to JSON in an ODATA 3 controller in my WebAPI 2 project as an array of segments that does not include the domain. I've tried everything I can think of to change this (including fiddling with the serialization settings and even trying out the contract serializer instead of JSON.Net). Nothing seems to change the behavior. Note, I am not using .Net Core. Here is a code sample, condensed into a single snippet.
namespace WebApplication1.Controllers
{
public class MyObject
{
public Uri Url { get; set; }
public string Name { get; set; }
public string ID { get; set; }
}
public class MyObjectsController : ODataController
{
private static ODataValidationSettings _validationSettings = new ODataValidationSettings();
public IHttpActionResult GetMyObjects(ODataQueryOptions<MyObject> queryOptions)
{
try
{
queryOptions.Validate(_validationSettings);
return Ok<IEnumerable<MyObject>>(new List<MyObject>() { new MyObject() { ID="asdf", Name="123rwe", Url = new Uri("http://www.webapp.com/sites/page.html") } });
}
catch (ODataException ex)
{
return BadRequest(ex.Message);
}
}
}
}
This generates the following JSON in the browser:
{
"odata.metadata":"http://localhost:51607/odata/$metadata#MyObjects","value":[
{
"Url":{
"Segments":[
"/","sites/","page.html"
]
},"Name":"123rwe","ID":"asdf"
}
]
}
This is what I'd like (without changing the Url property to a string):
{
"odata.metadata":"http://localhost:51607/odata/$metadata#MyObjects","value":[
{
"Url":"http://www.webapp.com/sites/page.html","Name":"123rwe","ID":"asdf"
}
]
}
Any thoughts?
UPDATE:
Further research is suggesting that serialization behavior for URI types in WebAPI ODATA is controlled by the odataentityreferencelink and odataentityreferencelinkserializer classes. Specifically, URI type appear to be converted to odataentityreferencelink types which are then serialized in the manner I posted above (as an array of segments not including the root domain). I still need to know how to change this behavior, however the documentation for these two classes is not proving helpful. Last, I've confirmed this problem is not specific to the JSON output format. The serialization behavior for both XML/Atom and JSON are identical: URIs are broken down into an array of segments.
MS Premier support provided a final answer to this which I'll share below.
There is no option to directly JSON serialize an URI type, normally it would be broken into array of segments as you are observing in your code
The domain name will be eliminated as a normal scenario
The option you can go for is to create a custom URI Converter deriving from JSONConverter which is a part of Newtonsoft.Json namespace