I am using Avro schema to dynamically produce message to a Kafka cluster from a C# application, using the Confluent Kafka client. The data types are not known on compile time so I am using the GenericRecord class from the Avro.Generic namespace, as described here: https://www.confluent.io/blog/decoupling-systems-with-apache-kafka-schema-registry-and-avro/.
I however have an issue - if the schema has a field that can contain null value, it is still required to add the field to the GenericRecord by using the Add method, giving null as value. My application is not aware of the fields that can be null and I don't think it should be - because that would defy the purpose of nullable fields in the schema.
Avro schema:
{
"namespace": "Test",
"type": "record",
"doc": "Test bool type",
"name": "BoolType",
"version": "1",
"fields": [
{
"name": "Data",
"type": [ "null", "boolean" ],
"default": null
},
{
"name": "Source",
"type": "string"
}
]
}
C# code:
var valueRecord = new GenericRecord( valueAvroSchema );
valueRecord.Add( "Data", null );
valueRecord.Add( "Source", "Test app .NET" );
var messageToSend = new Message<GenericRecord, GenericRecord>
{
Key = keyRecord,
Value = valueRecord
};
await _producer.ProduceAsync( _topicName, messageToSend );
If the line:
valueRecord.Add( "Data", null );
is not present, the ProduceAsync method throws a Confluent.Kafka.ProduceException, as seen in the screenshot below.
Is there any way I can automatically populate the fields that can be null in the GenericRecord? The same would apply if I would have to populate the fields with their default value.
Is there any way of doing this in a standard way or do I need to write my own code to read the schema and if there are any nullable fields that are not already set by my application to add them at the end, before publishing?
Thank you!
Avro defaults are only relevant to consumers. The producer must always set each field
Related
I am trying to integrate with salesforces new grpc change data capture event bus. Events are sent to clients via grpc with an avro encoded message of what the changes to the records were, so the client has to decode the message using an avro schema that is provided and unable to be changed.
I am able to easily decode the avro encoded message for objects with fields of two union types but fields with three types throw exceptions.
This is the Name field for the Account avro schema:
{
"name": "Name",
"type": [
"null",
"string",
{
"type": "record",
"name": "Switchable_PersonName",
"fields": [
{
"name": "Salutation",
"type": [
"null",
"string"
],
"default": null
},
{
"name": "FirstName",
"type": [
"null",
"string"
],
"default": null
},
{
"name": "LastName",
"type": [
"null",
"string"
],
"default": null
},
{
"name": "MiddleName",
"type": [
"null",
"string"
],
"default": null
},
{
"name": "InformalName",
"type": [
"null",
"string"
],
"default": null
},
{
"name": "Suffix",
"type": [
"null",
"string"
],
"default": null
}
]
}
],
"doc": "Data:Switchable_PersonName",
"default": null
},
As you can see the name can either be null, a string, or an object called Switchable_PersonName.
Using the avrogen cli tool, I was able to convert the avro schema into concrete c# classes of AccountChangeEvent, ChangeEventHeader, ChangeType, Address, Switchable_PersonName.
The Name field was created in the AccountChangeEvent class as:
private object _Name;
This is the method I made for decoding the avro message:
public static void DeserializeAccountConcrete(byte[] payload)
{
var accSchema = Avro.Schema.Parse(File.ReadAllText("./avro/AccountGRPCSchema.avsc"));
var unionSchema = accSchema as Avro.UnionSchema;
var cache = new ClassCache();
cache.LoadClassCache(typeof(AccountChangeEvent), unionSchema);
cache.LoadClassCache(typeof(Switchable_PersonName), unionSchema);
cache.LoadClassCache(typeof(Address), unionSchema);
var reader = new ReflectReader<AccountChangeEvent>(accSchema, accSchema, cache);
using var accStream = new MemoryStream(payload);
accStream.Seek(0, SeekOrigin.Begin);
var accDecoder = new BinaryDecoder(accStream);
var accEvent = reader.Read(accDecoder);
Console.WriteLine(accEvent.Name);
Console.WriteLine("Event " + accEvent.ChangeEventHeader.changeType);
}
This sort of deserialization works for other schemas but it fails for the Account schema with this exception being thrown.
Avro.AvroException: Class for union record type com.sforce.eventbus.Switchable_PersonName is not registered.Create a ClassCache object and call LoadClassCache
Looking at the documentation for avro my implementation looks correct but it seems it is not.
I have changed the field type to
private com.sforce.eventbus.Switchable_PersonName _Name;
and any other code that may rely on this field but the same error is still thrown.
I am new to avro so there may be many things that I do not know or am doing wrong.
How do you create objects when beforehand you don't know all it's properties?
I want to create a application which works with source, schema and destination
Source is a dynamic object from database
Mapping schema is generate from json or another way
Destination object, object generate bases source and mapping schema.
Let's say you receive the following source data
{
"ModuleName": "example name",
"type": "string"
}
And you receive another dynamic JSON mapping file which specifies how some properties should be mapped from the JSON:
"mappings": [
{
"source": {
"name": "ModuleName",
"type": "String"
},
"destination": {
"name": "Module",
"type": "Single Line string"
}
},
]
I want to convert to:
{
"Module": "example name",
"type": "Single Line string"
}
how can i do this using .NET?
Every help is welcome, thanks
I've created a CosmosDB DB, with a single table, called MyTable. My idea is that I want to insert into this table from an Azure function. Having had a look around, I've come up with this function:
public static HttpResponseMessage Run(HttpRequestMessage req, TraceWriter log, out object tableOutput)
{
log.Info("C# HTTP trigger function processed a request.");
// parse query parameter
string field1 = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "field1", true) == 0)
.Value;
string field2 = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "field2", true) == 0)
.Value;
var item = new
{
field1 = field1,
field2 = field2
};
tableOutput = item;
return req.CreateResponse(HttpStatusCode.OK);
}
The error that I get is this:
2017-12-07T15:52:44.066 Exception while executing function:
Functions.MyFunc. Microsoft.Azure.WebJobs.Host: Error while handling
parameter tableOutput after function returned:.
Microsoft.Azure.WebJobs.Extensions.DocumentDB: The collection
'tableOutput' (in database 'myCosmosDB') does not exist. To
automatically create the collection, set 'CreateIfNotExists' to
'true'. Microsoft.Azure.Documents.Client: Message: {"Errors":["Owner
resource does not exist"]}
I have set-up the output parameter, and I have seen the checkbox mentioned here (CreateIfNotExists); however, I have an existing Cosmos table set-up, and would like to write to that; so my question is, how can I access that table from within an Azure function?
The function.json is below:
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "$return",
"type": "http",
"direction": "out"
},
{
"type": "documentDB",
"name": "outputDocument",
"databaseName": "mycosmosdb",
"collectionName": "tableOutput",
"createIfNotExists": false,
"connection": "mycostmosdb_DOCUMENTDB",
"direction": "out"
}
],
"disabled": false
}
EDIT:
Two people have implied that for the purposes of this question the terms Table and Collection are synonymous. If I have understood correctly, then this appears to not be the case, as even when I change the collection name in the function.json to match the name of the table that I created, I get the same error.
To clarify the CosmosDB table configuration, I am seeing, inside data explorer, a node entitles TablesDB, which has a sub-node of the table that I've created.
You need to update the Function.json with the correct "collectionName" value for your Cosmos DB Collection you're trying to save the new Document to. Currently, you do NOT have a Collection in your Cosmos DB Database ("mycosmosdb") with the name "tableOutput".
Also, if you'd wish for the Azure Function Output binding to automatically create the Collection if it doesn't exist, then set the Function.json property "createIfNotExists" to "true" and it'll create the Collection if it doesn't exist.
Either approach will get your code working without error, but you'll need to make sure the "collectionName" is set to the correct name of your Cosmos DB Collection you have set up.
For example, try changing your Function.json to the following:
{
"bindings": [
{
"authLevel": "function",
"name": "req",
"type": "httpTrigger",
"direction": "in"
},
{
"name": "$return",
"type": "http",
"direction": "out"
},
{
"type": "documentDB",
"name": "outputDocument",
"databaseName": "mycosmosdb",
"collectionName": "MyTable",
"createIfNotExists": true,
"connection": "mycostmosdb_DOCUMENTDB",
"direction": "out"
}
],
"disabled": false
}
Make sure this is the value of your existing collection:
(you mention MyTable but it says tableOutput)
"collectionName": "tableOutput",
Make sure the names are identical. Collection names are case sensitive. "mytable" and "myTable" are different collections
I can not google that thing on json.net api reference or anywhere.
I want to create object from json schema with default values filled in.
Basically some thing like this:
var JsonSchema=JsonSchema.ReadSchemaFromSomeWhere();
dynamic DefaultObject= JsonSchema.GetDefaultObject();
Example you might see in json-schema-defaults package.
Example
var JsonSchema=JsonSchema.ReadSchemaFromString("
{
"title": "Album Options",
"type": "object",
"properties": {
"sort": {
"type": "string",
"default": "id"
},
"per_page": {
"default": 30,
"type": "integer"
}
}");
dynamic DefaultObject= JsonSchema.GetDefaultObject();
//DefaultObject dump is
{
sort: 'id',
per_page: 30
}
UPDATE
I want lib or api in json.net to create object with default values from any given valid json schema during runtime.
Well a simple case might be this
[Test]
public void Test()
{
dynamic ob = new JsonObject();
ob["test"] = 3;
Assert.That(ob.test, Is.EqualTo(3));
}
I used the RestSharp library that provides a good dynamic implementation that allows indexing ["test"];
So then - what You're left to do is read the properties from the schema and assign values (of course this will work only for simple plain case`s, but might be a start
dynamic ob = new JsonObject();
foreach (var prop in JsonSchema.Properties)
{
if (prop.Default != null)
ob[prop.Name] = prop.Default
}
I'm using the Fhir-net-api with DSTU2 to parse JSON objects to C# models. Everything works well, except that I can't access the Reason element of the resource type Procedure. As an example, I parse following JSON object to the Procedure model using the FhirParser:
{
"resourceType": "Procedure",
"identifier": [
{
"system": "https://mrd2.melanoma.org.au/fhir",
"value": "100200199664802"
}
],
"subject": { "reference": "Patient/10101000001733" },
"status": "completed",
"category": {
"coding": [
{
"system": "https://mrd2.melanoma.org.au/fhir/RootType",
"code": "3004"
}
],
"text": "Primary Surgery"
},
"bodySite": [
{
"coding": [
{
"system": "http://snomed.info/sct",
"code": "7771000"
}
],
"text": "Left Forearm, Anterior"
}
],
"reasonReference": { "reference": "/Condition/10106000001807" },
"performedDateTime": "1968-03-11",
"report": [ { "reference": "/DiagnosticReport/100200199664828" } ]
}
and the generated object has following entries (excerpt):
Procedure
I can access Report[0].Reference just fine but it won't work with Reason.Reference. Is the data in my JSON object wrong?
I've seen that Reason is of type Hl7.Fhir.Model.Element and Report of type Hl7.Fhir.Model.ResourceReference. Is there a way to change Reason to Hl7.Fhir.Model.ResourceReference and then access the Reference element?
Would be grateful for any hints. Thanks.
Regards,
Trammy
As you noticed, the type of reasonReference is Model.Element, while the type of report is ResourceReference. This difference has its origin in the definition of these elements in the FHIR specification for Procedure, where report is fixed to the type Reference but reason (or rather reason[x]) can either be a CodeableConcept or a Reference.
When elements can be of multiple types (we call this a "choice element", and you can recognize them because their name ends with [x] in the specification), we have created a C# member that is of type Model.Element (the base class for both Reference and CodeableConcept).
Now, depending on the instance you have just parsed or received, the contents of the reason member can be one of both types. So, you'll have to check in your code:
if(Reports[0].reason is ResourceReference)
{
var reference = (ResourceReference)Reports[0].reason;
//handle the case where this is a reference
//doing reference.Reference will now work as expected
}
else if(Reports[0].reason is CodeableConcept)
{
var concept = (CodeableConcept)Reports[0].reason;
//handle the case where this is a codeable concept
}
else
{
// Oops! Should not happen unless the standard has changed
}
Of course, if you are sure you can only receive instances where reason is a ResourceReference, you can do a cast directly:
var myReference = (ResourceReference)Reports[0].Reference;
// myReference.Reference and myReference.Display will now work