Protocol Buffer Field Fails to Serialize, Ends Up as Null - c#

I'm using Google Protobuf 3.17.3.
PS C:\Users\Name> protoc --version
libprotoc 3.17.3
I'm running into an issue where, upon serialization of a given object, I'm finding that a field that is non-null is not being serialized. I know this because, when the proto-message gets to the server-side, it's fields are null despite being non-null on the client-side.
For example, in the code snippet below, while Google.Protobuf.ExchangeProto.Update's UpdateNewOrder field is non-null, after desterilizing this field on the server-side, it ends up as null. The UpdateType field serializes just fine.
public async Task SendNewOrderAsync(NewOrder newOrder)
{
var networkStream = _tcpClient.GetStream();
using var ms = new MemoryStream();
{
using var gms = new Google.Protobuf.CodedOutputStream(ms);
var update = new Google.Protobuf.ExchangeProto.Update()
{
UpdateType = Google.Protobuf.ExchangeProto.Update.Types.UpdateType.NewOrder,
UpdateNewOrder = ProtoAdapter.NewOrderToProto(newOrder), // < -- Problematic field!
};
update.WriteTo(gms);
}
await networkStream.WriteAsync(ms.ToArray());
}
It appears this isn't a server-side problem, though. I noticed post-serialization on the client-side, that the underlying MemoryStream object is particularly small (only 27 bytes).
Client Side Memory Stream Object Details
Server Side Object Post-Parse - Missing UpdateNewOrder Field
ProtoAdapter is where the to-and-from proto logic resides. This particular NewOrderToProto message looks as such:
public static Google.Protobuf.ExchangeProto.NewOrder NewOrderToProto(NewOrder newOrder)
{
return new Google.Protobuf.ExchangeProto.NewOrder()
{
Symbol = newOrder.Symbol,
IsBuy = newOrder.IsBuy,
OrderId = newOrder.OrderID,
Price = newOrder.Price,
Quantity = newOrder.Quantity,
};
}
The update message I'm attempting to serialize looks like the following:
message Update
{
enum UpdateType
{
UpdateType_NewOrder = 0;
UpdateType_CancelOrder = 1;
}
UpdateType update_type = 1;
NewOrder update_new_order = 2;
CancelOrder update_cancel_order = 3;
}
The struct that isn't serializing properly in the Update message's update_new_order field looks like such:
message NewOrder
{
string symbol = 1;
int32 quantity = 2;
int64 price = 3;
bool is_buy = 4;
uint64 order_id = 5;
}
Question: Why is the UpdateNewOrder field not being serialized?

So I looked into this, and started playing around with the order of operations on the server-side. The issue wasn't on the client-side. I found that out by deserializing the message on the client-side right after serializing it, noticing that the UpdateNewOrder field is non-null, exactly what I need to rule out a client-side issue.
I then started playing with the server-side, and noticed that the Parser's ParseFrom method can take in any object that inherits from stream, but seems to work best when passing in a CodedInputStream as shown below.

Related

How to use AskAdjacentFacet function in NXOpen?

I am working with convergent facet bodies in NX and using C# in NXOpen. In the process, I am using UFSession.Facet.AskAdjacentFacet function to get the adjacent facets of each facet. But on using this particular command, the NXOpen throws error stating "NXOpen.NXException: The facet object is not supported for this operation". I went through the example given in NXOpen documentation (https://docs.plm.automation.siemens.com/data_services/resources/nx/10/nx_api/en_US/custom/ugopen_doc/uf_facet/uf_facet_eg2.c) and used similar approach, but this error shows up any way. Below is the script that I tried.
'''
public static void Main(string[] args)
{
NXOpen.UF.UFFacet myFacet = UFSession.Facet;
int facetID;
int edgeID;
int adjFacID;
int edgeIDinAdjFac;
int null_facet_ID = UFConstants.UF_FACET_NULL_FACET_ID;
facetID = null_facet_ID;
foreach (NXOpen.Facet.FacetedBody facetBody in workPart.FacetedBodies)
{
myFacet.CycleFacets(facetBody.Tag, ref facetID); // initialise for cycling
while (facetID != null_facet_ID)
{
List<int> Adj_fac_list = new List<int>();
for (edgeID = 0; edgeID < 3; edgeID++)
{
myFacet.AskAdjacentFacet(facetBody.Tag, facetID, edgeID, out adjFacID, out edgeIDinAdjFac);
if (adjFacID != UFConstants.UF_FACET_NULL_FACET_ID)
{
Adj_fac_list.Add(adjFacID);
}
}
}
}
}
Note: I could use the same model tag and facet id in the function UFSession.FACET.AskNumVertsInFacet and the script works fine. But I do not know why AskAdjacentFacet is not working. Can anyone help me on why there is an error and how to get this working?
On first glimpse, the problem I see is that you have not initialized the variable myFacet and it's null. And since it's null, you cannot call its members.
So change a single line of the code from
NXOpen.UF.UFFacet myFacet = UFSession.Facet;
To
NXOpen.UF.UFFacet myFacet = UFSession.GetUFSession().Facet;

How to write JSON to Event Hub correctly

I'm batching serialized records (in a JArray) to send to Event Hub. When I'm writing the data to Event Hubs it seems to be inserting extra speech marks around the JSON i.e. what is written "{"myjson":"blah"}" not {"myjson":"blah"} so downstream I'm having trouble reading it.
Based on this guidance, I must convert JSON to string and then use GetBytes to pass it into an EventData object. I suspect my attempt at following this guidance is where my issue is arising.
using Newtonsoft.Json;
using Newtonsoft.Json.Linq;
public static class EventDataTransform
{
public static EventData ToEventData(dynamic eventObject, out int payloadSize)
{
string json = eventObject.ToString(Formatting.None);
payloadSize = Encoding.UTF8.GetByteCount(json);
var payload = Encoding.UTF8.GetBytes(json);
var eventData = new EventData(payload)
{
};
return eventData;
}
}
How should an item from a JArray containing serialized data be converted into the contents of an EventData message?
Code call location - used in batching upto 256kb parcels
public bool MoveNext()
{
var batch = new List<EventData>(_allEvents.Count);
var batchSize = 0;
for (int i = _lastBatchedEventIndex; i < _allEvents.Count; i++)
{
dynamic evt = _allEvents[i];
int payloadSize = 0;
var eventData = EventDataTransform.ToEventData(evt, out payloadSize);
var eventSize = payloadSize + EventDataOverheadBytes;
if (batchSize + eventSize > MaxBatchSizeBytes)
{
break;
}
batch.Add(eventData);
batchSize += eventSize;
}
_lastBatchedEventIndex += batch.Count();
_currentBatch = batch;
return _currentBatch.Count() > 0;
}
Sounds like the JArray already contains serialized objects (strings). Calling .ToString(Formatting.None) will serialize it again a second time (wrapping it in quotes).
Interestingly enough, if you call .ToString() without passing in a Formatting, it would not serialize it again.
This fiddle demonstrates this: https://dotnetfiddle.net/H4p6KL

Concurrent execution of Cassandra prepared statement returns invalid JSON as result

When I use prepared statement for async execution of multiple statements I get JSON with broken data. Keys and values got totally corrupted.
First I encoutered this issue when I were performing stress testing of our project using custom script. We are using DataStax C++ driver and execute statements from different fibers.
Then I was trying to isolate problem and wrote simple C# program which starts multiple Tasks in a loop. Each task uses the once created prepared statement to read data from the base. For some rows result is totally mess, eg:
Expected (fetched by cqlsh)
516b00a2-01a7-11e6-8630-c04f49e62c6b |
lucid_lynx_value_45404 |
precise_pangolin_value_618429 |
saucy_salamander_value_302796 |
trusty_tahr_value_873 |
vivid_vervet_value_216045 |
wily_werewolf_value_271991
Actual
{
"sa": "516b00a2-01a7-11e6-8630-c04f49e62c6b",
"lucid_lynx": "wily_werewolflue_45404",
"precise_pangolin": "precise_pangolin_value_618429",
"saucy_salamander": "saucy_salamander_value_302796",
"trusty_tahr": "trusty_tahr_value_873",
"vivid_vervet": "vivid_vervet_value_216045",
"wily_werewolf": "wily_werewolf_value_271991"
}
Here is the main part of C# code.
static void Main(string[] args)
{
const int task_count = 300;
using(var cluster = Cluster.Builder().AddContactPoints(/*contact points here*/).Build())
{
using(var session = cluster.Connect())
{
var prepared = session.Prepare("select json * from test_neptunao.ubuntu where id=?");
var tasks = new Task[task_count];
for(int i = 0; i < task_count; i++)
{
tasks[i] = Query(prepared, session);
}
Task.WaitAll(tasks);
}
}
Console.ReadKey();
}
private static Task Query(PreparedStatement prepared, ISession session)
{
string id = GetIdOfRandomRow();
var stmt = prepared.Bind(id);
stmt.SetConsistencyLevel(ConsistencyLevel.One);
return session.ExecuteAsync(stmt).ContinueWith(tr =>
{
foreach(var row in tr.Result)
{
var value = row.GetValue<string>(0);
//some kind of output
}
});
}
CQL script with test DB schema.
CREATE KEYSPACE IF NOT EXISTS test_neptunao
WITH replication = {
'class' : 'SimpleStrategy',
'replication_factor' : 3
};
use test_neptunao;
create table if not exists ubuntu (
id timeuuid PRIMARY KEY,
precise_pangolin text,
trusty_tahr text,
wily_werewolf text,
vivid_vervet text,
saucy_salamander text,
lucid_lynx text
);
UPD
Expected JSON
{
"id": "516b00a2-01a7-11e6-8630-c04f49e62c6b",
"lucid_lynx": "lucid_lynx_value_45404",
"precise_pangolin": "precise_pangolin_value_618429",
"saucy_salamander": "saucy_salamander_value_302796",
"trusty_tahr": "trusty_tahr_value_873",
"vivid_vervet": "vivid_vervet_value_216045",
"wily_werewolf": "wily_werewolf_value_271991"
}
UPD2
Here is the sample c# project mentioned above
UPD3
The issue was resolved after upgrading to Cassandra 3.5.
It sounds like you're seeing CASSANDRA-11048 (JSON Queries are not thread safe). Upgrading Cassandra to a version with the fix is the best way to resolve this.
The only error I see in the generated JSON is the name of the primary key which should be "id" instead of "sa". Otherwise the other columns are correct.
{
"sa": "516b00a2-01a7-11e6-8630-c04f49e62c6b",
"lucid_lynx": "wily_werewolflue_45404",
"precise_pangolin": "precise_pangolin_value_618429",
"saucy_salamander": "saucy_salamander_value_302796",
"trusty_tahr": "trusty_tahr_value_873",
"vivid_vervet": "vivid_vervet_value_216045",
"wily_werewolf": "wily_werewolf_value_271991"
}
What kind of JSON structure did you expect to get as result ?

Find Item in List<T> That Threw Exception

I have inherited a WCF web service application that requires to have much better error tracking. What we do is query data from one system (AcuODBC), and send that data to another system (Salesforce). This query will return 10's of thousands of complex objects as a List<T>. We then process this List<T> in batches of 200 records at a time to map the fields to another object type, then send that batch to Salesforce. After this is completed, the next batch starts. Here's a brief example:
int intStart = 0, intEnd = 200;
//done in a loop, snipped for brevity
var leases = from i in trleases.GetAllLeases(branch).Skip(intStart).Take(intEnd)
select new sforceObject.SFDC_Lease() {
LeaseNumber = i.LeaseNumber.ToString(),
AccountNumber = i.LeaseCustomer,
Branch = i.Branch
(...)//about 150 properties
//do stuff with list and increment to next batch
intStart += 200;
However, the problem is if one object has a bad field mapping (Invalid Cast Exception), I would like to print out the object that failed to a log.
Question
Is there any way I can decipher which object of the 200 threw the exception? I could forgo the batch concept that was given to me, but I'd rather avoid that if possible for performance reasons.
This should accomplish what you are looking for with very minor code changes:
int intStart = 0, intEnd = 200, count = 0;
List<SDFC_Lease> leases = new List<SDFC_Lease>();
//done in a loop, snipped for brevity
foreach(var i in trleases.GetAllLeases(branch).Skip(intStart).Take(intEnd)) {
try {
count++;
leases.Add(new sforceObject.SFDC_Lease() {
LeaseNumber = i.LeaseNumber.ToString(),
AccountNumber = i.LeaseCustomer,
Branch = i.Branch
(...)//about 150 properties);
} catch (Exception ex) {
// you now have you culprit either as 'i' or from the index 'count'
}
}
//do stuff with 'leases' and increment to next batch
intStart += 200;
I think that you could use a flag in each set method of the properties of the class SFDC_Lease, and use a static property for this like:
public class SFDC_Lease
{
public static string LastPropertySetted;
public string LeaseNumber
{
get;
set
{
LastPropertySetted = "LeaseNumber";
LeaseNumber = value;
}
}
}
Plz, feel free to improve this design.

Implementing IFormatter recursively

I'm trying to implement a custom formatter using the .NET IFormatter interface.
After a couple of hours of search, I just found a very basic sample which unfortunately doesn't include recursion. I also tried with Reflector to look at BinaryFormatter and SoapFormatter, but they are rather complex.
My question is:
Should I implement recursion myself, or there's something I've missed in FormatterServices?
Following my code:
public void Serialize(Stream serializationStream, object graph)
{
// Get fields that are to be serialized.
MemberInfo[] members = FormatterServices.GetSerializableMembers(graph.GetType(), Context);
// Get fields data.
object[] data = FormatterServices.GetObjectData(graph, members);
// Write class name and all fields & values to file
StreamWriter sw = new StreamWriter(serializationStream);
string accumulator = string.Empty;
for (int i = 0; i < data.Length; ++i)
{
// Skip this field if it is marked NonSerialized.
if (Attribute.IsDefined(members[i], typeof(NonSerializedAttribute)))
continue;
FieldInfo field = (FieldInfo)members[i];
if (field.FieldType.IsPrimitive)
{
}
else //TODO: What should I do here?
}
sw.Close();
}
If by recursion you mean traversing through the object tree then yes, it up to you when you implement your own IFormatter.
Simply check if the value of the property is not null and if it is implementing IFormatter interface. If it is then just call it and use the value it returns.
If not then it is up to you again: you may throw an exception saying that IFormatter must be implemented, or just fall-back to some sort of default formatter (XML or Binary one).
The recursion per se is tricky. When, let's say, the object references itself, you need to be smart enough to handle this situation and not to end up with the infinite loop:
public class A {
public object SomeProperty { get; set; }
}
var a = new A();
a.SomeProperty = a;
There are a number of tricky aspects in implementing formatters, like what if two properties are actually reference the same object? Will you serialize/format it twice or just once and keep the information about these references somehow?
You don't probably need this if you want just one-way serialization, but if you want to be able to restore the object it might be important...

Categories