So I am trying to send a message to a queue that accepts a string of max 482. The string I am sending it has a length of 452. Here is the block of code where the request queue is accessed:
var openOptions = MQC.MQOO_OUTPUT + MQC.MQOO_FAIL_IF_QUIESCING;
requestQueue = queueManager.AccessQueue(requestQueueName, openOptions);
var messageObject = new MQMessage();
messageObject.WriteString(message);
openReplyQueue(replyQueueName);
messageObject.ReplyToQueueName = replyQueue.Name;
messageObject.Format = MQC.MQFMT_STRING
messageObject.MessageType = MQC.MQMT_REQUEST;
messageObject.Report = MQC.MQRO_COPY_MSG_ID_TO_CORREL_ID;
messageObject.Expiry = 300;
var pmo = new MQPutMessageOptions();
pmo.Options = MQC.MQPMO_FAIL_IF_QUIESCING;
requestQueue.Put(messageObject, pmo);
The code fails on the last line with the MQException Reason Code 2030. With some console output, I found out that the message length in the MQMessage object is 904, exactly double the length of the string I am trying to send and way longer than the queue's max message length.
How do I keep this buffer from happening and make sure the message length stays at 452?
IBM MQ classes for .NET default to using CCSID 1200 (UTF-16) which is a Double Byte Character Set (DBCS). Because each character is represented as two bytes your 452 character string is represented as 904 bytes.
If the application getting the message from the queue is expecting 452 characters and is using the Get with Convert option, the message will be read correctly by the application. If the reading application is using an ASCII character set then this would be converted and read by the application as 452 bytes. This would also work if the getting application is reading in CCSID 1200 or another DBCS since the application is expecting 452 characters, even in a DBCS it is still getting 452 characters. If this is how your getting application works then one option is to increase the MAXMSGL of the queue to accommodate messages encoded in a DBCS.
Another option is to tell your putting application to put the message in a ASCII character set such as CCSID 437.
To set the CCSID to 437 use the following:
messageObject.CharacterSet = 437;
Related
I am writing an android app in c# (using Avalonia) to communicate with my embedded usb device. The device has an atxmega microcontroller and utilises the atmel provided cdc usb driver. I have a desktop version of the app using System.IO.Ports.SerialPort. I use SerialPort.Write() to write data and SerialPort.ReadByte() in a loop to read data - using SerialPort.Read() and trying to read multiple bytes very often fails, while the looped byte by byte reading basically never does. It works on both Windows and Mac pcs.
The communication is very straightforward - the host sends commands and expects a known length of data to come in from the device.
On the android app I am using the android.hardware.usb class and adapted the code from the multiple serial port wrappers for cdc devices. Here is my connect function:
public static bool ConnectToDevice()
{
if (Connected) return true;
UsbDevice myDevice;
if ((myDevice = GetConnectedDevice()) == null) return false;
if (!HasPermission(myDevice)) return false;
var transferInterface = myDevice.GetInterface(1);
var controlInterface = myDevice.GetInterface(0);
writeEndpoint = transferInterface.GetEndpoint(1);
readEndpoint = transferInterface.GetEndpoint(0);
deviceConnection = manager.OpenDevice(myDevice);
if (deviceConnection != null)
{
deviceConnection.ClaimInterface(transferInterface, true);
deviceConnection.ClaimInterface(controlInterface, true);
SendAcmControlMessage(0x20, 0, parameterMessage);
SendAcmControlMessage(0x22, 0x03, null); //Dtr Rts true
Connected = true;
OnConnect?.Invoke();
return true;
}
return false;
}
The two control transfers are what I borrowed from the serial port wrappers and I use them to:
Set the default parameters - 115200 baudrate, 1 stop bit, parity none, 8 data bits which is the same as in my embedded usb config. This doesn't seem necessary but I do it anyway. The response from the control transfer is 7, so I assume it works properly.
Set Dtr and Rts to true. This doesn't seem to be necessary either - the desktop code works with both set to false as well as to true. The control transfer response is 0 so I assume it also works properly.
I checked all endpoints and parameters after connection and they all seem to be correct - both read and write Endpoints are bulk endpoints with correct data direction.
Writing to my device is not a problem. I can use either the bulkTransfer method as well as UsbRequest and they both work perfectly.
Unfortunately everything falls apart when trying to read data. The initial exchange with the device is as follows:
Host sends a command and expects 4 bytes of data.
Host sends another command and expects 160 bytes of data.
The bulk transfer method looks like this:
public static byte[] ReadArrayInBulk(int length)
{
byte[] result = new byte[length];
int transferred = deviceConnection.BulkTransfer(readEndpoint, result, length, 500);
if (transferred != length)
{
transferred = deviceConnection.BulkTransfer(readEndpoint, result, length, 500);
}
return result;
}
The first transfer almost always (99.9%) returns 0 and then the second returns the proper data, up to a point. For the first exchange it does receive the 4 bytes correctly. However, for the second exchange, when it expects 160 bytes the second transfer returns 80, 77, 22 and other random lengths of data. I tried making a loop and performing bulk transfers until the number of bytes transferred is equal to the expected amount but unfortunately after receiving the random length of data for the first time the bulk transfer starts always returning 0.
So I tried the UsbRequest way to get the data:
public static byte[] GetArrayRequest(int length)
{
byte[] result = new byte[length];
UsbRequest req = new UsbRequest();
req.Initialize(deviceConnection, readEndpoint);
ByteBuffer buff = ByteBuffer.Wrap(result);
req.Queue(buff, length);
var resulting = deviceConnection.RequestWait();
if (resulting != null) buff.Get(result);
req.Close();
return result;
}
The WrapBuffer does get filled with the expected amount of data (the position is the same as length), unfortunately the data is all 0. It does work for a couple transfers once in a while, but 99% of the time it just returns zeroes.
So, given that the desktop version also has trouble reading data in bulk using the SerialPort.Read() method but works perfectly when reading them one by one (SerialPort.ReadByte() which underneath uses SerialPort.Read() for a single byte) in a loop I decided to try doing that in my app as well.
Unfortunately, when looping using the UsbRequest method not only does it take ages but it also always returns zeroes. When looping using the BulkTransfer method for a single byte array it returns 0 on the first transfer and -1 on the second, indicating a timeout.
I have spent so much time with this that I am at my wits' end. I checked multiple implementations of usb serial for android projects from mik3y, kai-morich, felHR85 but all they do differently from me are the two control transfers, which I have now added - to no avail.
Can anyone tell me what could cause the bulkTransfer to always return 0 in response to the first transfer (which is supposed to be a success, but what does it actually do with no data read?) and only return some data or timeout (in case of a single byte transfer) on the second one? If it were reliably doing so I would just get used to it, but unfortunately it only seems to work correctly for smaller transfers and only up to a certain point, because while the initial couple/couple dozen bytes are correct, then it stops receiving any more in random intervals. Or why the UsbRequest method does fill the buffers, but with only zeroes 99% of the time?
I have inherited a C++ app (of which Im no expert, Im a .NET guy) which sends messages to an azure queue in a JSON form. This works fine, its when I try and pick the message off the queue in my .NET console app that gives me the following message:
“The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or a non-white space character among the padding characters”.
The C++ code looks like this (notice the commented out dummy message which gives an example of what it looks like)
void send(utility::string_t msg) {
// Define the connection-string with your values.
const utility::string_t storage_connection_string(U("DefaultEndpointsProtocol=https;AccountName=bogus;AccountKey=YcG8FP9HdaB+r5jDTruTzZy8dXku+fLr4hvPcq+C6Uzhh7UOB6C7MemYluQMz28JlzwZIcn6Vw=="));
// Retrieve storage account from connection string.
azure::storage::cloud_storage_account storage_account = azure::storage::cloud_storage_account::parse(storage_connection_string);
// Create a queue client.
azure::storage::cloud_queue_client queue_client = storage_account.create_cloud_queue_client();
// Retrieve a reference to a queue.
azure::storage::cloud_queue queue = queue_client.get_queue_reference(U("beam-queue"));
// Create the queue if it doesn't already exist.
queue.create_if_not_exists();
// Create a message and add it to the queue.
//Dummy message
//azure::storage::cloud_queue_message message(U("[{\"url\":\"https://www.google.com.au\",\"app\":null,\"email\":\"jv#jv.ie\",\"dbId\":\"323e3098-cc87-4b37-8eb5-85a6d6ddba1c\",\"seconds\":147.0490574,\"date\":\"2016-11-17T00:00:00+11:00\"}]"));
azure::storage::cloud_queue_message message(msg);
queue.add_message(message);
lastsendtime = GetTickCount();
}
I can even see the message in the storage explorer:
But it seems to be in the wrong format as when i pick the message off the queue:
CloudStorageAccount storageAccount = CloudStorageAccount.Parse(GetConnectionString());
CloudQueueClient queueClient = storageAccount.CreateCloudQueueClient();
CloudQueueClient clnt = storageAccount.CreateCloudQueueClient();
CloudQueue queue = clnt.GetQueueReference("bogus");
queue.EncodeMessage = true;
List<Service> userServices = null;
CloudQueueMessage retrievedMessage = queue.GetMessage();
List<CloudAppItem> items = JsonConvert.DeserializeObject<List<CloudAppItem>>(queue.GetMessage().AsString);
It fails in the last line. And its not because of the serializer. The queue.GetMessage().AsString returns the error.
UPDATE (Still not working)
I took out the encodedmessage statement, as I had previous tried to add it in to make it work. It still doesnt work. I also show the raw string, not accessible due to the fact it is private method on the class:
I took out the encodedmessage statement, as I had previous tried to add it in to make it work. It still doesnt work.
In your screenshot, we could find that you just remove queue.EncodeMessage = true;, but the default value of CloudQueue.EncodeMessage property is true, please explicitly set CloudQueue.EncodeMessage to false.
CloudQueueClient clnt = storageAccount.CreateCloudQueueClient();
CloudQueue queue = clnt.GetQueueReference("bogus");
queue.EncodeMessage = false; //explicitly set CloudQueue.EncodeMessage to false
The reason you're getting this error is because you're instructing the SDK to decode the message from a base64 encoded string however the message contents is not base64 encoded (you're saving the message as plain text).
Please change the following line of code:
queue.EncodeMessage = true;
to
queue.EncodeMessage = false;
And that should take care of the problem. From CloudQueue.EncodeMessage documentation:
Gets or sets a value indicating whether to apply base64 encoding when
adding or retrieving messages.
A java client constructs a Message according to this skeleton :
package tutorial;
option java_package = "com.example.sff.piki2";
option java_outer_classname = "MsgProtos";
message MSG {
required string guid = 1;
required int32 MsgCode = 2;
required int32 From = 3; //sender
...
This message is sent to a C# program ( server side).
The server knows how to read the bytes ( first byte is the number of bytes to read which represents the size of the following message ).
This is how MSG is being constructed by a byte array.
MSG Msg = MSG.CreateBuilder().MergeFrom(buffer).Build();
Where buffer is the byte array which read from the socket.
But now I'm in a situation where a client needs to send "Heartbeat" message( another message) in order to check if the server is alive or not. ( the server should respond : "yes i'm alive")
Sure , I can add another field to the MSG class. but I don't want to because the MSG class has a lot of unnecessary fields - for a Heartbeat operation.
Question :
The server read n bytes. Is there anyway I can know if this is a MSG message or a "Heartbeat" message ?
Is there anyway I can know if this is a MSG message or a "Heartbeat" message?
No. Protocol buffer messages don't contain any such type information. Typically the way round this is to have a "wrapper" type with a field for each message you might want to send. You'd ideally want to express this as a oneof, but my port doesn't support that (yet).
The overhead is minimal - but it will be a protocol change, of course, so you'll need to think about any existing servers etc.
I am relatively new to programming, so please excuse me if the question is stupid. I am now working on a project which involves Kinect. I am using C# to extract the live joint information (such as position and orientation) and then sending the data into Processing using OSC message -Udp protocal. I sent OSC message bundle from C# the problem is I don't know how to dispatch the message to what I want in Processing. Or possibly, I sent the data in a wrong format in C#. I would be very appreciate if someone can tell me what could possible gone wrong in the code and caused the errors.
I send the joint position from C# by using these code:
if (joint0.JointType == JointType.ElbowRight)
{
// distance in meter
String temp = "ElbowRight " + joint0.Position.X * 1000 + " " + joint0.Position.Y * 1000 + " " + joint0.Position.Z * 1000;
Console.WriteLine(temp);
OscElement message = new OscElement("/joint/" + joint0.JointType.ToString(), joint0.Position.X * 1000, joint0.Position.Y * 1000, joint0.Position.Z, joint0.TrackingState.ToString());
bundle.AddElement(message);
}
OscSender.Send(bundle); // send message bundle
The part,"/joint/", is the address pattern of the message. The following data are the arguments of the message. According to http://opensoundcontrol.org/spec-1_0 a OSC type tag string should be added following to the address pattern, which is an OSC-string beginning with the character ',' (comma) followed by a sequence of characters corresponding exactly to the sequence of OSC Arguments in the given message. However when I tried this, a format exception was caused and the error was reported as: Invalid character (\44). What I did was simply added ",s" to the OSC message:
OscElement message = new OscElement("/joint/" + ",s" + joint0.JointType.ToString(), joint0.Position.X * 1000, joint0.Position.Y * 1000, joint0.Position.Z, joint0.TrackingState.ToString());
How do I suppose to add the type tag? Can this be the reason that caused the following errors?
In my Processing code, I tried to get the joint position value by using these code:
if(theOscMessage.checkAddrPattern("/joint")==true) {
String firstValue = theOscMessage.get(0).stringValue();
float xosc = theOscMessage.get(1).floatValue(); // get the second osc argument
float yosc = theOscMessage.get(2).floatValue(); // get the second osc argument
float zosc = theOscMessage.get(3).floatValue(); // get the second osc argument
String thirdValue = theOscMessage.get(4).stringValue(); // get the third osc argument
println("### values: "+xosc+", "+xosc+", "+zosc);
return;
}
However I got this error:
[2013/6/16 20:20:53] ERROR # UdpServer.run() ArrayIndexOutOfBoundsException: java.lang.ArrayIndexOutOfBoundsException
Than I tied receiving the messaging using a example given in Processing, which displays the address pattern and type tag of the massage:
println("addrpattern\t"+theOscMessage.addrPattern());
println("typetag\t"+theOscMessage.typetag());
It printed out this:
addrpattern pundle
typetag u???N?N?$xlt???
I dont understand what's wrong with the code. Souldn't the address pattern be "joint"? or at least "bundle"? What is the pundle...
P.s. I am using Visual C# 2010 Express and Processing 2.0b9 64bit on a Win7 os computer.
Thank you so much for the help!
Update:
Although I still can't figure out how to solve this problem, I found a way to receive messages in Processing. Instead of using OSC bundle, I am sending Osc messages with different address pattern. then use message plug (e.g. oscP5.plug(this,”leftFoot”,”/joint/AnkleLeft”);) in the draw method. Then create a method called leftFoot
public void leftFoot(float fx, float fy, float fz, String state) {
println("Received: "+fx+", "+fy+", "+fz+", "+state);
}
Then you can see the data being print out. p.s. in C# the OSC message was send using:
OscElement message = new OscElement("/joint" + "/" + joint0.JointType.ToString(), joint0.Position.X * 1000, joint0.Position.Y * 1000, joint0.Position.Z, joint0.TrackingState.ToString());
OscSender.Send(message);
Hmmm... not sure exactly, but you could just use OSCeleton. It comes with a Processing example - I've used it before, and it works fine.
(the example may also help you understand how to use the OSC address patterns correctly... )
https://github.com/Sensebloom/OSCeleton
https://github.com/Sensebloom/OSCeleton-examples
OSC requires padding to next 32-bit boundary, meaning that you need to add zeros to pad the message, until its length is an even multiple of four bytes.
You need to do the same for the typetag. Pad it with zeroes even if you only send one typetag, so ,s00 instead of just ,s
I am sending 3 messages
Message1 - correlation id:5000
empty message (no body/message)-correlation id:5001
Message2 - correlation id:5002
My outbound queue processes like this
Message1 - correlation id:5000
Message1-correlation id:5001 => same previous message ovewritten on the empty message.*
Message2 - correlation id:5002
The second line above should not have had Message1, instead just empty. Any thoughts?
My get method
mqGetMsgOpts = new MQGetMessageOptions();
if (mqQueue != null)
{
//Get options for the messsage
mqGetMsgOpts.Options = MQC.MQGMO_BROWSE_FIRST | MQC.MQGMO_WAIT | MQC.MQOO_INQUIRE;
mqGetMsgOpts.MatchOptions = MQC.MQMO_NONE;
mqGetMsgOpts.WaitInterval = 5000; // 5 seconds limit for waiting
}
if (mqMsg.MessageLength > 0 && mqMsg.DataLength > 0)
{
messageData = mqMsg.ReadString(mqMsg.MessageLength);
}
If I don't do the length check, i will get stream reader related exception.
My put method
if(mqQueue==null)
mqQueue = mqQMgr.AccessQueue("Queue Name", MQC.MQOO_OUTPUT | MQC.MQOO_INPUT_SHARED | MQC.MQOO_INQUIRE);
mqMsg.WriteString(message);
I have not heard messages getting overwritten in WMQ. I suspect this must be an issue with application. This line of code:
mqGetMsgOpts.Options = MQC.MQGMO_BROWSE_FIRST | MQC.MQGMO_WAIT | MQC.MQOO_INQUIRE;
The MQC.MQGMO_BROWSE_FIRST option will make WMQ to return always the first message that satisfies the conditions specified in MQMD structure. I can't make out from your code snippet if this option is modified at a later point to specify MQGMO_BROWSE_NEXT to read the next message in the queue.
Instead of MQC.MQGMO_BROWSE_FIRST you can specify MQGMO_BROWSE_NEXT option to continuously read messages.
Also you have specified MQC.MQOO_INQUIRE which is not valid for GMO options. You need to remove that.
More details browse options are here
I can imagine two possible issues that could cause this.
Your putting application did not send an empty message body for message two.
Your getting application is showing you the message buffer from message one. If there is no message buffer delivered from MQ, your previous message buffer contents will remain.
To determine which has happened to you I suggest you put all the messages, but before running your sample to get them, instead run something like the supplied sample amqsget to rule out the possibility of 1.
Then you can focus on the get buffer in your application. Make sure you are not using it if MQ said the length of the returned message is zero.