Kindly bear with me for this confusing question. I'm finding it as hard to describe as it is involving and tiresome. Read it and you'll know why.
I've been hounding this issue for over a month now without much progress. I'm using an STM32 (STM32F103C8 mounted on a BluePill board) to communicate with a C# app through an FT232r Serial-USB converter. The complete communication protocol is a bit complex. I'm writing here a simplistic version of the code that explains my problem quite accurately.
STM32 does the following.
In the initial setup,
Serial.begin at 2000000 (Yes it's very high but I've analyzed it using an oscilloscope and the signal is very healthy; impedance matching and clock jitter is very accurate).
Waits for a command from the C# end to enter the loop
In the loop, it does the following.
TX a byte buffer of length N on the serial port. Packet structure is 0xAA, N bytes, 1 byte checksum.
repeat the loop
And on the C# side (Pseudo code),
new Thread(()=>{while(true) IOTick(); Thread.Sleep(30); }).Start();
IOTick() is defined as:
{
while(SerialPortObject.BytesToRead > 1)
{
header = read();
if (header != 0xAA) continue;
byte [] buffer = new byte[N + 1];
receivedBytes = readBytes(buffer, N + 1, Timeout = 500ms); // receivedBytes is never less than N + 1 for timeout greater than 120)
use the N=16 bytes. Check Nth byte to compare checksum. Doen't take too much CPU time.
Send a packet received software event.
}
}
readBytes is defined as
int readBytes(byte[] buffer, int count, int timeout)
{
var st = DateTime.Now;
for (int i = 0; i < count; i++)
{
var b_ = read(timeout);
if (b_ == -1)
return i;
buffer[i] = (byte)b_;
timeout -= (int)(DateTime.Now - st).TotalMilliseconds;
}
return count;
}
int buffer2ReadIndex = 0;
byte[] buffer2= new byte[0];
int read(int timeout)
{
DateTime start = DateTime.Now;
if (buffer2.Length == 0)
{
while (SerialPortObject.BytesToRead <= 0)
{
if ((DateTime.Now - start).TotalMilliseconds > timeout)
return -1;
System.Threading.Thread.Sleep(30);
}
buffer2 = new byte[SerialPortObject.BytesToRead];
sp.Read(buffer2, 0, buffer2.Length);
}
if (buffer2.Length > 0)
{
var b = buffer2[buffer2ReadIndex];
buffer2ReadIndex++;
if (buffer2ReadIndex >= buffer2.Length)
{
buffer2ReadIndex = 0;
buffer2 = new byte[0];
}
return b;
}
return -1;
}
Now, everything is working as expected. The packet received software event is triggered not later than every ~30ms (the windows tick time). The problem starts if I have to wait between each packet TX at the STM side. First, I suspected that the I2C I was using for some tasks between each packet TX was causing some HW or software conflict with serial data which gets corrupted. But then I noticed that only if I introduce a delay of 1 millisecond using Arduino delay() between each packet TX, the same thing happens. Almost, 1K packets should be received every second now. Almost 1 out of 10 packets after a successful header exception get either not delivered completely or delivered with corrupted checksum, causing the C# app to lose the packet Header. The new header trace obviously requires flushing some bytes, losing some packets in the communication. Even this doesn't sound too bad for an app that can afford 5% data packet loss, strangely though, when this anomaly occurs, the packet received software interrupt waits for more than 1 second after every couple hundred of consecutive events.
I'm completely blind here. Even tried it with 115200 baud rate, does the same loss with a slightly lesser loss ratio. It should be noted that at 9600 baud, the issue doesn't happen. This is the only hint I've got right now.
It looks like I've found an answer.
After digging deep into SerialPort and SerialPort.base stream class and after doing some document reading and benchmarking, here is what I've observed:
SerialPort.BytesToRead updates are not uniform. DataReceived event seems to be following it. When bytes are coming at ~200kHz, (baud = 2Mbps), It is updated almost instantaneously (or within 30ms, worst case). When they are coming at ~20kHz or slower (evenly spaced on time using a micrcontroller), the SerialPort.BytesToRead can take up to 400ms to update. This happens only after a dozen 30ms updates.
So, observing this, I can say that SerialPort.BytesToRead is updated on two conditions. Some amount of time has passed since the data arrived (and this time is not constrained to 30ms) or the data is coming too fast.
This is a strange behavior. No data is lost when this anomaly is occurring. Not to surprise, 0.06% of bytes are lost when working at full bandwidth (200KBps at baud of 2Mbps).
Related
I'm working on a client/server relationship that is meant to push data back and forth for an indeterminate amount of time.
The problem I'm attempting to overcome is on the client side, being that I cannot manage to find a way to detect a disconnect.
I've taken a couple of passes at other peoples solutions, ranging from just catching IO Exceptions, to polling the socket on all three SelectModes. I've also tried using a combination of a poll, with a check on the 'Available' field of the socket.
// Something like this
Boolean IsConnected()
{
try
{
bool part1 = this.Connection.Client.Poll(1000, SelectMode.SelectRead);
bool part2 = (this.Connection.Client.Available == 0);
if (part1 & part2)
{
// Never Occurs
//connection is closed
return false;
}
return true;
}
catch( IOException e )
{
// Never Occurs Either
}
}
On the server side, an attempt to write an 'empty' character ( \0 ) to the client forces an IO Exception and the server can detect that the client has disconnected ( pretty easy gig ).
On the client side, the same operation yields no exception.
// Something like this
Boolean IsConnected( )
{
try
{
this.WriteHandle.WriteLine("\0");
this.WriteHandle.Flush();
return true;
}
catch( IOException e )
{
// Never occurs
this.OnClosed("Yo socket sux");
return false;
}
}
A problem that I believe I am having in detecting a disconnect via a poll, is that I can fairly easily encounter a false on a SelectRead, if my server hasn't yet written anything back to the client since the last check... Not sure what to do here, I've chased down every option to make this detection that I can find and nothing has been 100% for me, and ultimately my goal here is to detect a server (or connection) failure, inform the client, wait to reconnect, etc. So I am sure you can imagine that this is an integral piece.
Appreciate anyone's suggestions.
Thanks ahead of time.
EDIT: Anyone viewing this question should note the answer below, and my FINAL Comments on it. I've elaborated on how I overcame this problem, but have yet to make a 'Q&A' style post.
One option is to use TCP keep alive packets. You turn them on with a call to Socket.IOControl(). Only annoying bit is that it takes a byte array as input, so you have to convert your data to an array of bytes to pass in. Here's an example using a 10000ms keep alive with a 1000ms retry:
Socket socket; //Make a good socket before calling the rest of the code.
int size = sizeof(UInt32);
UInt32 on = 1;
UInt32 keepAliveInterval = 10000; //Send a packet once every 10 seconds.
UInt32 retryInterval = 1000; //If no response, resend every second.
byte[] inArray = new byte[size * 3];
Array.Copy(BitConverter.GetBytes(on), 0, inArray, 0, size);
Array.Copy(BitConverter.GetBytes(keepAliveInterval), 0, inArray, size, size);
Array.Copy(BitConverter.GetBytes(retryInterval), 0, inArray, size * 2, size);
socket.IOControl(IOControlCode.KeepAliveValues, inArray, null);
Keep alive packets are sent only when you aren't sending other data, so every time you send data, the 10000ms timer is reset.
I've been working with windows app store programming in c# recently, and I've come across a problem with sockets.
I need to be able to read data with an unknown length from a DataReader().
It sounds simple enough, but I've not been able to find a solution after a few days of searching.
Here's my current receiving code (A little sloppy, need to clean it up after I find a solution to this problem. And yes, a bit of this is from the Microsoft example)
DataReader reader = new DataReader(args.Socket.InputStream);
try
{
while (true)
{
// Read first 4 bytes (length of the subsequent string).
uint sizeFieldCount = await reader.LoadAsync(sizeof(uint));
if (sizeFieldCount != sizeof(uint))
{
// The underlying socket was closed before we were able to read the whole data.
return;
}
reader.InputStreamOptions
// Read the string.
uint stringLength = reader.ReadUInt32();
uint actualStringLength = await reader.LoadAsync(stringLength);
if (stringLength != actualStringLength)
{
// The underlying socket was closed before we were able to read the whole data.
return;
}
// Display the string on the screen. The event is invoked on a non-UI thread, so we need to marshal
// the text back to the UI thread.
//MessageBox.Show("Received data: " + reader.ReadString(actualStringLength));
MessageBox.updateList(reader.ReadString(actualStringLength));
}
}
catch (Exception exception)
{
// If this is an unknown status it means that the error is fatal and retry will likely fail.
if (SocketError.GetStatus(exception.HResult) == SocketErrorStatus.Unknown)
{
throw;
}
MessageBox.Show("Read stream failed with error: " + exception.Message);
}
You are going down the right lines - read the first INT to find out how many bytes are to be sent.
Franky Boyle is correct - without a signalling mechanism it is impossible to ever know the length of a stream. Thats why it is called a stream!
NO socket implementation (including the WinSock) will ever be clever enough to know when a client has finished sending data. The client could be having a cup of tea half way through sending the data!
Your server and its sockets will never know! What are they going to do? Wait forever? I suppose they could wait until the client had 'closed' the connection? But your client could have had a blue screen and the server will never get that TCP close packet, it will just be sitting there thinking it is getting more data one day?
I have never used a DataReader - i have never even heard of that class! Use NetworkStream instead.
From my memory I have written code like this in the past. I am just typing, no checking of syntax.
using(MemoryStream recievedData = new MemoryStream())
{
using(NetworkStream networkStream = new NetworkStream(connectedSocket))
{
int totalBytesToRead = networkStream.ReadByte();
// This is your mechanism to find out how many bytes
// the client wants to send.
byte[] readBuffer = new byte[1024]; // Up to you the length!
int totalBytesRead = 0;
int bytesReadInThisTcpWindow = 0;
// The length of the TCP window of the client is usually
// the number of bytes that will be pushed through
// to your server in one SOCKET.READ method call.
// For example, if the clients TCP window was 777 bytes, a:
// int bytesRead =
// networkStream.Read(readBuffer, 0, int.Max);
// bytesRead would be 777.
// If they were sending a large file, you would have to make
// it up from the many 777s.
// If it were a small file under 777 bytes, your bytesRead
// would be the total small length of say 500.
while
(
(
bytesReadInThisTcpWindow =
networkStream.Read(readBuffer, 0, readBuffer.Length)
) > 0
)
// If the bytesReadInThisTcpWindow = 0 then the client
// has disconnected or failed to send the promised number
// of bytes in your Windows server internals dictated timeout
// (important to kill it here to stop lots of waiting
// threads killing your server)
{
recievedData.Write(readBuffer, 0, bytesReadInThisTcpWindow);
totalBytesToRead = totalBytesToRead + bytesReadInThisTcpWindow;
}
if(totalBytesToRead == totalBytesToRead)
{
// We have our data!
}
}
}
I have an RFID card reader connected to my pc on serial port. It's using RS485, so I need switching between send and receive state. The communication frames contains header and CRC (CRC16 ccitt - Xmodem).
After every writing on the port I'm waiting the answer, then computing the CRC and if it failed, request frame again. Then if everything correct process it.
It works fine with the "simple" commands. (Request Firmware version, Enable/Disable antenna, etc..).
With the important commands (Logging into the reader's interface, config. it, etc..) I'm facing the next: Rarely the answer comes correctly, with a maximum delay of 5 secs, but in the most of the cases, I don't get anything on the buffer. I can wait for minutes, but nothing.
Conclusion: If I get answer it happens in the first seconds, if I don't I can wait anytime, it won't happen.
My question is: Could it be the hardware's fault, or maybe I miss something in my software?
Here is the send & receive part of my code:
int size;
bool msg_ok = false;
do
{
int max_attemps = 50;
port.DtrEnable = true;
port.RtsEnable = false;
port.Write(fullMsg, 0, fullMsg.Length);
port.DtrEnable = false;
port.RtsEnable = true;
do
{
Thread.Sleep(200);
size = port.BytesToRead;
}while(size <= 3 && max_attemps-- > 0);
if(size > 3){
answer = new byte[size];
port.Read(answer, 0, size);
int end = answer.Length - 1; //Trim 0-s after end
while (answer[end] == 0)
--end;
int start = 0;
while (answer[start] == 0) //Trim 0-s before header
++start;
trimmed = new byte[(end - start) + 1];
Array.Copy(answer, start, trimmed, 0, (end - start) + 1);
checkSum = new byte[2];
checkSum = crc.ComputeChecksumBytes(trimmed, trimmed.Length); //Calculate crc
if (checkSum[0] == trimmed[trimmed.Length - 1] && checkSum[1] == trimmed[trimmed.Length - 2])
{
msg_ok = true; //If it's still false on the end, restart this whole block and request again, if it's true, I can send the answer for processing
}
} else {
Console.WriteLine("Timed out.");
}
}while(!msg_ok);
When data is sent over a serial port, the operating system buffers the data as it arrives. If you query the data when only some of it has arrived, you will get a partial packet. You need to keep reading until you receive the full packet before you start trying to decode it. Otherwise your decode will fail on the first half of the packet, fail on the second half, and then sit waiting for another message that will never come.
The best approach for using a serial port is to subscribe to the DataReceived event, because this means you are called by the port if and when data arrives. This avoids having to sleep to try to get around the timing issues. You will still sometimes need to stitch several chunks of received data together to form a valid packet however, so you should write your code to keep reading and appending into a receive buffer until it recognises a valid, complete packet.
You also shouldn't need to flip the handshaking bits unless the device on the other end of the serial line is very unusual - just send your data and wait for the reply. By changing the low level states on the port manually you are likely to introduce transmission problems into the system.
Try starting with the example code on the DataReceived event page (above) and you should have more reliable results.
Hello I am using Read() method to read 10 characters say 0123456789 from serial port. Actually the characters are sent by a PIC Micro-controller.
Here is my code:
serialPort1.PortName = "com4";
serialPort1.BaudRate = 9600;
serialPort1.Open();
char[] result = new char[10];
serialPort1.Read(result, 0, result.Length);
string s = new string(result);
MessageBox.Show(s);
serialPort1.Close();
When I run the code, a message box shows up and displays only the first character. "0" alone is displayed in the message box.
Where have i gone wrong ??
What you are doing wrong is not paying attention to the return value of Read(). Which tells you how many bytes were read.
Serial ports are very slow devices, at a typical baudrate setting of 9600 it takes a millisecond to get one byte transferred. That's an enormous amount of time for a modern processor, it can easily execute several million instructions in a millisecond. The Read() method returns as soon as some bytes are available, you only get all 10 of them if you make your program artificially slow so the driver gets enough time to receive all of them.
A simple fix is to keep calling Read() until you got them all:
char[] result = new char[10];
for (int len = 0; len < result.Length; ) {
len += serialPort1.Read(result, len, result.Length - len);
}
Another common solution is to send a unique character to indicate the end of the data. A line feed ('\n') is a very good choice for that. Now it becomes much simpler:
string result = serialPort.ReadLine();
Which now also supports arbitrary response lengths. Just make sure that the data doesn't also contain a line feed.
I try to stream sound samples from my microphone to my speakers by using DirectSound and C#. It should be similar to 'listening to microphone', but later I want to use this for something else. By testing my approach I've noticed silent tickeling, cracking noises in the background. I would guess this has something to do with the delay between writing and playing the buffer, which must be greater than the latency to write the chunks.
If I set the delay between recording and playout to less than 50ms. Mostly it works but sometimes I get really loud cracking noises. So I've decided to a delay about at least 50ms. This works okay for me, but the delay of the systems "listen to device" seems to be much shorter. I would guess it is about 15-30ms, and nearly not noticeable. For 50ms I get at least a little reverb effect.
In the following I'll show you my microphone code (partially):
The initialisation is done like this:
capture = new Capture(device);
// Creating the buffer
// Determining the buffer size
bufferSize = format.AverageBytesPerSecond * bufferLength / 1000;
while (bufferSize % format.BlockAlign != 0) bufferSize += 1;
chunkSize = Math.Max(bufferSize, 256);
bufferSize = chunkSize * BUFFER_CHUNKS;
this.bufferLength = chunkSize * 1000 / format.AverageBytesPerSecond; // Redetermining the buffer Length that will be used.
captureBufferDescription = new CaptureBufferDescription();
captureBufferDescription.BufferBytes = bufferSize;
captureBufferDescription.Format = format;
captureBuffer = new CaptureBuffer(captureBufferDescription, capture);
// Creating Buffer control
bufferARE = new AutoResetEvent(false);
// Adding notifier to buffer.
bufferNotify = new Notify(captureBuffer);
BufferPositionNotify[] bpns = new BufferPositionNotify[BUFFER_CHUNKS];
for(int i = 0 ; i < BUFFER_CHUNKS ; i ++) bpns[i] = new BufferPositionNotify() { Offset = chunkSize * (i+1) - 1, EventNotifyHandle = bufferARE.SafeWaitHandle.DangerousGetHandle() };
bufferNotify.SetNotificationPositions(bpns);
The capturing will run like this in an extra thread:
// Initializing
MemoryStream tempBuffer = new MemoryStream();
// Capturing
while (isCapturing && captureBuffer.Capturing)
{
bufferARE.WaitOne();
if (isCapturing && captureBuffer.Capturing)
{
captureBuffer.Read(currentBufferPart * chunkSize, tempBuffer, chunkSize, LockFlag.None);
ReportChunk(applyVolume(tempBuffer.GetBuffer()));
currentBufferPart = (currentBufferPart + 1) % BUFFER_CHUNKS;
tempBuffer.Dispose();
tempBuffer = new MemoryStream(); // Reset Buffer;
}
}
// Finalizing
isCapturing = false;
tempBuffer.Dispose();
captureBuffer.Stop();
if (bufferARE.WaitOne(bufferLength + 1)) currentBufferPart = (currentBufferPart + 1) % BUFFER_CHUNKS; // That on next start the correct bufferpart will be read.
stateControlARE.Set();
While capturing ReportChunk takes the data to the speaker as an event that could be subscribed. The speaker part is initialized like this:
// Creating the dxdevice.
dxdevice = new Device(device);
dxdevice.SetCooperativeLevel(hWnd, CooperativeLevel.Normal);
// Creating the buffer
bufferDescription = new BufferDescription();
bufferDescription.BufferBytes = bufferSize;
bufferDescription.Format = input.Format;
bufferDescription.ControlVolume = true;
bufferDescription.GlobalFocus = true; // That sound doesn't stop if the hWnd looses focus.
bufferDescription.StickyFocus = true; // - " -
buffer = new SecondaryBuffer(bufferDescription, dxdevice);
chunkQueue = new Queue<byte[]>();
// Creating buffer control
bufferARE = new AutoResetEvent(false);
// Register at input device
input.ChunkCaptured += new AInput.ReportBuffer(input_ChunkCaptured);
The data is put by the event method into the queue, simply by:
chunkQueue.Enqueue(buffer);
bufferARE.Set();
Filling the playbackbuffer and starting/stopping the playback buffer is done by another thread:
// Initializing
int wp = 0;
bufferARE.WaitOne(); // wait for first chunk
// Playing / writing data to play buffer.
while (isPlaying)
{
Thread.Sleep(1);
bufferARE.WaitOne(BufferLength * 3); // If a chunk is played and there is no new chunk we try to continue and may stop playing, else may the buffer runs out.
// Note that this may fails if the sender was interrupted within one chunk
if (isPlaying)
{
if (chunkQueue.Count > 0)
{
while (chunkQueue.Count > 0) wp = writeToBuffer(chunkQueue.Dequeue(), wp);
if (buffer.PlayPosition > wp - chunkSize * 3 / 2) buffer.SetCurrentPosition(((wp - chunkSize * 2 + bufferSize) % bufferSize));
if (!buffer.Status.Playing)
{
buffer.SetCurrentPosition(((wp - chunkSize * 2 + bufferSize) % bufferSize)); // We have 2 chunks buffered so we step back 2 chunks and play them while getting new chunks.
buffer.Play(0, BufferPlayFlags.Looping);
}
}
else
{
buffer.Stop();
bufferARE.WaitOne(); // wait for a filling chunk
}
}
}
// Finalizing
isPlaying = false;
buffer.Stop();
stateControlARE.Set();
writeToBuffer simply writes the enqueued chunk to the buffer by this.buffer.Write(wp, data, LockFlag.None); and caring about bufferSize and chunkSize and wp, which represents the last writing position. I think this is everything that is important about my code. Maybe the definitions are missing and at least there is another method that starts/stops=controls the threads.
I've posted this code in case I've made a mistake in filling the buffer or my initialisation is wrong. But I would guess that this problem occurs because the execution of C# bytecode is too slow or something like that. But in the end my question is still open: My question is how to reduce the latency and how to avoid noises that shouldn't be there?
I know the reason of your problem and the way that you can solve it, but I can't implement it in C# and .Net, so I will explain it in hope that you can find your way.
Audio will be recorded by your mic. with an specified frequency( for example 44100 ) and then played on the sound card at same sample rate( again 44100 ), the problem is the crystal that count the time in input device( mic. for example ) is not same as the crystal that play sound in sound card.
also the difference is so small they are not the same( there is no 2 exact same crystal in entire world ) so after a while there will be a gap in your playback routines.
Now the solution is re-sample data to match the sample rate of the output but I don't know how to do that in C# and .Net
A long time ago I figured out, that this problem was caused by the Thread.Sleep(1); in combination with high CPU usage. Due the windows timerresolution is 15,6ms by default, this sleep doesn't mean sleep for 1ms, but sleep until the next clock interrupt is reached. (For more read this paper) Combined with high CPU usage it may stacks up to the length of a chunk or even more.
For example: If my chunksize is 40ms, this could be about 46,8ms (3 * 15,6ms) and this causes the tickeling. One solution for that is setting the resolution down to 1ms. That can be done in this way:
[DllImport("winmm.dll", EntryPoint="timeBeginPeriod", SetLastError=true)]
private static extern uint timeBeginPeriod(uint uiPeriod);
[DllImport("winmm.dll", EntryPoint="timeEndPeriod", SetLastError=true)]
private static extern uint timeEndPeriod(uint uiPeriod);
void routine()
{
Thead.Sleep(1); // May takes about 15,6ms or even longer.
timeBeginPeriod(1); // Should be set at the startup of the application.
Thead.Sleep(1); // May takes about 1, 2 or 3 ms depending on the CPU usage.
// ... time depending routines goes here ...
timeEndPeriod(1); // Should end at application shutdown.
}
As far as I know this should be already done by directx. But due this setting is a global setting, other parts of the application or other applications maybe change it. This shouldn't happen if an application sets and revokes the setting once. But somehow it seems to happen caused by any dirty programmed part or other running application.
One more thing that needs to be watched, is whether you're still using the correct position of the directx buffer, if you're skipping one chunk for any reason. In this case a resynchronization is required.