I wrote this code on c#
public class SerialClass
{
SerialPort s;
public Serial()
{
InitSerialPort();
s.DataReceived += dataReciver;
}
private void dataReciver(object sender, SerialDataReceivedEventArgs e)
{
lock (obj)
{
while (s.BytesToRead >0)
{
var line = s.ReadLine();
if(line=="hello")
{
Thread.Sleep(500);
s.WriteLine("hello to you friend");
}
else //......
}
}
}
}
When i got "hello" from the serial I want to answer after 500 milliseconds "hello to you friend".
I heard so much , don't use sleep on you code..
What is the disadvantage here to use sleep? If more data will get on serialport so new event will enter to dataReciver because it will be open on secondery thread.
so what is the disadvantage and what is the better/best way to implement it without sleep?
I use lock because I want only 1 thread will be on this reading
If you've done it right, you shouldn't need the lock.
IMHO, you should avoid the DataReceived event altogether. Wrap SerialPort.BaseStream in a StreamReader, then loop in an async method to read. Regardless, I also would not put the delay, asynchronous or otherwise, in sequence with your reading. You should always be ready to read.
You didn't provide real code, so it's impossible to offer a real code solution, but here's how I'd have written the bit of code you posted:
public class Serial
{
SerialPort s;
public Serial()
{
InitSerialPort();
// Ignore returned task...constructors shouldn't wait. You could store
// the task in a class field, to provide a mechanism to observe the
// receiving state.
Task task = ReceiveLoopAsync();
}
private async Task ReceiveLoopAsync()
{
using (StreamWriter writer = new StreamWriter(s.BaseStream))
using (StreamReader reader = new StreamReader(s.BaseStream))
{
string line;
while ((line = reader.ReadLineAsync()) != null)
{
if (line == "hello")
{
// Ignore returned task...we don't really care when it finishes
Task task = RespondAsync(writer);
}
}
}
}
private async Task RespondAsync(StreamWriter writer)
{
await Task.Delay(500);
writer.WriteLine("hello to you friend");
}
}
I've left out niceties like exception handling and more robust handling of the tasks. But the above is the basic idea. Note that all receiving is done in a single loop, with no need for cross-thread synchronization.
Related
I'm trying to write integration tests using xunit.
my test application is communicating with my testee application through mqtt and for one of my test I want to send a request, collect all received message for an amount of time and then analyse the result to determine if the outcome is what i expect.
I could sucessfuly do it, however, not beeing very familiar with async programing in c#, I have some doubt about the way I do it.
Can someone tell me if the core idea is right and how can it be improved ? especially regarding some comments in the code.
First my test code:
[Fact]
public void HandlingMultipleSimultanuousRequests()
{
_output.WriteLine("Starting test <HandlingMultipleSimultanuousRequests>");
bool success = false;
_com.SendRequestEvent(new RequestScanEvent());
Task<IList<IScanEvent>> waitTask = WaitAndReturnAll(5);//get all result received the next 5 sec TODO: place before send request, but not running asynchronously (problem for another time)
IList<IScanEvent> results = waitTask.GetAwaiter().GetResult(); //wait synchronously on the results
//TODO: test of the validity of what was received
Assert.True(results.Count > 0); //TODO: dummy test to replace
}
Then the method I use to wait for the results :
private async Task<IList<IScanEvent>> WaitAndReturnAll(int waitTimeSeconds)
{
IList<IScanEvent> scanEvents = new List<IScanEvent>();
try
{
_com.ReceivedScanResult += ReceivedScanEvent;
_genericScanEventReceived = new TaskCompletionSource<IScanEvent>();
DateTime startTime = DateTime.Now;
while(DateTime.Now < startTime.AddSeconds(waitTimeSeconds))
{
if (_genericScanEventReceived.Task.Wait(5)) //small error in timespan ms not really a problem
{
//TODO: should I put a lock here for the receive event...
scanEvents.Add(_genericScanEventReceived.Task.Result);
_genericScanEventReceived = new TaskCompletionSource<IScanEvent>(); //new event to be ready for next message...
//TODO: remove the lock here...
}
}
_com.ReceivedScanResult -= ReceivedScanEvent; //remove callback before cancellation
_genericScanEventReceived.SetCanceled();
_genericScanEventReceived = null;
}
catch (Exception e)
{
_logger.LogError(e, "Error during awaiting of request answer, result should still be valid");
return null;
}
return scanEvents;
}
and finally the eventHandler for when I receive a message from the testee:
private void ReceivedScanEvent(object? sender, IScanEvent res)
{
//TODO: should event queue with a lock, will it work ?
if (_genericScanEventReceived == null)
{
_logger.LogWarning("Received scanevent but no test is currently waiting");
return;
}
if (_genericScanEventReceived.Task.IsCompleted)
{
//TODO: code smell... this case should not append
_logger.LogError("A result is already returned, cannot handle the event now...");
return;
}
_genericScanEventReceived.SetResult(res);
}
As you can see I use a TaskCompletionSource to collect the data. is it the right tool ?
Also I have some concern about the risk of loosing some messages with my code.
I have a rather tricky question to solve. I have multiple (up to hundred or more) tasks, each of them produce a piece of data, say, string. These tasks can be spawn in every moment and there can be huge amount of them in one time and no at another. Each task must receive bool, indicating, whether is was completed correctly or not (that's important).
I want to implement some kind of buffer, to agregate data from tasks and flush it to external service, returning operation state (ok or fail). Also, my buffer must be flushed by timeout (to prevent waiting for new tasks to generate data for too long).
So far i tried to make some shared list of items. Tasks can add items to list and there is another task, checking timer or count of items in list and flushing them. But in this approach i can't tell status of flush operation to task, which is very bad for me.
I'll be gratefull for any approarch to solve my problem.
As I understand, you need to save result of each task to database/service, but you don't want to do it immediately.
There can be more than one solution to your problem, but it's difficult to come up with the best one, so I'll describe how I would have done it ... quickly.
A container for data you need to save/send.
public class TaskResultEventArgs : EventArgs
{
public bool Result { get; set; }
}
A notifier which also runs the task for you. I assumed you can delay execution of tasks.
public class NotifyingTaskRunner
{
public event EventHandler<TaskResultEventArgs> TaskCompleted;
public void RunAndNotify(Task<bool> task)
{
task.ContinueWith(t =>
{
OnTaskCompleted(this, new TaskResultEventArgs { Result = t.Result });
}, TaskContinuationOptions.OnlyOnRanToCompletion);
task.Start();
}
protected virtual void OnTaskCompleted(object sender, TaskResultEventArgs e)
{
var h = TaskCompleted;
if (h != null)
{
h.Invoke(sender, e);
}
}
}
A listener which can buffer and/or flush results (or you might want to delegate this to another class).
public class Listener
{
private ConcurrentQueue<bool> _queue = new ConcurrentQueue<bool>();
public Listener(NotifyingTaskRunner runner)
{
runner.TaskCompleted += Flush;
}
public async void Flush(object sender, TaskResultEventArgs e)
{
// Enqueue status to flush everything later (or flush it immediately)
_queue.Enqueue(e.Result);
}
}
And this is how you can use everything together.
var runner = new NotifyingTaskRunner();
var listener = new Listener(runner);
var t1 = new Task<bool>(() => { return true; });
var t2 = new Task<bool>(() => { return false; });
runner.RunAndNotify(t1);
runner.RunAndNotify(t2);
Do we have in .NET 4.0 the opportunity to wait for a response and then to return the response?
Currently I'm doing it like this but it isn't really nice and I don't like it:
public partial class MainWindow : Window
{
public MainWindow()
{
InitializeComponent();
byte[] options = new byte[]{1,1,0};
COMManager mgr = new COMManager("COM1");
byte[] result = mgr.GetResponse(options);
}
}
And my COM Manager Class
(I have to do the operation in a seperate class (dll)):
public class COMManager
{
SerialPort sp = null;
byte[] result = null;
bool completed = false;
public COMManager(string comport)
{
sp = new SerialPort(comport);
sp.DataReceived +=new SerialDataReceivedEventHandler(sp_DataReceived);
}
public byte[] GetResponse(byte[] option)
{
sp.Write(option, 0, option.Length);
//I don't like the way...
while (!completed) { }
completed = false;
return result;
}
void sp_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
result = new byte[sp.BytesToRead];
sp.Read(result, 0, sp.BytesToRead);
completed = true;
}
}
In .NET 4.5 we may have the opportunity to use the "await" statement. But for the current project we only are allowed to use .NET 4.0.
Any ideas?
There's no point in using the DataReceived event if you don't want to read asynchronously. Simply call the Read() method directly in GetResponse().
Beware that you cannot assume you will get a complete response, you cannot ignore the return value of Read(). It usually returns a couple of bytes only, serial ports are pretty slow. So be sure to keep calling Read() until you got the entire response.
For your original question, to block the executing thread you can use a ManualResetEvent or AutoResetEvent which will get Set when your response has been obtained. There's a fairly good explanation on the page.
For threading, the rule of thumb is that if you're not extremely clear on what you're doing, don't do it.
Synchronous blocking when you have access to events seems like a waste. Considering that the data is a stream, this might end up being a hard to maintain abstraction.
There's a longer explanation of the above idea with an example over here.
You can also do this in async with TaskCompletionSource. Instead of set, you can call SetResult, and you await the .Task, but the idea is pretty much the same.
The clean way would be to wait on an AutoResetEvent and for for the receive callback to signal it.
By creating a wrapper with this method, you can effectivly await in every version of .Net.
Let's say I have a list and am streaming data from a namedpipe to that list.
hypothetical sample:
private void myStreamingThread()
{
while(mypipe.isconnected)
{
if (mypipe.hasdata)
myList.add(mypipe.data);
}
}
Then on another thread I need to read that list every 1000ms for example:
private void myListReadingThread()
{
while(isStarted)
{
if (myList.count > 0)
{
//do whatever I need to.
}
Thread.Sleep(1000);
}
}
My priority here is to be able to read the list every 1000 ms and do whatever I need with the list but at the same time it is very important to be able to get the new data from it that comes from the pipe.
What is a good method to come with this ?
Forgot to mention I am tied to .NET 3.5
I would recommend using a Queue with a lock.
Queue<string> myQueue = new Queue<string>();
private void myStreamingThread()
{
while(mypipe.isconnected)
{
if (mypipe.hasdata)
{
lock (myQueue)
{
myQueue.add(mypipe.data);
}
}
}
}
If you want to empty the queue every 1000 ms, do not use Thread.Sleep. Use a timer instead.
System.Threading.Timer t = new Timer(myListReadingProc, null, 1000, 1000);
private void myListReadingProc(object s)
{
while (myQueue.Count > 0)
{
lock (myQueue)
{
string item = myQueue.Dequeue();
// do whatever
}
}
}
Note that the above assumes that the queue is only being read by one thread. If multiple threads are reading, then there's a race condition. But the above will work with a single reader and one or more writers.
I would suggest using a ConcurrentQueue (http://msdn.microsoft.com/en-us/library/dd267265.aspx). If you use a simple List<> then you will encourter a lot threading issues.
The other practice would be to use a mutex called outstandingWork and wait on it instead of Thread.Sleep(). Then when you enqueue some work you pulse outstandingWork. This means that you sleep when no work is available but start processing work immediately instead of sleep the entire 1 second.
Edit
As #Prix pointed out, you are using .Net 3.5. So you cannot use ConcurrentQueue. Use the Queue class with the following
Queue<Work> queue;
AutoResetEvent outstandingWork = new AutoResetEvent(false);
void Enqueue(Work work)
{
lock (queue)
{
queue.Enqueue(work);
outstandingWork.Set();
}
}
Work DequeMaybe()
{
lock (queue)
{
if (queue.Count == 0) return null;
return queue.Dequeue();
}
}
void DoWork()
{
while (true)
{
Work work = DequeMaybe();
if (work == null)
{
outstandingWork.WaitOne();
continue;
}
// Do the work.
}
}
I have the scenario where a command comes in over a socket which requires a fair amount of work. Only one thread can process the data at a time. The commands will come in faster than can process it. Over time there will be quiet a back log.
The good part is that I can discard waiting threads and really only have to process the last one that is waiting - (or process the first one in and discard all the other once). I was thinking about using a semaphore to control the critical section of code and to use a boolean to see if there are any threads blocking. If there are blocking thread I would just discard the thread.
My mind is drawing a blank on how to implement it nicely I would like to implement it with out using an integer or boolean to see if there is a thread waiting already.
I am coding this in c#
You can use Monitor.TryEnter to see whether a lock is already taken on an object:
void ProcessConnection(TcpClient client)
{
bool lockTaken = false;
Monitor.TryEnter(lockObject, out lockTaken);
if (!lockTaken)
{
client.Close();
return;
}
try
{
// long-running process here
}
finally
{
Monitor.Exit(lockObject);
client.Close();
}
}
Note that for this to work you'll still have to invoke the method in a thread, for example:
client = listener.AcceptTcpClient();
ThreadPool.QueueUserWorkItem(notused => ProcessConnection(client));
FYI, the lock statement is just sugar for:
Monitor.Enter(lockObject);
try
{
// code within lock { }
}
finally
{
Monitor.Exit(lockObject);
}
I believe you are looking for the lock statement.
private readonly object _lock = new object();
private void ProccessCommand(Command command)
{
lock (_lock)
{
// ...
}
}
It sounds like you just need to use the lock statement. Code inside a lock statement will allow only one thread to work inside the code block at once.
More info: lock Statement
From the sounds of what you've posted here, you might be able to avoid so many waiting threads. You could queue up the next command to execute, and rather than keep the threads waiting, just replace the command to execute next after the current command finishes. Lock when replacing and removing the "waiting" command.
Something like this:
class CommandHandler
{
Action nextCommand;
ManualResetEvent manualResetEvent = new ManualResetEvent(false);
object lockObject = new object();
public CommandHandler()
{
new Thread(ProcessCommands).Start();
}
public void AddCommand(Action nextCommandToProcess)
{
lock (lockObject)
{
nextCommand = nextCommandToProcess;
}
manualResetEvent.Set();
}
private void ProcessCommands()
{
while (true)
{
Action action = null;
lock (lockObject)
{
action = nextCommand;
nextCommand = null;
}
if (action != null)
{
action();
}
lock (lockObject)
{
if(nextCommand != null)
continue;
manualResetEvent.Reset();
}
manualResetEvent.WaitOne();
}
}
}
check out: ManualResetEvent
It's a useful threading class.