parallel operations with audio stream c# - c#

I record sound in c#(wpf) and when data from sound card is available it calls this event:
void myWaveIn_DataAvailable(object sender, WaveInEventArgs e)
{
for (int index = 0; index < e.BytesRecorded; index += 2)//Here I convert in a loop the stream into floating number samples
{
short sample = (short)((e.Buffer[index + 1] << 8) |
e.Buffer[index + 0]);
samples32Queue.Enqueue(sample/32768f);
}
//***Do some Processing with data inside Queue
}
As you can see I push every sample from the recorded buffer to a queue that is declared like that:
Queue<float> samples32Queue = new Queue<float>();
As you can see inside the event after the for loop I want to do some processing on the Queue. I worry that while processing the data, a new samples will come from the sound card and my processing will got lost.
What is the right approach to make that?
Is the processing that I call from the event is a static method/non-static?

Given the fact you can buffer the samples and process them later, consider using BlockingCollection. BlockingCollection is an excellent solution for producer-consumer pattern, which to my understanding, is your case. On one side you have myWaveIn_DataAvailable() method as a producer adding samples to the collection, and on the other end you have another consuming thread (it doesn't have to be another thread, though) collecting the samples and processing them. There are various ways the consumer can be implemented, and they are well documented in MSDN.
EDIT:
See the following example, I did not test it but it should give you a start point:
class ProducerConsumerExample
{
BlockingCollection<float> samples32Collection;
Thread consumer;
public ProducerConsumerExample()
{
samples32Collection = new BlockingCollection<float>();
consumer = new Thread(() => LaunchConsumer());
consumer.Start(); //you don't have to luanch the consumer here...
}
void Terminate() //Call this to terminate the consumer
{
consumer.Abort();
}
void myWaveIn_DataAvailable(object sender, WaveInEventArgs e)
{
for (int index = 0; index < e.BytesRecorded; index += 2)//Here I convert in a loop the stream into floating number samples
{
short sample = (short)((e.Buffer[index + 1] << 8) |
e.Buffer[index + 0]);
samples32Collection.Add(sample / 32768f);
}
}
void LaunchConsumer()
{
while (true /* insert your abort condition here*/)
{
try
{
var sample = samples32Collection.Take(); //this thread will wait here until the producer add new item(s) to the collection
Process(sample); //in the meanwhile, more samples could be added to the collection
//but they will not be processed until this thread is done with Process(sample)
}
catch (InvalidOperationException) { }
}
}
}

Related

Speed improvement using multiple threads

I have a CustomControl called PlaylistView. It displays elements in a playlist with name and thumbnail. The method DisplayPlaylist ensures that a thread is started, in which the individual elements are added one by one and the thumbnails (30th frame) are read out:
public void DisplayPlaylist(Playlist playlist)
{
Thread thread = new Thread(() => DisplayElements(playlist));
thread.Start();
}
private void DisplayElements(Playlist playlist)
{
for (int i = 0; i < playlist.elements.Count; i++)
DisplayElement(playlist.elements[i], i);
}
private void DisplayElement(IPlayable element, int index)
{
VideoSelect videoSelect = null;
if (element is Audio)
//
else if (element is Video)
videoSelect = new VideoSelect(index, element.name, GetThumbnail(element.path, SystemData.thumbnailFrame));
videoSelect.Location = GetElementsPosition(index);
panel_List.BeginInvoke(new Action(() =>
{
panel_List.Controls.Add(videoSelect);
}));
}
private Bitmap GetThumbnail(string path, int frame)
{
VideoFileReader reader = new VideoFileReader();
try
{
reader.Open(path);
for (int i = 1; i < frame; i++)
reader.ReadVideoFrame();
return reader.ReadVideoFrame();
}
catch
{
return null;
}
}
But there is a problem.
It is much too slow (about 10 elements/sec). With a playlist length of 614, you would have to wait more than a minute until all are displayed. Each time you change the playlist, such as adding or deleting an item, the procedure starts with the new item. Adding 2 or more will make it even more complicated.
I now had the approach to use multiple threads and the number of threads used for this to be specified by the user (1 to max 10). The implementation in the code currently looks like this (only changed parts compared to the previously posted code)
public void DisplayPlaylist(Playlist playlist)
{
for (int i = 0; i < SystemData.usedDisplayingThreads; i++)
{
Thread thread = new Thread(() => DisplayElements(playlist, i));
thread.Start();
}
}
private void DisplayElements(Playlist playlist, int startIndex)
{
for (int i = startIndex; i < playlist.elements.Count; i += SystemData.usedDisplayingThreads)
DisplayElement(playlist.elements[i], i);
}
The problem is that now very often null is returned by the GetThumbnail function, so an error occurs. In addition, a System.AccessViolationException is often thrown out.
In my opinion, the reason for this is the presence of multiple, simultaneously active VideoFileReaders. However, I do not know what exactly triggers the problem so I cannot present any solution. Maybe you know what the actual trigger is and how to fix the problem or maybe you also know other methods for speed improvement, which maybe even more elegant.
I would start with logging what exception is raised in GetThumbnail method. Your code hides it and returns null. Change to catch (Exception exc), write exception details in log or at least evaluate in debugger. That can give a hint.
Also I'm pretty sure your VideoFileReader instances are IDisposable, so you have to dispose them by invoking reader.Close(). Maybe previous instances were not disposed and you are trying to open same file multiple times.
Update: video frame has to be disposed as well. Probably you will need to do a copy of bitmap if it is referenced with reader and prevents disposion.

Safely clear List (with concurrency)

Good day!
I have List<byte> soundBuffer - to get audio signal from microphone.
void _waveInStream_DataAvailable(object sender, WaveInEventArgs e)
{
lock(_lockObject)
{
for (int i = 0; i < e.BytesRecorded; i++)
{
_soundBuffer.Add(e.Buffer[i]);
}
}
}
And if user waits a long type- buffer will be very big (2 mb per minute).
So, i create a timer:
_timerSoundCount = new System.Timers.Timer();
_timerSoundCount.Interval = 10000; // check every 10 second
_timerSoundCount.Enabled = true;
_timerSoundCount.Elapsed += _timerSoundCount_Elapsed;
_timerSoundCount.Start();
And:
void _timerSoundCount_Elapsed(object sender, ElapsedEventArgs e)
{
if(_soundBuffer.Count>2*1024*1024)
{
var energy = GetSignalEnergy(_soundBuffer);
if(energy<1) //if energy of signal is small- clear buffer.
{
lock (_lockObject)
_soundBuffer.Clear();
}
}
else if(_soundBuffer.Count>=5*1024*1024)
{... the same operation}
else if(_sounfBuffer.Count>=10*1024*1024)
{ _soundBuffer.Clear();//very big buffer }
}
Every 10 seconds i check the buffer size. If it is too big- i just clear buffer, because i can detect Speech\Silence and clear buffer at that code.
So, point is: can it be that when i execute _soundBuffer.Clear() at timer and at the same time at _waveInStream_DataAvailable i will add new bytes to buffer- can it be currupt write?
Can it be a deadlock?
If so, can you help me how to safetely clear buffer?
Thank you!
If the two actions are being undertaken from the same thread, there is no chance of a deadlock occurring.
If there are multiple threads writing/reading the list at the same time, then lock should be used (https://msdn.microsoft.com/en-us/library/c5kehkcz.aspx) to prevent multiple threads accessing the object simultaneously. See here (use the same lock object at two different code block?) for a simple example.
Alternatively, you can use a concurrent collection from the System.Collections.Concurrent namespace (https://msdn.microsoft.com/en-us/library/system.collections.concurrent(v=vs.110).aspx) Perhaps a ConcurrentQueue would be appropriate if the data is not being randomly accessed. You can also implement your own concurrent collection, although this is much more complex.

outofmemoryexception: threads using too much virtual memory, about 12GB

actually i have to create lots of threads to send pcap file using UDP protocol, when thread completely sends the pcap file it then sleep for some time. when i sleep thread to 420 seconds virtual memory gets full after creating more than 3100 threads and program throws a OutOfMemoryException.
i searched internet about this problem but found that a thread takes only 1MB to create and pcap file is just 60KB, and my 3100 threads are consuming more than 12GB(1.06*3100<12GB). on the other hand physical memory is not used more than 200MB. i have to create more than 5000 threads at the same time
what am i doing wrong? can anyone help me?
thanks
my code:
public static void send_pcap_file_with_single_port()
{
string callID = Call_ID;
try
{
//CREATING CONNECTION HERE
using (FileStream stream = new FileStream("g711a.pcap", FileMode.Open, FileAccess.ReadWrite, FileShare.ReadWrite))
{
for (Pos = 0; Pos < (StreamBytes - ChunkSize); Pos += ChunkSize)
{
//creating RTP_header here
stream.Read(RTP_payload, 0, ChunkSize);
//combining both the byte arrays
System.Buffer.BlockCopy(RTP_header, 0, Bytes_to_send, 0, RTP_header.Length);
System.Buffer.BlockCopy(RTP_payload, 0, Bytes_to_send, 16, RTP_payload.Length);
RTPpacket_queue.Enqueue(Bytes_to_send);
//RTP_handler.Send(Bytes_to_send, Bytes_to_send.Length, remote_EP);
}
//done processing here
stream.Close();
stream.Dispose();
RTP_count++;
GC.Collect();
}
System.Threading.Thread.Sleep(420000);
}
catch (Exception e)
{
//using (StreamWriter sw = new StreamWriter(stream_logFile))
//{
// sw.WriteLine(e.ToString());
//}
//send_BYE_message_toSIPp(client, "BYE", 5060, 2, callID);
Console.WriteLine(e.ToString());
}
}
creating threads here:
Thread RTP_sender = new Thread(new ThreadStart(send_pcap_file_with_single_port));
RTP_sender.Start();
In simple terms you exhaust you garbage collector by creating objects in long term pile (objects that survive longer then few seconds). Fix would be to free and recreate thread when it is needed.
In any case by default i5 has 2 cores, if you have 3 or more threads than they are running them on same cpu. Running 3000+ of them means 1500 each, it is not a problem unless they try to write in same place (in case they start blocking like hell).
To demonstrate you do not need 5000 permanent threads to accomplish something like this, I have created a sample program.
The program does not do much really but what it does do is that it creates 5000 objects, each of which creates a thread when it needs to do its work. There isn't any actual work being done other than simply sleeping for a random interval but still.
Just run the program, leave it running for a while and keep an eye on its memory use. You will see it is very much manageable while still actually doing work on 5000 objects.
You will probably need to actually be creative in applying this approach to your situation but you could do something along the lines of what I am doing.
namespace Test
{
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Diagnostics;
using System.Threading;
public static class MainClass
{
public static Random sleeper = new Random ();
public static void Main (string[] args)
{
Stopwatch timer = new Stopwatch ();
List<WorkerClass> workload = new List<WorkerClass> ();
// Create a workload of 5000 objects
for (int i = 0; i < 5000; i++) {
workload.Add (new WorkerClass ());
}
int fires = 0;
// Start processing the workload
while (true) {
// We'll measure the time it took to go through the entire workload
// to illustrate that it does not take all that long.
timer.Restart ();
foreach (WorkerClass w in workload) {
// for each of the worker objects in the entire workload
// we decrease its internal counter by 1.
// Because after the loop is done, we sleep for 1 secondd
// that amounts to reducing the counter by 1 every second.
w.counter--;
if (w.counter == 0) {
fires++;
// Once the counter hits 0, do the work.
w.DoWork ();
}
}
timer.Stop ();
Console.WriteLine ("Processing the entire workload of {0} objects took {1} milliseconds, {2} workers actually fired.", workload.Count, timer.ElapsedMilliseconds, fires);
fires = 0;
Thread.Sleep (1000);
}
}
}
public class WorkerClass
{
public int counter = 0;
public WorkerClass ()
{
// When the worker is created, set its internal counter
// to a random value between 5 and 10.
// This is to mimic sleeping it for a random interval.
// Also see the primary loop in MainClass.Main
this.counter = MainClass.sleeper.Next (5, 10);
}
public void DoWork ()
{
// Whenever we do the work, we'll create a background worker thread
// that actually does the work.
BackgroundWorker work = new BackgroundWorker ();
work.RunWorkerCompleted += (object sender, RunWorkerCompletedEventArgs e) => {
// This simulates going back to sleep for a random interval, see
// the main loop in MainClass.Main
this.counter = MainClass.sleeper.Next (5, 10);
};
work.DoWork += (object sender, DoWorkEventArgs e) => {
// Simulate working by sleeping a random interval
Thread.Sleep (MainClass.sleeper.Next (2000, 5000));
};
// And now we actually do the work.
work.RunWorkerAsync ();
}
}
}

Collection was modified in foreach loop C#

I know there are MANY similiar questions, but I can't seem to get to the bottom of this.
In my program I execute a verification method which should compare two ascii HEX files with eachother (one is local, the other is read from a USB device). Some code:
private void buttonVerify_Click(object sender, EventArgs e)
{
onlyVerifying = true;
Thread t = new Thread(verifyProgram);
}
private void verifyProgram()
{
verifying = true;
externalFlashFile.Clear();
// After this method is finished, the returned data will end up in
// this.externalFlashFile since this listen to the usb's returned data
hexFile.readExternalFlashForVerify(usbDongle, autoEvent);
externalFlashFile.RemoveAt(0);
//externalFlashFile.RemoveAt(externalFlashFile.Count - 1);
hexFile.verifyProgram(externalFlashFile);
}
public void verifyProgram(List<string> externalProgram)
{
byte[] originalFile = null; // Will be modified later with given size
byte[] externalFile = new byte[4096];
int k = 0, errors = 0;
// Remove last line which contains USB command data
externalProgram.RemoveAt(externalProgram.Count - 1);
foreach (String currentLine in externalProgram)
{
for (int i = 0; i < 64; i += 2)
{
string currentDataByte = currentLine.Substring(i, 2);
externalFile[k] = Convert.ToByte(currentDataByte, 16);
k++;
}
progress += steps;
}
//... compare externalFile and originalFile
When executing the readExternalFlashForVerify the USB is responding with requested data. This data is parsed and calls an eventhandler:
public void usbDongle_OnDataParsed(object sender, EventArgs e)
{
if (verifying)
{
usbDongle.receivedBytesString.Trim();
externalFlashFile.Add(usbDongle.receivedBytesString.Substring(2, 32 * 2));
// Allow hexFile continue its thread processing
autoEvent.Set();
}
}
The first run is always completes correctly. The following executions, at the third or fourth iteration of the foreach, I get an extra element in externalProgram. This is not a global variable (argument in function call) and the function is not called anywhere else. This ofcourse throws an exception.
I tried adding .ToList() to externalProgram in the foreach but that didn't do any difference. How can my externalProgram be modified during this execution?
EDIT: I never found the cause of this, but replacing the foreach with a hard-coded for-loop solved the issue at hand. Not an optimal solution, but don't have much time on this.
// The list should never be larger than 128 items
for (int j = 0; j < 0x7f ; j++)
{
string currentLine = externalProgram[j];
// ...
Usually when you receive an exception with a message like that it is caused by multiple accesses from different threads to a list.
What I suggest you is to use a lock when you add and remove items from that list, so you're sure the indexes to that collection are not changing. You have to think what would happen if you try to remove the last element (of index 3, for example) of a collection when someone else removes a previous item (changing the lenght of the collection to 3...).
This example: Properly locking a List<T> in MultiThreaded Scenarios? describes better what I mean.
Probably this line is a problem:
externalProgram.RemoveAt(externalProgram.Count - 1);
If verifyProgram is called multiple times, it will remove more and more lines from externalProgram list passed by reference

troubles threads in winforms

I need to calculate sum of elements in the textbox and number of elements at the same time. So I decided to create two threads - one for length of the number, and one for sum of elements. But when I start only one thread - it works correct. But when I start the second thread - form begins to work slow or stops working at all.
I create two threads
thrd = new Thread(GetLength);
thrd.Start();
thrd1 = new Thread(SetSum);
thrd1.Start();
And these are threads' functions for calculation length of the number in textbox and for calculation sum of its elements.
private void SetSum()
{
while (true)
{
if (this.label3.InvokeRequired)
this.Invoke(new Action(() => label3.Text = this.GetSum().ToString()));
}
}
private int GetSum()
{
string n = textBox1.Text;
int sum = 0;
for (int i = 0; i < n.Length; i++)
{
try
{
sum += int.Parse(n[i].ToString());
}
catch (FormatException) { };
}
return sum;
}
private void GetLength()
{
while (true)
{
if (this.label2.InvokeRequired)
this.Invoke(new Action(() => label2.Text = " | Length = " + textBox1.Text.Length.ToString()));
}
}
Where is the problem? Synchronization?
I have found a solution - I add Thread.Sleep(1) in while loop in GetLength method
Several problems here.
The task at hand is much too small for a (full) Thread. Threads are expensive to create.
By Invoking the main action, all work is done on the Main thread. Your solution is not multi-threaded after all.
Counting is easily done as a by-product of Summing (or vice versa) so 2 threads/tasks is overkill.
The while(true) ... loop will drag your process down, consuming too much CPU time for nothing
The simple answer here is not to use any threads, just run some logic in textBox1.TextChanged.
Yes, the problem is in fact synchronization: there's too much of it.
You're spawning threads that only do Invokes, which means the UI thread is doing all the work.
This part of your code is an infinite loop without any Thread.Sleep or any other Wait. This will bring CPU to 100%. You should tie this to some event or any other activity which will trigger GetLength
private void GetLength()
{
while (true)
{
if (this.label2.InvokeRequired)
this.Invoke(new Action(() => label2.Text = " | Length = " + textBox1.Text.Length.ToString()));
}
}

Categories