I'm currently working on a program that analyses your voice.
I have a thread that runs WaveInEvent that records audio from mic.
When data is available (WaveInEventArgs event fires), it encodes it and serialises it and sends it to the analyser.
I'd like to be able to use an audio file, say a .mp3.
I'm reading it like this :
byte[] audioFileBytes = File.ReadAllBytes(audioFilePath);
And then sending it to the analyser the same way.
I'm encountering a couple of problems : the audio file is not processed on the other end - while it works if I use the "dynamic" audio input. I'm guessing it has to do with buffer length, but I can't find how long the WaveInEvent buffer is nor how often the WaveInEventArgs event fires.
Second, the data is treated in ~2 seconds, and I used to add timestamps to the data I received on the analyser side when I get it in realtime, so right now it adds one timestamp for the whole file. If I send it byte per byte, I'll get 1.2 million timestamps in about 2 seconds, which is still not printable on a graph - plus the graph should start at 0 and end at the length of the audio file (70 seconds here for the test one I have), not at 2 seconds.
So as a first step I thought about reading the audio file "in real time", and sending the audio from the file as if it came from the microphone, but I have no idea how to do that.
My question comes down to : how often do the WaveInEvent handlers fire ?
It's to do with the length of buffers. Every time a buffer is filled with audio the event will fire. Look at the BufferMilliseconds and NumberOfBuffers properties. You can set these to desired values before you start recording.
Related
I am writing a simple app where an Ardunio sends read data to the C# program using simple serial.write()'s. I send the data from Ardunio with special syntax and the c# program reads and interprets it and eventually graphs the data.
For example, the Ardunio program sends data
Serial.write("DATA 1 3 90;");
which means: X value and Y value in chart 1 is 3 and 90 respectively. And the c# program reads it using
private async void Serialport_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
await serialport.BaseStream.ReadAsync(buffer, 0, 1024);
await Task.Run(() => Graph()); // Graph function graphs the data
}
In the graph() function, I convert the buffer to a string with Encode.ASCII.GetString(buffer); and interpret the data with the syntax and stuff. But for some reason either the c# program doesn't read it fast enough or the Arduino doesn't send it fast enough and the messages are sometimes interrupted For example the packet I received is:
DATA 2 5 90;DATA 5 90 10;DATA 1 1 <|-------- I cannot get the last Y Value
And the next data module starts with
75;DATA 5 4 60;DATA 14 5 6;DATA
/\
|
+==================== It is here
BTW, all the packets are 32 bytes.
So I either need to get the data line by line but I cannot do that because ardunio sends it too fast
Serial.write("DATA 1 3 90;");
Serial.write("DATA 2 4 40;");
Comes to C# as DATA 1 3 90;DATA 2 4 40; as a whole block. Or I need to get it all at once?
(I prefer getting it line by line)
UPDATE:
When delay(1000); is added between sends. The data is processed correctly. Without the delays, Arduino sends the data too fast, and data clumps and interrupted. How can I make sure that there is no delay in the data yet the data is reliable and without interruption?
UPDATE 2:
When buffer size is increased to 100 * 1024 * 1024 as well as the readCount in ReadAsync method, the read message is much larger still with interruptions tho.
I can give you any extra information.
UPDATE 3:
I ll try to answer the comments;
First of all, I am already doing that and I forgot to mention it. I split the incoming data from the semicolons and interpret them one by one with a foreach loop
Second I am using buffer globally instead of passing it as an argument to graph function
Third I ll try Serial.PrintLine and give you feedback on it
UPDATE 4:
I was able to solve the problem by replacing ReadAsync with Serial.ReadLine() and serial.write with serial.printline(); The solution is suggested by Hans Passant
UPDATE 5:
I still encounter the same problem but differently. The graph function is async so when I write the data so fast I miss some data trying to graph and read other data.
Also because the graph function is async chart control plots the points without order and the graph gets messy.
This can be easily solvable if I add delay(50) between Serial.println()'s
PS. I didn't give the whole code because it is a large block. But I can give it piece by piece if you tell me where you want it.
Any help is appreciated.
so I have a kinda strange problem. I'm using LAN for the communication with a microcontroller. Everything was working perfect. Meaning: I can send and receive data. For receiving data I'm using a simple method, which is Thread.sleep(1) in a for loop in which I keep checking client.GetStream().DataAvailable for true while client is a TcpClient
Now, with one process I have to send and receive to the microcontroller with a higher Baud rate. I was using 9600 for all other operations and everythingwas fine. Now with 115200 client.GetStream().DataAvailableseems to always have the value false.
What could be the problem?
PS: Another way to communicate with the microcontroller (all chosen by user) is serial communication. This is still working fine with the higher Baud rate.
Here is a code snippet:
using (client = new TcpClient(IP_String, LAN_Port))`
{
client.SendTimeout = 200;
client.ReceiveTimeout = 200;
stream = client.GetStream();
.
.
bool OK = false;
stream.Write(ToSend, 0, ToSend.Length);
for (int j = 0; j < 1000; j++)
{
if (stream.DataAvailable)
{
OK = true;
break;
}
Thread.Sleep(1);
}
.
.
}
EDIT:
While monitoring the communication with a listing device I realized that the bits actually arrive and that the device actually answers. The one and only problem seem that the DataAvailable flag is not being raised. I should probably find another way to check data availability. Any ideas?
I've been trying to think of things I've seen that act this way...
I've seen serial chips that say they'll do 115,200, but actually won't. See what happens if you drop the baud rate one notch. Either way you'll learn something.
Some microcontrollers "bit-bang" the serial port by having the CPU raise and lower the data pin and essentially go through the bits, banging 1 or 0 onto the serial pin. When a byte comes in, they read it, and do the same thing.
This does save money (no serial chip) but it is an absolute hellish nightmare to actually get working reliably. 115,200 may push a bit-banger too hard.
This might be a subtle microcontroller problem. Say you have a receiving serial chip which asserts a pin when a byte has come in, usually something like DRQ* for "Data Request" (the * in DRQ* means a 0-volt is "we have a byte" condition) (c'mon, people, a * isn't always a pointer:-). Well, DRQ* requests an interrupt, the firmware & CPU interrupt, it reads the serial chip's byte, and stashes it into some handy memory buffer. Then it returns from interrupt.
A problem can emerge if you're getting data very fast. Let's assume data has come in, serial chip got a byte ("#1" in this example), asserted DRQ*, we interrupted, the firmware grabs and stashes byte #1, and returns from interrupt. All well and good. But think what happens if another byte comes winging in while that first interrupt is still running. The serial chip now has byte #2 in it, so it again asserts the already-asserted DRQ* pin. The interrupt of the first byte completes. What happens?
You hang.
This is because it's the -edge- of DRQ*, physically going from 5V to 0V, that actually causes the CPU interrupt. On the second byte, DRQ* started at 0 and was set to 0. So DRQ* is (still) asserted, but there's no -edge- to tell the interrupt hardware/CPU that another byte is waiting. And now, of course, all the rest of the incoming data is also dropped.
See why it gets worse at higher speeds? The interrupt routine is fielding data more and more quickly, and typically doing circular I/O buffer calculations within the interrupt handler, and it must be fast and efficient, because fast input can push the interrupt handler to where a full new byte comes in before the interrupt finishes.
This is why it's a good idea to check DRQ* during the interrupt handler to see if another byte (#2) is already waiting (if so, just read it in, to clear the serial chip's DRQ*, and stash the byte in memory right then), or use "level triggering" for interrupts, not "edge triggering". Edge triggering definitely has good uses, but you need to watch out for this.
I hope this is helpful. It sure took me long enough to figure it out the first time. Now I take great care on stuff like this.
Good luck, let me know how it goes.
thanks,
Dave Small
I have a one-on-one connection between a server and a client. The server is streaming real-time audio/video data.
My question may sound weird, but should I use multiple ports/socket or only one? Is it faster to use multiple ports or a single one offer better performance? Should I have a port only for messages, one for video and one for audio or is it more simple to package the whole thing in a single port?
One of my current problem is that I need to first send the size of the current frame as the size - in bytes - may change from one frame to the next. I'm fairly new to Networking, but I haven't found any mechanism that would automatically detect the correct range for a specific object being transmitted. For example, if I send a 2934 bytes long packet, do I really need to tell the receiver the size of that packet?
I first tried to package the frame as fast as they were coming in, but I found out the receiving end would sometime not get the appropriated number of bytes. Most of the time, it would read faster than I send them, getting only a partial frame. What's the best way to get only the appropriated number of bytes as quickly as possible?
Or am I looking too low and there's a higher-level class/framework used to handle object transmission?
I think it is better to use an object mechanism and send data in an interleaved fashion. This mechanism may work faster than multiple port mechanism.
eg:
class Data {
DataType, - (Adio/Video)
Size, - (Size of the Data buffer)
Data Buffer - (Data depends on the type)
}
'DataType' and 'Size' always of constant size. At the client side take the 'DataType' and 'Size' and then read the specifed size of corresponding sent data(Adio/Video).
Just making something up off the top of my head. Shove "packets" like this down the wire:
1 byte - packet type (audio or video)
2 bytes - data length
(whatever else you need)
|
| (raw data)
|
So whenever you get one of these packets on the other end, you know exactly how much data to read, and where the beginning of the next packet should start.
[430 byte audio L packet]
[430 byte audio R packet]
[1000 byte video packet]
[20 byte control packet]
[2000 byte video packet]
...
But why re-invent the wheel? There are protocols to do these things already.
I've been trying to figure out the mystical realm of MIDI parsing, and I'm having no luck. All I'm trying to do is get the note value (60 = C4, 72 = C5, etc), in order of when they occur.
My code is as follows. All it does is very simply open a file as a byte array and read everything out as hex:
byte[] MIDI = File.ReadAllBytes("TestMIDI.mid");
foreach (var element in MIDI) {
string b = Convert.ToString(element,16);
Debug.WriteLine(b);
}
All TestMIDI.mid contains is one note on C5. Here's a hex dump of it. Using this info, I'm trying to find the simple hex value for Note On (0x9, or just 9 in the dump), but there aren't any. I can find a few 72's, but there are 3, which doesn't make any sense to me (note on, note off, then what?).
This is my first attempt at parsing MIDI as a file and using hex dumps (are they even called that?), so I'm sorry if I'm heading in the complete wrong direction. All I need is to get the note that plays, and in what order. I don't need timing or anything fancy at all. The reason behind this, if it matters - is to then generate new code in a different language to be played out of a speaker, very similar to the beep command on *nix. Because of this, I don't want to use any frameworks that 1) I didn't program, and really didn't learn anything and 2) do far more than what I need, making the framework heavier than the actual code by me.
Accepted answer is not a solution for the problem. It will not work in common case. I'll provide several cases where this code either will not work or will fail. Order of these cases corresponds their probability - most probable cases go first.
False positives. MIDI files contain a lot of data structures where you can find a byte with the value 144. And these structures are not Note On events. For real MIDI files you'll get bunch of "notes" that are not notes but random values within the file.
Channels other than 0. Most of the modern MIDI files contain several track chunks. Each one holds events for the specific MIDI channel (from 0 to 15). 144 (or 90 in hex) represents a Note On event for the channel 0. So you are going to miss a lot of Note On events for other channels.
Running status. MIDI files actively use concept of running status. This technique allows don't store status bytes of consecutive events of the same type. It means that status byte 144 can be written only once for the first Note On event and you will not find it further in the file.
144 is the last byte in a file. MIDI file can end with this value. For example if a custom chunk is the last chunk in the file or track chunk doesn't end with End of Track event (which is corruption according to MIDI file specification but possible scenario in real world). In this case you' ll get IndexOutOfRangeException on MIDI[i+1].
Thus, you should never search for specific value to find some semantic data structure in a MIDI file. You must use one of the .NET libraries available on the Internet. For example, with the DryWetMIDI you can use this code:
IEnumerable<Note> notes = MidiFile.Read(filePath)
.GetNotes();
To do this right, you'll need at least some semblance of a MIDI parser. Searching through 0x9 events is a good start, but 0x9 is also a Note-Off event if the velocity field is 0. 0x9 can also be present inside other events (meta events, MPQN events, delta times, etc), so you'll get false positives. So, you need something that actually knows the MIDI file format to do this accurately.
Look for a library, write your own, or port an open-source one. Mine is in Java if you want to look.
I am using C# to receive data from a serial port but there are some problems. I'm new to this so I need some help.
First off all I want to know which functions are event driven:
ReadExisting()
Read()
Readbyte()
Readchar()
ReadLine()
Readto()
How can I take the required data form input stream of this port?
I have static sized protocols. Can I use a special char to specify limits of a protocol data, and which will be a suitable char for this?
How do I handle this exception:
C# SerialPort System.ObjectDisposedException, safe handle has been closed in System.DLL
None of these methods are "event driven", you'd use them in the DataReceived event. Which is called when the serial port has at least one byte of data available to read.
Not sure what "static sized" means. If the device sends a fixed number of bytes then you'd use the Read() method to read them. Pay attention to the return value, you'll get only as many bytes as are available. Store them in a byte[] and append to that in the next DR event until you've got them all.
If the device sends characters rather than bytes then you can usually take advantage of the NewLine property. Set it to the character or string that terminates the response. A linefeed ("\n") is by far the most typical choice. Read the response with ReadLine(). No buffering is required in that case.
You'll get the ObjectDisposed exception when you close a form but don't ensure that the device stops sending data. Be sure to use only BeginInvoke in the DataReceived event, not Invoke. And don't call BeginInvoke if the form's IsDisposed property is true.
I can't add anything much to Hans' answer except to say that one of the biggest traps I have seen is that people tend to expect that when the DataReceived event fires, all of the bytes they would like to receive are all present.
e.g. if your message protocol is 20 bytes long, the DataReceived event fires and you try to read 20 bytes. They may all be there, they may not. Pretty likely that they won't be, depending on your baud rate.
You need to check the BytesToRead property of the port you are reading from, and Read that amount into your buffer. If and when more bytes are available, the DataReceived event will fire again.
Note that the DataReceived event will fire when the number of bytes to receive is at least equal to the ReceivedBytesThreshold property of the serial port. By default I believe this is set to a value of 1.
If you set this to 10 for example, the event will fire when there are 10 or more bytes waiting to be received, but not fewer. This may or may not cause problems, and it is my personal preference to leave this property value set to 1, so that all data received will fire the event, even if only 1 byte is received.
Do not make the mistake that this will cause the event to fire for every single byte received - it won't do that.