send array of bytes to System.Media.SoundPlayer in c# - c#

I want send string byte to speaker something like this:
byte[] bt = {12,32,43,74,23,53,24,54,234,253,153};// example array
var ms = new MemoryStream(bt);
var sound = new System.Media.SoundPlayer();
sound.Stream = ms;
sound.Play();
but I get this exception:
my problem pic http://8pic.ir/images/g699b52xe5ap9s8yf0pz.jpg

The first bytes of a WAV stream contain info about length, etc.
You have to send this "WAV-Header" as well in the first few bytes.
See http://de.wikipedia.org/wiki/RIFF_WAVE
As you'll see its perfectly possible to compose these few bytes in the header and send them before your raw audio data,

You can use some library for reading data from microphone or playing it to speakers.
I worked successfuly with:
NAudio - http://naudio.codeplex.com/
I would not recommend building a WAV file yourself, it may be too much effort for this.
Note that this library (and probably some others, Bass - http://www.un4seen.com is also widely used) also have built in functionality for saving and reading WAV files.

NAudio is best app to play that functionality. use sample app provided.It may help.

Related

Save the received MJPEG data in a .mjpeg file with C#

I have a project. It is with Raspberry Pi Camera V2. One PC is used for encoding the captured video in MJPEG format and sending it with the serial port.
My PC is used for receiving the data, saving it in a .mjpeg formatted file and playing it with an MJPEG to MP4 converter.
I am trying to save the data in these lines:
byte[] data= new byte[100];
serialPort.Read(data,0,100);
BinaryWriter videoFile = new BinaryWriter(File.Open("video.mjpeg",FileMode.Create));
string dataAscii;
dataAscii = System.Text.Encoding.UTF8.GetString(data); //bytearray to string
videoFile.Write(dataAscii); // which is received
It works, it creates a .mjpeg file. However, I couldn't make it play with the converter. Maybe I should save the data frame by frame or try to save in a different way. I have no idea about what I am doing wrong.
Any ideas, many thanks!
Kane
Why are you converting the byte array into a string before writing it? That's your problem. Just write the byte array directly to the file stream.

Receiving "Not a WAVE file - no RIFF header" when creating wave stream from mp3 conversion stream

I'm trying to convert a mp3 into a wave stream using NAudio. Unfortunately I receive the error Not a WAVE file - no RIFF header on the third line when creating the wave reader.
var mp3Reader = new Mp3FileReader(mp3FileLocation);
var pcmStream = WaveFormatConversionStream.CreatePcmStream(mp3Reader);
var waveReader = new WaveFileReader(pcmStream)
Shouldn't these streams work together properly? My goal is to combine several mp3s and wavs into a single stream for both playing and saving to disc( as a wav).
I'm going to preface this with a note that I've never used NAudio. Having said that, there's a guide to concatenating audio on their Github site.
Having looked at the API, you can't use Mp3FileReader directly as it doesn't implement ISampleProvider. However, you can use AudioFileReader instead.
Assuming you have an IEnumerable<string> (aka List or array) of the filenames you want to join named files:
var sampleList = new List<ISampleProvider>();
foreach(string file in files)
{
sampleList.add(new AudioFileReader(file));
}
WaveFileWriter.CreateWaveFile16("outfilenamegoeshere.wav", new ConcatenatingSampleProvider(sampleList));
you don't need the second and third lines. Mp3FileReader will convert to PCM for you and you can play it directly with a player like WaveOutEvent. To actually produce a WAV file on disk, pass it into WaveFileWriter.CreateWaveFile
using(var reader = new Mp3FileReader(mp3FileLocation))
{
WaveFileWriter.CreateWaveFile(reader);
}

NAudio MediaFoundationRecorder EncodeToAAC Byte array instead of file

What I'm currently doing is while recording I get the bytes from the data received event and encode the bytes with a codec. The codecs I have set up to currently use are ALaw, G722, GSM, etc (like in the NetworkChatDemo)
The code used to start recording
If waveIn Is Nothing Then
waveIn = New WaveIn
waveIn.DeviceNumber = 0
waveIn.BufferMilliseconds = 100
waveIn.WaveFormat = codec.RecordFormat
waveIn.StartRecording()
End If
The code used to work with the recorded bytes
Private Sub OnDataAvailable(ByVal sender As Object, ByVal e As WaveInEventArgs) Handles waveIn.DataAvailable
Dim encoded As Byte() = codec.Encode(e.Buffer, 0, e.BytesRecorded)
...
//Send encoded audio over UdpClient
End Sub
This is working well for me but I'd like to add support for the AAC format. So I'd like to achieve the same result as in the above DataReceived event, a byte array of encoded AAC audio which I can send over udp, from MediaFoundationEncoder.EncodeToAac(...) or another class/function I am not aware of within NAudio. As far as I can tell, MediaFoundationEncoder.EncodeToAac(...) only encodes audio to a file and not a byte array. It also doesn't accept a byte array as a parameter. Is there a way to achieve what I want or does NAudio only let me work with an AAC file after it has been fully recorded and saved?
Yes, as Corey says, this is a new addition to NAudio for 1.7, and adding support to encode to a Stream is something I would definitely like to add in the future.
If you don't mind working at a lower level, you could try to construct the AAC encoder as a MFT creating a derived class from MediaFoundationTransform (see MediaFoundationResampler as an example of how to do this.

Record to a MemoryStream using NAudio

I'm trying to use NAudio to record from the micrphone and save tthat data to a MemoryStream. There doesn't seem to be an obvious way and everytime I try to read from my BufferedWaveProvider, the recording no longer plays when using WaveOut.Play. It recording plays fine if I remove my read attempt. Is there a way to record to a MemoryStream?
This is how I've tried to convert it to a MemoryStream:
Byte[] stream = new byte[bwp.BufferedBytes];
bwp.Read(stream, 0, bwp.BufferedBytes);
wo.Play(); //plays fine only if I comment out the Read line above
MemoryStream ms = new MemoryStream(stream);
A WaveProvider can only have one consumer of its Read method. I suggest that at the point you add bytes to your BufferedWaveProvider you also add them to your memory stream. The alternative is to inherit from IWaveProvider, and in the Read method, read from the BufferedWaveProvider and write what you read to the MemoryStream before returning. Then use that WaveProvider to give to WaveOut.
There is a link out on code project along with the source code that you can download to help you to do what you are trying.
There is an example project on codeproject doing this:
http://www.codeproject.com/KB/cs/Streaming_wave_audio.aspx
I don't know how low the latency is.
As a codec I'd recommend Speex(at least for speech). It's free, open source and offers low latency and low bandwidth.

Converting wav file to wav file (changing format)

Trying to find a workaround for my previous question, I'd like to convert a 16k 8bit mono wav which is written in byte[] (which has wav header) to a 8k 8bit mono stream/byte[].
Is there any .Net library with samples available for this kind of conversion?
Thank you.
Thanks for the answers, I ended up using NAudio and with the following snippet, Voila! Everything works like a charm:
WaveFormat target = new WaveFormat(8000, 8 , 1);
WaveStream stream =new WaveFileReader("c:\\test.wav");
WaveFormatConversionStream str = new WaveFormatConversionStream(target, stream);
WaveFileWriter.CreateWaveFile("c:\\converted.wav", str);
Alvas seems to support conversion as well as the usual features:
http://alvas.net/alvas.audio.aspx

Categories