What I'm currently doing is while recording I get the bytes from the data received event and encode the bytes with a codec. The codecs I have set up to currently use are ALaw, G722, GSM, etc (like in the NetworkChatDemo)
The code used to start recording
If waveIn Is Nothing Then
waveIn = New WaveIn
waveIn.DeviceNumber = 0
waveIn.BufferMilliseconds = 100
waveIn.WaveFormat = codec.RecordFormat
waveIn.StartRecording()
End If
The code used to work with the recorded bytes
Private Sub OnDataAvailable(ByVal sender As Object, ByVal e As WaveInEventArgs) Handles waveIn.DataAvailable
Dim encoded As Byte() = codec.Encode(e.Buffer, 0, e.BytesRecorded)
...
//Send encoded audio over UdpClient
End Sub
This is working well for me but I'd like to add support for the AAC format. So I'd like to achieve the same result as in the above DataReceived event, a byte array of encoded AAC audio which I can send over udp, from MediaFoundationEncoder.EncodeToAac(...) or another class/function I am not aware of within NAudio. As far as I can tell, MediaFoundationEncoder.EncodeToAac(...) only encodes audio to a file and not a byte array. It also doesn't accept a byte array as a parameter. Is there a way to achieve what I want or does NAudio only let me work with an AAC file after it has been fully recorded and saved?
Yes, as Corey says, this is a new addition to NAudio for 1.7, and adding support to encode to a Stream is something I would definitely like to add in the future.
If you don't mind working at a lower level, you could try to construct the AAC encoder as a MFT creating a derived class from MediaFoundationTransform (see MediaFoundationResampler as an example of how to do this.
Related
I have a project. It is with Raspberry Pi Camera V2. One PC is used for encoding the captured video in MJPEG format and sending it with the serial port.
My PC is used for receiving the data, saving it in a .mjpeg formatted file and playing it with an MJPEG to MP4 converter.
I am trying to save the data in these lines:
byte[] data= new byte[100];
serialPort.Read(data,0,100);
BinaryWriter videoFile = new BinaryWriter(File.Open("video.mjpeg",FileMode.Create));
string dataAscii;
dataAscii = System.Text.Encoding.UTF8.GetString(data); //bytearray to string
videoFile.Write(dataAscii); // which is received
It works, it creates a .mjpeg file. However, I couldn't make it play with the converter. Maybe I should save the data frame by frame or try to save in a different way. I have no idea about what I am doing wrong.
Any ideas, many thanks!
Kane
Why are you converting the byte array into a string before writing it? That's your problem. Just write the byte array directly to the file stream.
I am trying to create a WAV object, from a WMA byte array, I've tried the following code, but the software exits, and in the dump file I found the following exception when I try to create fileReader
"The thread tried to read from or write to a virtual address for which it does not have the appropriate access."
System.IO.File.WriteAllBytes("wmatemp.wma", data);
WMAFileReader fileReader = new WMAFileReader("wmatemp.wma");
WaveStream waveStream = WaveFormatConversionStream.CreatePcmStream(fileReader);
WAV wav = new WAV(AudioMemStream(waveStream).ToArray());
I know I should not save the .wma to the HDD, but I don't know how to proceed differently, any help ?
I want send string byte to speaker something like this:
byte[] bt = {12,32,43,74,23,53,24,54,234,253,153};// example array
var ms = new MemoryStream(bt);
var sound = new System.Media.SoundPlayer();
sound.Stream = ms;
sound.Play();
but I get this exception:
my problem pic http://8pic.ir/images/g699b52xe5ap9s8yf0pz.jpg
The first bytes of a WAV stream contain info about length, etc.
You have to send this "WAV-Header" as well in the first few bytes.
See http://de.wikipedia.org/wiki/RIFF_WAVE
As you'll see its perfectly possible to compose these few bytes in the header and send them before your raw audio data,
You can use some library for reading data from microphone or playing it to speakers.
I worked successfuly with:
NAudio - http://naudio.codeplex.com/
I would not recommend building a WAV file yourself, it may be too much effort for this.
Note that this library (and probably some others, Bass - http://www.un4seen.com is also widely used) also have built in functionality for saving and reading WAV files.
NAudio is best app to play that functionality. use sample app provided.It may help.
I am using Axis media parser SDK to get h264 stream, how can I write that stream directly in MPEG-4 AVC (mp4) container in C# without re encoding and loosing the video quality
Currently these 2 events are fired to write audio and video into a file
private void OnVideoSample(int cookieID, int sampleType,
int sampleFlags, ulong startTime, ulong stopTime, object SampleArray)
{
// Cast the buffer object to a byte array
byte[] bufferBytes = (byte[])SampleArray;
Console.WriteLine("OnVideoSample - {0} bytes", bufferBytes.Length);
// Write the data to out file
lock (fileLock)
{
outFile.Write(sampleType);
outFile.Write(sampleFlags);
outFile.Write(startTime);
outFile.Write(stopTime);
outFile.Write(bufferBytes.Length);
outFile.Write(bufferBytes, 0, bufferBytes.Length);
}
}
// Event handler callback for audio samples buffers
private void OnAudioSample(int cookieID, int sampleType,
int sampleFlags, ulong startTime, ulong stopTime, object SampleArray)
{
// Cast the buffer object to a byte array
byte[] bufferBytes = (byte[])SampleArray;
Console.WriteLine("OnAudioSample - {0} bytes", bufferBytes.Length);
// Write the data to out file
lock (fileLock)
{
outFile.Write(sampleType);
outFile.Write(sampleFlags);
outFile.Write(startTime);
outFile.Write(stopTime);
outFile.Write(bufferBytes.Length);
outFile.Write(bufferBytes, 0, bufferBytes.Length);
}
}
You can build DirectShow pipeline with your custom source filters (esp. based on/derived from `PushSource' SDK sample) which inject data provided from events quoted above. These two will be connected to a so called MPEG-4 Multiplexer, such as free one from GDCL or a compatible one. Then the former will be connected to file writer. This pipeline will convert the data from events into MP4 file.
Writing filters assumes C++ development though. I am not sure, but DirectShowLib might be providing a sort of bridge into managed envrionment so that you could end up avoiding touching C++.
You can also use Intel IPP Library (Intel Media SDK?) to produce MP4 files, though I think it also requires C++ rather than C#.
I am trying to load a *.wav file to a byte array using C# 3.0 and .NET 3.5 like this:
var fs = File.Open(filedialog.FileName, FileMode.Open,FileAccess.Read);
long numBytes = new FileInfo(filedialog.FileName).Length;
BinaryReader br = new BinaryReader(fs);
byte[] bytes = br.ReadBytes((int)numBytes);
From byte[58] and to the end (~50k bytes) all values are 127 or 128 (I guess the first ~58 bytes are header stuff?).
The wave file is playing fine in Windows media player and other players, and I am sure it is nothing wrong with it (it's recorded with the sound recorder in WinXP).
Wave file info:
BitRate: 176kbps
Audio sample size: 8bit
Audio sample rate: 22kHz
Audio format: PCM
When I try to play the byte stream using the .NET SoundPlayer it sounds terrible :-)
Any idèas?
[SOLVED]
This was not the problem after all, so I'll have to continue my search for the real bug.
The code looks all right, as far as I can see.
You could try the simpler code:
byte[] bytes = File.ReadAllBytes(filedialog.FileName);