I've already searched at Stackoverflow and google, but haven't found what I'm looking for.
So far I got the audio raw data(WAV File) and I want to visualize it.
private void Form1_Load(object sender, EventArgs e)
{
FileStream fs = new FileStream("D:\\tada.wav", FileMode.Open);
BinaryReader reader = new BinaryReader(fs);
char[] data = new char[4];
long fsize;
long wfxSize;
long dataSize;
WaveFormatEx wfx;
//RIFF
reader.Read(data, 0, 4);
fsize = reader.ReadInt32();
//WAVE
reader.Read(data, 0, 4);
//FMT
reader.Read(data, 0, 4);
wfxSize = reader.ReadInt32();
byte[] wfxBuffer = new byte[wfxSize];
reader.Read(wfxBuffer, 0, (int)wfxSize);
wfx = new WaveFormatEx(wfxBuffer);
//DATA
reader.Read(data, 0, 4);
dataSize = reader.ReadInt32();
byte[] dataBuff = new byte[dataSize];
reader.Read(dataBuff, 0, (int)dataSize);
reader.Close();
//Visualize the data...
}
I know I need to convert the raw data into samples and then check for the peak for each sample and draw lines, but I really don't know how to do it(except for the drawing).
I see this is an old question but in case someone is interested here is a solution:
Use the NAudio library:
http://naudio.codeplex.com/
Here is a video tutorial on how to use NAudio to display waveforms:
http://www.youtube.com/watch?v=ZnFoVuOVrUQ
Visualize the data... Wow! You should check out the WAV file spec here and perhaps here and then re-think whether this is something you actually want to tackle. (The second link is actually a better, more streamlined overview. Take a look at the data section to see if it's something you want to work with.)
Don't get me wrong. Maybe this is exactly what you want to do, and it might be fun. You should just know what you're getting into!
Also, here's a Code Project component that you could use outright or look at for ideas.
Related
I am completely new to working with audio. I eventually want to stream an MP3 to a web page and allow user to alter the tempo. I got a HTML5 audio element set up and it can stream a MP3 fine. I can import the MP3 into NAudio.AudioFileReader and stream that to the page and that also works fine using the following code:
string fn = Server.MapPath("~/Uploads/Music/" + filename);
AudioFileReader reader = new AudioFileReader(fn);
MemoryStream outputStream = new MemoryStream();
using (NAudio.Wave.WaveFileWriter waveFileWriter = new WaveFileWriter(outputStream, reader.WaveFormat))
{
byte[] bytes = new byte[reader.Length];
reader.Position = 0;
reader.Read(bytes, 0, (int)reader.Length);
waveFileWriter.Write(bytes, 0, bytes.Length);
waveFileWriter.Flush();
}
return File(outputStream.ToArray(), "audio/mp3");
I'm not even sure if this is the proper way to do this, but I modified some code I found online and this does work. However, when looking at the NAudio Varispeed demo which integrates the SoundTouch library and trying to incorporate it, it no longer works.
I modified my code like this:
string fn = Server.MapPath("~/Uploads/Music/" + filename);
AudioFileReader reader = new AudioFileReader(fn);
bool useTempo = true;
VarispeedSampleProvider speedControl = new VarispeedSampleProvider(reader, 100, new SoundTouchProfile(useTempo, false));
MemoryStream outputStream = new MemoryStream();
using (NAudio.Wave.WaveFileWriter waveFileWriter = new WaveFileWriter(outputStream, reader.WaveFormat))
{
byte[] bytes = new byte[reader.Length];
speedControl.Read(bytes.Select(b => (float)Convert.ToDouble(b)).ToArray(), 0, (int)reader.Length);
waveFileWriter.Write(bytes, 0, bytes.Length);
waveFileWriter.Flush();
}
return File(outputStream.ToArray(), "audio/mp3");
It builds and appears like it's working but when I hit play, I get no audio.
What am I doing wrong here? Is this not even a good way to accomplish what I want?
You are reading into a temporary array (created by ToArray), so the audio you read is lost.
Instead, declare a float[], read into that, and then write the contents of that into the waveFileWriter.
Also, it is very important to use the return value from Read which will indicate the number of samples actually written into the array.
My task is to decode a mp3 file, exclude its header, side information and the optional checksum. I just need the actual data of every frame of the mp3 file.
I have googled a lot but did't find a way !
Can any one tell me a direct way to do that. I am using NAudio to access frames using ReadNextFrame()
Any help will be appreciated.
As mentioned in http://mark-dot-net.blogspot.de/2010/11/merging-mp3-files-with-naudio-in-c-and.html you could change the code to something like that:
public static byte[] GetRawMp3Frames(string filename)
{
using(MemoryStream output = new MemoryStream()) {
Mp3FileReader reader = new Mp3FileReader(filename);
Mp3Frame frame;
while ((frame = reader.ReadNextFrame()) != null)
{
output.Write(frame.RawData, 0, frame.RawData.Length);
}
return output.ToArray();
}
}
Then you can process the frame-only bytes by doing this:
var btAllFrames = GetRawMp3Frames("MyMp3.mp3");
EDIT: Looks like this is a dupe question that was answered better here.
ORIGINAL:
Sounds like you want a full MP3 decoder that outputs the main_data chunks instead of decoding them.
Two options:
Build your own reader (complete with bit reservoir calculations for layer III), or
Remove the audio decode logic from an existing decoder and insert your output logic instead.
There are probably some tricks you can apply that would allow you to short-circuit decoding at least some of the header / side info, but that will take a thorough understanding of the specifications.
If you need a place to start for option #2, try searching for NLayer, JLayer, libmad, or "dist10 source".
Apparently NAudio uses NLayer.
This worked for me
byte[] data = File.ReadAllBytes("path/to/file.mp3");
var memStream = new System.IO.MemoryStream(results);
var mpgFile = new NLayer.MpegFile(memStream);
var samples = new float[mpgFile.Length];
mpgFile.ReadSamples(samples, 0, (int)mpgFile.Length);
I'd like to resample audio: change its sample rate from 44k to 11k. The input I've got is raw audio in bytes. It really is raw, it has no headers - if I try loading it into a WaveFileReader, I get an exception saying "Not a WAVE file - no RIFF header".
How I'm currently trying to achieve it is something like this (just a really simplified piece of code):
WaveFormat ResampleInputFormat = new WaveFormat(44100, 1);
WaveFormat ResampleOutputFormat = new WaveFormat(11025, 1);
MemoryStream ResampleInputMemoryStream = new MemoryStream();
foreach (var b in InputListOfBytes)
{
ResampleInputMemoryStream.Write(new byte[]{b}, 0, 1);
}
RawSourceWaveStream ResampleInputWaveStream =
new RawSourceWaveStream(ResampleInputMemoryStream, ResampleInputFormat);
WaveFormatConversionStream ResampleOutputStream =
new WaveFormatConversionStream(ResampleOutputFormat, ResampleInputWaveStream);
byte[] bytes = new byte[2];
while (ResampleOutputStream.Read(bytes, 0, 2) > 0)
{
OutputListOfBytes.Add(bytes[0]);
OutputListOfBytes.Add(bytes[1]);
}
My problem is: the last loop is an infinite loop. The Read() always gets the same values and never advances in the Stream. I even tried Seek()-ing to the right position after each Read(), but that doesn't seem to work either, I still always get the same values.
What am I doing wrong? And is this even the right way to resample raw audio? Thanks in advance!
First, you need to reset ResampleInputMemoryStream's position to the start. It may actually have been easier to create the memory stream based on the array:
new MemoryStream(InputListOfBytes)
Second, when reading out of the resampler, you need to read in larger chunks than two bytes at a time. Try at least a second's worth of audio (use ResampleOutputStream.WaveFormat.AverageBytesPerSecond).
I've been looking over the NAudio examples trying to work out how I can get ulaw samples suitable for packaging up as an RTP payload. I'm attempting to generate the samples from an mp3 file using the code below. Not surprisingly, since I don't really have a clue what I'm doing with NAudio, when I transmit the samples across the network to a softphone all I get is static.
Can anyone provide any direction on how I should be getting 160 bytes (8Khz # 20ms) ULAW samples from an MP3 file using NAudio?
private void GetAudioSamples()
{
var pcmStream = WaveFormatConversionStream.CreatePcmStream(new Mp3FileReader("whitelight.mp3"));
byte[] buffer = new byte[2];
byte[] sampleBuffer = new byte[160];
int sampleIndex = 0;
int bytesRead = pcmStream.Read(buffer, 0, 2);
while (bytesRead > 0)
{
var ulawByte = MuLawEncoder.LinearToMuLawSample(BitConverter.ToInt16(buffer, 0));
sampleBuffer[sampleIndex++] = ulawByte;
if (sampleIndex == 160)
{
m_rtpChannel.AddSample(sampleBuffer);
sampleBuffer = new byte[160];
sampleIndex = 0;
}
bytesRead = pcmStream.Read(buffer, 0, 2);
}
logger.Debug("Finished adding audio samples.");
}
Here's a few pointers. First of all, as long as you are using NAudio 1.5, no need for the additional WaveFormatConversionStream - Mp3FileReader's Read method returns PCM.
However, you will not be getting 8kHz out, so you need to resample it first. WaveFormatConversionStream can do this, although it uses the built-in Windows ACM sample rate conversion, which doesn't seem to filter the incoming audio well, so there could be aliasing artefacts.
Also, you usually read bigger blocks than just two bytes at a time as the MP3 decoder needs to decode frames one at a time (the resampler also will want to deal with bigger block sizes). I would try reading at least 20ms worth of bytes at a time.
Your use of BitConverter.ToInt16 is correct for getting the 16 bit sample value, but bear in mind that an MP3 is likely stereo, with left, right samples. Are you sure your phone expects stereo.
Finally, I recommend making a mu-law WAV file as a first step, using WaveFileWriter. Then you can easily listen to it in Windows Media Player and check that what you are sending to your softphone is what you intended.
Below is the way I eventually got it working. I do lose one of the channels from the mp3, and I guess there's some way to combine the channels as part of a conversion, but that doesn't matter for my situation.
The 160 byte buffer size gives me 20ms ulaw samples which work perfectly with the SIP softphone I'm testing with.
var pcmFormat = new WaveFormat(8000, 16, 1);
var ulawFormat = WaveFormat.CreateMuLawFormat(8000, 1);
using (WaveFormatConversionStream pcmStm = new WaveFormatConversionStream(pcmFormat, new Mp3FileReader("whitelight.mp3")))
{
using (WaveFormatConversionStream ulawStm = new WaveFormatConversionStream(ulawFormat, pcmStm))
{
byte[] buffer = new byte[160];
int bytesRead = ulawStm.Read(buffer, 0, 160);
while (bytesRead > 0)
{
byte[] sample = new byte[bytesRead];
Array.Copy(buffer, sample, bytesRead);
m_rtpChannel.AddSample(sample);
bytesRead = ulawStm.Read(buffer, 0, 160);
}
}
}
I am trying to send a file over a NetworkStream and rebuild it on the client side. I can get the data over correctly (i think) but when I use either a BinaryWriter or a FileStream object to recreate the file, the file is cut off in the beginning at the same point no matter what methodology I use.
private void ReadandSaveFileFromServer(ref TcpClient clientATF,ref NetworkStream currentStream, string locationToSave)
{
int fileSize = 0;
string fileName = "";
fileName = ReadStringFromServer(ref clientATF,ref currentStream);
fileSize = ReadIntFromServer(ref clientATF,ref currentStream);
byte[] fileSent = new byte[fileSize];
if (currentStream.CanRead && clientATF.Connected)
{
currentStream.Read(fileSent, 0, fileSent.Length);
WriteToConsole("Log Recieved");
}
else
{
WriteToConsole("Log Transfer Failed");
}
FileStream fileToCreate = new FileStream(locationToSave + "\\" + fileName, FileMode.Create);
fileToCreate.Seek(0, SeekOrigin.Begin);
fileToCreate.Write(fileSent, 0, fileSent.Length);
fileToCreate.Close();
//binWriter = new BinaryWriter(File.Open(locationToSave + "\\" + fileName, FileMode.Create));
//binWriter.Write(fileSent);
//binWriter.Close();
}
When I step through and check fileName and fileSize, they are correct. The byte[] is also fully populated. Any clue as to what I can do next?
Thanks in advance...
Sean
EDIT!!!:
So I figured out what is happening. When I read a string and then the Int from stream, the byte array is 256 indices long. So my read for string is taking in the int, which then will clobber the other areas. Need to figure this out...
For one thing, you can use the convenience method File.WriteAllBytes to write the data more simply. But I doubt that that's the problem.
You're assuming you can read all the data in a single call to Read. You're ignoring the return value. Don't do that - instead, read multiple times until either you've read everything you expect to, or you've reached the end of the stream. See this article for more details. If you're using .NET 4, there's a new CopyTo method you may find useful.
(As an aside, your use of ref suggests that you don't understand what it really means. It's well worth making sure you understand how arguments are passed in C#.)
To add to Jon Skeet's answer, your reading code should be:
int bytesRead;
int readPos = 0;
do
{
bytesRead = currentStream.Read(fileSent, readPos, fileSent.Length);
readPos += bytesRead;
} while (bytesRead > 0);
If you are looking for a general solution for sending and receiving files over a network have you considered using a C# network library? It has probably solved most of the issues you will come across when trying to do this.
Disclaimer: I'm one of the developers of this library.