Decoding H.264 NALU Stream C# - c#

1- Live Bosch IP camera Stream
2- Stream format is H.264 AVC
3- The NALUs are packed in the stream item as described in "ISO/IEC 14496-15 Advanced Video Coding (AVC) file format", i.e. single NALUs are separated by a 4 byte length field in network byte order.
private unsafe byte[] PostProcessPayload(Guid codingType, byte[] dataBuffer)
{
if ((codingType == VsdkGuids.CodingType_H264) && m_convertH264StreamItems)
{
//
// "This stream item consists of one or multiple H.264 Network Abstraction Layer Units
// (NALUs) ITU-T Rec. H.264 / ISO/IEC 14496-10 Advanced Video Coding belonging to the same timestamp.
// It is guaranteed that each item including an IDR picture will also include the corresponding picture
// parameter set and sequence parameter set NALUs.
//The NALUs are packed in the stream item as described in
// "ISO/IEC 14496-15 Advanced Video Coding (AVC) file format", i.e. single NALUs are separated
// by a 4 byte length field in network byte order.
// Since the frame boundary is implicitly communicated (each stream item represents a frame),
// frames usually do not carry an explicit access unit delimiter at the end.
// In case interlaced video is received, two fields are packet into a single stream item."
//
// The following code section replaces the 4 byte NALU lengths with NALU separators and appends an access delimiter at frame end.
// The data buffer that contains the H.264 frame must be allocated with enough headroom (6 bytes after the last NALU)
// for the access delimiter. When such a processed payload data is written into a file of type *.h264, the VLC player for example is able
// to play that file (in this Complete C# Sample, connect a device proxy, set the property ConvertH264StreamItems of
// a video datastream to "true", add the device proxy to the end nodes, start this video stream, and stop it after a while. Then simply
// rename the written DataStream_{unique identifier}_Video.dat file to *.h264 and play this file with VLC).
int remainingLength = dataBuffer.Length - H264_ACCESS_DELIMITER.Length;
fixed (byte* pDataBuffer = &dataBuffer[0], pDelimiterBuffer = &H264_ACCESS_DELIMITER[0], pSeparatorBuffer = &H264_NALU_SEPARATOR[0])
{
byte* pData = pDataBuffer;
while (remainingLength > H264_NALU_SEPARATOR.Length)
{
int naluLength = System.Net.IPAddress.NetworkToHostOrder(*(Int32*)pData);
// replace NALU length with NALU separator
for (int i = 0; i < H264_NALU_SEPARATOR.Length; i++)
pData[i] = pSeparatorBuffer[i];
// goto next NALU
int offsetToNextNalu = H264_NALU_SEPARATOR.Length + naluLength;
pData += offsetToNextNalu;
remainingLength -= offsetToNextNalu;
}
if (remainingLength != 0)
Common.Log("Warning: Inconsistency in postprocessing of H.264 stream item, remaining length {0} is not zero", remainingLength);
// append access delimiter after last NALU, there will be no access violation here because the buffer had been allocated with enough headroom
for (int i = 0; i < H264_ACCESS_DELIMITER.Length; i++)
pData[i] = pDelimiterBuffer[i];
}
}
return dataBuffer;
}
How to get frame image from the dataBuffer in C#?

Related

How to decompress just one frame using NAudio?

I am simply trying to take any or first frame from an mp3 file and then decompress it.
internal void Read3(string location) //take in file location
{
Mp3FileReader mp3 = new Mp3FileReader(location); //make a Mp3FileReader
SECTIONA: //to jump back here when needed.
byte[] _passedBuffer = Decompress(mp3.ReadNextFrame()); //passed the decompressed byte array here.
int jump_size = mp3.WaveFormat.Channels *2; //just to get how many bytes to skip
for (int i = 0; i < _passedBuffer.Length; i += jump_size)
{
short FinalSample = BitConverter.ToInt16(_passedBuffer, i);
if (jump_size == 4) //converting the bytes to Int16,nothing special here.
{
FinalSample = (short)(((BitConverter.ToInt16(_passedBuffer, i + 2)) + FinalSample) / 2);
}
Console.Write(FinalSample+"|"); //and writing it down to Console.
}
Console.WriteLine("Frames are Written,continue to next frame?");
if (Convert.ToChar(Console.Read()) == 'y') //asking to go on or not.
{ goto SECTIONA; }
}
private byte[] Decompress(Mp3Frame fm)
{
var buffer = new byte[16384 * 4]; //big enough buffer size
WaveFormat wf = new Mp3WaveFormat(fm.SampleRate, fm.ChannelMode == ChannelMode.Mono ? 1 : 2, fm.FrameLength, fm.BitRate); //creating a new WaveFormat
IMp3FrameDecompressor decompressor = new AcmMp3FrameDecompressor(wf); //passing in to AcmMp3FrameDecompressor.
decompressor.DecompressFrame(fm, buffer, 0); //running the DecompressFrame method and then passing back the buffer.
return buffer;
}
Now the Mp3FileReader is reading the Frame correctly as I checked the Frame's RawData. Now I am trying to decompress that Frame and then convert its PCM data into Int16 in that only For Loop but every Int16 FinalSample value is returning 0.
I know that just using Mp3FileReader.Read(Buffer,Offset,Length) will get the job done but for all the frames so:
how do I do it for just one frame?
what is wrong with my code because of which I am getting all zeros?
I know that RawData is ok, so something must be wrong with Decompress method, How do I setup a decompressor for mp3 file?
You can use the AcmMp3FrameDecompressor or DmoMp3FrameDecompressor to decompress individual MP3 frames.
You need to check the return value of Decompress to see how much data was returned. It's possible for the first frame not to return anything.
You should also create a single frame decompressor and use it for all frames in the MP3 file, as it retains state between calls.

Read a zip entry and save it in a string

Ack. I am trying to open a specific entry in a zip file archive and store the contents in a string, instead of saving it to a file. I cannot use disk space for this per the client.
Here's what I have:
string scontents = "";
byte[] abbuffer = null;
MemoryStream oms = new MemoryStream();
try
{
//get the file contents
ozipentry.Open().CopyTo(oms);
int length = (int)oms.Length; // get file length
abbuffer = new byte[length]; // create buffer
int icount; // actual number of bytes read
int isum = 0; // total number of bytes read
// read until Read method returns 0 (end of the stream has been reached)
while ((icount = oms.Read(abbuffer, isum, length - isum)) > 0)
{
isum += icount; // sum is a buffer offset for next reading
}
scontents = BytesToString(abbuffer); <----abbuffer is filled with Ascii 0
}
finally
{
oms.Close();
}
The variable abbuffer is supposed to hold that contents of the stream, but all it holds is a bunch of ascii zeros, which I guess means it didn't read (or copy) the stream! But I do not get any error messages or anything. Can someone tell me how to get this working?
I've looked everywhere on stack and on the web, and no where does anyone answer this question specifically for ASP.NET 4.5 ZipArchive library. I cannot use any other library, so if you offer an answer in that, while it would be educational, won't help me at all in this instance. Thanks so much for any help!
One more thing. 'ozipentry' is of type ZipArchiveEntry and is an element in a ZipArchive Entries array. (ie ozipentry = oziparchive.Entries[i])
Oops. One more thing! The function 'BytesToString' is not included, because it is irrelevant. Before the function is called, the abbuffer array is already filled with 0's
Ok. Sorry for being so dense. I realized I was overthinking this. I changed to function to do this:
osr = new StreamReader(ozipentry.Open(), Encoding.Default);
scontents = osr.ReadToEnd();
And it worked fine! Didn't even have to worry about Encoding...

Why does SSL Stream use a byte array of 2048?

After reading the TCP SSL example from MSDN, they use a byte array to read data into the stream. Is there a reason the array is limited to 2048? What if TCP sends a longer array than 2048? Also, how does the buffer.Length property continue to read the stream, as it is changing. It doesn't make complete sense to me. Why read the length of the buffer, wouldn't you want to read the length of the incremental bytes coming into the stream?
static string ReadMessage(SslStream sslStream)
{
// Read the message sent by the client.
// The client signals the end of the message using the
// "<EOF>" marker.
byte [] buffer = new byte[2048];
StringBuilder messageData = new StringBuilder();
int bytes = -1;
do
{
// Read the client's test message.
bytes = sslStream.Read(buffer, 0, buffer.Length);
// Use Decoder class to convert from bytes to UTF8
// in case a character spans two buffers.
Decoder decoder = Encoding.UTF8.GetDecoder();
char[] chars = new char[decoder.GetCharCount(buffer,0,bytes)];
decoder.GetChars(buffer, 0, bytes, chars,0);
messageData.Append (chars);
// Check for EOF or an empty message.
if (messageData.ToString().IndexOf("<EOF>") != -1)
{
break;
}
} while (bytes !=0);
return messageData.ToString();
}
When reading from any kind of Stream (not just SslStream) just because you asked for X bytes they are allowed to give you anywhere between 1 and X bytes back. that is what the bytes returned from Read is for, to see how many you read.
If you give Read a giant buffer it is only going to fill in the first few 1000ish bytes, giving it a bigger buffer is a waste of space. If you need more data than the buffer is large you make multiple requests for data.
A important thing to remember when dealing with network streams, one write on the sender does not mean one read on the receiver. Writes can be combined together or split apart at arbitrary points, that is why it is to have some form of Message Framing to be able to tell when you have "all the data". In the example you posted they use the marker <EOF> to signal "end of message".

Problems with playing AAC Raw data on Silverlight (Windows Phone)

I need to play AAC LC audio that comes from live stream.
To achive that i've implemented MediaStreamSource.
When i receive first packets of stream, i set MediaElement's source to my MediaStreamSource.
It seems that everything works fine: OpenMediaAsync is called -> reported with ReportOpenMediaCompleted, then GetSampleAsync is called -> reported with ReportGetSampleCompleted, BUT, on 10th call of GetSampleAsync, ReportGetSampleCompleted is throws NullReferenceException.
Here is my CodecPrivateData:
var waveFormat = new AACWaveFormat();
waveFormat.FormatTag = 0xFF;
waveFormat.Channels = 2; // For my stream is always stereo
waveFormat.Frequency = 44100; //For my stream is always 44Khz
waveFormat.BitsPerSample = 16; //For my stream is always 16bit
waveFormat.BlockAlign = waveFormat.Channels * waveFormat.BitsPerSample / 8; //is this true formula?
waveFormat.AverageBytesPerSecond = waveFormat.Frequency * waveFormat.BlockAlign; //is this true formula? because usually this value is 176400 or 1411Kbps is this real value for sound?
waveFormat.ExtraDataSize = 2; //usually, but i read these values from first packet of stream
waveFormat.ExtraData = AudioSpecificConfig; //AudioSpecificConfig usually 2 bytes length, readed from stream.
First packet of the stream is always AACSequenceHeader - where i read my CodecPrivateData and AudioSpecificConfig. All the rest is AACRaw.
My CodecPrivateData is looks like
FF00020044AC000010B102000400100002001210.
My GetSampleAsync
protected override void GetSampleAsync(MediaStreamType mediaStreamType)
{
var audioStreamDescription = new MediaStreamDescription(MediaStreamType.Audio, AudioStreamAttibutes); //AudioStreamAttibutes is field that contains data filled on OpenMediaAsync step.
//using (var memoryStream = new MemoryStream(AudioPackets[0].Data))
var memoryStream = new MemoryStream(AudioPackets[0].Data);
ReportGetSampleCompleted(new MediaStreamSample(audioStreamDescription, memoryStream, 0, AudioPackets[0].Data.Length, TimeSpan.FromMilliseconds(GetAudioSampleCalls++ * 32).Ticks, new Dictionary<MediaSampleAttributeKeys, String>())); //throws NullReferenceException, when debugger stops i can be see that all passed params is not null!
}
The problem here is that i don't know any timestamp and i don't know whether this could be a problem.
And finally what is Data field? Data field contains all received RawAAC audio as Byte[] that i extract from AudioTag. (See E.4.2.2 AACAUDIODATA at http://download.macromedia.com/f4v/video_file_format_spec_v10_1.pdf)

How to save Stream on server without dialog box in silverligh5?

i created one app, which is recording user voice from system Mic, and save file on system Harddisk using dialog box, which is working fine !
but my requirement is, i want to save Stream on server without dialog box
this is what i tried so far:-
private SaveFileDialog saveFileDialog = new SaveFileDialog()
{
Filter = "Audio files(*.wav)|*.wav"
};
protected void SaveFile()
{
if (saveFileDialog.ShowDialog() == false)
{
return;
}
StatusText = "Saving...";
Stream stream = saveFileDialog.OpenFile();
WavManager.SavePcmToWav(_sink.BackingStream, stream, _sink.CurrentFormat);
stream.Close();
MessageBox.Show("Your record is saved.");
GoToStartState();
}
public class WavManager
{
public static void SavePcmToWav(Stream rawData, Stream output, AudioFormat audioFormat)
{
if (audioFormat.WaveFormat != WaveFormatType.Pcm)
throw new ArgumentException("Only PCM coding is supported.");
BinaryWriter bwOutput = new BinaryWriter(output);
// Write down the WAV header.
// Refer to http://technology.niagarac.on.ca/courses/ctec1631/WavFileFormat.html
// for details on the format.
// Note that we use ToCharArray() when writing fixed strings
// to force using the char[] overload because
// Write(string) writes the string prefixed by its length.
// -- RIFF chunk
bwOutput.Write("RIFF".ToCharArray());
// Total Length Of Package To Follow
// Computed as data length plus the header length without the data
// we have written so far and this data (44 - 4 ("RIFF") - 4 (this data))
bwOutput.Write((uint)(rawData.Length + 36));
bwOutput.Write("WAVE".ToCharArray());
// -- FORMAT chunk
bwOutput.Write("fmt ".ToCharArray());
// Length Of FORMAT Chunk (Binary, always 0x10)
bwOutput.Write((uint)0x10);
// Always 0x01
bwOutput.Write((ushort)0x01);
// Channel Numbers (Always 0x01=Mono, 0x02=Stereo)
bwOutput.Write((ushort)audioFormat.Channels);
// Sample Rate (Binary, in Hz)
bwOutput.Write((uint)audioFormat.SamplesPerSecond);
// Bytes Per Second
bwOutput.Write((uint)(audioFormat.BitsPerSample * audioFormat.SamplesPerSecond * audioFormat.Channels / 8));
// Bytes Per Sample: 1=8 bit Mono, 2=8 bit Stereo or 16 bit Mono, 4=16 bit Stereo
bwOutput.Write((ushort)(audioFormat.BitsPerSample * audioFormat.Channels / 8));
// Bits Per Sample
bwOutput.Write((ushort)audioFormat.BitsPerSample);
// -- DATA chunk
bwOutput.Write("data".ToCharArray());
// Length Of Data To Follow
bwOutput.Write((uint)rawData.Length);
// Raw PCM data follows...
// Reset position in rawData and remember its origin position
// to restore at the end.
long originalRawDataStreamPosition = rawData.Position;
rawData.Seek(0, SeekOrigin.Begin);
// Append all data from rawData stream into output stream.
byte[] buffer = new byte[4096];
int read; // number of bytes read in one iteration
while ((read = rawData.Read(buffer, 0, 4096)) > 0)
{
bwOutput.Write(buffer, 0, read);
}
rawData.Seek(originalRawDataStreamPosition, SeekOrigin.Begin);
}
}

Categories