i created one app, which is recording user voice from system Mic, and save file on system Harddisk using dialog box, which is working fine !
but my requirement is, i want to save Stream on server without dialog box
this is what i tried so far:-
private SaveFileDialog saveFileDialog = new SaveFileDialog()
{
Filter = "Audio files(*.wav)|*.wav"
};
protected void SaveFile()
{
if (saveFileDialog.ShowDialog() == false)
{
return;
}
StatusText = "Saving...";
Stream stream = saveFileDialog.OpenFile();
WavManager.SavePcmToWav(_sink.BackingStream, stream, _sink.CurrentFormat);
stream.Close();
MessageBox.Show("Your record is saved.");
GoToStartState();
}
public class WavManager
{
public static void SavePcmToWav(Stream rawData, Stream output, AudioFormat audioFormat)
{
if (audioFormat.WaveFormat != WaveFormatType.Pcm)
throw new ArgumentException("Only PCM coding is supported.");
BinaryWriter bwOutput = new BinaryWriter(output);
// Write down the WAV header.
// Refer to http://technology.niagarac.on.ca/courses/ctec1631/WavFileFormat.html
// for details on the format.
// Note that we use ToCharArray() when writing fixed strings
// to force using the char[] overload because
// Write(string) writes the string prefixed by its length.
// -- RIFF chunk
bwOutput.Write("RIFF".ToCharArray());
// Total Length Of Package To Follow
// Computed as data length plus the header length without the data
// we have written so far and this data (44 - 4 ("RIFF") - 4 (this data))
bwOutput.Write((uint)(rawData.Length + 36));
bwOutput.Write("WAVE".ToCharArray());
// -- FORMAT chunk
bwOutput.Write("fmt ".ToCharArray());
// Length Of FORMAT Chunk (Binary, always 0x10)
bwOutput.Write((uint)0x10);
// Always 0x01
bwOutput.Write((ushort)0x01);
// Channel Numbers (Always 0x01=Mono, 0x02=Stereo)
bwOutput.Write((ushort)audioFormat.Channels);
// Sample Rate (Binary, in Hz)
bwOutput.Write((uint)audioFormat.SamplesPerSecond);
// Bytes Per Second
bwOutput.Write((uint)(audioFormat.BitsPerSample * audioFormat.SamplesPerSecond * audioFormat.Channels / 8));
// Bytes Per Sample: 1=8 bit Mono, 2=8 bit Stereo or 16 bit Mono, 4=16 bit Stereo
bwOutput.Write((ushort)(audioFormat.BitsPerSample * audioFormat.Channels / 8));
// Bits Per Sample
bwOutput.Write((ushort)audioFormat.BitsPerSample);
// -- DATA chunk
bwOutput.Write("data".ToCharArray());
// Length Of Data To Follow
bwOutput.Write((uint)rawData.Length);
// Raw PCM data follows...
// Reset position in rawData and remember its origin position
// to restore at the end.
long originalRawDataStreamPosition = rawData.Position;
rawData.Seek(0, SeekOrigin.Begin);
// Append all data from rawData stream into output stream.
byte[] buffer = new byte[4096];
int read; // number of bytes read in one iteration
while ((read = rawData.Read(buffer, 0, 4096)) > 0)
{
bwOutput.Write(buffer, 0, read);
}
rawData.Seek(originalRawDataStreamPosition, SeekOrigin.Begin);
}
}
Related
The following code do :
Read all bytes from an input file
Keep only part of the file in outbytes
Write the extracted bytes in outputfile
byte[] outbytes = File.ReadAllBytes(sourcefile).Skip(offset).Take(size).ToArray();
File.WriteAllBytes(outfile, outbytes);
But there is a limitation of ~2GB data for each step.
Edit: The extracted bytes size can also be greater than 2GB.
How could I handle big file ? What is the best way to proceed with good performances, regardless of size ?
Thx !
Example to FileStream to take the middle 3 Gb out of a 5 Gb file:
byte[] buffer = new byte{1024*1024];
using(var readFS = File.Open(pathToBigFile))
using(var writeFS = File.OpenWrite(pathToNewFile))
{
readFS.Seek(1024*1024*1024); //seek to 1gb in
for(int i=0; i < 3000; i++){ //3000 times of one megabyte = 3gb
int bytesRead = readFS.Read(buffer, 0, buffer.Length);
writeFS.Write(buffer, 0, bytesRead);
}
}
It's not a production grade code; Read might not read a full megabyte so you'd end up with less than 3Gb - it's more to demonstrate the concept of using two filestreams and reading repeatedly from one and writing repeatedly to the other. I'm sure you can modify it so that it copies an exact number of bytes by keeping track of the total of all the bytesRead in the loop and stopping reading when you have read enough
It is better to stream the data from one file to the other, only loading small parts of it into memory:
public static void CopyFileSection(string inFile, string outFile, long startPosition, long size)
{
// Open the files as streams
using (var inStream = File.OpenRead(inFile))
using (var outStream = File.OpenWrite(outFile))
{
// seek to the start position
inStream.Seek(startPosition, SeekOrigin.Begin);
// Create a variable to track how much more to copy
// and a buffer to temporarily store a section of the file
long remaining = size;
byte[] buffer = new byte[81920];
do
{
// Read the smaller of 81920 or remaining and break out of the loop if we've already reached the end of the file
int bytesRead = inStream.Read(buffer, 0, (int)Math.Min(buffer.Length, remaining));
if (bytesRead == 0) { break; }
// Write the buffered bytes to the output file
outStream.Write(buffer, 0, bytesRead);
remaining -= bytesRead;
}
while (remaining > 0);
}
}
Usage:
CopyFileSection(sourcefile, outfile, offset, size);
This should have equivalent functionality to your current method without the overhead of reading the entire file, regardless of its size, into memory.
Note: If you're doing this in code that uses async/await, you should change CopyFileSection to be public static async Task CopyFileSection and change inStream.Read and outStream.Write to await inStream.ReadAsync and await outStream.WriteAsync respectively.
I am simply trying to take any or first frame from an mp3 file and then decompress it.
internal void Read3(string location) //take in file location
{
Mp3FileReader mp3 = new Mp3FileReader(location); //make a Mp3FileReader
SECTIONA: //to jump back here when needed.
byte[] _passedBuffer = Decompress(mp3.ReadNextFrame()); //passed the decompressed byte array here.
int jump_size = mp3.WaveFormat.Channels *2; //just to get how many bytes to skip
for (int i = 0; i < _passedBuffer.Length; i += jump_size)
{
short FinalSample = BitConverter.ToInt16(_passedBuffer, i);
if (jump_size == 4) //converting the bytes to Int16,nothing special here.
{
FinalSample = (short)(((BitConverter.ToInt16(_passedBuffer, i + 2)) + FinalSample) / 2);
}
Console.Write(FinalSample+"|"); //and writing it down to Console.
}
Console.WriteLine("Frames are Written,continue to next frame?");
if (Convert.ToChar(Console.Read()) == 'y') //asking to go on or not.
{ goto SECTIONA; }
}
private byte[] Decompress(Mp3Frame fm)
{
var buffer = new byte[16384 * 4]; //big enough buffer size
WaveFormat wf = new Mp3WaveFormat(fm.SampleRate, fm.ChannelMode == ChannelMode.Mono ? 1 : 2, fm.FrameLength, fm.BitRate); //creating a new WaveFormat
IMp3FrameDecompressor decompressor = new AcmMp3FrameDecompressor(wf); //passing in to AcmMp3FrameDecompressor.
decompressor.DecompressFrame(fm, buffer, 0); //running the DecompressFrame method and then passing back the buffer.
return buffer;
}
Now the Mp3FileReader is reading the Frame correctly as I checked the Frame's RawData. Now I am trying to decompress that Frame and then convert its PCM data into Int16 in that only For Loop but every Int16 FinalSample value is returning 0.
I know that just using Mp3FileReader.Read(Buffer,Offset,Length) will get the job done but for all the frames so:
how do I do it for just one frame?
what is wrong with my code because of which I am getting all zeros?
I know that RawData is ok, so something must be wrong with Decompress method, How do I setup a decompressor for mp3 file?
You can use the AcmMp3FrameDecompressor or DmoMp3FrameDecompressor to decompress individual MP3 frames.
You need to check the return value of Decompress to see how much data was returned. It's possible for the first frame not to return anything.
You should also create a single frame decompressor and use it for all frames in the MP3 file, as it retains state between calls.
I have created a small application using UnrealEngine 4.10 (UE4). Within that application, I am grabbing the colorBuffer via ReadPixels. I am then compressing the colorBuffer to PNG. Finally, the compressed colorBuffer is sent via TCP to another application. The UE4 application is written in c++, not that that should matter. The compressed colorBuffer is being sent every "tick" of the application - essentially 30 times a second (thereabouts).
My client, the one receiving the streamed PNG is written in c# and does the following:
Connect to server If connected
get the stream (memory stream)
read the memory stream into a byte array
convert the byte array to an image
Client implementation:
private void Timer_Tick(object sender, EventArgs e)
{
var connected = tcp.IsConnected();
if (connected)
{
var stream = tcp.GetStream(); //simply returns client.GetStream();
int BYTES_TO_READ = 16;
var buffer = new byte[BYTES_TO_READ];
var totalBytesRead = 0;
var bytesRead;
do {
// You have to do this in a loop because there's no
// guarantee that all the bytes you need will be ready when
// you call.
bytesRead = stream.Read(buffer, totalBytesRead,
BYTES_TO_READ - totalBytesRead);
totalBytesRead += bytesRead;
} while (totalBytesRead < BYTES_TO_READ);
Image x = byteArrayToImage(buffer);
}
}
public Image byteArrayToImage(byte[] byteArrayIn)
{
var converter = new ImageConverter();
Image img = (Image)converter.ConvertFrom(byteArrayIn);
return img;
}
The problem is that Image img = (Image)converter.ConvertFrom(byteArrayIn);
Throws an argument exception, telling me "Parmeter is not valid".
The data being sent looks like this:
My byteArrayInand buffer look like this:
I have also tried both:
Image.FromStream(stream); and Image.FromStream(new MemoryStream(bytes));
Image.FromStream(stream); causes it to read forever... and Image.FromStream(new MemoryStream(bytes)); results in the same exception as mentioned above.
Some questions:
What size shall I set BYTES_TO_READ to be? I set as 16 because when I check the size of the byte array being sent in the UE4 application (dataSize in the first image), it says the length is 16... Not too sure about what to set this as.
Is the process that I have followed correct?
What am I doing wrong?
UPDATE
#RonBeyer asked if I could verify that the data sent from the server matches that which is received. I have tried to do that and here is what I can say:
The data sent, as far as I can tell looks like this (sorry for formatting):
The data being received, looks like this:
var stream = tcp.GetStream();
int BYTES_TO_READ = 512;
var buffer = new byte[BYTES_TO_READ];
Int32 bytes = stream.Read(buffer, 0, buffer.Length);
var responseData = System.Text.Encoding.ASCII.GetString(buffer, 0,
bytes);
//responseData looks like this (has been formatted for easier reading)
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
//?PNG\r\n\u001a\n\0\0\0\rIHDR
If I try take a single line from the responseData and put that into an image:
var stringdata = "?PNG\r\n\u001a\n\0\0\0\rIHDR";
var data = System.Text.Encoding.ASCII.GetBytes(stringdata);
var ms = new MemoryStream(data);
Image img = Image.FromStream(ms);
data has a length of 16... the same length as the dataSize variable on the server. However, I again get the execption "Parameter is not valid".
UPDATE 2
#Darcara has helped by suggesting that what I was actually receiving was the header of the PNG file and that I needed to first send the size of the image. I have now done that and made progress:
for (TArray<class FSocket*>::TIterator ClientIt(Clients); ClientIt;
++ClientIt)
{
FSocket *Client = *ClientIt;
int32 SendSize = 2 * x * y;
Client->SetNonBlocking(true);
Client->SetSendBufferSize(SendSize, SendSize);
Client->Send(data, SendSize, bytesSent);
}
With this, I am now getting the image on the first go, however, subsequent attempts fail with the same "Parameter is not valid". Upon inspection, I have noticed that the stream now appears to be missing the header... "?PNG\r\n\u001a\n\0\0\0\rIHDR". I came to this conclusion when I converted the buffer to a string using Encoding.ASCII.GetString(buffer, 0, bytes);
Any idea why the header is now only being sent to first time and never again? What can I do to fix it?
First of all, thank you to #Dacara and #RonBeyer for your help.
I now have a solution:
Server:
for (TArray<class FSocket*>::TIterator ClientIt(Clients); ClientIt;
++ClientIt)
{
FSocket *Client = *ClientIt;
int32 SendSize = x * y; // Image is 512 x 512
Client->SetSendBufferSize(SendSize, SendSize);
Client->Send(data, SendSize, bytesSent);
}
The first issue was that the size of the image needed to be correct:
int32 SendSize = 2 * x * y;
The line above is wrong. The image is 512 by 512 and so SendSize should be x * y where x & y are both 512.
The other issue was how I was handling the stream client side.
Client:
var connected = tcp.IsConnected();
if (connected)
{
var stream = tcp.GetStream();
var BYTES_TO_READ = (512 * 512)^2;
var buffer = new byte[BYTES_TO_READ];
var bytes = stream.Read(buffer, 0, BYTES_TO_READ);
Image returnImage = Image.FromStream(new MemoryStream(buffer));
//Apply the image to a picture box. Note, this is running on a separate
//thread.
UpdateImageViewerBackgroundWorker.ReportProgress(0, returnImage);
}
The var BYTES_TO_READ = (512 * 512)^2; is now the correct size.
I now have Unreal Engine 4 streaming its frames.
You are only reading the first 16 bytes of the stream. I'm guessing that is not intentional.
If the stream ends/connection closes after the image is transferred, use stream.CopyTo to copy it into a MemoryStream. Image.FromStream(stream) might also work
If the stream does not end, you need to know the size of the transferred object beforehand, so you can copy it read-by-read into another array / memory stream or directly to disk. In that case a much higher read buffer should be used (default is 8192 I think). This is a lot more complicated though.
To manually read from the stream, you need to prepend you data with the size. A simple Int32 should suffice. Your client code might look something like this:
var stream = tcp.GetStream();
//this is our temporary read buffer
int BYTES_TO_READ = 8196;
var buffer = new byte[BYTES_TO_READ];
var bytesRead;
//read size of data object
stream.Read(buffer, 0, 4); //read 4 bytes into the beginning of the empty buffer
//TODO: check that we actually received 4 bytes.
var totalBytesExpected = BitConverter.ToInt32(buffer, 0)
//this will be the stream we will save our received bytes to
//could also be a file stream
var imageStream = new MemoryStream(totalBytesExpected);
var totalBytesRead = 0;
do {
//read as much as the buffer can hold or the remaining bytes
bytesRead = stream.Read(buffer, 0, Math.Min(BYTES_TO_READ, totalBytesExpected - totalBytesRead));
totalBytesRead += bytesRead;
//write bytes to image stream
imageStream.Write(buffer, 0, bytesRead);
} while (totalBytesRead < totalBytesExpected );
I glossed over a lot of error handling here, but that should give you the general idea.
If you want to transfer more complex objects look into proper protocols like Google Protocol Buffers or MessagePack
I am currently testing several decompression libraries for a project I'm involved with to decompress http file streams on the fly. I have tried two very promising libraries and found an issue that seems to appear in both of them.
This is what I am doing:
video.avi compressed to video.zip on HTTP server test.com/video.zip (~20MB)
HttpWebRequest to read stream from the server
Write HttpWebRequest ResponseStream data into MemoryStream
Let decompression library read from MemoryStream
Read decompressed file stream while it's being downloaded by HttpWebRequest
The whole idea works fine, I'm able to uncompress and stream the compressed video directly into VLC stdin and it's rendered just fine. However I have to use a read buffer of one byte on the decompression library. Any buffer larger than one byte will cause the uncompressed data stream to be cut off. For a test I've written the decompressed stream into a file and compared it with the original video.avi and some data is just skipped by the decompression. When streaming this broken data into VLC it causes a lot of video artifacts and the playback speed is also greatly reduced.
If I knew the size of what is available to read I could trim my buffer accordingly but no library would make this information public so all I can do is read the data with a one byte buffer. Maybe my approach is wrong? Or maybe I'm overlooking something?
Here's an example code (requires VLC):
ICSharpCode.SharpZLib (http://icsharpcode.github.io/SharpZipLib/)
static void Main(string[] args)
{
// Initialise VLC
Process vlc = new Process()
{
StartInfo =
{
FileName = #"C:\Program Files\VideoLAN\vlc.exe", // Adjust as required to test the code
RedirectStandardInput = true,
UseShellExecute = false,
Arguments = "-"
}
};
vlc.Start();
Stream outStream = vlc.StandardInput.BaseStream;
// Get source stream
HttpWebRequest stream = (HttpWebRequest)WebRequest.Create("http://codefreak.net/~daniel/apps/stream60s-large.zip");
Stream compressedVideoStream = stream.GetResponse().GetResponseStream();
// Create local decompression loop
MemoryStream compressedLoopback = new MemoryStream();
ZipInputStream zipStream = new ZipInputStream(compressedLoopback);
ZipEntry currentEntry = null;
byte[] videoStreamBuffer = new byte[8129]; // 8kb read buffer
int read = 0;
long totalRead = 0;
while ((read = compressedVideoStream.Read(videoStreamBuffer, 0, videoStreamBuffer.Length)) > 0)
{
// Write compressed video stream into compressed loopback without affecting current read position
long previousPosition = compressedLoopback.Position; // Store current read position
compressedLoopback.Position = totalRead; // Jump to last write position
totalRead += read; // Increase last write position by current read size
compressedLoopback.Write(videoStreamBuffer, 0, read); // Write data into loopback
compressedLoopback.Position = previousPosition; // Restore reading position
// If not already, move to first entry
if (currentEntry == null)
currentEntry = zipStream.GetNextEntry();
byte[] outputBuffer = new byte[1]; // Decompression read buffer, this is the bad one!
int zipRead = 0;
while ((zipRead = zipStream.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
outStream.Write(outputBuffer, 0, outputBuffer.Length); // Write directly to VLC stdin
}
}
SharpCompress (https://github.com/adamhathcock/sharpcompress)
static void Main(string[] args)
{
// Initialise VLC
Process vlc = new Process()
{
StartInfo =
{
FileName = #"C:\Program Files\VideoLAN\vlc.exe", // Adjust as required to test the code
RedirectStandardInput = true,
UseShellExecute = false,
Arguments = "-"
}
};
vlc.Start();
Stream outStream = vlc.StandardInput.BaseStream;
// Get source stream
HttpWebRequest stream = (HttpWebRequest)WebRequest.Create("http://codefreak.net/~daniel/apps/stream60s-large.zip");
Stream compressedVideoStream = stream.GetResponse().GetResponseStream();
// Create local decompression loop
MemoryStream compressedLoopback = new MemoryStream();
ZipReader zipStream = null;
EntryStream currentEntry = null;
byte[] videoStreamBuffer = new byte[8129]; // 8kb read buffer
int read = 0;
long totalRead = 0;
while ((read = compressedVideoStream.Read(videoStreamBuffer, 0, videoStreamBuffer.Length)) > 0)
{
// Write compressed video stream into compressed loopback without affecting current read position
long previousPosition = compressedLoopback.Position; // Store current read position
compressedLoopback.Position = totalRead; // Jump to last write position
totalRead += read; // Increase last write position by current read size
compressedLoopback.Write(videoStreamBuffer, 0, read); // Write data into loopback
compressedLoopback.Position = previousPosition; // Restore reading position
// Open stream after writing to it because otherwise it will not be able to identify the compression type
if (zipStream == null)
zipStream = (ZipReader)ReaderFactory.Open(compressedLoopback); // Cast to ZipReader, as we know the type
// If not already, move to first entry
if (currentEntry == null)
{
zipStream.MoveToNextEntry();
currentEntry = zipStream.OpenEntryStream();
}
byte[] outputBuffer = new byte[1]; // Decompression read buffer, this is the bad one!
int zipRead = 0;
while ((zipRead = currentEntry.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
outStream.Write(outputBuffer, 0, outputBuffer.Length); // Write directly to VLC stdin
}
}
To test this code I recommend setting the output buffer for SharpZipLib to 2 bytes and for SharpCompress to 8 bytes. You will see the artifacts and also that the play speed of the video is wrong, the seek time should always be aligned with the number that is counting in the video.
I haven't really found any good explanation of why a larger outputBuffer that is reading from the decompression lib is causing these problems or a way to solve this other than having the tiniest possible buffer.
So my question is what I am doing wrong or if this is a general issue when reading compressed files from streams? How could I increase the outputBuffer while reading the correct data?
Any help is greatly appreciated!
Regards,
Gachl
You need to write only how many bytes you read. Writing the entire buffer size will add additional bytes (whatever happened to be in the buffer before). zipStream.Read is not required to read as many bytes as you request.
while ((zipRead = zipStream.Read(outputBuffer, 0, outputBuffer.Length)) > 0)
outStream.Write(outputBuffer, 0, zipRead); // Write directly to VLC stdin
Edit: Solution is at bottom of post
I am trying my luck with reading binary files. Since I don't want to rely on byte[] AllBytes = File.ReadAllBytes(myPath), because the binary file might be rather big, I want to read small portions of the same size (which fits nicely with the file format to read) in a loop, using what I would call a "buffer".
public void ReadStream(MemoryStream ContentStream)
{
byte[] buffer = new byte[sizePerHour];
for (int hours = 0; hours < NumberHours; hours++)
{
int t = ContentStream.Read(buffer, 0, sizePerHour);
SecondsToAdd = BitConverter.ToUInt32(buffer, 0);
// further processing of my byte[] buffer
}
}
My stream contains all the bytes I want, which is a good thing. When I enter the loop several things cease to work.
My int t is 0although I would presume that ContentStream.Read() would process information from within the stream to my bytearray, but that isn't the case.
I tried buffer = ContentStream.GetBuffer(), but that results in my buffer containing all of my stream, a behaviour I wanted to avoid by using reading to a buffer.
Also resetting the stream to position 0 before reading did not help, as did specifying an offset for my Stream.Read(), which means I am lost.
Can anyone point me to reading small portions of a stream to a byte[]? Maybe with some code?
Thanks in advance
Edit:
Pointing me to the right direction was the answer, that .Read() returns 0 if the end of stream is reached. I modified my code to the following:
public void ReadStream(MemoryStream ContentStream)
{
byte[] buffer = new byte[sizePerHour];
ContentStream.Seek(0, SeekOrigin.Begin); //Added this line
for (int hours = 0; hours < NumberHours; hours++)
{
int t = ContentStream.Read(buffer, 0, sizePerHour);
SecondsToAdd = BitConverter.ToUInt32(buffer, 0);
// further processing of my byte[] buffer
}
}
And everything works like a charm. I initially reset the stream to its origin every time I iterated over hour and giving an offset. Moving the "set to beginning-Part" outside my look and leaving the offset at 0 did the trick.
Read returns zero if the end of the stream is reached. Are you sure, that your memory stream has the content you expect? I´ve tried the following and it works as expected:
// Create the source of the memory stream.
UInt32[] source = {42, 4711};
List<byte> sourceBuffer = new List<byte>();
Array.ForEach(source, v => sourceBuffer.AddRange(BitConverter.GetBytes(v)));
// Read the stream.
using (MemoryStream contentStream = new MemoryStream(sourceBuffer.ToArray()))
{
byte[] buffer = new byte[sizeof (UInt32)];
int t;
do
{
t = contentStream.Read(buffer, 0, buffer.Length);
if (t > 0)
{
UInt32 value = BitConverter.ToUInt32(buffer, 0);
}
} while (t > 0);
}