CloudBlob.OpenRead() is not reading all data - c#

With windows Azure Storage Client Library, CloudBlob.OpenRead() method reads only 4 mb of data. How can I read full stream using OpenRead method.
CloudBlob blob = container.GetBlobReference(filename);
Stream stream = blob.OpenRead();
I must use OpenRead method. Can't use DownloadToFile or DownloadToStream.
EDIT :
The consumer code which is outside my scope calls
stream.CopyTo(readIntoStream);
CopyTo is an extension method.
public static int CopyTo(this Stream source, Stream destination)
{
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead;
int bytesCopied = 0;
do
{
bytesRead = source.Read(buffer, 0, BUFFER_SIZE);
if (bytesRead > 0)
destination.Write(buffer, 0, bytesRead);
bytesCopied += bytesRead;
}
while (bytesRead > 0);
return bytesCopied;
}
EDIT 2 :
I have observed that when file is uploaded using blob.UploadText() method, it works fine but when blob is uploaded using OpenWrite method as done in following code sample, the OpenRead Method reads only 4194304 bytes (4 mb).
using(var input = File.OpenRead(#"C:\testFile.txt")) //5000000 bytes
using (var output = blob.OpenWrite())
{
input.CopyTo(output);
}
EDIT 3 :
Complete code that produces the issue at my end.
static void Main(string[] args)
{
var blobContainer = GetContainer("tier1");
blobContainer.CreateIfNotExist();
var containerPermissions = new BlobContainerPermissions();
containerPermissions.PublicAccess = BlobContainerPublicAccessType.Blob;
blobContainer.SetPermissions(containerPermissions);
var blob = blobContainer.GetBlobReference("testfileWithOpenWrite1.txt");
using (var input = File.OpenRead(#"C:\testFile.txt")) //5000000 bytes
using (var output = blob.OpenWrite(new BlobRequestOptions()))
input.CopyTo(output);
Console.WriteLine("uploaded");
int bytesDownloaded = 0;
byte[] buffer = new byte[65536];
using (BlobStream bs = blob.OpenRead())
{
int chunkLength;
do
{
chunkLength = bs.Read(buffer, 0, buffer.Length);
bytesDownloaded += chunkLength;
} while (chunkLength > 0);
}
Console.WriteLine(bytesDownloaded);
}
public static int CopyTo(this Stream source, Stream destination)
{
int BUFFER_SIZE = 65536;
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead;
int bytesCopied = 0;
do
{
bytesRead = source.Read(buffer, 0, BUFFER_SIZE);
if (bytesRead > 0)
destination.Write(buffer, 0, bytesRead);
bytesCopied += bytesRead;
}
while (bytesRead > 0);
return bytesCopied;
}
EDIT 4 - Conclusion:
This was probably a bug in Microsoft.WindowsAzure.StorageClient.dll (assembly version 6.0.6002.17043) that comes with SDK v1.2. It works with latest Microsoft.WindowsAzure.StorageClient.dll (assembly version 6.0.6002.18312 - SDK v1.4).
Thanks smarx :)

I can't reproduce what you're seeing. Things seem to work as expected:
static void Main(string[] args)
{
// I also tried a real cloud storage account. Same result.
var container = CloudStorageAccount.DevelopmentStorageAccount.CreateCloudBlobClient().GetContainerReference("testcontainer");
container.CreateIfNotExist();
var blob = container.GetBlobReference("testblob.txt");
blob.UploadText(new String('x', 5000000));
var source = blob.OpenRead();
int BUFFER_SIZE = 4000000;
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead;
int bytesCopied = 0;
do
{
bytesRead = source.Read(buffer, 0, BUFFER_SIZE);
bytesCopied += bytesRead;
} while (bytesRead > 0);
Console.WriteLine(bytesCopied); // prints 5000000
}
EDIT:
I've also (in response to the edited question) now tried uploading the blob using OpenWrite, and the result is the same. (I get the full blob back.) I used this code in place of blob.UploadText(...):
using (var input = File.OpenRead(#"testblob.txt")) //5000000 bytes
using (var output = blob.OpenWrite())
{
input.CopyTo(output);
}
The final WriteLine still prints 5000000.
EDIT 2:
This is getting a bit tiresome. Changing the BUFFER_SIZE to 65536 didn't change anything. The results still seem correct to me. Here's my full application for comparison:
static void Main(string[] args)
{
// I also tried a real cloud storage account. Same result.
var container = CloudStorageAccount.DevelopmentStorageAccount.CreateCloudBlobClient().GetContainerReference("testcontainer");
container.CreateIfNotExist();
var blob = container.GetBlobReference("testblob.txt");
using (var input = File.OpenRead(#"testblob.txt")) //5000000 bytes
using (var output = blob.OpenWrite())
{
input.CopyTo(output);
}
var source = blob.OpenRead();
int BUFFER_SIZE = 65536;
byte[] buffer = new byte[BUFFER_SIZE];
int bytesRead;
int bytesCopied = 0;
do
{
bytesRead = source.Read(buffer, 0, BUFFER_SIZE);
bytesCopied += bytesRead;
} while (bytesRead > 0);
Console.WriteLine(bytesCopied); // prints 5000000
}

Related

How to stream an mp3 file from a URL without using much RAM?

I have a method to stream mp3 from url sources. In this method, the file begins downloading and at the same time the downloaded bytes are stored in a MemoryStream. But I realized that this method is not good. Because the RAM usage is approximately 50 mb when the mp3 file is being played.
So I want to make it without storing in MemoryStream. I tried storing the downloaded bytes in a temporary file, but it didn't worked. How to fix it to work with FileStream?
It works good using MemoryStream:
MemoryStream ms = new MemoryStream();
int bytesRead = 0;
long pos = 0;
int total = 0;
do
{
bytesRead = responseStream.Read(buffer, 0, buffer.Length);
Buffer.BlockCopy(buffer, 0, bigBuffer, total, bytesRead);
total += bytesRead;
pos = ms.Position;
ms = new MemoryStream(bigBuffer);
ms.Position = pos;
frame = Mp3Frame.LoadFromStream(ms);
//Other codes to play mp3...
}
while (bytesRead > 0 || waveOut.PlaybackState == PlaybackState.Playing);
This code throws exception on the LoadFromStream line with FileStream
int bytesRead = 0;
long pos = 0;
int total = 0;
string path =Path.Combine( Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData),"temp.mp3");
FileStream fs = new FileStream(path, FileMode.Create, FileAccess.ReadWrite);
do
{
bytesRead = responseStream.Read(buffer, 0, buffer.Length);
total += bytesRead;
pos = fs.Position;
fs.Write(buffer, 0, bytesRead);
fs.Position = pos;
frame = Mp3Frame.LoadFromStream(fs);
//Other codes to play mp3...
}
while (bytesRead > 0 || waveOut.PlaybackState == PlaybackState.Playing);

sending binary files through TCP sockets c

I made a client & server that establishes a TCP connection through sockets, I'm trying to send binary data over the socket but I could only send txt or pdf files, I had no luck with exe files, I use fread and fseek to read the file and split it into a buffer. When I read the whole exe file it gets sent successfully but when I split it, it gets sent corrupted!
I read some books about sockets but I still don't know much and I have some questions.
is it okay to send the whole file in one send()?? or I should continue with sending it in small chunks?
Also, why exe files get corrupted when I send them in chunks?
Thank you!
Client Code (in c):
int bufSize = 10000;
int sentBytes = 0;
FILE * pFile;
long remainingBytes;
char * buffer;
size_t result;
pFile = fopen ( "D:\\file.exe" , "rb" );
fseek (pFile , 0 , SEEK_END);
remainingBytes = ftell (pFile);
rewind (pFile);
int bufferSize = remainingBytes > bufSize ? bufSize : remainingBytes;
buffer = (char*) malloc (sizeof(char)*bufferSize);
send(Socket, (char*)&remainingBytes, 4, 0);
while(remainingBytes > 0)
{
fseek (pFile , sentBytes , SEEK_SET);
result = fread(buffer,1,bufferSize,pFile);
if(bufferSize < remainingBytes)
{
send(Socket, buffer, bufferSize, 0);
}
else
{
send(Socket, buffer, remainingBytes, 0);
bufferSize = remainingBytes;
}
remainingBytes -= bufferSize;
sentBytes += bufferSize;
}
Server code (in c#)
try
{
int bufferSize = 200;
int len = 0;
int receivedBytes = 0;
int remainingBytes = len;
byte[] length = new byte[4];
//byte[] imgBuf = new byte[bufferSize];
int current = 0;
List<byte[]> coming = new List<byte[]>();
sockets[number].Receive(length,4,0);
len = BitConverter.ToInt32(length, 0);
remainingBytes = len;
bufferSize = len < bufferSize ? len : bufferSize;
while(receivedBytes < len)
{
if (remainingBytes > bufferSize)
{
coming.Add(new byte[bufferSize]);
//imgBuf = new byte[bufferSize];
sockets[number].Receive(coming[current], bufferSize, 0);
}
else
{
coming.Add(new byte[remainingBytes]);
//imgBuf = new byte[remainingBytes];
sockets[number].Receive(coming[current], remainingBytes, 0);
bufferSize = remainingBytes;
}
remainingBytes -= bufferSize;
receivedBytes += bufferSize;
current++;
//Array.Clear(imgBuf, 0, imgBuf.Length);
}
using (var stream = new FileStream(#"C:\receivedFile.exe",FileMode.Create))
{
using (var binaryWriter = new BinaryWriter(stream))
{
foreach (byte[] buffer in coming)
{
binaryWriter.Write(buffer);
}
}
}
}
catch (Exception ex)
{ this.setText(ex.Message, textBox2); }
Edit: Thanks for the help, I got it working :)
try
{
int bufferSize = 1024 * 100;
int len = 0;
int receivedBytes = 0;
int remainingBytes = len;
int reached = 0;
byte[] length = new byte[4];
byte[] imgBuf = new byte[bufferSize];
int current = 0;
sockets[number].Receive(length,4,0);
len = BitConverter.ToInt32(length, 0);
remainingBytes = len;
bufferSize = len < bufferSize ? len : bufferSize;
imgBuf = new byte[len];
while (reached < len)
{
reached += sockets[number].Receive(imgBuf, receivedBytes, remainingBytes, 0);
remainingBytes = len - reached;
receivedBytes = reached;
current++;
//Array.Clear(imgBuf, 0, imgBuf.Length);
}
using (var stream = new FileStream(#"C:\putty.exe",FileMode.Create))
{
using (var binaryWriter = new BinaryWriter(stream))
{
binaryWriter.Write(imgBuf);
}
}
Array.Clear(imgBuf, 0, imgBuf.Length);
}
catch (Exception ex)
{ this.setText(ex.Message, textBox2); }
You don't check how many bytes you actually received in sockets[number].Receive(coming[current], bufferSize, 0); . It doesn't have to be equal to the buffer size you have declared.
Additionaly, as said in comments, keeping whole file in memory is not a good idea.
In addition to checking the number of bytes received, you need to check the number of bytes sent by each send call -- if your TCP transmit window fills up, a send call might send less data than you requested, in which case the you'll need to resend the unsent data.
In general, you ALWAYS need to check the return value of your system calls to check for the all the various odd corner cases that can occur. Read the man pages for send(2) and recv(2) for a full list of everything that can happen.

How to split a large file into chunks in c#?

I'm making a simple file transfer sender and receiver app through the wire. What I have so far is that the sender converts the file into a byte array and sends chunks of that array to the receiver.
This works with file of up to 256mb, but this line throws a "System out of memory" exception for anything above:
byte[] buffer = StreamFile(fileName); //This is where I convert the file
I'm looking for a way to read the file in chunks then write that chunk instead of loading the whole file into a byte. How can I do this with a FileStream?
EDIT:
Sorry, heres my crappy code so far:
private void btnSend(object sender, EventArgs e)
{
Socket clientSock = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
byte[] fileName = Encoding.UTF8.GetBytes(fName); //file name
byte[] fileData = null;
try
{
fileData = StreamFile(textBox1.Text); //file
}
catch (OutOfMemoryException ex)
{
MessageBox.Show("Out of memory");
return;
}
byte[] fileNameLen = BitConverter.GetBytes(fileName.Length); //length of file name
clientData = new byte[4 + fileName.Length + fileData.Length];
fileNameLen.CopyTo(clientData, 0);
fileName.CopyTo(clientData, 4);
fileData.CopyTo(clientData, 4 + fileName.Length);
clientSock.Connect("172.16.12.91", 9050);
clientSock.Send(clientData, 0, 4 + fileName.Length, SocketFlags.None);
for (int i = 4 + fileName.Length; i < clientData.Length; i++)
{
clientSock.Send(clientData, i, 1 , SocketFlags.None);
}
clientSock.Close();
}
And here's how I receive (the code was from a tutorial)
public void ReadCallback(IAsyncResult ar)
{
int fileNameLen = 1;
String content = String.Empty;
StateObject state = (StateObject)ar.AsyncState;
Socket handler = state.workSocket;
int bytesRead = handler.EndReceive(ar);
if (bytesRead > 0)
{
if (flag == 0)
{
Thread.Sleep(1000);
fileNameLen = BitConverter.ToInt32(state.buffer, 0);
string fileName = Encoding.UTF8.GetString(state.buffer, 4, fileNameLen);
receivedPath = fileName;
flag++;
}
if (flag >= 1)
{
BinaryWriter writer = new BinaryWriter(File.Open(receivedPath, FileMode.Append));
if (flag == 1)
{
writer.Write(state.buffer, 4 + fileNameLen, bytesRead - (4 + fileNameLen));
flag++;
}
else
writer.Write(state.buffer, 0, bytesRead);
writer.Close();
handler.BeginReceive(state.buffer, 0, StateObject.BufferSize, 0,
new AsyncCallback(ReadCallback), state);
}
}
else
{
Invoke(new MyDelegate(LabelWriter));
}
}
I just really want to know how I can read the file in chunks so that I dont need to convert it to a byte.
Thanks for the responses so far, I think I'm starting to get it :D
Just call Read repeatedly with a small buffer (I tend to use something like 16K). Note that the call to Read may end up reading a smaller amount than you request. If you're using a fixed chunk size and need the whole chunk in memory, you could just use an array of that size of course.
Without knowing how you're sending the file, it's hard to give much advice about how to structure your code, but it could be something like this:
byte[] chunk = new byte[MaxChunkSize];
while (true)
{
int index = 0;
// There are various different ways of structuring this bit of code.
// Fundamentally we're trying to keep reading in to our chunk until
// either we reach the end of the stream, or we've read everything we need.
while (index < chunk.Length)
{
int bytesRead = stream.Read(chunk, index, chunk.Length - index);
if (bytesRead == 0)
{
break;
}
index += bytesRead;
}
if (index != 0) // Our previous chunk may have been the last one
{
SendChunk(chunk, index); // index is the number of bytes in the chunk
}
if (index != chunk.Length) // We didn't read a full chunk: we're done
{
return;
}
}
If I was more awake I'd probably find a more readable way of writing this, but it'll do for now. One option is to extract another method from the middle section:
// Attempts to read an entire chunk into the given array; returns the size of
// chunk actually read.
int ReadChunk(Stream stream, byte[] chunk)
{
int index = 0;
while (index < chunk.Length)
{
int bytesRead = stream.Read(chunk, index, chunk.Length - index);
if (bytesRead == 0)
{
break;
}
index += bytesRead;
}
return index;
}
var b = new byte[1<<15]; // 32k
while((count = inStream.Read(b, 0, b.Length)) > 0)
{
outStream.Write(b, 0, count);
}
public static IEnumerable<byte[]> SplitStreamIntoChunks(Stream stream, int chunkSize)
{
var bytesRemaining = stream.Length;
while (bytesRemaining > 0)
{
var size = Math.Min((int) bytesRemaining, chunkSize);
var buffer = new byte[size];
var bytesRead = stream.Read(buffer, 0, size);
if (bytesRead <= 0)
break;
yield return buffer;
bytesRemaining -= bytesRead;
}
}

Problem in splitting a file

int bufferlength = 12488;
int pointer = 1;
int offset = 0;
int length = 0;
FileStream fstwrite = new FileStream("D:\\Movie.wmv", FileMode.Create);
while (pointer != 0)
{
byte[] buff = new byte[bufferlength];
FileStream fst = new FileStream("E:\\Movie.wmv", FileMode.Open);
pointer = fst.Read(buff, 0, bufferlength);
fst.Close();
fstwrite.Write(buff, offset , pointer);
offset += pointer;
}
I used the above code for splitting a file and place it in other drive.Im not able to set the correct offset and length for this routine can anyone help me to fix this
splitting in the sense ,i split it in "x" kbs and pass it somewhere make the same file in some other location
I find it atlast ,thanks to evry one who gave their valueble responses.
Currently you're always reading from the start of the file... and even if you weren't you'd just be copying the whole file.
Here's some code which will actually split a single file into multiple files:
public static void SplitFile(string inputFile,
string outputPrefix,
int chunkSize)
{
byte[] buffer = new byte[chunkSize];
using (Stream input = File.OpenRead(inputFile))
{
int index = 0;
while (input.Position < input.Length)
{
using (Stream output = File.Create(outputPrefix + index))
{
int chunkBytesRead = 0;
while (chunkBytesRead < chunkSize)
{
int bytesRead = input.Read(buffer,
chunkBytesRead,
chunkSize - chunkBytesRead);
// End of input
if (bytesRead == 0)
{
break;
}
chunkBytesRead += bytesRead;
}
output.Write(buffer, 0, chunkBytesRead);
}
index++;
}
}
}
Your reading bufferlength of bytes. Shouldn't you set the offset like this then?
offset += bufferlength;
Don't open your source file inside the loop, or you'll always read the first chunk.
Open it before the loop, then make sure your offset is applied to the read.

How do I convert a Stream into a byte[] in C#? [duplicate]

This question already has answers here:
Creating a byte array from a stream
(18 answers)
Closed 6 years ago.
Is there a simple way or method to convert a Stream into a byte[] in C#?
The shortest solution I know:
using(var memoryStream = new MemoryStream())
{
sourceStream.CopyTo(memoryStream);
return memoryStream.ToArray();
}
Call next function like
byte[] m_Bytes = StreamHelper.ReadToEnd (mystream);
Function:
public static byte[] ReadToEnd(System.IO.Stream stream)
{
long originalPosition = 0;
if(stream.CanSeek)
{
originalPosition = stream.Position;
stream.Position = 0;
}
try
{
byte[] readBuffer = new byte[4096];
int totalBytesRead = 0;
int bytesRead;
while ((bytesRead = stream.Read(readBuffer, totalBytesRead, readBuffer.Length - totalBytesRead)) > 0)
{
totalBytesRead += bytesRead;
if (totalBytesRead == readBuffer.Length)
{
int nextByte = stream.ReadByte();
if (nextByte != -1)
{
byte[] temp = new byte[readBuffer.Length * 2];
Buffer.BlockCopy(readBuffer, 0, temp, 0, readBuffer.Length);
Buffer.SetByte(temp, totalBytesRead, (byte)nextByte);
readBuffer = temp;
totalBytesRead++;
}
}
}
byte[] buffer = readBuffer;
if (readBuffer.Length != totalBytesRead)
{
buffer = new byte[totalBytesRead];
Buffer.BlockCopy(readBuffer, 0, buffer, 0, totalBytesRead);
}
return buffer;
}
finally
{
if(stream.CanSeek)
{
stream.Position = originalPosition;
}
}
}
I use this extension class:
public static class StreamExtensions
{
public static byte[] ReadAllBytes(this Stream instream)
{
if (instream is MemoryStream)
return ((MemoryStream) instream).ToArray();
using (var memoryStream = new MemoryStream())
{
instream.CopyTo(memoryStream);
return memoryStream.ToArray();
}
}
}
Just copy the class to your solution and you can use it on every stream:
byte[] bytes = myStream.ReadAllBytes()
Works great for all my streams and saves a lot of code!
Of course you can modify this method to use some of the other approaches here to improve performance if needed, but I like to keep it simple.
In .NET Framework 4 and later, the Stream class has a built-in CopyTo method that you can use.
For earlier versions of the framework, the handy helper function to have is:
public static void CopyStream(Stream input, Stream output)
{
byte[] b = new byte[32768];
int r;
while ((r = input.Read(b, 0, b.Length)) > 0)
output.Write(b, 0, r);
}
Then use one of the above methods to copy to a MemoryStream and call GetBuffer on it:
var file = new FileStream("c:\\foo.txt", FileMode.Open);
var mem = new MemoryStream();
// If using .NET 4 or later:
file.CopyTo(mem);
// Otherwise:
CopyStream(file, mem);
// getting the internal buffer (no additional copying)
byte[] buffer = mem.GetBuffer();
long length = mem.Length; // the actual length of the data
// (the array may be longer)
// if you need the array to be exactly as long as the data
byte[] truncated = mem.ToArray(); // makes another copy
Edit: originally I suggested using Jason's answer for a Stream that supports the Length property. But it had a flaw because it assumed that the Stream would return all its contents in a single Read, which is not necessarily true (not for a Socket, for example.) I don't know if there is an example of a Stream implementation in the BCL that does support Length but might return the data in shorter chunks than you request, but as anyone can inherit Stream this could easily be the case.
It's probably simpler for most cases to use the above general solution, but supposing you did want to read directly into an array that is bigEnough:
byte[] b = new byte[bigEnough];
int r, offset;
while ((r = input.Read(b, offset, b.Length - offset)) > 0)
offset += r;
That is, repeatedly call Read and move the position you will be storing the data at.
Byte[] Content = new BinaryReader(file.InputStream).ReadBytes(file.ContentLength);
byte[] buf; // byte array
Stream stream=Page.Request.InputStream; //initialise new stream
buf = new byte[stream.Length]; //declare arraysize
stream.Read(buf, 0, buf.Length); // read from stream to byte array
Ok, maybe I'm missing something here, but this is the way I do it:
public static Byte[] ToByteArray(this Stream stream) {
Int32 length = stream.Length > Int32.MaxValue ? Int32.MaxValue : Convert.ToInt32(stream.Length);
Byte[] buffer = new Byte[length];
stream.Read(buffer, 0, length);
return buffer;
}
if you post a file from mobile device or other
byte[] fileData = null;
using (var binaryReader = new BinaryReader(Request.Files[0].InputStream))
{
fileData = binaryReader.ReadBytes(Request.Files[0].ContentLength);
}
Stream s;
int len = (int)s.Length;
byte[] b = new byte[len];
int pos = 0;
while((r = s.Read(b, pos, len - pos)) > 0) {
pos += r;
}
A slightly more complicated solution is necesary is s.Length exceeds Int32.MaxValue. But if you need to read a stream that large into memory, you might want to think about a different approach to your problem.
Edit: If your stream does not support the Length property, modify using Earwicker's workaround.
public static class StreamExtensions {
// Credit to Earwicker
public static void CopyStream(this Stream input, Stream output) {
byte[] b = new byte[32768];
int r;
while ((r = input.Read(b, 0, b.Length)) > 0) {
output.Write(b, 0, r);
}
}
}
[...]
Stream s;
MemoryStream ms = new MemoryStream();
s.CopyStream(ms);
byte[] b = ms.GetBuffer();
"bigEnough" array is a bit of a stretch. Sure, buffer needs to be "big ebough" but proper design of an application should include transactions and delimiters. In this configuration each transaction would have a preset length thus your array would anticipate certain number of bytes and insert it into correctly sized buffer. Delimiters would ensure transaction integrity and would be supplied within each transaction. To make your application even better, you could use 2 channels (2 sockets). One would communicate fixed length control message transactions that would include information about size and sequence number of data transaction to be transferred using data channel. Receiver would acknowledge buffer creation and only then data would be sent.
If you have no control over stream sender than you need multidimensional array as a buffer. Component arrays would be small enough to be manageable and big enough to be practical based on your estimate of expected data. Process logic would seek known start delimiters and then ending delimiter in subsequent element arrays. Once ending delimiter is found, new buffer would be created to store relevant data between delimiters and initial buffer would have to be restructured to allow data disposal.
As far as a code to convert stream into byte array is one below.
Stream s = yourStream;
int streamEnd = Convert.ToInt32(s.Length);
byte[] buffer = new byte[streamEnd];
s.Read(buffer, 0, streamEnd);
Quick and dirty technique:
static byte[] StreamToByteArray(Stream inputStream)
{
if (!inputStream.CanRead)
{
throw new ArgumentException();
}
// This is optional
if (inputStream.CanSeek)
{
inputStream.Seek(0, SeekOrigin.Begin);
}
byte[] output = new byte[inputStream.Length];
int bytesRead = inputStream.Read(output, 0, output.Length);
Debug.Assert(bytesRead == output.Length, "Bytes read from stream matches stream length");
return output;
}
Test:
static void Main(string[] args)
{
byte[] data;
string path = #"C:\Windows\System32\notepad.exe";
using (FileStream fs = File.Open(path, FileMode.Open, FileAccess.Read))
{
data = StreamToByteArray(fs);
}
Debug.Assert(data.Length > 0);
Debug.Assert(new FileInfo(path).Length == data.Length);
}
I would ask, why do you want to read a stream into a byte[], if you are wishing to copy the contents of a stream, may I suggest using MemoryStream and writing your input stream into a memory stream.
You could also try just reading in parts at a time and expanding the byte array being returned:
public byte[] StreamToByteArray(string fileName)
{
byte[] total_stream = new byte[0];
using (Stream input = File.Open(fileName, FileMode.Open, FileAccess.Read))
{
byte[] stream_array = new byte[0];
// Setup whatever read size you want (small here for testing)
byte[] buffer = new byte[32];// * 1024];
int read = 0;
while ((read = input.Read(buffer, 0, buffer.Length)) > 0)
{
stream_array = new byte[total_stream.Length + read];
total_stream.CopyTo(stream_array, 0);
Array.Copy(buffer, 0, stream_array, total_stream.Length, read);
total_stream = stream_array;
}
}
return total_stream;
}

Categories