How to rewrite file as byte array fast C# - c#

Hello I am trying to rewrite file by replacing bytes but it takes too much time to rewrite large files. For example on 700MB this code was working about 6 minutes. Pls help me to make it work less than 1 minute.
static private void _12_56(string fileName)
{
byte[] byteArray = File.ReadAllBytes(fileName);
for (int i = 0; i < byteArray.Count() - 6; i += 6)
{
Swap(ref byteArray[i], ref byteArray[i + 4]);
Swap(ref byteArray[i + 1], ref byteArray[i + 5]);
}
File.WriteAllBytes(fileName, byteArray);
}

Read the file in chuncks of bytes which are divisible by 6.
Replace the necessary bytes in each chunk and write each chunk to another file before reading the next chunk.
You can also try to perform the read of the next chunk in parallel with writing the next chunk:
using( var source = new FileStream(#"c:\temp\test.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
using( var target = new FileStream(#"c:\temp\test.txt", FileMode.Open, FileAccess.Write, FileShare.ReadWrite))
{
await RewriteFile(source, target);
}
}
private async Task RewriteFile( FileStream source, FileStream target )
{
// We're reading bufferSize bytes from the source-stream inside one half of the buffer
// while the writeTask is writing the other half of the buffer to the target-stream.
// define how many chunks of 6 bytes you want to read per read operation
int chunksPerBuffer = 1;
int bufferSize = 6 * chunksPerBuffer;
// declare a byte array that contains both the bytes that are read
// and the bytes that are being written in parallel.
byte[] buffer = new byte[bufferSize * 2];
// curoff is the start-position of the bytes we're working with in the
// buffer
int curoff = 0;
Task writeTask = Task.CompletedTask;
int len;
// Read the desired number of bytes from the file into the buffer.
// In the first read operation, the bytes will be placed in the first
// half of the buffer. The next read operation will read them in
// the second half of the buffer.
while ((len = await source.ReadAsync(buffer, curoff, bufferSize).ConfigureAwait(false)) != 0)
{
// Swap the bytes in the current buffer.
// When reading x * 6 bytes in one go, every 1st byte will be replaced by the 4th byte; every 2nd byte will be replaced by the 5th byte.
for (int i = curoff; i < bufferSize + curoff; i += 6)
{
Swap(ref buffer[i], ref buffer[i + 4]);
Swap(ref buffer[i + 1], ref buffer[i + 5]);
}
// wait until the previous write-task completed.
await writeTask.ConfigureAwait(false);
// Start writing the bytes that have just been processed.
// Do not await the task here, so that the next bytes
// can be read in parallel.
writeTask = target.WriteAsync(buffer, curoff, len);
// Position the pointer to the beginnen of the other part
// in the buffer
curoff ^= bufferSize;
}
// Make sure that the last write also finishes before closing
// the target stream.
await writeTask.ConfigureAwait(false);
}
The code above should read a file, swap bytes and rewrite to the same file in parallel.

As the other answer says, you have to read the file in chunks.
Since you are rewriting the same file, it's easiest to use the same stream for reading and writing.
using(var file = File.Open(path, FileMode.Open, FileAccess.ReadWrite)) {
// Read buffer. Size must be divisible by 6
var buffer = new byte[6*1000];
// Keep track of how much we've read in each iteration
var bytesRead = 0;
// Fill the buffer. Put the number of bytes into 'bytesRead'.
// Stop looping if we read less than 6 bytes.
// EOF will be signalled by Read returning -1.
while ((bytesRead = file.Read(buffer, 0, buffer.Length)) >= 6)
{
// Swap the bytes in the current buffer
for (int i = 0; i < bytesRead; i += 6)
{
Swap(ref buffer[i], ref buffer[i + 4]);
Swap(ref buffer[i + 1], ref buffer[i + 5]);
}
// Step back in the file, to where we filled the buffer from
file.Position -= bytesRead;
// Overwrite with the swapped bytes
file.Write(buffer, 0, bytesRead);
}
}

Related

Fastest way to split a Stream according to a pattern

What would be the most optimal/fastest way to split a Steam into chunks delimited by a byte pattern (eg. new byte[] { 0, 0 })?
My current, naieve and slow, implementation reads the stream byte per byte, decrements a counter each time it encounters the delimiter. If the counter is zero, it yields a memory chunk.
const int NUMBER_CONSECUTIVE_DELIMITER = 2;
const int DELIMITER = 0;
public IEnumerable<ReadOnlyMemory<byte>> Chunk(Stream stream)
{
var chunk = new MemoryStream();
try
{
int b; //the byte being read
int c = NUMBER_CONSECUTIVE_DELIMITER;
while ((b = stream.ReadByte()) != -1) //Read the stream byte by byte, -1 = end of the stream
{
chunk.WriteByte((byte)b); //Write this byte to the next chunk
if (b == DELIMITER)
c--; //if we hit the delimiter (ie '0') decrement the counter
else
c = NUMBER_CONSECUTIVE_DELIMITER; //else, reset the couter
if ((c <= 0 || stream.Position == stream.Length) //we hit two subsequent '0's
{
var r = chunk.ToArray().AsMemory(); //parse it to a Memory<T>
chunk.Dispose();
chunk = new();
yield return r;
}
}
}
finally
{
chunk.Dispose();
}
}
Such an implementation is extremely difficult to implement because a stream has to be read out in fixed buffer sizes. The buffer can be too big or too small for the content to be interpreted. To solve this problem, the ReadOnlySequence<T> struct was added. More information about this topic can be seen here.
By using System.IO.Pipelines (package must be obtained) this problem can be solved as follows:
public static async Task FillPipeAsync(Stream stream, PipeWriter writer, CancellationToken cancellationToken = default)
{
// The minimum buffer size that is used for the current buffer segment.
const int bufferSize = 65536;
while (true)
{
// Request 65536 bytes from the PipeWriter.
Memory<byte> memory = writer.GetMemory(bufferSize);
// Read the content from the stream.
int bytesRead = await stream.ReadAsync(memory, cancellationToken).ConfigureAwait(false);
if (bytesRead == 0) break;
// Tell the writer how many bytes are read.
writer.Advance(bytesRead);
// Flush the data to the PipeWriter.
FlushResult result = await writer.FlushAsync(cancellationToken).ConfigureAwait(false);
if (result.IsCompleted) break;
}
// This enables our reading process to be notified that no more new data is coming.
await writer.CompleteAsync().ConfigureAwait(false);
}
This will read your stream asynchronously and write a buffer segment to the pipe. Next you have to implement a read logic to slice/merge the concatenated buffer segments into chunks:
public static async IAsyncEnumerable<ReadOnlySequence<byte>> ReadPipeAsync(PipeReader reader, ReadOnlyMemory<byte> delimiter,
[EnumeratorCancellation] CancellationToken cancellationToken = default)
{
while (true)
{
// Read from the PipeReader.
ReadResult result = await reader.ReadAsync(cancellationToken).ConfigureAwait(false);
ReadOnlySequence<byte> buffer = result.Buffer;
while (TryReadChunk(ref buffer, delimiter.Span, out ReadOnlySequence<byte> chunk))
yield return chunk;
// Tell the PipeReader how many bytes are read.
// This is essential because the Pipe will release last used buffer segments that are not longer in use.
reader.AdvanceTo(buffer.Start, buffer.End);
// Take care of the complete notification and return the last buffer. UPDATE: Corrected issue 2/.
if (result.IsCompleted)
{
yield return buffer;
break;
}
}
await reader.CompleteAsync().ConfigureAwait(false);
}
private static bool TryReadChunk(ref ReadOnlySequence<byte> buffer, ReadOnlySpan<byte> delimiter,
out ReadOnlySequence<byte> chunk)
{
// Search the buffer for the first byte of the delimiter.
SequencePosition? position = buffer.PositionOf(delimiter[0]);
// If no occurence was found or the next bytes of the data in the buffer does not match the delimiter, return false.
// UPDATE: Corrected issue 3/.
if (position is null || !buffer.Slice(position.Value, delimiter.Length).FirstSpan.StartsWith(delimiter))
{
chunk = default;
return false;
}
// Return the calculated chunk and update the buffer to cut the start.
chunk = buffer.Slice(0, position.Value);
buffer = buffer.Slice(buffer.GetPosition(delimiter.Length, position.Value));
return true;
}
For this to work in that form you have to use an IAsyncEnumerable so that the chunks can be streamed into a foreach loop. Merging and slicing is largely handled by the pipe, so that a reliable algorithm can be built here with relatively little code. This code will also handle this in a high-performance manner.
Usage:
// Create a Pipe that manages the buffer.
Pipe pipe = new Pipe();
ConfiguredTaskAwaitable writing = FillPipeAsync(stream, pipe.Writer).ConfigureAwait(false);
// The delimiter that should be used. This can be any data with length > 0.
ReadOnlyMemory<byte> delimiter = new ReadOnlyMemory<byte>(new byte[] { 0, 0 });
// 'await foreach' and 'await writing' are executed asynchronously (in parallel).
await foreach (ReadOnlySequence<byte> chunk in ReadPipeAsync(pipe.Reader, delimiter))
{
// Use "chunk" to retrieve your chunked content.
};
await writing;
Note that reading and chunking is done asynchronously and independently.
I eventually ended up with the below code, strongly inspired by Philipp's answer above and https://keestalkstech.com/2010/11/seek-position-of-a-string-in-a-file-or-filestream/.
public override IEnumerable<byte[]> Chunk(Stream stream)
{
var buffer = new byte[bufferSize];
var size = bufferSize;
var offset = 0;
var position = stream.Position;
var nextChunk = Array.Empty<byte>();
while (true)
{
var bytesRead = stream.Read(buffer, offset, size);
// when no bytes are read -- the string could not be found
if (bytesRead <= 0)
break;
// when less then size bytes are read, we need to slice the buffer to prevent reading of "previous" bytes
ReadOnlySpan<byte> ro = buffer;
if (bytesRead < size)
ro = ro.Slice(0, offset + bytesRead);
// check if we can find our search bytes in the buffer
var i = ro.IndexOf(Delimiter);
if (i > -1 && // we found something
i <= bytesRead && //i <= r -- we found something in the area that was read (at the end of the buffer, the last values are not overwritten). i = r if the delimiter is at the end of the buffer
nextChunk.Length + (i + Delimiter.Length - offset) >= MinChunkSize) //the size of the chunk that will be made is large enough
{
var chunk = buffer[offset..(i + Delimiter.Length)];
yield return new byte[](Concat(nextChunk, chunk));
nextChunk = Array.Empty<byte>();
offset = 0;
size = bufferSize;
position += i + Delimiter.Length;
stream.Position = position;
continue;
}
else if (stream.Position == stream.Length)
{
// we re at the end of the stream
var chunk = buffer[offset..(bytesRead + offset)]; //return the bytes read
yield return new byte[](Concat(nextChunk, chunk));
break;
}
// the stream is not finished. Copy the last 2 bytes to the beginning of the buffer and set the offset to fill the buffer as of byte 3
nextChunk = Concat(nextChunk, buffer[offset..buffer.Length]);
offset = Delimiter.Length;
size = bufferSize - offset;
Array.Copy(buffer, buffer.Length - offset, buffer, 0, offset);
position += bufferSize - offset;
}
}

Copy all but the last 16 bytes of a stream? Early detection of end-of-stream?

This is C# related. We have a case where we need to copy the entire source stream into a destination stream except for the last 16 bytes.
EDIT: The streams can range upto 40GB, so can't do some static byte[] allocation (eg: .ToArray())
Looking at the MSDN documentation, it seems that we can reliably determine the end of stream only when the return value is 0. Return values between 0 and the requested size can imply bytes are "not currently available" (what does that really mean?)
Currently it copies every single byte as follows. inStream and outStream are generic - can be memory, disk or network streams (actually some more too).
public static void StreamCopy(Stream inStream, Stream outStream)
{
var buffer = new byte[8*1024];
var last16Bytes = new byte[16];
int bytesRead;
while ((bytesRead = inStream.Read(buffer, 0, buffer.Length)) > 0)
{
outStream.Write(buffer, 0, bytesRead);
}
// Issues:
// 1. We already wrote the last 16 bytes into
// outStream (possibly over the n/w)
// 2. last16Bytes = ? (inStream may not necessarily support rewinding)
}
What is a reliable way to ensure all but the last 16 are copied? I can think of using Position and Length on the inStream but there is a gotcha on MSDN that says
If a class derived from Stream does not support seeking, calls to Length, SetLength, Position, and Seek throw a NotSupportedException. .
Read between 1 and n bytes from the input stream.1
Append the bytes to a circular buffer.2
Write the first max(0, b - 16) bytes from the circular buffer to the output stream, where b is the number of bytes in the circular buffer.
Remove the bytes that you just have written from the circular buffer.
Go to step 1.
1This is what the Read method does – if you call int n = Read(buffer, 0, 500); it will read between 1 and 500 bytes into buffer and return the number of bytes read. If Read returns 0, you have reached the end of the stream.
2For maximum performance, you can read the bytes directly from the input stream into the circular buffer. This is a bit tricky, because you have to deal with the wraparound within the array underlying the buffer.
The following solution is fast and tested. Hope it's useful. It uses the double buffering idea you already had in mind. EDIT: simplified loop removing the conditional that separated the first iteration from the rest.
public static void StreamCopy(Stream inStream, Stream outStream) {
// Define the size of the chunk to copy during each iteration (1 KiB)
const int blockSize = 1024;
const int bytesToOmit = 16;
const int buffSize = blockSize + bytesToOmit;
// Generate working buffers
byte[] buffer1 = new byte[buffSize];
byte[] buffer2 = new byte[buffSize];
// Initialize first iteration
byte[] curBuffer = buffer1;
byte[] prevBuffer = null;
int bytesRead;
// Attempt to fully fill the buffer
bytesRead = inStream.Read(curBuffer, 0, buffSize);
if( bytesRead == buffSize ) {
// We succesfully retrieved a whole buffer, we will output
// only [blockSize] bytes, to avoid writing to the last
// bytes in the buffer in case the remaining 16 bytes happen to
// be the last ones
outStream.Write(curBuffer, 0, blockSize);
} else {
// We couldn't retrieve the whole buffer
int bytesToWrite = bytesRead - bytesToOmit;
if( bytesToWrite > 0 ) {
outStream.Write(curBuffer, 0, bytesToWrite);
}
// There's no more data to process
return;
}
curBuffer = buffer2;
prevBuffer = buffer1;
while( true ) {
// Attempt again to fully fill the buffer
bytesRead = inStream.Read(curBuffer, 0, buffSize);
if( bytesRead == buffSize ) {
// We retrieved the whole buffer, output first the last 16
// bytes of the previous buffer, and output just [blockSize]
// bytes from the current buffer
outStream.Write(prevBuffer, blockSize, bytesToOmit);
outStream.Write(curBuffer, 0, blockSize);
} else {
// We could not retrieve a complete buffer
if( bytesRead <= bytesToOmit ) {
// The bytes to output come solely from the previous buffer
outStream.Write(prevBuffer, blockSize, bytesRead);
} else {
// The bytes to output come from the previous buffer and
// the current buffer
outStream.Write(prevBuffer, blockSize, bytesToOmit);
outStream.Write(curBuffer, 0, bytesRead - bytesToOmit);
}
break;
}
// swap buffers for next iteration
byte[] swap = prevBuffer;
prevBuffer = curBuffer;
curBuffer = swap;
}
}
static void Assert(Stream inStream, Stream outStream) {
// Routine that tests the copy worked as expected
inStream.Seek(0, SeekOrigin.Begin);
outStream.Seek(0, SeekOrigin.Begin);
Debug.Assert(outStream.Length == Math.Max(inStream.Length - bytesToOmit, 0));
for( int i = 0; i < outStream.Length; i++ ) {
int byte1 = inStream.ReadByte();
int byte2 = outStream.ReadByte();
Debug.Assert(byte1 == byte2);
}
}
A much easier solution to code, yet slower since it would work at a byte level, would be to use an intermediate queue between the input stream and the output stream. The process would first read and enqueue 16 bytes from the input stream. Then it would iterate over the remaining input bytes, reading a single byte from the input stream, enqueuing it and then dequeuing a byte. The dequeued byte would be written to the output stream, until all bytes from the input stream are processed. The unwanted 16 bytes should linger in the intermediate queue.
Hope this helps!
=)
Use a circular buffer sounds great but there is no circular buffer class in .NET which means additional code anyways. I ended up with the following algorithm, a sort of map and copy - I think it's simple. The variable names are longer than usual for the sake of being self descriptive here.
This flows thru the buffers as
[outStream] <== [tailBuf] <== [mainBuf] <== [inStream]
public byte[] CopyStreamExtractLastBytes(Stream inStream, Stream outStream,
int extractByteCount)
{
//var mainBuf = new byte[1024*4]; // 4K buffer ok for network too
var mainBuf = new byte[4651]; // nearby prime for testing
int mainBufValidCount;
var tailBuf = new byte[extractByteCount];
int tailBufValidCount = 0;
while ((mainBufValidCount = inStream.Read(mainBuf, 0, mainBuf.Length)) > 0)
{
// Map: how much of what (passthru/tail) lives where (MainBuf/tailBuf)
// more than tail is passthru
int totalPassthruCount = Math.Max(0, tailBufValidCount +
mainBufValidCount - extractByteCount);
int tailBufPassthruCount = Math.Min(tailBufValidCount, totalPassthruCount);
int tailBufTailCount = tailBufValidCount - tailBufPassthruCount;
int mainBufPassthruCount = totalPassthruCount - tailBufPassthruCount;
int mainBufResidualCount = mainBufValidCount - mainBufPassthruCount;
// Copy: Passthru must be flushed per FIFO order (tailBuf then mainBuf)
outStream.Write(tailBuf, 0, tailBufPassthruCount);
outStream.Write(mainBuf, 0, mainBufPassthruCount);
// Copy: Now reassemble/compact tail into tailBuf
var tempResidualBuf = new byte[extractByteCount];
Array.Copy(tailBuf, tailBufPassthruCount, tempResidualBuf, 0,
tailBufTailCount);
Array.Copy(mainBuf, mainBufPassthruCount, tempResidualBuf,
tailBufTailCount, mainBufResidualCount);
tailBufValidCount = tailBufTailCount + mainBufResidualCount;
tailBuf = tempResidualBuf;
}
return tailBuf;
}

C# split byte array from file

Hello I'm doing an encryption algorithm which reads bytes from file (any type) and outputs them into a file. The problem is my encryption program takes only blocks of 16 bytes so if the file is bigger it has to be split into blocks of 16, or if there's a way to read 16 bytes from the file each time it's fine.
The algorithm is working fine with hard coded input of 16 bytes. The ciphered result has to be saved in a list or array because it has to be deciphered the same way later. I can't post all my program but here's what I do in main so far and cannot get results
static void Main(String[] args)
{
byte[] bytes = File.ReadAllBytes("path to file");
var stream = new StreamReader(new MemoryStream(bytes));
byte[] cipherText = new byte[16];
byte[] decipheredText = new byte[16];
Console.WriteLine("\nThe message is: ");
Console.WriteLine(stream.ReadToEnd());
AES a = new AES(keyInput);
var list1 = new List<byte[]>();
for (int i = 0; i < bytes.Length; i+=16)
{
a.Cipher(bytes, cipherText);
list1.Add(cipherText);
}
Console.WriteLine("\nThe resulting ciphertext is: ");
foreach (byte[] b in list1)
{
ToBytes(b);
}
}
I know that my loops always add the first 16 bytes from the byte array but I tried many ways and nothing work. It won't let me index the bytes array or copy an item to a temp variable like temp = bytes[i]. The ToBytes method is irrelevant, it just prints the elements as bytes.
I would like to recommend you to change the interface for your Cipher() method: instead of passing the entire array, it would be better to pass the source and destination arrays and offset - block by block encryption.
Pseudo-code is below.
void Cipher(byte[] source, int srcOffset, byte[] dest, int destOffset)
{
// Cipher these bytes from (source + offset) to (source + offset + 16),
// write the cipher to (dest + offset) to (dest + offset + 16)
// Also I'd recommend to check that the source and dest Length is less equal to (offset + 16)!
}
Usage:
For small files (one memory allocation for destination buffer, block by block encryption):
// You can allocate the entire destination buffer before encryption!
byte[] sourceBuffer = File.ReadAllBytes("path to file");
byte[] destBuffer = new byte[sourceBuffer.Length];
// Encrypt each block.
for (int offset = 0; i < sourceBuffer.Length; offset += 16)
{
Cipher(sourceBuffer, offset, destBuffer, offset);
}
So, the main advantage of this approach - it elimitates additional memory allocations: the destination array is allocated at once. There is also no copy-memory operations.
For files of any size (streams, block by block encryption):
byte[] inputBlock = new byte[16];
byte[] outputBlock = new byte[16];
using (var inputStream = File.OpenRead("input path"))
using (var outputStream = File.Create("output path"))
{
int bytesRead;
while ((bytesRead = inputStream.Read(inputBlock, 0, inputBlock.Length)) > 0)
{
if (bytesRead < 16)
{
// Throw or use padding technique.
throw new InvalidOperationException("Read block size is not equal to 16 bytes");
// Fill the remaining bytes of input block with some bytes.
// This operation for last block is called "padding".
// See http://en.wikipedia.org/wiki/Block_cipher_modes_of_operation#Padding
}
Cipher(inputBlock, 0, outputBlock, 0);
outputStream.Write(outputBlock, 0, outputBlock.Length);
}
}
No need to read the whole mess into memory if you can only process it a bit at a time...
var filename = #"c:\temp\foo.bin";
using(var fileStream = new FileStream(filename, FileMode.Open))
{
var buffer = new byte[16];
var bytesRead = 0;
while((bytesRead = fileStream.Read(buffer, 0, buffer.Length)) > 0)
{
// do whatever you need to with the next 16-byte block
Console.WriteLine("Read {0} bytes: {1}",
bytesRead,
string.Join(",", buffer));
}
}
You can use Array.Copy
byte[] temp = new byte[16];
Array.Copy(bytes, i, temp, 0, 16);

Processing C# filestream input in WHILE loop causing execution time error

I have a C# console app that I'm trying to create that processes all the files in a given directory and writes output to another given directory. I want to process the input files X bytes at a time.
namespace FileConverter
{
class Program
{
static void Main(string[] args)
{
string srcFolder = args[0];
string destFolder = args[1];
string[] srcFiles = Directory.GetFiles(srcFolder);
for (int s = 0; s < srcFiles.Length; s++)
{
byte[] fileBuffer;
int numBytesRead = 0;
int readBuffer = 10000;
FileStream srcStream = new FileStream(srcFiles[s], FileMode.Open, FileAccess.Read);
int fileLength = (int)srcStream.Length;
string destFile = destFolder + "\\" + Path.GetFileName(srcFiles[s]) + "-processed";
FileStream destStream = new FileStream(destFile, FileMode.OpenOrCreate, FileAccess.Write);
//Read and process the source file by some chunk of bytes at a time
while (numBytesRead < fileLength)
{
fileBuffer = new byte[readBuffer];
//Read some bytes into the fileBuffer
//TODO: This doesn't work on subsequent blocks
int n = srcStream.Read(fileBuffer, numBytesRead, readBuffer);
//If we didn't read anything, there's no more to process
if (n == 0)
break;
//Process the fileBuffer
for (int i = 0; i < fileBuffer.Length; i++)
{
//Process each byte in the array here
}
//Write data
destStream.Write(fileBuffer, numBytesRead, readBuffer);
numBytesRead += readBuffer;
}
srcStream.Close();
destStream.Close();
}
}
}
}
I'm running into an error at execution time at:
//Read some bytes into the fileBuffer
//TODO: This doesn't work on subsequent blocks
int n = srcStream.Read(fileBuffer, numBytesRead, readBuffer);
I don't want to load the entire file into memory, as it could possibly be many gigabytes in size. I really want to be able to read some number of bytes, process them, write them out to a file, and then read in the next X bytes and repeat.
It gets through one iteration of the loop, and then dies on the second. The error I get is:
"Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection."
The sample file I'm working with is about 32k.
Can anyone tell me what I'm doing wrong here?
The second parameter to Read is not the offset into the file - it is the offset into the buffer at which to start writing data. So just pass 0.
Also, don't assume the buffer is filled each time: you should only process "n" bytes from the buffer. And the buffer should be reused between iterations.
If you need to read exactly a number of bytes:
static void ReadOrThrow(Stream source, byte[] buffer, int count) {
int read, offset = 0;
while(count > 0 && (read = source.Read(buffer, offset, count)) > 0) {
offset += read;
count -= read;
}
if(count != 0) throw new EndOfStreamException();
}
Note that Write works similarly, so you need to pass 0 as the offset and n as the count.
It should be
destStream.Write(fileBuffer, numBytesRead, n);
numBytesRead += n;
because n is the actual number of bytes that was read

Error reading file into array

I get the following error on the second iteration of my loop:
Offset and length were out of bounds for the array or count is greater than the number of elements from index to the end of the source collection.
and this is my loop
FileStream fs = new FileStream("D:\\06.Total Eclipse Of The Moon.mp3", FileMode.Open);
byte[] _FileName = new byte[1024];
long _FileLengh = fs.Length;
int position = 0;
for (int i = 1024; i < fs.Length; i += 1024)
{
fs.Read(_FileName, position, Convert.ToInt32(i));
sck.Client.Send(_FileName);
Thread.Sleep(30);
long unsend = _FileLengh - position;
if (unsend < 1024)
{
position += (int)unsend;
}
else
{
position += i;
}
}
fs.Close();
}
fs.Length = 5505214
On the first iteration, you're calling
fs.Read(_FileName, 0, 1024);
That's fine (although why you're calling Convert.ToInt32 on an int, I don't know.)
On the second iteration, you're going to call
fs.Read(_FileName, position, 2048);
which is trying to read into the _FileName byte array starting at position (which is non-zero) and fetching up to 2048 bytes. The byte array is only 1024 bytes long, so that can't possibly work.
Additional problems:
You haven't used a using statement, so on exceptions you'll leave the stream open
You're ignoring the return value from Read, which means you don't know how much of your buffer has actually been read
You're unconditionally sending the socket the complete buffer, regardless of how much has been read.
Your code should probably look more like this:
using (FileStream fs = File.OpenRead("D:\\06.Total Eclipse Of The Moon.mp3"))
{
byte[] buffer = new byte[1024];
int bytesRead;
while ((bytesRead = fs.Read(buffer, 0, buffer.Length)) > 0)
{
sck.Client.Send(buffer, 0, bytesRead);
// Do you really need this?
Thread.Sleep(30);
}
}

Categories