Read many of the StreamReader - c#

I have little problem. My code in visual studio:
file = new StreamReader("D:\\BaseList.txt");
string line;
while ((line = file.ReadLine()) != null)
{
listBox1.Items.Add(line);
}
file.Close(); // 1
file = new StreamReader("D:\\Baza3.txt"); //2
I read all lines in file and I would like once more to read from the beginning. Do I have to close the stream and reload the file to stream( line numbered 1 and 2)?
Is there a method, which allows to set the stream at the beginning of my file without using this numbered line?

You can reset the position of the base stream like this
streamReader.BaseStream.Position = 0;
You can do that only if the base stream is seekable. (myStream.CanSeek == true). The is true in your case when you create a new StreamReader with a path string.

Try setting the BaseStream Position to 0, or copying the contents to a MemoryStream before actually start reading it.
Check out this thread:
Return StreamReader to Beginning

Related

How to read a log file which is hourly updated in c#?

I'm trying to write a console app in C# which reads a log file. The problem that i'm facing is that this log file is updated every 1 hour so for example, if I had 10 lines in the beginning and afterwards 12, in my second read attempt i will have to read only the 2 newly added lines.
Can you suggest me a way to do this efficiently (without the need to read all the lines again because the log file usually has 5000+ lines)?
First of all you can use FileSystemWatcher to have notifications after file changed.
Morover you can use FileStream and Seek function to ready only new added lines. On http://www.codeproject.com/Articles/7568/Tail-NET there is an example with Thread.Sleep:
using ( StreamReader reader = new StreamReader(new FileStream(fileName,
FileMode.Open, FileAccess.Read, FileShare.ReadWrite)) )
{
//start at the end of the file
long lastMaxOffset = reader.BaseStream.Length;
while ( true )
{
System.Threading.Thread.Sleep(100);
//if the file size has not changed, idle
if ( reader.BaseStream.Length == lastMaxOffset )
continue;
//seek to the last max offset
reader.BaseStream.Seek(lastMaxOffset, SeekOrigin.Begin);
//read out of the file until the EOF
string line = "";
while ( (line = reader.ReadLine()) != null )
Console.WriteLine(line);
//update the last max offset
lastMaxOffset = reader.BaseStream.Position;
}
}

Read the large text files into chunks line by line

Suppose the following lines in text file to which i have to read
INFO 2014-03-31 00:26:57,829 332024549ms Service1 startmethod - FillPropertyColor end
INFO 2014-03-31 00:26:57,829 332024549ms Service1 getReports_Dataset - getReports_Dataset started
INFO 2014-03-31 00:26:57,829 332024549ms Service1 cheduledGeneration - SwitchScheduledGeneration start
INFO 2014-03-31 00:26:57,829 332024549ms Service1 cheduledGeneration - SwitchScheduledGeneration limitId, subscriptionId, limitPeriod, dtNextScheduledDate,shoplimittype0, 0, , 3/31/2014 12:26:57 AM,0
I use the FileStream method to read the text file because the text file size having size over 1 GB. I have to read the files into chunks like initially in first run of program this would read two lines i.e. up to "getReports_Dataset started of second line". In next run it should read from 3rd line. I did the code but unable to get desired output.Problem is that my code doesn't give the exact chunk from where i have to start read text in next run. And second problem is while reading text lines .. don't give a complete line..i.e. some part is missing in lines. Following code:
readPosition = getLastReadPosition();
using (FileStream fStream = new FileStream(logFilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
using (System.IO.StreamReader rdr = new System.IO.StreamReader(fStream))
{
rdr.BaseStream.Seek(readPosition, SeekOrigin.Begin);
while (numCharCount > 0)
{
int numChars = rdr.ReadBlock(block, 0, block.Length);
string blockString = new string(block);
lines = blockString.Split(Convert.ToChar('\r'));
lines[0] = fragment + lines[0];
fragment = lines[lines.Length - 1];
foreach (string line in lines)
{
lstTextLog.Add(line);
if (lstTextLog.Contains(fragment))
{
lstTextLog.Remove(fragment);
}
numProcessedChar++;
}
numCharCount--;
}
SetLastPosition(numProcessedChar, logFilePath);
}
If you want to read a file line-by-line, do this:
foreach (string line in File.ReadLines("filename"))
{
// process line here
}
If you really must read a line and save the position, you need to save the last line number read, rather than the stream position. For example:
int lastLineRead = getLastLineRead();
string nextLine = File.ReadLines("filename").Skip(lastLineRead).FirstOrDefault();
if (nextLine != null)
{
lastLineRead++;
SetLastPosition(lastLineRead, logFilePath);
}
The reason you can't do it by saving the base stream position is because StreamReader reads a large buffer full of data from the base stream, which moves the file pointer forward by the buffer size. StreamReader then satisfies read requests from that buffer until it has to read the next buffer full. For example, say you open a StreamReader and ask for a single character. Assuming that it has a buffer size of 4 kilobytes, StreamReader does essentially this:
if (buffer is empty)
{
read buffer (4,096 bytes) from base stream
buffer_position = 0;
}
char c = buffer[buffer_position];
buffer_position++; // increment position for next read
return c;
Now, if you ask for the base stream's position, it's going to report that the position is at 4096, even though you've only read one character from the StreamReader.

Why is this locking the file from writing?

A feature of this program basically needs to tail a log file and forward lines which are newly written to it. I believe I am doing this correctly by creating the FileStream with the FileShare.ReadWrite option as the stream for StreamReader, as described in several other answers here and here.
But when I run the program it prevents some processes from writing to the file. Using Process Monitor I can see that my program is opening the file with R/W rights instead of just Read.
reader = new StreamReader(new FileStream(fileName, FileMode.Open, FileAccess.Read, FileShare.ReadWrite));
{
//start at the end of the file
long lastMaxOffset = reader.BaseStream.Length;
while (true)
{
System.Threading.Thread.Sleep(Properties.Settings.Default.pauseInMilliseconds);
// if the file size has not changed, keep idling
if (reader.BaseStream.Length == lastMaxOffset)
continue;
// handle if the file contents have been cleared
if (reader.BaseStream.Length < lastMaxOffset)
lastMaxOffset = 0;
eventLogger.WriteEntry("LogChipper target file was reset, starting from beginning", EventLogEntryType.Information, 0);
// seek to the last max offset
reader.BaseStream.Seek(lastMaxOffset, SeekOrigin.Begin);
// read out of the file until the EOF
string line = "";
while ((line = reader.ReadLine()) != null)
syslogForwarder.Send(line);
// update the last max offset
lastMaxOffset = reader.BaseStream.Position;
// block if the service is paused or is shutting down
pause.WaitOne();
}
}
Is there something else I'm doing in that block which is holding the file open? I'm open to trying different approaches (e.g. FileSystemWatcher) if that would be better...

How to handle StreamReader?

I use StreamReader to read my csv file.
The problem is : i need to read this file twice, and in second time then i use StreamReader
StreamReader.EndOfStream is true and reading not executed.
using (var csvReader = new StreamReader(file.InputStream))
{
string inputLine = "";
var values = new List<string>();
while ((inputLine = csvReader.ReadLine()) != null)...
Can enybody help
Try file.InputStream.Seek(0, SeekOrigin.Begin); before you open the second StreamReader to reset the Stream to the starting point.
A much better approach(if possible) would be to store the file contents in memory, and re-use it from there.

Why does FileStream.Position increment in multiples of 1024?

I have a text file that I want to read line by line and record the position in the text file as I go. After reading any line of the file the program can exit, and I need to resume reading the file at the next line when it resumes.
Here is some sample code:
using (FileStream fileStream = new FileStream("Sample.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
fileStream.Seek(GetLastPositionInFile(), SeekOrigin.Begin);
using (StreamReader streamReader = new StreamReader(fileStream))
{
while (!streamReader.EndOfStream)
{
string line = streamReader.ReadLine();
DoSomethingInteresting(line);
SaveLastPositionInFile(fileStream.Position);
if (CheckSomeCondition())
{
break;
}
}
}
}
When I run this code, the value of fileStream.Position does not change after reading each line, it only advances after reading a couple of lines. When it does change, it increases in multiples of 1024. Now I assume that there is some buffering going on under the covers, but how can I record the exact position in the file?
It's not FileStream that's responsible - it's StreamReader. It's reading 1K at a time for efficiency.
Keeping track of the effective position of the stream as far as the StreamReader is concerned is tricky... particularly as ReadLine will discard the line ending, so you can't accurately reconstruct the original data (it could have ended with "\n" or "\r\n"). It would be nice if StreamReader exposed something to make this easier (I'm pretty sure it could do so without too much difficulty) but I don't think there's anything in the current API to help you :(
By the way, I would suggest that instead of using EndOfStream, you keep reading until ReadLine returns null. It just feels simpler to me:
string line;
while ((line = reader.ReadLine()) != null)
{
// Process the line
}
I would agree with Stefan M., it is probably the buffering which is causing the Position to be incorrect. If it is just the number of characters that you have read that you want to track than I suggest you do it yourself, as in:
using(FileStream fileStream = new FileStream("Sample.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
fileStream.Seek(GetLastPositionInFile(), SeekOrigin.Begin);
/**Int32 position = 0;**/
using(StreamReader streamReader = new StreamReader(fileStream))
{
while(!streamReader.EndOfStream)
{
string line = streamReader.ReadLine();
/**position += line.Length;**/
DoSomethingInteresting(line);
/**SaveLastPositionInFile(position);**/
if(CheckSomeCondition())
{
break;
}
}
}
}
Provide that your file is not too big, why not read the whole thing in big chuncks and then manipulate the string - probably faster than the stop and go i/o.
For example,
//load entire file
StreamReader srFile = new StreamReader(strFileName);
StringBuilder sbFileContents = new StringBuilder();
char[] acBuffer = new char[32768];
while (srFile.ReadBlock(acBuffer, 0, acBuffer.Length)
> 0)
{
sbFileContents.Append(acBuffer);
acBuffer = new char[32768];
}
srFile.Close();

Categories