I'm trying to get my program to read code from a .txt and then read it back to me, but for some reason, it crashes the program when I compile. Could someone let me know what I'm doing wrong? Thanks! :)
using System;
using System.IO;
public class Hello1
{
public static void Main()
{
string winDir=System.Environment.GetEnvironmentVariable("windir");
StreamReader reader=new StreamReader(winDir + "\\Name.txt");
try {
do {
Console.WriteLine(reader.ReadLine());
}
while(reader.Peek() != -1);
}
catch
{
Console.WriteLine("File is empty");
}
finally
{
reader.Close();
}
Console.ReadLine();
}
}
I don't like your solution for two simple reasons:
1)I don't like gotta Cath 'em all(try catch). For avoing check if the file exist using System.IO.File.Exist("YourPath")
2)Using this code you haven't dispose the streamreader. For avoing this is better use the using constructor like this: using(StreamReader sr=new StreamReader(path)){ //Your code}
Usage example:
string path="filePath";
if (System.IO.File.Exists(path))
using (System.IO.StreamReader sr = new System.IO.StreamReader(path))
{
while (sr.Peek() > -1)
Console.WriteLine(sr.ReadLine());
}
else
Console.WriteLine("The file not exist!");
If your file is located in the same folder as the .exe, all you need to do is StreamReader reader = new StreamReader("File.txt");
Otherwise, where File.txt is, put the full path to the file. Personally, I think it's easier if they are in the same location.
From there, it's as simple as Console.WriteLine(reader.ReadLine());
If you want to read all lines and display all at once, you could do a for loop:
for (int i = 0; i < lineAmount; i++)
{
Console.WriteLine(reader.ReadLine());
}
Use the code below if you want the result as a string instead of an array.
File.ReadAllText(Path.Combine(winDir, "Name.txt"));
Why not use System.IO.File.ReadAllLines(winDir + "\Name.txt")
If all you're trying to do is display this as output in the console, you could do that pretty compactly:
private static string winDir = Environment.GetEnvironmentVariable("windir");
static void Main(string[] args)
{
Console.Write(File.ReadAllText(Path.Combine(winDir, "Name.txt")));
Console.Read();
}
using(var fs = new FileStream(winDir + "\\Name.txt", FileMode.Open, FileAccess.Read))
{
using(var reader = new StreamReader(fs))
{
// your code
}
}
The .NET framework has a variety of ways to read a text file. Each have pros and cons... lets go through two.
The first, is one that many of the other answers are recommending:
String allTxt = File.ReadAllText(Path.Combine(winDir, "Name.txt"));
This will read the entire file into a single String. It will be quick and painless. It comes with a risk though... If the file is large enough, you may run out of memory. Even if you can store the entire thing into memory, it may be large enough that you will have paging, and will make your software run quite slowly. The next option addresses this.
The second solution allows you to work with one line at a time and not load the entire file into memory:
foreach(String line in File.ReadLines(Path.Combine(winDir, "Name.txt")))
// Do Work with the single line.
Console.WriteLine(line);
This solution may take a little longer for files because it's going to do work MORE OFTEN with the contents of the file... however, it will prevent awkward memory errors.
I tend to go with the second solution, but only because I'm paranoid about loading huge Strings into memory.
Related
Hello
I've been working on terminal-like application to get better at programming in c#, just something to help me learn. I've decided to add a feature that will copy a file exactly as it is, to a new file... It seems to work almost perfect. When opened in Notepad++ the file are only a few lines apart in length, and very, very, close to the same as far as actual file size goes. However, the duplicated copy of the file never runs. It says the file is corrupt. I have a feeling it's within the methods for reading and rewriting binary to files that I created. The code is as follows, thank for the help. Sorry for the spaghetti code too, I get a bit sloppy when I'm messing around with new ideas.
Class that handles the file copying/writing
using System;
using System.IO;
//using System.Collections.Generic;
namespace ConsoleFileExplorer
{
class FileTransfer
{
private BinaryWriter writer;
private BinaryReader reader;
private FileStream fsc; // file to be duplicated
private FileStream fsn; // new location of file
int[] fileData;
private string _file;
public FileTransfer(String file)
{
_file = file;
fsc = new FileStream(file, FileMode.Open);
reader = new BinaryReader(fsc);
}
// Reads all the original files data to an array of bytes
public byte[] ReadAllDataToArray()
{
byte[] bytes = reader.ReadBytes((int)fsc.Length); // reading bytes from the original file
return bytes;
}
// writes the array of original byte data to a new file
public void WriteDataFromArray(byte[] fileData, string path) // got a feeling this is the problem :p
{
fsn = new FileStream(path, FileMode.Create);
writer = new BinaryWriter(fsn);
int i = 0;
while(i < fileData.Length)
{
writer.Write(fileData[i]);
i++;
}
}
}
}
Code that interacts with this class .
(Sleep(5000) is because I was expecting an error on first attempt...
case '3':
Console.Write("Enter source file: ");
string sourceFile = Console.ReadLine();
if (sourceFile == "")
{
Console.Clear();
Console.ForegroundColor = ConsoleColor.DarkRed;
Console.Error.WriteLine("Must input a proper file path.\n");
Console.ForegroundColor = ConsoleColor.White;
Menu();
} else {
Console.WriteLine("Copying Data"); System.Threading.Thread.Sleep(5000);
FileTransfer trans = new FileTransfer(sourceFile);
//copying the original files data
byte[] data = trans.ReadAllDataToArray();
Console.Write("Enter Location to store data: ");
string newPath = Console.ReadLine();
// Just for me to make sure it doesnt exit if i forget
if(newPath == "")
{
Console.Clear();
Console.ForegroundColor = ConsoleColor.DarkRed;
Console.Error.WriteLine("Cannot have empty path.");
Console.ForegroundColor = ConsoleColor.White;
Menu();
} else
{
Console.WriteLine("Writing data to file"); System.Threading.Thread.Sleep(5000);
trans.WriteDataFromArray(data, newPath);
Console.WriteLine("File stored.");
Console.ReadLine();
Console.Clear();
Menu();
}
}
break;
File compared to new file
right-click -> open in new tab is probably a good idea
Original File
New File
You're not properly disposing the file streams and the binary writer. Both tend to buffer data (which is a good thing, especially when you're writing one byte at a time). Use using, and your problem should disappear. Unless somebody is editing the file while you're reading it, of course.
BinaryReader and BinaryWriter do not just write "raw data". They also add metadata as needed - they're designed for serialization and deserialization, rather than reading and writing bytes. Now, in the particular case of using ReadBytes and Write(byte[]) in particular, those are really just raw bytes; but there's not much point to use these classes just for that. Reading and writing bytes is the thing every Stream gives you - and that includes FileStreams. There's no reason to use BinaryReader/BinaryWriter here whatsover - the file streams give you everything you need.
A better approach would be to simply use
using (var fsn = ...)
{
fsn.Write(fileData, 0, fileData.Length);
}
or even just
File.WriteAllBytes(fileName, fileData);
Maybe you're thinking that writing a byte at a time is closer to "the metal", but that simply isn't the case. At no point during this does the CPU pass a byte at a time to the hard drive. Instead, the hard drive copies data directly from RAM, with no intervention from the CPU. And most hard drives still can't write (or read) arbitrary amounts of data from the physical media - instead, you're reading and writing whole sectors. If the system really did write a byte at a time, you'd just keep rewriting the same sector over and over again, just to write one more byte.
An even better approach would be to use the fact that you've got file streams open, and stream the files from source to destination rather than first reading everything into memory, and then writing it back to disk.
There is an File.Copy() Method in C#, you can see it here https://msdn.microsoft.com/ru-ru/library/c6cfw35a(v=vs.110).aspx
If you want to realize it by yourself, try to place a breakpoint inside your methods and use a debug. It is like a story about fisher and god, who gived a rod to fisher - to got a fish, not the exactly fish.
Also, look at you int[] fileData and byte[] fileData inside last method, maybe this is problem.
So im trying to close a file (transactions.txt) that has been open that i've used to read into a textbox and now I want to save back to the file but the problem debug says that the file is in use so I need to find a way to close it. Can anyone help me with this? Thanks!
SearchID = textBox1.Text;
string ID = SearchID.ToString();
bool idFound = false;
int count = 0;
foreach (var line in File.ReadLines("transactions.txt"))
{
//listView1.Items.Add(line);
if (line.Contains(ID))
{
idFound = true;
}
//Displays Transactions if the variable SearchID is found.
if (idFound && count < 8)
{
textBox2.Text += line + "\r\n";
count++;
}
}
}
private void SaveEditedTransaction()
{
SearchID = textBox1.Text;
string ID = SearchID.ToString();
bool idFound = false;
int count = 0;
foreach (var lines in File.ReadLines("transactions.txt"))
{
//listView1.Items.Add(line);
if (lines.Contains(ID))
{
idFound = true;
}
if (idFound)
{
string edited = File.ReadAllText("transactions.txt");
edited = edited.Replace(lines, textBox2.Text);
File.WriteAllText("Transactions.txt", edited);
}
The problem here is that File.ReadLines keeps the file open while you read it, since you've put the call to write new text to it inside the loop, the file is still open.
Instead I would simply break out of the loop when you find the id, and then put the if-statement that writes to the file outside the loop.
This, however, means that you will also need to maintain which line to replace in.
So actually, instead I would switch to using File.ReadAllLines. This reads the entire file into memory, and closes it, before the loop starts.
Now, pragmatic minds might argue that if you have a lot of text in that text file, File.ReadLines (that you're currently using) will use a lot less memory than File.ReadAllLines (that I am suggesting you should use), but if that's the case then you should switch to a database, which would be much more suited to your purpose anyway. It is, however, a bit of an overkill for a toy project with 5 lines in that file.
Use StreamReader directly with the using statement, for example:
var lines = new List<string>();
using (StreamReader reader = new StreamReader(#"C:\test.txt")) {
var line = reader.ReadLine();
while (line != null) {
lines.Add(line);
line = reader.ReadLine();
}
}
By using the using statement the StreamReader instance will automatically be disposed of after it's done with it.
You can try with this:
File.WriteAllLines(
"transactions.txt",
File.ReadAllLines("transactions.txt")
.Select(x => x.Contains(ID) ? textBox2.Text : x));
It works fine, but if the file is big you have to find other solutions.
You can use the StreamReader class instead of the methods of the File class. In this way you can use, Stream.Close() and Stream.Dispose().
I need help figuring out the fastest way to read through about 80 files with over 500,000 lines in each file, and write to one master file with each input file's line as a column in the master. The master file must be written to a text editor like notepad and not a Microsoft product because they can't handle the number of lines.
For example, the master file should look something like this:
File1_Row1,File2_Row1,File3_Row1,...
File1_Row2,File2_Row2,File3_Row2,...
File1_Row3,File2_Row3,File3_Row3,...
etc.
I've tried 2 solutions so far:
Create a jagged array to hold each files' contents into an array and then once reading all lines in all files, write the master file. The issue with this solution is that Windows OS memory throws an error that too much virtual memory is being used.
Dynamically create a reader thread for each of the 80 files that reads a specific line number, and once all threads finish reading a line, combine those values and write to file, and repeat for each line in all files. The issue with this solution is that it is very very slow.
Does anybody have a better solution for reading so many large files in a fast way?
The best way is going to be to open the input files with a StreamReader for each one and a StreamWriter for the output file. Then you loop through each reader and read a single line and write it to the master file. This way you are only loading one line at a time so there should be minimal memory pressure. I was able to copy 80 ~500,000 line files in 37 seconds. An example:
using System;
using System.Collections.Generic;
using System.IO;
using System.Diagnostics;
class MainClass
{
static string[] fileNames = Enumerable.Range(1, 80).Select(i => string.Format("file{0}.txt", i)).ToArray();
public static void Main(string[] args)
{
var stopwatch = Stopwatch.StartNew();
List<StreamReader> readers = fileNames.Select(f => new StreamReader(f)).ToList();
try
{
using (StreamWriter writer = new StreamWriter("master.txt"))
{
string line = null;
do
{
for(int i = 0; i < readers.Count; i++)
{
if ((line = readers[i].ReadLine()) != null)
{
writer.Write(line);
}
if (i < readers.Count - 1)
writer.Write(",");
}
writer.WriteLine();
} while (line != null);
}
}
finally
{
foreach(var reader in readers)
{
reader.Close();
}
}
Console.WriteLine("Elapsed {0} ms", stopwatch.ElapsedMilliseconds);
}
}
I've assume that all the input files have the same number of lines, but you should be add the logic to keep reading when at least one file has given you data.
Use Memory Mapped files seems what is suitable to you. Something that does not execute pressure on memory of your app contemporary maintaining good performance in IO operations.
Here complete documentation: Memory-Mapped Files
If you have enough memory on the computer, I would use the Parallel.Invoke construct and read each file into a pre-allocated array such as:
string[] file1lines = new string[some value];
string[] file2lines = new string[some value];
string[] file3lines = new string[some value];
Parallel.Invoke(
() =>
{
ReadMyFile(file1,file1lines);
},
() =>
{
ReadMyFile(file2,file2lines);
},
() =>
{
ReadMyFile(file3,file3lines);
}
);
Each ReadMyFile method should just use the following sample code which, according to these benchmarks, is the fastest way to read a text file:
int x = 0;
using (StreamReader sr = File.OpenText(fileName))
{
while ((file1lines[x] = sr.ReadLine()) != null)
{
x += 1;
}
}
If you need to manipulate the data from each file before writing your final output, read this article on the fastest way to do that.
Then you just need one method to write the contents to each string[] to the output as you desire.
Have an array of open file handles. Loop through this array and read a line from each file into a string array. Then combine this array into the master file, append a newline at the end.
This differs from your second approach that it is single threaded and doesn't read a specific line but always the next one.
Of course you need to be error proof if there are files with less lines than others.
I'm trying to write 4 sets of 15 txt files into 4 large txt files in order to make it easier to import into another app.
Here's my code:
using System;
using System.IO;
using System.Collections.Generic;
using System.Linq;
using System.Text;
namespace AggregateMultipleFiles
{
class AggMultiFilestoOneFile
{/*This program can reduce multiple input files and grouping results into one file for easier app loading.*/
static void Main(string[] args)
{
TextWriter writer = new StreamWriter("G:/user/data/yr2009/fy09_filtered.txt");
int linelen =495;
char[] buf = new char[linelen];
int line_num = 1;
for (int i = 1; i <= 15; i++)
{
TextReader reader = File.OpenText("G:/user/data/yr2009/fy09_filtered"+i+".txt");
while (true)
{
int nin = reader.Read(buf, 0, buf.Length);
if (nin == 0 )
{
Console.WriteLine("File ended");
break;
}
writer.Write(new String(buf));
line_num++;
}
reader.Close();
}
Console.WriteLine("done");
Console.WriteLine(DateTime.Now);
Console.ReadLine();
writer.Close();
}
}
}
My problem is somewhere in calling the end of the file. It doesn't finishing writing the last line of a file, and then, proceeds to start writing the first line of the next file half way through the middle of the last line of the previous file.
This is throwing off all of my columns and data in the app it imports into.
Someone suggested that perhaps I need to pad the end of each line of each of the 15 files with carriage and line return, \r\n.
Why doesn't what I have work?
Would padding work instead? How would I write that?
Thank you!
I strongly suspect this is the problem:
writer.Write(new String(buf));
You're always creating a string from all of buf, rather than just the first nin characters. If any of your files are short, you may end up with "null" Unicode characters (i.e. U+0000) which may be seen as string terminators in some apps.
There's no need even to create a string - just use:
writer.Write(buf, 0, nin);
(I would also strongly suggest using using statements instead of manually calling Close, by the way.)
It's also worth noting that there's nothing to guarantee that you're really reading a line at a time. You might as well increase your buffer size to something like 32K in order to read the files in potentially fewer chunks.
Additionally, if the files are small enough, you could read each one into memory completely, which would make your code simpler:
using (var writer = File.CreateText("G:/user/data/yr2009/fy09_filtered.txt"))
{
for (int i = 1; i <= 15; i++)
{
string inputName = "G:/user/data/yr2009/fy09_filtered" + i + ".txt";
writer.Write(File.ReadAllText(inputName));
}
}
I'm having an issue where I'm writing the contents of several xml files to one file. When I run the program, the output is in the proper format, but the words are out of order. An example of this:
My string is "<s:AttributeType name=\"Shift\" number=\"34\" nullable=\"true\" writeunknown=\"true\">"
So it should print <s:AttributeType name="Shift" number="34" nullable="true" writeunknown="true">
But instead <s:AttributeType name="Shift" writeunknown="true" number="34" nullable="true">
is returned.
Some of the file is written in using File.WriteAllText(#"C:\Users\status.xml", xsh);
Where 'xsh' is a variable containing a string.
The rest is written in using this loop:
foreach (var i in Numbers.GetWSnumber())
{
string contents = "";
string curFile = #"\\production\public\Staus\TStatus\WS" + i.SetId + ".xml";
if (File.Exists(curFile))
{
System.IO.StreamReader file = new System.IO.StreamReader(curFile);
while ((contents = file.ReadLine()) != null)
{
using (StreamWriter sw = File.AppendText(#"C:\Users\status.xml"))
{
sw.WriteLine(contents);
}
}
file.Close();
}
}
Any help is appreciated
The order of XML attributes is not important so I wouldn't worry about it. However, if it's really bugging you I would suggest moving your using statement.
using (StreamWriter sw = File.AppendText(#"C:\Users\status.xml"))
{
sw.WriteLine(contents);
}
Although this is probably optimized so that it works the same as if it were better written, the way you currently have this a new StreamWriter is being allocated and then disposed of with every iteration.
Your using statement should wrap the while loop and not the other way around. This could possibly solve the problem (though I think it's unlikely) as I don't know how the compiler is handling this line. Either way, it's worth changing.