I've got a problem that has apparently been encountered before, though the solutions then don't help with what I'm seeing.
I'm trying to write data to a file, based on a base 64 encoding of the contents of a Dictionary<string, object>. It's supposed to go like this: if the file doesn't exist,
It gets created
Some default values get added to the Dictionary
The Dictionary is turned into base 64 data
The data is written to the file
Steps 1 through 3 are working fine. Step 4 appears to be working fine, until you I open the file - when it's revealed that there's nothing in it. It gets created but not written to, and I end up with an empty file.
The code goes like this, where the top method (CreateDefaultConfigFile is called if the file doesn't exist):
private static void CreateDefaultConfigFile()
{
Console.WriteLine("AppConfig.CreateDefaultConfigFile");
File.Create(CONFIG_FILE_PATH);
Configuration.Clear();
Configuration.Add("ClientId", "5577");
Configuration.Add("RedirectUri", "https://stackexchange.com/oauth/login_success");
Configuration.Add("RequestKey", "2WQ5ksCzcYLeeYJ0qM4kHw((");
Save();
}
public static void Save()
{
Console.WriteLine("AppConfig.Save");
string data = "";
foreach (KeyValuePair<string, object> pair in Configuration)
{
Console.WriteLine(" {0}: {1}", pair.Key, pair.Value.ToString());
if (pair.Value.GetType() == typeof(string))
{
data += pair.Key + SC_SPLITTER + pair.Value + "\n";
}
else if (pair.Value.GetType() == typeof(Array))
{
data += pair.Key + SC_SPLITTER + "[" + string.Join(",", pair.Value) + "]\n";
}
else
{
Configuration.Remove(pair.Key);
}
}
if (data.EndsWith("\n"))
{
data.Remove(data.Length - 2);
}
byte[] dataBytes = Encoding.UTF8.GetBytes(data);
string encoded = Convert.ToBase64String(dataBytes);
File.WriteAllText(CONFIG_FILE_PATH, encoded);
Console.WriteLine(" Written to file.");
}
Important fact to note: the "Written to file." message never gets logged to the console (though if I put a log directly before the File.WriteAllLines call, it does log). A breakpoint at the final Console.Log call never raises.
No exceptions are thrown, and it's not because data or encoded are empty - logging them just before the write reveals data in both.
CONFIG_FILE_PATH is a constant value of C:\Users\Me\Documents\longpath\bin\Debug\config\general.cgf.
I've also tried using a FileStream and FileStream.Flush(), as suggested in the question I linked at the top - this doesn't work.
The File.Create method doesn't do what you appear to think that it does.
It does create a file, but it also leaves the file open and returns a FileStream object to handle the open file.
If you just call Create and ignore the returned FileStream object, then the file will be left open until the object is disposed by the garbage collector.
Depending on when the garbage collection runs, the File.WriteAllText call will either be able to write to the file, or you will get an IOException. The fact that you don't see anything written to the file, and that you don't see the text that is written out after the call, suggests that you actually get an exception but catch that at some other level and ignore it.
If you want to use File.Create to create the file, you should get the FileStream object and dispose it to close the file:
using (FileStream stream = File.Create(CONFIG_FILE_PATH)) {
}
However, you don't have to create the file before calling File.WriteAllText, it will create the file if it doesn't exist.
Related
I am trying to figure out how to work with files and I got confused due to the amount of different methods.
I have used current way of writting to the file. So far it seems to work fine, but I wonder if I have to add a file closing when the user exits the game?
Another questions is why the file gets created without using File.Create()?
if (!File.Exists(path))
{
File.WriteAllText(path, "Date\t\t| ROM \t\t| Left Hand | Right Hand |\n");
}
The whole code is attached:
public class SavingData : MonoBehaviour
{
public static string path = #"C:\users\Desktop\Game1.txt";
void Start()
{
CreateFile();
}
void CreateFile()
{
if (!File.Exists(path))
{
File.WriteAllText(path, "Date\t\t| ROM \t\t| Left Hand | Right Hand |\n");
}
else
{
string date = "Login date: " + System.DateTime.Now + "\n";
File.AppendAllText(path, date);
}
}
public static void WriteToFile()
{
File.AppendAllText(path, "hellow\n");
}
public static void CloseFile()
{
}
}
The documentation of File.WriteAllText states:
Creates a new file, writes the specified string to the file, and then closes the file. If the target file already exists, it is overwritten.
So this single method does all the things that you thought you didn't do - creating the file and closing it.
If you would like to close the file manually, don't use File.WriteAllText and File.AppendAllText. Use another way of writing to a file, like a StreamWriter.
This is what I would write if I had to use StreamWriter.
public class SavingData : MonoBehaviour
{
public string path = #"C:\users\Desktop\Game1.txt";
private StreamWriter writer;
void Start()
{
CreateFile();
}
void CreateFile()
{
writer = new StreamWriter(path, true);
if (!File.Exists(path))
{
writer.WriteLine("Date\t\t| ROM \t\t| Left Hand | Right Hand |\n");
}
else
{
string date = "Login date: " + System.DateTime.Now + "\n";
writer.WriteLine(date);
}
}
public void WriteToFile()
{
writer.WriteLine("hellow\n");
}
public void CloseFile()
{
writer.Close();
}
}
Actually, if you want more control you need to use FileStream. It gives you more control while writing into files. It allows you to keep the file handle open and just write data without any additional control.
But FileStream also has some type of disadvantages.
From the documentation
When a FileStream object does not have an exclusive hold on its
handle, another thread could access the filehandle concurrently and
change the position of the operating system's file pointer that is
associated with the filehandle. In this case, the cached position in
the FileStream object and the cached data in the buffer could be
compromised. The FileStream object routinely performs checks on
methods that access the cached buffer to ensure that the operating
system's handle position is the same as the cached position used by
the FileStream object.
On the other hands :
System.IO.File contains wrappers around file operations for basic actions such as saving a file, reading a file to lines, etc. It's simply an abstraction over FileStream.
So WriteAllText is the abstraction for over the Create, Save and Close and automatically doing it and you don't need to know each of the implementations.
So the basic answer to your question is: NO, you don't need to manually close file, it will do it automatically.
I am having a problem with a console job that runs and creates a daily log file that I archive at midnight.
This creates a blank log file for the next day and an archived file with yesterdays date in the name and the contents of the old file for debugging issues I may have had and not known about until the day after.
However since I cranked up the BOT's job I have been hitting issues with System Out of Memory errors when I try and archive the file.
At first I was just not able to get an archived file at all then I worked out a way to get at least the last 100,000 lines which is not nearly enough.
I wrap everything in 3 try/catches
I/O
System out of memory
standard exception
However it's always the OutOfMemoryException that I get e.g
System.OutOfMemoryException Error: Exception of type 'System.OutOfMemoryException' was thrown.;
To give you an example of size 100,000 lines of log is about 11MB file
A standard full log file can be anything from 1/2 a GB to 2GB
What I need to know is this:
a) what size of a standard text file will throw an out of memory error when trying to use File.ReadAllText or a custom StreamReader function I call ReadFileString e.g
public static string ReadFileString(string path)
{
// Use StreamReader to consume the entire text file.
using (StreamReader reader = new StreamReader(path))
{
return reader.ReadToEnd();
}
}
b) is it my computers memory (I have 16GB RAM - 8GB use at time of copying) or the objects I am using in C# that are failing with the opening and copying of files.
When archiving I first try with my custom ReadFileString function (see above), if that returns 0 bytes of content I try File.ReadAllText and then if that fails I try a custom function to get the last 100,000 lines, which is really not enough for debugging errors earlier in the day.
The log file starts at midnight when a new one is created and logs all day. I never used to have out of memory errors but since I have turned up the frequency of method calls the logging has expanded which means the file sizes have as well.
This is my custom function for getting the last 100,000 lines. I am wondering how many lines I could get without IT throwing an out of memory error and me not getting any contents of the last days log file at all.
What do people suggest for the maximum file size for various methods / memory needed to hold X lines, and what is the best method for obtaining as much of the log file as possible?
E.G some way of looping line by line until an exception is hit and then saving what I have.
This is my GetHundredThousandLines method and it logs to a very small debug file so I can see what errors happened during the archive process.
private bool GetHundredThousandLines(string logpath, string archivepath)
{
bool success = false;
int numberOfLines = 100000;
if (!File.Exists(logpath))
{
this.LogDebug("GetHundredThousandLines - Cannot find path " + logpath + " to archive " + numberOfLines.ToString() + " lines");
return false;
}
var queue = new Queue<string>(numberOfLines);
using (FileStream fs = File.Open(logpath, FileMode.Open, FileAccess.Read, FileShare.Read))
using (BufferedStream bs = new BufferedStream(fs)) // May not make much difference.
using (StreamReader sr = new StreamReader(bs))
{
while (!sr.EndOfStream)
{
if (queue.Count == numberOfLines)
{
queue.Dequeue();
}
queue.Enqueue(sr.ReadLine() + "\r\n");
}
}
// The queue now has our set of lines. So print to console, save to another file, etc.
try
{
do
{
File.AppendAllText(archivepath, queue.Dequeue(), Encoding.UTF8);
} while (queue.Count > 0);
}
catch (IOException exception)
{
this.LogDebug("GetHundredThousandLines - I/O Error accessing daily log file with ReadFileString: " + exception.Message.ToString());
}
catch (System.OutOfMemoryException exception)
{
this.LogDebug("GetHundredThousandLines - Out of Memory Error accessing daily log file with ReadFileString: " + exception.Message.ToString());
}
catch (Exception exception)
{
this.LogDebug("GetHundredThousandLines - Exception accessing daily log file with ReadFileString: " + exception.Message.ToString());
}
if (File.Exists(archivepath))
{
this.LogDebug("GetHundredThousandLines - Log file exists at " + archivepath);
success = true;
}
else
{
this.LogDebug("GetHundredThousandLines - Log file DOES NOT exist at " + archivepath);
}
return success;
}
Any help would be much appreciated.
Thanks
try:
keep the queue and stream position in class scope, try GC.Collect() when getting out of memory exception and call function again. seek stream to last position and continue.
or:
use one database like sqlite and keep newest 100000 record in each table.
Hello
I've been working on terminal-like application to get better at programming in c#, just something to help me learn. I've decided to add a feature that will copy a file exactly as it is, to a new file... It seems to work almost perfect. When opened in Notepad++ the file are only a few lines apart in length, and very, very, close to the same as far as actual file size goes. However, the duplicated copy of the file never runs. It says the file is corrupt. I have a feeling it's within the methods for reading and rewriting binary to files that I created. The code is as follows, thank for the help. Sorry for the spaghetti code too, I get a bit sloppy when I'm messing around with new ideas.
Class that handles the file copying/writing
using System;
using System.IO;
//using System.Collections.Generic;
namespace ConsoleFileExplorer
{
class FileTransfer
{
private BinaryWriter writer;
private BinaryReader reader;
private FileStream fsc; // file to be duplicated
private FileStream fsn; // new location of file
int[] fileData;
private string _file;
public FileTransfer(String file)
{
_file = file;
fsc = new FileStream(file, FileMode.Open);
reader = new BinaryReader(fsc);
}
// Reads all the original files data to an array of bytes
public byte[] ReadAllDataToArray()
{
byte[] bytes = reader.ReadBytes((int)fsc.Length); // reading bytes from the original file
return bytes;
}
// writes the array of original byte data to a new file
public void WriteDataFromArray(byte[] fileData, string path) // got a feeling this is the problem :p
{
fsn = new FileStream(path, FileMode.Create);
writer = new BinaryWriter(fsn);
int i = 0;
while(i < fileData.Length)
{
writer.Write(fileData[i]);
i++;
}
}
}
}
Code that interacts with this class .
(Sleep(5000) is because I was expecting an error on first attempt...
case '3':
Console.Write("Enter source file: ");
string sourceFile = Console.ReadLine();
if (sourceFile == "")
{
Console.Clear();
Console.ForegroundColor = ConsoleColor.DarkRed;
Console.Error.WriteLine("Must input a proper file path.\n");
Console.ForegroundColor = ConsoleColor.White;
Menu();
} else {
Console.WriteLine("Copying Data"); System.Threading.Thread.Sleep(5000);
FileTransfer trans = new FileTransfer(sourceFile);
//copying the original files data
byte[] data = trans.ReadAllDataToArray();
Console.Write("Enter Location to store data: ");
string newPath = Console.ReadLine();
// Just for me to make sure it doesnt exit if i forget
if(newPath == "")
{
Console.Clear();
Console.ForegroundColor = ConsoleColor.DarkRed;
Console.Error.WriteLine("Cannot have empty path.");
Console.ForegroundColor = ConsoleColor.White;
Menu();
} else
{
Console.WriteLine("Writing data to file"); System.Threading.Thread.Sleep(5000);
trans.WriteDataFromArray(data, newPath);
Console.WriteLine("File stored.");
Console.ReadLine();
Console.Clear();
Menu();
}
}
break;
File compared to new file
right-click -> open in new tab is probably a good idea
Original File
New File
You're not properly disposing the file streams and the binary writer. Both tend to buffer data (which is a good thing, especially when you're writing one byte at a time). Use using, and your problem should disappear. Unless somebody is editing the file while you're reading it, of course.
BinaryReader and BinaryWriter do not just write "raw data". They also add metadata as needed - they're designed for serialization and deserialization, rather than reading and writing bytes. Now, in the particular case of using ReadBytes and Write(byte[]) in particular, those are really just raw bytes; but there's not much point to use these classes just for that. Reading and writing bytes is the thing every Stream gives you - and that includes FileStreams. There's no reason to use BinaryReader/BinaryWriter here whatsover - the file streams give you everything you need.
A better approach would be to simply use
using (var fsn = ...)
{
fsn.Write(fileData, 0, fileData.Length);
}
or even just
File.WriteAllBytes(fileName, fileData);
Maybe you're thinking that writing a byte at a time is closer to "the metal", but that simply isn't the case. At no point during this does the CPU pass a byte at a time to the hard drive. Instead, the hard drive copies data directly from RAM, with no intervention from the CPU. And most hard drives still can't write (or read) arbitrary amounts of data from the physical media - instead, you're reading and writing whole sectors. If the system really did write a byte at a time, you'd just keep rewriting the same sector over and over again, just to write one more byte.
An even better approach would be to use the fact that you've got file streams open, and stream the files from source to destination rather than first reading everything into memory, and then writing it back to disk.
There is an File.Copy() Method in C#, you can see it here https://msdn.microsoft.com/ru-ru/library/c6cfw35a(v=vs.110).aspx
If you want to realize it by yourself, try to place a breakpoint inside your methods and use a debug. It is like a story about fisher and god, who gived a rod to fisher - to got a fish, not the exactly fish.
Also, look at you int[] fileData and byte[] fileData inside last method, maybe this is problem.
I'm trying to convert a file's encoding and replace some text along the way. Unfortunately, I'm getting an OutOfMemory exception. I'm not sure why. As I understand it, it streams the original file line by line into a var (str), completes a couple of string replacements, and then writes the converted line to the StreamWriter.
Can someone tell me what I'm doing wrong here?
EDIT 1
- I'm currently testing a single file - 1GB:2.5m rows.
- Replaced read and replace into a single line. Same results!
EDIT 2
???By the way, can anyone tell me why the question was downgraded? I'd like to know for future postings.???
The problem is with the file itself. It's output from SQL Server BCP where I explicitly flag the row terminator with a specific string. By default, when the row terminator flag is omitted, BCP adds a newline at the end of each row and the code below works perfectly.
What I still don't understand is: when I set the row terminator flag with a specific string, each record appears on a newline, so why doesn't streamreader see each record on a separate line? Instead, it appears it views the entire file as one long line. That still doesn't explain the OOM exception since I have well over a 100G of memory.
Unfortunately, explicitly setting the row terminator flag is a must. For now, I'll take this over to dba exchange.
Thanks
static void Main(string[] args)
{
String msg = String.Empty;
String str = String.Empty;
DirectoryInfo dInfo = new DirectoryInfo(#"\\server\share");
foreach (var f in dInfo.GetFiles())
{
using (StreamReader sr = new StreamReader(f.FullName, Encoding.Unicode, false))
{
using (StreamWriter sw = new StreamWriter(f.DirectoryName + "\\new\\" + f.Name, false, Encoding.UTF8))
{
try
{
while (!sr.EndOfStream)
{
str = sr.ReadLine().Replace("this","that");
sw.WriteLine(str);
}
}
catch (Exception e)
{
msg += f.Name + ": " + e.Message;
}
}
}
}
Console.WriteLine(msg);
Console.ReadLine();
}
Well, you're main reading and writing code needs just one line of data. Your msg string, on the other hand, keeps getting larger and larger with each exception.
You'll need to have many millions of files in the folder to get an OutOfMemory exception this way, though.
I want to check if a list of files is in use or not writable before I start replacing files.
Sure I know that the time from the file-check and the file-copy there is a chance that one or more files is gonna to be locked by someone else but i handle those exceptions. I want to run this test before file copy because the complete list of files have a better chance to succeed than if a file in the middle of the operation fails to be replaced.
Have any of you an example or a hint in the right direction
There is no guarantee that the list you get, at any point of time, is going to stay the same the next second as somebody else might take control of the file by the time you come back to them.
I see one way though - "LOCK" the files that you want to replace by getting their corresponding FileStream objects. This way you are sure that you have locked all "available" files by opening them and then you can replace them the way you want.
public void TestGivenFiles(List<string> listFiles)
{
List<FileStream> replaceAbleFileStreams = GetFileStreams(listFiles);
Console.WriteLine("files Received = " + replaceAbleFileStreams.Count);
foreach (FileStream fileStream in replaceAbleFileStreams)
{
// Replace the files the way you want to.
fileStream.Close();
}
}
public List<FileStream> GetFileStreams(List<string> listFilesToReplace)
{
List<FileStream> replaceableFiles = new List<FileStream>();
foreach (string sFileLocation in listFilesToReplace)
{
FileAttributes fileAttributes = File.GetAttributes(sFileLocation);
if ((fileAttributes & FileAttributes.ReadOnly) != FileAttributes.ReadOnly)
{ // Make sure that the file is NOT read-only
try
{
FileStream currentWriteableFile = File.OpenWrite(sFileLocation);
replaceableFiles.Add(currentWriteableFile);
}
catch
{
Console.WriteLine("Could not get Stream for '" + sFileLocation+ "'. Possibly in use");
}
}
}
return replaceableFiles;
}
That said, you are better off trying to replace them one by one and and ignore the ones that you can't.
You must open each file for writing in order to test this.
Double
How to check For File Lock in C# ?
Can I simply ‘read’ a file that is in use?
Read one byte, write same byte?