C# Collection was modified error with garbled text reading from file - c#

I'm new to C# and have been asked to write a custom task in a plugin for our deployment software, but I just can't wrap my head around this. I'm simply trying to log some data that is in a certain directory on my deployment server to the output log, but I only get the first file logged (and then even the text is garbled, I think it's loading the bytes wrong somehow) before getting strange errors about "Additional information: Collection was modified; enumeration operation may not execute."
Here is the code I have so far:
class Clense : AgentBasedActionBase
{
public string dataPath { get; set; }
protected override void Execute()
{
IFileOperationsExecuter agent = Context.Agent.GetService<IFileOperationsExecuter>();
GetDirectoryEntryCommand get = new GetDirectoryEntryCommand() { Path = dataPath };
GetDirectoryEntryResult result = agent.GetDirectoryEntry(get);
DirectoryEntryInfo info = result.Entry;
// info has directory information
List<FileEntryInfo> myFiles = info.Files.ToList();
foreach (FileEntryInfo file in myFiles)
{
Byte[] bytes = agent.ReadFileBytes(file.Path);
String s = Encoding.Unicode.GetString(bytes);
LogInformation(s);
// myFiles.Remove(file);
}
}
}
Does anyone know what I can do to try to fix this?
Update
Removing the myFiles.Remove() fixed the error (I thought it would loop too many times but it doesn't) and it looks like I'm getting one log entry per file now, but the messages are still garbled. Does anyone have any idea why this is happening?

You can either iterate the myFiles collection in the reverse direction (so that you don't corrupt the collection when you remove each individual file), or you can simply clear the collection when you are done iterating it (which would accomplish the same thing).

You are modifying the collection with
myFiles.Remove(file);
Delete that line (since it's the cause).

In his comment, Blorgbeard is almost certainly correct with regards to the encoding used to read the files on disk. Remember that Encoding.Unicode is actually UTF16, which is somewhat confusing, and if I had to guess, probably not the encoding your files were created with.
For completeness, I will add the BuildMaster-idiomatic way to handle your scenario using the ReadAllText() extension method on IFileOperationsExecuter:
protected override void Execute()
{
var agent = this.Context.Agent.GetService<IFileOperationsExecuter>();
var entry = agent.GetDirectoryEntry(new GetDirectoryEntryCommand() { Path = dataPath }).Entry;
foreach(var file in entry.Files)
{
string contents = agent.ReadAllText(file.Path);
this.LogInformation(contents);
}
}
The ReadAllText() method will internally assume UTF8 encoding, but there is an overload that accepts a different encoding if necessary.

Related

Check inside loop if *txt file has been created

My code is searchcing inside a loop if a *txt file has been created.
If file will not be created after x time then i will throw an exception.
Here is my code:
var AnswerFile = #"C:\myFile.txt";
for (int i = 0; i <= 30; i++)
{
if (File.Exists(AnswerFile))
break;
await Task.Delay(100);
}
if (File.Exists(AnswerFile))
{
}
else
{
}
After the loop i check my file if has been created or not. Loop will expire in 3 seconds, 100ms * 30times.
My code is working, i am just looking for the performance and quality of my code. Is there any better approach than mine? Example should i use FileInfo class instead this?
var fi1 = new FileInfo(AnswerFile);
if(fi1.Exists)
{
}
Or should i use filewatcher Class?
You should perhaps use a FileSystemWatcher for this and decouple the process of creating the file from the process of reacting to its presence. If the file must be generated in a certain time because it has some expiry time then you could make the expiry datetime part of the file name so that if it appears after that time you know it's expired. A note of caution with the FileSystemWatcher - it can sometimes miss something (the fine manual says that events can be missed if large numbers are generated in a short time)
In the past I've used this for watching for files being uploaded via ftp. As soon as the notification of file created appears I put the file into a list and check it periodically to see if it is still growing - you can either look at the filesystem watcher lastwritetime event for this or directly check the size of the file now vs some time ago etc - in either approach it's probably easiest to use a dictionary to track the file and the previous size/most recent lastwritedate event.
After a minute of no growth I consider the file uploaded completely and I process it. It might be wise for you to implement a similar delay if using a file system watcher and the files are arriving by some slow generating method
Why you don't retrieve a list of files name, then search in the list? You can use Directory.GetFiles to get the files list inside a directory then search in this list.
This would be more fixable for you since you will create the list once, and reuse it across the application, instead of calling File.Exists for each file.
Example :
var path = #"C:\folder\"; // set the folder path, which contains all answers files
var ext = "*.txt"; // set the file extension.
// GET filename list (bare name) and make them all lowercase.
var files = Directory.GetFiles(path, ext).Select(x=> x.Substring(path.Length, (x.Length - path.Length) - ext.Length + 1 ).Trim().ToLower()).ToList();
// Search for this filename
var search = "myFile";
// Check
if(files.Contains(search.ToLower()))
{
Console.WriteLine($"File : {search} is already existed.");
}
else
{
Console.WriteLine($"File : {search} is not found.");
}

On some tiny percentage of reads across userbase, C# `File.ReadAllText` giving string consisting solely of of null characters (\u0000) on Windows

We're seeing a decent volume of errors in our analytics when certain users of our application try to read save files. 99.9%+ of reads across our entire user base go fine, but we get a handful of these errors in our error telemetry data.
From what we can tell, in some tiny % of reads, File.ReadAllText outputs a long string of null characters:
\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000 ...etc
The number of null characters seems to match the size of the user data file, but it's hard to tell exactly for sure, and we don't have a way of reproducing the issue.
Subsequent reads / parsings of the same exact files appear to happen fine (from what I can tell from the analytics surrounding the events).
Is this a common random occurrence on File.ReadAllText on Windows machines? What could possibly be causing this read (possibly writing?) of all-null-character files?
\u0000 is a null character, therefor there must be something wrong with your retrieval method or your file path in general. Assuming that your code is possibly faulty in some manner as you have not displayed any maybe give this a try?
//Open the file
var stream = File.OpenText("json file.txt");
//Read the file
string st = stream.ReadToEnd();
var jsonArray = JsonArray.Parse(st);
foreach (var item in jsonArray){
JsonObject ob = new JsonObject(item);
foreach (var t in ob.Values){
JsonObject oo = new JsonObject(t);
foreach (var x in oo){
textBox1.AppendText(x.Key + " : " + x.Value + "\n");
}
}
}
Code Reference: Code Project.

Saving & loading data on level selection screen, XML, XNA

I am making a basic platformer, my first ever game. I've run into a bit of a problem. So far the game only has one level, and it is loaded from a .txt file. However I'd like to have a sort of an Angry Birdish world/level selection screen.
The plan is to have an icon for each level visible, but only so far completed levels and the next one accessible. Also for the completed levels the score (stars, whatever) would be displayed under the icon.
I do not wish to load the levels from XML, at least not yet. Only the persistent world data that needs to be read AND written. I assume the easiest way is to load even the formatting of the level selection screen from XML, and not use the method i currently use (text files).
I could do this with text files, I suppose, but I really do not relish the idea of writing and sorting through the file. I then discovered that XML-files should be a bit less problematic in this regard. However additional problem rises from the fact tht I've never ever worked with XML-files before.
Could someone point me in a direction of a tutorial for this sort of things, or some sample you might have come accross that accomplishes at least relatively similar results. I don't expect anyone to do the coding for me, but if you have pointers or time and patience to provide a sample, I'd appreciate it a lot.
After some further digging and fumbling with tutorials for older XNA versions I managed to produce following save/load class:
namespace SaveLoadXML
{
class SaveLoad
{
public LevelInfo Load (int id)
{
LevelInfo level;
// Get the path of the save game
string fullpath = "World.xml";
// Open the file
FileStream stream = File.Open(fullpath, FileMode.OpenOrCreate,
FileAccess.Read);
try
{
// Read the data from the file
XmlSerializer serializer = new XmlSerializer(typeof(LevelInfo));
level = (LevelInfo)serializer.Deserialize(stream);
}
finally
{
// Close the file
stream.Close();
}
return (level);
}
public void Save (LevelInfo level, int id)
{
// Get the path of the save game
string fullpath = "World.xml";
// Open the file, creating it if necessary
FileStream stream = File.Open(fullpath, FileMode.OpenOrCreate);
try
{
// Convert the object to XML data and put it in the stream
XmlSerializer serializer = new XmlSerializer(typeof(LevelInfo));
serializer.Serialize(stream, level);
}
finally
{
// Close the file
stream.Close();
}
}
}
}
Now I started to think, is there a way to target a specific part of the XML-file, or is the writing always just from the start? Almost all of the examples I saw had a condition at the start: if the file exists, delete it and then write.
I assume I could (or even should?) make a list of LevelInfo objects and just load them all at once, as there is no real need to load a single LevelInfo anyway. On the saving however, do I need to load the previous state (old list) and then manipulate the list regarding the certain indexes involved, and then delete te file, and save it again.
This might open an easy way for the system to fail if something goes wrong in the saving or power fails for example. The whole file would be lost or corrupt. I suppose this ould be countered with using back-up file and then checking the integrity of the main file, but now it's starting to feel like quite a mountain to climb for a beginner like me.
Having tried this question on GameDev, I'll just clarify the main question here:
1) Can I save only info about one or two levels in the XML-file containing info for all levels? ie. can I use some indexing to point the write operation to a particular section that would then be overwritten/replaced.
2) If not, is there any way to safely load all info from file, delete file, save all info after modifying it where needed.
After some looking into this Json stuff, I've managed to successfully serialize test level information. However, de-serialization fails as I have a rectangle as a part of the object. Error is as follows:
Error converting value "{X:1 Y:1 Width:1 Height:1}" to type 'Microsoft.Xna.Framework.Rectangle'. Path '[0].Rectangle', line 6, position 46.
class LevelInfo
{
public int ID { get; set; }
public Vector2 Dimensions { get; set; }
public Vector2 Position { get; set; }
public Rectangle Rectangle { get; set; }
public int Stars { get; set; }
public string Text { get; set; }
}
class SaveLoadJSON
{
public static List<LevelInfo> Load()
{
List<LevelInfo> levels = new List<LevelInfo>();
using (StreamReader file = File.OpenText("World.json"))
{
JsonSerializer serializer = new JsonSerializer();
levels = (List<LevelInfo>)serializer.Deserialize(file, typeof(List<LevelInfo>));
}
return levels;
}
public static void Save(List<LevelInfo> levels)
{
if (File.Exists("World.json"))
{
File.Delete("World.json");
}
using (FileStream fs = File.Open("World.json", FileMode.CreateNew))
using (StreamWriter sw = new StreamWriter(fs))
using (JsonWriter jw = new JsonTextWriter(sw))
{
jw.Formatting = Formatting.Indented;
JsonSerializer serializer = new JsonSerializer();
serializer.Serialize(jw, levels);
}
}
}
Is there a way to work around this? Preferably a relatively simple way for a simple beginner like me.
Or alternatively, is there a way to omit the rectangle information to begin with, and maybe add it later? If I input nothing to the rectangle, it still is added to Json-file with 0 values. I do need the rectangle info for the drawing.
So here comes the promised answer.
Personally I'd prefer using JSon for storing data, since it's a lot easier to work with than XML, and takes up less storage. What you're going to want to do, is make Data Models of your player, enemy, items, scene objects, etc.
Then, you'll want to JsonConvert.SerializeObject() a parent data model, which will contain all those things.
Save this in any file, and Deserialize it again upon load, and reconstruct all objects from scratch.
Alternatively, just have all properties in the classes you're working with already, be public. That way, JsonConvert will be able to actually serialize the entire model. Keep in mind, if you do this runtime, it will make more of a complete snapshot of the Levels current state. Aka. where the enemies are located, the health remaining and whatever else you may have.
I hope this answers your question.

How to save a text file in Hard Disk using C#

I'm creating a logger for my app and I'm stuck with a problem I need to save my log file in my C drive but when I'm executing the Code its give me an error "Given Path Format Is Not Supported" My current code is given below
string path="C:\\Logger\\"+DateTime.Now.Date.ToString()+".txt";
public void CreateDirectory()
{
if(!File.Exists(path))
{
File.Create(path);
}
}
any solutions????
You're going to have to format the date:
string path="C:\\Logger\\"+DateTime.Now.Date.ToString("yyyy_MM_dd")+".txt";
because the operating system isn't going to accept something like this:
C:\Logger\07/27/2013.txt
Now, for future reference, consider using Path.Combine to build your paths:
var path = Path.Combine("C:\\Logger",
DateTime.Now.Date.ToString("yyyy_MM_dd"),
".txt");
You won't have to determine when to provide back slashes and when not to. If there isn't one, it will be appended for you.
Finally, you may experience problems if the directory doesn't exist. Something you can do to mitigate that is this:
var path = ...
var dir = Path.GetDirectoryName(path);
if (!Directory.Exists(dir))
{
Directory.Create(dir);
}
But even then, you can run into permissions issues during runtime.
Check that the result of this: DateTime.Now.Date.ToString() is accepted by the operating system.

Issues with StreamReader, ThreadSafety and Read Mode

I have following code to read a file
StreamReader str = new StreamReader(File.Open(fileName, FileMode.Open, FileAccess.Read));
string fichier = str.ReadToEnd();
str.Close();
This is part of a asp.net webservice and has been working fine for an year now in production. Now with increasing load on server, customer has started getting "File already in use" error. That file is being read from this code and is never written to from application.
One problem that I clearly see is that we are not caching the contents of file for future use. We will do that. But I need to understand why and how we are getting this issue.
Is it because of multiple threads trying to read the file? I read that StreamReader is not thread safe but why should it be a problem when I am opening file in Read mode?
You need to open the file with read access allowed. Use this overload of File.Open to specify a file sharing mode. You can use FileShare.Read to allow read access to this file.
Anothr possible solution is to load this file once into memory in a static constructor of a class and then store the contents in a static read-only variable. Since a static constructor is guaranteed to run only once and is thread-safe, you don't have to do anything special to make it work.
If you never change the contents in memory, you won't even need to lock when you access the data. If you do change the contents, you need to first clone this data every time when you're about to change it but then again, you don't need a lock for the clone operation since your actual original data never changes.
For example:
public static class FileData
{
private static readonly string s_sFileData;
static FileData ()
{
s_sFileData = ...; // read file data here using your code
}
public static string Contents
{
get
{
return ( string.Copy ( s_sFileData ) );
}
}
}
This encapsulates your data and gives you read-only access to it.
You only need String.Copy() if your code may modify the file contents - this is just a precaution to force creating a new string instance to protect the original string. Since string is immutable, this is only necessary if your code uses string pointers - I only added this bit because I ran into an issue with a similar variable in my own code just last week where I used pointers to cached data. :)
FileMode just controls what you can do (read/write).
Shared access to files is handled at the operating system level, and you can request behaviors with FileShare (3rd param), see doc

Categories