I'm trying to make ildasm output more like json or xml so that its somewhat easy to programmatically read it.
The way I intended to do it by reading the output line by line and then adding the classes and methods etc to lists and then modify and rewrite it as xml and then read it.
Question: Are there any smarter or simpler ways to read the output?
There is a way to get a list of classes and methods by reading IL Code.
The solution which i am telling might be a bit long but it will work.
IL is nothing but .exe or .dll . First try to convert this to C# or VB by using ILSpy . Download this tool and open your DLL into this. This tool can convert your IL Code into C# or VB.
After converting , save your converted code into a txt file.
Then read the text file and find the classes and methods inside it.
To read Method Names :
MatchCollection mc = Regex.Matches(str, #"(\s)([A-Z]+[a-z]+[A-Z]*)+\(");
To read Class Names :
Iterate through the file line by line and check whether the line has name "Class" . if it has the name then Split the values and store the value/text which comes after the name "Class" which is nothing but ClassName.
Complete Code :
static void Main(string[] args)
{
string line;
List<string> classLst = new List<string>();
List<string> methodLst = new List<string>();
System.IO.StreamReader file = new System.IO.StreamReader(#"C:\Users\******\Desktop\TreeView.txt");
string str = File.ReadAllText(#"C:\Users\*******\Desktop\TreeView.txt");
while ((line = file.ReadLine()) != null)
{
if (line.Contains("class")&&!line.Contains("///"))
{
// for finding class names
int si = line.IndexOf("class");
string followstring = line.Substring(si);
if (!string.IsNullOrEmpty(followstring))
{
string[] spilts = followstring.Split(' ');
if(spilts.Length>1)
{
classLst.Add(spilts[1].ToString());
}
}
}
}
MatchCollection mc = Regex.Matches(str, #"(\s)([A-Z]+[a-z]+[A-Z]*)+\(");
foreach (Match m in mc)
{
methodLst.Add(m.ToString().Substring(1, m.ToString().Length - 2));
//Console.WriteLine(m.ToString().Substring(1, m.ToString().Length - 2));
}
file.Close();
Console.WriteLine("******** classes ***********");
foreach (var item in classLst)
{
Console.WriteLine(item);
}
Console.WriteLine("******** end of classes ***********");
Console.WriteLine("******** methods ***********");
foreach (var item in methodLst)
{
Console.WriteLine(item);
}
Console.WriteLine("******** end of methods ***********");
Console.ReadKey();
}
Here I am storing the class names and method names in a list. you can later store them in XML or JSON as you described above.
Ping us if you face any problem.
Related
I'm reading from a file with numbers and then when I try to convert it to an Int I get this error, System.FormatException: 'Input string was not in a correct format.' Reading the file works and I've tested all of that, it just seems to get stuck on this no matter what I try. This is what I've done so far:
StreamReader share_1 = new StreamReader("Share_1_256.txt");
string data_1 = share_1.ReadToEnd();
int intData1 = Int16.Parse(data_1);
And then if parse is in it doesn't print anything.
As we can see in your post, your input file contains not one number but several. So what you will need is to iterate through all lines of your file, then try the parsing for each lines of your string.
EDIT: The old code was using a external library. For raw C#, try:
using (StringReader reader = new StringReader(input))
{
string line;
while ((line = reader.ReadLine()) != null)
{
// Do something with the line
}
}
In addition, I encourage you to always parse string to number using the TryParse method, not the Parse one.
You can find some details and different implementations for that common problem in C#: C#: Looping through lines of multiline string
parser every single line
while (!reader.EndOfStream)
{
string line = reader.ReadLine();
int intData1 = Int16.Parse(line);
}
You can simplify the code and get rid of StreamReader with a help of File class and Linq:
// Turn text file into IEnumerable<int>:
var data = File
.ReadLines("Share_1_256.txt")
.Select(line => int.Parse(line));
//TODO: add .OrderBy(item => item); if you want to sort items
// Loop over all numbers within file: 15, 1, 48, ..., 32
foreach (int item in data) {
//TODO: Put relevant code here, e.g. Console.WriteLine(item);
}
Code:
using System;
using System.IO;
namespace TimeTress
{
class Program
{
static void Main(string[] args)
{
GetString("../../../../timeline.csv");
GetString("../../../../people.csv");
}
static void GetString(string path)
{
if (File.Exists(path))
{
foreach (var line in File.ReadAllLines(path))
{
Console.WriteLine(line);
}
}
else
{
Console.WriteLine($"Файл не найден по пути {Path.GetFullPath(path)}");
}
}
}
}
It is necessary that the result should not be simply displayed, but written into two different variables, preferably into the arrays string [] [] or string [], so that you can work with them in the future. File text: timeline: {event_date}; {event_description} people: {Name}; {Date of Birth}; {Date of death}
Change the method's return type to string[] or List<string> (if you don't know the difference, use Google)
then, instead of:
Console.WriteLine(line);
create a list and add lines to it:
List<string> result = new ListString();
foreach (var line in File.ReadAllLines(path))
{
result.Add(line);
}
return line; // or `lines.ToArray()`, if your return type is `string[]`
Now you need to think about the return value when the file does not exist. One option would be returning null. But that's not good and forces the users (of the method, I mean, the programmers) to check for null result whenever the method is called. A better option in my opinion is not checking for the existence of the file. Yes, you heard me right. Let the FileNotFoundException be thrown.
For more elaborate ways of processing the CSV file and parsing fields, consult other posts on the internet, including the ones available in StackOverflow
First, the variable(s) need to be defined before a value can be assigned to the variable(s). This variable definition is done using a statement similar to the following:
string[] stringArray;
After defining a variable, it is possible to assign values to the intended variable(s), as shown below:
stringArray = new string[] { "message one", "message two" };
Building upon this, if one would like to read through the lines in a text file, assign the lines to an array of strings, and then return a variable containing that array of strings, the following function would be one way to accomplish this:
static string[] GetListOfStringsFromTextFile(string filePath)
{
string[] stringArray;
if (File.Exists(filePath))
{
// ReadAllLines() already returns a string array. No need to loop.
stringArray = File.ReadAllLines(filePath);
}
else
{
// If file not found, return empty string array as default.
stringArray = new string[0];
}
return stringArray;
}
I am new to c# and am attempting to read in a .csv file and put each line of text in to a separate list item so I can sort it later.
the .csv file is organised like so:
1;"final60";"United Kingdom";"2013-12-06 15:48:16";
2;"donnyr8";"Netherlands";"2013-12-06 15:54:32";
etc
This is my first attempt that doesn't work.It shows no errors in Visual studios 2010 but when I run the console program it displays the following Exception instead of the list.
Exception of type 'System.OutOFMemoryException' was thrown. Which is bizarre because the .csv file only contains a small list.
try
{
// load csv file
using (StreamReader file = new StreamReader("file.csv"))
{
string line = file.ReadLine();
List<string> fileList = new List<string>();
// Do something with the lines from the file until the end of
// the file is reached.
while (line != null)
{
fileList.Add(line);
}
foreach (string fileListLine in fileList)
{
Console.WriteLine(fileListLine);
}
}
}
catch (Exception e)
{
// Let the user know what went wrong.
Console.WriteLine("The file could not be read:");
Console.WriteLine(e.Message);
}
So am I approaching this the correct way?
If the file you are loading isn't really big then you can use File.ReadAllLines:
List<string> list = File.ReadAllLines("file.csv").ToList();
As Servy pointed out in comment it would be better to use File.ReadLines method.
File.ReadLines - MSDN
The ReadLines and ReadAllLines methods differ as follows: When you use
ReadLines, you can start enumerating the collection of strings before
the whole collection is returned; when you use ReadAllLines, you must
wait for the whole array of strings be returned before you can access
the array. Therefore, when you are working with very large files,
ReadLines can be more efficient.
If you need a List<string> then you can do:
List<string> list = File.ReadLines("file.csv").ToList();
You are not updating the line variable so the line will be always different from null infinite loop which cause OutOfMemoryException
try
{
// load csv file
using (StreamReader file = new StreamReader("file.csv"))
{
string line = file.ReadLine();
List<string> fileList = new List<string>();
// Do something with the lines from the file until the end of
// the file is reached.
while (line != null)
{
fileList.Add(line);
line = file.ReadLine();
}
foreach (string fileListLine in fileList)
{
Console.WriteLine(fileListLine);
}
}
}
but the correct approaches will be
List<string> list = File.ReadLines("file.csv").ToList();
which is better than File.ReadAllLines for the following reason
From MSDN:
When you use ReadLines, you can start enumerating the collection of strings before the whole collection is returned;
You should use File.ReadAllLines() and then parse the strings in the array.
For extremely large files this might not be feasible and you'll have to stream the single lines in and process them one by one.
But this is something you can only decide AFTER you have seen this quick approach failing miserably. Until then, stick to the quick and dirty.
I need to write my array list into a text file and so far have come up with this code.
Now im confused as to how to write the 'line' to my text file using the textwriter?
One procedure loads the list out of a txt file below.
public void LoadArrayList()
{
TextReader tr;
tr = File.OpenText("C:\\Users\\Mirro\\Documents\\Visual Studio 2010\\Projects\\Assessment2\\Assessment2\\act\\actors.txt");
string line = tr.ReadToEnd();
Console.WriteLine(line);
if (line != null)
{
ActorArrayList.Add(line);
}
else
tr.Close();
}
Then i have one that will populate the combo box in my form.
public void PopulateActors()
{
cboActor.Items.Clear();
foreach (string line in ActorArrayList)
{
cboActor.Items.AddRange(File.ReadAllLines("C:\\Users\\Mirro\\Documents\\Visual Studio 2010\\Projects\\Assessment2\\Assessment2\\act\\actors.txt"));
}
}
and this procedure i need it to write my whole array "ActoryArrayList" into the text file.
public void WriteArrayList()
{
}
Im sorry for being confusing originally.
Try with following code
// Example #1: Write an array of strings to a file.
// Create a string array that consists of three lines.
string[] lines = { "First line", "Second line", "Third line" };
// WriteAllLines creates a file, writes a collection of strings to the file,
// and then closes the file.
System.IO.File.WriteAllLines(#"C:\Users\Mirro\Documents\Visual Studio 2010\Projects\Assessment2\Assessment2\act\actors.txt", lines);
OUTPUT :
// First line
// Second line
// Third line
The best way is #Leez's way, but You also may use TextWriter and foreach operator to make this:
//your array
string[] yourArray = { "fisrt", "second", "third" };
string text = "C:\\Users\\Mirro\\Documents\\Visual Studio 2010\\Projects\\Assessment2\\Assessment2\\act\\actors.txt";
using (TextWriter writer = File.CreateText(text))
{
foreach (string i in yourArray)
{
writer.WriteLine(i);
}
}
System.IO.File.WriteAllText("FILE_PATH", line);
BTW, where is the ArrayList in your code? Also, consider using System.IO.File.ReadAllText("FILE_PATH") for everyday file reading.
If you were to actually write an ArrayList to a disk file, that would require you to first serialize the contents of the ArrayList to maybe XML or binary etc. Then you can use the above methods to write that serialized representation to a file. Also note that serializing collections involves a concept called deep and shallow copying. This question may help you better understand the concept.
File.WriteAllLines(filePath, ActorArrayList.ToArray());
WriteAllLines outputs two end of line characters (carriage return and line feed - \r\n). If you don't want two end of line characters at the end of each line (\r\n), you can output only one character (\n) by using StreamWriter.
using (StreamWriter sw = new StreamWriter(#"C:\mypath\file.txt"))
{
foreach (string s in linesArray)
sw.Write(s + "\n");
}
I am trying to read some text files, where each line needs to be processed. At the moment I am just using a StreamReader, and then reading each line individually.
I am wondering whether there is a more efficient way (in terms of LoC and readability) to do this using LINQ without compromising operational efficiency. The examples I have seen involve loading the whole file into memory, and then processing it. In this case however I don't believe that would be very efficient. In the first example the files can get up to about 50k, and in the second example, not all lines of the file need to be read (sizes are typically < 10k).
You could argue that nowadays it doesn't really matter for these small files, however I believe that sort of the approach leads to inefficient code.
First example:
// Open file
using(var file = System.IO.File.OpenText(_LstFilename))
{
// Read file
while (!file.EndOfStream)
{
String line = file.ReadLine();
// Ignore empty lines
if (line.Length > 0)
{
// Create addon
T addon = new T();
addon.Load(line, _BaseDir);
// Add to collection
collection.Add(addon);
}
}
}
Second example:
// Open file
using (var file = System.IO.File.OpenText(datFile))
{
// Compile regexs
Regex nameRegex = new Regex("IDENTIFY (.*)");
while (!file.EndOfStream)
{
String line = file.ReadLine();
// Check name
Match m = nameRegex.Match(line);
if (m.Success)
{
_Name = m.Groups[1].Value;
// Remove me when other values are read
break;
}
}
}
You can write a LINQ-based line reader pretty easily using an iterator block:
static IEnumerable<SomeType> ReadFrom(string file) {
string line;
using(var reader = File.OpenText(file)) {
while((line = reader.ReadLine()) != null) {
SomeType newRecord = /* parse line */
yield return newRecord;
}
}
}
or to make Jon happy:
static IEnumerable<string> ReadFrom(string file) {
string line;
using(var reader = File.OpenText(file)) {
while((line = reader.ReadLine()) != null) {
yield return line;
}
}
}
...
var typedSequence = from line in ReadFrom(path)
let record = ParseLine(line)
where record.Active // for example
select record.Key;
then you have ReadFrom(...) as a lazily evaluated sequence without buffering, perfect for Where etc.
Note that if you use OrderBy or the standard GroupBy, it will have to buffer the data in memory; ifyou need grouping and aggregation, "PushLINQ" has some fancy code to allow you to perform aggregations on the data but discard it (no buffering). Jon's explanation is here.
It's simpler to read a line and check whether or not it's null than to check for EndOfStream all the time.
However, I also have a LineReader class in MiscUtil which makes all of this a lot simpler - basically it exposes a file (or a Func<TextReader> as an IEnumerable<string> which lets you do LINQ stuff over it. So you can do things like:
var query = from file in Directory.GetFiles("*.log")
from line in new LineReader(file)
where line.Length > 0
select new AddOn(line); // or whatever
The heart of LineReader is this implementation of IEnumerable<string>.GetEnumerator:
public IEnumerator<string> GetEnumerator()
{
using (TextReader reader = dataSource())
{
string line;
while ((line = reader.ReadLine()) != null)
{
yield return line;
}
}
}
Almost all the rest of the source is just giving flexible ways of setting up dataSource (which is a Func<TextReader>).
Since .NET 4.0, the File.ReadLines() method is available.
int count = File.ReadLines(filepath).Count(line => line.StartsWith(">"));
NOTE: You need to watch out for the IEnumerable<T> solution, as it will result in the file being open for the duration of processing.
For example, with Marc Gravell's response:
foreach(var record in ReadFrom("myfile.csv")) {
DoLongProcessOn(record);
}
the file will remain open for the whole of the processing.
Thanks all for your answers! I decided to go with a mixture, mainly focusing on Marc's though as I will only need to read lines from a file. I guess you could argue seperation is needed everywhere, but heh, life is too short!
Regarding the keeping the file open, that isn't going to be an issue in this case, as the code is part of a desktop application.
Lastly I noticed you all used lowercase string. I know in Java there is a difference between capitalised and non capitalised string, but I thought in C# lowercase string was just a reference to capitalised String?
public void Load(AddonCollection<T> collection)
{
// read from file
var query =
from line in LineReader(_LstFilename)
where line.Length > 0
select CreateAddon(line);
// add results to collection
collection.AddRange(query);
}
protected T CreateAddon(String line)
{
// create addon
T addon = new T();
addon.Load(line, _BaseDir);
return addon;
}
protected static IEnumerable<String> LineReader(String fileName)
{
String line;
using (var file = System.IO.File.OpenText(fileName))
{
// read each line, ensuring not null (EOF)
while ((line = file.ReadLine()) != null)
{
// return trimmed line
yield return line.Trim();
}
}
}