I would like to migrate me previously serialized objects in database to new schema.
My previous object.
Public interface MyReport
{
string Id { get; set;}
string Name { get; set;}
Dictionary<string, string> PropColl { get; set;}
}
But for some reasons we had to make interface changes
Public interface IMarkme
{
}
Public interface MyReport<T> where T : Imarkme
{
string Id { get; set;}
string Name { get; set;}
T ExtendedProp { get; set;}
}
Public NewProp : Imarkme
{
/// some code here
}
So as you can see my interface has been modified and I would like to migrate my serialized objects which were serialized based on MyReport to MyReport
Can someone provide me some input as what kind of utility I should aim to write which can help me achieve migrating my serialized object to new modified interface version.
Thanks,
AG
I have actually done something similar recently, where I have created a simple console application to be able to transform some serialized objects from one version to another. I have simply used both versions of dlls and reflection to read and write the values of different properties. Probably you'll find this helpful as an inspiration ;)
static void Main(string[] args)
{
object test;
AppDomain.CurrentDomain.AssemblyResolve += domain_AssemblyResolve;
using (var con = new SqlConnection(connectionString))
{
using (var cmd = new SqlCommand())
{
cmd.CommandText = "select top 1 Data_Blob from dbo.Serialized";
cmd.CommandType = CommandType.Text;
cmd.Connection = con;
con.Open();
var blob = (byte[])cmd.ExecuteScalar();
var bf = new BinaryFormatter();
var stream = new MemoryStream(blob);
bf.AssemblyFormat = FormatterAssemblyStyle.Full;
test = bf.Deserialize(stream);
}
}
var objNewVersion = Activator.CreateInstance(Type.GetType("ObjectGraphLibrary.Test, ObjectGraphLibrary, Version=1.0.0.10, Culture=neutral, PublicKeyToken=33c7c38cf0d65826"));
var oldType = test.GetType();
var newType = objNewVersion.GetType();
var oldName = (string) oldType.GetProperty("Name").GetValue(test, null);
var oldAge = (int) oldType.GetProperty("Age").GetValue(test, null);
newType.GetProperty("Name").SetValue(objNewVersion, oldName, null);
newType.GetProperty("DateOfBirth").SetValue(objNewVersion, DateTime.Now.AddYears(-oldAge), null);
Console.Read();
}
static Assembly domain_AssemblyResolve(object sender, ResolveEventArgs args)
{
var assName = new AssemblyName(args.Name);
var uriBuilder = new UriBuilder(Assembly.GetExecutingAssembly().CodeBase);
var assemblyPath = Uri.UnescapeDataString(uriBuilder.Path);
var codeBase = Path.GetDirectoryName(assemblyPath);
var assPath = Path.Combine(codeBase, string.Format("old\\{0}.{1}.{2}.{3}\\{4}.dll", assName.Version.Major,
assName.Version.Minor, assName.Version.Build,
assName.Version.Revision, assName.Name));
return File.Exists(assPath) ? Assembly.LoadFile(assPath) : null;
}
1) Write a utility that reads the serialized objects in the old object definition.
2) The utility writes your objects into the DB in a non-serialized manner (ie, with one piece of data in every field, etc...).
Don't get into the habit of serializing objects, and storing the somewhere in persistent storage for retrieval (much) later. Serialization was not built for that.
You have run into the problem of C programmers in the old days: they would create a struct in memory, save that struct into a file. Then the struct's members would change, and they woudl wonder how to read it back, since the data was encoded differently.
then along came database formats, INI files, and so on, specifically to address this need, so saving data in one format, and then being able to read it without error.
So don't repeat errors of the past. Serialization was created to facilitate short-term binary storage and the ability to, say, transmit an object over TCP/IP.
At worst, store your data as XML, not as serialized binary stream. Also, there is no assurance that I know about from MS that says that serialized data from one version of .NET will be able to be read from another. Convert your data to a legible format while you can.
Related
I'm trying to write out Avro files, and having some real trouble with the serialization. I'm using Microsoft.Avro.Core, and recently discovered that when I give it a schema containing a type with an associated logicalType, it will inexplicably extract the inner type and use that to replace it! This means that my DateTime declaration of "type": {"type": "long", "logicalType": "timestamp-micros"} is now a simple "type": "long", which the recipient is unable to interpret properly.
If it were simply doing this internally to understand what data types it's working with, that would be one thing. But this mangled schema is actually being written to the output file, which is completely incorrect behavior. Does anyone know a way to fix or work around this?
(And yes, the library hasn't been updated in 5 years and is probably completely unsupported. But it was the only .NET Avro serializer I could find that fulfills one crucial requirement: allowing me to work with arbitrary types not known at compile-time. Everything else seems to want to only use generic serializers of type T, but my use case can't provide that T. So I can't abandon this library for something better unless there actually is something better that I can use. But if there is, I'd be open to it.)
After a fair amount of searching and poking through poorly-documented source code, I found a solution. Even though the official Avro library from Apache does require a generic type parameter for all of its readers and writers, if you specify the type as GenericRecord, it will let you work with the GenericRecord as a runtime-defined structure. (This is not an arbitrary dynamic type, as the values you assign to its fields still have to match the provided schema.)
Meanwhile, this library's type system has much wider support for Avro's type set than the abandoned Microsoft one does. It correctly works with Avro's logical types and converts between them and CLR types the way you would expect it should work, with one notable exception: serializing a DateTime will convert it from the system's local time to UTC. (There's probably a way to work around this but I haven't found it yet.)
Just leaving this here in case anyone runs into similar problems in the future.
I tried to reproduce the behavior and put together a small program, that doesn't use generics.
Indeed, from what I can see the DateTime sub type is still omitted from the schema, which is really confusing and frustrating, because the receiver needs to know up-front, possible by using the field name ?!, which long fields are of the type DateTime and which not.?! By default it uses Ticks to parse DateTimes. I looked a bit over the Avro library on github and I saw it uses runtimeType for DateTime and DateTimeOffset.
It probably doesn't help you too much, but maybe it helps someone out there with similar problems.
[DataContract]
public struct TestMsg
{
[DataMember]
public int Id;
[DataMember]
public double Amount;
[DataMember]
public DateTime OrderSubmitted;
public TestMsg(int id, double amount, DateTime orderSubmitted)
{
Id = id;
Amount = amount;
OrderSubmitted = orderSubmitted;
}
}
internal class Program
{
static void Main(string[] args)
{
string line = Environment.NewLine;
string fileName = "Messages.avro";
string filePath = null;
if (Environment.NewLine.Contains("\r"))
{
filePath = new DirectoryInfo(".") + #"\" + fileName;
}
else
{
filePath = new DirectoryInfo(".") + #"/" + fileName;
}
ArrayList al = new ArrayList
{
new TestMsg(1, 189.12, DateTime.Now),
new TestMsg(2, 345.94, new DateTime(2000, 1, 10, 15, 20, 23, 103))
};
var schema = #"
{
""type"" : ""record"",
""name"" : ""TestAvro.TestMsg"",
""fields"" : [
{
""name"" : ""Id"",
""type"": ""int""
},
{
""name"" : ""Amount"",
""type"": ""double""
},
{
""name"" : ""OrderSubmitted"",
""type"": ""long"",
""runtimeType"": ""DateTime""
}
]
}";
using (var dataStream = new FileStream(filePath, FileMode.Create))
{
var serializer = AvroSerializer.CreateGeneric(schema);
using (var avroWriter = AvroContainer.CreateGenericWriter(schema, dataStream, Codec.Null))
{
using (var seqWriter = new SequentialWriter<object>(avroWriter, al.Count))
{
var avroRecords = new List<AvroRecord>();
foreach (var item in al)
{
dynamic record = new AvroRecord(serializer.WriterSchema);
record.Id = ((TestMsg)item).Id;
record.Amount = ((TestMsg)item).Amount;
record.OrderSubmitted = ((TestMsg)item).OrderSubmitted.Ticks;
seqWriter.Write(record);
}
}
}
dataStream.Dispose();
}
Console.WriteLine("Now reading file.");
using (var dataStream = new FileStream(filePath, FileMode.Open))
{
using (var avroReader = AvroContainer.CreateGenericReader(schema, dataStream, true, new CodecFactory()))
{
using (var seqReader = new SequentialReader<object>(avroReader))
{
foreach (var dynamicItem in seqReader.Objects)
{
dynamic obj = (dynamic)dynamicItem;
Console.WriteLine($" {obj.Id} - {obj.Amount} - {new DateTime(obj.OrderSubmitted).ToString()}");
}
}
}
dataStream.Dispose();
}
Console.ReadLine();
}
}
I am using KsqlDb a table with the following form:
KSQL-DB Query
create table currency (id integer,name varchar) with (kafka_topic='currency',partitions=1,value_format='avro');
C# model
public class Currency
{
public int Id{get;set;}
public string Name{get;set;}
}
Now i want to know how should i write/read data from this topic in C# using the Confluent library:
Writing
IProducer<int, Currency> producer=....
Currency cur=new Currency();
Message<int,Currency> message = new Message<int, Currency>
{
Key = msg.Id,
Timestamp = new Timestamp(DateTime.UtcNow, TimestampType.CreateTime),
Value = msg
};
DeliveryResult<int,Currency> delivery = await this.producer.ProduceAsync(topic,message);
Reading
IConsumer<int,Currency> iconsumer = new ConsumerBuilder<int, Currency>(config)
.SetKeyDeserializer(Deserializers.Int32) //i assume i need to use the id from my dto
.SetValueDeserializer(...) //what deserializer
.Build();
ConsumeResult<int,Currency> result = consumer.Consume();
Currency message = // what deserializer JsonSerializer.Deserialize<Currency>(result.Message.Value);
I am not sure how to go about this so i tried looking for serializer. I found this library AvroSerializer , but i do not get where the author fetches the schema.
Any help on how to read/write to a specific topic that would match with my ksqldb models ?
Update
After some research and some answers here i have started using the schemaRegistry
var config = new ConsumerConfig
{
GroupId = kafkaConfig.ConsumerGroup,
BootstrapServers = kafkaConfig.ServerUrl,
AutoOffsetReset = AutoOffsetReset.Earliest
};
var schemaRegistryConfig = new SchemaRegistryConfig
{
Url = kafkaConfig.SchemaRegistryUrl
};
var schemaRegistry = new CachedSchemaRegistryClient(schemaRegistryConfig);
IConsumer<int,Currency> consumer = new ConsumerBuilder<int, Currency>(config)
.SetKeyDeserializer(new AvroDeserializer<int>(schemaRegistry).AsSyncOverAsync())
.SetValueDeserializer(new AvroDeserializer<Currency>(schemaRegistry).AsSyncOverAsync())
.Build();
ConsumeResult<int, Currency> result = consumer.Consume();
Now i am getting another error:
Expecting data framing of length 5 bytes or more but total data size
is 4 bytes
As someone kindly pointed out it seems i retrieving only the id from the schema registry.
How can i just : insert into currency (id,name) values (1,3) and retrieve it in C# as a POCO (listed above) ?
Update 2
After i have found this source program it seems i am not able to publish messages to tables for some reason.
There is no error when sending the message but it is not published to Kafka.
I found this library AvroSerializer , but i do not get where the author fetches the schema.
Unclear why you need to use a library other than the Confluent one, but they get it from the Schema Registry. You can use CachedSchemaRegistryClient to get the schema string easily, however you shouldn't need this in the code as the deserializer will download from the registry on its own.
If you refer to the examples/ in the confluent-kafka-dotnet repo for Specific Avro consumption, you can see they generate the User class from User.avsc file, which seems to be exactly what you want to do here for Currency rather than write it yourself
I have solved the problem by defining my custom serializer , thus implementing the ISerializer<T> and IDeserializer<T> interfaces which in their belly are just wrappers over System.Text.Json.JsonSerializer or NewtonsoftJson.
Serializer
public class MySerializer:ISerializer<T>
{
byte[] Serialize(T data, SerializationContext context)
{
var str=System.Text.Json.JsonSerializer.Serialize(data); //you can also use Newtonsoft here
var bytes=Encoding.UTF8.GetBytes(str);
return bytes;
}
}
Usage
var config = new ConsumerConfig
{
GroupId = kafkaConfig.ConsumerGroup,
BootstrapServers = kafkaConfig.ServerUrl,
AutoOffsetReset = AutoOffsetReset.Earliest
};
IConsumer<int,Currency> consumer = new ConsumerBuilder<int, Currency>(config)
.SetValueDeserializer(new MySerializer<Currency>())
.Build();
ConsumeResult<int, Currency> result = consumer.Consume();
P.S
I am not even using the schema registry here afteri implemented the interface
Saving Disk information from Azure:
var credentials = SdkContext.AzureCredentialsFactory.FromServicePrincipal("myclientId", "mytenant", "mysecretId", AzureEnvironment.AzureGlobalCloud);
var azure = Azure.Authenticate(credentials).WithSubscription("mySubscription");
var groupName = "myResourceGroup";
var vmName = "myVM";
var location = Region.USWest;
var vm = azure.Disks.List();
Console.WriteLine("Getting information about the virtual machine...");
MongoClient client = new MongoClient("mylocalhost");
IMongoDatabase database = client.GetDatabase("VM");
var collection = database.GetCollection<IDisk>("Disk ");
collection.InsertManyAsync(vm);
When I save them to Mongodb, I get an error:
maximum serialization depth exceeded (does the object being serialized have a circular reference?).
What am I doing wrong here?
It sounds like the IDisk that you're getting back from that API is resolving to a circular graph that the Mongodb isn't very happy with. The most simple fix, then, is : don't serialize the IFile - after all, it isn't your type, and you can't control it. Instead, create some type of your own that has exactly what you want on it, and serialize that. For example:
sealed class MyDisk // TODO: rename me
{
public string Key {get;set;}
public string Name {get;set;}
public string RegionName {get;set;}
public int SizeGB {get;set;}
// etc, but *only* the information you actually want, and *only* using
// primitive types or types you control
}
...
var disks = (from disk in azure.Disks.List();
select new MyDisk {
Key = disk.Key,
Name= disk.Name,
RegionName = disk.RegionName,
SizeGB = disk.SizeInGB,
// etc
}).ToList();
And store disks, which we know doesn't have a circular reference, because we control it.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I'm a beginner to .net and could you guide me to right direction. My problem is based on the following code. Here I have 4 variations of same method and all 4 variations are working fine.
I just want to know what is the recommended or standard way of doing this?
Are all these forms of method ok?
Code explanation:
From a windows form I'm calling a viewAccount() method which is in bankAccount class. Its purpose is to get relevant bank account details of an employee from the database and then those details should be shown in the text boxes of calling form.
Also please note that I have reduced no of line to make it more readable. Appreciate your any help towards the right direction.
Thank you.
Example 01 - Method will return a bankAccount object with fields populated with data from the database
class bankAccount
{
//Member fields...
string acNo;
string acName;
string bank;
string acType;
frmShowAccount form=new frmShowAccount();
public bankAccount viewAccount( string acNo )
{
this.acNo = acNo;
using (SqlConnection newCon = new SqlConnection(db.GetConnectionString))
{
SqlCommand newCmd = new SqlCommand("SELECT Employee.Name, BankAccount.ac_name, BankAccount.bank_name, BankAccount.ac_type FROM BankAccount INNER JOIN Employee ON BankAccount.emp_id = Employee.Emp_ID WHERE (BankAccount.ac_no = #bankAccount)", newCon);
newCmd.Parameters.Add("#bankAccount", SqlDbType.Char).Value = acNo;
newCon.Open();
SqlDataReader rdr = newCmd.ExecuteReader();
rdr.Read();
form.txtEmpName.text = rdr.GetString(0); //EmpName is not a member of bankAccount class
this.acName = rdr.GetString(1);
this.bank = rdr.GetString(2);
this.acType = rdr.GetString(3);
return this;
}
}
}
// CALLING THE ABOVE METHOD...
bankAccount newBA = new bankAccount();
newBA = newBA.viewAccount(txtACNo.text); // A reference is set to the instance returned
txtACName.text = newBA.acName; // Get the value of instance field to text box
Example 02 - Method will return a data reader and it will be used by the form to get data
class bankAccount
{
string acNo;
string acName;
string bank;
string acType;
public SqlDataReader viewAccount( string acNo )
{
this.acNo = acNo;
using (SqlConnection newCon = new SqlConnection(db.GetConnectionString))
{
SqlCommand newCmd = new SqlCommand("Same SELECT …”,newCon);
newCmd.Parameters.Add()…
newCon.Open();
SqlDataReader rdr = newCmd.ExecuteReader();
rdr.Read();
return rdr;
}
}
}
//CALLING THE ABOVE METHOD...
bankAccount newBA = new bankAccount();
SqlDataReader rdr = newBA.viewAccount(txtACNo.text) //A reference to hold the returning reader from the method call
txtACName.text = rdr.getString(1); //Get the value through the reader to text box
Example 03: this method want return values but explicitly assign values to the text boxes in the form
class bankAccount
{
string acNo;
string acName;
string bank;
string acType;
frmShowAccount form=new frmShowAccount();
public void viewAccount( string acNo )
{
this.acNo = acNo;
using (SqlConnection newCon = new SqlConnection(db.GetConnectionString))
{
SqlCommand newCmd = new SqlCommand("Same SELECT …", newCon);
newCmd.Parameters.Add()…
newCon.Open();
SqlDataReader rdr = newCmd.ExecuteReader();
rdr.Read();
// Setting values to the text boxes in the current instance of form
form.txtName.text=rdr[0];
form.txtACName.text=rdr[1];
form.txtBankName.text=rdr[2];
form.txtACType.text=rdr[3];
}
}
}
//CALLING THE ABOVE METHOD
bankAccount newBA = new bankAccount();
newBA.form.this; // reference 'form' which is in the 'bankAccount' class is set to current instance of the form object.
Example 04: this method want return any value. It will only initialize instance fields of the class with the data
class bankAccount
{
string acNo;
string acName;
string bank;
string acType;
frmShowAccount form=new frmShowAccount();
public void viewAccount( string acNo )
{
this.acNo = acNo;
using (SqlConnection newCon = new SqlConnection(db.GetConnectionString))
{
SqlCommand newCmd = new SqlCommand("Same SELECT …)", newCon);
newCmd.Parameters.Add()…
newCon.Open();
SqlDataReader rdr = newCmd.ExecuteReader();
rdr.Read();
form.txtName.text=rdr[0];
this.acName=rdr[1];
this.bank=rdr[2];
this.acType=rdr[3];
}
}
// CALLING THE ABOVE METHOD
bankAccount newBA = new bankAccount();
txtACName.text = newBA.acName; // Text boxes get the data from account object's instance fields (probably through a get property)
Option 1 is your best bet, but:-
a) The method accessing the database really should return a new BankAccount model object, rather than setting the properties of this:-
class BankAccountModel
{
public string AccountNumber { get; set; }
public string AccountName { get; set; }
public string Bank { get; set; }
public string AccountType { get; set; }
public static BankAccountModel GetAccount(string accountNumber)
{
var account = new BankAccountModel()
{
AccountNumber = accountNumber,
};
using (SqlConnection newCon = new SqlConnection(db.GetConnectionString))
{
...
account.AccountName = rdr.GetString(1);
account.Bank = rdr.GetString(2);
account.AccountType = rdr.GetString(3);
}
return account;
}
}
b) If you're making a WPF/Winforms/Webforms application, you'll want to investigate Data Binding rather than manually setting the values of your controls to the values of your model.
var account = BankAccountModel.GetAccount(accountNumber);
myControl.DataSource = account;
(Or if you're making an ASP.NET MVC application, you just pass your model to the view):-
var model = BankAccountModel.GetAccount(accountNumber);
return View(model);
c) It's usually helpful if your model doesn't have to concern itself with the way that it's being persisted. Eventually you'll want to pull the "GetAccount" method out of your Model class and into some dedicated data access code (e.g. a BankAccountRepository). This gives you better flexibility for situations where you need to control the lifecycle of your managed resources (e.g. the database connection), or if you need to obtain BankAccountModels from multiple sources:-
class BankAccountModel
{
public string AccountNumber { get; set; }
public string AccountName { get; set; }
public string Bank { get; set; }
public string AccountType { get; set; }
}
class BankAccountRepository
{
public BankAccountModel GetAccount(string AccountNumber)
{
...
}
}
I would say none of the above
You are mixing the ui, data layer, and business layer
for sure bankAccount should not access the ui directly
2 is the closest
but you should pass values to the ctor
the ctor should not access the database directly
have separate method to create the class
this method access the db
public bankAccount GetbankAccount(int acNo)
{
// access db
// create bankAccount
}
Don't mix your data model with the UI, it will become a nightmare to maintain and evolve. Keep different and independent concerns - keeping and manipulating data in memory, fetching data from and writing data to the database and presenting data to the user - separated. Therefore option three is bad.
Option two is not good either because you really want to encapsulate your low lever database interaction code and not have to deal with data readers all over the place.
Options one and four are both okay, see the next paragraph, but both still have issues and I would not recommend using them. This is probably not really helpful but there are just so many things to say and consider - database context management, data modeling, view modeling, data binding, ... - that I can not properly address everything in an answer. If you are really interested in the best solution - there is no single best solution - I suggest to take invest several weeks and just read a lot of related articles, blogs, tutorials, manuals and so on to get an idea what tools, ideas, problems and solutions are out there and then you can better judge and decide what you need for your case. But it will really take weeks to just to get a rough overview of what is out their.
It usually is a good idea to also separate keeping data in memory and fetching data from the database. There is a design pattern called active record that puts the database access part on the data classes. This gives you code like
var user = User.GetById(42);
user.Name = "John Doe";
user.Save();
instead of
var database = new Database();
var user = database.GetById<User>(42);
user.Name = "John Doe";
database.Save(user);
and is usually more natural and easier to read. The main drawback is that it gets harder to separate concerns. You may want to have a look design patterns for data persistence. After all you usually just use an O/R mapper like Entity Framework or NHibernate if you prefer the active record pattern. The only good reason for writing your own database access layer - besides fun and education - is that you need every bit of performance.
I have a problem - previously I was after an algorithm to solve a part of it (see Combine LINQ queries) anyway, and I have come to a huge issue.
At around 540k directories, it's crashing out with out of memory. :(
I am trying to process and store the company SAN file information, and we need to do this, because we have people who keep data for 25 years and they don't need to, but it's hard to track. It's a total of up to 70 TB of files. So, as you can imagine, it's a lot of files.
From what I've read however, memory mapped files can't be dynamic? Is this true? I can't know prior how many files + directories there are for sure.
If not, (please say not), can someone do me a short example on how to make a dynamic mapped file (code provided in the Combine LINQ queries question). In short, I create a directory structure in memory holding directory → directories + files(name, size, access date, modified date, and creation date).
Any clues would be appreciated as this would get around my problem if it's possible.
When you can't fit the whole thing into memory you can stream your data with an IEnumerable Below's an example of that. I've been playing around with MemoryMapped files as well since I need the last drop of perf, but so far I've stuck with BinaryReader/Writer.
For the DB advocates: When you really need the last drop of perf, I do my own binary files as well. Going out of process to a DB really adds overhead. Also the whole security/ logging, ACID etc does add up.
Here's an example that streams your f_results class.
EDIT
Updated example to show how to write/read a tree of directory info. I keep 1 file that holds all the directories. This tree is loaded into memory in one go, and then points to the files where all the f_results are.
You still have to create a seperate file per directory that holds the f_results for all the files. How to do that depends on your code, but you should be able to figure that out.
Good luck!
public class f_results {
public String name { get; set; }
public DateTime cdate { get; set; }
public DateTime mdate { get; set; }
public DateTime adate { get; set; }
public Int64 size { get; set; }
// write one to a file
public void WriteTo(BinaryWriter wrtr) {
wrtr.Write(name);
wrtr.Write(cdate.Ticks);
wrtr.Write(mdate.Ticks);
wrtr.Write(adate.Ticks);
wrtr.Write(size);
}
// read one from a file
public f_results(BinaryReader rdr) {
name = rdr.ReadString();
cdate = new DateTime(rdr.ReadInt64());
mdate = new DateTime(rdr.ReadInt64());
adate = new DateTime(rdr.ReadInt64());
size = rdr.ReadInt64();
}
// stream a whole file as an IEnumerable (so very little memory needed)
public static IEnumerable<f_results> FromFile(string dataFilePath) {
var file = new FileStream(dataFilePath, FileMode.Open);
var rdr = new BinaryReader(file);
var eos = rdr.BaseStream.Length;
while (rdr.BaseStream.Position < eos) yield return new f_results(rdr);
rdr.Close();
file.Close();
}
}
class Program {
static void Main(string[] args) {
var d1 = new DirTree(#"C:\",
new DirTree(#"C:\Dir1",
new DirTree(#"C:\Dir1\Dir2"),
new DirTree(#"C:\Dir1\Dir3")
),
new DirTree(#"C:\Dir4",
new DirTree(#"C:\Dir4\Dir5"),
new DirTree(#"C:\Dir4\Dir6")
));
var path = #"D:\Dirs.dir";
// write the directory tree to a file
var file = new FileStream(path, FileMode.CreateNew | FileMode.Truncate);
var w = new BinaryWriter(file);
d1.WriteTo(w);
w.Close();
file.Close();
// read it from the file
var file2 = new FileStream(path, FileMode.Open);
var rdr = new BinaryReader(file2);
var d2 = new DirTree(rdr);
// now inspect d2 in debugger to see that it was read back into memory
// find files bigger than (roughly) 1GB
var BigFiles = from f in f_results.FromFile(#"C:\SomeFile.dat")
where f.size > 1e9
select f;
}
}
class DirTree {
public string Path { get; private set; }
private string FilesFile { get { return Path.Replace(':', '_').Replace('\\', '_') + ".dat"; } }
public IEnumerable<f_results> Files() {
return f_results.FromFile(this.FilesFile);
}
// you'll want to encapsulate this in real code but I didn't for brevity
public DirTree[] _SubDirectories;
public DirTree(BinaryReader rdr) {
Path = rdr.ReadString();
int count = rdr.ReadInt32();
_SubDirectories = new DirTree[count];
for (int i = 0; i < count; i++) _SubDirectories[i] = new DirTree(rdr);
}
public DirTree( string Path, params DirTree[] subDirs){
this.Path = Path;
_SubDirectories = subDirs;
}
public void WriteTo(BinaryWriter w) {
w.Write(Path);
w.Write(_SubDirectories.Length);
// depth first is the easiest way to do this
foreach (var f in _SubDirectories) f.WriteTo(w);
}
}
}