All, I want to create an object array foo[], where the constructor for Foo is
public Foo(string name, string discription){}
I have a database object which has a structure (not incuding stored procedures, functions or views for simplicity) like
public class Database
{
public string name { get; set; }
public string filename { get; set; }
public List<Table> tables { get; set; }
public Database(string name, string filename)
{
this.name = name;
this.filename = filename;
}
}
protected internal class Table
{
public string name { get; set; }
public List<Column> columns { get; set;}
public Table(string name, List<Column> columns)
{
this.name = name;
this.columns = columns;
}
}
protected internal class Column
{
public string name { get; set; }
public string type { get; set; }
public Column(string name, string type, int maxLength,
bool isNullable)
{
this.name = name;
this.type = type;
}
}
I would like to know the quickest way to add Column and Table information to the Foo[] object array?
Clearly I can do
List<Foo> fooList = new List<Foo>();
foreach (Table t in database.tables)
{
fooList.Add(new Foo(t.Name, "Some Description"));
foreach (Column c in t.columns)
fooList.Add(new Foo(c.Name, "Some Description"));
}
Foo[] fooArr = fooList.ToArray<Foo>();
But is there a quicker way? Clearly LINQ is likely to be slower for a query that does a simalar operation, but I care allot about speed here so any advice would be appreciated. Perhaps the use of a HashSet would be the way to go as there will not be duplicate entries...
Thanks for your time.
I would say change your foreach loop to for loop, as discussed here In .NET, which loop runs faster, 'for' or 'foreach'?
Datastructure wise, you do need mutable structure, unless you know exactly how many records you will be inserting into fooList, then you can use Array instead of list. According to the answer of the foreach vs for-loop question, assuming it's correct, for loops on List are a bit more than 2 times cheaper than foreach loops on List, and Looping on array is around 2 times cheaper than looping on List.
So 2 improvement would be:
change foreach to for
use linq to compute the length for the array
as per #Tim Schmelter, and change List to Array
You could initialize the array with the correct size and only use it without a backing list:
int size = db.tables.Sum(t => t.columns.Count + 1);
Foo[] fooArr = new Foo[size];
int currentSize = 0;
foreach (var tbl in db.tables)
{
fooArr[currentSize++] = new Foo(tbl.Name, "Some Discription");
foreach(var c in tbl.columns)
fooArr[currentSize++] = new Foo(c.Name, "Some Discription");
}
Related
Is there a way to pass child or multiple objects to Dapper as query parameter, when they have properties with the same name?
For example, if I have these classes:
class Person
{
public int Id { get; set; }
public string Name { get; set; }
public City City { get; set; }
}
class City
{
public int Id { get; set; }
public string Name { get; set; }
}
and I want to execute this query:
connect.QueryFirst<Result>("select :Id Id, :Name Name, :City_Id City_Id, :City_Name City_Name from dual", personParams);
The only way I managed to do it without reflection was passing the first object and then adding the other properties one by one:
var personParams = new DynamicParameters(person);
personParams.AddDynamicParams(new { City_Id = person.City.Id, City_name = person.City.Name });
But on the database there are hundreds of tables, some of them with more than a hundred columns, and I need to split them into multiple classes. So passing the parameters one by one would be improductive.
I tried adding the parameters to a temporary bag and then adding a prefix to each one, something like this:
static DynamicParameters GetParameters(object mainEntity, object otherEntities)
{
var allParams = new DynamicParameters(mainEntity);
foreach (PropertyDescriptor descriptor in TypeDescriptor.GetProperties(otherEntities))
{
object obj = descriptor.GetValue(otherEntities);
var parameters = new DynamicParameters(obj);
foreach (var paramName in parameters.ParameterNames)
{
var value = parameters.Get<object>(paramName);
allParams.Add($"{descriptor.Name}{paramName}", value);
}
}
return allParams;
}
But it doesn't work because Dapper only populates the parameters when the command is executing, not when the DynamicParamters is created.
I wanted to avoid reflection because Dapper is very good at it, and my code would probably perform much worse. Is there a way to do it or is reflection my only option?
I need a clear example that shows me how to define a list that has n rows and 4 columns and how to use it. I need a list to save my data like the below image. as you see this could be a dictionary.
You need to create a class with all the above properties
public class Sample
{
public string vocabulary { get; set; }
public string meaning { get; set; }
public int number { get; set; }
public int group { get; set; }
}
and then you can create a List of type Sample,
List<Sample> yourList = new List<Sample>();
You can add items to the list as below
yourList.Add(new Sample { vocabulary = "massive", meaning = "very big", number = 5, group = 15 });
You can access them later like this, if you want the first element,
var result = yourList[0];
this is the easiest and best way of doing it. You need to create a new class and then create new instances of the class and then add it to the list and then use LINQ to get the data out
void Main()
{
var list = new List<myClass>()
list.Add(new myClass() {
Vocabluary = "Vocabluary ",
Meaning = "meaning",
Number = 1,
Group = 2})
}
public class myClass
{
public string Vocabluary { get; set; }
public string Meaning { get; set; }
public int Number { get; set; }
public int Group { get; set; }
}
yes... as Sajeetharan mentioned, with a custom class you can create an any dimensions List. but i don't think you need to think about dimension in C#... it is a bit more high level than that.
just simply create a class and put everything you need in it...
public class CustomClass{
public string d1;
public int d2;
public string d3;
public string d4;
...
//you can easily create a N dimension class
}
to access it and apply it
public void Main(){
List<CustomClass> list = new List<CustomClass>();
CustomClass cc = new CustomClass();
cc.d1 = "v1";
cc.d2 = 0; //v2
list.Add(cc);
//to access it
foreach(CustomClass tmpClass in list)
{
string d1Value = tmpClass.d1;
int d2Value = tmpClass.d2;
}
}
The scenario is the following: I receive a message containing a lot of variables, several hundreds. I need to write this to Azure Table storage where the partition key is the name of the individual variables and the value gets mapped to e.g. Value.
Let’s say the payload looks like the following:
public class Payload
{
public long DeviceId { get; set; }
public string Name { get; set; }
public double Foo { get; set; }
public double Rpm { get; set; }
public double Temp { get; set; }
public string Status { get; set; }
public DateTime Timestamp { get; set; }
}
And my TableEntry like this:
public class Table : TableEntity
{
public Table(string partitionKey, string rowKey)
{
this.PartitionKey = partitionKey;
this.RowKey = rowKey;
}
public Table() {}
public long DeviceId { get; set; }
public string Name { get; set; }
public double Value { get; set; }
public string Signal { get; set; }
public string Status { get; set; }
}
In order to write that to Table storage, I need to
var table = new Table(primaryKey, payload.Timestamp.ToString(TimestampFormat))
{
DeviceId = payload.DeviceId,
Name = payload.Name,
Status = payload.Status,
Value = value (payload.Foo or payload.Rpm or payload.Temp),
Signal = primarykey/Name of variable ("foo" or "rmp" or "temp"),
Timestamp = payload.Timestamp
};
var insertOperation = TableOperation.Insert(table);
await this.cloudTable.ExecuteAsync(insertOperation);
I don’t want to copy this 900 times (or how many variables there happen to be in the payload message; this is a fixed number).
I could make a method to create the table, but I will still have to call this 900 times.
I thought maybe AutoMapper could help out.
Are they always the same variables? A different approach could be to use DynamicTableEntity in which you basically have a TableEntity where you can fill out all additional fields after the RowKey/PartitionKey Duo:
var tableEntity = new DynamicTableEntity();
tableEntity.PartitionKey = "partitionkey";
tableEntity.RowKey = "rowkey";
dynamic json = JsonConvert.DeserializeObject("{bunch:'of',stuff:'here'}");
foreach(var item in json)
{
tableEntity.Properties.Add(item.displayName, item.value);
}
// Save etc
The problem is to map these properties, it is right?
Value = value (payload.Foo or payload.Rpm or payload.Temp),
Signal = primarykey/Name of variable ("foo" or "rmp" or "temp"),
This conditional mapping can be done via Reflection:
object payload = new A { Id = 1 };
object value = TryGetPropertyValue(payload, "Id", "Name"); //returns 1
payload = new B { Name = "foo" };
value = TryGetPropertyValue(payload, "Id", "Name"); //returns "foo"
.
public object TryGetPropertyValue(object obj, params string[] propertyNames)
{
foreach (var name in propertyNames)
{
PropertyInfo propertyInfo = obj.GetType().GetProperty(name);
if (propertyInfo != null) return propertyInfo.GetValue(obj);
}
throw new ArgumentException();
}
You may map rest of properties (which have equal names in source and destination) with AutoMapper.Mapper.DynamicMap call instead of AutoMapper.Mapper.Map to avoid creation of hundreds configuration maps. Or just cast your payload to dynamic and map it manually.
You can create a DynamicTableEntity from your Payload objects with 1-2 lines of code using TableEntity.Flatten method in the SDK or use the ObjectFlattenerRecomposer Nuget package if you are also worried about ICollection type properties. Assign it PK/RK and write the flattened Payload object into the table as a DynamicTableEntity. When you read it back, read it as DynamicTableEntity and you can use TableEntity.ConvertBack method to recreate the original object. Dont even need that intermediate Table class.
I am making a program that automates data messaging.
I have an Excel sheet that has a Name column, Type Column, and Value Column.
It looks like this:
Name Type Value
Sam A 32
Ben B 65
Sam B 213
max B 23
max C 24
max C 12
Ben C 45
This data is not real, but it is similar to what I am working with.
Note: some of the names do not have certain types.
I have loaded the data in 3 arrays arrName[], arrType[], and arrValue[]
I wish to make the data look like this with a 3D array arrPro[name, type, value]. All the values of the same type should belong to the same name and all the values should be added up together to form a total value.
Load excel data directly to DataSet is the most simple method.
If you do want a 3D array, I suggest
Tuple<string, string, int> as the data holder for each row.
and
List<Tuple<string, string, int>> will hold all your data.
This structure might hold your data
public class MyTypes
{
public string TypeName;
public List<NamesValues> Names;
}
public class NamesValues
{
public string Name;
public List<int> Values;
}
It's not simplier and cleaner to store your data in some List<YourObject> for example:
public class YourObject {
public string Name { get; set; }
public string Type{ get; set; }
public int Value { get; set; }
}
and then count what you want with linq (more less):
List<YourObject> yourObject = "your logic goes here (with whole data)";
List<YourObject> result = from s in yourObject
group s by new { s.Name, s. Type) into r
select new YourObject { Name = r.Name, Type = r.Type, Value = r.Sum(s => s.Value) };
I'm not shure it's exactly what you're looking for (grouping type).
You should use a Dictionary.
public Enum EType
{
A,
B,
C
}
public class PatientData
{
public EType Type { get; set; }
public int Value {get; set; }
}
public class PatientManager
{
private readonly Dictionary<String, List<PatientData>> _patients = new Dictionary<String, List<PatientData>>();
public void AddPatientData(string name, EType type, int value)
{
var patientData = new PatientData
{
Type = type,
Value = value
};
List<PatientData> patientDatas;
if (!dictionary.TryGetValue(name, out patientDatas))
{
patientDatas = new List<PatientData>();
dictionary.Add(key, patientDatas);
}
_patientDatas.Add(patientData);
}
public void LoadData(string[] names, EType[] types, int[] values)
{
var iMax = Math.Min(names.Length, Math.Min(type.Length, values.Length));
for (var i = 0; i < iMax; i++)
{
AddPatientData(names[i], types[i], values[i]);
}
}
}
First of all I'm sorry if you feel this question has been raised before, but I can't seem to wrap my mind around this one, although it's rly not the hardest thing to do..
Basically I have a query result from sql which holds several rows, existing out of :
id, parentid, name, description, level
level is the depth of the item viewed as a tree structure represented as a positive integer.
now I would love to parse/convert this flat data into a "List<Item> mySqlData" where Item consist like the following class definition
public class Item
{
public string Id { get; set; }
public string ParentId { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public string List<Item> { get; set; }
}
can anybody give me some example code, it's probably going to be something in the lines of recursive iteration trough the list while adding the items at their place..
thanks in advance
Assuming you want to build the tree, and don't get the data out of order, you should be able to maintain a lookup as you go, i.e.
var idLookup = new Dictionary<int, Item>();
var roots = new List<Item>();
foreach([row]) {
Item newRow = [read basic row];
int? parentId = [read parentid]
Item parent;
if(parentId != null && idLookup.TryGetValue(parentId.Value, out parent)) {
parent.Items.Add(newRow);
} else {
roots.Add(newRow);
}
idLookup.Add(newRow.Id, newRow);
}