Is it possible to initalize a nested property when setting the parent? - c#

I'd like to know if the following is possible and if it would have any performance benefits.
Given this structure;
public class X
{
[JsonIgnore]
public List<Y> Y { get; set; }
}
public class Y
{
[JsonIgnore]
public List<Z> Z { get; set; }
}
public class Z
{
...
}
short version
Can I initialize X.Y.Z in one line?
Working (foreach) version;
X x = null;
var dbitem = db.SingleOrDefault(...);
if (dbitem != null)
{
x = new X
{
Y = dbX.dbY.Select(a =>
JsonConvert.DeserializeObject<Y>(a.json)).ToList()
};
x.Y.ForEach(a =>
a.Z = dbX.dbY.Single(b => b.id == a.id)
.dbZ.Select(q => JsonConvert.DeserializeObject<Z>(q.json)).ToList());
}
Experimental
Create Y and Z in one go, no need to query .Single(b => b.id == a.id)
x = new X
{
Y = dbX.dbY.Select(a =>
JsonConvert.DeserializeObject<Y>(a.json).Z.AddRange(
a.dbZ.Select(b =>
JsonConvert.DeserializeObject<Z>(b.json)).ToList()))
};
Get dbY items that are associated with the dbX item (FK)
Deserialize all dbY items to object Y
Object Y has List property which I want to initalize at the level of Y. I want to do this because at that point I have a reference to all Z objects (by FK).
Step 3 is where I don't know what to do. How to intialize those Z objects from that reference?
If the question is unclear or title doesn't reflect question please let me know.

Related

Fastest way to sort an Array c#

Hi this is my problem I have an array of points P(x,y) and I need to sort these points from the furthest to the closest, respect to the barycenter of a Polygon, this what I have done (I know this is a bad solution ) how can I do a better and aboveall faster solution?
List<C2DPoint> OrderedGripperPoints = new List<C2DPoint> { };
while(myGripperPoints.Count!=0)
{
double dist=-1;
int index=-1;
for(int k=0;k<myGripperPoints.Count;k++)
{
if(myGripperPoints[k].Distance(WorkScrap.GetCentroid())>=dist)
{
index = k;
dist = myGripperPoints[k].Distance(WorkScrap.GetCentroid());
}
}
OrderedGripperPoints.Add(myGripperPoints[index]);
myGripperPoints.RemoveAt(index);
}
Thanks for your answers...
Use Linq to order points.
using System.Linq;
var sortedList = myGripperPoints.OrderBy(p => p.Distance(WorkScrap.GetCentroid())).ToList();
Consider the following code:
Point Class (assumed class definition)
class Point
{
public int X { get; set;}
public int Y { get; set;}
}
Point EqualityComparer
class PointEqualityComparer : IEqualityComparer<Point>
{
public bool Equals(Point p1, Point p2) { return p1.X == p2.X && p1.Y == p2.Y; }
public int GetHashCode(Point p) { return p.X.GetHashCode() *31 + p.Y.GetHashCode()*23; }
}
Create a Dictionary with Point as Key and Distance as value (assuming integer):
Dictionary<Point,int> pointDictionary =
new Dictionary<Point, int>(new PointEqualityComparer());
Add Points as follows:
Point p = new Point {X = <value>, Y = <value>};
pointDictionary.Add(p,p.Distance(WorkScrap.GetCentroid()));
Order by Distance as follows:
pointDictionary.OrderByDescending(x => x.Value).ToList();
Ordering is done by Distance in Descending order as expected
Result would be List<KeyValuePair<Point,int>>, where elements are in Descending order

Return list of items in order of how many related items there are

I want to return a list of items in order of how many related items there are.
Imagine the following classes. and imagine they all had DbSets... context.A..., context.B...
class A
{
public ID { get; set; }
}
class B
{
public virtual A A { get; set; }
}
I am trying to get a list of A items in order of most related from B. The query might look like this:
IEnumerable<A> GetMostRelatedAs( int numberOfAsToReturn )
{
return this.context.A.SelectMany(
a => a.ID,
( whatever) => new
{
A = whatever,
RelatedBCount = this.context.B.Where( b => b.A.ID == whatever.ID)
}).OrderByDescending( x => x.RelatedBCount ).Take( numberOfAsToReturn );
}
Where am I going wrong in my query?
Due to this:
I am trying to get a list of A items in order of most related to from B.
to from makes this quite confusing, so on this basis I'm going to have a stab in the dark with this one:
IEnumerable<dynamic> GetMostRelatedAs( int numberOfAsToReturn )
{
var results = this.context.A
.GroupJoin(
this.context.B,
a => a.ID,
b => b.A.ID,
(singleA, multipleBs) => new {
// this is the projection, so take here what you want
numberOfBs = multipleBs.Count(),
name = singleA.Name,
singleA.ViewCount
}
)
.OrderByDescending(x => x.ViewCount)
.Take(numberOfAsToReturn)
.ToList();
// here you can use automapper to project to a type that you can use
// So you could add the following method calls after the ToList()
// .Project(this.mappingEngine)
// .To<ClassThatRepresentsStructure>()
// The reason you don't map before the ToList is that you are already doing a projection with that anonymous type.
return results;
}
Edit
To address the comments:
IEnumerable<A> GetMostRelatedAs( int numberOfAsToReturn )
{
var results = this.context.A
.GroupJoin(
this.context.B,
a => a.ID,
b => b.A.ID,
(singleA, multipleBs) => new {
// this is the projection, so take here what you want
numberOfBs = multipleBs.Count(),
name = singleA.Name,
singleA.ViewCount,
singleA
}
)
.OrderByDescending(x => x.ViewCount)
.Take(numberOfAsToReturn)
.ToList()
.Select(x => x.singleA);
return results;
}

How to edit element of List of Lists?

I have a:
var list = new List<List<MyClass>>();
MyClass:
public class MyClass
{
public double X { set; get; }
public double Y { set; get; }
public bool Unique{ set; get; }
}
Now I have a double x and double y and I want to find element of list with same X and Y and set Unique field to false.
I already find examples for search like:
var element = (from sublist in list
from item in sublist
where item.X == x && item.Y == y
select item).FirstOrDefault();
But how to edit this element?
Sure I can do something like:
foreach (var myClsList in list)
{
foreach (var myCls in myClsList )
{
if (myCls.X == x && myCls.Y == y)
{
myCls.Unique= false;
}
}
}
But it doesn't look nice.
It looks you really want to group elements by their X and Y properties and properly set the Unique properties inside these groups. Here are some initial thoughts:
List<Element> allItems = list.SelectMany(l => l);
var groups = allItems.GroupBy(element => new { X = element.X, Y = element.Y });
foreach (var group in groups) {
bool unique = group.Count() == 1;
foreach (var element in group) {
element.Unique = unique;
}
}
It seems very unclear whether you want to edit one element, or the entire elements matching certain criterion.
For one element you do:
(from sublist in list
from item in sublist
where item.X == x && item.Y == y
select item).First().Unique = false; //the default item will be null
This works, but quite ugly. Better:
var element = (from sublist in list
from item in sublist
where item.X == x && item.Y == y
select item).FirstOrDefault();
element.Unique = false;
To change them all, use a foreach. Linq is not really for side-effects. It should be a side-effect construct. In your case I would do:
var query = list.SelectMany(sublist => sublist)
.Where(item => item.X == x && item.Y == y);
foreach(var myClass in query)
myClass.Unique = false;
If you really want side-effects and want to do it in linq style, you can better adopt a fluent style tweaking your MyClass like this:
public class MyClass
{
public double X { set; get; }
public double Y { set; get; }
public bool Unique { set; get; }
public MyClass Duplicate()
{
Unique = false;
return this;
}
}
// So you call:
var query = list.SelectMany(sublist => sublist)
.Where(item => item.X == x && item.Y == y)
.Select(item => item.Duplicate());
Some thoughts on the overall structure:
Look at the Dictionary<> class. This is designed to store a list of items which may be accessed via an arbitrary type. To use it in this context:
Remove Unique from your class.
Make your class a struct instead so it is compared by value rather than reference (i.e. two distinct MyClasses with the same X/Y are considered the same)
Instead of a list, use Dictionary<MyClass, bool>
(seem to need text here to create a code block)
public struct MyClass
{
public double X { get; set; }
public double Y { get; set; }
}
var items = new Dictionary<MyClass, Bool>();
items[ new MyClass{ X = 3, Y = 7 } ] = false;
items[ new MyClass{ X = -12, Y = -24 } ] = true;
Assert.IsFalse( items[ new MyClass{ X = 3, Y = 7 } ] );
Assert.IsTrue( items[ new MyClass{ X = -12, Y = -24 } ] );
items[ new MyClass{ X = 3, Y = 7 } ] = true;
Assert.IsTrue( items[ new MyClass{ X = 3, Y = 7 } ] );
To clarify how Dictionary works:
K is the "key" type; this is the type used to label the items you're storing in the dictionary and is used both when putting items into it and taking items out.
V is the "value" type; this is the type you are storing.
You can read or write the dictionary using an indexer:
var value = dictionary[ key ]
dictionary[ key ] = value
Dictionary may be faster under some circumstances than manually iterating through the list as it internally may use any data structure it likes to search for matches, which may be faster for large datasets. I believe it uses GetHashCode to shortlist matches, but that's likely unimportant in this context.

Can we get access to the F# copy and update feature from c#?

For example in F# we can define
type MyRecord = {
X: int;
Y: int;
Z: int
}
let myRecord1 = { X = 1; Y = 2; Z = 3; }
and to update it I can do
let myRecord2 = { myRecord1 with Y = 100; Z = 2 }
That's brilliant and the fact that records automatically implement IStructuralEquality with no extra effort makes me wish for this in C#. However Perhaps I can define my records in F# but still be able to perform some updates in C#. I imagine an API like
MyRecord myRecord2 = myRecord
.CopyAndUpdate(p=>p.Y, 10)
.CopyAndUpdate(p=>p.Z, 2)
Is there a way, and I don't mind dirty hacks, to implement CopyAndUpdate as above? The C# signiture for CopyAndUpdate would be
T CopyAndUpdate<T,P>
( this T
, Expression<Func<T,P>> selector
, P value
)
It can be done, but doing that properly is going to be quite hard (and it definitely won't fit in my answer). The following simple implementation assumes that your object has only read-write properties and parameter-less constructor:
class Person
{
public string Name { get; set; }
public int Age { get; set; }
}
This slightly defeats the point, because you would probably want to use this on immutable types - but then you always have to call the constructor with all the arguments and it is not clear how to link the constructor parameters (when you create an instance) with the properties that you can read.
The With method creates a new instance, copies all property values and then sets the one that you want to change (using the PropertyInfo extracted from the expression tree - without any checking!)
public static T With<T, P>(this T self, Expression<Func<T, P>> selector, P newValue)
{
var me = (MemberExpression)selector.Body;
var changedProp = (System.Reflection.PropertyInfo)me.Member;
var clone = Activator.CreateInstance<T>();
foreach (var prop in typeof(T).GetProperties())
prop.SetValue(clone, prop.GetValue(self));
changedProp.SetValue(clone, newValue);
return clone;
}
The following demo behaves as expected, but as I said, it has lots of limitations:
var person = new Person() { Name = "Tomas", Age = 1 };
var newPerson = person.With(p => p.Age, 20);
In general, I think using a universal reflection-based method like With here might not be such a good idea, unless you have lots of time to implement it properly. It might be easier to just implement one With method for every type that you use which takes optional parameters and sets their values to a cloned value (created by hand) if the value is not null. The signature would be something like:
public Person With(string name=null, int? age=null) { ... }
You could achieve something similar using optional arguments:
class MyRecord {
public readonly int X;
public readonly int Y;
public readonly int Z;
public MyRecord(int x, int y, int z) {
X = x; Y = y; Z = z;
}
public MyRecord(MyRecord prototype, int? x = null, int? y = null, int? z = null)
: this(x ?? prototype.X, y ?? prototype.Y, z ?? prototype.Z) { }
}
var rec1 = new MyRecord(1, 2, 3);
var rec2 = new MyRecord(rec1, y: 100, z: 2);
This is actually pretty close to the code that F# generates for records.
In case anyone else is stumbling upon this, I recently needed to do the same thing and was able to take #Tomas Petricek's answer and expand it to work with immutable records:
public static T With<T, P>(this T self, Expression<Func<T, P>> selector, P newValue)
{
var me = (MemberExpression)selector.Body;
var changedProp = (System.Reflection.PropertyInfo)me.Member;
var constructor = typeof(T).GetConstructors()[0];
var parameters = constructor.GetParameters().Select(p => p.Name);
var properties = typeof(T).GetProperties();
var args = parameters
.Select(p => properties.FirstOrDefault(prop => String.Equals(prop.Name,p, StringComparison.CurrentCultureIgnoreCase)))
.Select(prop => prop == changedProp ? newValue : prop.GetValue(self))
.ToArray();
var clone = (T) constructor.Invoke(args);
return clone;
}
Usage
// F#
type Person =
{
Name : string
Age : int
}
// C#
var personRecord = new Person("John",1);
var newPerson = personRecord.With(p => p.Age, 20);

Given two collections A & B: want to output the inner join, elements in A that were not in B, elements in B that were not in A

Given two collections A & B, I want to output:
1. their inner join (say on a field called Id)
2. those elements in A that could not be found in B
3. those elements in B that could not be found in A
What is the most efficient way to do this?
When I say those elements in A that could not be found in B, I mean those elements that could not be "inner-joined" with B
For the inner join, have a look at the .Join() extension method: http://msdn.microsoft.com/en-us/library/bb344797.aspx
For the second 2 outputs, have a look at the .Except() extension method. http://msdn.microsoft.com/en-us/library/bb300779.aspx
For examples of most of the LINQ queries, have a look at this page: http://msdn.microsoft.com/en-us/vcsharp/aa336746
I guess I'd write this:
public class DeltaSet<T>
{
public ISet<T> FirstItems { get; private set; }
public ISet<T> SecondItems { get; private set; }
public ISet<Tuple<T, T>> IntersectedItems { get; private set; }
// T is the type of the objects, U is the key used to determine equality
public static DeltaSet<T> GetDeltaSet<T, U>(IDictionary<U, T> first,
IDictionary<U, T> second)
{
var firstUniques = new HashSet<T>(
first.Where(x => !second.ContainsKey(x.Key)).Select(x => x.Value));
var secondUniques = new HashSet<T>(
second.Where(x => !first.ContainsKey(x.Key)).Select(x => x.Value));
var intersection = new HashSet<Tuple<T, T>>(
second.Where(x => first.ContainsKey(x.Key)).Select(x =>
Tuple.Create(first[x.Key], x.Value)));
return new DeltaSet<T> { FirstItems = firstUniques,
SecondItems = secondUniques,
IntersectedItems = intersection };
}
public static DeltaSet<IDClass> GetDeltas(IEnumerable<IDClass> first,
IEnumerable<IDClass> second)
{
return GetDeltaSet(first.ToDictionary(x => x.ID),
second.ToDictionary(x => x.ID));
}
}
Assuming you have class A for elements in collection A and class B in collection B
class AB {
public A PartA;
public B PartB;
// Constructor
};
public void ManyJoin (List<A> colA, List<B> colB)
{
List<AB> innerJoin = new List<AB>();
List<A> leftJoin = new List<A>();
List<B> rightJoin = new List<B>();
bool[] foundB = new bool[colB.Count];
foreach (A itemA in colA)
{
int i = colB.FindIndex(itemB => itemB.ID == itemA.ID);
if (i >= 0)
{
innerJoin.Add (new AB(itemA, colB[i]));
foundB[i] = true;
}
else
leftJoin.Add(itemA);
}
for (int j = 0; j < foundB.count; j++)
{
if (!foundB[j])
rightJoin.Add(colB[j]);
}
}
This is one possible way. Whether it is optimum or not, I'm not sure, it does the job.

Categories