I have some data modeled as a simple set of nested c# objects, that I am trying to create/retrieve from a neo4j database using the .Net Neo4jClient.
My classes are in the form:
public class Foo {
public int ID { get; set; }
public List<Bar> bar { get; set;}
}
public class Bar {
public int ID { get; set; }
public List<Baz> baz { get; set;}
}
public class Baz {
public int ID { get; set; }
}
Once the data has been stored in the correct form in the database:
(f:Foo)-[h:HASBAR]->(b:Bar)-[hb:HASBAZ]->(bz:Baz)
I am able to retrieve the data into my class structure using collect and optional match using the following query:
List<Foo> foolist = WebApiConfig.GraphClient.Cypher
.Match("(f:Foo)-[h:HASBAR]->(b:Bar)")
.OptionalMatch("(b)-[hb:HASBAZ]->(bz:Baz)")
.With("f, { ID: b.ID, baz: collect(bz) } as Bar")
.With("{ ID:f.ID, bar:collect(Bar) } as Foo")
.Return<Foo>("Foo")
.Results
.ToList();
This all works perfectly and the data is correctly serialized into the proper classes.
My question is how should I perform the reverse?
As in given a single Foo class, containing multiple bar and baz classes nested, can I create the above data structure in the database in a single query?
Or do I have to write a query per each level of nesting?
I know I will probably have to list properties when creating, as if I give the client a Foo class it will create a node with "bar" as a property.
My problem mostly comes from the third level of nesting, if I treat the second level (bar) as an array (passing in Foo.bar) as a variable, I can create multiple [:HASBAR] relationships. But within the same query I have not found a way to relate the correct Baz node with the Bar node.
Am I approaching this in the correct way?
Any responses are appreciated, thanks in advance...
Well, it is possible to do it in one query - unfortunately I don't think you can use the tasty UNWIND or FOREACH due to the secondary nesting, and you'll need to do some funky things with the classes, but well, here goes:
First, we need to define the classes, so we can deserialize the properties, but not serialize them, to that end
public class Foo
{
public int ID { get; set; }
[JsonIgnore]
public List<Bar> bar { get; set; }
[JsonProperty("bar")]
private List<Bar> barSetter { set { bar = value;} }
}
public class Bar
{
public int ID { get; set; }
[JsonIgnore]
public List<Baz> baz { get; set; }
[JsonProperty("baz")]
private List<Baz> bazSetter { set { baz = value; } }
}
public class Baz
{
public int ID { get; set; }
}
What is this craziness??!?! Well - By using [JsonIgnore] we tell Json not to serialize or deserialize a given property - but we want to deserialize so your retrieval query will work - so having the private setter with JsonProperty allows us to achieve this.
The added bonus of this approach is that you don't need to specify properties to serialize in the Cypher generation bit. And here it is in all it's glory:
var query = gc.Cypher
.Create("(f:Foo {fooParam})")
.WithParam("fooParam", foo);
for (int barIndex = 0; barIndex < foo.bar.Count; barIndex++)
{
var barIdentifier = $"bar{barIndex}";
var barParam = $"{barIdentifier}Param";
query = query
.With("f")
.Create($"(f)-[:HASBAR]->({barIdentifier}:Bar {{{barParam}}})")
.WithParam(barParam, foo.bar[barIndex]);
for (int bazIndex = 0; bazIndex < foo.bar[barIndex].baz.Count; bazIndex++)
{
var bazIdentifier = $"baz{barIndex}{bazIndex}";
var bazParam = $"{bazIdentifier}Param";
query = query
.With($"f, {barIdentifier}")
.Create($"({barIdentifier})-[:HASBAZ]->({bazIdentifier}:Baz {{{bazParam}}})")
.WithParam(bazParam, foo.bar[barIndex].baz[bazIndex]);
}
}
The f:Foo bit is as per normal, the subsequent for loops allow you to define each identifier and set the parameters.
I don't think this is an ideal solution, but it will work, and will execute in 1 query. Obvs, this could get unwieldy with lots of nested values.
Related
Say we have a class PersonInfo
PersonInfo{
public int age = 10;
public int weight = 150;
}
and I uploaded this data to my Mongo DB.
Now, say I dont need wieght anymore, so my PersonInfo class looks like this:
PersonInfo{
public int age = 10;
}
But now, when I load from the Mongo DB it says,
"FormatException: Element 'weight' does not match any field or property of class PersonInfo."
How can I remove class variables without getting this error?
Thanks!
There are a few ways to solve this problem depending on what you're doing with your data.
By default as you've seen the driver will throw an exception if it can't deserializer all the bson document properties, this is so that we don't end up with some data loss.
The simplest approach is to add a ExtraElements property to your model that you're deserializing in to, this can be of either a BsonDocument or an IDictionary<string, object>.
public class PersonInfo
{
public ObjectId Id { get; set; }
public int Age { get; set; }
public BsonDocument ExtraElements { get; set; }
}
Alternately we can just ignore any extra elements that are in the document in a couple of ways.
The first one is adding a BsonIgnoreExtraElements attribute to our class
[BsonIgnoreExtraElements]
public class PersonInfo
{
public ObjectId Id { get; set; }
public int Age { get; set; }
}
Alternatively, we can create a class map for our PersonInfo in code by executing the following before running our query (this only needs to be executed once, ideally on application startup).
BsonClassMap.RegisterClassMap<PersonInfo>(map =>
{
map.AutoMap();
map.SetIgnoreExtraElements(true);
});
The next approach is if you want to apply to ignore extra elements across all or a range of types matching a predicate.
Here we're using a convention that we'll set up on application startup.
ConventionRegistry.Register("IgnoreExtraElements", new ConventionPack
{
new IgnoreExtraElementsConvention(true)
}, t => true);
The t => true predicate will apply the convention to all types.
Recently found out about AutoMapper's ProjectTo<> method, so I've been playing around with it.
So far so good until I came upon a class that had multiple properties of the same type, such as:
public class RandomDto
{
public int Id {get;set;}
public ChildDto FirstChild {get;set;}
public ChildDto SecondChild {get;set;}
}
It seems like it generates SQL for a single Child relationship, and not for both:
SELECT CASE
WHEN [dtoRandom].[FirstChild_FK] IS NULL
THEN CAST(1 AS BIT) ELSE CAST(0 AS BIT)
END, [t0].[Child_Description]
END, [dtoRandom].[Id]
FROM [Randoms] AS [dtoRandom]
INNER JOIN (
SELECT [s].*
FROM [Childs] AS [s]
) AS [t0] ON [dtoRandom].[FirstChild_FK] = [t0].[Id]
I've tried
.ProjectTo<RandomDto>(null, "FirstChild", "SecondChild")
.ProjectTo<RandomDto>(x=>x.FirstChild, x=>x.SecondChild)
and both return with the first one being filled in, second being null
Not sure if I need to set custom aliases or something for this to work?
I experienced the same behaviour and tried to work around the problem by using an other class for the second navigation property that only derived from the original one.
public class ChildDtoTmp : ChildDto { }
public class RandomDto
{
public int Id { get; set; }
public ChildDto FirstChild { get; set; }
public ChildDtoTmp SecondChild { get; set; }
}
This worked fine, but because the type of the navigation proerty has another navigation property the same strange behaviour shows up with the nested navigation property.
public class ChildDto
{
public int Id { get; set; }
public InnerChildDto InnerChild { get; set; }
}
This leads to the properties FirstChild and SecondChild being mapped, but only the InnerChild property of the FirstChild gets mapped.
Maybe this helps someone to figure out how to solve this.
Your issue might be related to a slightly confusing option called MaxDepth. I've managed to reproduce that exact problem by calling MaxDepth(1), which in my understanding should only affect self-referencing entities such as:
class Foo
{
public Foo InnerFoo { get; set; }
}
In that case, a MaxDepth(1) should only map the first Foo found in a object graph. Which is exactly what it happens, but it also affects the following structure (quite wrongly, IMHO):
class Bar
{
public Foo Foo1 { get; set; }
public Foo Foo2 { get; set; }
}
A MaxDepth(1) in the above scenario will map only the Foo1 property, keeping Foo2 as null.
Oh, by the way: to set MaxDepth, one may apply to all their mappings:
var mapperCfg = new MapperConfiguration(cfg =>
{
cfg.AddProfile<SomeProfile>();
cfg.ForAllMaps(SetMaxDepth);
});
private static void SetMaxDepth(TypeMap typeMap, IMappingExpression expression) => expression.MaxDepth(1);
Or to each map individually:
var mapperCfg = new MapperConfiguration(cfg =>
{
cfg.CreateMap<SomeObject, SomeObjectDto>().MaxDepth(1);
});
I have a repository for a DocumentDb database. My documents all have a set of common properties so all documents implement the IDocumentEntity interface.
public interface IDocumentEntity {
[JsonProperty("id")]
Guid Id { get; set; }
[JsonProperty("documentClassification")]
DocumentClassification DocumentClassification { get; set; }
}
public class KnownDocument : IDocumentEntity {
[JsonProperty("id")]
Guid Id { get; set; }
[JsonProperty("documentClassification")]
DocumentClassification DocumentClassification { get; set; }
[JsonProperty("knownProperty")]
string KnownProperty { get; set; }
}
public class BaseDocumentRepository<T> where T : IDocumentEntity {
public Set(T entity) {
// ... stuff
}
}
This works fine with a KnownDocument where I know all of the properties. But, of course, what's great about a Document Db is that I don't need to know all of the properties (and in many cases I won't).
So my client submits something like this-
{unknownProperty1: 1, unknownProperty2: 2}
And I want to upsert this using my document repository.
public OtherDocumentService() {
_otherDocumentService = new OtherDocumentRepository();
}
public UpsertDocument(dynamic entity) {
entity.id = new Guid();
entity.documentClassification = DocumentClassification.Other;
_otherDocumentRepository.Set(entity);
}
But I get an InvalidCastException from dynamic to IDocumentEntity. I assume it's because of the extra properties that exist on the dynamic object but not on the IDocumentEntity interface?
What I'm trying to do is leave my document entities open to be dynamic, but rely on a few properties being there to maintain them.
Entity parameter passed to the UpsertDocument should explicitly implement IDocumentEntity in order do make the code works, it is not enough just have a Id property.
Some options:
1) Proxy may be applied:
public class ProxyDocumentEntity : IDocumentEntity
{
public dynamic Content { get; private set; }
public ProxyDocumentEntity(dynamic #content)
{
Content = #content;
}
public Guid Id
{
get { return Content.Id; }
set { Content.Id = value; }
}
}
... using
public void UpsertDocument(dynamic entity)
{
entity.Id = new Guid();
repo.Set(new ProxyDocumentEntity(entity));
}
The stored document will have nested Object property, which may be not acceptable
2)There is a lib https://github.com/ekonbenefits/impromptu-interface which creates a proxy dynamically
and does not make extra property like solution above.
Drawback will be in performance.
Technically it could be 2 methods:
public void UpsertDocument(IDocumentEntity entity){...}
public void UpsertDocument(dynamic entity){...}
so the first (fast) will work for the objects which implement IDocumentEntity and second(slow) for the rest of the objects.
But this is a speculation a bit , as I dunno the details of the whole code base of the project you have.
If you have some flexibility as to how to name those dynamic properties, you could stuff them into a Dictionary property on your object:
public Dictionary<string, dynamic> extra { get; set; }
When using Entity Framework 6, how is the most efficient way to create an object or objects with additional data from other DbSet entities, when I have a DbContext or IQueryable<T>?
Here is some code:
If I have an Data class as follows:
public class Data
{
[Key]
public int id { get; set; }
public string name { get; set; }
public string data { get; set; }
public int parentId { get; set; }
public int otherDataId { get; set; }
}
And an OtherData class as follows:
public class OtherData
{
[Key]
public int id { get; set; }
public string name { get; set; }
public string data { get; set; }
}
In the Data class, the parentId is a foreign key reference to another Data object in the same DbSet, and the otherDataId is a foreign key reference to an OtherData object in a DbSet<OtherData>.
I would like to get all Data objects in the DbSet<Data>, with the additional DbSet data of the parent Data objects id and name and the OtherData object's id and name. I need this to be in one object to be sent from a webservice GET.
I am not sure on how to do this.
Do I need some code along the lines of:
var result = DbContext.Data.Select(x=> x...).Join(y=> y...) .. new { id = x.id... y.name.. }
Can I please have some help with this code?
You can use a join and project the result. In the below snippet CombinedData is a another class with 2 string fields Name and OtherName. You can also use a view but I think the Join is less work.
IQueryable<CombinedData> result = DbContext.Data.Join(
DbContext.Data.DbContext.OtherData,
outer => outer.OtherDataId,
inner => inner.Id),
(outer, inner) => new { Name = outer.Name, OtherName = inner.Name}
);
Depending on your overall architecture, this may be a good or a bad answer but often when faced with this in the past our teams would create a view in the database to combine the fields. Write your optimized query in the view and then treat it like any other table in your data layer.
You could accomplish the same end result using includes or joins or even writing out the expression in a cross-table query but in my opinion the view is the cleanest and most efficient method.
I have a simple test solution which consists of two projects (a 'business' layer and a Data Access layer) using Catel to tie the two together - works fine, no problems.
However, have been reading about how useful AutoMapper can be for helping to move data around such a setup by allowing easy population of DTO's and decided to give it a look...that's when my problems started!
I'm using Entity Framework 6.1, VS 2013 Express for Desktop and accessing a SQL Server Express 14 db - no problems with data retrieval and data displays correctly in my views.
AutoMapper was added using NuGet.
In order to use AutoMapper I've set up the following in my App.xaml.cs
private void InitializeAutomapper()
{
Mapper.CreateMap<Result, ResultDto>();
Mapper.AssertConfigurationIsValid();
}
This code is the first item called inside my 'OnStartup'.
A service in my business layer makes a call to the Data Access layer and retrieves a list of Result entites.
Subsequently, I take a single entity from this list and use that in the AutoMapper mapping call.
I'm trying to populate a resultDTO from this single entity, using the following
Result res = ResultList.First();
ResultDto resultDTO = Mapper.Map<Result, ResultDto>(res);
'res' is correctly populated with data but resultDTO is filled with the default values for the individual data types (in = 0, string = null, DateTime = {01/01/0001 00:00:00}) ie; no values are mapped from the source to the destination.
There are References in both projects to AutoMapper and AutoMapper.Net and no errors are raised - it just doesn't work as advertised...
I'm not slagging off the software, just asking what I'm doing wrong!
I realise there isn't much code to work on here but, in truth, what is posted here is pretty much all I've added to try out AutoMapper. I can see, conceptually how useful it could be - I just need to figure out how to make it happen so any help/comments gratefully received...:)
EDIT
#Andrew, as requested -
Result Class:
public partial class Result
{
public int Div { get; set; }
public System.DateTime Date { get; set; }
public string HomeTeam { get; set; }
public string AwayTeam { get; set; }
public int FTHG { get; set; }
public int FTAG { get; set; }
public string FTR { get; set; }
}
ResultDTO Class:
public class ResultDto
{
int Div { get; set; }
DateTime Date { get; set; }
string HomeTeam { get; set; }
string AwayTeam { get; set; }
int FTHG { get; set; }
int FTAG { get; set; }
string FTR { get; set; }
// Added tonight to try and get it to work
public ResultDto()
{
Div = 0;
Date = DateTime.Now;
HomeTeam = null;
AwayTeam = null;
FTHG = 0;
FTAG = 0;
FTR = null;
}
}
#stuartd, the following is used to retrieve the ResultList from which Result is obtained:
// Produce a list of DataLayer.Result entities.
var ResultList = (from x in dbContext.Results.Local
where x.HomeTeam == team.TeamName.ToString() || x.AwayTeam == team.TeamName.ToString()
orderby x.Date
select x).ToList();
Please note 'team.Teamname' is passed into the above from an external source - seems to be working fine.
So to sum up -
I produce ResultList as a list of Result entities.
Fill Result with the first entity in the list.
Try to map this Result entity to ResultDTO
Fail :(
Hope this helps!
By default, class members are declared private unless otherwise specified so the ResultDto properties aren't visible outside of the class.
public class ResultDto
{
int Div { get; set; }
....
}
needs to be
public class ResultDto
{
public int Div { get; set; }
....
}
AutoMapper can work out the type you are mapping from from the arguments provided. Try this:
ResultDto resultDTO = Mapper.Map<ResultDto>(res);
UPDATE
This is wrong, or at least won't help. We need to see the source and destination classes as mentioned in the comments.