Deleting variables from a local script and loading existing database data - c#

Say we have a class PersonInfo
PersonInfo{
public int age = 10;
public int weight = 150;
}
and I uploaded this data to my Mongo DB.
Now, say I dont need wieght anymore, so my PersonInfo class looks like this:
PersonInfo{
public int age = 10;
}
But now, when I load from the Mongo DB it says,
"FormatException: Element 'weight' does not match any field or property of class PersonInfo."
How can I remove class variables without getting this error?
Thanks!

There are a few ways to solve this problem depending on what you're doing with your data.
By default as you've seen the driver will throw an exception if it can't deserializer all the bson document properties, this is so that we don't end up with some data loss.
The simplest approach is to add a ExtraElements property to your model that you're deserializing in to, this can be of either a BsonDocument or an IDictionary<string, object>.
public class PersonInfo
{
public ObjectId Id { get; set; }
public int Age { get; set; }
public BsonDocument ExtraElements { get; set; }
}
Alternately we can just ignore any extra elements that are in the document in a couple of ways.
The first one is adding a BsonIgnoreExtraElements attribute to our class
[BsonIgnoreExtraElements]
public class PersonInfo
{
public ObjectId Id { get; set; }
public int Age { get; set; }
}
Alternatively, we can create a class map for our PersonInfo in code by executing the following before running our query (this only needs to be executed once, ideally on application startup).
BsonClassMap.RegisterClassMap<PersonInfo>(map =>
{
map.AutoMap();
map.SetIgnoreExtraElements(true);
});
The next approach is if you want to apply to ignore extra elements across all or a range of types matching a predicate.
Here we're using a convention that we'll set up on application startup.
ConventionRegistry.Register("IgnoreExtraElements", new ConventionPack
{
new IgnoreExtraElementsConvention(true)
}, t => true);
The t => true predicate will apply the convention to all types.

Related

Mongodb collection as dynamic

I have an application that has two similar but different objects and I want to store those objects in the same collection. What is the best way to do this? And how can I query this collection?
Today my collections is represented by:
public IMongoCollection<Post> Posts
{
get
{
return _database.GetCollection<Post>("posts");
}
}
And I have this class:
public class Post
{
public string Id { get; set; }
public string Message { get; set; }
}
public class NewTypePost
{
public string Id { get; set; }
public string Image { get; set; }
}
So, today I just can save and query using Post class. Now I want to store and retrive the both classes, Post and NewTypePost.
I tried to change the class type from Post to dynamic. But when I did this, I could not query the collections.
MongoDB .NET driver offers few possibilites in such cases:
Polymorphism
You can build a hierarchy of classes and MongoDB driver will be able to determine a type of an object it gets retrieved from the database:
[BsonKnownTypes(typeof(Post), typeof(NewTypePost))]
public abstract class PostBase
{
[BsonId]
public string Id { get; set; }
}
public class Post: PostBase
{
public string Message { get; set; }
}
public class NewTypePost: PostBase
{
public string Image { get; set; }
}
MongoDB driver will create additional field _t in every document which will represent corresponding class.
Single Class
You can still have Post class and use BsonIgnoreIfNull attribute to avoid serialization exception. MongoDB .NET driver will set those properties to null if they don't exist in your database.
public class Post
{
[BsonId]
public string Id { get; set; }
[BsonIgnoreIfNull]
public string Message { get; set; }
[BsonIgnoreIfNull]
public string Image { get; set; }
}
BsonDocument
You can also drop strongly-typed approach and use BsonDocument class which is dynamic dictionary-like structure that represents your Mongo documents
var collection = db.GetCollection<BsonDocument>("posts");
More details here
dynamic
Specifying dynamic as generic parameter of ICollection you should get a list of ExpandoObject that will hold all the values you have in your database.
var collection = db.GetCollection<dynamic>("posts");
var data = collection.Find(Builders<dynamic>.Filter.Empty).ToList();
var firstMessage = data[0].Message; // dynamically typed code
Suppose I have the next conn to a test database:
var mongoClient = new MongoClient(new MongoClientSettings
{
Server = new MongoServerAddress("localhost"),
});
var database = mongoClient.GetDatabase("TestDb");
Then I can do something like:
var col = database.GetCollection<Post>("posts");
var col2 = database.GetCollection<NewTypePost>("posts");
To get two different instances of IMongoCollection but pointing to the same collection in the database. Further I am able to save to each collection in the usual way:
col.InsertOne(new Post { Message = "m1" });
col2.InsertOne(new NewTypePost { Image = "im1" });
Then, I'm also able to query from those collection base on the specific fields:
var p1= col.Find(Builders<Post>.Filter.Eq(x=>x.Message, "m1")).FirstOrDefault();
var p2 =col2.Find(Builders<NewTypePost>.Filter.Eq(x=>x.Image, "im1")).FirstOrDefault();
Console.WriteLine(p1?.Message); // m1
Console.WriteLine(p2?.Image); // im1
I don't know if that's what you want but it uses the same collection. BTW, change the Id properties to be decorated with [BsonId, BsonRepresentation(BsonType.ObjectId)]. Hope it helps.
Use the BsonDocument data type. It can do all of that. BsonDocument and dynamic back and forth is very convenient.
public class CustomObject{
public long Id{get;set;}
public string Name{get;set;}
public List<(string,object)> CollectionDynamic{get;set;}
}
// inserted in mongo
//public class CustomObject_in_Db{
// public long Id {get;set;}
// public string Name {get;set;}
// public string field2 {get;set;}
// public string field3 {get;set;}
// public string field4 {get;set;}
// public string field5 {get;set;}
// }
// something code... mapper(config)
Automapper.Mapper.CreateMap<BsonDocument,CustomObject>()
.ForMember(dest=>dest.Id, a=>a.MapFrom(s=>s.Id.GetValue(nameof(CustomObject.Id)).AsInt64)
.ForMember(dest=>dest.Name, a=>a.MapFrom(s=>s.Id.GetValue(nameof(CustomObject.Name)).AsString)
.ForMember(dest=>dest.CollectionDynamic, a=>a.MapFrom(s=>_getList(s));
// .......
private List<(string, object)> _getList(BsonDocument source){
return source.Elements.Where(e=>!typeof(CustomObject).GetProperties().Select(s=>s.Name).Any(a=>a ==e.Name)).Select(e=>e.Name, BsonTryMapper.MapToDotNetValue(e.Value)));
}

Is it possible to have extra (ignored) properties in C#?

I have a repository for a DocumentDb database. My documents all have a set of common properties so all documents implement the IDocumentEntity interface.
public interface IDocumentEntity {
[JsonProperty("id")]
Guid Id { get; set; }
[JsonProperty("documentClassification")]
DocumentClassification DocumentClassification { get; set; }
}
public class KnownDocument : IDocumentEntity {
[JsonProperty("id")]
Guid Id { get; set; }
[JsonProperty("documentClassification")]
DocumentClassification DocumentClassification { get; set; }
[JsonProperty("knownProperty")]
string KnownProperty { get; set; }
}
public class BaseDocumentRepository<T> where T : IDocumentEntity {
public Set(T entity) {
// ... stuff
}
}
This works fine with a KnownDocument where I know all of the properties. But, of course, what's great about a Document Db is that I don't need to know all of the properties (and in many cases I won't).
So my client submits something like this-
{unknownProperty1: 1, unknownProperty2: 2}
And I want to upsert this using my document repository.
public OtherDocumentService() {
_otherDocumentService = new OtherDocumentRepository();
}
public UpsertDocument(dynamic entity) {
entity.id = new Guid();
entity.documentClassification = DocumentClassification.Other;
_otherDocumentRepository.Set(entity);
}
But I get an InvalidCastException from dynamic to IDocumentEntity. I assume it's because of the extra properties that exist on the dynamic object but not on the IDocumentEntity interface?
What I'm trying to do is leave my document entities open to be dynamic, but rely on a few properties being there to maintain them.
Entity parameter passed to the UpsertDocument should explicitly implement IDocumentEntity in order do make the code works, it is not enough just have a Id property.
Some options:
1) Proxy may be applied:
public class ProxyDocumentEntity : IDocumentEntity
{
public dynamic Content { get; private set; }
public ProxyDocumentEntity(dynamic #content)
{
Content = #content;
}
public Guid Id
{
get { return Content.Id; }
set { Content.Id = value; }
}
}
... using
public void UpsertDocument(dynamic entity)
{
entity.Id = new Guid();
repo.Set(new ProxyDocumentEntity(entity));
}
The stored document will have nested Object property, which may be not acceptable
2)There is a lib https://github.com/ekonbenefits/impromptu-interface which creates a proxy dynamically
and does not make extra property like solution above.
Drawback will be in performance.
Technically it could be 2 methods:
public void UpsertDocument(IDocumentEntity entity){...}
public void UpsertDocument(dynamic entity){...}
so the first (fast) will work for the objects which implement IDocumentEntity and second(slow) for the rest of the objects.
But this is a speculation a bit , as I dunno the details of the whole code base of the project you have.
If you have some flexibility as to how to name those dynamic properties, you could stuff them into a Dictionary property on your object:
public Dictionary<string, dynamic> extra { get; set; }

Creating Graph from nested objects with Neo4jClient and Cypher

I have some data modeled as a simple set of nested c# objects, that I am trying to create/retrieve from a neo4j database using the .Net Neo4jClient.
My classes are in the form:
public class Foo {
public int ID { get; set; }
public List<Bar> bar { get; set;}
}
public class Bar {
public int ID { get; set; }
public List<Baz> baz { get; set;}
}
public class Baz {
public int ID { get; set; }
}
Once the data has been stored in the correct form in the database:
(f:Foo)-[h:HASBAR]->(b:Bar)-[hb:HASBAZ]->(bz:Baz)
I am able to retrieve the data into my class structure using collect and optional match using the following query:
List<Foo> foolist = WebApiConfig.GraphClient.Cypher
.Match("(f:Foo)-[h:HASBAR]->(b:Bar)")
.OptionalMatch("(b)-[hb:HASBAZ]->(bz:Baz)")
.With("f, { ID: b.ID, baz: collect(bz) } as Bar")
.With("{ ID:f.ID, bar:collect(Bar) } as Foo")
.Return<Foo>("Foo")
.Results
.ToList();
This all works perfectly and the data is correctly serialized into the proper classes.
My question is how should I perform the reverse?
As in given a single Foo class, containing multiple bar and baz classes nested, can I create the above data structure in the database in a single query?
Or do I have to write a query per each level of nesting?
I know I will probably have to list properties when creating, as if I give the client a Foo class it will create a node with "bar" as a property.
My problem mostly comes from the third level of nesting, if I treat the second level (bar) as an array (passing in Foo.bar) as a variable, I can create multiple [:HASBAR] relationships. But within the same query I have not found a way to relate the correct Baz node with the Bar node.
Am I approaching this in the correct way?
Any responses are appreciated, thanks in advance...
Well, it is possible to do it in one query - unfortunately I don't think you can use the tasty UNWIND or FOREACH due to the secondary nesting, and you'll need to do some funky things with the classes, but well, here goes:
First, we need to define the classes, so we can deserialize the properties, but not serialize them, to that end
public class Foo
{
public int ID { get; set; }
[JsonIgnore]
public List<Bar> bar { get; set; }
[JsonProperty("bar")]
private List<Bar> barSetter { set { bar = value;} }
}
public class Bar
{
public int ID { get; set; }
[JsonIgnore]
public List<Baz> baz { get; set; }
[JsonProperty("baz")]
private List<Baz> bazSetter { set { baz = value; } }
}
public class Baz
{
public int ID { get; set; }
}
What is this craziness??!?! Well - By using [JsonIgnore] we tell Json not to serialize or deserialize a given property - but we want to deserialize so your retrieval query will work - so having the private setter with JsonProperty allows us to achieve this.
The added bonus of this approach is that you don't need to specify properties to serialize in the Cypher generation bit. And here it is in all it's glory:
var query = gc.Cypher
.Create("(f:Foo {fooParam})")
.WithParam("fooParam", foo);
for (int barIndex = 0; barIndex < foo.bar.Count; barIndex++)
{
var barIdentifier = $"bar{barIndex}";
var barParam = $"{barIdentifier}Param";
query = query
.With("f")
.Create($"(f)-[:HASBAR]->({barIdentifier}:Bar {{{barParam}}})")
.WithParam(barParam, foo.bar[barIndex]);
for (int bazIndex = 0; bazIndex < foo.bar[barIndex].baz.Count; bazIndex++)
{
var bazIdentifier = $"baz{barIndex}{bazIndex}";
var bazParam = $"{bazIdentifier}Param";
query = query
.With($"f, {barIdentifier}")
.Create($"({barIdentifier})-[:HASBAZ]->({bazIdentifier}:Baz {{{bazParam}}})")
.WithParam(bazParam, foo.bar[barIndex].baz[bazIndex]);
}
}
The f:Foo bit is as per normal, the subsequent for loops allow you to define each identifier and set the parameters.
I don't think this is an ideal solution, but it will work, and will execute in 1 query. Obvs, this could get unwieldy with lots of nested values.

How can I get data from different DbSet entities and combine into one object?

When using Entity Framework 6, how is the most efficient way to create an object or objects with additional data from other DbSet entities, when I have a DbContext or IQueryable<T>?
Here is some code:
If I have an Data class as follows:
public class Data
{
[Key]
public int id { get; set; }
public string name { get; set; }
public string data { get; set; }
public int parentId { get; set; }
public int otherDataId { get; set; }
}
And an OtherData class as follows:
public class OtherData
{
[Key]
public int id { get; set; }
public string name { get; set; }
public string data { get; set; }
}
In the Data class, the parentId is a foreign key reference to another Data object in the same DbSet, and the otherDataId is a foreign key reference to an OtherData object in a DbSet<OtherData>.
I would like to get all Data objects in the DbSet<Data>, with the additional DbSet data of the parent Data objects id and name and the OtherData object's id and name. I need this to be in one object to be sent from a webservice GET.
I am not sure on how to do this.
Do I need some code along the lines of:
var result = DbContext.Data.Select(x=> x...).Join(y=> y...) .. new { id = x.id... y.name.. }
Can I please have some help with this code?
You can use a join and project the result. In the below snippet CombinedData is a another class with 2 string fields Name and OtherName. You can also use a view but I think the Join is less work.
IQueryable<CombinedData> result = DbContext.Data.Join(
DbContext.Data.DbContext.OtherData,
outer => outer.OtherDataId,
inner => inner.Id),
(outer, inner) => new { Name = outer.Name, OtherName = inner.Name}
);
Depending on your overall architecture, this may be a good or a bad answer but often when faced with this in the past our teams would create a view in the database to combine the fields. Write your optimized query in the view and then treat it like any other table in your data layer.
You could accomplish the same end result using includes or joins or even writing out the expression in a cross-table query but in my opinion the view is the cleanest and most efficient method.

Is this still the way to handle JSON serializer circular reference exception in asp.net MVC

I'm using MVC5 and Visual Studio 2013 with entity framework. Basic problem is when we have a many-to-many relationships like this:
public class Person
{
public int ID { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public ICollection<Group> Groups { get; set; }
public Patient()
{
Groups = new HashSet<Group>();
}
}
public class Group
{
public Guid ID { get; set; }
public string Name { get; set; }
public ICollection<Person> People{ get; set; }
public Group()
{
People = new HashSet<Person>();
}
}
and we want to get the JSON representation for a record like this
Person person = db.People.Include(x => x.Groups).Where(i => i.ID == id).Single();
string json=JsonConvert.SerializeObject(person);
the JsonConvert throws a circular reference exception.
This SO question has an easy solution by configuring it to ignore circular reference exceptions:
var serializerSettings = new JsonSerializerSettings { PreserveReferencesHandling = PreserveReferencesHandling.Objects };
string json=JsonConvert.SerializeObject(person,serializerSettings);
My question is this: is this still the best way to deal with this situation? Those SO answers are now very old, and this seems like a very common situation. I've updated everything I can think of to update in my solution, but I still get the exception unless I do the extra configuration step.
Also, are there side effects to setting PreserveReferencesHandling = PreserveReferencesHandling.Objects? Is there any reason ever not do this?
using datamembers is a better solution I think.
Here you can chose what to serialize and what not.
Not adding attributes to the unwanted/circular references solves the problem of the circular serialization.
http://www.newtonsoft.com/json/help/html/DataContractAndDataMember.htm
Another way is to avoid eager loading, by removing include extension method from your statement. Or create another custom class for your dependent object and use projection using select(x => new CustomClass {PropertyOne = p.PropertyOne})
You just need to add
[ScriptIgnore(ApplyToOverrides = true)] into your text template (.tt) file (part of the EF data model).
Here a portion of my text template before
#>
<#=codeStringGenerator.NavigationProperty(navigationProperty)#>
<#
Once I inserted the code the line above the codeStringGenerator my classes auto generated and looked like this:
[ScriptIgnore(ApplyToOverrides = true)]
public virtual ICollection<Currency> Currencies { get; set; }
I also needed to modify the UsingDirectives function to insert "using System.Web.Script.Serialization;"

Categories