I am get confused about voting system implementation. I want to know vote up and vote down counts of a Post and also save voters who voted up or down. I use below model for it.
public class Post : Entity
{
public string Title { get; set; }
public virtual User Owner { get; set; }
public virtual List<User> UpVoters { get; set; }
public virtual List<User> DownVoters { get; set; }
}
I feel a bit of problem in this design because if i have to represent vote up and vote down count of a post so i think i have to process 2 query.Using navigation properties counts would cause performance problems if i just need counts. Am i right?
Second aproach below get confuse me too because VoteUpCount and VoteDownCount should be handled always manualy by developer. And always reupdate values when voter changed his vote and this looks bit of code smell.
public class Post : Entity
{
public string Title { get; set; }
public virtual User Owner { get; set; }
public int VoteUpCount { get; set; }
public int VoteDownCount { get; set; }
public virtual List<User> UpVoters { get; set; }
public virtual List<User> DownVoters { get; set; }
}
Can you suggest me which is better ? And why better ? Do i have another alternatives ?
Lastyly a post hold 3 User navigation properties and this makes me feel that something can be wrong. Is there anything wrong in that relationship ??
Instinctively, I would create a Vote entity, since you want save information on votes :
public class Vote : Entity
{
public User User {get;set;}
public int VoteDirection {get;set;} // 1 or -1
// any other info...
}
Then add the Vote field in the Post class :
public class Post : Entity
{
public string Title { get; set; }
public virtual User Owner { get; set; }
public virtual List<Vote> Votes { get; set; }
}
Then you count the sum of VoteDirection for each Votes...
I don't think anything is wrong here.
Your first model looks pretty OK to me. I wouldn't go for the second one unless you can prove that there is a performance hit for getting a count of related entities. Premature optimisation is the root of all evil ;-).
If you need to store the count for performance reasons than the second model is fine, too. Just make sure that adding the vote updates the count field.
Your first model is the right one, you should not compromise your model for the short-comings of EF, also the second version is not very maintainable - there is too much thinking and manual effort involved.
There is actually a workaround for EF materializing all related entities when you want to retrieve the count, the syntax is not very intuitive but does result in the right SQL query:
var myPost = context.Posts.First();
int upvotersCount = context.Entry(myPost)
.Collection(p => p.UpVoters)
.Query()
.Count();
This approach is detailed here.
Also as a general rule of thumb you should use ICollection<User> instead of the concrete List<User> in your model.
Using navigation properties counts would cause performance problems if i just need counts. Am i right? - what kind of performance problems do you have in mind?
Generally, adding the count fields will introduce redundancy in your database, but is an obvious performance optimization step. If I were you I would rather ask myself if I need to improve performance in this case. How many posts and votes are there going to be in the database? How many users are going to access this information simultaneously? Etc. In case your application needs to be really scalable, you can add the extra counts and make sure they are updated correctly. Otherwise, you can just keep this possibility in mind but not be in a hurry to implement it.
Related
I'm working on a side project to create a Forum built on top of RavenDB. I am currently trying to work out the relationship between Authors of Topics, and the "Last Reply" user on a topic. In a typical relational model I would simply store the FK to the User who posted the topic, and have a join off to the replies table to get the most recent replies author. This is obviously not the use case for Raven or any Document store for that matter.
What would be the most "optimal" way of pulling this off? Currently I'm tossing around a couple ideas.
Idea 1:
Save the FK of the Author on the Topic model, add a JsonIgnored User object that I will populate on the Topic load by using an Include in my session load (so one request so far from the client side, just makes the Load itself and model a bit complicated). Then possibly using a map-reduce index to get the most recent replies author (or even the same method as getting the Topic Author, so 1 or 2 queries depending).
Idea 2:
Saving both the Author and the most recent reply User on the model. Main "problem" here is the potential for stale data (say if a username changes). However that could potentially be alleviated with a background task (or simply keeping that in mind when updating a user document and going back over all posts from a user).
Example of the models in question.
public class User
{
public string Id { get; set; }
public string UserName { get; set; }
public string PasswordHash { get; set; }
}
public class Topic
{
public string Id { get; set; }
public string Title { get; set; }
public string Body { get; set; }
// Idea 1 Relationships
public string AuthorId { get; set; }
[JsonIgnore]
public User Author { get; set; } // Would need to be populated on loads from an Include on AuthorId
public string MostRecentReplyUserId { get; set; }
[JsonIgnore]
public User MostRecentReplyUser { get; set; } // Same as Author
// Idea 2 Relationships
public User Author { get; set; }
public User MostRecentReplyUser { get; set; }
}
Note: I would likely add a method to the User model to return a "clean" version where I scrub out things like the PasswordHash and use that on the Save for Idea 2.
Depending on your needs in case of update and query performance both ways may be the better choice.
I personally would recommend the first idea because you don't need to update existing documents when some data changes on the user records. Using include on query/load time is quite a nice feature of ravendb which can help you when retrieving nested records from the database.
Just make sure that you don't forget to include all nested documents - otherwise you may get many roundtrips.
Embedding documents (like Idea 1 but with a stored value of the users) may be better if your data processing is seperated from the data retrieval and you don't have access to the database session when converting data to be handed out to the frontend. We're using such a system that heavily relies on that (getting one input and mapping out a json pendant of the value) - this seperates data retrieval logic completely from the output (like mapping to json) logic.
Downside here: You've to make sure that existing (embedded) data get's updated whenever a user changes and the data that is transferred over the wire is more than on idea 1.
There are many related entities in Domain assembly. For example People that has navigation properties (Level1) to FamilyRelations, Houses and Persons. Beside this the Houses has own nav.prop (Level2) to Address and Address (Level3) has to City, Street ... etc.
When I set LazyLoadingEnabled to true then I'm getting JSON (on the left side in screen) with all related entities.
How can I get only one level of nesting (as on the right side in scree) or set other levels to NULL value (because I had setting Newtonsoft.Json.NullValueHandling.Ignore)?
Can I implement it without use .Include to each entity?
My class of People:
public class People : BaseEntity
{
public int PersonID { get; set; }
public int HouseID { get; set; }
public int PeopleNumber { get; set; }
public int? FamilyRelationID { get; set; }
//FK to House
public virtual House Houses { get; set; }
//FK to Person
public virtual Person Persons { get; set; }
//FK to FamilyRelations
public virtual FamilyRelations FamilyRelations { get; set; }
}
WebAPI config:
config.Formatters.JsonFormatter.SupportedMediaTypes.Add(new MediaTypeHeaderValue("text/html"));
config.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling
= Newtonsoft.Json.ReferenceLoopHandling.Ignore;
config.Formatters.JsonFormatter.SerializerSettings.NullValueHandling
= Newtonsoft.Json.NullValueHandling.Ignore;
I do not have any solution because I did not have enough experience with it. So, I need your suggestions, advices about it. Sorry for my English and if I have to add more informations, please let me know. Thanks
UPDATE
I've tried to add [JsonIgnore] or ignore those properties in mapping class but when I do request get/House then I need to get field from Address without nav.prop and when request get/People then I do not nedd Address. As a result I can't ingnore it.
Never return tracked objects to the controller. Your business logic code (which should not exist in the controller) should map your database aware objects to POCOs. This can be as simple as using
var poco = AutoMapper.Map<People>(livePerson)
And you setup in your mapping profile to ignore those properties so they're not copied.
Note my automapper-fu is rusty that syntax is rough code.
You want to be very careful with any blind mapping as it opens you up to the Mass Assignment vulnerability. This is equally true for going straight to your live tracked objects. If a user sees in their data IsAdmin: false, they might get crafty and post IsAdmin: true. This can be saved to your database with blind assignments.
If you are look for a way to ignore navigation properties in json serialization, this answer can help and you can ignore navigation properties in json serialization by it.
Can anyone provide an easier more automatic way of doing this?
I have the following save method for a FilterComboTemplate model. The data has been converted from json to a c# model entity by the webapi.
So I don't create duplicate entries in the DeviceProperty table I have to go through each filter in turn and retrieve the assigned DeviceFilterProperty from the context and override the object in the filter. See the code below.
I have all the object Id's if they already exist so it seems like this should be handled automatically but perhaps that's just wishful thinking.
public void Save(FilterComboTemplate comboTemplate)
{
// Set the Device Properties so we don't create dupes
foreach (var filter in comboTemplate.Filters)
{
filter.DeviceProperty = context.DeviceFilterProperties.Find(filter.DeviceFilterProperty.DeviceFilterPropertyId);
}
context.FilterComboTemplates.Add(comboTemplate);
context.SaveChanges();
}
From here I'm going to have to check whether any of the filters exist too and then manually update them if they are different to what's in the database so as not to keep creating a whole new set after an edit of a FilterComboTemplate.
I'm finding myself writing a lot of this type of code. I've included the other model classes below for a bit of context.
public class FilterComboTemplate
{
public FilterComboTemplate()
{
Filters = new Collection<Filter>();
}
[Key]
public int FilterComboTemplateId { get; set; }
[Required]
public string Name { get; set; }
[Required]
public ICollection<Filter> Filters { get; set; }
}
public class Filter
{
[Key]
public int FilterId { get; set; }
[Required]
public DeviceFilterProperty DeviceFilterProperty { get; set; }
[Required]
public bool Exclude { get; set; }
[Required]
public string Data1 { get; set; }
}
public class DeviceFilterProperty
{
[Key]
public int DeviceFilterPropertyId { get; set; }
[Required]
public string Name { get; set; }
}
Judging from some similar questions on SO, it does not seem something EF does automatically...
It's probably not a massive cut on code but you could do something like this, an extension method on DbContext (or on your particular dataContext):
public static bool Exists<TEntity>(this MyDataContext context, int id)
{
// your code here, something similar to
return context.Set<TEntity>().Any(x => x.Id == id);
// or with reflection:
return context.Set<TEntity>().Any(x => {
var props = typeof(TEntity).GetProperties();
var myProp = props.First(y => y.GetCustomAttributes(typeof(Key), true).length > 0)
var objectId = myProp.GetValue(x)
return objectId == id;
});
}
This will check if an object with that key exists in the DbContext. Naturally a similar method can be created to actually return that entity as well.
There are two "returns" in the code, just use the one you prefer. The former will force you to have all entities inherit from an "Entity" object with an Id Property (which is not necessarily a bad thing, but I can see the pain in this... you will also need to force the TEntity param: where TEntity : Entity or similar).
Take the "reflection" solution with a pinch of salt, first of all the performance may be a problem, second of all I don't have VS running up now, so I don't even know if it compiles ok, let alone work!
Let me know if that works :)
It seems that you have some common operations for parameters after it's bound from request.
You may consider to write custom parameter bindings to reuse the code. HongMei's blog is a good start point: http://blogs.msdn.com/b/hongmeig1/archive/2012/09/28/how-to-customize-parameter-binding.aspx
You may use the code in Scenario 2 to get the formatter binding to deserialize the model from body and perform the operations your want after that.
See the final step in the blog to specify the parameter type you want customize.
Assume the following simple POCOs, Country and State:
public partial class Country
{
public Country()
{
States = new List<State>();
}
public virtual int CountryId { get; set; }
public virtual string Name { get; set; }
public virtual string CountryCode { get; set; }
public virtual ICollection<State> States { get; set; }
}
public partial class State
{
public virtual int StateId { get; set; }
public virtual int CountryId { get; set; }
public virtual Country Country { get; set; }
public virtual string Name { get; set; }
public virtual string Abbreviation { get; set; }
}
Now assume I have a simple respository that looks something like this:
public partial class CountryRepository : IDisposable
{
protected internal IDatabase _db;
public CountryRepository()
{
_db = new Database(System.Configuration.ConfigurationManager.AppSettings["DbConnName"]);
}
public IEnumerable<Country> GetAll()
{
return _db.Query<Country>("SELECT * FROM Countries ORDER BY Name", null);
}
public Country Get(object id)
{
return _db.SingleById(id);
}
public void Add(Country c)
{
_db.Insert(c);
}
/* ...And So On... */
}
Typically in my UI I do not display all of the children (states), but I do display an aggregate count. So my country list view model might look like this:
public partial class CountryListVM
{
[Key]
public int CountryId { get; set; }
public string Name { get; set; }
public string CountryCode { get; set; }
public int StateCount { get; set; }
}
When I'm using the underlying data provider (Entity Framework, NHibernate, PetaPoco, etc) directly in my UI layer, I can easily do something like this:
IList<CountryListVM> list = db.Countries
.OrderBy(c => c.Name)
.Select(c => new CountryListVM() {
CountryId = c.CountryId,
Name = c.Name,
CountryCode = c.CountryCode,
StateCount = c.States.Count
})
.ToList();
But when I'm using a repository or service pattern, I abstract away direct access to the data layer. It seems as though my options are to:
Return the Country with a populated States collection, then map over in the UI layer. The downside to this approach is that I'm returning a lot more data than is actually needed.
-or-
Put all my view models into my Common dll library (as opposed to having them in the Models directory in my MVC app) and expand my repository to return specific view models instead of just the domain pocos. The downside to this approach is that I'm leaking UI specific stuff (MVC data validation annotations) into my previously clean POCOs.
-or-
Are there other options?
How are you handling these types of things?
It really depends on the projects architecture for what we do. Usually though.. we have services above the repositories that handle this logic for you. The service decides what repositories to use to load what data. The flow is UI -> Controller -> Service -> Repositories -> DB. The UI and/or Controllers have no knowledge of the repositories or their implementation.
Also, StateCount = c.States.Count would no doubt populate the States list anyway.. wouldn't it? I'm pretty sure it will in NHibernate (with LazyLoading causing an extra select to be sent to the DB).
One option is to separate your queries from your existing infrastructure entirely. This would be an implementation of a CQRS design. In this case, you can issue a query directly to the database using a "Thin Read Layer", bypassing your domain objects. Your existing objects and ORM are actually getting in your way, and CQRS allows you to have a "command side" that is separate and possibly a totally different set of tech to your "query side", where each is designed to do it's own job without being compromised by the requirements of the other.
Yes, I'm quite literally suggesting leaving your existing architecture alone, and perhaps using something like Dapper to do this (beware of untested code sample) directly from your MVC controllers, for example:
int count =
connection.Query<int>(
"select count(*) from state where countryid = #countryid",
new { countryid = 123 } );
Honestly, your question has gave me a food for thought for a couple of days. More and more I tend to think that denormalization is the correct solution.
Look, the main point of domain driven design is to let the problem domain drive your modeling decisions. Consider the country entity in the real world. A country has a list of states. However, when you want to know how many states a certain country has, you are not going over the list of the states in the encyclopedia and count them. You are more likely to look at the country's statistics and check the number of states there.
IMHO, the same behavior should be reflected in your domain model. You can have this information in the country's property, or introduce a kind of CountryStatistics object. Whatever approach you choose, it must be a part of the country aggregate. Being in the consistency boundary of the aggregate will ensure that it holds a consistent data in case of adding or removing a state.
Some other approaches:
If the states collection is not expected to change a lot, you can
allow a bit of denormalization - add "NumberOfStates" property to the
Country object. It will optimise the query, but you'll have to make
sure the extra field holds the correct information.
If you are using NHibernate, you can use ExtraLazyLoading - it will
issue another select, but won't populate the whole collection when
Count is called. More info here:
nHibernate Collection Count
I need to track a change history of some database objects in a MVC .NET application using the code first approach.
Here is what is meant by history table:
http://database-programmer.blogspot.de/2008/07/history-tables.html
I would use a history table for it, if I would write the SQL queries myself. But in the code first approach the SQL is generated... and I would like to stick to this paradigm.
The goal is a structure that holds all "old" revisions of changed/deleted entries together with some additional information (e.g. timestamp, user who changed it, ...)
Any ideas?
Regards,
Stefan
To be more specific - here is some code example:
public class Node {
public int NodeID { get; set; }
public string? data { get; set; } // sample data
}
public class NodeHistory {
public int NodeID { get; set; }
public string? data { get; set; }
public int UserID { get; set; }
public DataTime timestamp { get; set; }
}
What I need is some "framework" assistance to be able to add an entry to NodeHistory whenever a change is -persisted- to table the Node structure.
That means: Just overriding the set-method isn't a solution, as it would also create an entry, if the change to a "Node" is not persisted at the end (e.g. roleback).
I think the best approach for me would be to use a repository pattern and do the insertion into the NodeHistory table on every operation on the Node object that you see fit to keep a history of.
EDIT: Some code
public class NodeRepository{
public Node EditNode(Node toEdit, int userId){
using(new TransactionScope())
{
//Edit Node in NodeContext like you would anyway without repository
NodeContext.NodeHistories.Add(new NodeHistory(){//initialise NodeHistory stuff here)
NodeContext.SaveChagnes();
}
}
}
public class NodeContext:DbContext{
public DbSet<Node> Nodes{get;set;}
public DbSet<NodeHistory> NodeHistories{get;set;}
}
If you are looking for something simpler than this, then I have no idea what it might be.
This is really something you should do with a trigger. Yes, you have to write some sql for it, but then history is updated no matter how the update occurs, either manually, or through some other means.