I have my Unit of Measure which users fill in and save, they can then save a list of Unit Sizes which has its own table and is a foreign key to the Unit Of Measure. When I am fetching all the data back, the Unit Size value is coming back blank.
I have read a half dozen ways to do this and I am not comprehending them. The one that makes the most sense to me is using a Queryable extension so I am trying to go that route but my code still hasn't quite gotten there.
Here is where I am at - these are my entities:
namespace Mudman.Data.Entities
{
[Table("UnitOfMeasure")]
public class UnitOfMeasure : IEntityBase, IAuditBase
{
[Key]
[Column("UnitOfMeasureId")]
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public string Id { get; set; }
[Required]
[ForeignKey("TenantId")]
public string TenantId { get; set; }
[JsonIgnore]
public virtual Tenant Tenant { get; set; }
public string Name { get; set; }
public virtual IEnumerable<UnitOfMeasureSize> UnitSize { get; set; }
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public DateTime CreateDate { get; set; } = DateTime.UtcNow;
[StringLength(255)]
public string CreateUserId { get; set; }
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public DateTime UpdateDate { get; set; }
[StringLength(255)]
public string UpdateUserId { get; set; }
}
}
Unit Of Measure size entity:
namespace Mudman.Data.Entities
{
[Table("UnitOfMeasureSize")]
public class UnitOfMeasureSize : IEntityBase, IAuditBase
{
[Key]
[Column("UnitOfMeasureSize")]
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public string Id { get; set; }
[Required]
[ForeignKey("TenantId")]
public string TenantId { get; set; }
[JsonIgnore]
public virtual Tenant Tenant { get; set; }
[Required]
[ForeignKey("UnitOfMeasureId")]
public string UnitOfMeasureId { get; set; }
public virtual UnitOfMeasure UnitOfMeasure { get; set; }
[Required]
public int UnitSize { get; set; }
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public DateTime CreateDate { get; set; } = DateTime.UtcNow;
[StringLength(255)]
public string CreateUserId { get; set; }
[DatabaseGenerated(DatabaseGeneratedOption.None)]
public DateTime UpdateDate { get; set; }
[StringLength(255)]
public string UpdateUserId { get; set; }
}
}
Unit Of Measure Repository including Unit Size:
namespace Mudman.Repository
{
public class UnitOfMeasureRepository : EntityBaseRepository<UnitOfMeasure>,
IUnitOfMeasureRepository
{
MudmanDbContext context;
public UnitOfMeasureRepository(MudmanDbContext context) : base(context)
{
{ this.context = context; };
}
public IEnumerable<UnitOfMeasure> GetAllUnitsOfMeasure(string TenantId)
{
var result = context.UnitOfMeasure
.Where( uom => uom.TenantId == TenantId)
.Include(uom => uom.UnitSize);
return result;
}
}
}
My GetAllAsync method in my service:
public Task<IEnumerable<UnitOfMeasureViewModel>> GetAllAsync()
{
var result = _unitOfMeasureRepo.GetAllUnitsOfMeasure(TenantId);
result.OrderBy(r => r.Name);
return _mapper.Map<List<UnitOfMeasure>, List<UnitOfMeasureViewModel>>(result.ToList());
}
AutoMapper Code:
CreateMap<UnitOfMeasure, UnitOfMeasureViewModel>().ReverseMap()
.ForMember(dest => dest.UnitSize, uos => uos.Ignore())
.ForMember(uom => uom.UnitSize, src => src.MapFrom(uom => uom.UnitSize));
There are a few issues with your attempts so far.
Firstly, your GetAllAsync looks like it wants to be an async method but you have it making entirely synchronous calls, and hasn't been marked as async. I would avoid diving into asynchronous methods until you have the fundamentals of retrieving your data down.
What we cannot see from your example is the mapping between your unit of measure entity and the view model. The entity has a one-to-many relationship between unit of measure and UnitSizes, so what gets updated depends on how the view model is laid out and configured for mapping. This is most likely the root of your problem where your view model mapping from the entity is likely relying on a convention that isn't pairing up with the data you expect.
Performance wise, this approach will run into problems as your data model grows in terms of entities and rows. The fundamental problem with using a repository like this is that a method like this:
IEnumerable<UnitOfMeasure> GetAllUnitsOfMeasure(string TenantId)
will load all data into memory and you explicitly need to include related entities, whether the consumer will want them or not, which adds to the amount of work the queries need to do and the memory required. If TenantId is for something like a multi-tenant database such as in a SaaS application with multiple tenants using a single data source, this is a good reason to adopt a Repository pattern, but I would not pass tenantIds around as parameters. Instead, have the repository accept a dependency that can validate and resolve the current TenantId from the session. This way the repository can always ensure that the current tenant rules are validated and applied for every query without worrying about where the caller might have gotten a TenantId from. (I.e accepting a TenantId from a POST request would be bad as that value could easily be tampered with)
To address performance and probably touch on what you had read about IQueryable extensions, rather than returning IEnumerable<TEntity> from a repository, you can return IQueryable<TEntity>. The advantages here are that you can still have the repository add base filtering rules like the tenantID, and allow the consumer to handle things like sorting and projection.
For example, the repository looks more like:
public class UnitOfMeasureRepository : IUnitOfMeasureRepository
{
private readonly MudmanDbContext _context;
private readonly ICurrentUserLocator _currentUserLocator;
public UnitOfMeasureRepository(MudmanDbContext context, ICurrentUserLocator currentUserLocator )
{
_context = context ?? throw new ArgumentNullException("context");
_currentUserLocator = currentUserLocator ?? throw new ArgumentNullException("currentUserLocator");
}
public IQueryable<UnitOfMeasure> GetUnitsOfMeasure()
{
var tenantId = _currentUserLocator.CurrentUserTenantId; // Checks session for current user and retrieves a tenant ID or throws an exception. (no session, etc.)
var query = _context.UnitOfMeasure
.Where( uom => uom.TenantId == tenantId)
return query;
}
}
The changes to note here is that we do away with the base generic repository class. This was confusing as you were passing the context to a base class then setting a local context instance as well. Generic repositories with EF are a bad code smell as they lead to either very complex code, very poor performing code, or both. There is a CurrentUserLocator with the container can inject which is a simple class that can verify that a user is currently authenticated and can return their Tenant ID. From there we will return an IQueryable<UnitOfMeasure> which has a base filter for the TenantID which will allow our consumers to make up their own minds how they want to consume it. Note that we do not need to use Include for related entities, again the consumers can decide what they need.
Calling the new repository method and projecting your view models looks fairly similar to what you had. It looks like you are using Automapper, rather than using .Map() we can use .ProjectTo() with the IQueryable and Automapper can essentially build a Select() expression to pull back only the data that the view model will need. To use ProjectTo extension method we do need to provide it with the MappingConfiguration that was used to create your mapper and that will tell it how to build the ViewModel. (So rather than having a dependency of type 'Mapper' you will need one for the MapperConfiguration you set up for that mapper.)
public IEnumerable<UnitOfMeasureViewModel> GetAll()
{
var models = _unitOfMeasureRepo.GetUnitsOfMeasure()
.OrderBy(r => r.Name)
.ProjectTo<UnitOfMeasureViewModel>(_mapperConfiguration)
.ToList();
}
What this does is call our repository method to get the IQueryable, which we can then append the ordering we desire, and call ProjectTo to allow Automapper to populate the view models before executing the query with ToList(). When using Select or ProjectTo we don't need to worry about using Include to eager load related data that might be mapped, these methods take care of loading data related entities if/when needed automatically.
Even in cases where we want to use a method like this to update entities with related entities, using IQueryable works there to:
public void IncrementUnitSize(string unitOfMeasureId)
{
var unitOfMeasure = _unitOfMeasureRepo.GetUnitsOfMeasure()
.Include(r => r.UnitSizes)
.Where(r => r.Id == unitOfMeasureId)
.Single();
foreach(var unitSize in unitOfMeasure.UnitSizes)
unitSize.UnitSize += 1;
_context.SaveChanges();
}
Just as an example of fetching related entities as needed, versus having a method that returns IEnumerable and needs to eager load everything just in case some caller might need it.
These methods can very easily be translated into an asyncronous method without touching the repository:
public async Task<IEnumerable<UnitOfMeasureViewModel>> GetAll()
{
var models = await _unitOfMeasureRepo.GetAllUnitsOfMeasure(TenantId)
.OrderBy(r => r.Name)
.ProjectTo<UnitOfMeasureViewModel>(_mapperConfiguration)
.ToListAsync();
}
... and that is all! Just remember that async doesn't make the call faster, if anything it makes it a touch slower. What it does is make the server more responsive by allowing it to move the request handling to a background thread and free the request thread to pick up a new server request. That is great for methods that are going to take a bit of time, or are going to get called very frequently to avoid tying down all of the server request threads leading to timeouts for users waiting for a response from the server. For methods that are very fast and aren't expected to get hammered by a lot of users, async doesn't add a lot of value and you need to ensure every async call is awaited or you can end up with whacky behaviour and exceptions.
Related
I'm getting the following content when I invoke my API. It kind of breaks up in the middle when the tenant entity that member is linked to, will start listing its member entities.
{
"id":"00000000-7357-000b-0001-000000000000",
"tenantId":"00000000-7357-000a-0001-000000000000",
"userName":"user1",
"tenant":{
"id":"00000000-7357-000a-0001-000000000000",
"name":"First Fake Org",
"members":[
I configured the lazy loading like this.
services.AddDbContext<Context>(config => config
.UseLazyLoadingProxies()
.UseSqlServer(Configuration.GetConnectionString("Register")));
How should I change the code so that the lazily loaded entities don't get served? I was hoping that it would simply return an empty list to the client. Should I use a DTO for that purpose and not return from the DB like this? There's talk about not using lazy loading for APIs at all here.
[HttpGet("test1/{username}"), AllowAnonymous]
public IActionResult GetStuff(string userName)
{
Member output;
output = Context.Members
.Include(e => e.Tenant)
.Single(e => e.UserName == userName);
return Ok(output);
}
I'm not sure what to google for and all the hits I got were pointing to the UseLazyLoadingProxies() invokation.
This will probably be somewhat long winded: But here goes.
It sounds like you have Entities which look something like:
public partial class Member
{
public virtual long Id { get; set; }
public virtual List<Tenant> Tenants { get; set; } //tables have fk relationship
}
public partial class Tenant
{
public virtual long Id { get; set; }
public virtual List<Member> Members{ get; set; } //tables have another fk relationship?
}
And then for this method:
[HttpGet("test1/{username}"), AllowAnonymous]
public IActionResult GetStuff(string userName)
{
Member output;
output = Context.Members
.Include(e => e.Tenant)
.Single(e => e.UserName == userName);
return Ok(output);
}
I see a few issues, but I'll try to keep it short:
I wouldn't have the controller do this directly. But it should work.
What I think you over looked is exactly what the .Include statement does. When the object is instantiated, it will get all of those related entities. Includes essentially converts your where statement to a left join, where the foreign keys match (EF calls these navigation properties).
If you don't want the Tenant property, then you can omit the .Include statement. Unless this is meant to be more generic (In which case, an even stronger reason to use a different pattern and auto mapper).
Hopefully your database doesn't truly have a FK relationship both ways, if it does, fix that ASAP.
The next issue is that you might not want a list of child properties, but it is in the model so they will be "there". Although your List Tenants might be null. And while this might be fine to you, right now. As a general rule when I see an API returning a property, I expect something to be either not there (This Member doesn't have tenants) or something is wrong, like perhaps there is a second parameter I missed. This probably isn't a problem 93.284% of the time, but it is something to be mindful of.
This starts to get into why an AutoMapper is great. Your Database Models, Business Models and views are likely different. And as much as you shouldn't return the database models directly. Taking control of how the data is represented for each part of the application is a great idea.
You could reduce the code easily, and remove the navitation properties:
[HttpGet("test1/{username}"), AllowAnonymous]
public IActionResult GetStuff(string userName)
{
return Ok(Context.Members
.Include(e => e.Tenant)
.Single(e => e.UserName == userName));
}
But again, a business layer would be better:
[HttpGet("test1/{username}"), AllowAnonymous]
public IActionResult GetStuff(string userName)
{
return Ok(MemberRepository.GetMember(userName));
}
The main point I'd stress though, is creating a view model.
For example, Let's say a user Detail:
public class MemberDetail
{
public string UserName {get; set;}
public long UserId { get; set; }
public string FullName { get; set; }
}
This way the view always receives exactly what you want to see, and not the extra data. Add this with the fact that you can know exactly how every use of Member to MemberDetail will map.
I have the following method which is meant to build me up a single object instance, where its properties are built via recursively calling the same method:
public ChannelObjectModel GetChannelObject(Guid id, Guid crmId)
{
var result = (from channelObject in _channelObjectRepository.Get(x => x.Id == id)
select new ChannelObjectModel
{
Id = channelObject.Id,
Name = channelObject.Name,
ChannelId = channelObject.ChannelId,
ParentObjectId = channelObject.ParentObjectId,
TypeId = channelObject.TypeId,
ChannelObjectType = channelObject.ChannelObjectTypeId.HasValue ? GetChannelObject(channelObject.ChannelObjectTypeId.Value, crmId) : null,
ChannelObjectSearchType = channelObject.ChannelObjectSearchTypeId.HasValue ? GetChannelObject(channelObject.ChannelObjectSearchTypeId.Value, crmId) : null,
ChannelObjectSupportingObject = channelObject.ChannelObjectSupportingObjectId.HasValue ? GetChannelObject(channelObject.ChannelObjectSupportingObjectId.Value, crmId) : null,
Mapping = _channelObjectMappingRepository.Get().Where(mapping => mapping.ChannelObjectId == channelObject.Id && mapping.CrmId == crmId).Select(mapping => new ChannelObjectMappingModel
{
CrmObjectId = mapping.CrmObjectId
}).ToList(),
Fields = _channelObjectRepository.Get().Where(x => x.ParentObjectId == id).Select(field => GetChannelObject(field.Id, crmId)).ToList()
}
);
return result.First();
}
public class ChannelObjectModel
{
public ChannelObjectModel()
{
Mapping = new List<ChannelObjectMappingModel>();
Fields = new List<ChannelObjectModel>();
}
public Guid Id { get; set; }
public Guid ChannelId { get; set; }
public string Name { get; set; }
public List<ChannelObjectMappingModel> Mapping { get; set; }
public int TypeId { get; set; }
public Guid? ParentObjectId { get; set; }
public ChannelObjectModel ParentObject { get; set; }
public List<ChannelObjectModel> Fields { get; set; }
public Guid? ChannelObjectTypeId { get; set; }
public ChannelObjectModel ChannelObjectType { get; set; }
public Guid? ChannelObjectSearchTypeId { get; set; }
public ChannelObjectModel ChannelObjectSearchType { get; set; }
public Guid? ChannelObjectSupportingObjectId { get; set; }
public ChannelObjectModel ChannelObjectSupportingObject { get; set; }
}
this is connecting to a SQL database using Entity Framework Core 2.1.1
Whilst it technically works, it causes loads of database queries to be made - I realise its because of the ToList() and First() etc. calls.
However because of the nature of the object, I can make one huge IQueryable<anonymous> object with a from.... select new {...} and call First on it, but the code was over 300 lines long going just 5 tiers deep in the hierarchy, so I am trying to replace it with something like the code above, which is much cleaner, albeit much slower..
ChannelObjectType, ChannelObjectSearchType, ChannelObjectSupportingObject
Are all ChannelObjectModel instances and Fields is a list of ChannelObjectModel instances.
The query takes about 30 seconds to execute currently, which is far too slow and it is on a small localhost database too, so it will only get worse with a larger number of db records, and generates a lot of database calls when I run it.
The 300+ lines code generates a lot less queries and is reasonably quick, but is obviously horrible, horrible code (which I didn't write!)
Can anyone suggest a way I can recursively build up an object in a similar way to the above method, but drastically cut the number of database calls so it's quicker?
I work with EF6, not Core, but as far as I know, same things apply here.
First of all, move this function to your repository, so that all calls share the DbContext instance.
Secondly, use Include on your DbSet on properties to eager load them:
ctx.DbSet<ChannelObjectModel>()
.Include(x => x.Fields)
.Include(x => x.Mapping)
.Include(x => x.ParentObject)
...
Good practice is to make this a function of context (or extension method) called for example BuildChannelObject() and it should return the IQueryable - just the includes.
Then you can start the recursive part:
public ChannelObjectModel GetChannelObjectModel(Guid id)
{
var set = ctx.BuildChannelObject(); // ctx is this
var channelModel = set.FirstOrDefault(x => x.Id == id); // this loads the first level
LoadRecursive(channelModel, set);
return channelModel;
}
private void LoadRecursive(ChannelObjectModel c, IQueryable<ChannelObjectModel> set)
{
if(c == null)
return; // recursion end condition
c.ParentObject = set.FirstOrDefault(x => x.Id == c?.ParentObject.Id);
// all other properties
LoadRecursive(c.ParentObject, set);
// all other properties
}
If all this code uses the same instance of DbContext, it should be quite fast. If not, you can use another trick:
ctx.DbSet<ChannelObjectModel>().BuildChannelObjectModel().Load();
This loads all objects to memory cache of your DbContext. Unfortunately, it dies with context instance, but it makes those recursive calls much faster, since no database trip is made.
If this is still to slow, you can add AsNoTracking() as last instruction of BuildChannelObjectModel().
If this is still to slow, just implement application wide memory cache of those objects and use that instead of querying database everytime - this works great if your app is a service that can have long startup, but then work fast.
Whole another approach is to enable lazy loading by marking navigation properties as virtual - but remember that returned type will be derived type anonymous proxy, not your original ChannelObjectModel! Also, properties will load only as long you don't dispose the context - after that you get an exception. To load all properties with the context and then return complete object is also a little bit tricky - easiest (but not the best!) way to do it to serialize the object to JSON (remember about circural references) before returning it.
If that does not satisfy you, switch to nHibernate which I hear has application wide cache by default.
I'm having some issues with understanding how to property set up one-to-one relationships for a Web API using Entity Framework Core.
Consider the following objects:
public class Car
{
[JsonIgnore]
public Int32 CarId { get; set; }
public virtual Horn Horn { get; set; }
public virtual ICollection<Wheel> Wheels { get; set; }
}
public class Horn
{
[JsonIgnore]
public Int32 HornId { get; set; }
public Sound Sound { get; set; }
}
public class Sound
{
// some other props
}
When I perform a query in my repository, one-to-many is by default excluded unless I use .Include(), however, for one-to-one properties, they are by default included when I serialize.
e.g. It gets really messy because I query for a car and it returns all it's sub-components in the JSON response.
I can set car.Horn = null, etc, but that seems difficult to maintain for complex objects. I would like it to function like .Include(), but by default be excluded (in case I do want to query the full object).
Edit: Note, this issue is recursive, the car pulls in horn which pulls in sound. On a real world example like a user table, the automatically pulled in data when serialized is huge unless specifically nullifying the child properties.
EDIT2:
Here is an example of a repository call that by default bring back all one-to-one properties:
var group = _context.Car.Include(c =>
c.Wheels).SingleOrDefault(u => u.CarId == id);
Note the Include works as expected for many-to-one, but this query, even when the Include is removed, will recursively return all child objects that are one-to-one.
It does appear some type of lazy loading is being introduced in EF CORE 2.1
This article should give you a hint.
https://msdn.microsoft.com/en-us/library/jj574232(v=vs.113).aspx
Mainly:
Turn lazy loading off for serialization
Lazy loading and serialization don’t mix well, and if you aren’t
careful you can end up querying for your entire database just because
lazy loading is enabled. Most serializers work by accessing each
property on an instance of a type. Property access triggers lazy
loading, so more entities get serialized. On those entities properties
are accessed, and even more entities are loaded. It’s a good practice
to turn lazy loading off before you serialize an entity. The following
sections show how to do this.
EDIT:
Here is the way to disable lazy loading for all entities. But note you have to achieve this several way, so check the other options in the article...
public class YourContext : DbContext
{
public YourContext()
{
this.Configuration.LazyLoadingEnabled = false;
}
}
Context mapping
modelBuilder.Entity<SessionFeedbackModel>(entity =>
{
entity.HasOne(s => s.Session).WithOne(p => p.Feedback)
.HasForeignKey<SessionFeedbackModel>(s => s.SessionId).OnDelete(DeleteBehavior.Restrict);
});
modelBuilder.Entity<SessionQuestionModel>(entity =>
{
entity.HasOne(e => e.SessionResult).WithOne(e => e.SessionQuestion)
.HasForeignKey<SessionQuestionResultModel>(e => e.SessionQuestionId)
.OnDelete(DeleteBehavior.Restrict);
});
Models
public class SessionQuestionResultModel
{
public int Id { get; set; }
public int SessionQuestionId { get; set; }
public SessionQuestionModel SessionQuestion { get; set; }
}
public class SessionFeedbackModel
{
public int Id { get; set; }
public int SessionId { get; set; }
public SessionModel Session { get; set; }
}
EF Core 1.x or 2.x does not support 1 to 1 very well or at all, but it can be done this way, this would be radically different for EF 6.x.x
I have implemented Unit of Work and Repository pattern with Entity Framework. A sample operation is shown below:
public T GetById(int id)
{
return _context.Set<T>().Find(id);
}
So if I wanted to bring a record with id 12 and only StudentName column. I cannot do that using the above method as all columns will be pulled which I could have done using Linq like below:
Student get = context.Students
.Where(s => s.Id = 12)
.Select(s => new { StudentName = Name })
.SingleOrDefault();
Currently, I am returning an IQueryable from the repository like below to make above aforementioned scenario work:
public IQueryable<T> Query()
{
IQueryable<T> query = dbset;
return query;
}
which makes the point of having a Repository to null anyway because I cannot restrict operations on database now. I have to query like below:
Student get = _uow.Students
.Query()
.Where(s => s.Id = 12)
.Select(s => new { StudentName = Name })
.SingleOrDefault();
Any suggestions on how to improve this situation or other opinions are required please.
I would argue that you are worried about restricting operations at the wrong layer; the Repository. Assuming your Repositories are supposed to abstract away the details of your DB and Entity Framework, I think this is too low of a level to do what you want.
If you have a Repo per DB table, and a class per query result type (direct SQL/EF or DB View), it doesn't make sense to introduce another layer of abstraction here. It would be better to do this at the next layer up, or whatever is handling your transactional boundaries.
To demonstrate, here is a more concrete example:
Given a Student DB table:
TABLE Student
PK Id int
COLUMN Name string
COLUMN SecretData string
Your StudentRepo should always return instance(s) of a Student class:
public class Student
{
public int Id { get; set; }
public string Name { get; set; }
public string SecretData { get; set; }
}
Then the layer that utilizes your repo(s) should handle your transactions (potentially across multiple repos/operations) and map its results to a Domain entity. Your domain entity can include only the fields which you want to surface. You can create specialized domain entites for each purpose you need.
public class DomainStudent
{
public int Id { get; private set; } // prevent attempts to change Ids on domain entities
public string Name { get; set; }
}
public class DomainStudentWithSecret
{
public int Id { get; private set; }
public string Name { get; set; }
public string SecretData { get; set; }
}
And to expand on why you would want to handle this kind of mapping and transactional boundaries outside of your Repository code: these things are best left to code which can operate across many DB tables. Often you need to take the result of two separate SQL queries and map the result to a single domain entity. Or sometimes, you want to roll back a transaction (or not execute subsequent SQL) if an initial query fails. I find it best to keep the Repos/DAOs working on a single table/view/sproc (DB entity) to abstract away the details of the Db engine and have domain-layer classes handle the heavy lifting of how to make sense of the data. If you need complex SQL queries with many JOINs, consider creating a view so you can work with the data like any other table.
Assume the following simple POCOs, Country and State:
public partial class Country
{
public Country()
{
States = new List<State>();
}
public virtual int CountryId { get; set; }
public virtual string Name { get; set; }
public virtual string CountryCode { get; set; }
public virtual ICollection<State> States { get; set; }
}
public partial class State
{
public virtual int StateId { get; set; }
public virtual int CountryId { get; set; }
public virtual Country Country { get; set; }
public virtual string Name { get; set; }
public virtual string Abbreviation { get; set; }
}
Now assume I have a simple respository that looks something like this:
public partial class CountryRepository : IDisposable
{
protected internal IDatabase _db;
public CountryRepository()
{
_db = new Database(System.Configuration.ConfigurationManager.AppSettings["DbConnName"]);
}
public IEnumerable<Country> GetAll()
{
return _db.Query<Country>("SELECT * FROM Countries ORDER BY Name", null);
}
public Country Get(object id)
{
return _db.SingleById(id);
}
public void Add(Country c)
{
_db.Insert(c);
}
/* ...And So On... */
}
Typically in my UI I do not display all of the children (states), but I do display an aggregate count. So my country list view model might look like this:
public partial class CountryListVM
{
[Key]
public int CountryId { get; set; }
public string Name { get; set; }
public string CountryCode { get; set; }
public int StateCount { get; set; }
}
When I'm using the underlying data provider (Entity Framework, NHibernate, PetaPoco, etc) directly in my UI layer, I can easily do something like this:
IList<CountryListVM> list = db.Countries
.OrderBy(c => c.Name)
.Select(c => new CountryListVM() {
CountryId = c.CountryId,
Name = c.Name,
CountryCode = c.CountryCode,
StateCount = c.States.Count
})
.ToList();
But when I'm using a repository or service pattern, I abstract away direct access to the data layer. It seems as though my options are to:
Return the Country with a populated States collection, then map over in the UI layer. The downside to this approach is that I'm returning a lot more data than is actually needed.
-or-
Put all my view models into my Common dll library (as opposed to having them in the Models directory in my MVC app) and expand my repository to return specific view models instead of just the domain pocos. The downside to this approach is that I'm leaking UI specific stuff (MVC data validation annotations) into my previously clean POCOs.
-or-
Are there other options?
How are you handling these types of things?
It really depends on the projects architecture for what we do. Usually though.. we have services above the repositories that handle this logic for you. The service decides what repositories to use to load what data. The flow is UI -> Controller -> Service -> Repositories -> DB. The UI and/or Controllers have no knowledge of the repositories or their implementation.
Also, StateCount = c.States.Count would no doubt populate the States list anyway.. wouldn't it? I'm pretty sure it will in NHibernate (with LazyLoading causing an extra select to be sent to the DB).
One option is to separate your queries from your existing infrastructure entirely. This would be an implementation of a CQRS design. In this case, you can issue a query directly to the database using a "Thin Read Layer", bypassing your domain objects. Your existing objects and ORM are actually getting in your way, and CQRS allows you to have a "command side" that is separate and possibly a totally different set of tech to your "query side", where each is designed to do it's own job without being compromised by the requirements of the other.
Yes, I'm quite literally suggesting leaving your existing architecture alone, and perhaps using something like Dapper to do this (beware of untested code sample) directly from your MVC controllers, for example:
int count =
connection.Query<int>(
"select count(*) from state where countryid = #countryid",
new { countryid = 123 } );
Honestly, your question has gave me a food for thought for a couple of days. More and more I tend to think that denormalization is the correct solution.
Look, the main point of domain driven design is to let the problem domain drive your modeling decisions. Consider the country entity in the real world. A country has a list of states. However, when you want to know how many states a certain country has, you are not going over the list of the states in the encyclopedia and count them. You are more likely to look at the country's statistics and check the number of states there.
IMHO, the same behavior should be reflected in your domain model. You can have this information in the country's property, or introduce a kind of CountryStatistics object. Whatever approach you choose, it must be a part of the country aggregate. Being in the consistency boundary of the aggregate will ensure that it holds a consistent data in case of adding or removing a state.
Some other approaches:
If the states collection is not expected to change a lot, you can
allow a bit of denormalization - add "NumberOfStates" property to the
Country object. It will optimise the query, but you'll have to make
sure the extra field holds the correct information.
If you are using NHibernate, you can use ExtraLazyLoading - it will
issue another select, but won't populate the whole collection when
Count is called. More info here:
nHibernate Collection Count