Let's say we have a realm results taken with
RealmDb.All<Entry>();
Then I want to do some search over those results using not yet supported techniques, like StartsWith on a function return or on a property which is not mapped in realm etc, so I get a subset
IEnumerable<Entry> subset = bgHaystack;
var results = subset.Where(entry => entry.Content.ToLower().StartsWith(needle));
To get somehow these as part of RealmResults, I extract the entry ids like this:
List<int> Ids = new List<int>();
foreach (Entry entry in entries)
{
Ids.Add(entry.Id);
}
return Ids;
and finally I want to return a subset of RealmResults (not IEnumerable) of only those Entries that contain those ids, how can I do that? IDE says the Contains method is not supported.
Can I use some kind of predicate or a comparer for that?
Entry is my model class
using System.ComponentModel.DataAnnotations.Schema;
using Realms;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System;
namespace Data.Models
{
[Table("entry")]
public class Entry : RealmObject
{
public class EntryType
{
public const byte Word = 1;
public const byte Phrase = 2;
public const byte Text = 3;
};
[Key]
[PrimaryKey]
[Column("entry_id")]
public int Id { get; set; }
[Column("user_id")]
public int UserId { get; set; }
[Column("source_id")]
public int SourceId { get; set; }
[Indexed]
[Column("type")]
public byte Type { get; set; }
[Column("rate")]
public int Rate { get; set; }
[Column("created_at")]
public string CreatedAt { get; set; }
[Column("updated_at")]
public string UpdatedAt { get; set; }
[NotMapped]
public Phrase Phrase { get; set; }
[NotMapped]
public Word Word { get; set; }
[NotMapped]
public Text Text { get; set; }
[NotMapped]
public IList<Translation> Translations { get; }
[NotMapped]
public string Content
{
get {
switch (Type)
{
case EntryType.Phrase:
return Phrase?.Content;
case EntryType.Word:
return Word?.Content;
case EntryType.Text:
return Text?.Content;
}
return "";
}
}
}
}
According to the documentation, Realm .NET supports LINQ, so that's promising. In your specific example, you indicate that StartsWith isn't supported, but I see that on the above page, specifically here.
Now, your example makes clear that Entry is a RealmObject, so it's not clear where you'd possibly get a RealmResult from (nor does their documentation on that page mention a RealmResult). Specifically, the home page indicates that you're really only going to ever work with Realm, RealmObject and Transaction, so I'm going to just assume that you meant that you'll need a resulting RealmObject per their examples.
The way you presently have your data object set up, you're rather stuck calling it like you are (though if I could make a recommendation to simplify it a little bit:
var entries = RealmDb.All<Entry>().ToList();
var results = entries.Where(entry => entry.Content.ToLower().StartsWith(needle));
var ids = results.Select(a => a.Id).ToList();
Now, your big issue with just combining the filter predicate in line 2 with the end of line 1: Content itself is marked with a [NotMapped] attribute. Per the documentation again:
As a general rule, you can only create predicates with conditions that
rely on data in Realm. Imagine a class
class Person : RealmObject
{
// Persisted properties
public string FirstName { get; set; }
public string LastName { get; set; }
// Non-persisted property
public string FullName => FirstName + " " + LastName;
}
Given this class, you can create queries with conditions that apply to
the FirstName and LastName properties but not to the FullName
property. Likewise, properties with the [Ignored] attribute cannot be
used.
Because you're using [NotMapped], I've got to believe that's going to behave similarly to [Ignored] and further, because it's just a computed value, it's not something that Realm is going to be able to process as part of the query - it simply doesn't know it because you didn't map it to the information Realm is storing. Rather, you'll have to compute the Content property when you've actually got the instances of your Entry objects to enumerate through.
Similarly, I expect you'll have issues pulling values from Phrase, Word and Text since they're also not mapped, and thus not stored in the record within Realm (unless you're populating those in code you didn't post before executing your Where filter).
As such, you might instead consider storing separate records as a PhraseEntry, WordEntry, and TextEntry so you can indeed perform exactly that filter and execute it on Realm. What if you instead used the following?
public class Entry : RealmObject
{
[Key]
[PrimaryKey]
[Column("entry_id")]
public int Id { get; set; }
[Column("user_id")]
public int UserId { get; set; }
[Column("source_id")]
public int SourceId { get; set; }
[Column("rate")]
public int Rate { get; set; }
[Column("created_at")]
public string CreatedAt { get; set; }
[Column("updated_at")]
public string UpdatedAt { get; set; }
[Column("content")]
public string Content { get; set; }
[NotMapped]
public IList<Translation> Translations { get; }
}
[Table("wordEntry")]
public class WordEntry : Entry
{
}
[Table("phraseEntry")]
public class PhraseEntry : Entry
{
}
[Table("textEntry")]
public class TextEntry : Entry
{
}
And now, you can offload the filtering to Realm:
var wordEntries = RealmDb.All<WordEntry>.Where(entry =>
entry.Content.StartsWith(needle, StringComparison.OrdinalIgnoreCase)).ToList();
var phraseEntries = RealmDb.All<PhraseEntry>.Where(entry => entry.Content.StartsWith(needle, StringComparison.OrdinalIgnoreCase)).ToList();
var textEntries = RealmDb.All<TextEntry>.Where(entry => entry.Content.StartsWith(needle, StringComparison.OrdinalIgnoreCase)).ToList();
var entries = new List<Entry>();
entries.AddRange(wordEntries);
entries.AddRange(phraseEntries);
entries.AddRange(textEntries);
var ids = entries.Select(entry => entry.Id).ToList();
It's not quite as brief as storing it all in one table, but I'm not immediately seeing any Realm documentation that indicates support for executing the same query against multiple tables simultaneously, so at least this would allow you to leave the filtering to the database and work against a more limited subset of values locally.
Finally, so we have all that and I missed your final question up top. You indicate that you want to return a subset of your entries based on some collection of ids you create. In the logic you provide, you're retrieving all the Id properties in all your results, so there's really no further subset to pull.
That said, let's assume you have a separate list of ids that for whatever complicated reason, you were only able to derive after retrieving the list of Entry types from above (themselves all PhraseEntry, WordEntry or TextEntry objects).
At this point, since you've already pulled all the values from Realm and have them locally, just execute another Where statement against them. Because a List implements IEnumerable, you can thus execute the LINQ locally without any of the Realm restrictions:
var myLimitedIdSet = new List<int>()
{
10, 15, 20, 25 //Really complicated logic to narrow these down locally
};
var resultingEntries = entries.Where(entry => myLimitedIdSet.Contains(entry.Id)).ToList();
And you're set. You'll have only those entries that match the IDs listed in myLimitedIdSet.
Edit to address comment
You see this error because of the detail provided at the top of this page in the documentation. Specifically (and adapting to your code):
The first statement gives you a new instance of Entry of a class that implements IQueryable... This is standard LINQ implementation - you get an object representing the query. The query doesn't do anything until you made a further call that needs to iterate or count the results.
Your error is then derived by taking the result from RealmDb.All<Entry>() and trying to cast it to an IEnumerable<Entry> to operate against it as though you have local data. Until you call ToList() onRealmDb.All` you simply have a LINQ representation of what the call will be, not the data itself. As such, when you further refine your results with a Where statement, you're actually adding that to a narrowed version of the IQueryable statement, which will also fail because you lack the appropriate mapping in the Realm dataset.
To skip the optimization I provided above, the following should resolve your issue here:
var bgHaystack = realm.All<Entry>().ToList(); //Now you have local data
var results = bgHaystack.Where(entry => entry.Content.ToLower().StartsWith(needle));
Unfortunately, given your provided code, I don't expect that you'll see any matches here unless needle is an empty string. Not only is your Content property not part of the Realm data and you thus cannot filter on it within Realm, but neither are your Phrase, Word or Text properties mapped either. As a result, you will only ever see an empty string when getting your Content value.
You can further refine the results variable above to yield only those instances with a provided ID as you see fit with normal LINQ (as again, you'll have pulled the data from Realm in the first line).
var limitedIds = new List<int>{10, 20, 30};
var resultsLimitedById = results.Select(a => limitedIds.Contains(a.Id)).ToList();
I've updated my examples above to reflect the use of ToList() in the appropriate places as well.
Related
Hi there to the good friends of SO!
This is more of a design question so I'll get into a detailed example.
Let me explain the way we're sending emails.
In various parts of the application, we create entries in our Notification table for different kinds of email we might have to send.
For eg: The NotificationQueue table looks like this:
NotificationQueueID OrderID EmailType Notes SentDatetime
1 461196 OrderUpdate SomeNote1 2020-09-01 14:45:13.153
2 461194 OrderCancellation SomeNote2 2020-09-01 14:45:13.153
It's accessed using the property in the DbContext as:
public DbSet<NotificationQueue> NotificationQueues { get; set; }
The different types of email is modeled in an enum:
public enum TypeOfEmail
{
OrderCancellation,
OrderUpdate
}
We have a EmailModel class that has a TicketsInNotificationQueue property that has a list of any of the email types we have. For eg: At any given time, it can have list of either UpdatedTickets or CancelledTickets. The email type says what type of tickets are in the TicketsInNotificationQueue property.
public class EmailModel
{
public EmailModel(TypeOfEmail emailType, TicketsInNotificationQueue ticketsInNotificationQueue)
{
EmailType = emailType;
TicketsInNotificationQueue = ticketsInNotificationQueue;
}
public TypeOfEmail EmailType { get; set; }
public TicketsInNotificationQueue TicketsInNotificationQueue { get; set; }
}
public class TicketsInNotificationQueue
{
public List<OrderCancellation> CancelledTickets { get; set; }
public List<OrderUpdate> UpdatedTickets { get; set; }
}
public class OrderCancellation : CommonOrderInformation
{
public string SomeOrderId { get; set; }
}
public class OrderUpdate: CommonOrderInformation
{
public string SomeUpdateRelatedProperty { get; set; }
}
public class CommonOrderInformation
{
public int NotificationQueueId { get; set; }
public string ReferenceNumber { get; set; }
}
There's a method that retrieves tickets from Notification table:
public async Task<TicketsInNotificationQueue> GetTicketsfromNotificationQueueAsync(TypeOfEmail emailType)
{
var ticketsInNotificationQueue = new TicketsInNotificationQueue();
using (var dbCon = GetSomeDbContext())
{
var notifications = dbCon.NotificationQueues.Where(x => x.EmailType == emailType.ToString()).ToList();
foreach (var ntf in notifications)
{
if (ntf.EmailType == TypeOfEmail.OrderCancellation.ToString())
{
if (ticketsInNotificationQueue.CancelledTickets == null)
{
ticketsInNotificationQueue.CancelledTickets = new List<OrderCancellation>();
}
ticketsInNotificationQueue.CancelledTickets.Add(new OrderCancellation()
{
NotificationQueueId = ntf.NotificationQueueID,
ReferenceNumber = ntf.OrderID,
SomeOrderId = "Something from a table."
});
}
else if (ntf.EmailType == TypeOfEmail.OrderUpdate.ToString())
{
if (ticketsInNotificationQueue.UpdatedTickets == null)
{
ticketsInNotificationQueue.UpdatedTickets = new List<OrderUpdate>();
}
var notes = dbCon.NotificationQueues.FirstOrDefault(x => x.NotificationQueueID == ntf.NotificationQueueID)?.Notes;
ticketsInNotificationQueue.UpdatedTickets.Add(new OrderUpdate()
{
NotificationQueueId = ntf.NotificationQueueID,
ReferenceNumber = ntf.OrderID,
SomeUpdateRelatedProperty = "Something from a table."
});
}
}
}
return ticketsInNotificationQueue;
}
Now I just take this list, and filter out the notificationIds for the type of tickets that I just received, and work on them down the line. (I need those notificationIds to set the SentDatetime after the notification has been sent).
var ticketsReceived = false;
notificationIds = new List<int>();
if (ticketsInNotificationQueue.CancelledTickets != null && ticketsInNotificationQueue.CancelledTickets.Any())
{
ticketsReceived = true;
notificationIds = ticketsInNotificationQueue.CancelledTickets.Select(x => x.NotificationQueueId).ToList();
}
else if (ticketsInNotificationQueue.UpdatedTickets != null && ticketsInNotificationQueue.UpdatedTickets.Any())
{
ticketsReceived = true;
notificationIds = ticketsInNotificationQueue.UpdatedTickets.Select(x => x.NotificationQueueId).ToList();
}
if (ticketsReceived)
{
// Proceed with the process of sending the email, and setting the `SentDateTime`
}
The problem I see here is that as the type of emails grows bigger, let's say 10-20, the method to retrieve tickets and filter them out later needs to grow so big that it's going to spin out of control in terms of readability and code manageability which I'm not liking at all. The part where I need to check what emailType is requested in the fetch and what emailType has been received(to get the corresponding notificationIds for SentDateTime update).
So is there some other way to design this workflow (I'm even open to using reflection and such) to make it more manageable and concise?
Any help would be greatly appreciated!
There is significant improvements that you can make to the existing system and the existing code. In the interest of having a more complete answer I'm going to recommend a not-too-expensive system overhaul and then proceed to your exact answer.
A different and industry standard approach
You already have the data structure correct, this is a perfect job for distributed persistent queues, where you don't need to worry about querying the database as much; instead you just enqueue the messages and have a processor that deals with them. Since you're using C# and .net, I strongly encourage you to check out Azure Service Bus. This is effectively a large queue where you can send messages (in your case send email requests) and you can enqueue your messages to different channels in the service bus depending on their type.
You could also look into creating a queue processor / which Azure Functions have a trigger out of the box. Once your email is sent, then you can write to your DB, we've sent this email.
So, the good design looks like
Have distributed persistent queues, channels / enqueue the email requests to them directly.
If you want to process them at a cadence, run your processor using cron - which most industry solutions support.
If you want to process them as they are ending up in the queue, use a trigger.
You can enrich your processor based on your scenario, it looks like it has something to do with orders, so you may need to handle cases like not sending an already queued email after an order in cancelled, etc..
Improving what you have
Due to some circumstances, the solution above might not be available to you - so let's get to it.
See how to refactor switch statements (since you have one with if / else ifs)
https://sourcemaking.com/refactoring/smells/switch-statements
Ways to eliminate switch in code
You could get this through polymorphism, just create a base mail type and override the behaviors in subclasses. This way you can associate the correct queue with the correct email type.
Example:
var results = await getSomeEmails(OrderMail);
// returns a separate processor inherited from the base one, implemented in different ways.
var processor = ProcessorFactory.Create(OrderMail);
await processor.Send(results);
Some more improvements
foreach (var ntf in notifications)
{
if (ntf.EmailType == TypeOfEmail.OrderCancellation.ToString())
You are checking the email type over and over again unnecessarily in this loop, you should look into moving those statements above the for and check through the passed-in parameter, since you already know the type you're querying for.
Thank you for the answer #Mavi Domates.
But this is what I ended up doing:
I modified the EmailModel's TicketsInNotificationQueue property so that instead of having different types of classes for different types of email, we just have one type of common class. This will avoid having us to put those checks for checking what kind of email was requested in the fetch logic and also to retrieve notification Ids down the line (to update SentDateTime after email is sent) as indicated in the original question.
public class EmailModel
{
public EmailModel(TypeOfEmail emailType, IEnumerable<CommonEmailModel> ticketsInNotificationQueue)
{
EmailType = emailType;
TicketsInNotificationQueue = ticketsInNotificationQueue;
}
public TypeOfEmail EmailType { get; set; }
public IEnumerable<CommonEmailModel> TicketsInNotificationQueue { get; set; }
}
public enum TypeOfEmail
{
OrderCancellation,
OrderUpdate
}
I added a new class called: CommonEmailModel and removed all those different email type classes (classes for OrderCancellation, OrderUpdate etc.).
public class CommonEmailModel
{
// Common to all email types. A lot of email types only need these first 4 properties
public string EmailType { get; set; }
public int NotificationQueueId { get; set; }
public string OrderId { get; set; }
public string Notes { get; set; }
// Cancellation related
public string SomeOrderId { get; set; }
// Update related
public string SomeUpdateRelatedProperty { get; set; }
public static async Task<IEnumerable<CommonEmailModel>> GetEmailBodyRecordsAsync(TypeOfEmail emailType)
{
var emailModels = new List<CommonEmailModel>();
var emailEntries = await EmailNotificationQueue.GetEmailEntriesAsync(emailType);
var relevantOrdIds = emailEntries.Select(x => x.OrderID).Distinct().ToList();
using (var dbCon = GetSomeDbContext())
{
orders = dbCon.Orders.Where(x => relevantOrdIds.Contains(x.OrdNumber)).ToList();
}
foreach (var record in emailEntries)
{
var emailModel = new CommonEmailModel
{
EmailType = emailType,
NotificationQueueId = record.NotificationQueueID,
OrderId = record.OrderID,
Notes = record.Notes,
SomeOrderId = orders?.FirstOrDefault(o => o.OrdNumber == record.OrderID)?.SomeOrderIdINeed,
SomeUpdateRelatedProperty = orders?.FirstOrDefault(o => o.OrdNumber == record.OrderID)?.UpdateRelatedPropertyINeed
};
emailModels.Add(emailModel);
}
return emailModels;
}
}
I just get the records the following way:
var emailRecords = await CommonEmailModel.GetEmailBodyRecordsAsync(emailType);
And simply pass this to EmailModel constructor as the ticketsInNotificationQueue parameter. No need to do all that extra check of figuring out if records of certain emailType was requested. The views for OrderCancellation and OrderUpdate will use the common properties and their respective relevant properties that are present in the CommonEmailModel class.
if (emailRecords.Any())
{
var emailModel = new EmailModel(emailType, emailRecords);
}
Now all I have to do is pass the notification Ids to a method that marks the SentDateTime column with the current timestamp by simply calling:
if (emailWasSent)
{
await UpdateNotificationSentTimeAsync(emailRecords.Select(t => t.NotificationQueueId));
}
In the future if we keep on adding new emailType (most probably they'll carry the information in those 4 first common properties in CommonEmailModel), we can simply add new properties to the CommonEmailModel to accommodate that and just create a new view. This way I can avoid code repetition and complexity in the fetch and also at the end while updating the SentDateTime.
I am very new to asp.net and C# so bear with me. I am trying to return data from a database using the entity framework .include() method so that I can get the foreign key information from another table. However, what is being returned is only part of the data. It seems to be cut off before everything is returned.
"[{"id":11,"name":"Mr. Not-so-Nice","heroType":3,"heroTypeNavigation":{"id":3,"type":"Villian","heroes":["
Which gives me the error: SyntaxError: Unexpected end of JSON input.
Please seem below for the model classes and the GET section of the controller where this is being returned. If I remove the "include()" method it returns all the heroes from the main table just fine.
public partial class Hero
{
public int Id { get; set; }
public string Name { get; set; }
public int? HeroType { get; set; }
public virtual HeroTypes HeroTypeNavigation { get; set; }
}
{
public partial class HeroTypes
{
public HeroTypes()
{
Heroes = new HashSet<Hero>();
}
public int Id { get; set; }
public string Type { get; set; }
public virtual ICollection<Hero> Heroes { get; set; }
}
// GET: api/Heroes
[HttpGet]
public async Task<ActionResult<IEnumerable<Hero>>> GetHeroesTable()
{
return await _context.HeroesTable.Include(hero => hero.HeroTypeNavigation).ToListAsync();
}
Serializer recursion rules will be tripping this up. Basically as jonsca mentions, you have a circular reference between hero, and hero type. The serializer will start with the hero, then go to serialize the hero type which it will find the Hero's collection and expect to serialize, which each would reference a hero type, with collections of Heros.. The serializer bails when it sees this.
I would recommend avoiding passing back Entity classes to your view to avoid issues with EF and lazy loading. Serialization will iterate over properties, and this will trigger lazy loads. To avoid this, construct a view model for the details your view needs, flatten as necessary.
For example if you want to display a list of Heroes with their Type:
public class HeroViewModel
{
public int HeroId { get; set; }
public string Name { get; set; }
public string HeroType { get; set; }
}
to load:
var heroes = await _context.HeroesTable.Select(x => new HeroViewModel
{
HeroId = x.HeroId,
Name = x.Name,
HeroType = x.HeroType.Type
}).ToListAsync();
You can utilize Automapper for example to help translate entities to view models without that explicit code using ProjectTo<TEntity> which can work with EF's IQueryable implementation.
With larger realistic domains your client likely won't need everything in the object graph.
You won't expose more information than you need to. (I.e. visible via debugging tools)
You'll get a performance boost from not loading the entire graph or triggering
lazy load calls, and it's less data across the wire.
The last point is a rather important one as with complex object graphs, SQL can do a lot of the lifting resulting in a much more efficient query than loading "everything". Lazy hits to the database can easily add several seconds to each and every call from a client, and loading large graphs has a memory implication on the servers as well.
i'm writing a system to track observation values from sensors (e.g. temperature, wind direction and speed) at different sites. I'm writing it in C# (within VS2015) using a code-first approach. Although i've a reasonable amount of programming experience, I'm relatively new to C# and the code-first approach.
I've defined my classes as below. I've built a REST api to accept observation reading through Post, which has driven my desire to have Sensor keyed by a string rather than an integer - Some sensors have their own unique identifier built in. Otherwise, i'm trying to follow the Microsoft Contoso university example (instructors - courses- enrolments).
What I am trying to achieve is a page for a specific site with a list of the sensors at the site, and their readings. Eventually this page will present the data in graphical form. But for now, i'm just after the raw data.
public class Site
{
public int Id { get; set; }
public string Name { get; set; }
public ICollection<Sensor> Sensors { get; set; }
}
public class Sensor
{
[Key]
public string SensorName { get; set; }
public int SensorTypeId { get; set; }
public int SiteId { get; set; }
public ICollection<Observation> Observations { get; set; }
}
public class Observation
{
public int Id { get; set; }
public string SensorName { get; set; }
public float ObsValue { get; set; }
public DateTime ObsDateTime { get; set; }
}
and I've created a View Model for the page I'm going to use...
public class SiteDataViewModel
{
public Site Site { get; set; }
public IEnumerable<Sensor> Sensors { get; set;}
public IEnumerable<Observation> Observations { get; set; }
}
and then i try to join up the 3 classes into that View Model in the SiteController.cs...
public actionresult Details()
var viewModel.Site = _context.Sites
.Include(i => i.Sensors.select(c => c.Observations));
i used to get an error about "cannot convert lambda expression to type string", but then I included "using System.Data.Entity;" and the error has changed to two errors... on the 'include', I get "cannot resolve method 'include(lambda expression)'...". And on the 'select' i get "Icollection does not include a definition for select..."
There's probably all sorts of nastiness going on, but if someone could explain where the errors are (and more importantly why they are errors), then I'd be extremely grateful.
Simply you can you use like
viewModel.Site = _context.Sites
.Include("Sensors).Include("Sensors.Observations");
Hope this helps.
The way your ViewModel is setup, you're going to have 3 unrelated sets of data. Sites, sensors, and observations. Sites will have no inherent relation to sensors -- you'll have to manually match them on the foreign key. Realistically, your ViewModel should just be a list of Sites. You want to do
#Model.Sites[0].Sensors[0].Observations[0]
not something convoluted like
var site = #Model.Sites[0]; var sensor = #Model.Sensors.Where(s => SiteId == site.Id).Single(); etc...
Try doing
viewModel.Site = _context.Sites.Include("Sensors.Observations").ToList();
Eager-loading multiple levels of EF Relations can be accomplished in just one line.
One of the errors you reported receiving, by the way, is because you're using 'select' instead of 'Select'
And lastly, be aware that eager-loading like this can produce a huge amount of in-memory data. Consider splitting up your calls for each relation, such that you display a list of Sensors, and clicking, say, a dropdown will call an API that retrieves a list of Sites, etc. This is a bit more streamlined, and it prevents you from getting held up because your page is loading so much information.
Update
I've created a sample application for you that you can browse and look through. Data is populated in the Startup.Configure method, and retrieved in the About.cshtml.cs file and the About.cshtml page.. This produces this page, which is what you're looking for I believe.
Suppose I have a model with 20 fields, and in my index page, I want to list all models that are stored in my database.
In index page, instead of listing all fields of the model, I only to list 3 fields.
So, I make two class:
class CompleteModel {
public int Id { get; set; }
public string Field01 { get; set; }
public string Field02 { get; set; }
public string Field03 { get; set; }
public string Field04 { get; set; }
public string Field05 { get; set; }
...
public string Field20 { get; set; }
}
now, in my Controller, I can use:
await _context.CompleteModel.ToListAsync();
but I feel that it does not seem to be the right way to do it, because I'm getting all fields and using only 3 fields.
So, I made this code:
class ViewModel {
public string Field02 { get; set; }
public string Field04 { get; set; }
public string Field08 { get; set; }
}
var result = _context.CompleteModel.Select(
x => new {
x.Field02,
x.Field04,
x.Field08
}).ToListAsync();
var listResults = new List<IndexViewModel>();
if (result != null)
{
listResults.AddRange(results.Select(x => new IndexViewModel
{
Field02 = x.Field02,
Field04 = x.Field04,
Field08 = x.Field08
}));
}
I think this is a lot of code to do this.
First, I selected all the fields that I want, then, copied everything to another object.
There's a "more directly" way to do the same thing?
Like:
_context.CompleteModel.Select(x => new IndexViewModel { Field02, Field04, Field08 });
You could use AutoMapper to reduce the boiler plate so you're not manually copying field values over.
If you include the AutoMapper NuGet package then you'd need to have the following in your startup somewhere to configure it for your classes:
Mapper.Initialize(cfg => cfg.CreateMap<CompleteModel, ViewModel>());
You could then do something like the following:
var results = await _context.CompleteModel.ToListAsync();
var viewModelResults = results.Select(Mapper.Map<ViewModel>).ToList();
There are a lot of configuration options for the package so do take a look at the documentation to see if it suits your needs and determine the best way to use it if it does.
In my view this is one of the weaknesses of over abstraction and layering. The VM contains the data that is valuable to your application within the context of use (screen, process etc). The data model contains all the data that could be stored that might be relevant. At some point you need to match the two.
Use EF Projection to fetch only the data you need from the database into projected data model classes (using the EF POCO layer to define the query, but not to store the resultant data).
Map the projected classes onto your VM, if there is a naieve mapping, using Automapper or similar. However unless you are just writing CRUD screens a simple field by field mapping is of little value; the data you fetch from your data store via EF is in its raw, probably relational form. The data required by your VM is probably not going to fit that form very neatly (again, unless you are doing a simple CRUD form), so you are going to need to add some value by coding the relationship between the data store and the View Model.
I think concentrating on the count of lines of code would lead to the wrong approach. I think you can look at that code and ask "is it adding any value". If you can delegate the task to Automapper, then great; but your VM isn't really pulling its weight other than adding some validation annotation if you can consistently delegate the task of data model to VM data copying.
I'm using MongoDB to store some data. Documents have some mandatory fields and a set of optional. There can be any number of optional fields (this is metadata):
class DataItem {
public int id {get; set;}
public string Comment { get; set; }
[BsonExtraElementsAttribute]
BsonDocument Metadata { get; set; }
}
Metadata field names might be different for different documents, so I do not know these names.
I need to query such documents where any filed of Metadata contains a particular value.
I tried:
var query = "<some value>";
var res = collection.Find(di => di.Metadata.ContainsValue(BsonValue.Create(query))).ToListAsync();
But this code throws exception because ContainsValue() method is not supported there. When I try this:
var res = collection.Find(di => di.Metadata.Values.Contains(BsonValue.Create(query))).ToListAsync();
an empty result set is returned. I think the problem is in [BsonExtraElementsAttribute] but I cannot change it. Is there a way to do so?