In the parent there is a Observable Collection PendingPayment that has a list of all pending payments of sales with a column amount paid.
Then the user can select a particular sale and open it in new child window.
The thing thats going wrong is if the user just edits the text box paid amount in child window and closes the window without saving the new paid amount to database,the observable collection containing Amount paid column in the parent window gets updated.
What I want is it the collection to get updated only when the values are updated in the database.
This can be achieved by creating a copy of your sale object when the user select it in the list, and then using this copy as the view model of your child view.
You will then be able to set the new values in the original object from your list only once the save button has been clicked and the database update succeed.
An other way to proceed if you need to edit only few of the object properties would be to create and editor object and use it as the child window's view model.
Something like this :
public class Sale
{
public int PaidAmount { get; set; }
public int Some { get; set; }
public int More { get; set; }
public int Properties { get; set; }
}
public class SaleEditor
{
private Sale _sale;
public int PaidAmount { get; set; }
public SaleEditor(Sale sale)
{
_sale = sale;
PaidAmount = sale.PaidAmount;
}
public void Save()
{
// update your data here
_sale.PaidAmount = PaidAmount;
}
}
If you need your original object to update the database, then the save method could first update the object and the revert the changes if DB update failed :
public void Save()
{
var oldAmount = _sale.PaidAmount;
_sale.PaidAmount = PaidAmount;
if (!SalesDB.Update(_sale))
_sale.PaidAmount = oldAmount;
// you could also read back the value from DB
}
Whenever possible (I've never see a reason why it cannot),for listing purpose use proxy or flatted objects, you can implement this using projections query. Then user select an item from a list and the only thing you need to grab is a key to load the full object with its required object graph as the use case might dictate.
Here is a sample implementation using Entity Framework and c# lambda expressions:
Using anonymous object:
var anonymousListProjection = DbContext.PendingPayments.Select( pp=>
new { pp.Order, pp.Amount})
Using a hardcoded proxy:
var hardcodedListProjection = DbContext.PendingPayments.Select( pp=>
new PendingPaymentProxy { Order = pp.Order, Amount = pp.Amount})
//To return an observable:
var observableColl = new ObservableCollection<PendingPaymentProxy>
(hardcodedListProjection.Tolist());
public class PendingPaymentProxy
{
public string Order { get; set; }
public decimal Amount{ get; set; }
}
Apart from avoiding possibles performance problems due to unintentional loading real objects, this way you only have to worry for your list when the user do save in the detail view.
Related
There is Client app with ListBox containing Records sorted by Record Time attribute. On application start Client loads Records from server and start listening to server updates. When server inform Client about new Record Client add Record to ListBox (theoretically with Time before last showed Recort Time). When Server inform about update or delete Client find message by ID and update or delete it (Theoretically when Time of Record is changed order of Records in ListBox must be changed).
I guess somethink like sorted dictionary with value time comparator is required.
public partial class RecordsForm : Form
private System.Windows.Forms.ListBox recordsListBox;
peivate SortedDictionary<long, Record> recordsDictionary;
public RecordsForm()
{
InitializeComponent();
// this is not working because comparer must compare keys
recordsDictionary = new SortedDictionary<long, Record>(new RecordComparer());
var recordsBbinding = new BindingSource();
recordsBbinding.DataSource = recordsDictionary;
recordsListBox.DataSource = recordsBbinding;
}
public HandleCreateUpdate(Record record)
{
recordsDictionary[record.Id] = record;
}
public HandleDelete(Record record)
{
if (recordsDictionary.ContainsKey(record.Id))
{
recordsDictionary.Remove(record.Id);
}
}
}
class Record {
public long Id { get; set; }
public DateTime Time { get; set; }
public String Title { get; set; }
}
class RecordComparer : Comparer<Record>
{
public override int Compare(Record left, Record right)
{
return left.Time.CompareTo(right.Time);
}
}
Or exists another pattern used for something like this?
EDIT: Added screen
List of Records is always sorted by Time desc. I want to synchronize this list across clients. When one edit/add/delete record other clients may reflects changes without reloading entire list or iterating over all items.
Let's say we have a realm results taken with
RealmDb.All<Entry>();
Then I want to do some search over those results using not yet supported techniques, like StartsWith on a function return or on a property which is not mapped in realm etc, so I get a subset
IEnumerable<Entry> subset = bgHaystack;
var results = subset.Where(entry => entry.Content.ToLower().StartsWith(needle));
To get somehow these as part of RealmResults, I extract the entry ids like this:
List<int> Ids = new List<int>();
foreach (Entry entry in entries)
{
Ids.Add(entry.Id);
}
return Ids;
and finally I want to return a subset of RealmResults (not IEnumerable) of only those Entries that contain those ids, how can I do that? IDE says the Contains method is not supported.
Can I use some kind of predicate or a comparer for that?
Entry is my model class
using System.ComponentModel.DataAnnotations.Schema;
using Realms;
using System.Collections.Generic;
using System.ComponentModel.DataAnnotations;
using System;
namespace Data.Models
{
[Table("entry")]
public class Entry : RealmObject
{
public class EntryType
{
public const byte Word = 1;
public const byte Phrase = 2;
public const byte Text = 3;
};
[Key]
[PrimaryKey]
[Column("entry_id")]
public int Id { get; set; }
[Column("user_id")]
public int UserId { get; set; }
[Column("source_id")]
public int SourceId { get; set; }
[Indexed]
[Column("type")]
public byte Type { get; set; }
[Column("rate")]
public int Rate { get; set; }
[Column("created_at")]
public string CreatedAt { get; set; }
[Column("updated_at")]
public string UpdatedAt { get; set; }
[NotMapped]
public Phrase Phrase { get; set; }
[NotMapped]
public Word Word { get; set; }
[NotMapped]
public Text Text { get; set; }
[NotMapped]
public IList<Translation> Translations { get; }
[NotMapped]
public string Content
{
get {
switch (Type)
{
case EntryType.Phrase:
return Phrase?.Content;
case EntryType.Word:
return Word?.Content;
case EntryType.Text:
return Text?.Content;
}
return "";
}
}
}
}
According to the documentation, Realm .NET supports LINQ, so that's promising. In your specific example, you indicate that StartsWith isn't supported, but I see that on the above page, specifically here.
Now, your example makes clear that Entry is a RealmObject, so it's not clear where you'd possibly get a RealmResult from (nor does their documentation on that page mention a RealmResult). Specifically, the home page indicates that you're really only going to ever work with Realm, RealmObject and Transaction, so I'm going to just assume that you meant that you'll need a resulting RealmObject per their examples.
The way you presently have your data object set up, you're rather stuck calling it like you are (though if I could make a recommendation to simplify it a little bit:
var entries = RealmDb.All<Entry>().ToList();
var results = entries.Where(entry => entry.Content.ToLower().StartsWith(needle));
var ids = results.Select(a => a.Id).ToList();
Now, your big issue with just combining the filter predicate in line 2 with the end of line 1: Content itself is marked with a [NotMapped] attribute. Per the documentation again:
As a general rule, you can only create predicates with conditions that
rely on data in Realm. Imagine a class
class Person : RealmObject
{
// Persisted properties
public string FirstName { get; set; }
public string LastName { get; set; }
// Non-persisted property
public string FullName => FirstName + " " + LastName;
}
Given this class, you can create queries with conditions that apply to
the FirstName and LastName properties but not to the FullName
property. Likewise, properties with the [Ignored] attribute cannot be
used.
Because you're using [NotMapped], I've got to believe that's going to behave similarly to [Ignored] and further, because it's just a computed value, it's not something that Realm is going to be able to process as part of the query - it simply doesn't know it because you didn't map it to the information Realm is storing. Rather, you'll have to compute the Content property when you've actually got the instances of your Entry objects to enumerate through.
Similarly, I expect you'll have issues pulling values from Phrase, Word and Text since they're also not mapped, and thus not stored in the record within Realm (unless you're populating those in code you didn't post before executing your Where filter).
As such, you might instead consider storing separate records as a PhraseEntry, WordEntry, and TextEntry so you can indeed perform exactly that filter and execute it on Realm. What if you instead used the following?
public class Entry : RealmObject
{
[Key]
[PrimaryKey]
[Column("entry_id")]
public int Id { get; set; }
[Column("user_id")]
public int UserId { get; set; }
[Column("source_id")]
public int SourceId { get; set; }
[Column("rate")]
public int Rate { get; set; }
[Column("created_at")]
public string CreatedAt { get; set; }
[Column("updated_at")]
public string UpdatedAt { get; set; }
[Column("content")]
public string Content { get; set; }
[NotMapped]
public IList<Translation> Translations { get; }
}
[Table("wordEntry")]
public class WordEntry : Entry
{
}
[Table("phraseEntry")]
public class PhraseEntry : Entry
{
}
[Table("textEntry")]
public class TextEntry : Entry
{
}
And now, you can offload the filtering to Realm:
var wordEntries = RealmDb.All<WordEntry>.Where(entry =>
entry.Content.StartsWith(needle, StringComparison.OrdinalIgnoreCase)).ToList();
var phraseEntries = RealmDb.All<PhraseEntry>.Where(entry => entry.Content.StartsWith(needle, StringComparison.OrdinalIgnoreCase)).ToList();
var textEntries = RealmDb.All<TextEntry>.Where(entry => entry.Content.StartsWith(needle, StringComparison.OrdinalIgnoreCase)).ToList();
var entries = new List<Entry>();
entries.AddRange(wordEntries);
entries.AddRange(phraseEntries);
entries.AddRange(textEntries);
var ids = entries.Select(entry => entry.Id).ToList();
It's not quite as brief as storing it all in one table, but I'm not immediately seeing any Realm documentation that indicates support for executing the same query against multiple tables simultaneously, so at least this would allow you to leave the filtering to the database and work against a more limited subset of values locally.
Finally, so we have all that and I missed your final question up top. You indicate that you want to return a subset of your entries based on some collection of ids you create. In the logic you provide, you're retrieving all the Id properties in all your results, so there's really no further subset to pull.
That said, let's assume you have a separate list of ids that for whatever complicated reason, you were only able to derive after retrieving the list of Entry types from above (themselves all PhraseEntry, WordEntry or TextEntry objects).
At this point, since you've already pulled all the values from Realm and have them locally, just execute another Where statement against them. Because a List implements IEnumerable, you can thus execute the LINQ locally without any of the Realm restrictions:
var myLimitedIdSet = new List<int>()
{
10, 15, 20, 25 //Really complicated logic to narrow these down locally
};
var resultingEntries = entries.Where(entry => myLimitedIdSet.Contains(entry.Id)).ToList();
And you're set. You'll have only those entries that match the IDs listed in myLimitedIdSet.
Edit to address comment
You see this error because of the detail provided at the top of this page in the documentation. Specifically (and adapting to your code):
The first statement gives you a new instance of Entry of a class that implements IQueryable... This is standard LINQ implementation - you get an object representing the query. The query doesn't do anything until you made a further call that needs to iterate or count the results.
Your error is then derived by taking the result from RealmDb.All<Entry>() and trying to cast it to an IEnumerable<Entry> to operate against it as though you have local data. Until you call ToList() onRealmDb.All` you simply have a LINQ representation of what the call will be, not the data itself. As such, when you further refine your results with a Where statement, you're actually adding that to a narrowed version of the IQueryable statement, which will also fail because you lack the appropriate mapping in the Realm dataset.
To skip the optimization I provided above, the following should resolve your issue here:
var bgHaystack = realm.All<Entry>().ToList(); //Now you have local data
var results = bgHaystack.Where(entry => entry.Content.ToLower().StartsWith(needle));
Unfortunately, given your provided code, I don't expect that you'll see any matches here unless needle is an empty string. Not only is your Content property not part of the Realm data and you thus cannot filter on it within Realm, but neither are your Phrase, Word or Text properties mapped either. As a result, you will only ever see an empty string when getting your Content value.
You can further refine the results variable above to yield only those instances with a provided ID as you see fit with normal LINQ (as again, you'll have pulled the data from Realm in the first line).
var limitedIds = new List<int>{10, 20, 30};
var resultsLimitedById = results.Select(a => limitedIds.Contains(a.Id)).ToList();
I've updated my examples above to reflect the use of ToList() in the appropriate places as well.
I am very new to asp.net and C# so bear with me. I am trying to return data from a database using the entity framework .include() method so that I can get the foreign key information from another table. However, what is being returned is only part of the data. It seems to be cut off before everything is returned.
"[{"id":11,"name":"Mr. Not-so-Nice","heroType":3,"heroTypeNavigation":{"id":3,"type":"Villian","heroes":["
Which gives me the error: SyntaxError: Unexpected end of JSON input.
Please seem below for the model classes and the GET section of the controller where this is being returned. If I remove the "include()" method it returns all the heroes from the main table just fine.
public partial class Hero
{
public int Id { get; set; }
public string Name { get; set; }
public int? HeroType { get; set; }
public virtual HeroTypes HeroTypeNavigation { get; set; }
}
{
public partial class HeroTypes
{
public HeroTypes()
{
Heroes = new HashSet<Hero>();
}
public int Id { get; set; }
public string Type { get; set; }
public virtual ICollection<Hero> Heroes { get; set; }
}
// GET: api/Heroes
[HttpGet]
public async Task<ActionResult<IEnumerable<Hero>>> GetHeroesTable()
{
return await _context.HeroesTable.Include(hero => hero.HeroTypeNavigation).ToListAsync();
}
Serializer recursion rules will be tripping this up. Basically as jonsca mentions, you have a circular reference between hero, and hero type. The serializer will start with the hero, then go to serialize the hero type which it will find the Hero's collection and expect to serialize, which each would reference a hero type, with collections of Heros.. The serializer bails when it sees this.
I would recommend avoiding passing back Entity classes to your view to avoid issues with EF and lazy loading. Serialization will iterate over properties, and this will trigger lazy loads. To avoid this, construct a view model for the details your view needs, flatten as necessary.
For example if you want to display a list of Heroes with their Type:
public class HeroViewModel
{
public int HeroId { get; set; }
public string Name { get; set; }
public string HeroType { get; set; }
}
to load:
var heroes = await _context.HeroesTable.Select(x => new HeroViewModel
{
HeroId = x.HeroId,
Name = x.Name,
HeroType = x.HeroType.Type
}).ToListAsync();
You can utilize Automapper for example to help translate entities to view models without that explicit code using ProjectTo<TEntity> which can work with EF's IQueryable implementation.
With larger realistic domains your client likely won't need everything in the object graph.
You won't expose more information than you need to. (I.e. visible via debugging tools)
You'll get a performance boost from not loading the entire graph or triggering
lazy load calls, and it's less data across the wire.
The last point is a rather important one as with complex object graphs, SQL can do a lot of the lifting resulting in a much more efficient query than loading "everything". Lazy hits to the database can easily add several seconds to each and every call from a client, and loading large graphs has a memory implication on the servers as well.
Suppose I have a model with 20 fields, and in my index page, I want to list all models that are stored in my database.
In index page, instead of listing all fields of the model, I only to list 3 fields.
So, I make two class:
class CompleteModel {
public int Id { get; set; }
public string Field01 { get; set; }
public string Field02 { get; set; }
public string Field03 { get; set; }
public string Field04 { get; set; }
public string Field05 { get; set; }
...
public string Field20 { get; set; }
}
now, in my Controller, I can use:
await _context.CompleteModel.ToListAsync();
but I feel that it does not seem to be the right way to do it, because I'm getting all fields and using only 3 fields.
So, I made this code:
class ViewModel {
public string Field02 { get; set; }
public string Field04 { get; set; }
public string Field08 { get; set; }
}
var result = _context.CompleteModel.Select(
x => new {
x.Field02,
x.Field04,
x.Field08
}).ToListAsync();
var listResults = new List<IndexViewModel>();
if (result != null)
{
listResults.AddRange(results.Select(x => new IndexViewModel
{
Field02 = x.Field02,
Field04 = x.Field04,
Field08 = x.Field08
}));
}
I think this is a lot of code to do this.
First, I selected all the fields that I want, then, copied everything to another object.
There's a "more directly" way to do the same thing?
Like:
_context.CompleteModel.Select(x => new IndexViewModel { Field02, Field04, Field08 });
You could use AutoMapper to reduce the boiler plate so you're not manually copying field values over.
If you include the AutoMapper NuGet package then you'd need to have the following in your startup somewhere to configure it for your classes:
Mapper.Initialize(cfg => cfg.CreateMap<CompleteModel, ViewModel>());
You could then do something like the following:
var results = await _context.CompleteModel.ToListAsync();
var viewModelResults = results.Select(Mapper.Map<ViewModel>).ToList();
There are a lot of configuration options for the package so do take a look at the documentation to see if it suits your needs and determine the best way to use it if it does.
In my view this is one of the weaknesses of over abstraction and layering. The VM contains the data that is valuable to your application within the context of use (screen, process etc). The data model contains all the data that could be stored that might be relevant. At some point you need to match the two.
Use EF Projection to fetch only the data you need from the database into projected data model classes (using the EF POCO layer to define the query, but not to store the resultant data).
Map the projected classes onto your VM, if there is a naieve mapping, using Automapper or similar. However unless you are just writing CRUD screens a simple field by field mapping is of little value; the data you fetch from your data store via EF is in its raw, probably relational form. The data required by your VM is probably not going to fit that form very neatly (again, unless you are doing a simple CRUD form), so you are going to need to add some value by coding the relationship between the data store and the View Model.
I think concentrating on the count of lines of code would lead to the wrong approach. I think you can look at that code and ask "is it adding any value". If you can delegate the task to Automapper, then great; but your VM isn't really pulling its weight other than adding some validation annotation if you can consistently delegate the task of data model to VM data copying.
I am trying to update an entity with custom getters and setters but they are not being saved if I edit any other property on the entity. I have a DTO, which maps onto the retrieved entity.
DTO:
public class MyEntityDto
{
public string Id { get; set; }
public string Name { get; set; }
public bool Enabled { get; set; }
public Dictionary<string, string> Settings { get; set; }
}
Entity:
public class MyEntity
{
public virtual Guid Id { get; set; }
public virtual string Name { get; set; }
public virtual bool Enabled { get; set; }
public Dictionary<string, string> Settings { get; set; }
public virtual string SettingsJson
{
get { return JsonConvert.SerializeObject(Settings); }
set
{
Settings = JsonConvert.DeserializeObject<Dictionary<string, string>>(value);
}
}
}
Update Function:
public async Task UpdateAsync(dto MyEntityDto)
{
var existingEntity = await _myRepository.GetByIdAsync(dto.Id);
existingEntity.Name = dto.Name;
existingEntity.Enabled = dto.Enabled;
existingEntity.Settings = dto.Settings;
await _myRepository.Update(existingEntity);
// save changes happens at end of request
}
What happens is my SettingsJson does not get updated. I have stepped through it and existingEntity.SettingsJson and existingEntity.Settings both have the correct data. Name and Enabled do get updated.
However if I remove the assignments for Name and Enabled, SettingsJson does get updated.
It does work however on entity creation, just not on update. I would also like to point out I do not have access to DbContext.
I'm guessing Entity Framework doesn't know it should update SettingsJson in the database because you never modify it directly in your code. Yes, you modify the Settings object which is used in the getter of SettingsJson, but Entity Framework does not track objects that way.
Try this after modifying the Settings object, but before doing the actual update:
_myDbContext.Entry(existingEntity).Property(e => e.SettingsJson).IsModified = true;
This code however assumes your DbContext is exposed, which may not be the case.
EDIT:
If you don't have access to the DbContext, you must directly set the SettingsJson property in order to have Entity Framework update it in the database. I understand that in your current implementation you never need to worry about updating both the actual dictionary and the JSON representation of the dictionary, but that's just not possible with Entity Framework. You'll have to try something like this:
public virtual string SettingsJson { get; set; } // just an auto-implemented property
...
existingEntity.Settings = dto.Settings;
existingEntity.SettingsJson = JsonConvert.SerializeObject(dto.Settings);
// save changes
Another solution could be to directly set the SettingsJson in the setter of Settings, like so:
private Dictionary<string, string> settings // private backing field
public Dictionary<string, string> Settings
{
get { return settings; }
set
{
settings = value;
SettingsJson = JsonConvert.SerializeObject(dto.Settings); // EF tracks this
}
}
However, if you add elements to the dictionary after setting it, you need to re-serialize the dictionary again, in order to update the JSON string. You could solve this dynamically by using some sort of observable dictionary, but make sure it only serializes the dictionary once, when needed (before calling updating the database).
This is not really a solution but it's food for thought. Databases don't understand dictionaries. You know that, that's why you're using JSON to help you out. However, why not simply add a Setting table to your database model?