i have the following model:
public partial class location
{
public int Id { get; set; }
public double Lat { get; set; }
public double Long { get; set; }
public virtual ICollection<localserver> localserver { get; set; }
}
When, in a controller, i do:
List<location> l = db.location.ToList();
i get also the localserver object. How, in LINQ, get only the property of location (Id, Lat and Long) without using the Select argument in linq?
The way to extract part of an object is to project it to a new form, which is what .Select() is for:
var result = db.location
.Select(x => new
{
Id = x.Id,
Lat = x.Lat,
Long = x.Long
})
.ToList();
Now, you've specifically asked how to do this without using .Select(), so... really, the answer is "you don't". You've ruled out the tool that's specifically designed for the scenario you're presenting, so no useful answer remains.
To address the intent of your question, however, I'm going to make a guess that you don't want to load the collection of localserver objects, perhaps because it's large and uses a lot of memory. To solve that problem, I would suggest the following:
If you're using Entity Framework Code First, check your cascade options when creating the database (if you're setting any).
Check that the collection is declared as virtual (you've shown it as such here but check the actual source)
Check that you have lazy-loading enabled, by setting myContext.ContextOptions.LazyLoadingEnabled = true; at some point
This should allow the application to lazy-load that property, which means that the contents of that localserver property will only be retrieved from the database and loaded into memory when they're actually needed. If you don't ever access it, it won't be loaded and won't take up any memory.
When you are getting Location list entity is not pulling Localserver object data, the thing that entity framework has feature called lazy loading. Lazy loading means you can get associated object at any time you need, no need write one more separate linq query to get. This will pull data from associated object only when you call them inside the code, then entity framework will make one more call to database to load that associated object data.
So just go with your code, but if you want some selected columns from Location object itself than you have write Select and supply column names.
Related
While using Entity Framework Core with SQL Server I encountered an unexpected problem. I have an entity A that has a one-to-many relationship on entity B.
[Table("client")]
public class Client
{
public long ID { get; set; }
public string Name { get; set; }
public ICollection<Configuration> Configurations { get; set; } = new LinkedList<Configuration>();
}
I get a list of instances of entity A from the database like this:
public ICollection<Client> GetAllClients()
{
return _dbContext.Clients.ToList();
}
When I call this function I get a list of instances without the instances of entity B in the relationship. Why are the objects in the relationship not retrieved correctly?
I've also found out that if I add this line of code to the function the entities are retrieved as expected.
public ICollection<Client> GetAllClients()
{
var test = _dbContext.Configurations.ToList();
return _dbContext.Clients.ToList();
}
This makes no sense to me. What am I missing here?
You can use the Include method to specify related data to be included in query results (Eager loading).
Take a look in the following example:
public ICollection<Client> GetAllClients()
{
return _dbContext.Clients.Include(x => x.Configurations).ToList();
}
You can check the MSDN reference.
Related reference/collection properties must either be eagerly or explictly loaded. You generally want to eagerly load using Include:
var clients = await _context.Clients.Include(x => x.Configurations).ToListAsync();
Alternatively, you can lazy load, but that's generally a bad idea, as it can lead to N+1 query problems (i.e. you issue one query, and then a separate additional query for each item as you iterate through, which is obviously highly inefficient). Regardless, lazy loading requires two things:
The reference/collection property must have the virtual keyword. EF adds lazy loading functionality by creating a dynamic proxy of your entity class and overriding the property getter. Overriding, of course, can only be done on virtuals.
You have to explicitly add the lazy-loading services:
services.AddDbContext<MyContext>(o => o.UseLazyLoadingProxies()
.UseSqlServer(connectionString));
It works when you query the configurations as well, because EF has object-fixup. In other words, it will automatically fill related reference/collection properties, if it has previously retrieved those objects already and has them in its object cache. Otherwise, and if you do not otherwise load the relations, the property will remain null.
I am very new with MongoDB (only spend a day learning). I have a relatively simple problem to solve and I choose to take the opportunity and learn about this popular nosql database.
In C# I have the following classes:
public class Item
{
[BsonId]
public string ItemId { get; set; }
public string Name { get; set; }
public ICollection<Detail> Details { get; set; }
}
public class Detail
{
//[BsonId]
public int DetailId { get; set; }
public DateTime StartDate { get; set; }
public double Qty { get; set; }
}
I want to be able to add multiple objects (Details) to the Details collection. However I know that some of the items I have (coming from a rest api) will already be stored in the database and I want to avoid the duplicates.
So far I can think of 2 ways of doing it, but I am not really happy with either:
Get all stored details (per item) from MongoDB and then in .net I can filter
and find the new items and add them to the db. This way I can be sure that there will be no duplicates. That is however far from ideal solution.
I can add [BsonId] attribute to the DetailId (without this attribute this solution does not work) and then use AddToSetEach. This works and my only problem with that is that I don’t quite understand it. I mean, it suppose to only add the new objects if they do not already exists in the database,
but how does it know? How does it compare the objects? Do I have any control over that comparison process? Can I supply custom comparers? Also I noticed that if I pass 2 objects with the same DetailId (this should never happen in the real app), it still adds both, so BsonId attribute does not guarantee uniqueness?
Is there any elegant solution for this problem? Basically I just want to update the Details collection by passing another collection (which I know that contain some objects already stored in the db i.e. first collection) and ignore all duplicates.
The AddToSetEach based version is certainly the way to go since it is the only one that scales properly.
I would, however, recommend you to drop the entire DetailId field unless it is really required for some other part of your application. Judging from a distance it would appear like any entry in your list of item details is uniquely identifiable by its StartDate field (plus potentially Qty, too). So why would you need the DetailId on top of that?
That leads directly to your question of why adding a [BsonId] attribute to the DetailId property does not result in guaranteed uniqueness inside your collection of Detail elements. One reason is that MongoDB simply cannot do it (see this link). The second reason is that MongoDB C# driver does not create an unique index or attempts other magic in order to ensure uniqueness here - probably because of reason #1. ;) All the [BsonId] attribute does is tell the driver to serialize the attributed property as the "_id" field (and write the other way round upon deserialization).
On the topic of "how does MongoDB know which objects are already present", the documentation is pretty clear:
If the value is a document, MongoDB determines that the document is a
duplicate if an existing document in the array matches the to-be-added
document exactly; i.e. the existing document has the exact same fields
and values and the fields are in the same order. As such, field order
matters and you cannot specify that MongoDB compare only a subset of
the fields in the document to determine whether the document is a
duplicate of an existing array element.
And, no, there is no option to specify custom comparers.
I'm consuming an api that has the following class for time.
public class CRTime
{
public DateTime? datetime { get; set; }
public string timezone { get; set; }
}
I would like to translate that into a single DateTime field.
Here is an example of the full object that is returned to me and how I am trying to calculate the DateTime field that will then be stored in the database.
public class ReturnObject
{
[NotMapped]
// This comes from the api, but is not synced to database
public CRTime CRCreated { get; set; }
// this is stored in our db, should be calculated from CRCreated
public DateTime? CreatedDB { get
{
var val = (CRCreated != null) ? CRCreated.datetime : null;
return val;
}
private set { }
}
// other fields go here
}
This works perfect on record create, but when I try and update the record using the Entity.Migration framework and AddOrUpdate, the update overwrites the value and always sets it to NULL.
What is the best solution for creating a server side computed column that then gets synced to the database.
Side Note: I am using NewtonSoft Json to deserialize the object into my entity framework object, then passing that to Entity Framework AddOrUpdate.
This is because of a rather confusing quirk of AddOrUpdate.
AddOrUpdate determines if an entity is new or existing, obviously. It does that by querying the database for the entity by its primary key, or some key you can specify. Now two things can happen:
It doesn't find the entity in the database; your entity object is marked as Added.
It does find an entity, but contrary to what you'd expect your entity object is still Detached! The found entity remains hidden and this is the object EF updates in the end. It copies the values from your own entity to it, which may mark the hidden entity as Modified.
Obviously, by this empty setter, CreatedDB will always be null when it's fetched from the database. For EF, CRCreated doesn't exist. It probably reads a CreatedDB value from your object, but again the empty setter prevents it from being copied to the hidden object and it gets updated as null.
You have to decide how to fix this. Probably the best way is to make the setter actually modify (or create) the CRCreated instance.
Alternatively, you can do away with AddOrUpdate and try to find the existing object yourself. If you find it, it's not hidden anymore, so you can do with it whatever you like.
I would expect it to be NULL after you read it from the Db. The private set; is of course weird and blocks the proper working of EF.
You could do your self a great favour when you are able to use DateTimeOffset but in the current situation:
Your property might be logically 'calculated' but towards the database it should be treated and implemented as a normal read/write property.
I have the following class generated by entity framework:
public partial class Branch
{
public short Id { get; set; }
public short CompanyId { get; set; }
public string Code { get; set; }
public string Title { get; set; }
public virtual Company Ts_Companies { get; set; }
}
I have the following method which takes all of the branches out of the database:
public Branch[] LoadBranches(int companyId, int page, int limit, string search, string sort, string sortOrder)
{
using (var dbContext = new TimeShedulerEntities())
{
var _branches = (from ct in dbContext.Branches
where ct.Title.Contains(search) || ct.Code.Contains(search)
select ct).OrderBy(c => c.Title).Skip((page - 1) * limit).Take(limit);
return _branches.ToArray();
}
}
In my model designer I see that the Lazy Loading is set to true, but when I iterate over the branches, the property Ts_Companies is null. Also I get the following exception:
An exception of type 'System.ObjectDisposedException' occurred in
EntityFramework.dll but was not handled in user code
Additional information: The ObjectContext instance has been disposed
and can no longer be used for operations that require a connection.
Am I forgetting something?
You created and disposed of the context during your function since it was inside the using statement. Each entity happens to know from which context it was created so that lazy loading is possible.
When you accessed the Ts_Companies property, the entity realized that it had not yet loaded that property since it is probably a navigation property and attempted to ask its ObjectContext (TimeShedulerEntities) to load that property. However, the context had been disposed and so that it what caused that exception.
You need to modify your query as follows to 'pre-load' the Ts_Companies:
var _branches = (from ct in dbContext.Branches.Include("Ts_Companies")
where ct.Title.Contains(search) || ct.Code.Contains(search)
select ct).OrderBy(c => c.Title).Skip((page - 1) * limit).Take(limit);
It will take possibly quite a bit longer to load depending on the size of the Ts_Companies object and how many you end up bringing back at once, but the entity will stop asking its object context to load the Ts_Companies since you would have already loaded them.
A side note: I have found that creation and disposal of object context on a per-method basis causes problems when the entities are passed outside the function. If you want to create and destroy the object context in every function, you probably want to have the function return something that is not an entity. In other words, have an object that can be constructed from an entity and has the properties you need, but don't have it reference the entity. In java these are often called Data Transfer Objects (DTOs). You lose the read-write ability of entity framework, but you don't have unexpected ObjectDisposedExceptions flying all over the place.
The problem comes when you ask an entity to be associated with another (for example, adding on entity to a ICollection property of another entity) when they come from different objectcontexts. This will cause headaches for you since you would have to manually attach the objects to the same context before performing that operation. Additionally, you lose the ability to save changes to those entities without manually attaching them to a different context.
My opinion on how I would do it:
I've found it easier to either have an object containing all of these database access functions control the lifetime of the context (i.e. have your containing object be IDisposable and during disposal, destroy the context) or simply not return entities and have the datastore be read-old, write-new essentially without any modification ability.
For example, I have my object (I will call it my data access object) with a bunch of methods for getting database objects. These methods return entities. The data access object also has a SaveChanges method which simply calls the context's SaveChanges method. The data access object contains the context in a protected property and keeps it around until the data access object itself is disposed. Nobody but the data access object is allowed to touch its context. At that point, the context is disposed by manually calling 'Dispose'. The data access object could then used inside a using statement if that is your use case.
In any case, it is probably best to avoid passing entities attached to a context outside the scope in which their context exists since entity framework keeps references to that context all over the place in the individual entities
But you didn't load your Ts_Companies, use Eager Loading instead:
var _branches = dbContext.Branches
.Where(b => b.Title.Contains(search) || b.Code.Contains(search))
.Include("Ts_Companies")
.OrderBy(c => c.Title)
.Skip((page - 1) * limit)
.Take(limit);
And I came across the same issue before System.ObjectDisposedException, in my MVC project and I didn't use using blocks,instead I define my context on class level.If I need to return and use an array (in my View) I use that context.If I need to just update some information then I have used using blocks.I hope this helps.
I'm trying to auto project from SQL server with Automapper some data into my view models.
The view model I have is:
public sealed class WorkstationViewModel
{
public int Id { get; set; }
public string Name { get; set; }
public string OccupiedByName { get; set; }
}
And the code I'm trying to use is:
Mapper.CreateMap<Workstation, WorkstationViewModel>()
.ForMember(T => T.OccupiedByName, T => T.MapFrom(W =>
W.Sessions.AsQueryable().Select(E => E.StaffMember.Name).SingleOrDefault()));
Two properties Id and Name are auto-projected as they have equal names in Workstation class.
The exception I get on some codelines like this
var model = WorkstationsRepository.GetAll().Project()
.To<WorkstationViewModel>().SingleOrDefault();
is some weird object reference is null exception and on the top of the stack trace there are some automapper's CreateExpression<> methods which gives me a conclusion that the automapper cannot generate a good one expression to translate it to SQL code.
When I use simple maps, like .Name to .Category.Name or other one-item retrievals from the SQL table, it works perfectly. All I need is to get access to multiple items while projecting the sequence via Automapper.
The newer Project().To() API takes a totally different route than the "classic" Mapper.Map() API. I think that the latter would run in your case, but of course you won't have the benefit of the projection trickling thought to the SQL.
During Project().To() AutoMapper tries to get MemberInfos (reflection) from the involved types, which it uses to create lambda expressions. This means that the source properties in a mapping must be members of the source type. Evidently, W.Sessions.AsQueryable().Select(... is not a member of Workstation. So somewhere along the line AutoMapper bumps into a null memberinfo.
So, Project().To() is a bit restricted. In this particular case a remedy may be to map a Session with its parents WorkStation and StaffMember to the model. Reference navigation properties will map OK with Project().To().