How to write Linq method in data tier? - c#

I am thinking about how to use Linq in the classic 3-tier archetecture of .net project. Apprently, Linq to SQL should appear in Data tier. The reason I choose Linq is because it will save me much time on code than using store procedure. I did some search on line about the insert/update/delete method of Linq, but didn't find an appropriate method for record update using entities. Usually, people will do update using this way:
public void UpdateUser(String username, String password, int userId)
{
using (var db = new UserDataContext()){
var user = db.user.Single(p => p.Id = userId);
user.Username = username;
user.Password = password;
db.SubmitChanges();
}
}
Why we don't use entity to pass the record like this:
public void Update(Application info)
{
VettingDataContext dc = new VettingDataContext(_connString);
var query = (from a in dc.Applications
where a.Id==info.Id
select a).First();
query = info;
try{
dc.SubmitChanges();
}
catch(Exception e){
//...
}
}
But unfortunately, the above code is wrong because of "query=info", but if I assign each value from "info" to "query", it works fine. like
query.firstName=info.firstName;
query.lastName=info.lastName;
So if this table have 40 fields, I have to write 40 lines code. Is there any easier way to do the update? Hope I describe this issue clearly.

Adding another answer as a comment was not sufficient to expand on my previous answer.
Lets take a step back and look at what you want to do here from a logical perspective. You want to tell your data access layer how it should update the database, with all the new/changed values it needs to write.
One very common way of doing this is to pass an entity which has those changes (which is what you're doing in your example). This can become tricky, as you have seen, because if you simply overwrite the entity variable with the changed entity, Linq2Sql will lose change tracking... just because the new entity is assigned to the same variable, doesn't mean that Linq2Sql automatically picks up changes from the new object... in fact Linq2Sql has no knowledge of the new object at all...
Example:
// In domain layer:
MyEntity entity = new MyEntity();
entity.PrimaryKey = 10;
entity.Name = "Toby Larone";
entity.Age = 27;
myDataRepository.Update(entity);
// In data layer:
void Update(MyEntity changedEntity)
{
using (var db = new DataContext())
{
var entity = (from e in db.MyEntities
where e.PrimaryKey == changedEntity.PrimaryKey
select e).First();
// Linq2Sql now has change tracking of "entity"... any changes made will be persisted when SubmitChanges is called...
entity = changedEntity;
// Linq2Sql does **not** have change tracking of changedEntity - the fact that it has been assigned to the same variable that once stored a tracked entity does not mean that Linq2Sql will magically pick up the changes...
db.SubmitChanges(); // Nothing happens - as far as Linq2Sql is concerned, the entity that was selected in the first query has not been changed (only the variable in this scope has been changed to reference a different entity).
}
}
Now you've already seen that assigning each field to the entity rather than replacing it works as intended - this is because the changes are being made to the original entity, which is still inside the Linq2Sql change tracking system..
One possible solution to this problem would be to write a method that "applies" the changes of another Entity to an existing one, ie:
partial class MyEntity
{
void ApplyChanges(MyEntity changedEntity)
{
this.PrimaryKey = changeEntity.PrimaryKey;
this.Name = changedEntity.Name;
this.Age = changedEntity.Age;
}
}
and then your data access would look like this:
// In data layer:
void Update(MyEntity changedEntity)
{
using (var db = new DataContext())
{
var entity = (from e in db.MyEntities
where e.PrimaryKey == changedEntity.PrimaryKey
select e).First();
// Linq2Sql now has change tracking of "entity"... any changes made will be persisted when SubmitChanges is called...
entity.ApplyChanges(changedEntity);
db.SubmitChanges(); // Works OK...
}
}
But im sure you don't like this solution - because all you have done is effectively move the repetitive field assignment out of the repository and into the Entity class itself...
Going back to the logical perspective - all you really need to do is tell the data access repository 2 things - 1) which record you want to update and 2) what the changes are. Sending an entirely new entity which encapsulates those two requirements is not necessary to achieve that goal, in fact I think it's very inefficient.
In the following example, you are sending the data repository only the changes, not an entire entity. Becuase there is no entity, there are no change tracking issues to work around
Example:
// In domain layer:
myDataRepository.Update(10, entity =>
{
entity.Name = "Toby Larone";
entity.Age = 27;
});
// In data layer:
void Update(int primaryKey, Action<MyEntity> callback)
{
using (var db = new DataContext())
{
var entity = (from e in db.MyEntities
where e.PrimaryKey == primaryKey
select e).First();
// Linq2Sql now has change tracking of "entity"... any changes made will be persisted when SubmitChanges is called...
// The changes that were sent are being applied directly to the Linq2Sql entity, which is already under change tracking...
callback(entity);
db.SubmitChanges();
}
}
In the previous examples, the field assignments were happening twice - once when you described the changes you wanted to make, and again in the data repository when you needed to apply those changes to a Linq2Sql change tracked entity.
Using the callback, the field assignments only happen once - the description of the change itself is what updates the tracked entity.
I hope I explained this well enough :)

Think about what the data repository actually requires in order to perform the update. It does not require an object that contains those changes, but a description of what changes need to be made. This can be encapsulated easily into a callback delegate...
public void UpdateUser(int userId, Action<User> callback)
{
using (var db = new DataContext())
{
User entity = db.Users.Where(u => u.Id == userId).Single();
callback(entity);
db.SubmitChanges();
}
}
myrepository.UpdateUser(userId, user =>
{
user.Username = username;
user.Password = password;
// etc...
});

query is not the same type as info. They may have the same properties to you, but the code doesn't know that.
Now, if you want to avoid writing a bunch of unnecesary code, you can use a third party library like AutoMapper which can do that for you.

Related

Entity Framework 6 update existing record with another record's values

I use EF database-first model in my app. It's WPF MVVM app, so i use long-living DbContext, which is created when app starts and disposed when it finishes.
There are two tables - clients and settings. settings stores all client's settings with client_id as a foreign key and settings_id as primary key.
In this settings table I have some 'default' record with settings_id=1and client_id=1. I want my app to restore 'default' settings for a client by pressing a button.
In my vewmodel i have an ObservableCollection of type Client, which is my db entity model class, and a property SelectedClient of type Client, bound to currently selected client (in some ListBox). Also i have entity class Settings, which has some fields representing different settings from a settings table. I want all these settings from 'default' record to replace currently selected client's settings.
So what am i doing:
public void OnResetClientSettingsCommandExecute()
{
var defaultSettings = Global.DbContext.Settings.FirstOrDefault(c => c.client_id == 1);
if (defaultSettings == null) return;
var tmp = defaultSettings;
tmp.client_id = SelectedClient.client_id; // doing this to change the only field which needs to remain untouched
var selectedClientSettings = Global.DbContext.Settings.FirstOrDefault(c => c.client_id == SelectedClient.client_id);
selectedClientSettings = tmp;
Global.DbContext.SaveChanges();
}
This code doesn't work at all. The only thing i get here - is changing client_id for my 'default' record in settings to SelectedClients client_id. I don't know why it happens, i thought if i would use tmp it'll be ok, but no.
I know there are some practices of using Attach() methods or changing entity's State to Modified - i tried all of them and no one worked for me, i suppose because i use long-living DbContext approach.
Honestly, i am very confused of updating records in my app in general - i just can't do it, DbContext.SaveChanges() method does not save changes to database, but rolls them back for some reason. So i have to use raw SQL-queries, which is a bit of stone age.
Please someone help me to figure out what i am doing wrong. Thanks.
You could create a class with method like this
public static void CopyValues<T>(T source, T destination)
{
var props = typeof(T).GetProperties();
foreach(var prop in props)
{
var value = prop.GetValue(source);
prop.SetValue(destination, value);
}
}
Then assign your keys to temporary variables, copy the rest of the properties and reassign your keys back to their original values.
int id = selectedClientSettings.client_id;
ObjectCopier.CopyValues<Client>(defaultSettings, selectedClientSettings);
selectedClientSettings.client_id = id;
The right way to do it. But it's exhausting !
public void OnResetClientSettingsCommandExecute()
{
var defaultSettings = Global.DbContext.Settings.FirstOrDefault(c => c.client_id == 1);
if (defaultSettings == null) return;
var selectedClientSettings = Global.DbContext.Settings.FirstOrDefault(c => c.client_id == SelectedClient.client_id);
selectedClientSettings.serviceName = defaultSettings.serviceName;
selectedClientSettings.write_delay = defaultSettings.write_delay;
// etc...
Global.DbContext.SaveChanges();
}
You should consider using AutoMapper, it could be easier to write.

Dynamic LINQ - Entity Framework 6 - Update Records for Dynamic Select

C# rookie. Below is my code, been trying for hours now to get this to update some fields in my DB and tried many different implementations without luck.
// Select all fields to update
using (var db = new Entities())
{
// dbFields are trusted values
var query = db.tblRecords
.Where("id == " + f.id)
.Select("new(" + string.Join(",", dbFields.Keys) + ")");
foreach (var item in query)
{
foreach (PropertyInfo property in query.ElementType.GetProperties())
{
if (dbFields.ContainsKey(property.Name))
{
// Set the value to view in debugger - should be dynamic cast eventually
var value = Convert.ToInt16(dbFields[property.Name]);
property.SetValue(item, value);
// Something like this throws error 'Object does not match target type'
// property.SetValue(query, item);
}
}
}
db.SaveChanges();
}
The above code when run does not result in any changes to the DB. Obviously this code needs a bit of cleanup but i'm trying to get the basic functionality working. I believe what I might need to do is to somehow reapply 'item' back into 'query' but I've had no luck getting that to work no matter what implementation I try i'm always receiving 'Object does not match target type'.
This semi similar issue reaffirms that but isn't very clear to me since i'm using a Dynamic LINQ query and cannot just reference the property names directly. https://stackoverflow.com/a/25898203/3333134
Entity Framework will perform updates for you on entities, not on custom results. Your tblRecords holds many entities, and this is what you want to manipulate if you want Entity Framework to help. Remove your projection (the call to Select) and the query will return the objects directly (with too many columns, yes, but we'll cover that later).
The dynamic update is performed the same way any other dynamic assignment in C# would be, since you got a normal object to work with. Entity Framework will track the changes you make and, upon calling SaveChanges, will generate and execute the corresponding SQL queries.
However, if you want to optimize and stop selecting and creating all the values in memory in the first place, even those that aren't needed, you could also perform the update from memory. If you create an object of the right type by yourself and assign the right ID, you can then use the Attach() method to add it to the current context. From that point on, any changes will be recorded by Entity Framework, and when you call SaveChanges, everything should be sent to the database :
// Select all fields to update
using (var db = new Entities())
{
// Assuming the entity contained in tblRecords is named "ObjRecord"
// Also assuming that the entity has a key named "id"
var objToUpdate = new ObjRecord { id = f.id };
// Any changes made to the object so far won't be considered by EF
// Attach the object to the context
db.tblRecords.Attach(objToUpdate);
// EF now tracks the object, any new changes will be applied
foreach (PropertyInfo property in typeof(ObjRecord).GetProperties())
{
if (dbFields.ContainsKey(property.Name))
{
// Set the value to view in debugger - should be dynamic cast eventually
var value = Convert.ToInt16(dbFields[property.Name]);
property.SetValue(objToUpdate, value);
}
}
// Will only perform an UPDATE query, no SELECT at all
db.SaveChanges();
}
When you do a SELECT NEW ... it selects only specific fields and won't track updates for you. I think if you change your query to be this it will work:
var query = db.tblRecords.Where(x=>x.id == id);

How should I disable Entity Framework table reference(foreign) list from each objects?

I'm using Sqlite database and System.Data.SQLite 1.0.92
There is 2 table here:
Table Person:
PersonId
PersonName
Table Student:
StudentId
PersonId(reference table Person FK)
StudentNo
Now every time I get the Persons Collection in EF5:
using (var ctx = new myEntities)
{
AllPersons = ctx.Persons.ToList();
}
There is also has AllPersons.student collection will include in the result;
But I don't need it. Of course that's just an example, There is a lot of big table has so many references, it always has performance problems here because of that.
So I'm trying to do not let it in my result. So I change it:
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
}
Now fine, because AllPersons.student collection will always be null
But now I found: If I get Person and Student together:
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
AllStudents = ctx.Student.ToList();
}
Now the reference still include in.
So Is there anyway to don't let the reference include in any time in this situation?
Thank you.
Update
For some friends request, I explain why I need it:
1: When I convert it to json it will be a dead loop. even I already use Json.net ReferenceLoopHandling, the json size very big to crash the server.(if no references, it's just a very small json)
2:Every time I get the client data and need to save, it will display exception about model state, until I set it to null.
Example:
using (myEntities ctx = new myEntities())
{
ctx.Configuration.LazyLoadingEnabled = false;
ctx.Configuration.ProxyCreationEnabled = false;
Person model= ThisIsAModel();
model.students = null; // This is a key, I need set the students collection references to null , otherwise it will throw exception
ctx.Entry(model).State = EntityState.Modified;
ctx.SaveChanges();
}
3: This is More important problem. I already get all data and cache on the server. But It will let the loading time very long when server start. (because the data and references are so many, that is the main problem), I don't know I'll meet what kind of problem again....
public List<Person> PersonsCache; // global cache
public List<Student> StudentsCache; // global cache
using (myEntities ctx = new myEntities())
{
ctx.Configuration.LazyLoadingEnabled = false;
ctx.Configuration.ProxyCreationEnabled = false;
// There is so many references and data, will let it very slow , when I first time get the all cache. even I only get the Person model, not other , just because some Collection has some references problem. It will very slow....
PersonsCache = ctx.Persons.ToList();
StudentsCache= ctx.Student.ToList();
}
The Problem
As you said, when you load both of Parent and Child lists even when LazyLoading is disabled, and then look in parent.Childs you see child items has been loaded too.
var db = new YourDbContext();
db.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet.ToList();
var childList= db.YourChildSet.ToList();
What happened? Why childs are included in a parent?
The childs under a parent entity, are those you loaded using db.YourChildSet.ToList(); Exactly themselves; In fact Entity Framework never loads childs for a parent again but because of relation between parent and child in edmx, they are listed there.
Is that affect Perforemance?
According to the fact that childs only load once, It has no impact on perforemance because of loading data.
But for serialization or something else's sake, How can I get rid of it?
you can use these solutions:
Solution 1:
Use 2 different instance of YourDbContext:
var db1 = new YourDbContext();
db1.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet.ToList();
var db2 = new YourDbContext();
db2.Configuration.LazyLoadingEnabled = false;
var childList= db.YourChildSet.ToList();
Now when you look in parent.Childs there is no Child in it.
Solution 2:
use Projection and shape your output to your will and use them.
var db1 = new YourDbContext();
db1.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet
.Select(x=>new /*Model()*/{
Property1=x.Property1,
Property2=x.Property2, ...
}).ToList();
This way when serialization there is nothing annoying there.
Using a custom Model class is optional and in some cases is recommended.
Additional Resources
As a developer who use Entity Framework reading these resources is strongly recommended:
Performance Considerations for Entity Framework 4, 5, and 6
Connection Management
I'll focus on your third problem because that seems to be your most urgent problem. Then I'll try to give some hints on the other two problems.
There are two Entity Framework features you should be aware of:
When you load data into a context, Entity Framework will try to connect the objects wherever they're associated. This is called relationship fixup. You can't stop EF from doing that. So if you load Persons and Students separately, a Person's Students collection will contain students, even though you didn't Include() them.
By default, a context caches all data it fetches from the database. Moreover, it stores meta data about the objects in its change tracker: copies of their individual properties and all associations. So by loading many objects the internal cache grows, but also the size of the meta data. And the ever-running relationship fixup process gets slower and slower (although it may help to postpone it by turning off automatic change detection). All in all, the context gets bloated and slow like a flabby rhino.
I understand you want to cache data in separate collections for each entity. Two simple modifications will make this much quicker:
Evade the inevitable relationship fixup by loading each collection by a separate context
Stop caching (in the context) and change tracking by getting the data with AsNoTracking.
Doing this, your code will look like this:
public List<Person> PersonsCache;
public List<Student> StudentsCache;
using (myEntities ctx = new myEntities())
{
ctx.Configuration.ProxyCreationEnabled = false;
PersonsCache = ctx.Persons
.AsNoTracking()
.ToList();
}
using (myEntities ctx = new myEntities())
{
ctx.Configuration.ProxyCreationEnabled = false;
StudentsCache= ctx.Student
.AsNoTracking()
.ToList();
}
The reason for turning off ProxyCreationEnabled is that you'll get light objects and that you'll never inadvertently trigger lazy loading afterwards (throwing an exception that the context is no longer available).
Now you'll have cached objects that are not inter-related and that get fetched as fast as it gets with EF. If this isn't fast enough you'll have to resort to other tools, like Dapper.
By the way, your very first code snippet and problem description...
using (var ctx = new myEntities)
{
AllPersons = ctx.Persons.ToList();
}
There is also has AllPersons.student collection will include in the result;
...suggest that Entity Framework spontaneously performs eager loading (of students) without you Include-ing them. I have to assume that your code snippet is not complete. EF never, ever automatically executes eager loading. (Unless, maybe, you have some outlandish and buggy query provider).
As for the first problem, the serialization. You should be able to tackle that in a similar way as shown above. Just load the data you want to serialize in isolation and disable proxy creation. Or, as suggested by others, serialize view models or anonymous types exactly containing what you need there.
As for the second problem, the validation exception. I can only imagine this to happen if you initialize a students collection by default, empty, Student objects. These are bound to be invalid. If this is not the case, I suggest you ask a new question about this specific problem, showing ample detail about the involved classes and mappings. That shouldn't be dealt with in this question.
Explicitly select what you want to return from the Database.
Use Select new. With the select new clause, you can create new objects of an anonymous type as the result of a query and don't let the reference include in. This syntax allows you to construct anonymous data structures. These are created as they are evaluated (lazily). Like this:
using (var ctx = new myEntities())
{
var AllPersons = ctx.People.Select(c => new {c.PersonId, c.PersonName}).ToList();
}
And even you don't need to disable lazy loading anymore.
After running query above:
This query currently allocates an anonymous type using select new { }, which requires you to use var. If you want allocate a known type, add it to your select clause:
private IEnumerable<MyClass> AllPersons;//global variable
using (var ctx = new myEntities())
{
AllPersons = ctx.People
.Select(c => new MyClass { PersonId = c.PersonId, PersonName = c.PersonName }).ToList();
}
And:
public class MyClass
{
public string PersonId { get; set; }
public string PersonName { get; set; }
}
If entities are auto generated, then copy paste it to own code and remove the relation generated like child collection and Foreign key. Or you don't need all this kind of the functionality might be can user lightweight framework like dapper
In normally your student collection doesn't fill from database. it's fill when you reach to property. In addition if you use ToList() method so Entity Framework read data from data to fill your collection.
Pls check this.
https://msdn.microsoft.com/en-us/data/jj574232.aspx#lazy
https://msdn.microsoft.com/en-us/library/vstudio/dd456846(v=vs.100).aspx
Is there anyway to don't let the reference include in any time in this situation?
The solution to this seems to be very simple: don't map the association. Remove the Student collection. Not much more I can say about it.
Decorate any properties with [IgnoreDataMember] if you are using 4.5+
https://msdn.microsoft.com/en-us/library/system.runtime.serialization.ignoredatamemberattribute(v=vs.110).aspx
Also sounds like you are trying to do table inheritance which is a different problem with EF
http://www.asp.net/mvc/overview/getting-started/getting-started-with-ef-using-mvc/implementing-inheritance-with-the-entity-framework-in-an-asp-net-mvc-application
http://www.entityframeworktutorial.net/code-first/inheritance-strategy-in-code-first.aspx
If I understand you correctly, you're just trying to make sure you only get what you specifically ask for right?
This was mentioned a little above, but to do this correctly you just want to select an anonymous type.
var students = from s in _context.Students
select new{
StudentId,
StudentNo};
Then, when you want to update this collection/object, I'd recommend use GraphDiff. GraphDiff really helps with the problems of disconnected entities and updates (https://github.com/refactorthis/GraphDiff)
So your method would look similar to this:
void UpdateStudent(Student student){
_context.UpdateGraph(student, map =>
map
.AssociatedEntity(c => c.Person));
_context.SaveChanges();
}
This way, you're able to update whatever properties on an object, disconnected or not, and not worry about the association.
This is assuming that you correctly mapped your entities, and honestly, I find it easier to declare the object as a property, not just the ID, and use a mapping file to map it correctly.
So:
class Person{
int Id{get;set;}
string Name{get;set}
}
class Student{
int Id{get;set;}
string StudentNo{get;set;}
Person Person{get;set;}
public class StudentMap : EntityTypeConfiguration<Student>
{
public StudentMap()
{
// Primary Key
HasKey(t => t.Id);
// Table & Column Mappings
ToTable("Students");
Property(t => t.Id).HasColumnName("StudentId");
// Relationships
HasRequired(t => t.Person)
.HasForeignKey(d => d.PersonId);
}
}
Hopefully that makes sense. You don't need to create a view model, but you definitely can. This way does make it easier to map disconnected items back to the database though.
I had exact same situation.
All I did to solve it was ask for the Student.ToList() before I asked for Persons.ToList()
I didn't have to disable lazy loading. Just need to load the table that has reference to other table first after that you can load the other table and first table results are already in memory and don't get "fixed" with all the references.
They are automatically linked in the ObjectContext by there EntityKey. Depending on what you want to do with your Persons and Students, you can Detach them from the ObjectContext :
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
foreach(var c in AllPersons)
{
ctx.Detach(c);
}
AllStudents = ctx.Student.ToList();
foreach(var c in AllStudents )
{
ctx.Detach(c);
}
}

Entity Framework - retrieve ID before 'SaveChanges' inside a transaction

In Entity Framework - Is there any way to retrieve a newly created ID (identity) inside a transaction before calling 'SaveChanges'?
I need the ID for a second insert, however it is always returned as 0...
ObjectContext objectContext = ((IObjectContextAdapter)context).ObjectContext;
objectContext.Connection.Open();
using (var transaction = objectContext.Connection.BeginTransaction())
{
foreach (tblTest entity in saveItems)
{
this.context.Entry(entity).State = System.Data.EntityState.Added;
this.context.Set<tblTest>().Add(entity);
int testId = entity.TestID;
.... Add another item using testId
}
try
{
context.SaveChanges();
transaction.Commit();
}
catch (Exception ex)
{
transaction.Rollback();
objectContext.Connection.Close();
throw ex;
}
}
objectContext.Connection.Close();
The ID is generated by the database after the row is inserted to the table. You can't ask the database what that value is going to be before the row is inserted.
You have two ways around this - the easiest would be to call SaveChanges. Since you are inside a transaction, you can roll back in case there's a problem after you get the ID.
The second way would be not to use the database's built in IDENTITY fields, but rather implement them yourself. This can be very useful when you have a lot of bulk insert operations, but it comes with a price - it's not trivial to implement.
EDIT: SQL Server 2012 has a built-in SEQUENCE type that can be used instead of an IDENTITY column, no need to implement it yourself.
As others have already pointed out, you have no access to the increment value generated by the database before saveChanges() was called – however, if you are only interested in the id as a means to make a connection to another entity (e.g. in the same transaction) then you can also rely on temporary ids assigned by EF Core:
Depending on the database provider being used, values may be generated client side by EF or in the database. If the value is generated by the database, then EF may assign a temporary value when you add the entity to the context. This temporary value will then be replaced by the database generated value during SaveChanges().
Here is an example to demonstrate how this works. Say MyEntity is referenced by MyOtherEntity via property MyEntityId which needs to be assigned before saveChanges is called.
var x = new MyEntity(); // x.Id = 0
dbContext.Add(x); // x.Id = -2147482624 <-- EF Core generated id
var y = new MyOtherEntity(); // y.Id = 0
dbContext.Add(y); // y.Id = -2147482623 <-- EF Core generated id
y.MyEntityId = x.Id; // y.MyEntityId = -2147482624
dbContext.SaveChangesAsync();
Debug.WriteLine(x.Id); // 1261 <- EF Core replaced temp id with "real" id
Debug.WriteLine(y.MyEntityId); // 1261 <- reference also adjusted by EF Core
The above also works when assigning references via navigational properties, i.e. y.MyEntity = x instead of y.MyEntityId = x.Id
If your tblTest entity is connected to other entities that you want to attach, you don't need to have the Id to create the relation. Lets say tblTest is attached to anotherTest object, it the way that in anotherTest object you have tblTest object and tblTestId properties, in that case you can have this code:
using (var transaction = objectContext.Connection.BeginTransaction())
{
foreach (tblTest entity in saveItems)
{
this.context.Entry(entity).State = System.Data.EntityState.Added;
this.context.Set<tblTest>().Add(entity);
anotherTest.tblTest = entity;
....
}
}
After submitting the relation would be created and you don't need to be worry about Ids and etc.
You can retreive an ID before calling .SaveChanges() by using the Hi/Lo alhorithm. The id will be assigned to the object once it is added to dbcontext.
Example configuration with fluent api:
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Entity>(e =>
{
e.Property(x => x.Id).UseHiLo();
});
}
An excerpt from the relevant Microsoft article:
The Hi/Lo algorithm is useful when you need unique keys before committing changes. As a summary, the Hi-Lo algorithm assigns unique identifiers to table rows while not depending on storing the row in the database immediately. This lets you start using the identifiers right away, as happens with regular sequential database IDs.
#zmbq is right, you can only get the id after calling save changes.
My suggestion is that you should NOT rely on the generated ID's of the database.
The database should only a detail of your application, not an integral and unchangeable part.
If you can't get around that issue use a GUID as an identifier due it's uniqueness.
MSSQL supports GUID as a native column type and it's fast (though not faster than INT.).
Cheers
A simple work around for this would be
var ParentRecord = new ParentTable () {
SomeProperty = "Some Value",
AnotherProperty = "Another Property Value"
};
ParentRecord.ChildTable.Add(new ChildTable () {
ChildTableProperty = "Some Value",
ChildTableAnotherProperty = "Some Another Value"
});
db.ParentTable.Add(ParentRecord);
db.SaveChanges();
Where ParentTable and ChildTable are two tables connected with Foregin key.
You can look up the value in the ChangeTracker like this:
var newEntity = new MyEntity();
var newEntity.Property = 123;
context.Add(newEntity);
//specify a logic to identity the right entity here:
var entity = context.ChangeTracker.Entries()
.FirstOrDefault(e => e.Entity is MyEntity myEntity &&
myEntity.Property == newEntity.Property);
//In this case we look up the value for an autogenerated id of type int/long
//it will be a negative value like -21445363467
var value = entity.Properties?
.FirstOrDefault(pe => pe.Metadata.GetColumnName() == nameof(MyEntity.Id))?.CurrentValue;
//if you set it on another entity, it will be replaced on SaveChanges()
My setup was mysql 5.7, but should work in other environments also.

C#, Linq2SQL - tricks to fetch a ViewModel object with relation data?

I don't know Linq2Sql so well yet and I was wondering if there is a trick for this probably common MVVM scenario. I have Linq2Sql data context containing Domain models, but I am fetching data for my customized ViewModel object from it.
var query = from ord in ctx.Table_Orders
select new OrderViewModel()
{
OrderId = ord.OrderId,
OrderSum = ord.OrderSum,
OrderCurrencyId = ord.OrderCurrencyId,
OrderCurrencyView = ord.Currency.CurrencyText
};
So i want my ViewModel to inculde both CurrencyId from domain object and the CurrencyText from related table to show it nicely in the View.
This code works great. It generates one DB call with join to fetch the CurrencyText. But the model is simplified, real one has many more fields. I want to make the code reusable because I have many different queries, that returns the same ViewModel. Now every minor change to OrderViewModel requires lots of maintainance.
So I moved the code to OrderViewModel itself as a constructor.
public OrderViewModel(Table_Order ord)
{
OrderId = ord.OrderId,
OrderSum = ord.OrderSum,
OrderCurrencyId = ord.OrderCurrencyId,
OrderCurrencyView = ord.Currency.CurrencyText
}
And call it like this.
var query = from ord in ctx.Table_Orders
select new OrderViewModel(ord);
The Problem: The join is gone DB query is no more optimised. Now I get 1+N calls to database to fetch CurrencyText for every line.
Any comments are welcome. Maybe I have missed different great approach.
This is how far i could get on my own, to get the code reusability. I created a function that does the job and has multiple parameters. Then I need to explicitly pass it everything that has crossed the line of entity.
var query = ctx.Table_Orders.Select(m =>
newOrderViewModel(m, m.Currency.CurrencyText));
The DB call is again optimized. But it still does not feel like I am there yet! What tricks do You know for this case?
EDIT : The final solution
Thanks to a hint by #Muhammad Adeel Zahid I arrived at this solution.
I created an extension for IQueryable
public static class Mappers
{
public static IEnumerable<OrderViewModel> OrderViewModels(this IQueryable<Table_Order> q)
{
return from ord in q
select new OrderViewModel()
{
OrderId = ord.OrderId,
OrderSum = ord.OrderSum,
OrderCurrencyId = ord.OrderCurrencyId,
OrderCurrencyView = ord.Currency.CurrencyText
};
}
}
Now i can do this to get all list
var orders = ctx.Table_Order.OrderViewModels().ToList();
or this to get a single item, or anything in between with Where(x => ..)
var order = ctx.Table_Order
.Where(x => x.OrderId == id).OrderViewModels().SingleOrDefault();
And that completely solves this question. The SQL generated is perfect and the code to translate objects is reusable. Approach like this should work with both LINQ to SQL and LINQ to Entities. (Not tested with the latter) Thank You again #Muhammad Adeel Zahid
Whenever we query the database, we mostly require either enumeration of objects (more than one records in db) or we want a single entity (one record in db). you can write your mapping code in method that returns enumeration for whole table like
public IEnumerable<OrderViewModel> GetAllOrders()
{
return from ord in ctx.Table_Orders
select new OrderViewModel()
{
OrderId = ord.OrderId,
OrderSum = ord.OrderSum,
OrderCurrencyId = ord.OrderCurrencyId,
OrderCurrencyView = ord.Currency.CurrencyText
};
}
Now you may want to filter these records and return another enumeration for example on currencyID
public IEnumerable<OrderViewModel> GetOrdersByCurrency(int CurrencyID)
{
return GetAllOrders().Where(x=>x.CurrencyId == CurrencyID);
}
Now you may also want to find single record out of all these view models
public OrderViewModel GetOrder(int OrderID)
{
return GetAllOrders().SingleOrDefault(x=>x.OrderId == OrderID);
}
The beauty of IEnumerable is that it keeps adding conditions to query and does not execute it until it is needed. so your whole table will not be loaded unless you really want it and you have kept your code in single place. Now if there are any changes in ViewModel Mapping or in query itself, it has to be done in GetAllOrders() method, rest of code will stay unchanged
You can avoid the N+1 queries problem by having Linq2SQL eagerly load the referenced entites you need to construct your viewmodels. This way you can build one list of objects (and some referenced objects) and use it to construct everything. Have a look at this blog post.
One word of warning though: This technique (setting LoadOptions for the Linq2SQL data context) can only be done once per data context. If you need to perform a second query with a different eager loading configuration, you must re-initalize your data context. I automated this with a simple wrapper class around my context.

Categories