How can I add an entity to the database through EF4 without attaching all of its referenced entities first?
var entity = new MyEntity() { FK_ID = 4 }; // associated entity key
using (var db = new MyEntities())
{
db.MyEntity.AddObject(entity);
db.SaveChanges();
db.AcceptAllChanges();
}
This code keeps trying to also insert a new default-valued FK_Entity.
I see some suggestions online that I need to attach FK_Entity first and then set the MyEntity.FK property to the attached FK_Entity, but that seems awful. 1) I assume attaching requires loading the FK_Entity, which I don't need just to insert the entity - I already gave it the right key and SQL will enforce referential integrity if I make a mistake. 2) As I have more references, I have to attach each one??
I can't find any options or overloads for supressing cascaded inserts. I must be thinking about this wrong?
Thanks.
What you can try to do is create 'dummy' FK entities with only the ID property set for each one. And make sure they have EntityStatus.Unchanged in the ObjectStateManager, so that EF doesn't try to 'update' and rewrite all the other properties of the FK entities.
I don't have an opportunity to test this, but something along the lines of:
var fkEntity = new FK_Entity { ID = 4 };
var entity = new MyEntity { FK_Entity = fkEntity };
using (var db = new MyEntities())
{
db.AddToEntities(entity);
ObjectStateEntry fkEntry = db.ObjectStateManager.GetObjectStateEntry(fkEntity);
// You can check the state here while debugging, but it's probably `Added`
fkEntity.ChangeState(EntityState.Unchanged);
db.SaveChanges();
}
I found that the MVC Binder (or something) was creating an empty FK_Entity that I don't want it to. So I Exclude = "FK_Entity" in the Create() controller handler properties, and continue to just set the fk_id property.
I know I could use ViewModels for this, translate the objects in and out of EF types, but I want to avoid that overhead for now.
Related
C# rookie. Below is my code, been trying for hours now to get this to update some fields in my DB and tried many different implementations without luck.
// Select all fields to update
using (var db = new Entities())
{
// dbFields are trusted values
var query = db.tblRecords
.Where("id == " + f.id)
.Select("new(" + string.Join(",", dbFields.Keys) + ")");
foreach (var item in query)
{
foreach (PropertyInfo property in query.ElementType.GetProperties())
{
if (dbFields.ContainsKey(property.Name))
{
// Set the value to view in debugger - should be dynamic cast eventually
var value = Convert.ToInt16(dbFields[property.Name]);
property.SetValue(item, value);
// Something like this throws error 'Object does not match target type'
// property.SetValue(query, item);
}
}
}
db.SaveChanges();
}
The above code when run does not result in any changes to the DB. Obviously this code needs a bit of cleanup but i'm trying to get the basic functionality working. I believe what I might need to do is to somehow reapply 'item' back into 'query' but I've had no luck getting that to work no matter what implementation I try i'm always receiving 'Object does not match target type'.
This semi similar issue reaffirms that but isn't very clear to me since i'm using a Dynamic LINQ query and cannot just reference the property names directly. https://stackoverflow.com/a/25898203/3333134
Entity Framework will perform updates for you on entities, not on custom results. Your tblRecords holds many entities, and this is what you want to manipulate if you want Entity Framework to help. Remove your projection (the call to Select) and the query will return the objects directly (with too many columns, yes, but we'll cover that later).
The dynamic update is performed the same way any other dynamic assignment in C# would be, since you got a normal object to work with. Entity Framework will track the changes you make and, upon calling SaveChanges, will generate and execute the corresponding SQL queries.
However, if you want to optimize and stop selecting and creating all the values in memory in the first place, even those that aren't needed, you could also perform the update from memory. If you create an object of the right type by yourself and assign the right ID, you can then use the Attach() method to add it to the current context. From that point on, any changes will be recorded by Entity Framework, and when you call SaveChanges, everything should be sent to the database :
// Select all fields to update
using (var db = new Entities())
{
// Assuming the entity contained in tblRecords is named "ObjRecord"
// Also assuming that the entity has a key named "id"
var objToUpdate = new ObjRecord { id = f.id };
// Any changes made to the object so far won't be considered by EF
// Attach the object to the context
db.tblRecords.Attach(objToUpdate);
// EF now tracks the object, any new changes will be applied
foreach (PropertyInfo property in typeof(ObjRecord).GetProperties())
{
if (dbFields.ContainsKey(property.Name))
{
// Set the value to view in debugger - should be dynamic cast eventually
var value = Convert.ToInt16(dbFields[property.Name]);
property.SetValue(objToUpdate, value);
}
}
// Will only perform an UPDATE query, no SELECT at all
db.SaveChanges();
}
When you do a SELECT NEW ... it selects only specific fields and won't track updates for you. I think if you change your query to be this it will work:
var query = db.tblRecords.Where(x=>x.id == id);
I have already read many posts about the entity framework problem with many to many and its beeing a pain in the neck again.
Colaborador has an ICollection of Time and
Time has an Icollection of Colaborador
Some people say that it´s necessary to attach the child entity before Add in the context(didn´t work for me, Pk error).
I am using simple injector and my context is per request.
My associative table is mapped like this:
HasMany<Time>(c => c.Times)
.WithMany(t => t.Colaboradores)
.Map(ct =>
{
ct.MapLeftKey("ColaboradorId");
ct.MapRightKey("TimeId");
ct.ToTable("Receptor");
});
It creates the associative table in the database.
When i try to insert a Colaborador(entity), i add in its list some Times(Teams), add to DbContext and then SaveChanges().
When i do this, it creates a new Colaborador, insert correctly in the associative table(the ids) but also duplicate the Time.
var colaborador = Mapper.Map<ColaboradorViewModel, Colaborador>(colaboradorVm);
List<TimeViewModel> timesVm = new List<TimeViewModel>();
colaboradorVm.TimesSelecionados.ForEach(t => timesVm.Add(_serviceTime.BuscarPorId(t)));
colaborador.Times = Mapper.Map<ICollection<TimeViewModel>, ICollection<Time>>(timesVm);
The function BuscarPorId does the Find method and returns a Time.
I have figured out that if i call the Add command, the entity will mark the child´s state as Added as well, but if i attempt to attach the Time or change it´s state to Unchanged, i get a primary key error...
foreach (var item in colaborador.Times)
{
lpcContext.Set<Time>().Attach(item);
//lpcContext.Entry(item).State = EntityState.Unchanged;
}
Is there any way of tell to entity framework to not insert a specific child? So only the main and associative table are populated?
Mapper.Map creates new Time objects which are not attached to your context, so you must attach them as Unmodified, but attaching them causes another error due to duplicate PK because your context is already tracking the original copies of the Time entities. Using the Find method will retrieve these tracked and locally cached entities.
Find and use the entity already attached to your context:
Instead of:
colaborador.Times = Mapper.Map<ICollection<TimeViewModel>, ICollection<Time>>(timesVm);
Use:
var times = new List<Time>();
var dbSet = lpcContext.Set<Time>();
foreach( var t in timesVm )
{
var time = dbSet.Find( t.Id );
if( null == time )
{
time = Mapper.Map<TimeViewModel, Time>( t );
}
times.Add( time );
}
collaborador.Times = times;
I'm using Sqlite database and System.Data.SQLite 1.0.92
There is 2 table here:
Table Person:
PersonId
PersonName
Table Student:
StudentId
PersonId(reference table Person FK)
StudentNo
Now every time I get the Persons Collection in EF5:
using (var ctx = new myEntities)
{
AllPersons = ctx.Persons.ToList();
}
There is also has AllPersons.student collection will include in the result;
But I don't need it. Of course that's just an example, There is a lot of big table has so many references, it always has performance problems here because of that.
So I'm trying to do not let it in my result. So I change it:
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
}
Now fine, because AllPersons.student collection will always be null
But now I found: If I get Person and Student together:
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
AllStudents = ctx.Student.ToList();
}
Now the reference still include in.
So Is there anyway to don't let the reference include in any time in this situation?
Thank you.
Update
For some friends request, I explain why I need it:
1: When I convert it to json it will be a dead loop. even I already use Json.net ReferenceLoopHandling, the json size very big to crash the server.(if no references, it's just a very small json)
2:Every time I get the client data and need to save, it will display exception about model state, until I set it to null.
Example:
using (myEntities ctx = new myEntities())
{
ctx.Configuration.LazyLoadingEnabled = false;
ctx.Configuration.ProxyCreationEnabled = false;
Person model= ThisIsAModel();
model.students = null; // This is a key, I need set the students collection references to null , otherwise it will throw exception
ctx.Entry(model).State = EntityState.Modified;
ctx.SaveChanges();
}
3: This is More important problem. I already get all data and cache on the server. But It will let the loading time very long when server start. (because the data and references are so many, that is the main problem), I don't know I'll meet what kind of problem again....
public List<Person> PersonsCache; // global cache
public List<Student> StudentsCache; // global cache
using (myEntities ctx = new myEntities())
{
ctx.Configuration.LazyLoadingEnabled = false;
ctx.Configuration.ProxyCreationEnabled = false;
// There is so many references and data, will let it very slow , when I first time get the all cache. even I only get the Person model, not other , just because some Collection has some references problem. It will very slow....
PersonsCache = ctx.Persons.ToList();
StudentsCache= ctx.Student.ToList();
}
The Problem
As you said, when you load both of Parent and Child lists even when LazyLoading is disabled, and then look in parent.Childs you see child items has been loaded too.
var db = new YourDbContext();
db.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet.ToList();
var childList= db.YourChildSet.ToList();
What happened? Why childs are included in a parent?
The childs under a parent entity, are those you loaded using db.YourChildSet.ToList(); Exactly themselves; In fact Entity Framework never loads childs for a parent again but because of relation between parent and child in edmx, they are listed there.
Is that affect Perforemance?
According to the fact that childs only load once, It has no impact on perforemance because of loading data.
But for serialization or something else's sake, How can I get rid of it?
you can use these solutions:
Solution 1:
Use 2 different instance of YourDbContext:
var db1 = new YourDbContext();
db1.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet.ToList();
var db2 = new YourDbContext();
db2.Configuration.LazyLoadingEnabled = false;
var childList= db.YourChildSet.ToList();
Now when you look in parent.Childs there is no Child in it.
Solution 2:
use Projection and shape your output to your will and use them.
var db1 = new YourDbContext();
db1.Configuration.LazyLoadingEnabled = false;
var parentList= db.YourParentSet
.Select(x=>new /*Model()*/{
Property1=x.Property1,
Property2=x.Property2, ...
}).ToList();
This way when serialization there is nothing annoying there.
Using a custom Model class is optional and in some cases is recommended.
Additional Resources
As a developer who use Entity Framework reading these resources is strongly recommended:
Performance Considerations for Entity Framework 4, 5, and 6
Connection Management
I'll focus on your third problem because that seems to be your most urgent problem. Then I'll try to give some hints on the other two problems.
There are two Entity Framework features you should be aware of:
When you load data into a context, Entity Framework will try to connect the objects wherever they're associated. This is called relationship fixup. You can't stop EF from doing that. So if you load Persons and Students separately, a Person's Students collection will contain students, even though you didn't Include() them.
By default, a context caches all data it fetches from the database. Moreover, it stores meta data about the objects in its change tracker: copies of their individual properties and all associations. So by loading many objects the internal cache grows, but also the size of the meta data. And the ever-running relationship fixup process gets slower and slower (although it may help to postpone it by turning off automatic change detection). All in all, the context gets bloated and slow like a flabby rhino.
I understand you want to cache data in separate collections for each entity. Two simple modifications will make this much quicker:
Evade the inevitable relationship fixup by loading each collection by a separate context
Stop caching (in the context) and change tracking by getting the data with AsNoTracking.
Doing this, your code will look like this:
public List<Person> PersonsCache;
public List<Student> StudentsCache;
using (myEntities ctx = new myEntities())
{
ctx.Configuration.ProxyCreationEnabled = false;
PersonsCache = ctx.Persons
.AsNoTracking()
.ToList();
}
using (myEntities ctx = new myEntities())
{
ctx.Configuration.ProxyCreationEnabled = false;
StudentsCache= ctx.Student
.AsNoTracking()
.ToList();
}
The reason for turning off ProxyCreationEnabled is that you'll get light objects and that you'll never inadvertently trigger lazy loading afterwards (throwing an exception that the context is no longer available).
Now you'll have cached objects that are not inter-related and that get fetched as fast as it gets with EF. If this isn't fast enough you'll have to resort to other tools, like Dapper.
By the way, your very first code snippet and problem description...
using (var ctx = new myEntities)
{
AllPersons = ctx.Persons.ToList();
}
There is also has AllPersons.student collection will include in the result;
...suggest that Entity Framework spontaneously performs eager loading (of students) without you Include-ing them. I have to assume that your code snippet is not complete. EF never, ever automatically executes eager loading. (Unless, maybe, you have some outlandish and buggy query provider).
As for the first problem, the serialization. You should be able to tackle that in a similar way as shown above. Just load the data you want to serialize in isolation and disable proxy creation. Or, as suggested by others, serialize view models or anonymous types exactly containing what you need there.
As for the second problem, the validation exception. I can only imagine this to happen if you initialize a students collection by default, empty, Student objects. These are bound to be invalid. If this is not the case, I suggest you ask a new question about this specific problem, showing ample detail about the involved classes and mappings. That shouldn't be dealt with in this question.
Explicitly select what you want to return from the Database.
Use Select new. With the select new clause, you can create new objects of an anonymous type as the result of a query and don't let the reference include in. This syntax allows you to construct anonymous data structures. These are created as they are evaluated (lazily). Like this:
using (var ctx = new myEntities())
{
var AllPersons = ctx.People.Select(c => new {c.PersonId, c.PersonName}).ToList();
}
And even you don't need to disable lazy loading anymore.
After running query above:
This query currently allocates an anonymous type using select new { }, which requires you to use var. If you want allocate a known type, add it to your select clause:
private IEnumerable<MyClass> AllPersons;//global variable
using (var ctx = new myEntities())
{
AllPersons = ctx.People
.Select(c => new MyClass { PersonId = c.PersonId, PersonName = c.PersonName }).ToList();
}
And:
public class MyClass
{
public string PersonId { get; set; }
public string PersonName { get; set; }
}
If entities are auto generated, then copy paste it to own code and remove the relation generated like child collection and Foreign key. Or you don't need all this kind of the functionality might be can user lightweight framework like dapper
In normally your student collection doesn't fill from database. it's fill when you reach to property. In addition if you use ToList() method so Entity Framework read data from data to fill your collection.
Pls check this.
https://msdn.microsoft.com/en-us/data/jj574232.aspx#lazy
https://msdn.microsoft.com/en-us/library/vstudio/dd456846(v=vs.100).aspx
Is there anyway to don't let the reference include in any time in this situation?
The solution to this seems to be very simple: don't map the association. Remove the Student collection. Not much more I can say about it.
Decorate any properties with [IgnoreDataMember] if you are using 4.5+
https://msdn.microsoft.com/en-us/library/system.runtime.serialization.ignoredatamemberattribute(v=vs.110).aspx
Also sounds like you are trying to do table inheritance which is a different problem with EF
http://www.asp.net/mvc/overview/getting-started/getting-started-with-ef-using-mvc/implementing-inheritance-with-the-entity-framework-in-an-asp-net-mvc-application
http://www.entityframeworktutorial.net/code-first/inheritance-strategy-in-code-first.aspx
If I understand you correctly, you're just trying to make sure you only get what you specifically ask for right?
This was mentioned a little above, but to do this correctly you just want to select an anonymous type.
var students = from s in _context.Students
select new{
StudentId,
StudentNo};
Then, when you want to update this collection/object, I'd recommend use GraphDiff. GraphDiff really helps with the problems of disconnected entities and updates (https://github.com/refactorthis/GraphDiff)
So your method would look similar to this:
void UpdateStudent(Student student){
_context.UpdateGraph(student, map =>
map
.AssociatedEntity(c => c.Person));
_context.SaveChanges();
}
This way, you're able to update whatever properties on an object, disconnected or not, and not worry about the association.
This is assuming that you correctly mapped your entities, and honestly, I find it easier to declare the object as a property, not just the ID, and use a mapping file to map it correctly.
So:
class Person{
int Id{get;set;}
string Name{get;set}
}
class Student{
int Id{get;set;}
string StudentNo{get;set;}
Person Person{get;set;}
public class StudentMap : EntityTypeConfiguration<Student>
{
public StudentMap()
{
// Primary Key
HasKey(t => t.Id);
// Table & Column Mappings
ToTable("Students");
Property(t => t.Id).HasColumnName("StudentId");
// Relationships
HasRequired(t => t.Person)
.HasForeignKey(d => d.PersonId);
}
}
Hopefully that makes sense. You don't need to create a view model, but you definitely can. This way does make it easier to map disconnected items back to the database though.
I had exact same situation.
All I did to solve it was ask for the Student.ToList() before I asked for Persons.ToList()
I didn't have to disable lazy loading. Just need to load the table that has reference to other table first after that you can load the other table and first table results are already in memory and don't get "fixed" with all the references.
They are automatically linked in the ObjectContext by there EntityKey. Depending on what you want to do with your Persons and Students, you can Detach them from the ObjectContext :
using (var ctx = new myEntities)
{
ctx.Configuration.ProxyCreationEnabled = false;
ctx.Configuration.LazyLoadingEnabled = false;
AllPersons= ctx.Persons.ToList();
foreach(var c in AllPersons)
{
ctx.Detach(c);
}
AllStudents = ctx.Student.ToList();
foreach(var c in AllStudents )
{
ctx.Detach(c);
}
}
In Entity Framework - Is there any way to retrieve a newly created ID (identity) inside a transaction before calling 'SaveChanges'?
I need the ID for a second insert, however it is always returned as 0...
ObjectContext objectContext = ((IObjectContextAdapter)context).ObjectContext;
objectContext.Connection.Open();
using (var transaction = objectContext.Connection.BeginTransaction())
{
foreach (tblTest entity in saveItems)
{
this.context.Entry(entity).State = System.Data.EntityState.Added;
this.context.Set<tblTest>().Add(entity);
int testId = entity.TestID;
.... Add another item using testId
}
try
{
context.SaveChanges();
transaction.Commit();
}
catch (Exception ex)
{
transaction.Rollback();
objectContext.Connection.Close();
throw ex;
}
}
objectContext.Connection.Close();
The ID is generated by the database after the row is inserted to the table. You can't ask the database what that value is going to be before the row is inserted.
You have two ways around this - the easiest would be to call SaveChanges. Since you are inside a transaction, you can roll back in case there's a problem after you get the ID.
The second way would be not to use the database's built in IDENTITY fields, but rather implement them yourself. This can be very useful when you have a lot of bulk insert operations, but it comes with a price - it's not trivial to implement.
EDIT: SQL Server 2012 has a built-in SEQUENCE type that can be used instead of an IDENTITY column, no need to implement it yourself.
As others have already pointed out, you have no access to the increment value generated by the database before saveChanges() was called – however, if you are only interested in the id as a means to make a connection to another entity (e.g. in the same transaction) then you can also rely on temporary ids assigned by EF Core:
Depending on the database provider being used, values may be generated client side by EF or in the database. If the value is generated by the database, then EF may assign a temporary value when you add the entity to the context. This temporary value will then be replaced by the database generated value during SaveChanges().
Here is an example to demonstrate how this works. Say MyEntity is referenced by MyOtherEntity via property MyEntityId which needs to be assigned before saveChanges is called.
var x = new MyEntity(); // x.Id = 0
dbContext.Add(x); // x.Id = -2147482624 <-- EF Core generated id
var y = new MyOtherEntity(); // y.Id = 0
dbContext.Add(y); // y.Id = -2147482623 <-- EF Core generated id
y.MyEntityId = x.Id; // y.MyEntityId = -2147482624
dbContext.SaveChangesAsync();
Debug.WriteLine(x.Id); // 1261 <- EF Core replaced temp id with "real" id
Debug.WriteLine(y.MyEntityId); // 1261 <- reference also adjusted by EF Core
The above also works when assigning references via navigational properties, i.e. y.MyEntity = x instead of y.MyEntityId = x.Id
If your tblTest entity is connected to other entities that you want to attach, you don't need to have the Id to create the relation. Lets say tblTest is attached to anotherTest object, it the way that in anotherTest object you have tblTest object and tblTestId properties, in that case you can have this code:
using (var transaction = objectContext.Connection.BeginTransaction())
{
foreach (tblTest entity in saveItems)
{
this.context.Entry(entity).State = System.Data.EntityState.Added;
this.context.Set<tblTest>().Add(entity);
anotherTest.tblTest = entity;
....
}
}
After submitting the relation would be created and you don't need to be worry about Ids and etc.
You can retreive an ID before calling .SaveChanges() by using the Hi/Lo alhorithm. The id will be assigned to the object once it is added to dbcontext.
Example configuration with fluent api:
protected override void OnModelCreating(ModelBuilder modelBuilder)
{
modelBuilder.Entity<Entity>(e =>
{
e.Property(x => x.Id).UseHiLo();
});
}
An excerpt from the relevant Microsoft article:
The Hi/Lo algorithm is useful when you need unique keys before committing changes. As a summary, the Hi-Lo algorithm assigns unique identifiers to table rows while not depending on storing the row in the database immediately. This lets you start using the identifiers right away, as happens with regular sequential database IDs.
#zmbq is right, you can only get the id after calling save changes.
My suggestion is that you should NOT rely on the generated ID's of the database.
The database should only a detail of your application, not an integral and unchangeable part.
If you can't get around that issue use a GUID as an identifier due it's uniqueness.
MSSQL supports GUID as a native column type and it's fast (though not faster than INT.).
Cheers
A simple work around for this would be
var ParentRecord = new ParentTable () {
SomeProperty = "Some Value",
AnotherProperty = "Another Property Value"
};
ParentRecord.ChildTable.Add(new ChildTable () {
ChildTableProperty = "Some Value",
ChildTableAnotherProperty = "Some Another Value"
});
db.ParentTable.Add(ParentRecord);
db.SaveChanges();
Where ParentTable and ChildTable are two tables connected with Foregin key.
You can look up the value in the ChangeTracker like this:
var newEntity = new MyEntity();
var newEntity.Property = 123;
context.Add(newEntity);
//specify a logic to identity the right entity here:
var entity = context.ChangeTracker.Entries()
.FirstOrDefault(e => e.Entity is MyEntity myEntity &&
myEntity.Property == newEntity.Property);
//In this case we look up the value for an autogenerated id of type int/long
//it will be a negative value like -21445363467
var value = entity.Properties?
.FirstOrDefault(pe => pe.Metadata.GetColumnName() == nameof(MyEntity.Id))?.CurrentValue;
//if you set it on another entity, it will be replaced on SaveChanges()
My setup was mysql 5.7, but should work in other environments also.
I am thinking about how to use Linq in the classic 3-tier archetecture of .net project. Apprently, Linq to SQL should appear in Data tier. The reason I choose Linq is because it will save me much time on code than using store procedure. I did some search on line about the insert/update/delete method of Linq, but didn't find an appropriate method for record update using entities. Usually, people will do update using this way:
public void UpdateUser(String username, String password, int userId)
{
using (var db = new UserDataContext()){
var user = db.user.Single(p => p.Id = userId);
user.Username = username;
user.Password = password;
db.SubmitChanges();
}
}
Why we don't use entity to pass the record like this:
public void Update(Application info)
{
VettingDataContext dc = new VettingDataContext(_connString);
var query = (from a in dc.Applications
where a.Id==info.Id
select a).First();
query = info;
try{
dc.SubmitChanges();
}
catch(Exception e){
//...
}
}
But unfortunately, the above code is wrong because of "query=info", but if I assign each value from "info" to "query", it works fine. like
query.firstName=info.firstName;
query.lastName=info.lastName;
So if this table have 40 fields, I have to write 40 lines code. Is there any easier way to do the update? Hope I describe this issue clearly.
Adding another answer as a comment was not sufficient to expand on my previous answer.
Lets take a step back and look at what you want to do here from a logical perspective. You want to tell your data access layer how it should update the database, with all the new/changed values it needs to write.
One very common way of doing this is to pass an entity which has those changes (which is what you're doing in your example). This can become tricky, as you have seen, because if you simply overwrite the entity variable with the changed entity, Linq2Sql will lose change tracking... just because the new entity is assigned to the same variable, doesn't mean that Linq2Sql automatically picks up changes from the new object... in fact Linq2Sql has no knowledge of the new object at all...
Example:
// In domain layer:
MyEntity entity = new MyEntity();
entity.PrimaryKey = 10;
entity.Name = "Toby Larone";
entity.Age = 27;
myDataRepository.Update(entity);
// In data layer:
void Update(MyEntity changedEntity)
{
using (var db = new DataContext())
{
var entity = (from e in db.MyEntities
where e.PrimaryKey == changedEntity.PrimaryKey
select e).First();
// Linq2Sql now has change tracking of "entity"... any changes made will be persisted when SubmitChanges is called...
entity = changedEntity;
// Linq2Sql does **not** have change tracking of changedEntity - the fact that it has been assigned to the same variable that once stored a tracked entity does not mean that Linq2Sql will magically pick up the changes...
db.SubmitChanges(); // Nothing happens - as far as Linq2Sql is concerned, the entity that was selected in the first query has not been changed (only the variable in this scope has been changed to reference a different entity).
}
}
Now you've already seen that assigning each field to the entity rather than replacing it works as intended - this is because the changes are being made to the original entity, which is still inside the Linq2Sql change tracking system..
One possible solution to this problem would be to write a method that "applies" the changes of another Entity to an existing one, ie:
partial class MyEntity
{
void ApplyChanges(MyEntity changedEntity)
{
this.PrimaryKey = changeEntity.PrimaryKey;
this.Name = changedEntity.Name;
this.Age = changedEntity.Age;
}
}
and then your data access would look like this:
// In data layer:
void Update(MyEntity changedEntity)
{
using (var db = new DataContext())
{
var entity = (from e in db.MyEntities
where e.PrimaryKey == changedEntity.PrimaryKey
select e).First();
// Linq2Sql now has change tracking of "entity"... any changes made will be persisted when SubmitChanges is called...
entity.ApplyChanges(changedEntity);
db.SubmitChanges(); // Works OK...
}
}
But im sure you don't like this solution - because all you have done is effectively move the repetitive field assignment out of the repository and into the Entity class itself...
Going back to the logical perspective - all you really need to do is tell the data access repository 2 things - 1) which record you want to update and 2) what the changes are. Sending an entirely new entity which encapsulates those two requirements is not necessary to achieve that goal, in fact I think it's very inefficient.
In the following example, you are sending the data repository only the changes, not an entire entity. Becuase there is no entity, there are no change tracking issues to work around
Example:
// In domain layer:
myDataRepository.Update(10, entity =>
{
entity.Name = "Toby Larone";
entity.Age = 27;
});
// In data layer:
void Update(int primaryKey, Action<MyEntity> callback)
{
using (var db = new DataContext())
{
var entity = (from e in db.MyEntities
where e.PrimaryKey == primaryKey
select e).First();
// Linq2Sql now has change tracking of "entity"... any changes made will be persisted when SubmitChanges is called...
// The changes that were sent are being applied directly to the Linq2Sql entity, which is already under change tracking...
callback(entity);
db.SubmitChanges();
}
}
In the previous examples, the field assignments were happening twice - once when you described the changes you wanted to make, and again in the data repository when you needed to apply those changes to a Linq2Sql change tracked entity.
Using the callback, the field assignments only happen once - the description of the change itself is what updates the tracked entity.
I hope I explained this well enough :)
Think about what the data repository actually requires in order to perform the update. It does not require an object that contains those changes, but a description of what changes need to be made. This can be encapsulated easily into a callback delegate...
public void UpdateUser(int userId, Action<User> callback)
{
using (var db = new DataContext())
{
User entity = db.Users.Where(u => u.Id == userId).Single();
callback(entity);
db.SubmitChanges();
}
}
myrepository.UpdateUser(userId, user =>
{
user.Username = username;
user.Password = password;
// etc...
});
query is not the same type as info. They may have the same properties to you, but the code doesn't know that.
Now, if you want to avoid writing a bunch of unnecesary code, you can use a third party library like AutoMapper which can do that for you.