I am not sure if I am doing this the right way or not, so need advice.
I have an entity, this entity has a child collection, and each child entity has another child collection. Something like this (simplified example)
public MyEntity() {
public long Id { get; set; }
public ICollection<MyChild> Children { get; set; }
}
public MyChild() {
public long Id { get; set; }
public long MyEntityId { get; set; }
public MyEntity MyEntity { get; set; }
public ICollection<MyGrandChild> Children { get; set; }
}
public MyGrandChild() {
public long Id { get; set; }
public long MyChildId { get; set; }
public MyChild MyChild { get; set; }
public string Name { get; set; }
}
Now in our application, the user retrieves this entity from our webApi into an angularJs application. The user then updates the entity (and sub entities) and passes the entity back to the webApi. I am using models to pass the objects from my webApi to the angularJs application, and they look something like this.
public MyEntityModel() {
public long Id { get; set; }
public ICollection<MyChildModel> Children { get; set; }
}
public MyChildModel() {
public long Id { get; set; }
public ICollection<MyGrandChildModel> Children { get; set; }
}
public MyGrandChildModel() {
public long Id { get; set; }
public string Name { get; set; }
}
Once the models are passed back to the webApi, I use Auto Mapper to convert them back to entity objects.
Now the bit I am confused about, i now pass the object to my service layer, my method looks similar to this
public Task<int> UpdateAsync(MyEntity updated) {
_context.Entry(updated).State = EntityState.Modified;
return _context.SaveChangesAsync();
}
If I add a new MyChild or MyGrandChild object to MyEntity after MyEntity exists or update MyChild or MyGrandChild object then the changes are not committed to the database? I changed my UpdateAsync method to this, but is this really needed?
public Task<int> UpdateAsync(MyEntity updated) {
_context.Entry(updated).State = EntityState.Modified;
foreach (var child in updated.Children) {
if (child.Id == 0) {
_context.Entry(child).State = EntityState.Added;
} else {
_context.Entry(child).State = EntityState.Modified;
}
foreach (var grand in child.Children) {
if (grand.Id == 0) {
_context.Entry(grand).State = EntityState.Added;
} else {
_context.Entry(grand).State = EntityState.Modified;
}
}
}
return _context.SaveChangesAsync();
}
Do I really have to loop through each collection, and sub collection, check if the id equals 0 and set its state accordingly?
Yes, you have to do that.
When you do all the work inside a DbContext scope it takes care to track al the changes happening in the entities, and you have to do nothing to let the DbContext know whathas changed.
But, in multi-layered applications, when you move the entities between layers they cannot be kept inside a DbContext scope, so you're responsible for tracking the cahnges.
Julie Lerman recommends implementing an interface to track the status of each entity. This interface has a property that keeps the entity status. This is modified on the client side and visited on the server to set each entity status: get the entities on the server side, attach them to the context, and modify its status according to the tracking interface property. (I can't find the reference, but it's covered in her Programming Entity Framework book, and in one of her Pluralsight courses).
Trackable Entities can also be of your interest.
If you want this to happen "automagically" you can use Breeze. This let you easyly expose an EF model on the client side, using JavaScript code. This code is able to track the changes (and do many other things like validating) in the client side, and send them back to the server to update the database. It's quite easy to get started. Basically you need to install the NuGet package for the server to implement a Breeze controllers, which is done with very little lines of code, and the NuGet package for the client, that implements the JavaScript code. It's advisable to use some MVVM JavaScript library, like Knockout or AngularJS, because the changes will be automatically tracked by subscribing to the observables created by these libraries.
Related
I have a problem I am using lazy loading and virtual, but when using this it generates the following error
Violation of PRIMARY KEY constraint Cannot insert duplicate key in object
How can I solve it?
These are my classes:
public class Producto
{
[Key]
public Guid ProductoId { get; set; }
public Guid InquilinoId { get; set; }
public string Nombre { get; set; }
public decimal Precio_Publico { get; set; }
public string Detalle_producto { get; set; }
public DateTime? Fecha_publicacion { get; set; }
public bool Activo { get; set; }
public string Img_Producto { get; set; }
public string CodigoBarras { get; set; }
public virtual Concepto VigenciaPrecio { get; set; }
public virtual ICollection<Precio> Precios { get; set; }
public bool Es_Almacenable { get; set; }
public int Dias_de_Garantia { get; set; }
public bool Es_Importado { get; set; }
public virtual List<Categoria> Categoria { get; set; } = new List<Categoria>();
public virtual Impuesto Impuesto { get; set; }
public virtual Precio Precio { get; set; }
}
public class Categoria
{
public Guid CategoriaId { get; set; }
public string Nombre { get; set; }
public virtual Producto Producto { get; set; }
}
[HttpPost]
public async Task<ActionResult<Guid>> Post(Producto producto)
{
var user = await userManager.GetUserAsync(HttpContext.User);
var usercontodo = context.Users.Where(x => x.Id == user.Id).Include(x => x.InquilinoActual).FirstOrDefault();
if (!string.IsNullOrWhiteSpace(producto.Img_Producto))
{
var imgProducto = Convert.FromBase64String(producto.Img_Producto);
producto.Img_Producto = await almacenadorDeArchivos.GuardarArchivo(imgProducto, "jpg", "productos");
}
producto.InquilinoId = usercontodo.InquilinoActual.ClienteId;
context.Add(producto);
await context.SaveChangesAsync();
return producto.ProductoId;
}
This is the table categorias:
This is the table productos:
This is the error:
Error:
Microsoft.EntityFrameworkCore.DbUpdateException: An error occurred while updating the entries. See the inner exception for details.
Microsoft.Data.SqlClient.SqlException (0x80131904): Violation of
PRIMARY KEY constraint 'PK_Categorias'. Cannot insert duplicate key in
object 'dbo.Categorias'. The duplicate key value is
(1737b24b-93a4-4ad9-4d8b-08d85a75ca8e)
This error is usually the result of passing detached entity graphs, or entity graphs that are constructed in a client to a server to be added/updated.
Producto has a set of categories. If EF is trying persist a category, this means that the Producto you sent it and added to the context had at least one Category in it.
Lets say for example in my client code (JavaScript or some other server code calling the API) I create a new Producto:
// C# Pseudo
var producto = new Producto
{
// set up product fields...
// I have a list of categories pre-loaded, so I want to assign one of them...
Categorias = (new [] { new Categoria { CategoriaId = 3, Nombre = "Three" }}).ToList()
}
Now I pass that Producto to my service. That request has its own scoped DbContext that I set up some info on the Producto and Add it to the context's Producto DbSet. We can assume that the database already has a Category with an ID of 3, but the context isn't aware of it because Producto.Categorias has a reference to a new entity that just happens to have the ID of 3. EF will treat that Category as a new entity and try to insert it along-side the Product. Hence, FK violation as EF tries to insert another Category ID=3.
The complex solution is to attach entities to the current DbContext. As a simple example, with the above Producto coming in:
foreach(var categoria in producto.Categorias)
context.Attach(categoria); // which will treat the category as unmodified, but expect it to be an existing record.
context.Producto.Add(producto);
context.SaveChanges();
This would have to be done for every existing entity associated with and referenced by the Product. This gets complicated because it assumes that each associated entity is unknown by the DbContext. If you have a scenario where you are dealing with multiple Producto objects, or the Category could be referenced by the Producto and another new row under the Producto, or it's possible that the categories could have been read by the DbContext prior to saving the Producto, attempting to Attach the category could fail if EF is already tracking one with the same ID. This can lead to intermittent errors.
To be safer, before attaching an entity, you should test that the Context isn't already tracking it. If it is already tracking a reference, then you need to replace the reference in Producto:
foreach(var categoria in producto.Categorias)
{
var existingCategoria = context.Categorias.Local.SingleOrDefault(x => x.CategoriaId == categoria.CategoriaId);
if (existingCategoria != null)
{ // Context is tracking one already, so replace the reference in Producto
product.Categorias.Remove(categoria);
product.Categorias.Add(existingCategoria);
}
else
context.Attach(categoria); // context isn't tracking it yet.
context.Producto.Add(producto);
context.SaveChanges();
Again, that needs to be done for EVERY reference to safely save a detached entity.
The better solution is to avoid passing entity structures between client and server, and instead pass View Models or DTOs which are POCO (Plain Old C# Objects) containing just the details needed to build an entity.
Given a Producto ViewModel like this:
[Serializable]
public class NewProductoViewModel
{
public string Nombre { get; set; }
public ICollection<Guid> CategoriaIds { get; set; } = new List<Guid>();
// ... Other details needed to create a new Producto
}
When we go to add this new Producto:
[HttpPost]
public async Task<ActionResult<Guid>> Post(NewProductoViewModel viewModel)
{
var user = await userManager.GetUserAsync(HttpContext.User);
var usercontodo = context.Users.Where(x => x.Id == user.Id).Include(x => x.InquilinoActual).FirstOrDefault();
var producto = new Producto(); // Here is our new, fresh entity..
// TODO: Here we would copy across all non-reference data, strings, values, etc. from the ViewModel to the Entity...
if (!string.IsNullOrWhiteSpace(viewModle.Img_Producto))
{
var imgProducto = Convert.FromBase64String(viewModel.Img_Producto);
producto.Img_Producto = await almacenadorDeArchivos.GuardarArchivo(imgProducto, "jpg", "productos"); // This is fine, we get our object from the DbContext.
}
producto.InquilinoId = usercontodo.InquilinoActual.ClienteId;
// Now handle the Categorias....
foreach(var categoriaId in viewModel.CategoriaIds)
{
var categoria = context.Categorias.Single(categoriaId);
producto.Categorias.Add(categoria)
}
context.Add(producto);
await context.SaveChangesAsync();
return producto.ProductoId;
}
Much of your code is pretty much left as-is, but what it accepts is a deserialized view model, not to be confused with a deserialized block of data that can be confused with an entity. The method constructs a new Entity, then would copy across any details from the view model, before performing a lookup against the Context for any references to associate to the new entity. In the above example I use .Single() which would throw an exception if we passed a CategoriaId that didn't exist. Alternatively you could use .SingleOrDefault() and ignore CategoriaIds that don't exist.
The added benefit of using ViewModels is that you can minimize the amount of data being sent to just the fields needed. We don't send entire Categoria classes to the server, just the IDs that were associated. Most cases where I've seen people passing entities around, the reason is to avoid re-loading the entities more than once. (once when the categorias were read to send to the client, and again when the Producto is saved) This rationale is flawed because what gets sent back to the server may "look" like the entity that the server would have sent the client, but it is not an entity. It is a deserialized block of JSON with the same signature of an entity. It is also "stale" in the sense that any data sent to the server may be many minutes old. When updating entities, the first thing you should check is whether the row version of the data coming from the client matches what is in the server. (Has the server data been updated since?) This means touching the database anyways. We also shouldn't trust anything coming from the server. It is tempting to Attach entities, set a Modified State and call SaveChanges but this will overwrite every field on that entity. Clever people can intercept requests from their browsers and modify the data that is serialized into the request which means that data you don't intend to allow to be updated can be replaced if you merely attach that entity.
Your classes mean that, using the standard conventions, a Category can only belong to 1 Product, and a Product has many Categories.
You didn't include the code that builds up a Product (before it is Posted) but the error means that you try to add an existing Category to a new Product.
I think you want to use Categoria as a link table, this should more or less work:
public class Categoria
{
[Key]
public Guid CategoriaId { get; set; }
[Key]
public Guid ProductoId { get; set; }
public string Nombre { get; set; }
public virtual Producto Producto { get; set; }
}
but you get more control by mapping it in OnModelCreating of the DbContext class.
I have a website that is using EF Core 3.1 to access its data. The primary table it uses is [Story] Each user can store some metadata about each story [StoryUserMapping]. What I would like to do is when I read in a Story object, for EF to automatically load in the metadata (if it exists) for that story.
Classes:
public class Story
{
[Key]
public int StoryId { get; set; }
public long Words { get; set; }
...
}
public class StoryUserMapping
{
public string UserId { get; set; }
public int StoryId { get; set; }
public bool ToRead { get; set; }
public bool Read { get; set; }
public bool WontRead { get; set; }
public bool NotInterested { get; set; }
public byte Rating { get; set; }
}
public class User
{
[Key]
public string UserId { get; set; }
...
}
StoryUserMapping has composite key ([UserId], [StoryId]).
What I would like to see is:
public class Story
{
[Key]
public int StoryId { get; set; }
public bool ToRead { get; set; } //From user mapping table for currently logged in user
public bool Read { get; set; } //From user mapping table for currently logged in user
public bool WontRead { get; set; } //From user mapping table for currently logged in user
public bool NotInterested { get; set; } //From user mapping table for currently logged in user
public byte Rating { get; set; } //From user mapping table for currently logged in user
...
}
Is there a way to do this in EF Core? My current system is to load the StoryUserMapping object as a property of the Story object, then have Non-Mapped property accessors on the Story object that read into the StoryUserMapping object if it exists. This generally feels like something EF probably handles more elegantly.
Use Cases
Setup: I have 1 million stories, 1000 users, Worst-case scenario I have a StoryUserMapping for each: 1 billion records.
Use case 1: I want to see all of the stories that I (logged in user) have marked as "to read" with more than 100,000 words
Use case 2: I want to see all stories where I have NOT marked them NotInterested or WontRead
I am not concerned with querying multiple StoryUserMappings per story, e.g. I will not be asking the question: What stories have been marked as read by more than n users. I would rather not restrict against this if that changes in future, but if I need to that would be fine.
Create yourself an aggregate view model object that you can use to display the data in your view, similar to what you've ended up with under the Story entity at the moment:
public class UserStoryViewModel
{
public int StoryId { get; set; }
public bool ToRead { get; set; }
public bool Read { get; set; }
public bool WontRead { get; set; }
public bool NotInterested { get; set; }
public byte Rating { get; set; }
...
}
This view model is concerned only about aggregating the data to display in the view. This way, you don't need to skew your existing entities to fit how you would display the data elsewhere.
Your database entity models should be as close to "dumb" objects as possible (apart from navigation properties) - they look very sensible as they are the moment.
In this case, remove the unnecessary [NotMapped] properties from your existing Story that you'd added previously.
In your controller/service, you can then query your data as per your use cases you mentioned. Once you've got the results of the query, you can then map your result(s) to your aggregate view model to use in the view.
Here's an example for the use case of getting all Storys for the current user:
public class UserStoryService
{
private readonly YourDbContext _dbContext;
public UserStoryService(YourDbContext dbContext)
{
_dbContext = dbContext;
}
public Task<IEnumerable<UserStoryViewModel>> GetAllForUser(string currentUserId)
{
// at this point you're not executing any queries, you're just creating a query to execute later
var allUserStoriesForUser = _dbContext.StoryUserMappings
.Where(mapping => mapping.UserId == currentUserId)
.Select(mapping => new
{
story = _dbContext.Stories.Single(story => story.StoryId == mapping.StoryId),
mapping
})
.Select(x => new UserStoryViewModel
{
// use the projected properties from previous to map to your UserStoryViewModel aggregate
...
});
// calling .ToList()/.ToListAsync() will then execute the query and return the results
return allUserStoriesForUser.ToListAsync();
}
}
You can then create a similar method to get only the current user's Storys that aren't marked NotInterested or WontRead.
It's virtually the same as before, but with the filter in the Where to ensure you don't retrieve the ones that are NotInterested or WontRead:
public Task<IEnumerable<UserStoryViewModel>> GetForUserThatMightRead(string currentUserId)
{
var storiesUserMightRead = _dbContext.StoryUserMappings
.Where(mapping => mapping.UserId == currentUserId && !mapping.NotInterested && !mapping.WontRead)
.Select(mapping => new
{
story = _dbContext.Stories.Single(story => story.StoryId == mapping.StoryId),
mapping
})
.Select(x => new UserStoryViewModel
{
// use the projected properties from previous to map to your UserStoryViewModel aggregate
...
});
return storiesUserMightRead.ToListAsync();
}
Then all you will need to do is to update your View's #model to use your new aggregate UserStoryViewModel instead of your entity.
It's always good practice to keep a good level of separation between what is "domain" or database code/entities from what will be used in your view.
I would recommend on having a good read up on this and keep practicing so you can get into the right habits and thinking as you go forward.
NOTE:
Whilst the above suggestions should work absolutely fine (I haven't tested locally, so you may need to improvise/fix, but you get the general gist) - I would also recommend a couple of other things to supplement the approach above.
I would look at introducing a navigation property on the UserStoryMapping entity (unless you already have this in; can't tell from your question's code). This will eliminate the step from above where we're .Selecting into an anonymous object and adding to the query to get the Storys from the database, by the mapping's StoryId. You'd be able to reference the stories belonging to the mapping simply by it being a child navigation property.
Then, you should also be able to look into some kind of mapping library, rather than mapping each individual property yourself for every call. Something like AutoMapper will do the trick (I'm sure other mappers are available). You could set up the mappings to do all the heavy lifting between your database entities and view models. There's a nifty .ProjectTo<T>() which will project your queried results to the desired type using those mappings you've specified.
I'm using Entity Framework with Web API 2. I have a boat entity with properties like name, price etc. Those simple properties update fine when sent by Put to the web api controller. The boat entity also has a many to one relationship with an entity called BoatType. Boat types are "Yacht", "Motor Yacht" etc.
When a boat entity is updated in the controller the foreign key for boat type in the database doesn't get updated. Do I have to somehow manually update the child entity value or is there a way to get EF to do this automatically?
Here's an example PUT request sent to web API:
{
"$id":"1",
"Images":[],
"BoatType": {
"$id":"3",
"Boat":[],
"Id":1,
"DateCreated":"2015-09-15T13:14:39.077",
"Name":"Yacht"
},
"Id":2,
"Name":"Schooner",
"Description":"A harmless schooner",
"DateCreated":"2015-09-15T17:59:37.8",
"Price":65000
}
Here's the update function in web API:
[ResponseType(typeof(void))]
public async Task<IHttpActionResult> Put(int id, Boat boat)
{
if (id != boat.Id)
{
return BadRequest();
}
_db.Entry(boat).State = EntityState.Modified;
try
{
await _db.SaveChangesAsync();
}
catch (DbUpdateConcurrencyException)
{
if (!BoatExists(id))
{
return NotFound();
}
else
{
throw;
}
}
return StatusCode(HttpStatusCode.NoContent);
}
I've looked at similar questions like Entity Framework Code First Update Does Not Update Foreign Key, Entity Framework does not update Foreign Key object and Update foreign key using Entity Framework but none seem to have quite the same scenario (or the answers didn't help me understand my issue).
Here's the Boat and BoatType model classes (auto-generated by EF designer).
public partial class Boat
{
[System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Usage", "CA2214:DoNotCallOverridableMethodsInConstructors")]
public Boat()
{
this.Images = new HashSet<Image>();
}
public int Id { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public System.DateTime DateCreated { get; set; }
public Nullable<double> Price { get; set; }
[System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Usage", "CA2227:CollectionPropertiesShouldBeReadOnly")]
public virtual ICollection<Image> Images { get; set; }
public virtual BoatType BoatType { get; set; }
}
public partial class BoatType
{
[System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Usage", "CA2214:DoNotCallOverridableMethodsInConstructors")]
public BoatType()
{
this.Boat = new HashSet<Boat>();
}
public int Id { get; set; }
public System.DateTime DateCreated { get; set; }
public string Name { get; set; }
[System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Usage", "CA2227:CollectionPropertiesShouldBeReadOnly")]
public virtual ICollection<Boat> Boat { get; set; }
}
Ok, I figured out what the problem was. Using SQL Server Profiler to look at the SQL Update statement I saw that the foreign key for Boat.BoatType wasn't even in there - so I figured my model must be screwed up somewhere. When I created the model in the designer, I mistakenly set the relationship between Boat and BoatType as one to one. I later realised the mistake and changed the association to one (BoatType) to many (Boats) but that must have been AFTER I generated the database. D'oh! Something about the way EF handles associations meant that simply changing the association type in the diagram wasn't enough - I should have dropped/recreated the database constraint at that time.
Since I only had test data in the database what worked for me was to recreate the database using the "Generate database from model..." option in the designer.
Once I got the PUT working correctly the other thing I had to solve (which is not really on topic for this question but it's been discussed above so just in case it's useful to someone) was that Web API gave the error "A referential integrity constraint violation occurred: The property value(s) of 'Boat.BoatTypeId' on one end of a relationship do not match the property value(s) of 'Boat.BoatType.Id' on the other end.". The select list that allows the user to change the boat type is bound on the client using AngularJS to Boat.BoatType. So in the PUT data, Boat.BoatType had been updated to new values but Boat.BoatTypeId hadn't changed - hence the "referential integrity" error. So I just manually set the value of Boat.BoatTypeId to Boat.BoatType.Id before sending the PUT and all works as expected now.
I'm having a bit of performance problem with an EF query.
We basically have this:
public class Article
{
public int ID { get; set; }
public virtual List<Visit> Visits { get; set; }
}
public class Visit
{
public int? ArticleID { get; set; }
public DateTime Date { get; set; }
}
Now, I would like to do:
Article a = ...;
vm.Count = a.Visits.Count;
The problem is that, from what I can gather, this first causes the entire list being fetched, and then the count of it. When doing this in a loop this creates a performance problem.
I assumed that it was due to the object being "too concrete", so I've tried to move the Visits.Count call as far back in repository as I can (so that we're sort of working directly with the DbContext). That didn't help.
Any suggestions?
Assuming your data context has a Visits property:
public class MyDbContext: DbContext
{
public IDbSet<Article> Articles { get; set; }
public IDbSet<Visit> Visits { get; set; }
}
you could do that:
using (var ctx = new MyDbContext())
{
var count = ctx.Visits.Where(x => x.ArticleID == 123).Count();
}
Also if the Visits collection is not always required when dealing with an article you could declare it as IEnumerable<T>:
public class Article
{
public int ID { get; set; }
public virtual IEnumerable<Visit> Visits { get; set; }
}
and then rely on the lazy loading.
I think the performance issue might be in the lazy loading. (But need to see more code for that).
Try an include(a => a.Visits) on the moment you retrieve articles from the dbcontext.
for more inforamtion on EF performance: http://www.asp.net/web-forms/tutorials/continuing-with-ef/maximizing-performance-with-the-entity-framework-in-an-asp-net-web-application
In the end I did it another way.
I found that this was hit over and over in different ways, and due to the way the rest of the domain model is set up, I made a bit of a hack:
In my VisitRepository I created a new function GetArticleIDsWithVisit(), which makes a direct sql call via db.SqlQuery, returning a Dictionary. The dictionary is cached and used in all places where visit counts are needed.
Not very pretty, but I have wrapped it inside the repository so I think it's ok.
I'm having a real trouble with what I need to do.
Here's the thing:
I'm creating Silverlight Business Application. I want users to be able to define their own "reminders" and "templates". It seems very simple, just 3 models, 2 one-to-many relations and that's all. But I have no idea how I can connect the existing User model to other models.
I tried to create my own "membership" provider - I've created db with all 3 models and it seemed to be ok, I created EntityModel, but now I have 2 different places where User class is defined, and in the first one it inherits UserBase class and in another EntityObject (in the file Model.Designer.cs, which is generated automatically.
I'm totally confused - can I stick with the EntityObject solution, delete other definitions of classes? If so, how can I still be able to use all the features that come with silverlight business application? (Authentication/Registering etc. is already provided).
We have implemented this scenario in our LOB app.
Firstly add the appropriate properties to the user class like so.
public partial class User : UserBase
{
public Guid UserId { get; set; }
public int PeopleId { get; set; }
public int EpothecaryUserId { get; set; }
public string PersonFullName { get; set; }
public SearchGroups SearchGroups { get; set; }
public string SearchHistoryString { get; set; }
public int SearchRowsReturnedPerGroup { get; set; }
}
Then create a class derived from AuthenticationBase
public class AuthenticationService : AuthenticationBase<User>
{
protected override User GetAuthenticatedUser(IPrincipal principal)
{
return base.GetAuthenticatedUser(principal).WithProfile();
}
[Invoke]
public void SaveMyUser(User user)
{
if (user.UserId == Guid.Empty)
{
ClientLogger.Error("SaveMyUser failed because the UserId is invalid");
return;
}
using (var db = new Pharma360Model())
{
var userProfile = db.UserProfiles.Single(p => p.EpothecaryUserId == user.EpothecaryUserId);
userProfile.SearchGroups = (int)user.SearchGroups;
userProfile.SearchHistory = user.SearchHistoryString;
userProfile.SearchRowsReturnedPerGroup = user.SearchRowsReturnedPerGroup;
db.SaveChanges();
}
}
}
And this will take care of the loading and saving of the custom User class.