I'm consuming an api that has the following class for time.
public class CRTime
{
public DateTime? datetime { get; set; }
public string timezone { get; set; }
}
I would like to translate that into a single DateTime field.
Here is an example of the full object that is returned to me and how I am trying to calculate the DateTime field that will then be stored in the database.
public class ReturnObject
{
[NotMapped]
// This comes from the api, but is not synced to database
public CRTime CRCreated { get; set; }
// this is stored in our db, should be calculated from CRCreated
public DateTime? CreatedDB { get
{
var val = (CRCreated != null) ? CRCreated.datetime : null;
return val;
}
private set { }
}
// other fields go here
}
This works perfect on record create, but when I try and update the record using the Entity.Migration framework and AddOrUpdate, the update overwrites the value and always sets it to NULL.
What is the best solution for creating a server side computed column that then gets synced to the database.
Side Note: I am using NewtonSoft Json to deserialize the object into my entity framework object, then passing that to Entity Framework AddOrUpdate.
This is because of a rather confusing quirk of AddOrUpdate.
AddOrUpdate determines if an entity is new or existing, obviously. It does that by querying the database for the entity by its primary key, or some key you can specify. Now two things can happen:
It doesn't find the entity in the database; your entity object is marked as Added.
It does find an entity, but contrary to what you'd expect your entity object is still Detached! The found entity remains hidden and this is the object EF updates in the end. It copies the values from your own entity to it, which may mark the hidden entity as Modified.
Obviously, by this empty setter, CreatedDB will always be null when it's fetched from the database. For EF, CRCreated doesn't exist. It probably reads a CreatedDB value from your object, but again the empty setter prevents it from being copied to the hidden object and it gets updated as null.
You have to decide how to fix this. Probably the best way is to make the setter actually modify (or create) the CRCreated instance.
Alternatively, you can do away with AddOrUpdate and try to find the existing object yourself. If you find it, it's not hidden anymore, so you can do with it whatever you like.
I would expect it to be NULL after you read it from the Db. The private set; is of course weird and blocks the proper working of EF.
You could do your self a great favour when you are able to use DateTimeOffset but in the current situation:
Your property might be logically 'calculated' but towards the database it should be treated and implemented as a normal read/write property.
Related
I've been working on an application using ASP.NET Core using the Entity Framework Core connector for Cosmos DB. For the most part, it's been smooth sailing. I've reached a point where I'd like to add support for soft-deleting records from the database, using my application to query for a DeletedAt property to determine whether or not to include an entity in the results.
I'm using a base entity type to which I'm adding the aforementioned timestamp:
public abstract class Entity
{
[Key]
public long Id { get; internal set; }
public DateTimeOffset CreatedAt { get; set; }
public DateTimeOffset? UpdatedAt { get; set; }
public DateTimeOffset? DeletedAt { get; set; }
}
This updated code runs just fine. However, I do have several thousands of entities in my database that don't have the DeletedAt property defined.
I'm currently performing my queries from a generic repository type that does something along the lines of :
return await Queryable.Where(x => x.DeletedAt == null).Where(predicate).ToListAsync();
This works fine for new entities that do have a DeletedAt property defined, but excludes all of my older entities that don't have the property set. I'd expect EF to assume default values for properties that aren't defined, but it appears to ignore the old entities altogether.
Normally, using SQL Server, I'd just apply a migration and retrofit all of the older entities with a null DeletedAt timestamp, and all would be dandy. However, in Cosmos DB, I'm not sure how to handle this case. Do I have to go through all the older entities and retrofit them with the deletion timestamp, or is there another way to deal with changing entities and values missing?
Thanks!
In Cosmos there's a difference between null and no value (undefined). In SQL query text you would use the following to query on a property being defined (and additionally you can add a null check to the expression):
SELECT *
FROM c
WHERE NOT IS_DEFINED(c.example)
For the C# Cosmos SDK with linq you could use Microsoft.Azure.Cosmos.Linq and do:
var qry = container.GetItemLinqQueryable<Item>()
.Where(x => !x.Example.IsDefined())
.ToFeedIterator();
For EF this unfortunately doesn't seem to be possible as of this moment. Links:
Git issue
Current available functions
If you want to set it to null you would indeed need to iterate over all items and set it to make such queries possible using EF.
If you are using the .NET SDK for Cosmos DB, and thus using its deserialisation (which uses json.NET) then if your model type has a nullable property you don't need a value for that property set in the CosmosDB document.
It will remain a .NET instance with null values.
Thus optional properties in the CosmosDB document can be easily handled with nulls in your application code.
I set my serialisation options for Cosmos DB to not save null values to just avoid storing such properties all together.
I am wondering if it makes sense to create a migration for a scenario where there aren't necessarily model changes, but the enum property on a model has been changed. I'm using .NET 4.6.2 and Code-First Entity Framework
I have the following EF-tracked log model:
[Table("Logs")]
public class Log
{
public int Id { get; set; }
public DateTime Timestamp { get; set; }
public LogType Type { get; set; }
}
The LogType enum currently has around 40 values, of which, 13 have become obsolete or deprecated. I am going through the process of removing references to the obsolete/deprecated enum values.
As such, the values of the LogType enum are being changed. For example, LogType.ConnectionTimeout used to have the value 16, but now has the value 3.
In my database (MSSQL), the Type column is stored as an int, and I have written SQL that deletes all entries with an obsolete/deprecated enum value, and I have also written SQL that updates the other enum values to match what their new values are (e.g. changing 16 to 3 using my previous example of Type.ConnectionTimeout).
My question is this: Is it a good practice to put that SQL in a migration that is able to be Up()'d and Down()'d? My other question is, is that even possible? Or does there need to be actual model changes to create a migration? I'd like to be able to have this SQL tracked and stored in version control, as well as the ability to Up() and Down() it in the future if need be.
Thanks, and apologies in advance if this is a duplicate -- I wasn't able to find a similar question through my searches.
I am new to Entity Framework. I started with database first approach which created my classes corresponding to the tables I selected. I am using WPF. Unfortunately there is a problem occurred while EF6 was mapping classes. The assigned type for a field is byte while in some cases the value exceeds the byte constraints. So, I want to replace it with either int or double. How do I change the model field types without any changes made for the used database?
namespace DataChrome{
public partial class REGISTRY_MEST{
public byte MEST { get; set; } //Wrong typed field
public string MESTNAME { get; set; }
public Nullable<byte> IDSTRAT { get; set; }
public Nullable<int> MESTOLD { get; set; }
}
}
So, giving 7 hours to this problem I found the right solution. All you need is just to set user mapping rules in appconfig file. For more details: visit this page.
The type change should be possible by editing the edmx model: click the MEST property of the class inside the edmx model, then set the Type accordingly in the Properties Window and save the model.
You are running a risk by doing this, as it might be possible to store a value too big for the column if you just change the type this way. You noted that you are using Oracle as the underlying DB, so it might very well be the case that EF generated a "wrong" type for that property.
If you are absolutely sure that the DB will accept the expanded type (int, double) then it should be safe to edit the property as I mentioned at the start. Otherwise you would have to change the DB and generate the class anew - you might need to delete the class from the model and add it again, because not all changes to the table are picked up by the automatic update process.
i have the following model:
public partial class location
{
public int Id { get; set; }
public double Lat { get; set; }
public double Long { get; set; }
public virtual ICollection<localserver> localserver { get; set; }
}
When, in a controller, i do:
List<location> l = db.location.ToList();
i get also the localserver object. How, in LINQ, get only the property of location (Id, Lat and Long) without using the Select argument in linq?
The way to extract part of an object is to project it to a new form, which is what .Select() is for:
var result = db.location
.Select(x => new
{
Id = x.Id,
Lat = x.Lat,
Long = x.Long
})
.ToList();
Now, you've specifically asked how to do this without using .Select(), so... really, the answer is "you don't". You've ruled out the tool that's specifically designed for the scenario you're presenting, so no useful answer remains.
To address the intent of your question, however, I'm going to make a guess that you don't want to load the collection of localserver objects, perhaps because it's large and uses a lot of memory. To solve that problem, I would suggest the following:
If you're using Entity Framework Code First, check your cascade options when creating the database (if you're setting any).
Check that the collection is declared as virtual (you've shown it as such here but check the actual source)
Check that you have lazy-loading enabled, by setting myContext.ContextOptions.LazyLoadingEnabled = true; at some point
This should allow the application to lazy-load that property, which means that the contents of that localserver property will only be retrieved from the database and loaded into memory when they're actually needed. If you don't ever access it, it won't be loaded and won't take up any memory.
When you are getting Location list entity is not pulling Localserver object data, the thing that entity framework has feature called lazy loading. Lazy loading means you can get associated object at any time you need, no need write one more separate linq query to get. This will pull data from associated object only when you call them inside the code, then entity framework will make one more call to database to load that associated object data.
So just go with your code, but if you want some selected columns from Location object itself than you have write Select and supply column names.
I've a table with 52 columns in my database and I want to write a function to create a row in that table.
In my case, I don't want to use all columns in that table, so I created my model like this.
[Table("CUST_MASTER")]
public class CustomerMaster
{
[Key]
[Column("CUSTOMER_ID")]
public string Id { get; set; }
[Column("CUSTOMER_NAME")]
public string Name { get; set; }
[Column("CUSTOMER_CITY")]
public string City { get; set; }
}
Is there any way to send only this data via Entity framework and set all other not nullable fields to some default data(for strings "", for decimals 0.0, etc.) without writing all that fields in my model and doing it manually?
When you do not incorporate a Table-column in your model then it won't be mapped and it will be totally ignored by all generated SQL.
So the only option is to specify a default value in your Database.
If you set the values in the constructor you will have a default value through your code, but you could look into enabling migrations instead, this way you can set default values. look at this stackoverflow question
I think this old suggestion is what you want. It explicitly mentions the lack of mapping between the conceptual model and the storage model. Not a very popular/understood idea though.
Update: FWIW, this suggests that it is already possible in non-Code-First scenarios.