Have two fields with data type datetime.
Added
Modified
When inserting new record values for both fields must be System.DateTime.Now;
but when updating only Modified needs to be changed.
I can set StoreGeneratedPattern to Computed and handle Modified field with GETDATE() in database but problem is field Added.
My guess is that I have to override SavingChanges() or something similar but don't know how.
EDIT : What I have try so far
Added another class in my project with fallowing code
namespace Winpro
{
public partial class Customer
{
public Customer()
{
this.Added = DateTime.UtcNow;
}
}
}
but then cannot build solution
Type 'Winpro.Customer' already defines a member called 'Customer' with the same parameter types
One option is to define a constructor for the type that sets the field.
Big important note: unless you know exactly what you're doing, always store dates and times in a database in UTC. DateTime.Now is the computer's local time which can vary according to daylight savings, timezone changes (brought about by political/legislative reasons), and can end up rendering date information useless. Use DateTime.UtcNow.
public partial class MyEntity {
public MyEntity() {
this.Added = DateTime.UtcNow;
}
}
We did something quite similar in the past.
There was the need to store both Date and Time and the responsible for creating the record. Also, on every change, dispite if there's an audit record or not, the base record should also get a Date and Time and the user responsible for the changes.
Here's what we have done:
Interfaces
To add some standard behavior and make things more extensible, we've created two interfaces, as follows:
public interface IAuditCreated
{
DateTime CreatedDateTime { get; set; }
string CreationUser { get; set; }
}
public interface IAuditChanged
{
DateTime LastChangeDateTime { get; set; }
string LastChangeUser { get; set; }
}
Override SaveChanges() to add some automatic control
public class WhateverContext : DbContext
{
// Some behavior and all...
public override int SaveChanges()
{
// Added ones...
var _entitiesAdded = ChangeTracker.Entries()
.Where(_e => _e.State == EntityState.Added)
.Where(_e => _e.Entity.GetType().GetInterfaces().Any(_i => _i == typeof(IAuditCreated)))
.Select(_e => _e.Entity);
foreach(var _entity in _entitiesAdded) { /* Set date and user */ }
// Changed ones...
var _entitiesChanged = ChangeTracker.Entries()
.Where(_e => _e.State == EntityState.Modified)
.Where(_e => _e.Entity.GetType().GetInterfaces().Any(_i => _i == typeof(IAuditChanged)))
.Select(_e => _e.Entity);
foreach(var _entity in _entitiesChanged) { /* Set date and user */ }
// Save...
return base.SaveChanges();
}
}
Do not simply copy and paste!
This code was written a few years ago, on the age of EntityFramework v4. It assumes that you have already detected changes (ChangeTracker available) and some other.
Also, we have absolutely no idea of how this code impacts performance on any way. That's because the usage of this system is much or related to viewing than updating and also because it's a desktop application, so we have plenty available memory and processing time to waste.
You should take that into account and you might find a better way to implement this. But the whole idea is the same: filter which entities are being updated and which are being added to properly handle that.
Another approach
There are many approaches to this. One other that might be better for performance on some cases (but also more complex) is to have some sort of proxy, similar to an EF proxy, handling that.
Again, even with an empty interface, it's good to have one to clearly distinguish between auditable records and regular ones.
If possible to force all of them having the same property name and type, do it.
Related
I've got a Blazor Server App using the Entity Framework (EF Core).
I use a code first approach with two models, Entry and Target.
Each entry has a target. So a target can have more than one entry pointing to it.
The model for the Target looks like this:
public class Target
{
public string TargetId { get; set; }
[Required]
public string Name { get; set; }
[InverseProperty("Target")]
public List<Entry> Entries { get; set; }
[NotMapped]
public double AverageEntryRating => Entries != null ? Entries.Where(e => e.Rating > 0).Select(e => e.Rating).Average() : 0;
}
An entry can have a rating, the model Entry looks like this:
public class Entry
{
public string EntryId { get; set; }
public int Rating { get; set; }
[Required]
public string TargetId {get; set; }
[ForeignKey("TargetId")]
public Target Target { get; set; }
}
As you can see in my Target model, I would like to know for each Target, what the average rating for it is, based on the average of all entries that point to the Target - that's why there is this (NotMapped) property in the target:
public double AverageEntryRating => Entries != null ? Entries.Where(e => e.Rating > 0).Select(e => e.Rating).Average() : 0;
But this does (of course) not always work, as the Entries of the target are not guaranteed to be loaded at the time the property is accessed.
I tried to solve it differently, for example to have a method in my TargetService, where I can pass in a targetId and gives me the result:
public double GetTargetMedianEntryRating(string targetId) {
var median = _context.Entries
.Where(e => e.TargetId == targetId && e.Rating > 0)
.Select(e => e.Rating)
.DefaultIfEmpty()
.Average();
return median;
}
But when I list out my targets in a table and then in a cell want to display this value (passing in the current targetId of the foreach loop) I get a concurrency exception, as the database context is used in multiple threads (I guess one from looping through the rows/targets and one other from getting the average value)... so this leads me into new troubles.
Personally I would prefer to work with the AverageEntryRating property on the Target model, as it seems natural to me and it would also be convenient to access the value just like this.
But how would I make sure, that the entries are loaded, when I access this property. Or is this not a good approach because this would mean I would have to load Entries anyway for all the targets which would lead to performance degradation? If yes, what would be a good way to get to the average/median value?
There are a couple of options I could think of, and it depends on you situation what to do. There might be more alternatives, but at least I hope that this can give you some options you hadn't considered.
Have a BaseQuery extension method that always include all Entries
You could make sure of doing .Include(x => x.Entries) whenever you are querying for Target. You can even create an extension method of the database context called something like TargetBaseQuery() that includes all necessary relationship whenever you use it. Then you will be sure that the Entries List of each Target will be loaded when you access the property AverageEntryRating.
The downside will be a performance hit, since every time you load a Target you will need to load all its entries... and that's for every Target you query.
However, if you need to get it working fast, this would be probably the easiest. The pragmatic approach would be to do this, measure the performance hit, and if it is too slow then try something else, instead of doing premature optimization. The risk of course would be that it might work fast now, but it might scale badly in the future. So it's up to you to decide.
Another thing to consider would be to not Include the Entries every single time, but only in those places where you know you need the average. It might however become a maintainability issue.
Have a different model and service method to calculate the TargetStats
You could create another class Model that stores the related data of a Target, but it's not persisted in the database. For example:
public class TargetStats
{
public Target Target { get; set; }
public double AverageEntryRating { get; set; }
}
Then in your service you could have a method ish like this (haven't tested, so it might not work as is, but you get the idea):
public List<TargetStats> GetTargetStats() {
var targetStats = _context.Target
.Include(x => x.Entries)
.Select(x => new TargetStats
{
Target = x,
AverageEntryRatings = x.Entries.Where(e => e.Rating > 0).Select(e => e.Rating).Average(),
})
.ToList()
return targetStats;
}
The only advantage of this is that you don't have to degrade the performance of all Target related queries, but only of those that requires the average rating.
But this query in particular might still be slow. What you could do to further tweak it, is write raw SQL instead of LINQ or have a view in the database that you can query.
Store and update the Target's average rating as a column
Probably the best you could do to keep the code clean and have good performance while reading, is to store the average as a column in the Target table. This will move the performance cost of the calculation to the saving/updating of a Target or its related Entries, but the readings will be super fast since the data is already available. If the readings happen way more often than the updates, then it's probably worth doing it.
You could take a look at EF Core docs on perfomance, since they talk a little bit about the different perfomance tunning alternatives.
I have a C# Entity which is auto generated via database first:
public partial class Zone
{
public Guid LocationId {get; set;}
...
}
What I need to do is run a function, Process() whenever LocationId is changed. Now normally I would alter the setter and job done, however because this is auto generated via database first, any "manual changes to the file will be overwritten if the code is regenerated."
What would be the best approach here?
The current thinking is to create a new partial class to do something like this:
public partial class Zone
{
private Guid _pendingLocationId
public Guid PendingLocationId {
get { return _pendingLocationId }
set {
Guid updatedLocation = Process(value) ?? value;
_pendingLocationId = updatedLocation;
locationId = updatedLocation;
}
}
}
Just a note; the unfortunate reality is that there is probably zero chance of us integrating a new framework or library into the application at this stage.
In response to the possible duplicate flag; Unless I have misread, this would require us re-mapping /encapsulating all of our Zone references into a new class, not only pushing this out to all the views, but also editing many linq queries etc. If someone can identify why this would be the preferred solution over my own suggested solve, then please let me know.
The least intrusive way to do this might be using AOP patterns, for instance, using PostSharp framework: less than 2 lines of code!
[NotifyPropertyChanged] //<---Add this attributes to the class
public class Zone
{
public Guid LocationId {get; set;}
...
}
To hook the changed event and add your own handler
//Zone instance;
((INotifyPropertyChanged) instance).PropertyChanged += ZoneOnPropertyChanged;
More details can be found here.
Update: the OP mentioned zero chance of integrating other library into the app, I am just curious, don't you use nuget? and what is the reason of this zero chance? In my personal view, you should, rather than NOT, to reinvent the wheels, if there is already a library which does the required features.
If licensing cost is the issue or it is overkill or to heavy to introduce this bulky library just for the sake of the problem, I think Fody, a free open source alternative to PostSharp can be considered. More specifically PropertyChanged.Fody package, which is very standalone, compact and light weight.
I would suggest using AutoMapper.
You can write another class with the same name and properties (with INPC), but in different namespace. Then, everytime you fetch database, you use Automapper to map the data into your notifiying class and everytime you save data to database you map it back.
That way you only need to change namespaces in code using your class and add code like this into your repository:
var dtos = args.Select(x => Mapper.Map<Zone>(x)).ToList();
Have a business entity mapped to yr database entity (via AutoMapper) and then in your business entity, incorporate the INotifyPropertyChanged interface.
Pseudo code below. This will de-couple your database from business entity and allow independent changes.
namespace DBEntity {
public class Customer {
public int Id { get; set; } ...
}
}
namespace BizEntity {
public class Customer : INotifyPropertyChanged {
private int id;
public int Id {
get { return this.id } ;
set {
this.id = value;
PropertyChanged(Id...);
}
}
NotifyPropertyChanged() {
....
}
var dbCustomer = GetCustomerFromDB()
var cust = AutoMapper.Mapper.Map<DBEntity.Customer, BizEntity.Customer>(dbCustomer);
// Update the property as per biz requirement
cust.Id = 53656; // Fires the Notification
Let me know if this helps.
Regarding AutoMapper as a new library, this will be a minimum change and there's no licensing or learning curve required here to allow fast integration.
I have an Entity Framework Project with several linked entities. Since it is utilized by multiple users at once I've set up a RowVersion-Field for entities which are likely to be edited by several users at once. Unfortunately I now get an OptimisticConecurrencyException every time I try to save a new entity, which is linked to an already existing entity.
Store update, insert, or delete statement affected an unexpected number of rows (0). Entities may have been modified or deleted since entities were loaded. See http://go.microsoft.com/fwlink/?LinkId=472540 for information on understanding and handling optimistic concurrency exceptions.
The problem is now that this error doesn't really give any pointers as to where the error really lies. It could either be the underlying model that is modified in the meantime, there could be a validation error on the new model or something else.
The code I use to add the new entity is as follows:
using (ctx = new DbContext())
{
try
{
ctx.Samples.Add(model);
ctx.SaveChanges();
}
catch (DbUpdateConcurrencyException ex)
{
LogManager.HandleException(ex.InnerException);
}
}
model is the model i want to add to the database
Edit: As seen above i modified the code to ignore the update of an underlying model. Furthermore i have verified through:
ctx.Database.log = s => Debug.Write(s);
That only an insert statement is sent to the database and not an additional update statement.
INSERT [dbo].[Samples]([IDSample], [ModificationDate], [IDUser])
VALUES (#0, #1, #2)
SELECT [RowVersion]
FROM [dbo].[Samples]
WHERE ##ROWCOUNT > 0 AND [IDSample] = #0 AND [ModificationDate] = #1
I would understand the exception if i would update an entity and the rowversion column wouldn't match, but in this case it's a completely new entity. Is there a way to see if one of the properties is malformed?
Edit2:
Instead of just trimming the milliseconds i now used DateTime.Today instead of DateTime.Now which works. Seemingly there is some problem with datetime2(4) on ModificationDate. I already made sure that ModificationDate is truncated to 4 milliseconds so there should be no parse error.
Edit3:
After switching back to DateTime.Now and trimming the milliseconds it stopped working and the entities are not longer inserted into the database. Could this be caused by the fact that the sql server has problems matching the entities based on millisecond values. I executed the EF generated SQL as seen above with some fictional values and it went through although on some occasions the query didn't return a rowversion-value. In terms of the entity framework, the client would interpret this as a return value of 0 lines and therefore call an concurrency-exception. (It should also be of note that the ModificationDate together with the IDSample is the primary key of the entity.)
Edit4:
I'm now using DateTime.Today and then add the needed precision, which works for me. This can be flagged as solved. (Altough i would have expected that EF can take care of datetime-format-conversion by itself :/)
The question I have is where are/were you adding the DateTime? You are creating too many steps to hammer out this problem. Creating a datetime, modifying it, etc.
If you're entity is inheriting from a base class with mapped properties do your concurrency add/update in the DbContext override of SaveChanges().
Here's an example: (written without optimized syntax)
public abstract class EntityBase
{
public int Id {get; set;}
public DateTime CreationDate {get; set;}
public DateTime? ModifyDate {get; set;}
public string VersionHash {get; set;}
}
public static class EntityBaseExtensions
{
public static void MyBaseEntityMapping<T>(this EntityTypeConfiguration<T> configuration) where T : EntityBase
{
configuration.HasKey(x => x.Id);
configuration.Property(x => x.CreationDate)
.IsRequired();
configuration.Property(x => x.ModifyDate)
.IsOptional();
configuration.Property(x => x.VersionHash).IsConcurrencyToken();
}
}
public class MyEntity : EntityBase
{
public string MyProperty {get; set;}
}
public class MyEntityMapping : EntityTypeConfiguration<MyEntity>
{
public MyEntityMapping()
{
this.MyBaseEntityMapping();
Property(x=>x.MyProperty).IsRequired();
}
}
public class MyContext : DbContext
{
....
public override int SaveChanges()
{
this.ChangeTracker.DetectChanges(); //this forces EF to compare changes to originals including references and one to many relationships, I'm in the habit of doing this.
var context = ((IObjectContextAdapter)this).ObjectContext; //grab the underlying context
var ostateEntries = context.ObjectStateManager.GetObjectStateEntries(EntityState.Modified | EntityState.Added); // grab the entity entries (add/remove, queried) in the current context
var stateEntries = ostateEntries.Where(x => x.IsRelationship == false && x.Entity is EntityBase); // don't care about relationships, but has to inherit from EntityBase
var time = DateTime.Now; //getting a date for our auditing dates
foreach (var entry in stateEntries)
{
var entity = entry.Entity as EntityBase;
if (entity != null) //redundant, but resharper still yells at you :)
{
if (entry.State == EntityState.Added) //could also look at Id field > 0, but this is safe enough
{
entity.CreationDate = time;
}
entity.ModifyDate = time;
entity.VersionHash = Guid.NewGuid().ToString().Replace("-", "").Substring(0, 10); //this an example of a simple random configuration of letters/numbers.. since the query on sql server is primarily using the primary key index, you can use whatever you want without worrying about query execution.. just don't query on the version itself!
}
}
return base.SaveChanges();
}
....
}
Our company ships a suite of various applications that manipulate data in a database. Each application has its specific business logic, but all applications share a common subset of business rules. The common stuff is incapsulated in a bunch of legacy COM DLLs, written in C++, which use "classic ADO" (they usually call stored procedures, sometimes they use dynamic SQL). Most of these DLLs have XML-based methods (not to mention the proprietary-format-based methods!) to create, edit, delete and retrieve objects, and also extra action such as methods which copy and transform many entities quickly.
The middleware DLLs are now very old, our application developers want a new object-oriented (not xml-oriented) middleware that can be easily used by C# applications.
Many people in the company say that we should forget old paradigms and move to new cool stuff such Entity Framework. They are intrigued by the simplicity of POCOs and they would like to use LINQ to retrieve data (The Xml-based query methods of the DLLs are not so easy to use and will never be as flexible as LINQ).
So I'm trying to create a mock-up for a simplified scenario (the real scenario is much more complex, and here I'll post just a simplified subset of the simplified scenario!). I'm using Visual Studio 2010, Entity Framework 5 Code First, SQL Server 2008 R2.
Please have mercy if I make stupid mistakes, I'm a newby to Entity Framework.
Since I have many different doubts, I'll post them in separate threads.
This is the first one. Legacy XML methods have a signature like this:
bool Edit(string xmlstring, out string errorMessage)
With a format like this:
<ORDER>
<ID>234</ID>
<NAME>SuperFastCar</NAME>
<QUANTITY>3</QUANTITY>
<LABEL>abc</LABEL>
</ORDER>
The Edit method implemented the following business logic: when a Quantity is changed, an "automatic scaling" must be applied to all Orders which have the same Label.
E.g. there are three orders: OrderA has quantity = 3, label = X. OrderB has quantity = 4, label = X. OrderC has quantity = 5, label = Y.
I call the Edit method supplying a new quantity = 6 for OrderA, i.e. I'm doubling OrderA's quantity. Then, according to the business logic, OrderB's quantity must be automatically doubled, and must become 8, because OrderB and OrderA have the same label. OrderC must not be changed because it has a different label.
How can I replicate this with POCO classes and Entity Framework? It's a problem because the old Edit method can change only one order at a time, while
Entity Framework can change a lot of Orders when SaveChanges is called. Furthermore, a single call to SaveChanges can also create new Orders.
Temporary assumptions, just for this test: 1) if many Order Quantities are changed at the same time, and the scaling factor is not the same for all of them, NO scaling occurs; 2) newly added Orders are not automatically scaled even if they have the same label of a scaled order.
I tried to implement it by overriding SaveChanges.
POCO class:
using System;
namespace MockOrders
{
public class Order
{
public Int64 Id { get; set; }
public string Name { get; set; }
public string Label { get; set; }
public decimal Quantity { get; set; }
}
}
Migration file (to create indexes):
namespace MockOrders.Migrations
{
using System;
using System.Data.Entity.Migrations;
public partial class UniqueIndexes : DbMigration
{
public override void Up()
{
CreateIndex("dbo.Orders", "Name", true /* unique */, "myIndex1_Order_Name_Unique");
CreateIndex("dbo.Orders", "Label", false /* NOT unique */, "myIndex2_Order_Label");
}
public override void Down()
{
DropIndex("dbo.Orders", "myIndex2_Order_Label");
DropIndex("dbo.Orders", "myIndex1_Order_Name_Unique");
}
}
}
DbContext:
using System;
using System.Data.Entity;
using System.Data.Entity.ModelConfiguration;
using System.Linq;
namespace MockOrders
{
public class MyContext : DbContext
{
public MyContext() : base(GenerateConnection())
{
}
private static string GenerateConnection()
{
var sqlBuilder = new System.Data.SqlClient.SqlConnectionStringBuilder();
sqlBuilder.DataSource = #"localhost\aaaaaa";
sqlBuilder.InitialCatalog = "aaaaaa";
sqlBuilder.UserID = "aaaaa";
sqlBuilder.Password = "aaaaaaaaa!";
return sqlBuilder.ToString();
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
modelBuilder.Configurations.Add(new OrderConfig());
}
public override int SaveChanges()
{
ChangeTracker.DetectChanges();
var groupByLabel = from changedEntity in ChangeTracker.Entries<Order>()
where changedEntity.State == System.Data.EntityState.Modified
&& changedEntity.Property(o => o.Quantity).IsModified
&& changedEntity.Property(o => o.Quantity).OriginalValue != 0
&& !String.IsNullOrEmpty(changedEntity.Property(o => o.Label).CurrentValue)
group changedEntity by changedEntity.Property(o => o.Label).CurrentValue into x
select new { Label = x.Key, List = x};
foreach (var labeledGroup in groupByLabel)
{
var withScalingFactor = from changedEntity in labeledGroup.List
select new
{
ChangedEntity = changedEntity,
ScalingFactor = changedEntity.Property(o => o.Quantity).CurrentValue / changedEntity.Property(o => o.Quantity).OriginalValue
};
var groupByScalingFactor = from t in withScalingFactor
group t by t.ScalingFactor into g select g;
// if there are too many scaling factors for this label, skip automatic scaling
if (groupByScalingFactor.Count() == 1)
{
decimal scalingFactor = groupByScalingFactor.First().Key;
if (scalingFactor != 1)
{
var query = from oo in this.AllTheOrders where oo.Label == labeledGroup.Label select oo;
foreach (Order ord in query)
{
if (this.Entry(ord).State != System.Data.EntityState.Modified
&& this.Entry(ord).State != System.Data.EntityState.Added)
{
ord.Quantity = ord.Quantity * scalingFactor;
}
}
}
}
}
return base.SaveChanges();
}
public DbSet<Order> AllTheOrders { get; set; }
}
class OrderConfig : EntityTypeConfiguration<Order>
{
public OrderConfig()
{
Property(o => o.Name).HasMaxLength(200).IsRequired();
Property(o => o.Label).HasMaxLength(400);
}
}
}
It seems to work (barring bugs of course), but this was an example with just 1 class: a real production application may have hundreds of classes!
I'm afraid that in a real scenario, with a lot of constraints and business logic, the override of SaveChanges could quickly become long, cluttered and error-prone.
Some colleagues are also concerned about performance. In our legacy DLLs, a lot of business logic (such as "automatic" actions) lives in stored procedures, some colleagues are worried that the SaveChanges-based approach may introduce too many round-trips and hinder performance.
In the override of SaveChanges we could also invoke stored procedures, but what about transactional integrity? What if I make changes to the database
before I call "base.SaveChanges()", and "base.SaveChanges()" fails?
Is there a different approach? Am I missing something?
Thank you very much!
Demetrio
p.s. By the way, is there a difference between overriding SaveChanges and registering to "SavingChanges" event? I read this document but it does not explain whether there's a difference:
http://msdn.microsoft.com/en-us/library/cc716714(v=vs.100).aspx
This post:
Entity Framework SaveChanges - Customize Behavior?
says that "when overriding SaveChanges you can put custom logic before and AFTER calling base.SaveChanges". But are there other caveats/advantages/drawbacks?
I'd say this logic belongs either in your MockOrders.Order class, in a class from a higher layer which uses your Order class (e.g. BusinessLogic.Order) or in a Label class. Sounds like your label acts as a joining attribute so, without knowing the particulars, I'd say pull it out and make it an entity of its own, this will give you navigation properties so you can more naturally access all Orders with the same label.
If modifying the DB to normalise out Labels is not a goer, build a view and bring that into your entity model for this purpose.
I've had to do something similar, but I've created an IPrepForSave interface, and implemented that interface for any entities that need to do some business logic before they're saved.
The interface (pardon the VB.NET):
Public Interface IPrepForSave
Sub PrepForSave()
End Interface
The dbContext.SaveChanges override:
Public Overloads Overrides Function SaveChanges() As Integer
ChangeTracker.DetectChanges()
'** Any entities that implement IPrepForSave should have their PrepForSave method called before saving.
Dim changedEntitiesToPrep = From br In ChangeTracker.Entries(Of IPrepForSave)()
Where br.State = EntityState.Added OrElse br.State = EntityState.Modified
Select br.Entity
For Each br In changedEntitiesToPrep
br.PrepForSave()
Next
Return MyBase.SaveChanges()
End Function
And then I can keep the business logic in the Entity itself, in the implemented PrepForSave() method:
Partial Public Class MyEntity
Implements IPrepForSave
Public Sub PrepForSave() Implements IPrepForSave.PrepForSave
'Do Stuff Here...
End Sub
End Class
Note that I place some restrictions on what can be done in the PrepForSave() method:
Any changes to the entity cannot make the entity validation logic fail, because this will be called after the validation logic is called.
Database access should be kept to a minimum, and should be read-only.
Any entities that don't need to do business logic before saving should not implement this interface.
I have following repository. I have a mapping between LINQ 2 SQL generated classes and domain objects using a factory.
The following code will work; but I am seeing two potential issues
1) It is using a SELECT query before update statement.
2) It need to update all the columns (not only the changed column). This is because we don’t know what all columns got changed in the domain object.
How to overcome these shortcomings?
Note: There can be scenarios (like triggers) which gets executed based on specific column update. So I cannot update a column unnecessarily.
REFERENCE:
LINQ to SQL: Updating without Refresh when “UpdateCheck = Never”
http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=113917
CODE
namespace RepositoryLayer
{
public interface ILijosBankRepository
{
void SubmitChangesForEntity();
}
public class LijosSimpleBankRepository : ILijosBankRepository
{
private IBankAccountFactory bankFactory = new MySimpleBankAccountFactory();
public System.Data.Linq.DataContext Context
{
get;
set;
}
public virtual void SubmitChangesForEntity(DomainEntitiesForBank.IBankAccount iBankAcc)
{
//Does not get help from automated change tracking (due to mapping)
//Selecting the required entity
DBML_Project.BankAccount tableEntity = Context.GetTable<DBML_Project.BankAccount>().SingleOrDefault(p => p.BankAccountID == iBankAcc.BankAccountID);
if (tableEntity != null)
{
//Setting all the values to updates (except primary key)
tableEntity.Status = iBankAcc.AccountStatus;
//Type Checking
if (iBankAcc is DomainEntitiesForBank.FixedBankAccount)
{
tableEntity.AccountType = "Fixed";
}
if (iBankAcc is DomainEntitiesForBank.SavingsBankAccount)
{
tableEntity.AccountType = "Savings";
}
Context.SubmitChanges();
}
}
}
}
namespace DomainEntitiesForBank
{
public interface IBankAccount
{
int BankAccountID { get; set; }
double Balance { get; set; }
string AccountStatus { get; set; }
void FreezeAccount();
}
public class FixedBankAccount : IBankAccount
{
public int BankAccountID { get; set; }
public string AccountStatus { get; set; }
public double Balance { get; set; }
public void FreezeAccount()
{
AccountStatus = "Frozen";
}
}
}
If I understand your question, you are being passed an entity that you need to save to the database without knowing what the original values were, or which of the columns have actually changed.
If that is the case, then you have four options
You need to go back to the database to see the original values ie perform the select, as you code is doing. This allows you to set all your entity values and Linq2Sql will take care of which columns are actually changed. So if none of your columns are actually changed, then no update statement is triggered.
You need to avoid the select and just update the columns. You already know how to do (but for others see this question and answer). Since you don't know which columns have changed you have no option but set them all. This will produce an update statement even if no columns are actually changed and this can trigger any database triggers. Apart from disabling the triggers, about the only thing you can do here is make sure that the triggers are written to check the old and new columns values to avoid any further unnecessary updates.
You need to change your requirements/program so that you require both old and new entities values, so you can determine which columns have changed without going back to the database.
Don't use LINQ for your updates. LINQ stands for Language Integrated QUERY and it is (IMHO) brilliant at query, but I always looked on the updating/deleting features as an extra bonus, but not something which it was designed for. Also, if timing/performance is critical, then there is no way that LINQ will match properly hand-crafted SQL.
This isn't really a DDD question; from what I can tell you are asking:
Use linq to generate direct update without select
Where the accepted answer was no its not possible, but theres a higher voted answer that suggests you can attach an object to your context to initiate the change tracking of the data context.
Your second point about disabling triggers has been answered here and here. But as others have commented do you really need the triggers? Should you not be controlling these updates in code?
In general I think you're looking at premature optimization. You're using an ORM and as part of that you're trusting in L2S to make the database plumbing decisions for you. But remember where appropriate you can use stored procedures execute specific your SQL.