I have a table Communication which has a reference to PersonCompany.
In the mapping for PersonCompany i have defined a Cascade-Delete for this reference:
this.HasMany(x => x.Communications)
.AsSet()
.KeyColumn("PersonCompanyId")
.Fetch.Select()
.Inverse()
.Cascade.Delete();
But when I now execute the fallowing HQL-Query:
var sql = "delete from PersonCompany where Person.Id in (:idList) or Company.Id in (:idList)";
with
var query = NHibernateHelper.CurrentSession.CreateQuery(sql);
query.SetParameterList("idList", contactIdList);
query.SetTimeout(0);
query.ExecuteUpdate();
I always get this SqlException:
The DELETE statement conflicted with the REFERENCE constraint "FK_PersonCompany_Communication". The conflict occurred in database "proconact", table "dbo.Communication", column 'PersonCompanyId'.
The statement has been terminated.
I think, NHibernate should now delete cascade the referenced records in Communication - should'nt it?
I hope someone can help me, what I am doing wrong.
The syntax you've used is in fact part of the
13.3. DML-style operations,
which are in fact used to BULK operation on the DB server. They use the HQL syntax, but do not cover the cascade (because they are not executed in memory, just on the DB side)
This way, we can load the objects, to be deleted... and explicitly delete them. This will trigger cascades:
//var sql = "delete from PersonCompany where Person.Id in (:idList) or Company.Id in (:idList)";
var sql = "from PersonCompany where Person.Id in (:idList) or Company.Id in (:idList)";
var query = NHibernateHelper.CurrentSession.CreateQuery(sql);
query.SetParameterList("idList", contactIdList);
query.SetTimeout(0);
//query.ExecuteUpdate();
var list = query.List<PersonCompany >();
foreach (var item in list)
{
session.Delete(item);
}
session.Flush();
What happened is, that each and every item to be deleted, and placed in ISession. During the Delete() all the cascades were properly executed
9.9. Lifecyles and object graphs
way to do is,
IList<PersonCompany> _pCompanies = ....; <load the required person companies>
foreach (var pc in _pCompanies)
{
_session.delete(pc);
}
because when you use bulk updates, built in constraints is not going to work in Nhibernate. You could try to create a DB level constraint.
Related
I am doing an C# Entity Framework Application.
I am migrating data from a couple of tables from one database to another database. Those tables are called with the same name.
Here is my code
string ConnexionOrigen = ReadConnectionBiz.GetOrigen();
string ConnexionDestino = ReadConnectionBiz.GetDestino();
using (var db = new MigrateModel(ConnexionOrigen))
{
db.Configuration.LazyLoadingEnabled = false;
db.Configuration.ProxyCreationEnabled = false;
db.Configuration.AutoDetectChangesEnabled = false;
List<Promocion> promo = await (from p in db.Promocions
.Include(a =>a.PromocionAddresIds)
where (p.CAM_ID == 107936 || p.CAM_ID == 107937)
select p)
.AsNoTracking()
.ToListAsync();
var dbDestino = new MigrateModel(ConnexionDestino);
foreach (Promocion pp in promo)
{
try
{
dbDestino.Promocions.Add(pp);
await dbDestino.SaveChangesAsync();
}
catch (Exception ex)
{
string err = ex.InnerException.InnerException.Message.ToString();
}
}
What it does here is:
I search in Origen Database from table Promocions and insert those records in Promotions table in Destino Database .
I include a child table called PromocionAddresIds that is migrated too.
Everythig works fine...
The table PromocionAddresIds has a constraint with a table that it is not been migrated.
When I insert the Promotions with cam_id=107936 it thrown an exception of contraint violation... Witch is correct...
But when it read the next Promotions record with Cam_id =107937. That record has no record in table PromocionAddresIds, so it must be inserted, but it thrown the same error exception...
It appears that await dbDestino.SaveChangesAsync(); still have the all record or it persist somewhere...
I have tried adding
dbDestino.Promocions.Remove(pp); inside the catch exception, but it did not work.
I do not have any clue how to solve it.
Any ideas?
Thanks
You created a single instance of data context for all iterations of the loop. All objects added to the context will live there until the context is disposed or cleared. The situation you observe is:
First entity is added
An attempt to save changes, but it is failed due to constraint violation
Second entity is added
Another attempt to save changes. And here both first and second entity will be attempted to be inserted, because they still belong to the data context.
Please also note that calling "save changes" after each entity usually is not efficient. It is better to insert a portion of entities (let's say 100) and persist it in one shot.
I am working on Entity framework with database first approach and I came across below issue.
I have a Customer table with columns col1, col2, col3 ,....,col8. I have created an entity for this table and this table has around 100 records already. Out of above 8 columns, col4 is marked as Non-null.
Class Customer
{
member col1;
member col2;
member col3;
member col4;
.
.
member col8;
}
class Main
{
//main logic to read data from database using EF
Customer obj = object of Customerwith values assigned to col1,col2 and col3 members
obj.col2=some changed value.
DBContext.SaveChanges(); //<- throws an error stating it is expecting value of col4.
}
In my application, I am trying to read the one of the record using the stored procedure using EF and stored procedure only returns col1,col2 and col3.
I am trying to save the modified value of col2 and trying to save back to database using DBContext. But it thows an error stating value of required field col4 is not provided.
FYI: I have gone through couple of forums and question and option to go with disabled verfication on SaveChanges is not feasible for me.
Is there any other way through which I can achieve partial update?
I guess EntityFramework.Utilities satisfies your conditions.
This code:
using (var db = new YourDbContext())
{
db.AttachAndModify(new BlogPost { ID = postId }).Set(x => x.Reads, 10);
db.SaveChanges();
}
will generate single SQL command:
exec sp_executesql N'UPDATE [dbo].[BlogPosts]
SET [Reads] = #0
WHERE ([ID] = #1)
',N'#0 int,#1 int',#0=10,#1=1
disabled verfication on SaveChanges is not feasible for me
Sure it is. You even have to disable validation on Save. But then you can't mark the whole entity as modified, which I think you did. You must mark individual properties as modified:
var mySmallCustomer = someService.GetCustomer(); // from sproc
mySmallCustomer.col2 = "updated";
var myLargeCustomer = new Customer();
context.Customers.Attach(myLargeCustomer);
Entry(myLargeCustomer).CurrentValues.SetValues(mySmallCustomer);
// Here it comes:
Entry(myLargeCustomer).Property(c => c.col2).IsModified = true;
context.Configuration.ValidateOnSaveEnabled = false;
context.SaveChanges();
So you see it's enough to get the "small" customer. From this object you create a stub entity (myLargeCustomer) that is used for updating the one property.
During a migration how do i Insert into my table, then retrieve the ID, and then use it to insert a related data in another table.
what i have now is an hardoced ID to insert, but I don't know what it's gonna be when i'll run the migration.
var contactId = 2;
var phoneNumber = 2;
Insert.IntoTable("Contacts")
.WithIdentityInsert()
.Row(new
{
Id = contactId,
TimeZoneId = contact.TimeZoneId,
contact.CultureId,
Type = (byte)(int)contact.Type,
Email = contact.Email.ToString(),
EntityId = entityId
});
Insert.IntoTable("PhoneNumbers")
.WithIdentityInsert()
.Row(new
{
Id = phoneNumberId,
phone.Number,
Type = (byte)(int)phone.Type,
ContactId = contactId
});
I'd like to be able to retrieve the inserted ID and use it for the second insert instead of harcoding it.
I'm using SQL Server if it's any help...
I Thought this would be trivial, but seems like it's not, after googling for it, and not seing any answers here.
You are able to manually insert the Id by chaining .WithInsertIdentity() after your .Row() call.
This will let you keep it in memory for use within other objects as Foreign Keys. Unfortunately, FluentMigrator doesn't actually execute any SQL until after all code within the Up() or Down() methods finish executing.
I use Execute.Sql() with ##IDENTITY in sql-query for same cases
I'm using EF 5 Database first approach in my MVC application. all of my tables uses a Field called Deleted which is a boolean field to mark a record is deleted.
I'm trying to get rid of the requirement of having to check Deleted == false every time I query my database. The very straightforward way of doing this is to use a conditional mapping in the edmx file where EF always return data that are not deleted. That's all good.
But the problem of doing this condition mapping is that, when I want to allow the user to delete some record for e.g Address from their address book I don't have access to Delete field from EF as I used it in the conditional mapping and therefore I have to look for another option to allow user to delete a record.
The way I thought is to create a stored proc that handle the delete query and call it when I want to delete the record.
Is there a better way of doing this? Is it possible to make the Delete field accessible even it is used in the conditional mapping?
I have a working solution for Soft Delete in Entity Framework Code First that may help.
The key is that you add a discriminator to every model that you want to be able to soft delete. In code first that is done like this:
modelBuilder.Entity<Foo>().Map(m => m.Requires("IsDeleted").HasValue(false));
This makes it invisible to the context and therefore you have to do the deletes using sql.
If this is the equivalent of your "conditional mapping" in Database First then one way to modify the sql is to override SaveChanges and run sql from there:
public override int SaveChanges()
{
foreach (var entry in ChangeTracker.Entries()
.Where(p => p.State == EntityState.Deleted
&& p.Entity is ModelBase))//I do have a base class for entities with a single
//"ID" property - all my entities derive from this,
//but you could use ISoftDelete here
SoftDelete(entry);
return base.SaveChanges();
}
private void SoftDelete(DbEntityEntry entry)
{
var e = entry.Entity as ModelBase;
string tableName = GetTableName(e.GetType());
Database.ExecuteSqlCommand(
String.Format("UPDATE {0} SET IsDeleted = 1 WHERE ID = #id", tableName)
, new SqlParameter("id", e.ID));
//Marking it Unchanged prevents the hard delete
//entry.State = EntityState.Unchanged;
//So does setting it to Detached:
//And that is what EF does when it deletes an item
//http://msdn.microsoft.com/en-us/data/jj592676.aspx
entry.State = EntityState.Detached;
}
Method used to Get Table Name explained here
That is the way I used to do it. Probably irrelevant to your Database First approach in EF5, but I have now moved to doing it in stored procedures. EF6 Code First generates CreateStoredProcedure calls in Migration files. I replace these with this.CreateDeleteProcedure("dbo.Foo_Delete", "[dbo].[Foos]"); - which is a call to my own extension method:
public static class MigrationExtensions
{
internal static string DeleteSqlFormat
{
//I also hard delete anything deleted more than a day ago in the same table
get { return "DELETE FROM {0} WHERE IsDeleted = 1 AND DATEADD(DAY, 1, DeletedAt) < GETUTCDATE(); UPDATE {0} SET IsDeleted = 1, DeletedAt = GETUTCDATE() WHERE ID = #ID;"; }
}
internal static void CreateDeleteProcedure(this DbMigration migration, string procName, string tableName)
{
migration.CreateStoredProcedure(
procName,
p => new
{
ID = p.Int(),
},
body:
string.Format(MigrationExtensions.DeleteSqlFormat, tableName)
);
}
}
I'm doing an EF query with a list which contains many elements and I seem to be coming across the same issue listed here (where SQL Server throws an error because there are too many elements in the IN statement). I was wondering if there is another way to do it.
Would ExecuteStoreQuery or ExecuteStoreCommand work if the command set up a temporary table?
Thanks
Yep, the best way to select from a large list of keys is to use a temp table.
http://explainextended.com/2009/08/18/passing-parameters-in-mysql-in-list-vs-temporary-table/
If you're using MsSQL and C# then SqlBulkCopy will get your list of keys in the fastest.
public void bulkCopy(String tmpTableName, DataTable table)
{
using (SqlBulkCopy bulkCopy =
new SqlBulkCopy((SqlConnection)connection))
{
bulkCopy.DestinationTableName = tmpTableName;
bulkCopy.WriteToServer(table);
}
}
Then have a stored procedure to match to the temp table by key.
If that list data comes from a database query, do not call .ToList() on your source query. Instead pass the IQueryable to the query.
var statuses = context.Statuses.Where(o => o.IsActive).Select(o => o.Id);
var data = context.Orders.Where(o => statuses.Contains(o.StatusId));
Alternatively you can use something like this
var sessionId = Guid.NewGuid();
foreach (var s in statusList)
insert into SearchValues(sessionId, s); // pseudo code
var statuses = context.SearchValues.Where(o => o.SessionId == sessionId).Select(o => o.Id);
var data = context.Orders.Where(o => statuses.Contains(o.StatusId));
delete from SearchValues where SessionId == #sessionId // pseudo code
You might want to do the insert and delete using SQL queries (and not EF context operations) for optimal performance.