Update only a single item in linq - c#

Currently I update a single item in a database as follows:
var master = (from tmi in db._Masters where tmi.Id == Id select tmi).FirstOrDefault();
master.logPosition++;
db.SubmitChanges();
This strikes me as inefficient, as (I think) I am pulling out a full DB row to just update a single value. Is there a more efficient query I can use?

You can select only the field you want to modify by adjusting your original linq select:
var master = (from tmi in db._Masters
where tmi.Id == Id
select new { tmi.logPosition }).FirstOrDefault();
master.logPosition++;
db.SubmitChanges();
EDIT: By selecting the specific data into associated properties, the property logPosition should lose its read-only status and be fully updatable.
var master = (from tmi in db._Masters
where tmi.Id == Id
select new {
ID = tmi.ID,
logPosition = tmi.logPosition }).FirstOrDefault();
master.logPosition++;
db.SubmitChanges();

I assume this is LINQ to SQL; please correct if wrong.
You could make a stored procedure and pull that stored procedure into your DataContext and then invoke the stored procedure instead of using the above query.
Alternatively, you could hand write an update query and use DataContext.ExecuteQuery to execute the query.
However, none of this really matters until you profile and find out whether or not this is truly a bottleneck. If it is not, I would stick with the simplest that works which is what you already have.

#linq
I can't get the accepted answer by Joel to work.
The select new { } creates an anonymous type which is not updatable (as you stated in your comment, see also this question) and is also not part of the DataContext.
I described a couple of possible solutions
Stored Procedure
View
ExecuteCommand
dbml mappings
on my blog.

Related

Get SYSTEM_TIME information from a temporal table on Entity Framework code first method

I created my temporal table following instructions on this link "Cutting Edge - Soft Updates with Temporal Tables"
The procedure generally takes two steps:
create the regular table; this is just the kind of work that Code First usually does
alter the table by adding SysStartTime, SysEndTime columns and turning on the SYSTEM_VERSIONING setting
Everything looks good so far.
However, please note that the entity class doesn't have SysStartTime property and SysEndTime property as they are added later. This gives me trouble as I need to get information of SysStartTime from table.
My question is: How can I get it with Entity Framework?
The link also says:
In EF 6, you can only leverage the SqlQuery method of the DbSet class:
using (var db = new EF6Context())
{
var current = db.Bookings.Single(b => b.Id == 1);
var time = DateTime.Now.AddMinutes(-5);
var old = db.Bookings
.SqlQuery("SELECT * FROM dbo.Bookings
FOR SYSTEM_TIME AS OF {0} WHERE Id = 1", time)
.SingleOrDefault();
}
Note that for EF 6, the column names returned in the query need to match the property names on the class. This is because SqlQuery doesn’t use mappings. If column and property names don’t match, then you’d need to alias the columns in the SELECT list, rather than just SELECT *.
I have no clue how to do this or if it solves my problem. Does anyone have any experience on this?
I can think of one solution is to added an extra column AppliedTime on my table by adding AppliedTime to my entity class. It has (almost, but good enough for me) same value as SysStartTime.
Another solution could be use plain SQL to query the table directly.
Thanks.
I have been worked on this one and found a solution. It is actually quite simple. Just use Database.SqlQuery Method
DateTime time = context.Database.SqlQuery<DateTime>("SELECT SysStartTime
FROM dbo.Calibration
WHERE CalibrationID = 1")
.SingleOrDefault();
See Also: Raw SQL Queries

Query with JOIN clause always returns a empty result set using Dapper

I trying to do the following query using dapper but its always returning a empty result set. First I tried to remove the WHERE clause in order to isolate the problem but that didn't work. After that I added a alias to the C.NAME column in the SELECT clause but didn't work either.
private const string SelectClaims =
#"SELECT C.NAME FROM CLAIMS C
INNER JOIN USERS_CLAIMS UC ON C.ID = UC.ID_CLAIM
WHERE UC.ID_USER = #Id";
using (var conn = new FbConnection(connectionString))
{
var claims = conn.Query<string>(SelectClaims, new { user.Id });
return claims;
}
If I replace the query above for this right here everything works fine:
SELECT NAME FROM CLAIMS
To be honest I am not sure if you are using Dapper properly since you are selecting named column and mapping it to simple string - I believe Dapper doesn't see 'Name' property as fails silently. I guess you should try either Query<T> with strongly typed object or use Query<dynamic> to avoid unnecessary class creation.
So, I put this aside and go do something else and after I came back to try to solve my problem everything was working fine. I didn't change anything in my code and surprisingly its working right now.
I don't know if is possible that a pending transaction in my MiTeC Interbase Query was blocking me to see the current records from the database. I try to simulate this again and now its always returning the records that I need (better that than nothing, hehe).
For clarification, its perfect fine to use a string as the returning data type, do a simple join in sql parameter to a Query method or don't use a alias for the returning column at all (only if all columns matches your C# property names or you just have a column directing to a string like me).

How to boost Entity Framework Unit Of Work Performance

Please see the following situation:
I do have a CSV files of which I import a couple of fields (not all in SQL server using Entity Framework with the Unit Of Work and Repository Design Pattern).
var newGenericArticle = new GenericArticle
{
GlnCode = data[2],
Description = data[5],
VendorId = data[4],
ItemNumber = data[1],
ItemUOM = data[3],
VendorName = data[12]
};
var unitOfWork = new UnitOfWork(new AppServerContext());
unitOfWork.GenericArticlesRepository.Insert(newGenericArticle);
unitOfWork.Commit();
Now, the only way to uniquely identify a record, is checking on 4 fields: GlnCode, Description, VendorID and Item Number.
So, before I can insert a record, I need to check whether or not is exists:
var unitOfWork = new UnitOfWork(new AppServerContext());
// If the article is already existing, update the vendor name.
if (unitOfWork.GenericArticlesRepository.GetAllByFilter(
x => x.GlnCode.Equals(newGenericArticle.GlnCode) &&
x.Description.Equals(newGenericArticle.Description) &&
x.VendorId.Equals(newGenericArticle.VendorId) &&
x.ItemNumber.Equals(newGenericArticle.ItemNumber)).Any())
{
var foundArticle = unitOfWork.GenericArticlesRepository.GetByFilter(
x => x.GlnCode.Equals(newGenericArticle.GlnCode) &&
x.Description.Equals(newGenericArticle.Description) &&
x.VendorId.Equals(newGenericArticle.VendorId) &&
x.ItemNumber.Equals(newGenericArticle.ItemNumber));
foundArticle.VendorName = newGenericArticle.VendorName;
unitOfWork.GenericArticlesRepository.Update(foundArticle);
}
If it's existing, I need to update it, which you see in the code above.
Now, you need to know that I'm importing around 1.500.000 records, so quite a lot.
And it's the filter which causes the CPU to reach almost 100%.
The `GetAllByFilter' method is quite simple and does the following:
return !Entities.Any() ? null : !Entities.Where(predicate).Any() ? null : Entities.Where(predicate).AsQueryable();
Where predicate equals Expression<Func<TEntity, bool>>
Is there anything that I can do to make sure that the server's CPU doesn't reach 100%?
Note: I'm using SQL Server 2012
Kind regards
Wrong tool for the task. You should never process a million+ records one at at time. Insert the records to a staging table using bulk insert and clean (if need be) and then use a stored proc to do the processing in a set-based way or use the tool designed for this, SSIS.
I've found another solution which wasn't proposed here, so I'll be answering my own question.
I will have a temp table in which I will import all the data, and after the import, I'll execute a stored procedure which will execute a Merge command to populate the destinatio table. I do believe that this is the most performant.
Have you indexed on those four fields in your database? That is the first thing that I would do.
Ok, I would recommend trying the following:
Improving bulk insert performance in Entity framework
To summarize,
Do not call SaveChanges() after every insert or update. Instead, call every 1-2k records so that the inserts/updates are made in batches to the database.
Also, optionally change the following parameters on your context:
yourContext.Configuration.AutoDetectChangesEnabled = false;
yourContext.Configuration.ValidateOnSaveEnabled = false;

Cycle Through Sql Columns

I am working with a gridview in C# and I am wondering if there is an effective way to use one data source instead of three. Right now I have an if statement that selects the Data Source based off the value in a dropdownlist, ddlType.
if (ddlType.Text == "Confirmation")
gvMailMergeExport.DataSourceID = "SqlDSConfirmation";
else if (ddlType.Text == "Cancellation")
gvMailMergeExport.DataSourceID = "SqlDSCancellation";
else
gvMailMergeExport.DataSourceID = "SqlDSPreArrival";
Because each Database needs to look at a different column in the same table to decide which data to show.
The three columns being used are ConfirmationEmail, CancellationEmail, and PreArrivalEmail.
Each of these three columns is a bit value and I only display the rows where the correct column has a '0' for it's value.
So question: is there anything like #ColumnName = 0 that would work for this?
Thank you.
I've never used SQLDataSources, but if you want to go the more custom route, you could build out your query (or use LINQ to SQL or LINQ to Entity Framework) with the custom where clause based on the user selection.
Which technology are you more familiar with? LINQ would be better, but I can give an answer in ADO.NET as well (SqlConnection, SqlCommand, etc).
So the LINQ would be relatively simple. After setting up your LINQ to Entities (EDMX) or LINQ to SQL (DBML) (I'd do the EDMX, because L2S is no longer supported in forward maintenance by MSFT). With an existing DB, it's very easy, drag and drop.
The code would look like this:
using(var context = new LinqDbContext())
{
var results = context.Tablename;
if(limitByConfEmail)
{
results = results.Where(data => data.ConfirmationEmail == true);
}
// do your elses here
gridview.DataSource = results;
gridview.DataBind();
}

Update in LINQ with one trip to database?

I'm having a hard time wrapping my head around when LINQ accesses the database, and how it knows when the optimal time would be. Mainly, this UPDATE statement seems to make two trips to the database. But it shouldn't -- there seems to be no "update" equivalent in LINQ short of querying for a row, updating it, and then saving that change. In basic SQL, I should be able to UPDATE WHERE in one fell swoop.
var query = from u in db.users where u.id == 1 select u;
foreach (user u in query)
{
u.name = "JOE";
}
db.Save();
I get the magic of Save. It allows me to make multiple changes before committing them. But there still seems to be the initial query for the rows.
Thoughts? Ideas?
EDIT
Now tested, I can confirm that it does in fact go to the database twice. There must be a better way!
If id is the primary key, you can use:
var u = new user { id = 1 };
db.users.Attach(u);
u.name = "JOE";
db.SubmitChanges();
Take a look at one of custom Linq extensions, like this one http://www.aneyfamily.com/terryandann/post/2008/04/Batch-Updates-and-Deletes-with-LINQ-to-SQL.aspx. You have DeleteBatch and UpdateBatch methods there. In particular, the UpdateBatch takes the condition and the updating expression and executes as one SQL query.

Categories