It is second nature for me to whip up some elaborate SQL set processing code to solve various domain model questions. However, the trend is not to touch SQL anymore. Is there some pattern reference or conversion tool out there that helps convert the various SQL patterns to Linq syntax?
I would look-up ways to code things like the following code: (this has a sub query):
SELECT * FROM orders X WHERE
(SELECT COUNT(*) FROM orders Y
WHERE Y.totalOrder > X.totalOrder) < 6
(Grab the top five highest total orders with side effects)
Alternatively, how do you know Linq executes as a single statement without using a debugger? I know you need to follow the enumeration, but I would assume just lookup the patterns somewhere.
This is from the MSDN site which is their example of doing a SQL difference. I am probably wrong, but I wouldn't think this uses set processing on the server (I think it pulls both sets locally then takes the difference, which would be very inefficient). I am probably wrong, and this could be one of the patterns on that reference.
SQL difference example:
var differenceQuery =
(from cust in db.Customers
select cust.Country)
.Except
(from emp in db.Employees
select emp.Country);
Thanks
-- Update:
-- Microsoft's 101 Linq Samples in C# is a closer means of constructing linq in a pattern to produce the SQL you want. I will post more as I find them. I am really looking for a methodology (patterns or a conversion tool) to convert SQL to Linq.
-- Update (sql from Microsoft's difference pattern in Linq):
SELECT DISTINCT [t0].[field] AS [Field_Name]
FROM [left_table] AS [t0]
WHERE NOT (EXISTS(
SELECT NULL AS [EMPTY]
FROM [right_table] AS [t1]
WHERE [t0].[field] = [t1].[field]
))
That's what we wanted, not what I expected. So, that's one pattern to memorize.
If you have hand-written SQL, you can use ExecuteQuery, specifying the type of "row" class as a function template argument:
var myList = DataContext.ExecuteQuery<MyRow>(
"select * from myview");
The "row" class exposes the columns as public properties. For example:
public class MyRow {
public int Id { get; set; }
public string Name { get; set; }
....
}
You can decorate the columns with more information:
public class MyRow {
....
[Column(Storage="NameColumn", DbType="VarChar(50)")]
public string Name { get; set; }
....
}
In my experience linq to sql doesn't generate very good SQL code, and the code it does generate breaks down for large databases. What linq to sql does very well is expose stored procedures to your client. For example:
var result = DataContext.MyProcedure(a,b,c);
This allows you to store SQL in the database, while having the benefits of an easy to use, automatically generated .NET wrapper.
To see the exact SQL that's being used, you can use the SQL Server Profiler tool:
http://msdn.microsoft.com/en-us/library/ms187929.aspx
The Linq-to-Sql Debug Visualizer:
http://weblogs.asp.net/scottgu/archive/2007/07/31/linq-to-sql-debug-visualizer.aspx
Or you can write custom code to log the queries:
http://goneale.wordpress.com/2008/12/31/log-linq-2-sql-query-execution-to-consoledebug-window/
This is why Linq Pad was created in the first place. :) It allows you to easily see what the output is. What the results of the query would be etc. Best is it's free. Maybe not the answer to your question but I am sure it could help you.
If you know exactly the sql you want, then you should use ExecuteQuery.
I can imagine a few ways to translate the query you've shown, but if you're concerned that "Except" might not be translated.
Test it. If it works the way you want then great, otherwise:
Rewrite it with items you know will translate, for example:
db.Customers.Where(c => !db.Employees.Any(e => c.Country == e.Country) );
If you are concerned about the TSQL generated, then I would suggest formalising the queries into stored procedures or UDFs, and accessing them via the data-context. The UDF approach has slightly better metadata and composability (compared to stored procedure) - for example you can add addition Where/Skip/Take etc to a UDF query and have it run at the database (but last time I checked, only LINQ-to-SQL (not Entity Framework) supported UDF usage).
You can also use ExecuteQuery, but there are advantages of letting the database own the fixed queries.
Re finding what TSQL executed... with LINQ-to-SQL you can assign any TextWriter (for example, Console.Out) to DataContext.Log.
I believe the best way is to use stored procedures. In this case you has full control on the SQL.
Related
I am new to Dapper though I am aware about ORMs and DAL and have implemented DAL with NHibernate earlier.
Example Query: -
string sql = "SELECT * FROM MyTable";
public int GetCount()
{
var result = Connection.Query<MyTablePoco>(sql).Count();
return result;
}
Will Dapper convert this query (internally) to SELECT COUNT(*) FROM MyTable looking at .Count() at the end?
Similarly, will it convert to SELECT TOP 1 * FROM MyTable in case of SingleOrDefault()?
I came from NHibernate world where it generates query accordingly. I am not sure about Dapper though. As I am working with MS Access, I do not see a way to check the query generated.
No, dapper will not adjust your query. The immediate way to tell this is simply: does the method return IEnumerable... vs IQueryable...? If it is the first, then it can only use local in-memory mechanisms.
Specifically, by default, Query will actually return a fully populated List<>. LINQ's Count() method recognises that and just accesses the .Count property of the list. So all the data is fetched from the database.
If you want to ask the database for the count, ask the database for the count.
As for mechanisms to view what is actually sent to the database: we use mini-profiler for this. It works great.
Note: when you are querying exactly one row: QueryFirstOrDefault (and the other variants you would expect) exist and have optimizations internally (including hints to ADO.NET, although not all providers can act on those things) to do things as efficiently as possible, but it does not adjust your query. In some cases the provider itself (not dapper) can help, but ultimately: if you only want the first row, ask the database for the first row (TOP or similar).
I have often found that if I have too many joins in a Linq query (whether using Entity Framework or NHibernate) and/or the shape of the resulting anonymous class is too complex, Linq takes a very long time to materialize the result set into objects.
This is a generic question, but here's a specific example using NHibernate:
var libraryBookIdsWithShelfAndBookTagQuery = (from shelf in session.Query<Shelf>()
join sbttref in session.Query<ShelfBookTagTypeCrossReference>() on
shelf.ShelfId equals sbttref.ShelfId
join bookTag in session.Query<BookTag>() on
sbttref.BookTagTypeId equals (byte)bookTag.BookTagType
join btbref in session.Query<BookTagBookCrossReference>() on
bookTag.BookTagId equals btbref.BookTagId
join book in session.Query<Book>() on
btbref.BookId equals book.BookId
join libraryBook in session.Query<LibraryBook>() on
book.BookId equals libraryBook.BookId
join library in session.Query<LibraryCredential>() on
libraryBook.LibraryCredentialId equals library.LibraryCredentialId
join lcsg in session
.Query<LibraryCredentialSalesforceGroupCrossReference>()
on library.LibraryCredentialId equals lcsg.LibraryCredentialId
join userGroup in session.Query<UserGroup>() on
lcsg.UserGroupOrganizationId equals userGroup.UserGroupOrganizationId
where
shelf.ShelfId == shelfId &&
userGroup.UserGroupId == userGroupId &&
!book.IsDeleted &&
book.IsDrm != null &&
book.BookFormatTypeId != null
select new
{
Book = book,
LibraryBook = libraryBook,
BookTag = bookTag
});
// add a couple of where clauses, then...
var result = libraryBookIdsWithShelfAndBookTagQuery.ToList();
I know it's not the query execution, because I put a sniffer on the database and I can see that the query is taking 0ms, yet the code is taking about a second to execute that query and bring back all of 11 records.
So yeah, this is an overly complex query, having 8 joins between 9 tables, and I could probably restructure it into several smaller queries. Or I could turn it into a stored procedure - but would that help?
What I'm trying to understand is, where is that red line crossed between a query that is performant and one that starts to struggle with materialization? What's going on under the hood? And would it help if this were a SP whose flat results I subsequently manipulate in memory into the right shape?
EDIT: in response to a request in the comments, here's the SQL emitted:
SELECT DISTINCT book4_.bookid AS BookId12_0_,
libraryboo5_.librarybookid AS LibraryB1_35_1_,
booktag2_.booktagid AS BookTagId15_2_,
book4_.title AS Title12_0_,
book4_.isbn AS ISBN12_0_,
book4_.publicationdate AS Publicat4_12_0_,
book4_.classificationtypeid AS Classifi5_12_0_,
book4_.synopsis AS Synopsis12_0_,
book4_.thumbnailurl AS Thumbnai7_12_0_,
book4_.retinathumbnailurl AS RetinaTh8_12_0_,
book4_.totalpages AS TotalPages12_0_,
book4_.lastpage AS LastPage12_0_,
book4_.lastpagelocation AS LastPag11_12_0_,
book4_.lexilerating AS LexileR12_12_0_,
book4_.lastpageposition AS LastPag13_12_0_,
book4_.hidden AS Hidden12_0_,
book4_.teacherhidden AS Teacher15_12_0_,
book4_.modifieddatetime AS Modifie16_12_0_,
book4_.isdeleted AS IsDeleted12_0_,
book4_.importedwithlexile AS Importe18_12_0_,
book4_.bookformattypeid AS BookFor19_12_0_,
book4_.isdrm AS IsDrm12_0_,
book4_.lightsailready AS LightSa21_12_0_,
libraryboo5_.bookid AS BookId35_1_,
libraryboo5_.libraryid AS LibraryId35_1_,
libraryboo5_.externalid AS ExternalId35_1_,
libraryboo5_.totalcopies AS TotalCop5_35_1_,
libraryboo5_.availablecopies AS Availabl6_35_1_,
libraryboo5_.statuschangedate AS StatusCh7_35_1_,
booktag2_.booktagtypeid AS BookTagT2_15_2_,
booktag2_.booktagvalue AS BookTagV3_15_2_
FROM shelf shelf0_,
shelfbooktagtypecrossreference shelfbookt1_,
booktag booktag2_,
booktagbookcrossreference booktagboo3_,
book book4_,
librarybook libraryboo5_,
library librarycre6_,
librarycredentialsalesforcegroupcrossreference librarycre7_,
usergroup usergroup8_
WHERE shelfbookt1_.shelfid = shelf0_.shelfid
AND booktag2_.booktagtypeid = shelfbookt1_.booktagtypeid
AND booktagboo3_.booktagid = booktag2_.booktagid
AND book4_.bookid = booktagboo3_.bookid
AND libraryboo5_.bookid = book4_.bookid
AND librarycre6_.libraryid = libraryboo5_.libraryid
AND librarycre7_.librarycredentialid = librarycre6_.libraryid
AND usergroup8_.usergrouporganizationid =
librarycre7_.usergrouporganizationid
AND shelf0_.shelfid = #p0
AND usergroup8_.usergroupid = #p1
AND NOT ( book4_.isdeleted = 1 )
AND ( book4_.isdrm IS NOT NULL )
AND ( book4_.bookformattypeid IS NOT NULL )
AND book4_.lightsailready = 1
EDIT 2: Here's the performance analysis from ANTS Performance Profiler:
It is often database "good" practice to place lots of joins or super common joins into views. ORMs don't let you ignore these facts nor do they supplement the decades of time spent fine tuning databases to do these kinds of things efficiently. Refactor those joins into a singular view or a couple views if that'd make more sense in the greater perspective of your application.
NHibernate should be optimizing the query down and reducing the data so that .Net only has to mess with the important parts. However, if those domain objects are just naturally large, that's still a lot of data. Also, if it's a really large result set in terms of rows returned, that's a lot of objects getting instantiated even if the DB is able to return the set quickly. Refactoring this query into a view that only returns the data you actually need would also reduce object instantiation overhead.
Another thought would be to not do a .ToList(). Return the enumerable and let your code lazily consume the data.
According to profiling information, the CreateQuery takes 45% of the total execution time. However as you mentioned the query took 0ms when you executed directly. But this alone is not enough to say there is a performance problem because,
You are running the query with the profiler which has significant impact on execution time.
When you use a profiler, it will affect every code is being profiled but not the sql execution time (because it happens in the SQL server), so you can see everything else is slower compared to SQL statement.
so ideal scenario is to measure how long it takes to execute entire code block, measure time for SQL query and calculate times, and if you do that you will probably end up with different values.
However, I'm not saying that the the NH Linq to SQL implementation is optimized for any query you come up with, but there are other ways in NHibernate to deal with those situations such as QueryOverAPI, CriteriaQueries, HQL and finally SQL.
Where is that red line crossed between a query that is performant and
one that starts to struggle with materialization. What's going on under the hood?
This one is pretty hard question and without having detail knowledge of NHibernate Linq to SQL provider it's hard to provide a accurate answer. You can always try different mechanisms provided and see which one is the best for given scenario.
And would it help if this were a SP whose flat results I subsequently
manipulate in memory into the right shape?
Yes, using a SP would help things to work pretty fast, but using SP would add more maintenance problems to your code base.
You have generic question, I'll tell you generic answer :)
If you query data for reading (not for update) try to use anonymous classes. The reason is - they are lighter to create, they have no navigatoin properties. And you select only data you need! It's very important rule. So, try to replace your select with smth like this:
select new
{
Book = new { book.Id, book.Name},
LibraryBook = new { libraryBook.Id, libraryBook.AnotherProperty},
BookTag = new { bookTag.Name}
}
Stored procedures are good, when query is complex and linq-provider generates not effective code, so, you can replace it with plain SQL or stored procedure. It's not offten case and, I think, it's not your situation
Run your sql-query. How many rows it returns? Is it the same value as result? Sometimes linq provider generates code, that select much more rows to select one entity. It happens, when entity has one to many relationship with another selecting entity. For example:
class Book
{
int Id {get;set;}
string Name {get;set;}
ICollection<Tag> Tags {get;set;}
}
class Tag
{
string Name {get;set;}
Book Book {get;set;}
}
...
dbContext.Books.Where(o => o.Id == 1).Select(o=>new {Book = o, Tags = o.Tags}).Single();
I Select only one book with Id = 1, but provider will generate code, that returns rows amount equals to Tags amount (entity framework does this).
Split complex query to set of simple and join in client side. Sometimes, you have complex query with many conditionals and resulting sql become terrible. So, you split you big query to more simple, get results of each and join/filter on client side.
At the end, I advice you to use anonymous class as result of select.
Don’t use Linq’s Join. Navigate!
in that post you can see:
As long as there are proper foreign key constraints in the database, the navigation properties will be created automatically. It is also possible to manually add them in the ORM designer. As with all LINQ to SQL usage I think that it is best to focus on getting the database right and have the code exactly reflect the database structure. With the relations properly specified as foreign keys the code can safely make assumptions about referential integrity between the tables.
I agree 100% with the sentiments expressed by everyone else (with regards to their being two parts to the optimisation here and the SQL execution being a big unknown, and likely cause of poor performance).
Another part of the solution that might help you get some speed is to pre-compile your LINQ statements. I remember this being a huge optimisation on a tiny project (high traffic) I worked on ages and ages ago... seems like it would contribute to the client side slowness you're seeing. Having said all that though I've not found a need to use them since... so heed everyone else's warnings first! :)
https://msdn.microsoft.com/en-us/library/vstudio/bb896297(v=vs.100).aspx
I need to do a query on my database that might be something like this where there could realistically be 100 or more search terms.
public IQueryable<Address> GetAddressesWithTown(string[] towns)
{
IQueryable<Address> addressQuery = DbContext.Addresses;
addressQuery.Where( x => towns.Any( y=> x.Town == y ) );
return addressQuery;
}
However when it contains more than about 15 terms it throws and exception on execution because the SQL generated is too long.
Can this kind of query be done through Entity Framework?
What other options are there available to complete a query like this?
Sorry, are we talking about THIS EXACT SQL?
In that case it is a very simple "open your eyes thing".
There is a way (contains) to map that string into an IN Clause, that results in ONE sql condition (town in ('','',''))
Let me see whether I get this right:
addressQuery.Where( x => towns.Any( y=> x.Town == y ) );
should be
addressQuery.Where ( x => towns.Contains (x.Town)
The resulting SQL will be a LOT smaller. 100 items is still taxing it - I would dare saying you may have a db or app design issue here and that requires a business side analysis, I have not me this requirement in 20 years I work with databases.
This looks like a scenario where you'd want to use the PredicateBuilder as this will help you create an Or based predicate and construct your dynamic lambda expression.
This is part of a library called LinqKit by Joseph Albahari who created LinqPad.
public IQueryable<Address> GetAddressesWithTown(string[] towns)
{
var predicate = PredicateBuilder.False<Address>();
foreach (string town in towns)
{
string temp = town;
predicate = predicate.Or (p => p.Town.Equals(temp));
}
return DbContext.Addresses.Where (predicate);
}
You've broadly got two options:
You can replace .Any with a .Contains alternative.
You can use plain SQL with table-valued-parameters.
Using .Contains is easier to implement and will help performance because it translated to an inline sql IN clause; so 100 towns shouldn't be a problem. However, it also means that the exact sql depends on the exact number of towns: you're forcing sql-server to recompile the query for each number of towns. These recompilations can be expensive when the query is complex; and they can evict other query plans from the cache as well.
Using table-valued-parameters is the more general solution, but it's more work to implement, particularly because it means you'll need to write the SQL query yourself and cannot rely on the entity framework. (Using ObjectContext.Translate you can still unpack the query results into strongly-typed objects, despite writing sql). Unfortunately, you cannot use the entity framework yet to pass a lot of data to sql server efficiently. The entity framework doesn't support table-valued-parameters, nor temporary tables (it's a commonly requested feature, however).
A bit of TVP sql would look like this select ... from ... join #townTableArg townArg on townArg.town = address.town or select ... from ... where address.town in (select town from #townTableArg).
You probably can work around the EF restriction, but it's not going to be fast and will probably be tricky. A workaround would be to insert your values into some intermediate table, then join with that - that's still 100 inserts, but those are separate statements. If a future version of EF supports batch CUD statements, this might actually work reasonably.
Almost equivalent to table-valued paramters would be to bulk-insert into a temporary table and join with that in your query. Mostly that just means you're table name will start with '#' rather than '#' :-). The temp table has a little more overhead, but you can put indexes on it and in some cases that means the subsequent query will be much faster (for really huge data-quantities).
Unfortunately, using either temporary tables or bulk insert from C# is a hassle. The simplest solution here is to make a DataTable; this can be passed to either. However, datatables are relatively slow; the over might be relevant once you start adding millions of rows. The fastest (general) solution is to implement a custom IDataReader, almost as fast is an IEnumerable<SqlDataRecord>.
By the way, to use a table-valued-parameter, the shape ("type") of the table parameter needs to be declared on the server; if you use a temporary table you'll need to create it too.
Some pointers to get you started:
http://lennilobel.wordpress.com/2009/07/29/sql-server-2008-table-valued-parameters-and-c-custom-iterators-a-match-made-in-heaven/
SqlBulkCopy from a List<>
I'm from old school where DB had all data access encapsulated into views, procedures, etc. Now I'm forcing myself into using LINQ for most of the obvious queries.
What I'm wondering though, is when to stop and what practical? Today I needed to run query like this:
SELECT D.DeviceKey, D.DeviceId, DR.DriverId, TR.TruckId, LP.Description
FROM dbo.MBLDevice D
LEFT OUTER JOIN dbo.DSPDriver DR ON D.DeviceKey = DR.DeviceKey
LEFT OUTER JOIN dbo.DSPTruck TR ON D.DeviceKey = TR.DeviceKey
LEFT OUTER JOIN
(
SELECT LastPositions.DeviceKey, P.Description, P.Latitude, P.Longitude, P.Speed, P.DeviceTime
FROM dbo.MBLPosition P
INNER JOIN
(
SELECT D.DeviceKey, MAX(P.PositionKey) LastPositionKey
FROM dbo.MBLPosition P
INNER JOIN dbo.MBLDevice D ON P.DeviceKey = D.DeviceKey
GROUP BY D.DeviceKey
) LastPositions ON P.PositionKey = LastPositions.LastPositionKey
) LP ON D.DeviceKey = LP.DeviceKey
WHERE D.IsActive = 1
Personally, I'm not able to write corresponing LINQ. So, I found tool online and got back 2 page long LINQ. It works properly-I can see it in profiler but it's not maintainable IMO. Another problem is that I'm doing projection and getting Anonymous object back. Or, I can manually create class and project into that custom class.
At this point I wonder if it is better to create View on SQL Server and add it to my model? It will break my "all SQL on cliens side" mantra but will be easier to read and maintain. No?
I wonder where you stop with T-SQL vs LINQ ?
EDIT
Model description.
I have DSPTrucks, DSPDrivers and MBLDevices.
Device can be attached to Truck or to Driver or to both.
I also have MBLPositions which is basically pings from device (timestamp and GPS position)
What this query does - in one shot it returns all device-truck-driver information so I know what this device attached to and it also get's me last GPS position for those devices. Response may look like so:
There is some redundant stuff but it's OK. I need to get it in one query.
In general, I would also default to LINQ for most simple queries.
However, when you get at a point where the corresponding LINQ query becomes harder to write and maintain, then what's the point really? So I would simply leave that query in place. It works, after all. To make it easier to use it's pretty straight-forward to map a view or cough stored procedure in your EF model. Nothing wrong with that, really (IMO).
You can firstly store Linq queries in variables which may help to make it not only more readable, but also reusable.
An example maybe like the following:
var redCars = from c in cars
where c.Colour == "red"
select c;
var redSportsCars = from c in redCars
where c.Type == "Sports"
select c;
Queries are lazily executed and not composed until you compile them or iterate over them so you'll notice in profiler that this does produce an effecient query
You will also benifit from defining relationships in the model and using navigation properties, rather than using the linq join syntax. This (again) will make these relationships reusable between queries, and more readable (because you don't specify the relationships in the query like the SQL above)
Generally speaking your LINQ query will be shorter than the equivalent SQL, but I'd suggest trying to work it out by hand rather than using a conversion tool.
With the exception of CTEs (which I'm fairly sure you can't do in LINQ) I would write all queries in LINQ these days
I find when using LINQ its best to ignore whatever sql it generates as long as its retrieving the right thing and is performant, only when one of those doesn't work do I actually look at what its generating.
In terms of the sql it generates being maintainable, you shouldn't really worry about the SQL being maintainable but more the LINQ query that is generating the SQL.
In the end if the sql is not quite right I believe there are various things you can do to make LINQ generate SQL more along the lines you want..to some extent.
AFAIK there isn't any inherent problem with getting anonymous objects back, however if you are doing it it multiple places you may want to create a class to keep things neater.
I am new at LINQ and really need a help with some coding.
At the moment, I have a string and a var variables.
string temp = "from product in myEntities.Products where product.Name.Contains(_Name) select product";
var _Products = temp;
LvProducts.DataSource = _Products;
LvProducts.DataBind();
Basically, what I want to do is to be able to create a custom/complicated LINQ query by assigning it into a string beforehand. After done with composing, I assign the string into the var variable. However, this is obviously will not work. Therefore, can anyone assist me on this?
You have a few options:
Use the the Dynamic Linq
libraries to construct you queries on
the fly. The best place to get
started is by reading ScottGu's blog
entry. However, I don't think
these libraries support the contains
method in your example. Here is
a blog post explaining how to add
this support.
Directly execute SQL statements. Check out the MSDN docs for Linq to Sql or Linq to Entities.
var _Products = myEntities.ExecuteStoreQuery<Product>
(#"SELECT * FROM Products WHERE [Name] In ('Item1', 'Item2')");
Use Linq's composable behaviour. This might not be the most elegant solution but it works really well if you do not have too many options. You can just construct your query in multiple parts.
var _Products = from product in myEntities.Products
select product
_Products = from product in _Products
where product.Name.Contains(_Name)
select product
if FilterByPrice {
_Products = from product in _Products
where product.Price > 100
select product
}
You can do this by compiling this Linq within some c# using the CodeDomProvider - Adding scripting functionality to .NET applications - but this is quite heavyweight as a solution. If you want to see more about how to do this, then take a look at LinqPad - http://www.linqpad.net - the author invites you to use the decompiler to see how it works!
If the requirement is just down to simple where clauses than an alternative might be to use Dynamic Linq - see Scott Gu's posts and the sample code from Microsoft - http://weblogs.asp.net/scottgu/archive/2008/01/07/dynamic-linq-part-1-using-the-linq-dynamic-query-library.aspx
Maybe this can help you
http://nlinq.codeplex.com/
BR.
Much of the reason you use LINQ in the first place is to get compiler-verified queries, that wil ldetect errors at compile time. This will defeat that purpose since the string will be parsed at runtime.
For your needs you have two options:
1) Making a eSQL query and running it on the ObjectContext. Using this, you can still use your entities like myEntities.Products, and return a list of products.
2) Using a normal SQL query, and use the ObjectContext to call that directly towards the underlying database.
My guess is that you will have to use dynamic code execution. For further details, have a look at Ricks post on west-wind.
you're thinking of it like a Dynamic SQL where you create the statement as string and parse it as a SQL statement.
Since you're already in the code, why not make the statements right in there. It would be a lot easier if you use Lambda instead of the traditional linq. my 2 cents.