I've got a Linq To Sql query (or with brackets) here that works on my local SQL2008, in about 00:00:00s - 00:00:01s, but on the remote server, it takes around 00:02:10s. There's about 56k items in dbo.Movies, dbo.Boxarts, and 300k in dbo.OmdbEntries
{SELECT
//pull distinct t_meter out of the created object
Distinct2.t_Meter AS t_Meter
//match all movie data on the same movie_id
FROM ( SELECT DISTINCT
Extent2.t_Meter AS t_Meter
FROM dbo.Movies AS Extent1
INNER JOIN dbo.OmdbEntries AS Extent2 ON Extent1.movie_ID = Extent2.movie_ID
INNER JOIN dbo.BoxArts AS Extent3 ON Extent1.movie_ID = Extent3.movie_ID
//pull the genres matched on movie_ids
INNER JOIN (SELECT DISTINCT
Extent4.movie_ID AS movie_ID
FROM dbo.MovieToGenres AS Extent4
//all genres matched on movie ids
INNER JOIN dbo.Genres AS Extent5 ON Extent4.genre_ID = Extent5.genre_ID ) AS Distinct1 ON Distinct1.movie_ID = Extent1.movie_ID
WHERE 1 = 1
//sort the t_meters by ascending
) AS Distinct2
ORDER BY Distinct2.t_Meter ASC}
The inner query first takes all the related items in the tables and then creates a new object, then from that object, find only the t_Meters that aren't null. Then from those t_Meters, select only the distinct items and then sort them, to return a list of 98 or so ints.
I don't know enough about SQL Databases yet or not to intuitively know whether or not that that's an extreme set of db calls to put into a single query, but since it only takes a second or less on my local server, I thought it was alright.
edit: Here's the LINQ code that I haven't really cleaned up at all: http://pastebin.com/JUkdjHDJ It's messy, but it gets the job done... The fix I found was calling ToArray after OrderBy, but before Distinct helped out immensely. So instead of
var results = IQueryableWithDBDatasTMeter.Distinct().OrderBy().ToArray()
I did
var orderedResults = IQueryableWithDBDatasTMeter.OrderBy().ToArray()
var distinctOrderedResults = orderedResults.Distinct().ToArray()
I'm sure had I linked the Linq code (and cleaned it up) rather than the autogenerated SQL query, you would have been able to solve this easily, sorry about that.
Here's the LINQ code that I haven't really cleaned up at all: http://pastebin.com/JUkdjHDJ It's messy, but it gets the job done... The fix I found was calling ToArray after OrderBy, but before Distinct helped out immensely. So instead of
var results = IQueryableWithDBDatasTMeter.Distinct().OrderBy().ToArray()
I did
var orderedResults = IQueryableWithDBDatasTMeter.OrderBy().ToArray()
var distinctOrderedResults = orderedResults.Distinct().ToArray()
I guess it works because it's running the Distinct only against the Array in memory, rather than the entire DB's worth of entries? I'm not really sure though, since the old LINQ works flawlessly on my local server.
I'm sure had I linked the Linq code (and cleaned it up) rather than the autogenerated SQL query, you would have been able to solve this easily, sorry about that.
Related
I am using Telerik Open/Data Access ORM against an ORACLE.
Why do these two statements result in different SQL commands?
Statement #1
IQueryable<WITransmits> query = from wiTransmits in uow.DbContext.StatusMessages
select wiTransmits;
query = query.Where(e=>e.MessageID == id);
Results in the following SQL
SELECT
a."MESSAGE_ID" COL1,
-- additional fields
FROM "XFE_REP"."WI_TRANSMITS" a
WHERE
a."MESSAGE_ID" = :p0
Statement #2
IQueryable<WITransmits> query = from wiTransmits in uow.DbContext.StatusMessages
select new WITransmits
{
MessageID = wiTranmits.MessageID,
Name = wiTransmits.Name
};
query = query.Where(e=>e.MessageID == id);
Results in the following SQL
SELECT
a."MESSAGE_ID" COL1,
-- additional fields
FROM "XFE_REP"."WI_TRANSMITS" a
The query generated with the second statement #2 returns, obviously EVERY record in the table when I only want the one. Millions of records make this prohibitive.
Telerik Data Access will try to split each query into database-side and client-side (or in-memory LINQ if you prefer it).
Having projection with select new is sure trigger that will make everything in your LINQ expression tree after the projection to go to the client side.
Meaning in your second case you have inefficient LINQ query as any filtering is applied in-memory and you have already transported a lot of unnecessary data.
If you want compose LINQ expressions in the way done in case 2, you can append the Select clause last or explicitly convert the result to IEnumerable<T> to make it obvious that any further processing will be done in-memory.
The first query returns the full object defined, so any additional limitations (like Where) can be appended to it before it is actually being run. Therefore the query can be combined as you showed.
The second one returns a new object, which can be whatever type and contain whatever information. Therefore the query is sent to the database as "return everything" and after the objects have been created all but the ones that match the Where clause are discarded.
Even though the type were the same in both of them, think of this situation:
var query = from wiTransmits in uow.DbContext.StatusMessages
select new WITransmits
{
MessageID = wiTranmits.MessageID * 4 - 2,
Name = wiTransmits.Name
};
How would you combine the Where query now? Sure, you could go through the code inside the new object creation and try to move it outside, but since there can be anything it is not feasible. What if the checkup is some lookup function? What if it's not deterministic?
Therefore if you create new objects based on the database objects there will be a border where the objects will be retrieved and then further queries will be done in memory.
I have often found that if I have too many joins in a Linq query (whether using Entity Framework or NHibernate) and/or the shape of the resulting anonymous class is too complex, Linq takes a very long time to materialize the result set into objects.
This is a generic question, but here's a specific example using NHibernate:
var libraryBookIdsWithShelfAndBookTagQuery = (from shelf in session.Query<Shelf>()
join sbttref in session.Query<ShelfBookTagTypeCrossReference>() on
shelf.ShelfId equals sbttref.ShelfId
join bookTag in session.Query<BookTag>() on
sbttref.BookTagTypeId equals (byte)bookTag.BookTagType
join btbref in session.Query<BookTagBookCrossReference>() on
bookTag.BookTagId equals btbref.BookTagId
join book in session.Query<Book>() on
btbref.BookId equals book.BookId
join libraryBook in session.Query<LibraryBook>() on
book.BookId equals libraryBook.BookId
join library in session.Query<LibraryCredential>() on
libraryBook.LibraryCredentialId equals library.LibraryCredentialId
join lcsg in session
.Query<LibraryCredentialSalesforceGroupCrossReference>()
on library.LibraryCredentialId equals lcsg.LibraryCredentialId
join userGroup in session.Query<UserGroup>() on
lcsg.UserGroupOrganizationId equals userGroup.UserGroupOrganizationId
where
shelf.ShelfId == shelfId &&
userGroup.UserGroupId == userGroupId &&
!book.IsDeleted &&
book.IsDrm != null &&
book.BookFormatTypeId != null
select new
{
Book = book,
LibraryBook = libraryBook,
BookTag = bookTag
});
// add a couple of where clauses, then...
var result = libraryBookIdsWithShelfAndBookTagQuery.ToList();
I know it's not the query execution, because I put a sniffer on the database and I can see that the query is taking 0ms, yet the code is taking about a second to execute that query and bring back all of 11 records.
So yeah, this is an overly complex query, having 8 joins between 9 tables, and I could probably restructure it into several smaller queries. Or I could turn it into a stored procedure - but would that help?
What I'm trying to understand is, where is that red line crossed between a query that is performant and one that starts to struggle with materialization? What's going on under the hood? And would it help if this were a SP whose flat results I subsequently manipulate in memory into the right shape?
EDIT: in response to a request in the comments, here's the SQL emitted:
SELECT DISTINCT book4_.bookid AS BookId12_0_,
libraryboo5_.librarybookid AS LibraryB1_35_1_,
booktag2_.booktagid AS BookTagId15_2_,
book4_.title AS Title12_0_,
book4_.isbn AS ISBN12_0_,
book4_.publicationdate AS Publicat4_12_0_,
book4_.classificationtypeid AS Classifi5_12_0_,
book4_.synopsis AS Synopsis12_0_,
book4_.thumbnailurl AS Thumbnai7_12_0_,
book4_.retinathumbnailurl AS RetinaTh8_12_0_,
book4_.totalpages AS TotalPages12_0_,
book4_.lastpage AS LastPage12_0_,
book4_.lastpagelocation AS LastPag11_12_0_,
book4_.lexilerating AS LexileR12_12_0_,
book4_.lastpageposition AS LastPag13_12_0_,
book4_.hidden AS Hidden12_0_,
book4_.teacherhidden AS Teacher15_12_0_,
book4_.modifieddatetime AS Modifie16_12_0_,
book4_.isdeleted AS IsDeleted12_0_,
book4_.importedwithlexile AS Importe18_12_0_,
book4_.bookformattypeid AS BookFor19_12_0_,
book4_.isdrm AS IsDrm12_0_,
book4_.lightsailready AS LightSa21_12_0_,
libraryboo5_.bookid AS BookId35_1_,
libraryboo5_.libraryid AS LibraryId35_1_,
libraryboo5_.externalid AS ExternalId35_1_,
libraryboo5_.totalcopies AS TotalCop5_35_1_,
libraryboo5_.availablecopies AS Availabl6_35_1_,
libraryboo5_.statuschangedate AS StatusCh7_35_1_,
booktag2_.booktagtypeid AS BookTagT2_15_2_,
booktag2_.booktagvalue AS BookTagV3_15_2_
FROM shelf shelf0_,
shelfbooktagtypecrossreference shelfbookt1_,
booktag booktag2_,
booktagbookcrossreference booktagboo3_,
book book4_,
librarybook libraryboo5_,
library librarycre6_,
librarycredentialsalesforcegroupcrossreference librarycre7_,
usergroup usergroup8_
WHERE shelfbookt1_.shelfid = shelf0_.shelfid
AND booktag2_.booktagtypeid = shelfbookt1_.booktagtypeid
AND booktagboo3_.booktagid = booktag2_.booktagid
AND book4_.bookid = booktagboo3_.bookid
AND libraryboo5_.bookid = book4_.bookid
AND librarycre6_.libraryid = libraryboo5_.libraryid
AND librarycre7_.librarycredentialid = librarycre6_.libraryid
AND usergroup8_.usergrouporganizationid =
librarycre7_.usergrouporganizationid
AND shelf0_.shelfid = #p0
AND usergroup8_.usergroupid = #p1
AND NOT ( book4_.isdeleted = 1 )
AND ( book4_.isdrm IS NOT NULL )
AND ( book4_.bookformattypeid IS NOT NULL )
AND book4_.lightsailready = 1
EDIT 2: Here's the performance analysis from ANTS Performance Profiler:
It is often database "good" practice to place lots of joins or super common joins into views. ORMs don't let you ignore these facts nor do they supplement the decades of time spent fine tuning databases to do these kinds of things efficiently. Refactor those joins into a singular view or a couple views if that'd make more sense in the greater perspective of your application.
NHibernate should be optimizing the query down and reducing the data so that .Net only has to mess with the important parts. However, if those domain objects are just naturally large, that's still a lot of data. Also, if it's a really large result set in terms of rows returned, that's a lot of objects getting instantiated even if the DB is able to return the set quickly. Refactoring this query into a view that only returns the data you actually need would also reduce object instantiation overhead.
Another thought would be to not do a .ToList(). Return the enumerable and let your code lazily consume the data.
According to profiling information, the CreateQuery takes 45% of the total execution time. However as you mentioned the query took 0ms when you executed directly. But this alone is not enough to say there is a performance problem because,
You are running the query with the profiler which has significant impact on execution time.
When you use a profiler, it will affect every code is being profiled but not the sql execution time (because it happens in the SQL server), so you can see everything else is slower compared to SQL statement.
so ideal scenario is to measure how long it takes to execute entire code block, measure time for SQL query and calculate times, and if you do that you will probably end up with different values.
However, I'm not saying that the the NH Linq to SQL implementation is optimized for any query you come up with, but there are other ways in NHibernate to deal with those situations such as QueryOverAPI, CriteriaQueries, HQL and finally SQL.
Where is that red line crossed between a query that is performant and
one that starts to struggle with materialization. What's going on under the hood?
This one is pretty hard question and without having detail knowledge of NHibernate Linq to SQL provider it's hard to provide a accurate answer. You can always try different mechanisms provided and see which one is the best for given scenario.
And would it help if this were a SP whose flat results I subsequently
manipulate in memory into the right shape?
Yes, using a SP would help things to work pretty fast, but using SP would add more maintenance problems to your code base.
You have generic question, I'll tell you generic answer :)
If you query data for reading (not for update) try to use anonymous classes. The reason is - they are lighter to create, they have no navigatoin properties. And you select only data you need! It's very important rule. So, try to replace your select with smth like this:
select new
{
Book = new { book.Id, book.Name},
LibraryBook = new { libraryBook.Id, libraryBook.AnotherProperty},
BookTag = new { bookTag.Name}
}
Stored procedures are good, when query is complex and linq-provider generates not effective code, so, you can replace it with plain SQL or stored procedure. It's not offten case and, I think, it's not your situation
Run your sql-query. How many rows it returns? Is it the same value as result? Sometimes linq provider generates code, that select much more rows to select one entity. It happens, when entity has one to many relationship with another selecting entity. For example:
class Book
{
int Id {get;set;}
string Name {get;set;}
ICollection<Tag> Tags {get;set;}
}
class Tag
{
string Name {get;set;}
Book Book {get;set;}
}
...
dbContext.Books.Where(o => o.Id == 1).Select(o=>new {Book = o, Tags = o.Tags}).Single();
I Select only one book with Id = 1, but provider will generate code, that returns rows amount equals to Tags amount (entity framework does this).
Split complex query to set of simple and join in client side. Sometimes, you have complex query with many conditionals and resulting sql become terrible. So, you split you big query to more simple, get results of each and join/filter on client side.
At the end, I advice you to use anonymous class as result of select.
Don’t use Linq’s Join. Navigate!
in that post you can see:
As long as there are proper foreign key constraints in the database, the navigation properties will be created automatically. It is also possible to manually add them in the ORM designer. As with all LINQ to SQL usage I think that it is best to focus on getting the database right and have the code exactly reflect the database structure. With the relations properly specified as foreign keys the code can safely make assumptions about referential integrity between the tables.
I agree 100% with the sentiments expressed by everyone else (with regards to their being two parts to the optimisation here and the SQL execution being a big unknown, and likely cause of poor performance).
Another part of the solution that might help you get some speed is to pre-compile your LINQ statements. I remember this being a huge optimisation on a tiny project (high traffic) I worked on ages and ages ago... seems like it would contribute to the client side slowness you're seeing. Having said all that though I've not found a need to use them since... so heed everyone else's warnings first! :)
https://msdn.microsoft.com/en-us/library/vstudio/bb896297(v=vs.100).aspx
I built a dynamic LINQ-to-Entities query to support optional search parameters. It was quite a bit of work to get this producing performant SQL and I am NEARLY there, but I stumble across a big issue with OrderBy which gets translated into kind of a projection / subquery containing the actual query, causing extremely inperformant SQL. I can't find a solution to get this right. Maybe someone can help me out :)
I spare you the complete query for now as it is long and complex, I translate it into a simple sample for better understanding:
I'm doing something like this:
// Start with the base query
var query = from a in db.Articles
where a.UserId = 1;
// Apply some optional conditions
if (tagParam != null)
query = query.Where(a => a.Tag = tagParam);
if (authorParam != null)
query = query.Where(a => a.Author = authorParam);
// ... and so on ...
// I only want the 50 most recent articles, so I finally want to apply Take and OrderBy
query = query.OrderByDescending(a => a.Published);
query = query.Take(50);
The resulting SQL strangely translates the OrderBy in an container query:
select top 50 Id, Published, Title, Content
from (select Id, Published, Title Content
from Articles
where UserId = 1
and Author = #paramAuthor)
order by Published desc
Note that also the Top 50 got moved to the outer query. In case I would only use Take(50), the top 50 sql statement would correctly be applied to the inner query above (the outer query wouldn't even exist). Only when I use OrderBy, Linq-To-Entities uses this container query approach.
This causes a very bad execution plan where the inner query takes all articles that apply to the parameters from Disk and pass them to the outer query - and only there, OrderBy and Top is processed. In my case, this can be hundred thousands of lines. I already tried to move the order by manually into the inner statement and execute this - this produces much better results as the existing indexes allow the SQL Server to easily find the top 50 rows in right order without reading all rows from disk.
Is there any way I can get EF to append the order by clause to the inner query? Or any other trick to get this working right?
Any help would be greatly appreciated :)
Edit: As an additional information, some tests with less complex queries showed that the Optimizer normally handles such subquery scenarios well. In my scenario, the Optimizer fails on this unfortunately and moves hundrets of thousands of rows through the query plan. But moving the OrderBy to the inner query solves it and the Optimizer does it right.
Edit 2: After couple of hours of more testing it seems the issue with the wrong execution plan is a SQL Server issue that is not caused by the created container query. While the move of the order by and top clause into the inside query did fix the issue initially, I can't reproduce this now anymore, SQL Server started using the bad execution plan now also here (while the data in the DB remained unchanged). The move of the order by clause might caused SQL Server to take other statistics into account but it seems it was not due to the better/more clean query design. However, I still want to know why EF uses a container query here and if I can influence this behavior. If it will not improve performance, at least it would make debugging easier if the generated EF queries are more straightforward and not that convoluted.
Initially I had written this query using LINQ-to-SQL
var result = from w in PatternDataContext.Windows
join cf in PatternDataContext.ControlFocus on w.WindowId equals cf.WindowId
join p in PatternDataContext.Patterns on cf.CFId equals p.CFId
join r in ResultDataContext.Results on p.PatternId equals r.PatternId
join fi in ResultDataContext.IclFileInfos on r.IclFileId equals fi.IclFileId
join sp in sessionProfileDataContext.ServerProfiles on fi.ServerProfileId equals sp.ProfileId
join u in infrastructure.Users on sp.UserId equals u.Id
where w.Process.Equals(processName)
select u.DistributedAppId;
And when I executed it, and saw result in the QuickWatch.., it showed this message:
the query contains references to items defined on a different data context
On googling, I found this topic at Stackoverflow itself, where I learned simulating cross context joins and as suggested there, I changed my query a bit to this:
var result = from w in PatternDataContext.Windows
join cf in PatternDataContext.ControlFocus on w.WindowId equals cf.WindowId
join p in PatternDataContext.Patterns on cf.CFId equals p.CFId
join r in SimulateJoinResults() on p.PatternId equals r.PatternId
join fi in SimulateJoinIclFileInfos() on r.IclFileId equals fi.IclFileId
join sp in SimulateJoinServerProfiles() on fi.ServerProfileId equals sp.ProfileId
join u in SimulateJoinUsers() on sp.UserId equals u.Id
where w.Process.Equals(processName)
select u.DistributedAppId;
This query is using these SimulateXyz methods:
private static IQueryable<Result> SimulateJoinResults()
{
return from r in SessionDataProvider.Instance.ResultDataContext.Results select r;
}
private static IQueryable<IclFileInfo> SimulateJoinIclFileInfos()
{
return from f in SessionDataProvider.Instance.ResultDataContext.IclFileInfos select f;
}
private static IQueryable<ServerProfile> SimulateJoinServerProfiles()
{
return from sp in sessionProfileDataContext.ServerProfiles select sp;
}
private static IQueryable<User> SimulateJoinUsers()
{
return from u in infrastructureDataContext.Users select u;
}
But even this approach didn't solve the problem. I'm still getting this message in QuickWatch...:
the query contains references to items defined on a different data context
Any solution for this problem? Along with the solution, I would also want to know why the problem still exists, and how exactly the new solution removes it, so that from next time I could solve such problems myself. I'm new to LINQ, by the way.
I've had to do this before, and there are two ways to do it.
The first is to move all the servers into a single context. You do this by pointing LINQ-to-SQL to a single server, then, in that server, create linked servers to all the other servers. Then you just create views for any tables you're interested from the other servers, and add those views to your context.
The second is to manually do the joins yourself, by pulling in data from one context, and using just the properties you need to join into another context. For example,
int[] patternIds = SessionDataProvider.Instance.ResultDataContext.Results.Select(o => o.patternId).ToArray();
var results = from p in PatternDataContext.Patterns
where patternIds.Contains(p.PatternId)
select p;
Though the first is easier to work with, it does have its share of problems. The problem is that you're relying on SQL Server to be performant with linked servers, something it is notoriously bad at. For example, consider this query:
var results = from p in DataContext.Patterns
join r in DataContext.LinkedServerResults on p.PatternId equals r.PatternId
where r.userId = 10;
When you enumerate this query, the following will occur (let's call the normal and linked servers MyServer and MyLinkedServer, respectively)
MyServer asks MyLinkedServer for the Results
MyLinkedServer sends the Results back to MyServer
MyServer takes those Results, joins them on the Patterns table, and returns only the ones with Results.userId = 10
So now the question is: When is the filtering done - on MyServer or MyLinkedServer? In my experience, for such a simple query, it will usually be done on MyLinkedServer. However, once the query gets more complicated, you'll suddenly find that MyServer is requesting the entire Results table from MyLinkedServer and doing the filtering after the join! This wastes bandwidth, and, if the Results tables is large enough, could turn a 50ms query into a 50 second query!
You could fix unperformant cross-server joins using stored procedures, but if you do a lot of complex cross-server joins, you may end up writing stored procedures for most of your queries, which is a lot of work and defeats part of the purpose of using L2SQL in the first place (not having to write a lot of SQL).
In comparison, the following code would always perform the filtering on the server containing the Results table:
int[] patternIds = (from r in SessionDataProvider.Instance.ResultDataContext.Results
where r.userId = 10
select r.PatternId).ToArray();
var results = from p in PatternDataContext.Patterns
where patternIds.Contains(p.PatternId)
select p;
Which is best for your situation is up to your best judgement.
Note that there is a third potential solution which I did not mention, as it is not really a programmer-solution: you could ask your server admins to set up a replication task to copy the necessary data from MyLinkedServer to MyServer once a day/week/month. This is only an option if:
Your program can work with slightly stale data from MyLinkedServer
You only need to read, never write, to MyLinkedServer
The tables you need from MyLinkedServers are not exorbitantly huge
You have the space/bandwidth available
Your database admins are not stingy/lazy
Your SimulateJoins can't work because they return IQueryable. Your current solution is exactly the same as your former one and that is the reason why you get the same exception. If you check the linked question again you will see that their helper methods return IEnumerable which is the only way to make cross context operations. As you probably already know it means that join will be performed in memory on the application server instead of the database server = it will pull all data from your partial queries and execute join as linq-to-objects.
Cross context join on database level is IMO not possible. You can have different connections, different connection strings with different servers, etc. Linq-to-sql does not handle this.
You could work around it by "escaping from" Linq to SQL on the second context, i.e., calling for instance .ToList() on ResultDataContext.Results and ResultDataContext.IclFileInfos so that your query ended up looking like:
var result = from w in PatternDataContext.Windows
join cf in PatternDataContext.ControlFocus on w.WindowId equals cf.WindowId
join p in PatternDataContext.Patterns on cf.CFId equals p.CFId
join r in ResultDataContext.Results.ToList()
on p.PatternId equals r.PatternId
join fi in ResultDataContext.IclFileInfos.ToList()
on r.IclFileId equals fi.IclFileId
join sp in sessionProfileDataContext.ServerProfiles on
fi.ServerProfileId equals sp.ProfileId
join u in infrastructure.Users on sp.UserId equals u.Id
where w.Process.Equals(processName)
select u.DistributedAppId;
Or AsEnumerable() as long as you "get out" of Linq to SQL and into Linq to Objects for the "offending" context.
Old question, but as I happened to have the same problem, my solution was to pass the manually crafted T-SQL cross-server query (with linked servers) directly to the provider through the ExecuteQuery method of the first context:
db.ExecuteQuery(Of cTechSupportCall)(strSql).ToList
This just saves you from having to create a view server side, and Linq to SQL still maps the results to the proper type. This is useful when there is that one query that is just impossible to formulate in Linq.
I am trying to do a join with a sub query and can't seem to get it. Here is what is looks like working in sql. How do I get to to work in linq?
SELECT po.*, p.PermissionID
FROM PermissibleObjects po
INNER JOIN PermissibleObjects_Permissions po_p ON (po.PermissibleObjectID = po_p.PermissibleObjectID)
INNER JOIN Permissions p ON (po_p.PermissionID = p.PermissionID)
LEFT OUTER JOIN
(
SELECT u_po.PermissionID, u_po.PermissibleObjectID
FROM Users_PermissibleObjects u_po
WHERE u_po.UserID = '2F160457-7355-4B59-861F-9871A45FD166'
) used ON (p.PermissionID = used.PermissionID AND po.PermissibleObjectID = used.PermissibleObjectID)
WHERE used.PermissionID is null
Without seeing your database and data model, it's pretty impossible to offer any real help. But, probably the best way to go is:
download linqpad - http://www.linqpad.net/
create a connection to your database
start with the innermost piece - the subquery with the "where" clause
get each small query working, then join them up. Linqpad will show you the generated SQL, as well as the results, so build your small queries up until they are right
So, basically, split your problem up into smaller pieces. Linqpad is fantastic as it lets you test these things out, and check your results as you go
hope this helps, good luck
Toby
The LINQ translation for your query is suprisingly simple:
from pop in PermissibleObjectPermissions
where !pop.UserPermissibleObjects.Any (
upo => upo.UserID == new Guid ("2F160457-7355-4B59-861F-9871A45FD166"))
select new { pop.PermissibleObject, pop.PermissionID }
In words: "From all object permissions, retrieve those with at least one user-permission whose UserID is 2F160457-7355-4B59-861F-9871A45FD16".
You'll notice that this query uses association properties for navigating relationships - this avoids the need for "joining" and simplfies the query. As a result, the LINQ query is much closer to its description in English than the original SQL query.
The trick, when writing LINQ queries, is to get out of the habit of "transliterating" SQL into LINQ.