How to put two condition in lambda expression - c#

I have two Repeater Control
var list = from i in DA.obm_view_studentLists where i.FranchiseID == FranchiseID && i.ExamDate == DateTime.Parse(this.pallavi.DropDownSelectedValue) select i;
I want to get result in 1 go from database.
this._approvedStudentList.DataSource = list.Select(e => e.ApprovedByOBM == true);
this._pendingStudentList.DataSource = list.Select(e => e.ApprovedByOBM == false);
I have a Label Field (UnPaidTotal) in pendingStudentList Repeater where I want to display Total Fee Pending
I tried but fail
UnPaidTotal = string.Format("{0:c}", list.Select(e => e.ApprovedByOBM == true).Sum(j => j.CourseFee));

Evaluate list to get all results in one shot -- just wrap from i in… select i in parentheses and add .ToList() at the end.
When setting the DataSource, replace list.Select(… with list.Where(…
Same thing when getting the unpaid total -- use Where instead of Select.
You don't need to return two collections from the database, because you're dividing the results based on a boolean value. You want all of the query results, because no matter what each result will belong to one collection or the other, so you can split them in-memory.
Select is used to perform a transformation on a collection, whereas Where is a filter. An example of select would be manufacturerList = carCollection.Select(car => car.Manufacturer);. This takes a collection of cars, and transforms it into an enumerable of Manufacturer, by selecting out just that property from each member of the collection.

The key here is that after your first line of code where you create the query for the list variable you haven't gone out to the database yet. All you did is create an IEnumerable/IQueryable that knows how to go out to the database and get your data. Nothing happens until you actually bind it to a control or otherwise try to iterate over the results.
Your problem is that when you do finally bind to the first control the state of that IEnumerable changes such that it now points to the end of the result set rather than waiting to open the database at the beginning. When you try to bind the second repeater there are no records there.
The easiest way to remedy this is to simply put the original query results in a list, though you should be aware that this will cause your web server to load the entire result set into RAM. Another option is to use one repeater, order the query appropriately and use code to output the desired html depending on what kind of row you have and the extra html between sections. This will perform better but crosses lines between separate tiers.

Related

Faster way to get distinct values in LINQ?

I have a web part in SharePoint, and I am trying to populate a drop-down control with the unique/distinct values from a particular field in a list.
Unfortunately, due to the nature of the system, it is a text field, so there is no other definitive source to get the data values (i.e., if it were a choice field, I could get the field definition and just get the values from there), and I am using the chosen value of the drop-down in a subsequent CAML query, so the values must be accurate to what is present on the list items. Currently the list has arpprox. 4K items, but it is (and will continue) growing slowly.
And, it's part of a sandbox solution, so it is restricted by the user code service time limit - and it's timing out more often than not. In my dev environment I stepped through the code in debug, and it seems like the line of LINQ where I actually get the distinct values is the most time consuming, and I then commented out the call to this method entirely, and the timeouts stop, so I am fairly certain this is where the problem is.
Here's my code:
private void AddUniqueValues(SPList list, SPField filterField, DropDownList dropDownControl)
{
SPQuery query = new SPQuery();
query.ViewFields = string.Format("<FieldRef Name='{0}' />", filterField.InternalName);
query.ViewFieldsOnly = true;
SPListItemCollection results = list.GetItems(query); // retrieves ~4K items
List<string> uniqueValues = results.Cast<SPListItem>().Select(item => item[filterField.Id].ToString()).Distinct().ToList(); // this takes too long with 4K items
uniqueValues.Sort();
dropDownControl.Items.AddRange(uniqueValues.Select(itm => new ListItem(itm)).ToArray());
}
As far as I am aware, there's no way to get "distinct" values directly in a CAML query, so how can I do this more quickly? Is there a way to restructure the LINQ to run faster?
Is there an easy/fast way to do this from the client side? (REST would be preferred, but I'd do JSOM if necessary).
Thought I'd add some extra information here since I did some further testing and found some interesting results.
First, to address the questions of whether the Cast() and Select() are needed: yes, they are.
SPListItemCollection is IEnumerable but not IEnumerable<T>, so we need to cast just to be able to get to use LINQ at all.
Then after it's cast to IEnumerable<SPListItem>, SPListItem is a fairly complex object, and I am looking to find distinct values from just one property of that object. Using Distinct() directly on the IEnumerable<SPListItem> yields.. all of them. So I have to Select() just the single values I want to compare.
So yes, the Cast() and Select() are absolutely necessary.
As noted in the comments by M.kazem Akhgary, in my original line of code, calling ToString() every time (for 4K items) did add some time. But in testing some other variations:
// original
List<string> uniqueValues = results.Cast<SPListItem>().Select(item => item[filterField.Id].ToString()).Distinct().ToList();
// hash set alternative
HashSet<object> items = new HashSet<object>(results.Cast<SPListItem>().Select(itm => itm[filterField.Id]));
// don't call ToString(), just deal with base objects
List<object> obs = results.Cast<SPListItem>().Select(itm => itm[filterField.Id]).Distinct().ToList();
// alternate LINQ syntax from Pieter_Daems answer, seems to remove the Cast()
var things = (from SPListItem item in results select item[filterField.Id]).Distinct().ToList();
I found that all of those methods took multiple tens of seconds to complete. Strangely, the DataTable/DataView method from Pieter_Daems answer, to which I added a bit to extract the values I wanted:
DataTable dt = results2.GetDataTable();
DataView vw = new DataView(dt);
DataTable udt = vw.ToTable(true, filterField.InternalName);
List<string> rowValues = new List<string>();
foreach (DataRow row in udt.Rows)
{
rowValues.Add(row[filterField.InternalName].ToString());
}
rowValues.Sort();
took only 1-2 seconds!
In the end, I am going with Thriggle's answer, because it deals nicely with SharePoint's 5000 item list view threshold, which I will probably be dealing with some day, and it is only marginally slower (2-3 seconds) than the DataTable method. Still much, much faster than all the LINQ.
Interesting to note, though, that the fastest way to get distinct values from a particular field from a SPListItemCollection seems to be the DataTable/DataView conversion method.
You're potentially introducing a significant delay by retrieving all items first before checking for distinctness.
An alternative approach would be to perform multiple CAML queries against SharePoint; this would result in one query per unique value (plus one final query that returns no results).
Make sure your list has column indexing applied to the field whose values you want to enumerate.
In your initial CAML query, sort by the field you want to enumerate and impose a row limit of one item.
Get the value of the field from the item returned by that query and add it to your collection of unique values.
Query the list again, sorting by the field and imposing a row limit of 1, but this time add a filter condition such that it only retrieves items where the field value is greater than the field value you just detected.
Add the value of the field in the returned item to your collection of unique values.
Repeat steps 4 and 5 until the query returns an empty result set, at which point your collection of unique values should contain all current values of the field (assuming more haven't been added since you started).
Will this be any faster? That depends on your data, and how frequently duplicate values occur.
If you have 4000 items and only 5 unique values, you'll be able to gather those 5 values in only 6 lightweight CAML queries, returning a total of 5 items. This makes a lot more sense than querying for all 4000 items and enumerating through them one at a time to look for unique values.
On the other hand, if you have 4000 items and 3000 unique values, you're looking at querying the list 3001 times. This might well be slower than retrieving all the items in a single query and using post-processing to find the unique values.
var distinctItems = (from SPListItem item in items select item["EmployeeName"]).Distinct().ToArray();
Or convert your results to DataView and do something like:
SPList oList = SPContext.Current.Web.Lists["ListName"];
SPQuery query = new SPQuery();
query.Query = "<OrderBy><FieldRef Name='Name' /></OrderBy>";
DataTable dtcamltest = oList.GetItems(query).GetDataTable();
DataView dtview = new DataView(dtcamltest);
DataTable dtdistinct = dtview.ToTable(true, "Name");
Source: https://sharepoint.stackexchange.com/questions/77988/caml-query-on-sharepoint-list-without-duplicates
Duplicate maybe?
.Distinct is an O(n) call.
You can't get any faster than that.
This being said, maybe you want to check if you need the cast + select for getting uniques - I'd try a HashSet.

IQueryable being 3 times as slow with Contain-method vs Array

I have some records which I fetch from database (normally about 100-200). I also need to get the corresponding Place for every record and fill in the Description of the Place in the record. I would normally do that in my .Select function but I need to check if Place isn't null before trying to take the Description. My code goes like this:
var places = db.Places.Where(p => p.Active && p.CustomerID == cust_ID).ToArray();
foreach (var result in query)
result.Description =
places.Where(Place.Q.Contains(result.Latitude, result.Longitude).Compile())
.FirstOrDefault()?.Description;
query is IQueryable.
If I take places as IQueryable or IEnumerable and remove the Compile() from my Expression, my code runs 3x (!!!) as slow as when I run the code as shown here.
Does anyone have an explanation for that? Does places get fetched from database every loop of foreach?
(Edit as my first question was answered)
Also is there any way I could check if Place is null in my Select function (not taking the results into memory, keeping it IQueryable) so I don't have to loop over my results afterwards?

Is it possible to make LINQ's OrderBy operation maintain the original order of an IQueryable?

When attempting to page a search result set based on an IQueryable I get an error stating that I must call OrderBy before using Skip.
In most cases, this is fine as I can just order the results by Id. However, I have an edge case where in I am calling a Function Import via my entity class. This function import uses a FREETEXT search in SQL and returns a ranked result set.
I need to maintain--and be able to page through--that result set, but if I call OrderBy against any of the columns in my result set, I will no longer have a set of ranked search results.
So, as silly as it sounds, is there a way to fool LINQ into thinking that I have called an OrderBy operation that just maintains the original order?
You can use this:
var page = searchResultSet.OrderBy(x => 0).Skip(40).Take(20);
That will maintain the original order of your result set, because all records will have the same constant value of 0, hence sorting them won't change their order after comparing their equal values.
Check this answer

Limit Number of Results being returned in a List from Linq

I'm using Linq/EF4.1 to pull some results from a database and would like to limit the results to the (X) most recent results. Where X is a number set by the user.
Is there a way to do this?
I'm currently passing them back as a List if this will help with limiting the result set. While I can limit this by looping until I hit X I'd just assume not pass the extra data around.
Just in case it is relevant...
C# MVC3 project running from a SQL Server database.
Use the Take function
int numberOfrecords=10; // read from user
listOfItems.OrderByDescending(x => x.CreatedDate).Take(numberOfrecords)
Assuming listOfItems is List of your entity objects and CreatedDate is a field which has the date created value (used here to do the Order by descending to get recent items).
Take() Function returns a specified number of contiguous elements from the start of a
sequence.
http://msdn.microsoft.com/en-us/library/bb503062.aspx
results = results.OrderByDescending(x=>x.Date).Take(10);
The OrderByDescending(...) will sort items by your date/time property (or w/e logic you want to use to get most recent) and Take(...) will limit to first x items (first being most recent, thanks to the ordering).
Edit: To return some rows not starting at the first row, use Skip():
results = results.OrderByDescending(x=>x.Date).Skip(50).Take(10);
Use Take(), before converting to a List. This way EF can optimize the query it creates and only return the data you need.

Determine which elements in a list are NOT in another list

I have two IList<CustomObject>, where CustomObject has a Name property that's a string. Call the first one set, and the second one subset. set contains a list of things that I just displayed to the user in a multiselect list box. The ones the user selected have been placed in subset (so subset is guaranteed to be a subset of set, hence the clever names ;) )
What is the most straightforward way to generate a third IList<CustomObject>, inverseSubset, containing all the CustomObjects the user DIDN'T select, from these two sets?
I've been trying LINQ things like this
IEnumerable<CustomObject> inverseSubset = set.Select<CustomObject,CustomObject>(
sp => !subset.ConvertAll<string>(p => p.Name).Contains(sp.Name));
...based on answers to vaguely similar questions, but so far nothing is even compiling, much less working :P
Use the LINQ Except for this:
Produces the set difference of two sequences.
Aha, too much SQL recently - I didn't want Select, I wanted Where:
List<string> subsetNames = subset.ConvertAll<string>(p => p.Name);
IEnumerable<CustomObject> inverseSubset =
set.Where<CustomObject>(p => !subsetNames.Contains(p.Name));

Categories