Using Dapper Extensions, can IDs be retrieved when inserting IEnumerable<T> - c#

If an object is inserted 1 at a time, then the Id can be fetched from the object:
foreach (var object in objectList)
{
conn.Insert(object);
int id = object.Id; // Returns Id as expected
}
However, if an IEnumerable of objects is inserted, the Ids cannot be fetched properly:
conn.Insert(objectList);
foreach (var object in objectList)
{
int id = object.Id; // Returning 0
}
Is there a way to insert the list of objects and still get the Ids back without inserting 1 at a time?

Doesn't look like that has been implemented. See the code here. I would assume this is for performance reasons.

Related

How to update values in a list, using a foreach loop

I have some values stored on a database, that I am query'ing.
So for each object, I want to save the value to my list.
Currently I'm doing it, in a function that I call once pr second, like this:
foreach (var obj in results)
{
MyList.Add (ValueSavedOnDatabase);
}
But my issue is that, if I have, for example, 20 objects on the database, then this function will save the values in the first 20 indexes in the list.
Then each time the function gets called (once per second), it will add 20 new indexes to the list, instead of just overwriting the 20 indexes, which I need it to.
So after 3 seconds, I currently get 60 indexes in this list (When I just wanted 20).
Does anyone have an advice, for how I can achieve this overwriting, of these indexes in the list, instead of continouing creating new indexes ?
I'm using Unity and C#.
You need to empty your list before the loop:
MyList.Clear(); // Clear the list
foreach (var obj in results) { // do whatever you need
MyList.Add (ValueSavedOnDatabase);
}
Or if the size of the list is always the same per each call, you can do this:
int i = 0;
foreach (var obj in results) {
MyList[i] = ValueSavedOnDatabase;
i++;
}
N.B. I only suggest this last, in case you are using an ArrayList.

sql nhibernate performance for loop

I have the following logic:
loop through a list of ids, get the associated entity, and for that entity, loop through another list of ids and get another entity. Code is below:
foreach (var docId in docIds)
{
var doc = new EntityManager<Document>().GetById(docId);
foreach (var tradeId in tradeIds)
{
var trade = new EntityManager<Trade>().GetById(tradeId);
if (doc.Trade.TradeId != trade.TradeId)
{
Document newDoc = new Document(doc, trade, 0);
new EntityManager<Document>().Add(newDoc);
}
}
}
my question is mainly about sql performance. Obviously there will be a bunch of selects happening, as well as some adds. Is this a bad way to go about doing something like this?
Should I, instead, use a session and get a list of all entities that match the list of ids (with 1 select statement) and then loop after?
It depends only on my expirience. But you can test it yourselve.
If Trade entity isn't very big and count of entities wouldnt be over 1000 - reading all entities and loop after will be much preferable.
If count is more 1k - its better to call stored procedure with joining temp table, containing your ids.

Update specific object in array

I have a DataTable and an array of objects that I loop through.
For each row in a data table, I search through my collection of objects with Linq, and if found, that object needs to be updated.
But how do I refresh my collection without reloading it from the database?
Car[] mycars = Cars.RetrieveCars(); //This is my collection of objects
//Iterate through Cars and find a match
using (DataTable dt = data.ExecuteDataSet(#"SELECT * FROM aTable").Tables[0])
{
foreach (DataRow dr in dt.Rows) //Iterate through Data Table
{
var found = (from item in mycars
where item.colour == dr["colour"].ToString()
&& item.updated == false
select item).First();
if (found == null)
//Do something
else
{
found.updated = true;
Cars.SaveCar(found);
//HERE: Now here I would like to refresh my collection (mycars) so that the LINQ searches on updated data.
//Something like mycars[found].updated = true
//But obviously mycars can only accept int, and preferably I do not want to reload from the database for performance reasons.
}
How else can I search and update a single item in the array?
You don't need to update your collection - assuming Car is a class, you've already updated the object that the array refers to by setting found.updated to true.
Don't forget that the array only contains references - so the found reference is the same reference which is in the array; updating the object via either variable will result in the change being visible via the other one.

Does Linq OrderBy Not Sort Original Collection?

I am using Linq To Sql as my database layer for the first time and I have run into an issue. Basically, I have a form that allows users to create predefined jobs. A predefined job can have many predefined job items. For example, a predefined job might be something like an oil change. The predefined job items would be oil, labor, etc. In my database PredefinedJobs is a table and PredefinedJobItems is another table with a foreign key back to PredefinedJobs. I have a form for adding predefined jobs that has the Linq-to-Sql class backing the form as a class variable. There is a ListView on the form that displays all of the jobs items. A new feature has required me to track the position of an item in the ListView. For example, if my item ListView looks like below, note the order:
Qty Name Desc ItemOrder
4 Oil Penzoil 0
1 Labor 1
Because the items are added via a child form I do not want to provide access to the ListView. So, I created the method below in an attempt to both create the ItemOrder and sort the collection on the PredefinedJob Linq to Sql object. It does not appear that the OrderBy function on the List actually sorts the collection on the PredefinedJob. What would be the best way to maintain order on the Linq to Sql collection (i.e. PredefinedJob.fkJobItems)? Or, would it be better to just pass a reference to my ListView into the child form that adds the items to the jobs where I have access to the selectedIndex?
private SortAndOrderItems(List<PredefinedJobsItem> items)
{
var nextItemOrderNumber = items.Max(max => max.ItemOrder) + 1;
foreach (var item in items)
{
if (item.ItemOrder == null)
{
item.ItemOrder = nextItemOrderNumber;
nextItemOrderNumber++;
}
}
items.OrderBy(i => i.ItemOrder).ToList();
}
OrderBy creates a new query, that will, when executed, not alter your original list.
Why not just the Sort method of the List?
items.Sort((a, b) => a.ItemOrder.CompareTo(b.ItemOrder));
I think you were looking for List<>.Sort
class Cmp : IComparer<PredefinedJobsItem>
{
public int Compare(PredefinedJobsItem x, PredefinedJobsItem y)
{
return x.ItemOrder.CompareTo(y.ItemOrder);
}
}
var comparison = new Cmp();
items.Sort(comparison);

Cache only parts of an object

I'm trying to achieve a super-fast search, and decided to rely heavily on caching to achieve this. The order of events is as follows;
1) Cache what can be cached (from entire database, around 3000 items)
2) When a search is performed, pull the entire result set out of the cache
3) Filter that result set based on the search criteria. Give each search result a "relevance" score.
4) Send the filtered results down to the database via xml to get the bits that can't be cached (e.g. prices)
5) Display the final results
This is all working and going at lightning speed, but in order to achieve (3) I've given each result a "relevance" score. This is just a member integer on each search result object. I iterate through the entire result set and update this score accordingly, then order-by it at the end.
The problem I am having is that the "relevance" member is retaining this value from search to search. I assume this is because what I am updating is a reference to the search results in the cache, rather than a new object, so updating it also updates the cached version. What I'm looking for is a tidy solution to get around this. What I've come up with so far is either;
a) Clone the cache when i get it.
b) Create a seperate dictionary to store relevances in and match them up at the end
Am I missing a really obvious and clean solution or should i go down one of these routes? I'm using C# and .net.
Hopefully it should be obvious from the description what I'm getting at, here's some code anyway; this first one is the iteration through the cached results in order to do the filtering;
private List<QuickSearchResult> performFiltering(string keywords, string regions, List<QuickSearchResult> cachedSearchResults)
{
List<QuickSearchResult> filteredItems = new List<QuickSearchResult>();
string upperedKeywords = keywords.ToUpper();
string[] keywordsArray = upperedKeywords.Split(' ');
string[] regionsArray = regions.Split(',');
foreach (var item in cachedSearchResults)
{
//Check for keywords
if (keywordsArray != null)
{
if (!item.ContainsKeyword(upperedKeywords, keywordsArray))
continue;
}
//Check for regions
if (regionsArray != null)
{
if (!item.IsInRegion(regionsArray))
continue;
}
filteredItems.Add(item);
}
return filteredItems.OrderBy(t=> t.Relevance).Take(_maxSearchResults).ToList<QuickSearchResult>();
}
and here is an example of the "IsInRegion" method of the QuickSearchResult object;
public bool IsInRegion(string[] regions)
{
int relevanceScore = 0;
foreach (var region in regions)
{
int parsedRegion = 0;
if (int.TryParse(region, out parsedRegion))
{
foreach (var thisItemsRegion in this.Regions)
{
if (thisItemsRegion.ID == parsedRegion)
relevanceScore += 10;
}
}
}
Relevance += relevanceScore;
return relevanceScore > 0;
}
And basically if i search for "london" i get a score of "10" the first time, "20" the second time...
If you use the NetDataContractSerializer to serialize your objects in the cache, you could use a [DataMember] attribute to control what gets serialized and what doesn't. For instance, you could store your temporarary calculated relevance value in a field that is not serialized.

Categories