Need a help with RavenDB.
In my web page I want to have such list:
item1 category1
item2 category2
...
and another one:
category1, number of items
category2, number of items
...
My data structures:
public class Item
{
public string Id { get; set; }
public string Name { get; set; }
public string CategoryId { get; set; }
}
public class Category
{
public string Id { get; set; }
public string Name { get; set; }
public bool IsActive { get; set; }
}
Index for the first list:
public class Item_WithCategory : AbstractIndexCreationTask<Item>
{
public class Result
{
public string Name { get; set; }
public string CategoryName { get; set; }
}
public Item_WithCategory()
{
Map = items => from item in items
select new
{
Name = item.Name,
CategoryName = LoadDocument<Category>(item.CategoryId).Name
};
}
}
Is this data structure suitable for my case, or will it be better to have Category instead of CategoryId in item structure?
Should I use my index or is there a better solution to take category name?
If my index is good, how to write a correct query? My current try:
Item_WithCategory.Result[] all;
using (var session = DocumentStoreHolder.Store.OpenSession())
{
all = session.Query<Item_WithCategory.Result, Item_WithCategory>().ToArray();
}
but it throws exception stating that return type is item, not result. How to fix it?
You have a couple of options here. You could store both the CategoryId and the CategoryName on the Item entity. This will of course lead to duplicated data (if you still need to store the Category entity), but "storage is cheap" is a popular term these days.The downside of this is that you need to update each Item document of a given category if the category name changes to keep things consistent. A benefit is that you need to do less work to get your desired result.
If you store Category Name on the item as well you don't need a special index to handle the first list, just query on the Items and return what you need. For the second list you need to create a Map/Reduce index on the Item entity that groups on the category.
However, if you need to use the data structure you've given, there are a couple of ways of solving this. First, it's really not recommended to use a LoadDocument inside of an index definition, especially not in a select statement. This might affect indexing performance in a negative way.
Instead, just index the properties you need to query on (or use an auto index) and then use a Result Transformer to fetch information from related documents:
public class ItemCategoryTransformer : AbstractTransformerCreationTask<Item>
{
public ItemCategoryTransformer()
{
TransformResults = results => from item in results
let category = LoadDocument<Category>(item.CategoryId)
select new ItemCategoryViewModel
{
Name = item.Name,
CategoryName = category.Name
};
}
}
public class ItemCategoryViewModel
{
public string Name { get; set; }
public string CategoryName { get; set; }
}
You can use this Transformer with a Query on the Item entity:
using (var session = documentStore.OpenSession())
{
var items = session.Query<Item>()
.TransformWith<ItemCategoryTransformer, ItemCategoryViewModel>()
.ToList();
}
As for the second list, still using your data structure, you have to use a couple of things. First, a Map/Reduce index over the Items, grouped by CategoryId:
public class Category_Items_Count : AbstractIndexCreationTask<Item, Category_Items_Count.Result>
{
public class Result
{
public string CategoryId { get; set; }
public int Count { get; set; }
}
public Category_Items_Count()
{
Map = items => from item in items
select new Result
{
CategoryId = item.CategoryId,
Count = 1
};
Reduce = results => from result in results
group result by result.CategoryId
into c
select new Result
{
CategoryId = c.Key,
Count = c.Sum(x => x.Count)
};
}
}
But as you only have the CategoryId on the Item entity, you have to use a similar transformer as in the first list:
public class CategoryItemsCountTransformer : AbstractTransformerCreationTask<Category_Items_Count.Result>
{
public CategoryItemsCountTransformer()
{
TransformResults = results => from result in results
let category = LoadDocument<Category>(result.CategoryId)
select new CategoryItemsCountViewModel
{
CategoryName = category.Name,
NumberOfItems = result.Count
};
}
}
public class CategoryItemsCountViewModel
{
public string CategoryName { get; set; }
public int NumberOfItems { get; set; }
}
And lastly, query for it like this:
using (var session = documentStore.OpenSession())
{
var items = session.Query<Category_Items_Count.Result, Category_Items_Count>()
.TransformWith<CategoryItemsCountTransformer, CategoryItemsCountViewModel>()
.ToList();
}
As you can see, there are quite a difference in work needed depending on what data structure you're using. If you stored the Category Name on the Item entity directly you wouldn't need any Result Transformers to achieve the results you're after (you would only need a Map/Reduce index).
However, Result Transformers are executed server side and are only executed on request, instead of using LoadDocument inside of an index which is executed every time indexing occurs. Also, and maybe why LoadDocuments inside of index definitions isn't recommended, every change to a document that's referenced with a LoadDocument in an index will cause that index to have to be rewritten. This might lead to a lot of work for the index engine.
Lastly, to answer your last question about why you get an exception when querying: As the actual return type of your index is the document that that's being indexed (in this case Item). To use something else you need to project your result to something else. This can be done by using ".As()" in the query:
Item_WithCategory.Result[] all;
using (var session = DocumentStoreHolder.Store.OpenSession())
{
all = session.Query<Item_WithCategory.Result, Item_WithCategory>()
.As<Item_WithCategory.Result>()
.ToArray();
}
Long post, but hope it helps!
Related
I am trying to extract list of Categories with the corresponding Tickets for a specific userId using Linq Lambda expression.
Category:
public class Category
{
public int Id { get; set; }
public string CategoryName { get; set; }
public ICollection<Ticket> Tickets { get; set; }
}
Ticket:
public class Ticket
{
public int Id { get; set; }
public string Title { get; set; }
public User User { get; set; }
public Category Category { get; set; }
}
User
public class User
{
public string Id { get; set; }
public string UserName { get; set; }
public ICollection<Ticket> Tickets { get; set; }
}
This is what I tried so far. But it is not even close.
public async Task<IEnumerable<Category>> GetCategories(string userid)
{
var categories = _context.Categories
.Include(c => c.Tickets)
.AsQueryable();
categories = categories
.Where(c => c.Tickets.Any(t =>t.User.Id.Equals(id)));
return await categories.ToListAsync();
}
How can I have a list of categories with corresponding tickets for a specific userId?
The issue is this line of code:
categories = categories.Where(c => c.Tickets.Any(t =>t.User.Id.Equals(id)));
This will return all categories that contain at least 1 ticket with the specified user id, but all tickets in the category will be included. My understanding is, you want to keep only the tickets that belong to the specified user id.
The fact that your Tickets are represented by an ICollection<Ticket> instead of IEnumerable<Ticket>, make this a bit more difficult - since LINQ works on IEnumerables. That's the reason for the ToList() call, which causes enumeration - something to be aware of here - Tickets collection is no longer lazy past that point.
This code will do what you want:
IEnumerable<Category> selection = (from c in categories
select new Category
{
Id = c.Id,
CategoryName = c.CategoryName,
Tickets = (from t in c.Tickets
where t.User.Id.Equals(id)
select t).ToList()
}).Where(c => c.Tickets.Count() > 0);
but, I think, both performance-wise and readability-wise, you might want a loop instead:
List<Category> selection = new List<Category>();
foreach (var category in categories)
{
category.Tickets = category.Tickets.Where(t => t.User.Id.Equals(id)).ToList();
if (category.Tickets.Count > 0)
{
selection.Add(category);
}
}
Finally, since your Ticket class carries the Category info anyway, it may be worth to flatten the list using SelectMany:
var selection = categories.SelectMany(c => c.Tickets.Where(t => t.User.Id.Equals(id)));
This returns a flattened list of Tickets - but only the ones that match the specified user id, from all categories. This has the advantage of staying lazy (it's still an IEnumerable), being simple, and very likely of higher performance than the other options; but, you no longer have the nested list.
Take your pick!
I have Places, each place can have many tags. Each tag can be assigned to many places.
public class Place {
public int Id { get; set; }
public string PlaceName { get; set; }
public IEnumerable<Tag> Tags { get; set; }
}
public class Tag {
public int Id { get; set; }
public string TagName { get; set; }
}
public class TagPlace {
public int Id { get; set; }
public PlaceId { get; set; }
public TagId { get; set; }
}
The database has equivalent tables with foreign keys as appropriate.
I want to get a collection of Places, and I want each Place to have an appropriate colleciton of Tags. I guess using Linq might be required.
I've found various articles on this, but they aren't quite the same / deal with a list of ints rather than two collections of objects.
eg
https://social.msdn.microsoft.com/Forums/en-US/fda19d75-b2ac-4fb1-801b-4402d4bd5255/how-to-do-in-linq-quotselect-from-employee-where-id-in-101112quot?forum=linqprojectgeneral
LINQ Where in collection clause
What's the best way of doing this?
The classical approach with Dapper is to use a Dictionary to store the main objects while the query enumerates the records
public IEnumerable<Place> SelectPlaces()
{
string query = #"SELECT p.id, p.PlaceName, t.id, t.tagname
FROM Place p INNER JOIN TagPlace tp ON tp.PlaceId = p.Id
INNER JOIN Tag t ON tp.TagId = t.Id";
var result = default(IEnumerable<Place>);
Dictionary<int, Place> lookup = new Dictionary<int, Place>();
using (IDbConnection connection = GetOpenedConnection())
{
// Each record is passed to the delegate where p is an instance of
// Place and t is an instance of Tag, delegate should return the Place instance.
result = connection.Query<Place, Tag, Place(query, (p, t) =>
{
// Check if we have already stored the Place in the dictionary
if (!lookup.TryGetValue(p.Id, out Place placeFound))
{
// The dictionary doesnt have that Place
// Add it to the dictionary and
// set the variable where we will add the Tag
lookup.Add(p.Id, p);
placeFound = p;
// Probably it is better to initialize the IEnumerable
// directly in the class
placeFound.Tags = new List<Tag>();
}
// Add the tag to the current Place.
placeFound.Tags.Add(t);
return placeFound;
}, splitOn: "id");
// SplitOn is where we tell Dapper how to split the record returned
// in the two instances required, but here SplitOn
// is not really needed because "Id" is the default.
}
return result;
}
I wonder if someone could spare me a few minutes to give me some advice please?
I've created an IEnumerable list:
public class EmailBlock
{
public int alertCategory { get; set; }
public string alertName { get; set; }
public string alertURL { get; set; }
public string alertSnippet { get; set; } //Need to work out the snippet
}
List<EmailBlock> myEmailData = new List<EmailBlock>();
Which I then loop through some data (Umbraco content - not that that's really relevant!) and add items to the list.
myEmailData.Add(new EmailBlock { alertCategory = category.Id, alertName = alert.GetPropertyValue("pageTitle"), alertURL = alert.NiceUrl });
What ultimately I'd like to do is group the list by the alertCategory and then load each 'group' (another loop occurs later to check what members have subscribed to what alert category) into a variable which I can then use as an email's content.
You could use Linq's GroupBy() to do this:
using System.Linq
...
//Create a type to hold your grouped emails
public class GroupedEmail
{
public int AlertCategory { get; set; }
public IEnumerable<EmailBlock> EmailsInGroup {get; set; }
}
var grouped = myEmailData
.GroupBy(e => e.alertCategory)
.Select(g => new GroupedEmail
{
AlertCategory = g.Key,
EmailsInGroup = g
});
You can select to an anonymous type if required and project your sequence into whatever structure you require.
Linq has a nice group by statement:
var emailGroup = emailList.GroupBy(e => e.alertCategory);
Then you can loop through each grouping and do whatever you want:
foreach(var grouping in emailGroup)
{
//do whatever you want here.
//note grouping will access the list of grouped items, grouping.Key will show the grouped by field
}
Edit:
To retrieve a group after you have grouped them, just use Where for more than one or First for just one:
var group = emailGroup.First(g => g.Key == "name you are looking for");
or
var groups = emailGroup.Where(g => listOfWantedKeys.Contains(g.Key));
this is a lot more efficient than looping through every time you need to find something.
At the moment, I have multiple tables in my Database with slightly varying columns to define different "history" elements for an item.
So I have my item table;
int ItemId {get;set}
string Name {get;set}
Location Loc {get;set}
int Quantity {get;set}
I can do a few things to these items like Move, Increase Quantity, Decrease Quantity, Book to a Customer, "Pick" an item, things like that. So I have made multiple "History Tables" as they have different values to save E.g
public class MoveHistory
{
public int MoveHistoryId { get; set; }
public DateTime Date { get; set; }
public Item Item { get; set; }
public virtual Location Location1Id { get; set; }
public virtual Location Location2Id { get; set; }
}
public class PickingHistory
{
public int PickingHistoryId { get; set; }
public DateTime Date { get; set; }
public Item Item { get; set; }
public int WorksOrderCode { get; set; }
}
This is fine apart from where I want to show a complete history for an item displayed in a list;
Item 123 was moved on 23/02/2013 from Location1 to Location2
Item 123 was picked on 24/02/2013 from work order 421
I am using Entity Framework, .NET 4.5, WPF, and querying using Linq but cannot figure a way of taking these lists of history elements, and ordering them out one by one based on their date.
I can think of messy ways, like one single history table with columns used if required. Or even create a third list containing the date and what list it came from, then cycle through that list picking the corresponding contents from the corresponding list. However, I feel there must be a better way!
Any help would be appreciated.
If you implement a GetDescription() method on your history items (even as an extension method), you can do this:
db.PickingHistory.Where(ph => ph.Item.ItemId == 123)
.Select(ph => new { Time = ph.Date, Description = ph.GetDescription() })
.Concat(db.MoveHistory.Where(mh => mh.ItemId == 123)
.Select(mh => new { Time = mh.Date, Description = mh.GetDescription() })
.OrderByDescending(e => e.Time).Select(e => e.Description);
The problem you are facing is that you're trying to use your database model as a display model and obviously are failing. You need to create a new class that represents your history grid and then populate it from your various queries. From your example output the display model may be:
public class HistoryRow{
public DateTime EventDate { get; set; }
public string ItemName { get; set; }
public string Action { get; set; }
public string Detail { get; set; }
}
You then load the data into this display model:
var historyRows = new List<HistoryRow>();
var pickingRows = _db.PickingHistory.Select(ph => new HistoryRow{
EventDate = ph.Date,
ItemName = ph.Item.Name,
Action = "picked",
Detail = "from works order " + ph.WorksOrderCode);
historyRows.AddRange(pickingRows);
var movingRows = _db.MoveHistory.Select(mh => new HistoryRow{
EventDate = mh.Date,
ItemName = ph.Item.Name,
Action = "moved",
Detail = "from location " + mh.Location1Id + " to location " + mh.Location2Id);
historyRows.AddRange(movingRows );
You can repeatedly add the rows from various tables to get a big list of the HistoryRow actions and then order that list and display the values as you wish.
foreach(var historyRow in historyRows)
{
var rowAsString = historyRow.ItemName + " was " + historyRow.Action.....;
Console.WriteLine(rowAsString);
}
If you are implementing this in order to provide some sort of undo/redo history, then I think that you're going about it in the wrong way. Normally, you would have one collection of ICommand objects with associated parameter values, eg. you store the operations that have occurred. You would then be able to filter this collection for each item individually.
If you're not trying to implement some sort of undo/redo history, then I have misunderstood your question and you can ignore this.
I have the following entity collections in RavenDB:
public class EntityA
{
public string Id { get; set; }
public string Name { get; set; }
public string[] Tags { get; set; }
}
public class EntityB
{
public string Id { get; set; }
public string Name { get; set; }
public string[] Tags { get; set; }
}
The only thing shared is the Tags collection: a tag of EntityA may exist in EntityB, so that they may intersect.
How can I retrieve every EntityA that has intersecting tags with EntityB where the Name property of EntityB is equal to a given value?
Well, this is a difficult one. To do it right, you would need two levels of reducing - one by the tag which would expand out your results, and another by the id to collapse it back. Raven doesn't have an easy way to do this.
You can fake it out though using a Transform. The only problem is that you will have skipped items in your result set, so make sure you know how to deal with those.
public class TestIndex : AbstractMultiMapIndexCreationTask<TestIndex.Result>
{
public class Result
{
public string[] Ids { get; set; }
public string Name { get; set; }
public string Tag { get; set; }
}
public TestIndex()
{
AddMap<EntityA>(entities => from a in entities
from tag in a.Tags.DefaultIfEmpty("_")
select new
{
Ids = new[] { a.Id },
Name = (string) null,
Tag = tag
});
AddMap<EntityB>(entities => from b in entities
from tag in b.Tags
select new
{
Ids = new string[0],
b.Name,
Tag = tag
});
Reduce = results => from result in results
group result by result.Tag
into g
select new
{
Ids = g.SelectMany(x => x.Ids),
g.First(x => x.Name != null).Name,
Tag = g.Key
};
TransformResults = (database, results) =>
results.SelectMany(x => x.Ids)
.Distinct()
.Select(x => database.Load<EntityA>(x));
}
}
See also the full unit test here.
There is another approach, but I haven't tested it yet. That would be to use the Indexed Properties Bundle to do the first pass, and then map those results for the second pass. I am experimenting with this in general, and if it works, I will update this answer with the results.