It is possible to query a NotMapped property? - c#

i'm working with EF6 code first, and i used this answer to map a List<stirng> in my entitie.
This is my class
[Key]
public string SubRubro { get; set; }
[Column]
private string SubrubrosAbarcados
{
get
{
return ListaEspecifica == null || !ListaEspecifica.Any() ? null : JsonConvert.SerializeObject(ListaEspecifica);
}
set
{
if (string.IsNullOrWhiteSpace(value))
ListaEspecifica.Clear();
else
ListaEspecifica = JsonConvert.DeserializeObject<List<string>>(value);
}
}
[NotMapped]
public List<string> ListaEspecifica { get; set; } = new List<string>();
It works perfectly to storage my list as Json, but now i need to perform a linq query, and i'm trying this
var c = db.CategoriaAccesorios.Where(c => c.ListaEspecifica.Contains("Buc")).First();
And it's throwing
System.NotSupportedException: The specified type member
'ListaEspecifica' is not supported in LINQ to Entities. Only
initializers, entity members, and entity navigation properties are
supported.
what is logical.
Is there any way to perform a query like this?

The problem here is that LINQ to Entities does not understand how to convert your query to the back-end (SQL) language. Because you're not materializing (i.e. converting to .NET) the results of the query until you filter it, LINQ tries to convert your query to SQL itself. Since it's not sure how to do that, you get a NotSupportedException.
If you materialize the query first (I.e. call a .ToList()) then filter, things will work fine. I suspect this isn't what you want, though. (I.e. db.CategoriaAccesorios.ToList().Where(c => c.ListaEspecifica.Contains("Buc")).First();)
As this answer explains, your issue is the EF to SQL Conversion. Obviously you want some way to workaround it, though.
Because you are JSON serializing, there are actually a couple options here, most particularly using a LIKE:
var c =
(from category
in db.CategoriaAccessorios
where SqlMethods.Like(c.SubrubrosAbarcados, "%\"Buc\"%")
select category).First()
If EF Core, allegedly Microsoft.EntityFrameworkCore.EF.Functions.Like should replace SqlMethods.Like.
If you have SQL Server 2016+, and force the SubrubrosAbarcados to be a JSON type, it should be possible to use a raw query to directly query the JSON column in particular.
If you're curious about said aspect, here's a sample of what it could look like in SQL Server 2016:
CREATE TABLE Test (JsonData NVARCHAR(MAX))
INSERT INTO Test (JsonData) VALUES ('["Test"]'), ('["Something"]')
SELECT * FROM Test CROSS APPLY OPENJSON(JsonData, '$') WITH (Value VARCHAR(100) '$') AS n WHERE n.Value = 'Test'
DROP TABLE Test

I was able to do something like this via CompiledExpression.
using Microsoft.Linq.Translations;
// (...) namespace, class, etc
private static readonly CompiledExpression<MyClass, List<string>> _myExpression = DefaultTranslationOf<MyClass>
.Property(x => x.MyProperty)
.Is(x => new List<string>());
[NotMapped]
public List<string> MyProperty
{
get { return _myExpression.Evaluate(this); }
}
I hope there are better / prettier solutions though ;)

Related

Retrieve list of class from DB using Entity Framework filtered by a computed property?

I have a DB class with the following properties:
public string AwsS3Key { get; set; }
public bool PendingS3Upload => !string.IsNullOrEmpty(AwsS3Key);
I'm trying to pull a list of data from my context by filtering based on the PendingS3Upload property. However, I get the following error:
The specified type member 'PendingS3Upload' is not supported in LINQ to Entities. Only initializers, entity members, and entity navigation properties are supported.
How can I pull data from my table with respect to the "computed" property PendingS3Upload ?
You can mark your property with the [NotMapped] attribute. Although you won't be able to Query this at the Database level, only in Memory using ToList() or AsEnumnerable() in some use cases this is out of the question due to large datasets etc.
If you need to query this on the database then assuming you're using SQL Server you can use computed columns.
Computed Column Info
If you want this to be Code First then you'll have to execute some SQL when you initialise the database or migrating.
ALTER TABLE dbo.TableName
ADD PendingS3Upload AS (CAST(CASE WHEN AwsS3Key <> '' = 1 THEN 1 ELSE 0 END AS BIT))
Then change your Class Property to
[DatabaseGenerated(DatabaseGenerationOption.Computed)]
public bool PendingS3Upload { get; private set;}
(I can't remember if the private set will work)
LINQ isn't able to translate computed properties to an SQL statement which is why you're getting that error. You have two options for this. You can pull the data locally using ToList() and then apply the filter like this:
var result = Context.YourClass.ToList().Where(c => c.PendingS3Upload).ToList();
or you can translate the logic to something LINQ can handle like this:
var result = Context.YourClass.Where(c => !string.IsNullOrEmpty(c.AwsS3Key)).ToList();

How to query against non-mapped property in Entity Framework?

I am converting a mapped property's type via encapsulation using the process found here (Convert value when mapping). Like #Mykroft said, this prevents you from writing LINQ queries against the DB.
Using the example properties below, How can I tell Linq to Entities to use the IsActiveBool when writing the statement db.Employee.Where(b => b.IsActive); ?
[Column("IsActive")]
protected string IsActiveBool { get; set; }
[System.ComponentModel.DataAnnotations.Schema.NotMapped]
public bool IsActive
{
get { return IsActiveBool == "Y"; }
set { IsActiveBool = value ? "Y" : "N"; }
}
Using your simplistic example of db.Employee.Where(b => b.IsActive); You could write it like this:
db.Employee.ToList().Where(b => b.IsActive);
Keep in mind that this is going to pull back all rows in the Employee table and then filter on IsActive. If you want SQL server to do the filtering for you (which you should because this isn't a good solution for a table of any real size) then you need to write it as:
db.Employee.Where(b => b.IsActiveBool == "Y");

Fluent NHibernate References with constants

I have a table mapped in Fluent NHibernate. This table must join to another table on an ID, but must also filter the joined values on that table against a set of constant values. Consider the following SQL:
SELECT *
FROM
Table1
INNER JOIN
Table2 ON
Table1.Table2Id = Table2.Id
AND Table2.Category = 'A constant expression'
AND Table2.Language = 'A constant expression'
My fluent mapping for Table1 currently looks like this:
References(a => a.Table2).Nullable().Columns("Table2Id").ReadOnly();
How can I implement the constant expressions?
It sounds like you could use filters to do this.
First, you need to define the filter types
public class SpecificCategoryFilter : FilterDefinition
{
public SpecificCategoryFilter()
{
WithName("SpecificCategory").WithCondition("Category = 'A constant expression'");
}
}
public class SpecificLanguageFilter : FilterDefinition
{
public SpecificLanguageFilter()
{
WithName("SpecificLanguage").WithCondition("Language = 'A constant expression'");
}
}
EDITED: As per comments, there is no .ApplyFilter<TFilter>() on References(), so have updated with what I believe is the way to do it with filters
The filters need to be applied in the fluent mappings
public class Table2Map : ClassMap<Table2>
{
public Table2Map()
{
// Other mappings here...
ApplyFilter<SpecificCategoryFilter>();
ApplyFilter<SpecificLanguageFilter>();
}
}
Finally, when you open a session, you need to enable the filters
using (var session = sessionFactory.OpenSession())
{
session.EnableFilter("SpecificCategory");
session.EnableFilter("SpecificLanguage");
}
If you're using an implementation of ICurrentSessionContext and the filters should always apply then you can enable the filters in the session returned from the call to ICurrentSessionContext.CurrentSession().
Now, when querying Table1, in order to activate the filters for Table2, you need to indicate to NHibernate to join to the referenced Table2; you can do this using
Fetch(t => t.Table2).Eager
JoinQueryOver(t => t.Table2) (and similar join strategies)
Without indicating to NHibernate to make the join, the reference will be lazily-loaded by default and hence the filters will not be applied in the query. The downside is that Table2 will be eager fetched but I don't know of a way to have the filters applied otherwise. The following query
session.QueryOver<Table1>().Inner.JoinQueryOver(t => t.Table2).List();
results in SQL similar to
SELECT
this_.Id as Id0_1_,
this_.Table2Id as Table3_0_1_,
table2_.Id as Id1_0_,
table2_.Category as Category1_0_,
table2_.Language as Language1_0_
FROM
Table1 this_
inner join
Table2 table2_
on this_.Table2Id=table2_.Id
WHERE
table2_.Category = 'A constant expression'
and table2_.Language = 'A constant expression'
which is akin to the SQL that you have in your question.
I have noticed that your mapping specifies Nullable and no eager fetching (by default it will be lazy loaded). So if you did want to generate the sql you have shown in your comment, you would not be able to do it with a simple session.Get<Table1>(). Even if you changed the mapping so that it was like this :
References(a => a.Table2).Nullable().Columns("Table2Id").ReadOnly().Fetch.Join;
You would most likely end up with a left outer join in the outputted sql. The only way you would be able to force a fetch with inner join (without downloading any extra NHibernate addons) would be to use a session.QueryOver (you may also be able to do this with session.Query and the NHibernate linq extensions). If this is the case, then you may as well specify your set of constants inside the QueryOver query.
public class GetTableQuery
{
private readonly string _category;
private readonly string _language;
public GetTableQuery(string category, string language)
{
_category = category;
_language = language;
}
public IEnumerable<Table1> Execute(ISession session)
{
var returnList = session.QueryOver<Table1>()
.Inner.JoinQueryOver(t1 => t1.Table2)
.Where(t2 => t2.Category == _category && t2.Language == _language)
.List();
return returnList;
}
}
I think the ApplyFilter method shown by Russ does make the model retrieval much simpler, and its really good for code re-usability (if you have other tables with categories and languages), but since your table reference is a nullable reference, you have to use a query anyway. Maybe a combination of QueryOver with the filter would be the best (assuming you can get the later version of fluent NHibernate)
You might want have a look into Formula(string formula) where you could provide plain SQL. If it is a good idea to filter data on the mapping level is another question IMHO... As an example have a look here.

LINQ translates when not part of a property

I have a query like this:
var q = db.GetTable<Person>().Where(x => x.Employer.CEO != null);
This made up query will return the ID of the CEO of the company a given person works for. This works find and dandy, but if I do something like this, it get the failed to translate to SQL error:
public class Person
{
public bool HasCEO
{
get
{
return this.Employer.CEO != null;
}
}
}
I want to be able to do this and wrap the longer expression within a property so that I don't have to repeat a nested table get:
var q = db.GetTable<Person>().Where(x => x.HasCEO);
How do I create LINQ properties to achieve my desired result?
I'm using C# 4.0 if that matters.
You can't. LINQ to SQL works by examining the expression tree of the query, and does not delve into the implementation of properties to determine whether a row meets your criteria.
What you can do is create a view in SQL server that dynamically calculates this property, and query that instead of the table.

How do I write one to many query in Dapper.Net?

I've written this code to project one to many relation but it's not working:
using (var connection = new SqlConnection(connectionString))
{
connection.Open();
IEnumerable<Store> stores = connection.Query<Store, IEnumerable<Employee>, Store>
(#"Select Stores.Id as StoreId, Stores.Name,
Employees.Id as EmployeeId, Employees.FirstName,
Employees.LastName, Employees.StoreId
from Store Stores
INNER JOIN Employee Employees ON Stores.Id = Employees.StoreId",
(a, s) => { a.Employees = s; return a; },
splitOn: "EmployeeId");
foreach (var store in stores)
{
Console.WriteLine(store.Name);
}
}
Can anybody spot the mistake?
EDIT:
These are my entities:
public class Product
{
public int Id { get; set; }
public string Name { get; set; }
public double Price { get; set; }
public IList<Store> Stores { get; set; }
public Product()
{
Stores = new List<Store>();
}
}
public class Store
{
public int Id { get; set; }
public string Name { get; set; }
public IEnumerable<Product> Products { get; set; }
public IEnumerable<Employee> Employees { get; set; }
public Store()
{
Products = new List<Product>();
Employees = new List<Employee>();
}
}
EDIT:
I change the query to:
IEnumerable<Store> stores = connection.Query<Store, List<Employee>, Store>
(#"Select Stores.Id as StoreId ,Stores.Name,Employees.Id as EmployeeId,
Employees.FirstName,Employees.LastName,Employees.StoreId
from Store Stores INNER JOIN Employee Employees
ON Stores.Id = Employees.StoreId",
(a, s) => { a.Employees = s; return a; }, splitOn: "EmployeeId");
and I get rid of exceptions! However, Employees are not mapped at all. I am still not sure what problem it had with IEnumerable<Employee> in first query.
This post shows how to query a highly normalised SQL database, and map the result into a set of highly nested C# POCO objects.
Ingredients:
8 lines of C#.
Some reasonably simple SQL that uses some joins.
Two awesome libraries.
The insight that allowed me to solve this problem is to separate the MicroORM from mapping the result back to the POCO Entities. Thus, we use two separate libraries:
Dapper as the MicroORM.
Slapper.Automapper for mapping.
Essentially, we use Dapper to query the database, then use Slapper.Automapper to map the result straight into our POCOs.
Advantages
Simplicity. Its less than 8 lines of code. I find this a lot easier to understand, debug, and change.
Less code. A few lines of code is all Slapper.Automapper needs to handle anything you throw at it, even if we have a complex nested POCO (i.e. POCO contains List<MyClass1> which in turn contains List<MySubClass2>, etc).
Speed. Both of these libraries have an extraordinary amount of optimization and caching to make them run almost as fast as hand tuned ADO.NET queries.
Separation of concerns. We can change the MicroORM for a different one, and the mapping still works, and vice-versa.
Flexibility. Slapper.Automapper handles arbitrarily nested hierarchies, it isn't limited to a couple of levels of nesting. We can easily make rapid changes, and everything will still work.
Debugging. We can first see that the SQL query is working properly, then we can check that the SQL query result is properly mapped back to the target POCO Entities.
Ease of development in SQL. I find that creating flattened queries with inner joins to return flat results is much easier than creating multiple select statements, with stitching on the client side.
Optimized queries in SQL. In a highly normalized database, creating a flat query allows the SQL engine to apply advanced optimizations to the whole which would not normally be possible if many small individual queries were constructed and run.
Trust. Dapper is the back end for StackOverflow, and, well, Randy Burden is a bit of a superstar. Need I say any more?
Speed of development. I was able to do some extraordinarily complex queries, with many levels of nesting, and the dev time was quite low.
Fewer bugs. I wrote it once, it just worked, and this technique is now helping to power a FTSE company. There was so little code that there was no unexpected behavior.
Disadvantages
Scaling beyond 1,000,000 rows returned. Works well when returning < 100,000 rows. However, if we are bringing back >1,000,000 rows, in order to reduce the traffic between us and SQL server, we should not flatten it out using inner join (which brings back duplicates), we should instead use multiple select statements and stitch everything back together on the client side (see the other answers on this page).
This technique is query oriented. I haven't used this technique to write to the database, but I'm sure that Dapper is more than capable of doing this with some more extra work, as StackOverflow itself uses Dapper as its Data Access Layer (DAL).
Performance Testing
In my tests, Slapper.Automapper added a small overhead to the results returned by Dapper, which meant that it was still 10x faster than Entity Framework, and the combination is still pretty darn close to the theoretical maximum speed SQL + C# is capable of.
In most practical cases, most of the overhead would be in a less-than-optimum SQL query, and not with some mapping of the results on the C# side.
Performance Testing Results
Total number of iterations: 1000
Dapper by itself: 1.889 milliseconds per query, using 3 lines of code to return the dynamic.
Dapper + Slapper.Automapper: 2.463 milliseconds per query, using an additional 3 lines of code for the query + mapping from dynamic to POCO Entities.
Worked Example
In this example, we have list of Contacts, and each Contact can have one or more phone numbers.
POCO Entities
public class TestContact
{
public int ContactID { get; set; }
public string ContactName { get; set; }
public List<TestPhone> TestPhones { get; set; }
}
public class TestPhone
{
public int PhoneId { get; set; }
public int ContactID { get; set; } // foreign key
public string Number { get; set; }
}
SQL Table TestContact
SQL Table TestPhone
Note that this table has a foreign key ContactID which refers to the TestContact table (this corresponds to the List<TestPhone> in the POCO above).
SQL Which Produces Flat Result
In our SQL query, we use as many JOIN statements as we need to get all of the data we need, in a flat, denormalized form. Yes, this might produce duplicates in the output, but these duplicates will be eliminated automatically when we use Slapper.Automapper to automatically map the result of this query straight into our POCO object map.
USE [MyDatabase];
SELECT tc.[ContactID] as ContactID
,tc.[ContactName] as ContactName
,tp.[PhoneId] AS TestPhones_PhoneId
,tp.[ContactId] AS TestPhones_ContactId
,tp.[Number] AS TestPhones_Number
FROM TestContact tc
INNER JOIN TestPhone tp ON tc.ContactId = tp.ContactId
C# code
const string sql = #"SELECT tc.[ContactID] as ContactID
,tc.[ContactName] as ContactName
,tp.[PhoneId] AS TestPhones_PhoneId
,tp.[ContactId] AS TestPhones_ContactId
,tp.[Number] AS TestPhones_Number
FROM TestContact tc
INNER JOIN TestPhone tp ON tc.ContactId = tp.ContactId";
string connectionString = // -- Insert SQL connection string here.
using (var conn = new SqlConnection(connectionString))
{
conn.Open();
// Can set default database here with conn.ChangeDatabase(...)
{
// Step 1: Use Dapper to return the flat result as a Dynamic.
dynamic test = conn.Query<dynamic>(sql);
// Step 2: Use Slapper.Automapper for mapping to the POCO Entities.
// - IMPORTANT: Let Slapper.Automapper know how to do the mapping;
// let it know the primary key for each POCO.
// - Must also use underscore notation ("_") to name parameters in the SQL query;
// see Slapper.Automapper docs.
Slapper.AutoMapper.Configuration.AddIdentifiers(typeof(TestContact), new List<string> { "ContactID" });
Slapper.AutoMapper.Configuration.AddIdentifiers(typeof(TestPhone), new List<string> { "PhoneID" });
var testContact = (Slapper.AutoMapper.MapDynamic<TestContact>(test) as IEnumerable<TestContact>).ToList();
foreach (var c in testContact)
{
foreach (var p in c.TestPhones)
{
Console.Write("ContactName: {0}: Phone: {1}\n", c.ContactName, p.Number);
}
}
}
}
Output
POCO Entity Hierarchy
Looking in Visual Studio, We can see that Slapper.Automapper has properly populated our POCO Entities, i.e. we have a List<TestContact>, and each TestContact has a List<TestPhone>.
Notes
Both Dapper and Slapper.Automapper cache everything internally for speed. If you run into memory issues (very unlikely), ensure that you occasionally clear the cache for both of them.
Ensure that you name the columns coming back, using the underscore (_) notation to give Slapper.Automapper clues on how to map the result into the POCO Entities.
Ensure that you give Slapper.Automapper clues on the primary key for each POCO Entity (see the lines Slapper.AutoMapper.Configuration.AddIdentifiers). You can also use Attributes on the POCO for this. If you skip this step, then it could go wrong (in theory), as Slapper.Automapper would not know how to do the mapping properly.
Update 2015-06-14
Successfully applied this technique to a huge production database with over 40 normalized tables. It worked perfectly to map an advanced SQL query with over 16 inner join and left join into the proper POCO hierarchy (with 4 levels of nesting). The queries are blindingly fast, almost as fast as hand coding it in ADO.NET (it was typically 52 milliseconds for the query, and 50 milliseconds for the mapping from the flat result into the POCO hierarchy). This is really nothing revolutionary, but it sure beats Entity Framework for speed and ease of use, especially if all we are doing is running queries.
Update 2016-02-19
Code has been running flawlessly in production for 9 months. The latest version of Slapper.Automapper has all of the changes that I applied to fix the issue related to nulls being returned in the SQL query.
Update 2017-02-20
Code has been running flawlessly in production for 21 months, and has handled continuous queries from hundreds of users in a FTSE 250 company.
Slapper.Automapper is also great for mapping a .csv file straight into a list of POCOs. Read the .csv file into a list of IDictionary, then map it straight into the target list of POCOs. The only trick is that you have to add a propery int Id {get; set}, and make sure it's unique for every row (or else the automapper won't be able to distinguish between the rows).
Update 2019-01-29
Minor update to add more code comments.
See: https://github.com/SlapperAutoMapper/Slapper.AutoMapper
I wanted to keep it as simple as possible, my solution:
public List<ForumMessage> GetForumMessagesByParentId(int parentId)
{
var sql = #"
select d.id_data as Id, d.cd_group As GroupId, d.cd_user as UserId, d.tx_login As Login,
d.tx_title As Title, d.tx_message As [Message], d.tx_signature As [Signature], d.nm_views As Views, d.nm_replies As Replies,
d.dt_created As CreatedDate, d.dt_lastreply As LastReplyDate, d.dt_edited As EditedDate, d.tx_key As [Key]
from
t_data d
where d.cd_data = #DataId order by id_data asc;
select d.id_data As DataId, di.id_data_image As DataImageId, di.cd_image As ImageId, i.fl_local As IsLocal
from
t_data d
inner join T_data_image di on d.id_data = di.cd_data
inner join T_image i on di.cd_image = i.id_image
where d.id_data = #DataId and di.fl_deleted = 0 order by d.id_data asc;";
var mapper = _conn.QueryMultiple(sql, new { DataId = parentId });
var messages = mapper.Read<ForumMessage>().ToDictionary(k => k.Id, v => v);
var images = mapper.Read<ForumMessageImage>().ToList();
foreach(var imageGroup in images.GroupBy(g => g.DataId))
{
messages[imageGroup.Key].Images = imageGroup.ToList();
}
return messages.Values.ToList();
}
I still do one call to the database, and while i now execute 2 queries instead of one, the second query is using a INNER join instead of a less optimal LEFT join.
A slight modification of Andrew's answer that utilizes a Func to select the parent key instead of GetHashCode.
public static IEnumerable<TParent> QueryParentChild<TParent, TChild, TParentKey>(
this IDbConnection connection,
string sql,
Func<TParent, TParentKey> parentKeySelector,
Func<TParent, IList<TChild>> childSelector,
dynamic param = null, IDbTransaction transaction = null, bool buffered = true, string splitOn = "Id", int? commandTimeout = null, CommandType? commandType = null)
{
Dictionary<TParentKey, TParent> cache = new Dictionary<TParentKey, TParent>();
connection.Query<TParent, TChild, TParent>(
sql,
(parent, child) =>
{
if (!cache.ContainsKey(parentKeySelector(parent)))
{
cache.Add(parentKeySelector(parent), parent);
}
TParent cachedParent = cache[parentKeySelector(parent)];
IList<TChild> children = childSelector(cachedParent);
children.Add(child);
return cachedParent;
},
param as object, transaction, buffered, splitOn, commandTimeout, commandType);
return cache.Values;
}
Example usage
conn.QueryParentChild<Product, Store, int>("sql here", prod => prod.Id, prod => prod.Stores)
According to this answer there is no one to many mapping support built into Dapper.Net. Queries will always return one object per database row. There is an alternative solution included, though.
Here is another method:
Order (one) - OrderDetail (many)
using (var connection = new SqlCeConnection(connectionString))
{
var orderDictionary = new Dictionary<int, Order>();
var list = connection.Query<Order, OrderDetail, Order>(
sql,
(order, orderDetail) =>
{
Order orderEntry;
if (!orderDictionary.TryGetValue(order.OrderID, out orderEntry))
{
orderEntry = order;
orderEntry.OrderDetails = new List<OrderDetail>();
orderDictionary.Add(orderEntry.OrderID, orderEntry);
}
orderEntry.OrderDetails.Add(orderDetail);
return orderEntry;
},
splitOn: "OrderDetailID")
.Distinct()
.ToList();
}
Source: http://dapper-tutorial.net/result-multi-mapping#example---query-multi-mapping-one-to-many
Here is a crude workaround
public static IEnumerable<TOne> Query<TOne, TMany>(this IDbConnection cnn, string sql, Func<TOne, IList<TMany>> property, dynamic param = null, IDbTransaction transaction = null, bool buffered = true, string splitOn = "Id", int? commandTimeout = null, CommandType? commandType = null)
{
var cache = new Dictionary<int, TOne>();
cnn.Query<TOne, TMany, TOne>(sql, (one, many) =>
{
if (!cache.ContainsKey(one.GetHashCode()))
cache.Add(one.GetHashCode(), one);
var localOne = cache[one.GetHashCode()];
var list = property(localOne);
list.Add(many);
return localOne;
}, param as object, transaction, buffered, splitOn, commandTimeout, commandType);
return cache.Values;
}
its by no means the most efficient way, but it will get you up and running. I'll try and optimise this when i get a chance.
use it like this:
conn.Query<Product, Store>("sql here", prod => prod.Stores);
bear in mind your objects need to implement GetHashCode, perhaps like this:
public override int GetHashCode()
{
return this.Id.GetHashCode();
}

Categories