Okay, what do I need?
I'm looking for a class (object by schema instance), which has a maximal number of hard defined fields. And a dynamic way to "use" (create, read, update, delete) a "sub-object" of it.
Like this:
public class Books
{
public int Id;
public string Title;
public string Isbn;
public int Pages;
public int Price;
public string Author;
public string DescriptionSmall;
public string DescriptionLong;
public string Publisher;
//create constructor:
public Books(int id, string title, string isbn, ...) {
Id = id,
Title = title,
Isbn = isbn,
...
//only the setted fields should be usable
}
//add fields (only pre-defined should be possible)
public void|bool Add(Dictionary<string, object>) { //or List<> overload for only field names (without values)
//add fields
}
//get a return (only pre-defined should be possible)
public void|bool return Get(Dictionary<string, object>) {
//return as a sub-object
}
//delete void|bool fields
public Delete(Dictionary<string, object>) {
//delete fields
}
//update fields
public Update(...) {
...
}
}
//and than I can use it like a object, createtd by instance or what ever. :/
var smallBooks = new Books(id = 1, Title = "Lord of tha weed"); //can use all methods but for the moment only the created fields, but can add more fields by add()...
I dislike to have hunderds of models/entites for all possible field-combinations.
The problem is, that I try to update a database via GraphQL server.
Is there a way to return a part of the object, which is a "sub-" object, itself?
Yeah I know, I also can create a dynamic obejct with help by ExpandoObject or create a Collection/Dictionary to send.
It's important, that the not used fields are not only NULL, because some fields in my database are nullable and can work with NULL as a value.
-------------------------[Addition: 2021-07-27]-------------------------
Okay, I obviously expressed myself ambiguously. I have a table in the DB and the fields in it are firmly defined. When updating data, however, I only need a few fields from the complete list of all fields. The problem, I would like to keep the selection of the fields dynamic instead of having to create numerous entities or DTOs as part of the main model.
I'm looking for a simple way to create a dynamic sub-object in the code that contains the same methods, but only a (freely selectable) selection of the total fields.
Within my project it happens from time to time that data should be collected and processed before that data is finally sent to the server as a dynamic subset, e.g. as an update (GraphQL update mutation). An existing data set (main model) should, as it were, reflect the current status of the database. But only individual fields are required for an update.
I just want to avoid having to create countless classes for all combinations and choices.
Is there a way to derive a dynamic partial selection from the main class, which, for example, only contains 1 to x fields, but possibly has a reference to the main class and its fields (sync). - So an instance from the main class and dynamically derive further sub-objects from this object.
I used to simply create a dictionary<string, object> with field names and values and use mainDict.GetKeys() or mainDict.Select(f => f.key / f.value) from this selection in a complex method to create a "small copy" to update only separate fields (via uploading to the server). - But I'd like to use oop with classes, instances and maybe dynamic objects (expando objects). - But I don't know how. ;) Sorry.
I'm open to any suggestion. If possible, simple and with little code, but as dynamic as possible.
To expand on my comment. A more complex would be like this:
In this example the Course has some fixed number of fields but expands on that using the CouresField (many-to-many) connection to connect several FormFields. The FieldValue entity holds the values for the fields in combination with the UserFieldValue (many-to-many) for each user (in this case).
Now maintaining all of this is a lot of work but you can use the EntityFramework to maintain the many-to-many connections for you. No need to do all the coding yourself. It will be automatically generated.
So I have an sqlite database with [Raw_Data] and [Names] tables. Raw_Data has 3 fields that hold a reference (foreign key?) to the Names table (requester, worker, and approver).
When I display data (dgvResults is a DataGridView in a Windows Forms application), I want to merge the names into the main table (Raw_Data).
I am using sqlite-net (by praeclarum on guitub) and have this code (only relevant parts shown for brevity):
...
using (var db=new SQLiteConnection(DB_Interop_SQLite.DB_Path, true))
{
this.Users=db.Table<User>().ToList<User>();
this.Records=db.Table<Raw_Data>().ToList<Raw_Data>();
this.Roles=db.Table<Role>().ToList<Role>();
}
var items=
(
from r in this.Records
join requester in this.Users on r.Requester equals requester.ID into group1
from requester in group1.DefaultIfEmpty()
join worker in this.Users on r.Drafter equals worker.ID into group2
from worker in group2.DefaultIfEmpty()
join approver in this.Users on r.Approver equals approver.ID into group3
from approver in group3.DefaultIfEmpty()
select new {
r.ID, r.Project, Requester=requester, r.Task_Code,
r.DT_submitted, r.DT_required, // other fields
Worker=worker, r.DT_completed, // other related fields
Approver=approver, r.DT_approved, // more fields
}
).ToList();
this.dgvResults.DataSource=items;
...
I have read through LINQ Join 2 List<T>s and Create Items from 3 collections using Linq and Merge multiple Lists into one List with LINQ
They have been a great help.
I also referred to A generic list of anonymous class, which really helped me create the linq query and it all works beautifully.
In my code, User class has ToString() method overwritten so it displays the full name just as I want.
Suppose I wanted a List that holds fields from Raw_Data, but instead of the integer field that refers to the ID from Names, I want the User object as a field, which will hold the ID as well as the rest of the user information (Name, email, phone, etc), just like I have in my anonymous type in the select statement.
So my question is: is there a better way to implement a list (that is not of the anonymous type), without rewriting the entire Raw_Data class just to have a "user" field (from Names table) rather than just the user id?
Ideally, I want the exact same behaviour as my code, but preferably without the anonymous type.
Thanks.
Not sure if this what you're looking for, but you could specify a class just to hold that data, then select into that instead of an anonymous type:
public class RawDataInfo
{
public int ID { get; set; }
public string Requester { get; set; }
...
}
then alter your select RawDataInfo()
select new RawDataInfo(){
ID = r.ID,
Requester = requester,
...
}
You may want to look into something like AutoMapper. This is a library that allows you to easily and consistently map from one type of object to another. It's clean and well tested.
I've written this code to project one to many relation but it's not working:
using (var connection = new SqlConnection(connectionString))
{
connection.Open();
IEnumerable<Store> stores = connection.Query<Store, IEnumerable<Employee>, Store>
(#"Select Stores.Id as StoreId, Stores.Name,
Employees.Id as EmployeeId, Employees.FirstName,
Employees.LastName, Employees.StoreId
from Store Stores
INNER JOIN Employee Employees ON Stores.Id = Employees.StoreId",
(a, s) => { a.Employees = s; return a; },
splitOn: "EmployeeId");
foreach (var store in stores)
{
Console.WriteLine(store.Name);
}
}
Can anybody spot the mistake?
EDIT:
These are my entities:
public class Product
{
public int Id { get; set; }
public string Name { get; set; }
public double Price { get; set; }
public IList<Store> Stores { get; set; }
public Product()
{
Stores = new List<Store>();
}
}
public class Store
{
public int Id { get; set; }
public string Name { get; set; }
public IEnumerable<Product> Products { get; set; }
public IEnumerable<Employee> Employees { get; set; }
public Store()
{
Products = new List<Product>();
Employees = new List<Employee>();
}
}
EDIT:
I change the query to:
IEnumerable<Store> stores = connection.Query<Store, List<Employee>, Store>
(#"Select Stores.Id as StoreId ,Stores.Name,Employees.Id as EmployeeId,
Employees.FirstName,Employees.LastName,Employees.StoreId
from Store Stores INNER JOIN Employee Employees
ON Stores.Id = Employees.StoreId",
(a, s) => { a.Employees = s; return a; }, splitOn: "EmployeeId");
and I get rid of exceptions! However, Employees are not mapped at all. I am still not sure what problem it had with IEnumerable<Employee> in first query.
This post shows how to query a highly normalised SQL database, and map the result into a set of highly nested C# POCO objects.
Ingredients:
8 lines of C#.
Some reasonably simple SQL that uses some joins.
Two awesome libraries.
The insight that allowed me to solve this problem is to separate the MicroORM from mapping the result back to the POCO Entities. Thus, we use two separate libraries:
Dapper as the MicroORM.
Slapper.Automapper for mapping.
Essentially, we use Dapper to query the database, then use Slapper.Automapper to map the result straight into our POCOs.
Advantages
Simplicity. Its less than 8 lines of code. I find this a lot easier to understand, debug, and change.
Less code. A few lines of code is all Slapper.Automapper needs to handle anything you throw at it, even if we have a complex nested POCO (i.e. POCO contains List<MyClass1> which in turn contains List<MySubClass2>, etc).
Speed. Both of these libraries have an extraordinary amount of optimization and caching to make them run almost as fast as hand tuned ADO.NET queries.
Separation of concerns. We can change the MicroORM for a different one, and the mapping still works, and vice-versa.
Flexibility. Slapper.Automapper handles arbitrarily nested hierarchies, it isn't limited to a couple of levels of nesting. We can easily make rapid changes, and everything will still work.
Debugging. We can first see that the SQL query is working properly, then we can check that the SQL query result is properly mapped back to the target POCO Entities.
Ease of development in SQL. I find that creating flattened queries with inner joins to return flat results is much easier than creating multiple select statements, with stitching on the client side.
Optimized queries in SQL. In a highly normalized database, creating a flat query allows the SQL engine to apply advanced optimizations to the whole which would not normally be possible if many small individual queries were constructed and run.
Trust. Dapper is the back end for StackOverflow, and, well, Randy Burden is a bit of a superstar. Need I say any more?
Speed of development. I was able to do some extraordinarily complex queries, with many levels of nesting, and the dev time was quite low.
Fewer bugs. I wrote it once, it just worked, and this technique is now helping to power a FTSE company. There was so little code that there was no unexpected behavior.
Disadvantages
Scaling beyond 1,000,000 rows returned. Works well when returning < 100,000 rows. However, if we are bringing back >1,000,000 rows, in order to reduce the traffic between us and SQL server, we should not flatten it out using inner join (which brings back duplicates), we should instead use multiple select statements and stitch everything back together on the client side (see the other answers on this page).
This technique is query oriented. I haven't used this technique to write to the database, but I'm sure that Dapper is more than capable of doing this with some more extra work, as StackOverflow itself uses Dapper as its Data Access Layer (DAL).
Performance Testing
In my tests, Slapper.Automapper added a small overhead to the results returned by Dapper, which meant that it was still 10x faster than Entity Framework, and the combination is still pretty darn close to the theoretical maximum speed SQL + C# is capable of.
In most practical cases, most of the overhead would be in a less-than-optimum SQL query, and not with some mapping of the results on the C# side.
Performance Testing Results
Total number of iterations: 1000
Dapper by itself: 1.889 milliseconds per query, using 3 lines of code to return the dynamic.
Dapper + Slapper.Automapper: 2.463 milliseconds per query, using an additional 3 lines of code for the query + mapping from dynamic to POCO Entities.
Worked Example
In this example, we have list of Contacts, and each Contact can have one or more phone numbers.
POCO Entities
public class TestContact
{
public int ContactID { get; set; }
public string ContactName { get; set; }
public List<TestPhone> TestPhones { get; set; }
}
public class TestPhone
{
public int PhoneId { get; set; }
public int ContactID { get; set; } // foreign key
public string Number { get; set; }
}
SQL Table TestContact
SQL Table TestPhone
Note that this table has a foreign key ContactID which refers to the TestContact table (this corresponds to the List<TestPhone> in the POCO above).
SQL Which Produces Flat Result
In our SQL query, we use as many JOIN statements as we need to get all of the data we need, in a flat, denormalized form. Yes, this might produce duplicates in the output, but these duplicates will be eliminated automatically when we use Slapper.Automapper to automatically map the result of this query straight into our POCO object map.
USE [MyDatabase];
SELECT tc.[ContactID] as ContactID
,tc.[ContactName] as ContactName
,tp.[PhoneId] AS TestPhones_PhoneId
,tp.[ContactId] AS TestPhones_ContactId
,tp.[Number] AS TestPhones_Number
FROM TestContact tc
INNER JOIN TestPhone tp ON tc.ContactId = tp.ContactId
C# code
const string sql = #"SELECT tc.[ContactID] as ContactID
,tc.[ContactName] as ContactName
,tp.[PhoneId] AS TestPhones_PhoneId
,tp.[ContactId] AS TestPhones_ContactId
,tp.[Number] AS TestPhones_Number
FROM TestContact tc
INNER JOIN TestPhone tp ON tc.ContactId = tp.ContactId";
string connectionString = // -- Insert SQL connection string here.
using (var conn = new SqlConnection(connectionString))
{
conn.Open();
// Can set default database here with conn.ChangeDatabase(...)
{
// Step 1: Use Dapper to return the flat result as a Dynamic.
dynamic test = conn.Query<dynamic>(sql);
// Step 2: Use Slapper.Automapper for mapping to the POCO Entities.
// - IMPORTANT: Let Slapper.Automapper know how to do the mapping;
// let it know the primary key for each POCO.
// - Must also use underscore notation ("_") to name parameters in the SQL query;
// see Slapper.Automapper docs.
Slapper.AutoMapper.Configuration.AddIdentifiers(typeof(TestContact), new List<string> { "ContactID" });
Slapper.AutoMapper.Configuration.AddIdentifiers(typeof(TestPhone), new List<string> { "PhoneID" });
var testContact = (Slapper.AutoMapper.MapDynamic<TestContact>(test) as IEnumerable<TestContact>).ToList();
foreach (var c in testContact)
{
foreach (var p in c.TestPhones)
{
Console.Write("ContactName: {0}: Phone: {1}\n", c.ContactName, p.Number);
}
}
}
}
Output
POCO Entity Hierarchy
Looking in Visual Studio, We can see that Slapper.Automapper has properly populated our POCO Entities, i.e. we have a List<TestContact>, and each TestContact has a List<TestPhone>.
Notes
Both Dapper and Slapper.Automapper cache everything internally for speed. If you run into memory issues (very unlikely), ensure that you occasionally clear the cache for both of them.
Ensure that you name the columns coming back, using the underscore (_) notation to give Slapper.Automapper clues on how to map the result into the POCO Entities.
Ensure that you give Slapper.Automapper clues on the primary key for each POCO Entity (see the lines Slapper.AutoMapper.Configuration.AddIdentifiers). You can also use Attributes on the POCO for this. If you skip this step, then it could go wrong (in theory), as Slapper.Automapper would not know how to do the mapping properly.
Update 2015-06-14
Successfully applied this technique to a huge production database with over 40 normalized tables. It worked perfectly to map an advanced SQL query with over 16 inner join and left join into the proper POCO hierarchy (with 4 levels of nesting). The queries are blindingly fast, almost as fast as hand coding it in ADO.NET (it was typically 52 milliseconds for the query, and 50 milliseconds for the mapping from the flat result into the POCO hierarchy). This is really nothing revolutionary, but it sure beats Entity Framework for speed and ease of use, especially if all we are doing is running queries.
Update 2016-02-19
Code has been running flawlessly in production for 9 months. The latest version of Slapper.Automapper has all of the changes that I applied to fix the issue related to nulls being returned in the SQL query.
Update 2017-02-20
Code has been running flawlessly in production for 21 months, and has handled continuous queries from hundreds of users in a FTSE 250 company.
Slapper.Automapper is also great for mapping a .csv file straight into a list of POCOs. Read the .csv file into a list of IDictionary, then map it straight into the target list of POCOs. The only trick is that you have to add a propery int Id {get; set}, and make sure it's unique for every row (or else the automapper won't be able to distinguish between the rows).
Update 2019-01-29
Minor update to add more code comments.
See: https://github.com/SlapperAutoMapper/Slapper.AutoMapper
I wanted to keep it as simple as possible, my solution:
public List<ForumMessage> GetForumMessagesByParentId(int parentId)
{
var sql = #"
select d.id_data as Id, d.cd_group As GroupId, d.cd_user as UserId, d.tx_login As Login,
d.tx_title As Title, d.tx_message As [Message], d.tx_signature As [Signature], d.nm_views As Views, d.nm_replies As Replies,
d.dt_created As CreatedDate, d.dt_lastreply As LastReplyDate, d.dt_edited As EditedDate, d.tx_key As [Key]
from
t_data d
where d.cd_data = #DataId order by id_data asc;
select d.id_data As DataId, di.id_data_image As DataImageId, di.cd_image As ImageId, i.fl_local As IsLocal
from
t_data d
inner join T_data_image di on d.id_data = di.cd_data
inner join T_image i on di.cd_image = i.id_image
where d.id_data = #DataId and di.fl_deleted = 0 order by d.id_data asc;";
var mapper = _conn.QueryMultiple(sql, new { DataId = parentId });
var messages = mapper.Read<ForumMessage>().ToDictionary(k => k.Id, v => v);
var images = mapper.Read<ForumMessageImage>().ToList();
foreach(var imageGroup in images.GroupBy(g => g.DataId))
{
messages[imageGroup.Key].Images = imageGroup.ToList();
}
return messages.Values.ToList();
}
I still do one call to the database, and while i now execute 2 queries instead of one, the second query is using a INNER join instead of a less optimal LEFT join.
A slight modification of Andrew's answer that utilizes a Func to select the parent key instead of GetHashCode.
public static IEnumerable<TParent> QueryParentChild<TParent, TChild, TParentKey>(
this IDbConnection connection,
string sql,
Func<TParent, TParentKey> parentKeySelector,
Func<TParent, IList<TChild>> childSelector,
dynamic param = null, IDbTransaction transaction = null, bool buffered = true, string splitOn = "Id", int? commandTimeout = null, CommandType? commandType = null)
{
Dictionary<TParentKey, TParent> cache = new Dictionary<TParentKey, TParent>();
connection.Query<TParent, TChild, TParent>(
sql,
(parent, child) =>
{
if (!cache.ContainsKey(parentKeySelector(parent)))
{
cache.Add(parentKeySelector(parent), parent);
}
TParent cachedParent = cache[parentKeySelector(parent)];
IList<TChild> children = childSelector(cachedParent);
children.Add(child);
return cachedParent;
},
param as object, transaction, buffered, splitOn, commandTimeout, commandType);
return cache.Values;
}
Example usage
conn.QueryParentChild<Product, Store, int>("sql here", prod => prod.Id, prod => prod.Stores)
According to this answer there is no one to many mapping support built into Dapper.Net. Queries will always return one object per database row. There is an alternative solution included, though.
Here is another method:
Order (one) - OrderDetail (many)
using (var connection = new SqlCeConnection(connectionString))
{
var orderDictionary = new Dictionary<int, Order>();
var list = connection.Query<Order, OrderDetail, Order>(
sql,
(order, orderDetail) =>
{
Order orderEntry;
if (!orderDictionary.TryGetValue(order.OrderID, out orderEntry))
{
orderEntry = order;
orderEntry.OrderDetails = new List<OrderDetail>();
orderDictionary.Add(orderEntry.OrderID, orderEntry);
}
orderEntry.OrderDetails.Add(orderDetail);
return orderEntry;
},
splitOn: "OrderDetailID")
.Distinct()
.ToList();
}
Source: http://dapper-tutorial.net/result-multi-mapping#example---query-multi-mapping-one-to-many
Here is a crude workaround
public static IEnumerable<TOne> Query<TOne, TMany>(this IDbConnection cnn, string sql, Func<TOne, IList<TMany>> property, dynamic param = null, IDbTransaction transaction = null, bool buffered = true, string splitOn = "Id", int? commandTimeout = null, CommandType? commandType = null)
{
var cache = new Dictionary<int, TOne>();
cnn.Query<TOne, TMany, TOne>(sql, (one, many) =>
{
if (!cache.ContainsKey(one.GetHashCode()))
cache.Add(one.GetHashCode(), one);
var localOne = cache[one.GetHashCode()];
var list = property(localOne);
list.Add(many);
return localOne;
}, param as object, transaction, buffered, splitOn, commandTimeout, commandType);
return cache.Values;
}
its by no means the most efficient way, but it will get you up and running. I'll try and optimise this when i get a chance.
use it like this:
conn.Query<Product, Store>("sql here", prod => prod.Stores);
bear in mind your objects need to implement GetHashCode, perhaps like this:
public override int GetHashCode()
{
return this.Id.GetHashCode();
}
I'm running into a common need in my project to return collections of my model objects, plus a count of certain types of children within each, but I don't know if it is possible or how to model a "TotalCount" property in a Model class and populate it as part of on single Entity Framework query, preferably using LINQ queries. Is it possible to do this whilst being able to use the Entity Framework .Include("Object") and .Skip() and .Take()? I'm new to the Entity Framework so I may be missing tons of obvious stuff that can allow this...
I would like to be able to paginate on the dynamically constructed count properties as well. I'm thinking that the most scalable approach would be to store the counts as separate database properties and then simply query the count properties. But for cases where there are small row counts that I'm dealing with, I'd rather do the counts dynamically.
In a model like this:
Table: Class
Table: Professor
Table: Attendee
Table: ClassComment
I'd like to return a list of Class objects in the form of List, but I would also like the counts of Attendees and Class comments to be determined in a single query (LINQ preferred) and set in two Class properties called AttendeeCount and ClassCommentCount.
I have this thus far:
var query = from u in context.Classes
orderby tl.Name
select u;
List<Class> topics = ((ObjectQuery<Class>)query)
.Include("ClassComments")
.Skip(startRecord).Take(recordsToReturn).ToList();
Any suggestions or alternative query approaches that can still allow the use of .Include() and pagination would be much much appreciated, in order to produce a single database query, if at all possible. Thank you for any suggestions!
Try this:
public class ClassViewModel {
public Class Class { get; set; }
public int AttendeeCount { get; set; }
public int ClassCommentCount { get; set; }
}
var viewModel = context.Classes.Select(clas =>
new ClassViewModel {
Class = clas,
AttendeeCount = clas.ClassAttendes.Count,
ClassCommentCount = clas.ClassComments.Count}
).OrderBy(model => model.ClassCommentCount).Skip(startRecord).Take(recordsToReturn).ToList();
You don't have to include comments to get count.
It will not work this way. The easiest approach is to use projection into anonymous (or custom) non entity type. I would try something like this:
var query = context.Classes
.Include("ClassComments") // Only add this if you want eager loading of all realted comments
.OrderBy(c => c.Name)
.Skip(startRecord)
.Take(recordsToReturn)
.Select(c => new
{
Class = c,
AttendeeCount = c.Attendees.Count(),
ClassCommentCount = c.ClassComments.Count() // Not needed because you are loading all Class comments so you can call Count on loaded collection
});
The problem in your requirement are AttendeeCount and ClassCommentCount properties. You can't easily add them to your model because there is no corresponding column in database (unless you define one and in such case you don't need to manually count records). You can define them in partial Class implementation but in such case you can't use them in Linq-to-entities query.
The only way to map this in EF is to use DB view and create special read only entity to represent it in your applicaiton or to use DefiningQuery which is custom SQL command defined in SSDL instead of DB table or view.
I've just started learning how to use the Entity Framework to write a very simple C# network monitoring program - this is a learning exercise to try and "drive home" what I've only read about to date. I'm also new to C# and LINQ (just to complicate things further.)
I believe I have the data model suitably normalised but I may be wrong. Visual Studio generates a conceptual model that looks OK. I've pluralised the associations and EntitySets where necessary, but I'm struggling to perform what I think is a fairly basic query/projection on the data.
The database contains 3 tables:
[Server] - A server defined by the user that should be pinged.
ServerID - primary key
HostAddress - IP or hostname
[Result] - A result containing data about the last server test
ResultID - primary key
ServerID - foreign key on [Server].[ServerID]
StateID - an integer used to lookup one of 3 possible Server states
TimeStamp - Time stamp of last ping
[State] - A lookup table containing an integer -> string mapping.
StateID - a unique key
StateLabel - human-readable string like "unreachable" or "OK" or "timeout"
I have manually populated the database using a few simple entries - just enough to give me something to work with.
For starters, I would like to present all of the Result data in a ListView on a WinForm. The ListView contains the following static columns:
State | Server Address | Last checked
In theory, the ListView's data needs to be generated by projecting(?) across each of the 3 tables:
The "State" column should display the human-readable [State].[StateLabel] linked from [Result].[StateID]
The "Server Address" column should display [Server].[HostAddress] linked from [Result].[ServerID]
The "Last Checked" column should display [Result].[TimeStamp]
Since I have no need for the object materialisation and/or change-tracking features of ObjectServices, am I correct in thinking it would be more efficient/correct to use Entity SQL/EntityClient and DbDataReader? If so, what would a suitable Entity SQL query look like?
For what it's worth, I tried using LINQ to Entities and anonymous types in a method but was thwarted by a lack of understanding on a suitable return type:
var results = from r in _context.Result
select new
{
State = (from s in _context.State
where s.StateId == r.StateId
select s.StateLabel),
r.ServerReference.Value.HostAddress,
r.TimeStamp
};
return results.ToList(); // <- No can do.
Thanks for your help!
Steve
Well you won't be able to return a list of anonymous types unless you cast them to object and have the signature define the return type as List<object> (or suitable interface). Your other issue is that the subquery for State will actually return an IQueryable instead of a single entry (you can use the First extension method with EF to get the first matching item.) Although if you have the foreign key relationship the model should have setup an navigation property for the state as well and you should be able to use the property attached instead of a subquery. So if you would like to have this as a method call that returns a list of objects you will have to create a type that represents the transform or downcast to object. Otherwise you could do it at the form level (this all depends on your needs) where you are attempting to bind the list.
public List<object> GetStuff()
{
var results = from r in _context.Result
select new
{
State = r.StateNavigationProperty.StateLabel, //If FK
State = _context.State.First(state => state.StateId == r.StateId), //If Not FK
HostAddress = r.ServerReference.Value.HostAddress,
TimeStamp = r.TimeStamp
};
return results.Cast<object>().ToList();
}
...
myListView.DataSource = GetStuff();
And like I said the other alternative is to either create a class for the transform or bind the list directly to the query.
public class SimpleStuff
{
public string State { get; set; }
public string HostAddress { get; set; }
public DateTime TimeStamp { get; set; }
}
Then just add the class to the select new ala select new SimpleStuff and change the method signature to reflect the class and remove the cast in the return.