For my project I use SqlCE 3.5, Linq2Sql and generate entity classes (dbml file) with SqlMetal.
If I use extensions of the entity classes as my business object classes, how shall I deal with child objects?
Can I use the existing EntitySet(Of ChildClass) that is already in the dbml file? Or should I create a new collection property in my partial class extension, like:
public partial class ParentClass
{
public List<ChildClass> children { get; set; }
}
I got this vague assumption that EntitySet(Of ChildClass) has some type of direct connection to the complete database table. And that I perhaps should use this as purely a data access object, and for business logic, I should keep this other collection object that can hold a subset of the table only.
But I'm not sure if I have totally misunderstood the concept? I really like to know how this should be used properly..?
EDIT1:
One thing that possibly is a performance issue for me is when binding my object collections to a datagridview. A few properties are updated quite frequently from a stock-exchange data feed, and I wonder if using the EntitySets as the business layer collection makes this slow. Though I do not call submitchanges on those updates. Still I'm not able to scroll the datagridview, due to all the updates. May this be due to using the EntitySets, or just the datagridview itself updating so frequently that scrolling will seem impossible?
If you are using the entities generated by the Designer, which are Proxy and not real POCO objects you can also use their ObjectSet to get the child colletions. They are of type IQueryable so until you touch one of the item or you use a ToList() or .Where() LinQ extension, EF won't call the database.
Anyway, the BEST PRACTICE is to expose them using an IList and use a Repository to fill the List. The ObjectSet is not only a List, it is also a sort of repository for the entity.
Related
I'm using Entity Framework 6.
I want to run a stored procedure that returns a non entity object (3 colunms per row)
using(var dbContext = new DBContextEntity())
{
var queryProducts = dbContext.Database.SqlQuery<DataTable>("dbo.GetProductByDesc #q", query);
}
How to get that data as DataSet or anonymous object that I can iterate that?
As far as I know EntityFramework does not provide anonymous object materialization. The reason for that is that it probably generates IL code for each type and caches it (or just does a plain PropertyInfo caching).
The solution is to just create a simple class with properties you need matching the names of the stored procedure result set and use this class as a generic parameter for SqlQuery.
Edit:
SqlQuery implements IEnumerable and when you iterate over it, it executes automatically in thge current thread. To iterate the result you can for example:
foreach(var product in queryProducts)
{
// do something with each product here
}
You can also pass a list of product class instances to a function expecting it:
ShowProducts(queryProducts.ToList());
You can also make the query run in background and return you a list of Product after it has finished, more information about asynchronous fetching can be found here: http://www.codeguru.com/csharp/.net/net_framework/performing-asynchronous-operations-using-entity-framework.htm
Like #Too said, it is best to define a POCO class with properties corresponding to the field names and datatypes returned by the StoredProcedure or other SQL statement.
It is generally better to avoid the use of DataSets in any new development work you are doing. They do have their uses but have a performance penalty in high throughput scenarios which the POCO's clearly avoid.
If the attraction for DataSets is the ability to easily serialize the data over the wire or to a file for later use, then the various serialization frameworks will help you with that eg DataContractSerializer, Newtonsoft.Json, etc.
This also allows for portability if the POCO is defined in a PCL (Portable Class Library).
If you must use DataSets, I would rather use typed DataSets. The DataRow's can be used as the POCO in #Too's answer, since they have a default constructor. Just be careful of nulls and their unique treatment in fields other than String.
I am using EF 5.0 in a WinForms application. I dispose the DBContext quick.
Nevertheless I must maintain a static list of Customer entities , which I populate on startup using a DBContext - again, disposed quickly. Plus, I slighlty use multi-threading in some parts of the application.
The issue is that I receive this exception every once and then:
An entity object cannot be referenced by multiple instances of
IEntityChangeTracker.
Should I detach every Customer in that static list before disposing the DBContext? Should I use some other design for a WinForms application? I appreciate your feedback.
Try use AsNoTracking method when querying instances: http://msdn.microsoft.com/en-us/library/gg679352(v=vs.103).aspx
Consider building Data Transfer Objects to make a clean separation between EF and the static list.
The main benefit of this is limiting EF's influence on the rest of the application.
I would not use a list of Customer entity objects, but some mirror type, say CustomerListItem.
You can populate the list by projecting Customers into the items
db.Customers.Select(c => new CustomerListItem { Name = c.Name, ... })
By doing this you create objects that are not tracked. And you will be sure that changes in the entity model will not affect other parts of the application that depend on the customer list. And you will not run into potential lazy loading exceptions (if Customer has lazy navigation properties).
As it is a static list accessible to the whole application I would use a ReadOnlyCollection.
I have a user entity that contains a collection of survey entities. i would like the assocation to include a filter on the relationship, such as 'IsCompleted', so whenever i eager load (or lazy load for that matter) the collection, this filtering happens.
Is this something we have control over?
thanks!
If you are using a DB back-end that supports views, you might consider using the view as the source for the collection of survey entities. Leverage the power of the DB to do that filtering for you.
Loading of associations for an entity always just gets them all, whether because you used Include during the initial query, called Load after the fact, or lazy-loading caused it. The concept of the navigation property kind of assumes this behavior.
E.J. Brennan's answer would work well. If you're not concerned about loading all surveys behind the scenes (because of performance/memory reasons or something) then you might also consider creating a separate property via a partial class definition on your entity that returns the filtered list.
public partial class User
{
public ICollection<Survey> CompletedSurveys
{
get { return Surveys.Where(s => s.IsCompleted); }
}
}
I have an existing SQL Server database, where I store data from large specific log files (often 100 MB and more), one per database. After some analysis, the database is deleted again.
From the database, I have created both a Entity Framework Model and a DataSet Model via the Visual Studio designers. The DataSet is only for bulk importing data with SqlBulkCopy, after a quite complicated parsing process. All queries are then done using the Entity Framework Model, whose CreateQuery Method is exposed via an interface like this
public IQueryable<TTarget> GetResults<TTarget>() where TTarget : EntityObject, new()
{
return this.Context.CreateQuery<TTarget>(typeof(TTarget).Name);
}
Now, sometimes my files are very small and in such a case I would like to omit the import into the database, but just have a an in-memory representation of the data, accessible as Entities. The idea is to create the DataSet, but instead of bulk importing, to directly transfer it into an ObjectContext which is accessible via the interface.
Does this make sense?
Now here's what I have done for this conversion so far: I traverse all tables in the DataSet, convert the single rows into entities of the corresponding type and add them to instantiated object of my typed Entity context class, like so
MyEntities context = new MyEntities(); //create new in-memory context
///....
//get the item in the navigations table
MyDataSet.NavigationResultRow dataRow = ds.NavigationResult.First(); //here, a foreach would be necessary in a true-world scenario
NavigationResult entity = new NavigationResult
{
Direction = dataRow.Direction,
///...
NavigationResultID = dataRow.NavigationResultID
}; //convert to entities
context.AddToNavigationResult(entity); //add to entities
///....
A very tedious work, as I would need to create a converter for each of my entity type and iterate over each table in the DataSet I have. Beware, if I ever change my database model....
Also, I have found out, that I can only instantiate MyEntities, if I provide a valid connection string to a SQL Server database. Since I do not want to actually write to my fully fledged database each time, this hinders my intentions. I intend to have only some in-memory proxy database.
Can I do simpler? Is there some automated way of doing such a conversion, like generating an ObjectContext out of a DataSet object?
P.S: I have seen a few questions about unit testing that seem somewhat related, but not quite exact.
There are tools that map between objects, such as automapper. This is a very good open source tool.
However, these tools sometimes have problems, for example generating duplicate entity keys, or problems when the structure of the objects being mapped are very different.
If you are trying to automate it, I think that there is a greater chance of it working if you use EF 4 and POCO objects.
If you end up writing the mapping code manually, I would move it into a seperate procedure with automated unit tests on it.
The way we do this is to create a static class with "Map" methods":
From DTO to EF object
From EF to DTO
Then write a test for each method in which we check that the fields were mapped correctly.
I have a Linq-To-Sql based repository class which I have been successfully using. I am adding some functionality to the solution, which will provide WCF based access to the database.
I have not exposed the generated Linq classes as DataContracts, I've instead created my own "ViewModel" as a POCO for each entity I am going to be returning.
My question is, in order to do updates and take advantage of some of the Linq-To-Sql features like cyclic references from within my Service, do I need to add a Rowversion/Timestamp field to each table in by database so I can use code like dc.Table.Attach(myDisconnectedObject)? The alternitive, seems ugly:
var updateModel = dc.Table.SingleOrDefault(t => t.ID == myDisconnectedObject.ID);
updateModel.PropertyA = myDisconnectedObject.PropertyA;
updateModel.PropertyB = myDisconnectedObject.PropertyB;
updateModel.PropertyC = myDisconnectedObject.PropertyC;
// and so on and so forth
dc.SubmitChanges();
I guess a RowVersion/TimeStamp column on each table might be the best and least intrusive option - just basically check for that one value, and you're sure whether or not your data might have been modified in the mean time. All other columns can be set to Update Check=Never. This will take care of handling the possible concurrency issues when updating your database from "returning" objects.
However, the other thing you should definitely check out is AutoMapper - it's a great little component to ease those left-right-assignment orgies you have to go through when using ViewModels / Data Transfer Objects by making this mapping between two object types a snap. It's well used, well tested, used by many and very stable - a winner!