I have a number of static methods that perform simple operations like insert or delete a record. All these methods follow this template of using:
public static UserDataModel FromEmail(string email)
{
using (var db = new MyWebAppDataContext())
{
db.ObjectTrackingEnabled = false;
return (from u in db.UserDataModels
where u.Email == email
select u).Single();
}
}
I also have a few methods that need to perform multiple operations that use a DataContext:
public static UserPreferencesDataModel Preferences(string email)
{
return UserDataModel.Preferences(UserDataModel.FromEmail(email));
}
private static UserPreferencesViewModel Preferences(UserDataModel user)
{
using(var db = new MyWebAppDataContext())
{
var preferences = (from u in db.UserDataModels
where u == user
select u.Preferences).Single();
return new UserPreferencesViewModel(preferences);
}
}
I like that I can divide simple operations into faux-stored procedures in my data models with static methods like FromEmail(), but I'm concerned about the cost of having Preferences() invoking two connections (right?) via the two using DataContext statements.
Do I need to be? Is what I'm doing less efficient than using a single using(var db = new MyWebAppDataContext()) statement?
If you examine those "two" operations, you might see that they could be performed in 1 database roundtrip. Minimizing database roundtrips is a major performance objective (second to minimizing database io).
If you have multiple datacontexts, they view the same record differently. Normally, ObjectTracking requires that the same instance is always used to represent a single record. If you have 2 DataContexts, they each do their own object tracking on their own instances.
Suppose the record changes between DC1 observing it and and DC2 observing it. In this case, the record will not only have 2 different instances, but those different instances will have different values. It can be very challenging to express business logic against such a moving target.
You should definately retire the DataContext after the UnitOfWork, to protect yourself from stale instances of records.
Normally you should use one context for one logical unit of work. So have a look at the unit of work pattern, ex. http://dotnet.dzone.com/news/using-unit-work-pattern-entity
Of cause there is some overhead in creating a new DataContext each time. But its a good practice to do as Ludwig stated: One context per unit of work.
Its using connection pooling so its not a too expensive operation.
I also think creating a new DataContext each time is the correct way but this link explains different approaches for handling the data context. Linq to SQL DataContext Lifetime Management
I developed a wrapper component that uses an interface like:
public interface IContextCacher {
DataContext GetFromCache();
void SaveToCache(DataContext ctx);
}
And use a wrapper to instantiate the context; if it exists in cache, it's pulled from there, otherwise, a new instance is created and pushed to the Save method, and all future implementations would get the value from the getter.
Depending on the type of application would be the actual caching mechanism. Say for instance, an ASP.NET web application. This could store the context in the items collection, so its alive for the request only. For a windows app, it could pull it from some singleton collection. It could be whatever you wanted under the scenes.
Related
In the Business Logic Layer of an Entity Framework-based application, all methods acting on DB should (as I've heard) be included within:
using(FunkyContainer fc = new FunkyContainer())
{
// do the thing
fc.SaveChanges();
}
Of course, for my own convenience often times those methods use each other, for the sake of not repeating myself. The risk I see here is the following:
public void MainMethod()
{
using(FunkyContainer fc = new FunkyContainer())
{
// perform some operations on fc
// modify a few objects downloaded from DB
int x = HelperMethod();
// act on fc again
fc.SaveChanges();
}
}
public int HelperMethod()
{
using(FunkyContainer fc2 = new FunkyContainer())
{
// act on fc2 an then:
fc2.SaveChanges();
return 42;
}
}
I doesn't look good to me, when the container fc2 is created, while fc is still open and has not been saved yet. So this leads to my question number one:
Is having multiple containers open at the same time and acting on them carelessly an acceptable practice?
I came to a conclusion, that I could write a simple guard-styled object like this:
public sealed class FunkyContainerAccessGuard : IDisposable
{
private static FunkyContainer GlobalContainer { get; private set; }
public FunkyContainer Container // simply a non-static adapter for syntactic convenience
{
get
{
return GlobalContainer;
}
}
private bool IsRootOfHierarchy { get; set; }
public FunkyContainerAccessGuard()
{
IsRootOfHierarchy = (GlobalContainer == null);
if (IsRootOfHierarchy)
GlobalContainer = new FunkyContainer();
}
public void Dispose()
{
if (IsRootOfHierarchy)
{
GlobalContainer.Dispose();
GlobalContainer = null;
}
}
}
Now the usage would be as following:
public void MainMethod()
{
using(FunkyContainerAccessGuard guard = new FunkyContainerAccessGuard())
{
FunkyContainer fc = guard.Container;
// do anything with fc
int x = HelperMethod();
fc.SaveChanges();
}
}
public int HelperMethod()
{
using(FunkyContainerAccessGuard guard = new FunkyContainerAccessGuard())
{
FunkyContainer fc2 = guard.Container;
// do anything with fc2
fc2.SaveChanges();
}
}
When the HelperMethod is called by MainMethod, the GlobalContainer is already created, and its used by both methods, so there is no conflict. Moreover, HelperMethod can be also used separately, and then it creates its own container.
However, this seems like a massive overkill to me; so:
Has this problem been already solved in form of some class (IoC?) or at least some nice design pattern?
Thank you.
Is having multiple containers open at the same time and acting on them carelessly an acceptable practice?
Generally this is perfectly acceptable, sometimes even necessary, but you have to be caucious with that. To have multiple containers at the same time is especially handy when doing multithreading operations. Because of how db works generally each thread should have its own DbContext that should not be shared with other threads. Downside to using multiple DbContext at the same time is that each of them will use separate db connection, and sometimes they are limited, what may lead to application occasionally being unable to connect to database. Other downside is the fact that entity generated by one DbContext may not be used with entity generated by other DbContext. In your example HelperMethod returns primitive type, so this is perfectly safe, but if it would return some entity object that in MainMethod you would like to assign for instance to some navigation property of entity created by MainMethod DbContext then you will receive an exception. To overcome this in MainMethod you would have to use Id of entity returned by HelperMethod to retrieve that entity once more, this time with fc context. On the other hand there is an advantage of using multiple contexts - if one context have some troubles, for instance it tried to save something that violated index constaint, then all next trials of saving changes will result in the same exception as the faulty change will still be pending. If you use multiple DbContexts then if one would fail, then second will operate independently - this is why DbContexts should not live long. So generally I would say the best usage rule would be:
Each thread should use a separate DbContext
All methods that executes on the same thread should share the same DbContext
Of course the above applies if the job to be done is short. DbContext should not live long. The best example would be web applications - there each server request is handled by separate thread and the operations to generate response generally do not take long. In such case all methods executed to generate one response should share for convenience the same DbContext. But each request should be served by separate DbContext.
Has this problem been already solved in form of some class (IoC?) or at least some nice design pattern?
What you need to assure is that your DbContext class is singleton per thread, but each thread has its own instance of that class. In my opinion best way to assure this is with IoC. For instance in Autofac in web applications I register my DbContext with the following rule:
builder
.RegisterType<MyDbContext>()
.InstancePerHttpRequest();
This way autofac IoC generates one DbContext per request and share existing instance within the request serving thread. You do not need to care here for disposing your DbContext. Your IoC will do this when your thread is over.
Working in multiple connections at the same time is not the right approach most of the time because:
You can get distributed deadlocks that SQL Server cannot resolve.
You might not see data that was previously written but not yet committed.
You can't share entities across context boundaries (here: methods).
More resource usage.
No ability to transact across context boundaries (here: methods).
These are very severe disadvantages. Usually, the best model is to have one context, connection and transaction for the request that the app is processing (HTTP or WCF request). That's very simple to set up and avoids a lot of issues.
EF is supposed to be used as a live object model. Do not cripple it by reducing it to CRUD.
static FunkyContainer GlobalContainer
That does not work. You shouldn't share a context across requests. Super dangerous. Consider storing a context in HttpContext.Items or whatever is the per-request store in your app.
In my DALs I currently use a new DataContext instance for each method, i.e. create the context for each data call, then dispose it (with using). I remember I read that was sort of a best practice.
Now I think that I probably better use one common DataContext per DAL which will require less lines to write and will allow to update changes in the database without attaching the entities to the newly created context.
But I am not sure whether this will impact the productivity of the application. Are there negative things which may appear with this new approach, like maybe "each context reserves a connection line with a database" or "there are only a limited number of contexts available per application"?
From what I read and my own conclusion, the basic rule is: use a single DataContext instance for each short time set of operations, this means:
Use new (separate) instance of DataContext for each operation (transaction) in long living parent objects, such as DALs. For example, the main form has a DAL which uses a DataContext, the main form is the most long living object in a desktop application, thus having a single instance of a DataContext to serve all the main form data operations will not be a good solution due to the increasing cache and risk of the data to become obsolete.
Use single (common) instance of DataContext for all operations in short time living parent objects. For example, if we have a class which executes a set of data operations in a short amount of time, such as takes data from a database, operates with them, updates them, saves the changes to the database and gets disposed, we better create one single instance of the DataContext and use it in all the DAL methods. This relates to a web applications and services as well since they are stateless and are being executed per request.
Example of when I see a requirement of a common DataContext:
DAL:
// Common DAL DataContext field.
DataContext Context = new DataContext();
public IEnumerable<Record> GetRecords()
{
var records = Context.Records;
foreach (var record in records)
{
yield return record;
}
}
public void UpdateData()
{
Context.SaveChanges();
}
BLL:
public void ManageData()
{
foreach (var record in DAL.GetRecords())
{
record.IsUpdated = true;
DAL.UpdateData();
}
}
With this approach you will end up with a lot of objects created in memory (potentially, the whole db) and (which can be even more important), those objects will not correspond to current values in the db (if the db gets updated outside of your application/machine). So, in order to use memory efficiently and to have up-to-data values for your entities, it's really better to create data context per transaction.
I have an application that will generate reports from various databases, each with the same schema. The only difference between DB1, DB2, DB3, and DBx is the data. All tables, views, etc are the same in structure.
For each participating DB in my application I have created a distinct Linq to SQL DataContext.
I am creating a ReportHelper class that will generate reports from the underlying data. I want to be able to call a Method like "GetCustomerSales" and have it spit back the data for my report. The problem is that I want to pass or set the DataContext for the GetCustomerSales method before I call it (ideally when constructing the ReportHelper Class).
However, my GetCustomerSales method wants me to use a specific DataContext, and I do not want to create this method over and over for each potential DataContext in use in the app. What's the correct approach here?
Have a single data-context, that matches the common schema. The difference is that instead of using just new SomeDataContext(), you should supply an appropriate connection-string (or connection) to the constructor:
var db = new SomeDataContext(connectionString);
or
var db = new SomeDataContext(connection);
Now all you need is multiple connection-strings, which is easier than multiple data-contexts. Two choices there; you can store multiple strings, perhaps in config (this is especially useful if they each need different user accounts etc); or - you can use SqlConnectionStringBuilder to create the connection string at runtime, specifying the appropriate database.
To illustrate; every site in the StackExchange network is ultimately a different database, but they all use the same data-context type. We simply tell it about the connection at runtime (based on which hostname you accessed).
You can get around it by passing a connection string to your data context.
public class DataRepository
{
private MyDataContext ctx;
public DataRepository(string connection){
ctx = new MyDataContext(connection);
}
//now you can use your context
}
I am building a small data intensive app with Windows Forms. In the main project I have a folder that holds my DBML as well as data classes to provide CRUD operations against the database. There are about 10 said data classes currently.
The code behind in the form instantiates business objects and makes calls against them to do all the work. These business objects are making calls against the static data access classes.
An example of a data class would be something like this
static class CustomerData
{
public static IEnumerable<Customer> GetCustomersForRun(int runID)
{
var db = new FooDataContext("connectionString");
return db.Customers.Where(ri => ri.RunID == runID);
}
}
Now obviously there are a few problems with my initial design that I need to address.
1) It's not nice to have each static method need to create its own DataContext. This doesn't seem very DRY at all.
2) Because I'm relying on some lazy loading I'm not able to wrap my DataContext in a using statement.
A couple of different ideas I have to fix this problem are
1) Get rid of the static methods and instead create an abstract base data access class that can instantiate my DataContext.
2) Have each business object create it's own DataContext and pass that into the static methods of the data access classes.
An example of the method signature would then be
public static IEnumerable<Customer> GetCustomerForRun(DataContext db, int runID)
My specific questions are
1) Am I over complicating this?
2) Do you typically dispose of your DataContext objects?
3) Which of my solutions makes most sense? If none of them what do you recommend?
1) Am I over complicating this?
It really depends if your application is very small shoehorning a pattern into the mix might make things more complicated where simply using the DataContext might make things easier to understand instead of putting a layer of abstraction on top of linq to sql.
2) Do you typically dispose of your DataContext objects?
It will depend on your implamentation, if you plan on passing an IQueryable<T> around to do filtering wrapping a using(){} block will cause your grief. Since linq to sql only triggers a sql query when you do something that calls GetEnumerator() your context might be disposed and your call will fail.
Conceder this example:
IQueryable<Table> GetStuff()
{
using(var db = new DataContext())
{
return db.Tables.Where(i=>i.Id == 1);
}
}
if in another method you try to do this GetStuff().Where(i=> i.Name == "Jon").ToList() will cause the query to fail as the context has already been disposed.
Now if you don't do that you can gain the power of IQueryable
IQueryable<Table> GetStuff()
{
return db.Tables.Where(i=>i.Id == 1);
}
GetStuff().Where(i=> i.Name == "Jon").ToList() will work and allow you to filter out your query and defer execution of the sql statement until the very last minute. More information can be found here.
3) Which of my solutions makes most sense? If none of them what do you recommend?
I usually try to stay away from static classes/methods since it makes doing unit tests very difficult. Probably some good information would be to look at the Repository pattern and this answer which gives some quick information.
Still extremely new to the whole MVC/LINQ thing. I'm in the processes off building a blog, and I need to build a table for the posts, and within each post build a table for the comments of that post.
To build the tables, I'm doing something like:
postsTable = (new DataContext(connectionString)).GetTable<Post>();
Unfortunately for each comments table, it does the same thing. I see DataContext(connectionString) and assume it's reconnecting every single time. I feel like I should be able to connect once at the start of the fetch and then close the connection when I'm done. Am I doing this wrong?
What your looking for is a pattern called "Session/Context per Request". The most popular and cross ORM, cross WebForms/MVC way of doing this is at the start of a request new up a context, throw in session, and finally at the end pull it down and dispose of it.
From: http://blogs.microsoft.co.il/blogs/gilf/archive/2010/05/18/how-to-manage-objectcontext-per-request-in-asp-net.aspx
public static class ContextHelper<T> where T : ObjectContext, new()
{
#region Consts
private const string ObjectContextKey = "ObjectContext";
#endregion
#region Methods
public static T GetCurrentContext()
{
HttpContext httpContext = HttpContext.Current;
if (httpContext != null)
{
string contextTypeKey = ObjectContextKey + typeof(T).Name;
if (httpContext.Items[contextTypeKey] == null)
{
httpContext.Items.Add(contextTypeKey, new T());
}
return httpContext.Items[contextTypeKey] as T;
}
throw new ApplicationException("There is no Http Context available");
}
#endregion
}
You can also mess around with new()ing up the DataContext in your controller constructor as seen here:
http://www.stephenwalther.com/blog/archive/2008/08/20/asp-net-mvc-tip-34-dispose-of-your-datacontext-or-don-t.aspx
Now this article says you don't have to worry about disposing of your context but I disagree. With modern ORMs you really want to take advantage of the "session" like ways they track and persist changes. Without manually disposing of your context all sorts of bad code or horrible unit of work patterns won't throw exceptions like they should. IMHO the session aspect of an ORM is the most important part. Ignore at your peril.
If your using SQL Server the connection pooling feature negates a lot of the performance impacts of opening and closing a connection. Unless you start doing 100,000 requests a second I wouldn't worry about it.
Go to http://www.asp.net/MVC and check out the tutorials and starter kits there are a whole bunch of good articles that will show you what you are looking for
Since you're doing LINQ to SQL you can...
Define the blogs and comments tables on your database with the appropriate foreign key relationship.
Drag and drop both onto your dbml designer surface. Click the save button; that's when the code is generated.
When you populate your viewmodel (or however you're getting your data back to the result of a controller action) only query the information you need at the time.
For a single view of a blog entry with associated comments ithe LINQ query might look like so...
YourDataContext dataContext = new YourDataContext();
var blogData = (from b in dataContext.Blogs
where b.BlogId == 1
select b).SingleOrDefault();
// you should now have a single blog instance with a property named Comments. Set the
// fetch mode to eager if you plan to always show the comments; leave it lazy to only do
// the lookup if necessary. Execute all of your queries/accesses before you pass
// data to the view