I have a Windows Service running every ten seconds, in different threads.
This service makes various CRUD operations on a SQL Server 2008 Database onto the same machine.
For each CRUD operation, I put a "using" bracket like this example :
public object InsertClient(clsClient c)
{
using (ClientEntities e = new ClientEntities()) {
e.Clients.AddObject(c);
}
}
I'm concerned about the efficiency of this operations if there is already another thread interacting with the same table. Is it the right way to do this ?
Furthermore, is there any risk of interthread exception with this method ?
Thanks for your help.
No, it's not wrong to have multiple object entities as long as you create and dispose it right away.
Here is the general recommendation from MSDN.
When working with long-running object context consider the following:
As you load more objects and their references into memory, the object context may grow quickly in memory consumption. This may cause
performance issues.
Remember to dispose of the context when it is no longer required.
If an exception caused the object context to be in an unrecoverable state, the whole application may terminate.
The chances of running into concurrency-related issues increase as the gap between the time when the data is queried and updated grows.
When working with Web applications, use an object context instance per request. If you want to track changes in your objects between the
tiers, use the self-tracking entities. For more information, see
Working with Self-Tracking Entities and Building N-Tier Applications.
When working with Windows Presentation Foundation (WPF) or Windows Forms, use an object context instance per form. This lets you use
change tracking functionality that object context provides.
If you are worry about the cost of creating connection every new object entities, since EF relies on the data provider and if the provider is ADO.Net, the connection pooling is enabled by default, unless you disable it in connection string.
Also the metadata is cache globally per application domain, so every new object entities will simply copy the metadata from global cache.
And since EF is not thread-safe, it's recommended to have each object entities in each thread.
Like much of .NET, the Entity Framework is not thread-safe. This means
that to use the Entity Framework in multithreaded environments, you
need to either explicitly keep individual ObjectContexts in separate
threads, or be very conscientious about locking threads so that you
don't get collisions. - MSDN
Related
I understand that DbContext is not thread-safe, plus DbContext caches data and it may leads to data inconsistency when several transactions try to save/commit their own changes into database. Thus, it is highly recommended to inject it per request (here). But I have a situation where only read operations exists (in a stand-alone class library) and there is no transaction or create/update/delete operations.
My question is: Is it safe to inject DbContext as singleton in this situation?
Entity Framework developers explicitly say that DbContext is not thread safe for any operations performed on it, not just write (add, save changes etc) operations, and you should just believe them on that if you don't want to spend days debugging mysterious failures one day.
Even on read operations, EF can perform in-memory write operations on it's internal structures which are not thread safe, you cannot be sure it doesn't do that in any given case. For example, from documentation taking about processing of result set returned by a query:
If the query is a tracking query, EF checks if the data represents an
entity already in the change tracker for the context instance
So if query is tracking query - it checks change tracker for current instance for already existing entity of this type with same key, which means if such entity doesn't exist - it puts it into change tracker. This is write operation, so not safe.
You can say, well, I'll just use AsNoTracking() then. But here is another issue, about conncurrent AsNoTracking queries - EF won't even allow you to execute them anyway. Library maintainer says:
Concurrent use of the same DbContext is not possible - and not just
for tracked queries. Specifically, a DbContext has an underlying
DbConnection to the database, which cannot be used concurrently. There
are other various components at work under the hood which don't
support multithreading.
However, there's nothing wrong with instantiating several DbContexts
and executing queries on them - whether tracking or non-tracking. That
should get you the behavior you're looking for. If you run into any
further issues don't hesitate to post back.
So there are undocumented internal components in play which are not thread safe and you cannot be sure while doing anything on DbContext from multiple threads that you won't hit such components.
I have a unique (or so I think) problem - we have an ASP.NET web app using MVC principles. The project will be at most single threaded (our business requires single point of control). We are using Entity Framework to connect to the database
Problem:
We want to query our database less frequently than every page load.
I have considered putting our database connection in a singleton but am worried about connecting to in too infrequently -- will a query still work if it connected a significant time ago? How would you recommend connecting to the database?
How would you recommend connecting to the database?
Do NOT use a shared connection. Connections are not thread-safe, and are pooled by .NET, so creating one generally isn't an expensive operation.
The best practice is to create a command and connection for every database request. If you are using Entity Framework, then this will be taken care of for you.
If you want to cache results using the built-in Session or Cache properties, then that's fine, but don't cache disposable resources like connections, EF contexts, etc.
If at some point you find you have a measurable performance problem directly related to creating connections or contexts, then you can try and deal with that, but don't try to optimize something that might not even be a problem.
If you want to get data without connecting to the database, you need to cache it - either in memory, in a file or in whatever mean of storage you want, but you need to keep it in front of the DB somehow. There is no other way known to me.
If by connecting you mean building a completely new SqlConnection to your DB, then you can either rely on connection pooling (EF is smart enough to keep your connections alive for some minutes even after you finish your business) or you can just create connections and keep them alive inside your application by not closing them instantly (i.e. keeping track of them inside a structure).
But you should definitely consider if this is REALLY what you want. The way EF does it internally is most of the time exactly what you want.
Some further reading:
https://learn.microsoft.com/en-us/aspnet/mvc/overview/older-versions/getting-started-with-ef-5-using-mvc-4/implementing-the-repository-and-unit-of-work-patterns-in-an-asp-net-mvc-application
I need an ORM that is suitable for stateful application. I'm going to keep entities between requests in low-latency realtime game server with persistent client connections. There is an only 1 server instance connected to database so no data can be changed from "outside" and the server can rely on its cache.
When user remotely logs in to the server its whole profile is loaded to server memory. Several higher-level services are also created for each user to operate profile data and provide functionality. They can also have internal fields (state) to store temporary data. When user wants to change his signature he asks corresponding service to do so. The service tracks how frequently user changes his signature and allows it only once per ten minutes (for example) - such short interval is not tracked in db, this is a temporary state. This change should be stored to db executing only 1 query: UPDATE users SET signature = ... WHERE user_id = .... When user logs off it's unloaded from server memory after minutes/hours of inactivity. Db here is only a storage. This is what I call stateful.
Some entities are considered "static data" and loaded only once at application start. Those can be referenced from other "dynamic" entities. Loading "dynamic" entity should not require reloading referenced "static data" entity.
Update/Insert/Delete should set/insert/delete only changed properties/entities even with "detached" entity.
Write operations should not each time load data from database (perform Select) preliminary to detect changes. (A state can be tracked in dynamically generated inheritor.) I have a state locally, there is no sense to load anything. I want to continue tracking changes even outside of connection scope and "upload" changes when I want.
While performing operations references of persisted objects should not be changed.
DBConnection-per-user is not going to work. The expected online is thousands of users.
Entities from "static data" can be assigned to "dynamic" enitity properties (which represent foreign keys) and Update should handle it correctly.
Now I'm using NHibernate despite it's designed for stateless applications. It supports reattaching to session but that looks like very uncommon usage, requires me to use undocumented behavior and doesn't solve everything.
I'm not sure about Entity Framework - can I use it that way? Or can you suggest another ORM?
If the server will recreate (or especially reload) user objects each time user hits a button it will eat CPU very fast. CPU scales vertically expensively but have small effect. Contrary if you are out of RAM you can just go and buy more - like with horizontal scaling but easier to code. If you think that another approach should be used here I'm ready to discuss it.
Yes, you can use EF for this kind of application. Please keep in mind, that on heavy load you will have some db errors time to time. And typically, it's faster to recover after errors, when you application track changes, not EF. By the way, you can use this way NHibernate too.
I have used hibernate in a stateful desktop application with extremely long sessions: the session starts when the application launches, and remains open for as long as the application is running. I had no problems with that. I make absolutely no use of attaching, detaching, reattaching, etc. I know it is not standard practice, but that does not mean it is not doable, or that there are any pitfalls. (Edit: but of course read the discussion below for possible pitfalls suggested by others.)
I have even implemented my own change notification mechanism on top of that, (separate thread polling the DB directly, bypassing hibernate,) so it is even possible to have external agents modify the database while hibernate is running, and to have your application take notice of these changes.
If you have lots and lots of stuff already working with hibernate, it would probably not be a good idea to abandon what you already have and rewrite it unless you are sure that hibernate absolutely won't do what you want to accomplish.
I always have seen lots of questions about how to handle the life-cycle of an EF context, but never found a concrete answer to this.
As stated everywhere, context is intended to be used as a unit work and be disposed whenever you finish that work.
So, let's suppose in a program we create a class to manage all the tipical database tasks (create user, update user, delete user, etc..) and in each one we create a context wrapped into a using statement as is intended to be used (at least on all info I have found).
So, now, in our main program in a function we use, let's say, 3 or 4 of those functions. Does that mean we have opened and closed four connections to the database or does EF uses a pooling mechanism to reuse the same connection?
Connecting to the DB is a very consuming process (compared to execurte simple queries) and when using manually connections I tend to pool them to reuse, but with EF I am lost, don't know if I should pool contexts, pool connections and create contexts using that connections or do nothing as the EF will take care of it.
If all your EF instances share the same connection string, then by default it uses a connection pool.
However, I would recommend you to read about the Unit of Work pattern
http://www.asp.net/mvc/tutorials/getting-started-with-ef-5-using-mvc-4/implementing-the-repository-and-unit-of-work-patterns-in-an-asp-net-mvc-application
http://www.codeproject.com/Articles/615499/Models-POCO-Entity-Framework-and-Data-Patterns
We have our first NHibernate project going on pretty well. However, I still have not grasped the complete picture how to manage the sessions and objects in our scenario.
So, we are configuring a system structure in a persistent object model, stored in a database with NHibernate.
The system consists of physical devices, which the application is monitoring in a service process. So at service startup, we instantiate Device objects in the service and update their status according to data read from the device interface. The object model stays alive during the lifetime of the service.
The service is also serving Silverlight clients, which display object data and may also manipulate some objects. But they must access the same objects that the service is using for monitoring, for example, because the objects also have in-memory data as well, which is not persisted. (Yes, we are using DTO objects to actually transfer the data to the clients.)
Since the service is a multithreaded system, the question is how the NHibernate sessions should be managed.
I am now considering an approach that we would just have a background thread that would take care of object persistence in the background and the other threads would just place "SaveRequests" to our Repository, instead of directly accessing the NHibernate sessions. By this means, I can use a single session for the service and manage the NHibernate layer completely separate from the service and clients that access the objects.
I have not found any documentation for such a setup, since everyone is suggesting a session-per-request model or some variation. But if I get it right, if I instantiate an object in one session and save it in another one, it is not the same object - and it also seems that NHibernate will create a new entry in the database.
I've also tried to figure the role of IOC containers in this kond of context, but I have not found any useful examples that would show that they could really help me.
Am I on a right track or how should I proceed?
Consider ISession a unit of work. You will want to define within the context of your application, what constitutes a unit of work. A unit of work is a boundary around a series of smaller operations which constitute a complete, functional task (complete and functional is defined by you, in the design of your application). Is it when your service responds to a Silverlight client request, or other external request? Is it when the service wakes up to do some work on a timer? All of the above?
You want the session to be created for that unit of work, and disposed when it completes. It is not recommended that you use long-running ISession instances, where operations lazily use whatever ambient ISession they can find.
The idea is generally described as this:
I need to do some work (because I'm responding to an event, whether it be an incoming request, a job on a timer, it doesn't matter).
Therefore, I need to begin a new unit of work (which helps me keep track of all the operations I need to do while performing this work).
The unit of work begins a new ISession to keep track of my work.
I do my work.
If I was able to do my job successfully, all my changes should be flushed and committed
If not, roll all my changes back.
Clean up after myself (dispose ISession, etc.).