MVC: EF6 Connection Pooling and SQL Server CONTEXT_INFO - c#

In an ASP.NET MVC application, I'm trying to use SQL Server's CONTEXT_INFO to pass the currently logged in user so my audit triggers record not only the web server login, but also the login of the site.
I'm having trouble being certain that the current user will always be fed into the database server context though.
On the backend I have everything set up, a sproc to set the context, a function to pull it and DML triggers to record, no problem.
The app end is a bit more involved. I subscribe to the Database.Connection.StateChange event so I can catch each newly opened connection and set this context accordingly.
Additionally, to be able to retrieve the current login ID of the MVC site in the data layer (which has no access to the web project), I supply a delegate to the EF constructor that will return the user ID. This also means that any other peripheral projects I have set up require this dependency as well, and it keeps most of the implementation detail out of my hair during the web dev:
public class CoreContext : DbContext
{
Func<int> _uidObtainer;
public CoreContext(Func<int> uidObtainer) : base(nameof(CoreContext)) { construct(uidObtainer); }
public CoreContext(Func<int> uidObtainer, string connection) : base(connection) { construct(uidObtainer); }
void construct(Func<int> uidObtainer) {
// disallow updates of the db from our models
Database.SetInitializer<CoreContext>(null);
// catch the connection change so we can update for our userID
_uidObtainer = uidObtainer;
Database.Connection.StateChange += connectionStateChanged;
}
private void connectionStateChanged(object sender, System.Data.StateChangeEventArgs e) {
// set our context info for logging
if (e.OriginalState == System.Data.ConnectionState.Open ||
e.CurrentState != System.Data.ConnectionState.Open) {
return;
}
int uid = _uidObtainer();
var conn = ((System.Data.Entity.Core.EntityClient.EntityConnection)sender).StoreConnection;
var cmd = conn.CreateCommand();
cmd.CommandText = "audit.SetContext";
cmd.CommandType = System.Data.CommandType.StoredProcedure;
cmd.Parameters.Add(new System.Data.SqlClient.SqlParameter("#DomainUserID", uid));
cmd.ExecuteNonQuery();
}
// etc etc...
In my MVC project, I'll have code that looks like this:
context = new Data.CoreContext(() => AppService.UserID());
(making use of a readily accessible method to pass as delegate, which in turn reads from HttpContext.Current.User)
This is all shaping up nicely, except one unknown:
I know that it's possible for a EF Context instance to span multiple logged in users as this lives as part of the IIS app pool and not per HttpContext
What I don't know is enough about connection pooling and how connections are opened/re-opened to be safe in knowing that for each time my StateChange handler runs, I'll actually be retrieving the new UserID from the delegate.
Said differently: is it possible for a single connection to be open and used over the span of two separate HttpContext instances? I believe yes, seeing as how there's nothing to enforce otherwise (at least not that I'm aware of).
What can I do to ensure that each connection is getting the current HttpContext?
(possibly pertinent notes: There's no UoW/Repository pattern outside of EF itself, and data contexts are generally instantiated once per controller)

I see: the one context per controller is generally incorrect. Instead I should be using one context per request, which (besides other advantages), ensures my scenario operates correctly as well.
I found this answer, which explains the reasoning behind it: One DbContext per web request... why?
And I found this answer, which explains quite succinctly how to implement via BeginRequest and EndRequest: One DbContext per request in ASP.NET MVC (without IOC container)
(code from second answer pasted below to prevent linkrot)
protected virtual void Application_BeginRequest()
{
HttpContext.Current.Items["_EntityContext"] = new EntityContext();
}
protected virtual void Application_EndRequest()
{
var entityContext = HttpContext.Current.Items["_EntityContext"] as EntityContext;
if (entityContext != null)
entityContext.Dispose();
}
And in your EntityContext class...
public class EntityContext
{
public static EntityContext Current
{
get { return HttpContext.Current.Items["_EntityContext"] as EntityContext; }
}
}

Related

Dynamics CRM Online Object caching not caching correctly

I have a requirement where we need a plugin to retrieve a session id from an external system and cache it for a certain time. I use a field on the entity to test if the session is actually being cached. When I refresh the CRM form a couple of times, from the output, it appears there are four versions (at any time consistently) of the same key. I have tried clearing the cache and testing again, but still the same results.
Any help appreciated, thanks in advance.
Output on each refresh of the page:
20170511_125342:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125410:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125342:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125437:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125437:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
To accomplish this, I have implemented the following code:
public class SessionPlugin : IPlugin
{
public static readonly ObjectCache Cache = MemoryCache.Default;
private static readonly string _sessionField = "new_sessionid";
#endregion
public void Execute(IServiceProvider serviceProvider)
{
var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
try
{
if (context.MessageName.ToLower() != "retrieve" && context.Stage != 40)
return;
var userId = context.InitiatingUserId.ToString();
// Use the userid as key for the cache
var sessionId = CacheSessionId(userId, GetSessionId(userId));
sessionId = $"{sessionId}:{Cache.Select(kvp => kvp.Key == userId).ToList().Count}:{userId}";
// Assign session id to entity
var entity = (Entity)context.OutputParameters["BusinessEntity"];
if (entity.Contains(_sessionField))
entity[_sessionField] = sessionId;
else
entity.Attributes.Add(new KeyValuePair<string, object>(_sessionField, sessionId));
}
catch (Exception e)
{
throw new InvalidPluginExecutionException(e.Message);
}
}
private string CacheSessionId(string key, string sessionId)
{
// If value is in cache, return it
if (Cache.Contains(key))
return Cache.Get(key).ToString();
var cacheItemPolicy = new CacheItemPolicy()
{
AbsoluteExpiration = ObjectCache.InfiniteAbsoluteExpiration,
Priority = CacheItemPriority.Default
};
Cache.Add(key, sessionId, cacheItemPolicy);
return sessionId;
}
private string GetSessionId(string user)
{
// this will be replaced with the actual call to the external service for the session id
return DateTime.Now.ToString("yyyyMMdd_hhmmss");
}
}
This has been greatly explained by Daryl here: https://stackoverflow.com/a/35643860/7708157
Basically you are not having one MemoryCache instance per whole CRM system, your code simply proves that there are multiple app domains for every plugin, so even static variables stored in such plugin can have multiple values, which you cannot rely on. There is no documentation on MSDN that would explain how the sanboxing works (especially app domains in this case), but certainly using static variables is not a good idea.Of course if you are dealing with online, you cannot be sure if there is only single front-end server or many of them (which will also result in such behaviour)
Class level variables should be limited to configuration information. Using a class level variable as you are doing is not supported. In CRM Online, because of multiple web front ends, a specific request may be executed on a different server by a different instance of the plugin class than another request. Overall, assume CRM is stateless and that unless persisted and retrieved nothing should be assumed to be continuous between plugin executions.
Per the SDK:
The plug-in's Execute method should be written to be stateless because
the constructor is not called for every invocation of the plug-in.
Also, multiple system threads could execute the plug-in at the same
time. All per invocation state information is stored in the context,
so you should not use global variables or attempt to store any data in
member variables for use during the next plug-in invocation unless
that data was obtained from the configuration parameter provided to
the constructor.
Reference: https://msdn.microsoft.com/en-us/library/gg328263.aspx

Entity framework object not updating after being passed around

I'm working on setting up a new MVC payment site with a dependency-injected database connection in a separate project, and experimenting with some new things as I do. Currently, I'm trying to load an existing transaction from the database, authorize the card payment, and then save the result back to the database. Simple and straightforward, but when I call SaveChanges(), nothing gets saved, and I've run out of things to try.
The database interaction for this is handled by a CheckoutDataProvider:
public class CheckoutDataProvider : ICheckoutDataProvider
{
private readonly CheckoutEntities _context;
public CheckoutDataProvider(CheckoutEntities _context)
{
this._context = _context;
}
public ITransaction GetTransactionDetails(Guid transactionId)
{
var trans = _context.Transactions.FirstOrDefault(x => x.CheckoutTransactionId == transactionId);
return trans; // It's OK if trans == null, because the caller will expect that.
}
public void AddAuthorization(ITransaction transaction, IAuthorizationHistory history)
{
try
{
var trans = (Transaction)transaction;
var hist = (AuthorizationHistory)history;
trans.AuthorizationHistories.Add(hist);
_context.SaveChanges();
}
catch (DbEntityValidationException ex)
{
throw new InvalidDataException(ex.EntityValidationErrors.First().ValidationErrors.First().ErrorMessage, ex);
}
}
}
Transaction and AuthorizationHistory are EF objects and correspond directly to the database tables. CheckoutEntities is the context, as injected by Ninject.
GetTransactionDetails() works flawlessly. I give it the transactionId, I get the object, and then I use that data to run the card and generate the AuthorizationHistory class. Then I call AddAuthorization() to attach it to the transaction and save it to the database. But both the new AuthorizationHistory object and any changes to the original Transaction fail to save.
I can tell from inspecting the _context object that it's not aware of any changes, and if I make changes withing GetTransactionDetails() (before it gets returned as an interface) they will persist. So it looks like a problem with the casting (which makes me feel icky anyway, so I'd love to find out that that's the problem).
Am I missing something obvious? Is there something missing to get this to work? Is there a way to avoid the casting in AddAuthorization()?
Probably you are not sharing the same DBContext Between GetTransactionDetails and AddAuthoritzation. Due to this reason Entity Framework is not able to track the changes.
Set the scope life of DBContext for web request, you can do it with Ninject with .InRequestScope() https://github.com/ninject/ninject/wiki/Object-Scopes , with this option the same DBContext will be used during a web request.

Storing global variables in c#?

I basically have created a class which when a user logs into a website it then queries the database and stores some settings in a List (So I have key/pair values).
The reason for this is because I want to always be able to access these settings without going to the database again.
I put these in a class and loop through the fields via a SQL query and add them to the list.
How can I then access these variables from another part of the application? or is there a better way to do this? I'm talking server side and not really client side.
Here is an example of what I had at the moment:
public static void createSystemMetaData()
{
string constring = ConfigurationManager.ConnectionStrings["Test"].ConnectionString;
SqlConnection sql = new SqlConnection(constring);
sql.Open();
SqlCommand systemMetaData = new SqlCommand("SELECT * FROM SD_TABLES", sql);
//Set Modules
using (SqlDataReader systemMetaDataReader = systemMetaData.ExecuteReader())
{
while (systemMetaDataReader.Read())
{
var name = systemMetaDataReader.GetOrdinal("Sequence").ToString();
var value = systemMetaDataReader.GetOrdinal("Property").ToString();
var Modules = new List<KeyValuePair<string, string>>();
Modules.Add(new KeyValuePair<string, string>(name, value));
}
}
}
Thanks
Any static properties of a class will be preserved for the lifetime of the application pool, assuming you're using ASP.NET under IIS.
So a very simple class might look like:
public static class MyConfigClass
{
public static Lazy<Something> MyConfig = new Lazy<Something>(() => GetSomethings());
public static Something GetSomethings()
{
// this will only be called once in your web application
}
}
You can then consume this by simply calling
MyConfigClass.MyConfig.Value
For less users you can go with the SessionState as Bob suggested,however with more users you might need to move to a state server or load it from Data Base each time.
As others have pointed out, the risk of holding these values in global memory is that the values might change. Also, global variables are a bad design decision as you can end up with various parts of your application reading and writing to these values, which makes debugging problems harder than it need be.
A commonly adopted solution is to wrap your database access inside a facade class. This class can then cache the values if you wish to avoid hitting the database for each request. In addition, as changes are routed through the facade too, it knows when the data has changed and can empty its cache (forcing a database re-read) when this occurs. As an added bonus, it becomes possible to mock the facade in order to test code without touching the database (database access is notoriously difficult to unit test).
From the looks of things you are using universal values irrespective of users so an SqlCacheDependency would be useful here:
Make sure you setup a database dependency in web.config for the name Test
public static class CacheData {
public static List<KeyValuePair<string,string>> GetData() {
var cache = System.Web.HttpContext.Current.Cache;
SqlCacheDependency SqlDep = null;
var modules = Cache["Modules"] as List<KeyValuePair<string,string>>;
if (modules == null) {
// Because of possible exceptions thrown when this
// code runs, use Try...Catch...Finally syntax.
try {
// Instantiate SqlDep using the SqlCacheDependency constructor.
SqlDep = new SqlCacheDependency("Test", "SD_TABLES");
}
// Handle the DatabaseNotEnabledForNotificationException with
// a call to the SqlCacheDependencyAdmin.EnableNotifications method.
catch (DatabaseNotEnabledForNotificationException exDBDis) {
SqlCacheDependencyAdmin.EnableNotifications("Test");
}
// Handle the TableNotEnabledForNotificationException with
// a call to the SqlCacheDependencyAdmin.EnableTableForNotifications method.
catch (TableNotEnabledForNotificationException exTabDis) {
SqlCacheDependencyAdmin.EnableTableForNotifications("Test", "SD_TABLES");
}
finally {
// Assign a value to modules here before calling the next line
Cache.Insert("Modules", modules, SqlDep);
}
}
return modules;
}

Using mini-profiler and MVC3 - profiling connections from model class in separate dll

I've got an MVC3 project and one of the models is built as a separate class library project, for re-use in other applications.
I'm using mini-profiler and would like to find a way to profile the database connections and queries that are made from this class library and return the results to the MVC3 applciation.
Currently, in my MVC3 app, the existing models grab a connection using the following helper class:
public class SqlConnectionHelper
{
public static DbConnection GetConnection()
{
var dbconn = new SqlConnection(ConfigurationManager.ConnectionStrings["db"].ToString());
return new StackExchange.Profiling.Data.ProfiledDbConnection(dbconn, MiniProfiler.Current);
}
}
The external model can't call this function though, because it knows nothing of the MVC3 application, or of mini-profiler.
One way I thought of would be to have an IDbConnection Connection field on the external model and then pass in a ProfiledDbConnection object to this field before I call any of the model's methods. The model would then use whatever's in this field for database connections, and I should get some profiled results in the MVC3 frontend.
However, I'm not sure if this would work, or whether it's the best way of doing this. Is there a better way I'm missing?
ProfiledDbConnection isn't dapper: it is mini-profiler. We don't provide any magic that can take over all connection creation; the only thing I can suggest is to maybe expose an event in your library that can be subscribed externally - so the creation code in the library might look a bit like:
public static event SomeEventType ConnectionCreated;
static DbConnection CreateConnection() {
var conn = ExistingDbCreationCode();
var hadler = ConnectionCreated;
if(handler != null) {
var args = new SomeEventArgsType { Connection = conn };
handler(typeof(YourType), args);
conn = args.Connection;
}
return conn;
}
which could give external code the chance to do whatever they want, for example:
YourType.ConnectionCreated += (s,a) => {
a.Connection = new StackExchange.Profiling.Data.ProfiledDbConnection(
a.Connection, MiniProfiler.Current);
};

How Should I be wrapping my select statements in a transaction?

I am going threw my site with nhibernate profiler and I got this message
Alert: Use of implicit transactions is
discouraged
http://nhprof.com/Learn/Alerts/DoNotUseImplicitTransactions
I see they are on every single select statement.
private readonly ISession session;
public OrderHistoryRepo(ISession session)
{
this.session = session;
}
public void Save(OrderHistory orderHistory)
{
session.Save(orderHistory);
}
public List<OrderHistory> GetOrderHistory(Guid Id)
{
List<OrderHistory> orderHistories = session.Query<OrderHistory>().Where(x => x.Id == Id).ToList();
return orderHistories;
}
public void Commit()
{
using (ITransaction transaction = session.BeginTransaction())
{
transaction.Commit();
}
}
Should I be wrapping my GetOrderHistory with a transaction like I have with my commit?
Edit
How would I wrap select statements around with a transaction? Would it be like this? But then "transaction" is never used.
public List<OrderHistory> GetOrderHistory(Guid Id)
{
using (ITransaction transaction = session.BeginTransaction())
{
List<OrderHistory> orderHistories = session.Query<OrderHistory>().Where(x => x.Id == Id).ToList();
return orderHistories;
}
}
Edit
Ninject (maybe I can leverage it to help me out like I did with getting a session)
public class NhibernateSessionFactory
{
public ISessionFactory GetSessionFactory()
{
ISessionFactory fluentConfiguration = Fluently.Configure()
.Database(MsSqlConfiguration.MsSql2008.ConnectionString(c => c.FromConnectionStringWithKey("ConnectionString")))
.Mappings(m => m.FluentMappings.AddFromAssemblyOf<Map>().Conventions.Add(ForeignKey.EndsWith("Id")))
.ExposeConfiguration(cfg => cfg.SetProperty("adonet.batch_size", "20"))
//.ExposeConfiguration(BuidSchema)
.BuildSessionFactory();
return fluentConfiguration;
}
private static void BuidSchema(NHibernate.Cfg.Configuration config)
{
new NHibernate.Tool.hbm2ddl.SchemaExport(config).Create(false, true);
}
}
public class NhibernateSessionFactoryProvider : Provider<ISessionFactory>
{
protected override ISessionFactory CreateInstance(IContext context)
{
var sessionFactory = new NhibernateSessionFactory();
return sessionFactory.GetSessionFactory();
}
}
public class NhibernateModule : NinjectModule
{
public override void Load()
{
Bind<ISessionFactory>().ToProvider<NhibernateSessionFactoryProvider>().InSingletonScope();
Bind<ISession>().ToMethod(context => context.Kernel.Get<ISessionFactory>().OpenSession()).InRequestScope();
}
}
Edit 3
If I do this
public List<OrderHistory> GetOrderHistory(Guid Id)
{
using (ITransaction transaction = session.BeginTransaction())
{
List<OrderHistory> orderHistories = session.Query<OrderHistory>().Where(x => x.Id == Id).ToList();
return orderHistories;
}
}
I get this alert
If I do this
public List<OrderHistory> GetOrderHistory(Guid Id)
{
using (ITransaction transaction = session.BeginTransaction())
{
List<OrderHistory> orderHistories = session.Query<OrderHistory>().Where(x => x.Id == Id).ToList().ConvertToLocalTime(timezoneId);
transaction.Commit();
return orderHistories;
}
}
I can get rid of the errors but can get unexpected results.
For instance when I get orderHistories back I loop through all of them and convert the "purchase date" to the users local time. This is done through an extension method that I created for my list.
Once converted I set it to override the "purchase date" in the object. This way I don't have to create a new object for one change of a field.
Now if I do this conversion of dates before I call the commit nhibernate thinks I have updated the object and need to commit it.
So I am putting a bounty on this question.
How can I create my methods so I don't have to wrap each method in a transaction? I am using ninject already for my sessions so maybe I can leverage that however some times though I am forced to do multiple transactions in a single request.
So I don't know have just one transaction per request is a soultion.
how can I make sure that objects that I am changing for temporary use don't accidentally get commit?
how can I have lazy loading that I am using in my service layer. I don't want to surround my lazy loading stuff in a transaction since it usually used in my service layer.
I am finding it very hard to find examples of how to do it when your using the repository pattern. With the examples everything is always written in the same transaction and I don't want to have transactions in my service layer(it is the job of the repo not my business logic)
The NHibernate community recommends that you wrap everything in a transaction, regardless of what you're doing.
To answer your second question, generally, it depends. If this is a web application, you should look at the session-per-request pattern. In most basic scenarios, what this means is that you'll create a single session per HTTP request in which the session (and transaction) is created when the request is made and committed/disposed of at the end of the request. I'm not saying that this is the right way for you, but it's a common approach that works well for most people.
There are a lot of examples out there showing how this can be done. Definitely worth taking the time to do a search and read through things.
EDIT: Example of how I do the session/transaction per request:
I have a SessionModule that loads the session from my dependency resolver (this is a MVC3 feature):
namespace My.Web
{
public class SessionModule : IHttpModule {
public void Init(HttpApplication context) {
context.BeginRequest += context_BeginRequest;
context.EndRequest += context_EndRequest;
}
void context_BeginRequest(object sender, EventArgs e) {
var session = DependencyResolver.Current.GetService<ISession>();
session.Transaction.Begin();
}
void context_EndRequest(object sender, EventArgs e) {
var session = DependencyResolver.Current.GetService<ISession>();
session.Transaction.Commit();
session.Dispose();
}
public void Dispose() {}
}
}
This is how I register the session (using StructureMap):
new Container(x => {
x.Scan(a => {
a.AssembliesFromApplicationBaseDirectory();
a.WithDefaultConventions();
});
x.For<ISessionFactory>().Singleton().Use(...);
x.For<ISession>().HybridHttpOrThreadLocalScoped().Use(sf => sf.GetInstance<ISessionFactory>().OpenSession());
x.For<StringArrayType>().Use<StringArrayType>();
});
Keep in mind that this is something I've experimented with and have found to work well for the scenarios where I've used NHibernate. Other people may have different opinions (which are, of course, welcome).
Well, i guess you could set a Transaction level that's appropriate for the kind of reads that you perform in your application, but the question is: should it really be required to do that within the application code? My guess is no, unless you have a use case that differs from the default transaction levels that (n)hibernate will apply by configuration.
Maybe you can set transaction levels in your nhibernate config.
Or maybe the settings for the profiler are a bit overzealous? Can't tell from here.
But: Have you tried commiting a read transaction? Should not do any harm.
You're passing the ISession into the repository's constructor, which is good. But that's the only thing I like about this class.
Save just calls session.Save, so it's not needed.
GetOrderHistory appears to be retrieving a single entity by ID, you should use session.Get<OrderHistory>(id) for this. You can put the result into a collection if needed.
The Commit method shouldn't be in a repository.
To answer your questions directly...
How can I create my methods so I don't have to wrap each method in a
transaction? I am using ninject
already for my sessions so maybe I can
leverage that however some times
though I am forced to do multiple
transactions in a single request.
The pattern I recommend is below. This uses manual dependency injection but you could use Ninject to resolve your dependencies.
List<OrderHistory> orderHistories;
var session = GetSession(); // Gets the active session for the request
var repository = new OrderHistory(Repository);
// new up more repositories as needed, they will all participate in the same transaction
using (var txn = session.BeginTransaction())
{
// use try..catch block if desired
orderHistories = repository.GetOrderHistories();
txn.Commit();
}
So I don't know have just one
transaction per request is a soultion.
It's perfectly fine to have multiple transactions in a session. I don't like waiting until the request ends to commit because it's too late to provide good feedback to the user.
how can I make sure that objects that
I am changing for temporary use don't
accidentally get commit?
The only sure way is to use an IStatelessSession. Less sure ways are to Evict the object from the Session or Clear the session. But with NHibernate it's not recommended to modify persistent objects.
how can I have lazy loading that I am using in my service layer. I don't
want to surround my lazy loading stuff
in a transaction since it usually used
in my service layer.
If you're using session-per-request this shouldn't be a problem. But you're right that lazy-loading can happen outside of the transaction. I ignore these warnings. I suppose you could "touch" every child object needed so that lazy loads are in a transaction but I don't bother.
I don't want to have transactions in
my service layer(it is the job of the
repo not my business logic)
I disagree with this. The UI or business logic should manage the transaction. The UI is where the user expresses their intent (save or cancel my changes) and is the natural place to manage the transaction.
Recommended approach is unit of work
(session+transaction) per request.
Sure you can use NInject to manage
session lifecycle, I blogged
recently about similar approach
using Castle Windsor.
Here are 4 options:
Don't change entities temporary
Use stateless session in such cases
Detach objects when you are going to
do temporary change
Rollback transaction
I'd go with first one.
You don't have to worry about lazy loading if you are using session-per-request pattern - it will be executed in same request and wrapped with transaction automatically.

Categories