ASP.net Cacheing - Proper Usage - c#

I am creating a web application and am having an issue with my cacheing.
My application has a large amount of data that I want to try and not call from the sql database everytime i need the info.
I have tried to use caching in the following way:
public static List<DAL.EntityClasses.AccountsEntity> Accounts
{
get
{
if (HttpContext.Current.Cache["Account"] == null)
{
HttpContext.Current.Cache.Insert("Account", LoadAccounts(), null, DateTime.Now.AddHours(4), System.Web.Caching.Cache.NoSlidingExpiration);
}
return (List<DAL.EntityClasses.AccountsEntity>)HttpContext.Current.Cache["Account"];
}
}
The problem is that it appears that as I am adding items to the Cache, the items that I have already cached get removed.
So most calls are calling the DB to get the data for the cache.
Where have I gone wrong?
Thanks

This is normal for a LRU cache - least used items get pushed out as the cache fills up capacity.
Configure your cache to larger amounts of data.

Just FYI:
Theres a problem with your implementation of the Accounts property, that is not releated to your original question, but may cause problems in the future:
What could happen is that between this line
if (HttpContext.Current.Cache["Account"] == null)
and this line:
return (List<DAL.EntityClasses.AccountsEntity>)HttpContext.Current.Cache["Account"];
your cache could be cleared / the Account entry could be deleted from the cache.
a better implementation would be:
public static List<DAL.EntityClasses.AccountsEntity> Accounts
{
get
{
List<DAL.EntityClasses.AccountsEntity> accounts =
HttpContext.Current.Cache["Account"] as List<DAL.EntityClasses.AccountsEntity>
if(accounts == null)
{
accounts = LoadAccounts();
HttpContext.Current.Cache.Insert("Account", accounts, null, DateTime.Now.AddHours(4), System.Web.Caching.Cache.NoSlidingExpiration);
}
return accounts;
}
}

Related

Avoid fast post on webapi c#

I have problem in when user post the data. Some times the post run so fast and this make problem in my website.
The user want to register a form about 100$ and have 120$ balance.
When the post (save) button pressed sometimes two post come to server very fast like:
2018-01-31 19:34:43.660 Register Form 5760$
2018-01-31 19:34:43.663 Register Form 5760$
Therefore my client balance become negative.
I use If in my code to check balance but the code run many fast and I think both if happen together and I missed them.
Therefore I made Lock Controll class to avoid concurrency per user but not work well.
I made global Action Filter to control the users this is my code:
public void OnActionExecuting(ActionExecutingContext context)
{
try
{
var controller = (Controller)context.Controller;
if (controller.User.Identity.IsAuthenticated)
{
bool jobDone = false;
int delay = 0;
int counter = 0;
do
{
delay = LockControllers.IsRequested(controller.User.Identity.Name);
if (delay == 0)
{
LockControllers.AddUser(controller.User.Identity.Name);
jobDone = true;
}
else
{
counter++;
System.Threading.Thread.Sleep(delay);
}
if (counter >= 10000)
{
context.HttpContext.Response.StatusCode = 400;
jobDone = true;
context.Result = new ContentResult()
{
Content = "Attack Detected"
};
}
} while (!jobDone);
}
}
catch (System.Exception)
{
}
}
public void OnActionExecuted(ActionExecutedContext context)
{
try
{
var controller = (Controller)context.Controller;
if (controller.User.Identity.IsAuthenticated)
{
LockControllers.RemoveUser(controller.User.Identity.Name);
}
}
catch (System.Exception)
{
}
}
I made list static list of user and sleep their thread until previous task happen.
Is there any better way to manage this problem?
So the original question has been edited so this answer is invalid.
so the issue isn't that the code runs too fast. Fast is always good :) The issue is that the account is going into negative funds. If the client decides to post a form twice that is the clients fault. It maybe that you only want the client to pay only once which is an other problem.
So for the first problem, I would recommend a using transactions (https://en.wikipedia.org/wiki/Database_transaction) to lock your table. Which means that the add update/add a change (or set of changes) and you force other calls to that table to wait until those operations have been done. You can always begin your transaction and check that the account has the correct amount of funds.
If the case is that they are only ever meant to pay once then.. then have a separate table that records if the user has payed (again within a transaction), before processing the update/add.
http://www.entityframeworktutorial.net/entityframework6/transaction-in-entity-framework.aspx
(Edit: fixing link)
You have a few options here
You implement ETag functionality in your app which you can use for optimistic concurrency. This works well, when you are working with records, i.e. you have a database with a data record, return that to the user and then the user changes it.
You could add an required field with a guid to your view model which you pass to your app and add it to in memory cache and check it on each request.
public class RegisterViewModel
{
[Required]
public Guid Id { get; set; }
/* other properties here */
...
}
and then use IMemoryCache or IDistributedMemoryCache (see ASP.NET Core Docs) to put this Id into the memory cache and validate it on request
public Task<IActioNResult> Register(RegisterViewModel register)
{
if(!ModelState.IsValid)
return BadRequest(ModelState);
var userId = ...; /* get userId */
if(_cache.TryGetValue($"Registration-{userId}", register.Id))
{
return BadRequest(new { ErrorMessage = "Command already recieved by this user" });
}
// Set cache options.
var cacheEntryOptions = new MemoryCacheEntryOptions()
// Keep in cache for 5 minutes, reset time if accessed.
.SetSlidingExpiration(TimeSpan.FromMinutes(5));
// when we're here, the command wasn't executed before, so we save the key in the cache
_cache.Set($"Registration-{userId}", register.Id, cacheEntryOptions );
// call your service here to process it
registrationService.Register(...);
}
When the second request arrives, the value will already be in the (distributed) memory cache and the operation will fail.
If the caller do not sets the Id, validation will fail.
Of course all that Jonathan Hickey listed in his answer below applies to, you should always validate that there is enough balance and use EF-Cores optimistic or pessimistic concurrency

Dynamics CRM Online Object caching not caching correctly

I have a requirement where we need a plugin to retrieve a session id from an external system and cache it for a certain time. I use a field on the entity to test if the session is actually being cached. When I refresh the CRM form a couple of times, from the output, it appears there are four versions (at any time consistently) of the same key. I have tried clearing the cache and testing again, but still the same results.
Any help appreciated, thanks in advance.
Output on each refresh of the page:
20170511_125342:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125410:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125342:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125437:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125437:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
To accomplish this, I have implemented the following code:
public class SessionPlugin : IPlugin
{
public static readonly ObjectCache Cache = MemoryCache.Default;
private static readonly string _sessionField = "new_sessionid";
#endregion
public void Execute(IServiceProvider serviceProvider)
{
var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
try
{
if (context.MessageName.ToLower() != "retrieve" && context.Stage != 40)
return;
var userId = context.InitiatingUserId.ToString();
// Use the userid as key for the cache
var sessionId = CacheSessionId(userId, GetSessionId(userId));
sessionId = $"{sessionId}:{Cache.Select(kvp => kvp.Key == userId).ToList().Count}:{userId}";
// Assign session id to entity
var entity = (Entity)context.OutputParameters["BusinessEntity"];
if (entity.Contains(_sessionField))
entity[_sessionField] = sessionId;
else
entity.Attributes.Add(new KeyValuePair<string, object>(_sessionField, sessionId));
}
catch (Exception e)
{
throw new InvalidPluginExecutionException(e.Message);
}
}
private string CacheSessionId(string key, string sessionId)
{
// If value is in cache, return it
if (Cache.Contains(key))
return Cache.Get(key).ToString();
var cacheItemPolicy = new CacheItemPolicy()
{
AbsoluteExpiration = ObjectCache.InfiniteAbsoluteExpiration,
Priority = CacheItemPriority.Default
};
Cache.Add(key, sessionId, cacheItemPolicy);
return sessionId;
}
private string GetSessionId(string user)
{
// this will be replaced with the actual call to the external service for the session id
return DateTime.Now.ToString("yyyyMMdd_hhmmss");
}
}
This has been greatly explained by Daryl here: https://stackoverflow.com/a/35643860/7708157
Basically you are not having one MemoryCache instance per whole CRM system, your code simply proves that there are multiple app domains for every plugin, so even static variables stored in such plugin can have multiple values, which you cannot rely on. There is no documentation on MSDN that would explain how the sanboxing works (especially app domains in this case), but certainly using static variables is not a good idea.Of course if you are dealing with online, you cannot be sure if there is only single front-end server or many of them (which will also result in such behaviour)
Class level variables should be limited to configuration information. Using a class level variable as you are doing is not supported. In CRM Online, because of multiple web front ends, a specific request may be executed on a different server by a different instance of the plugin class than another request. Overall, assume CRM is stateless and that unless persisted and retrieved nothing should be assumed to be continuous between plugin executions.
Per the SDK:
The plug-in's Execute method should be written to be stateless because
the constructor is not called for every invocation of the plug-in.
Also, multiple system threads could execute the plug-in at the same
time. All per invocation state information is stored in the context,
so you should not use global variables or attempt to store any data in
member variables for use during the next plug-in invocation unless
that data was obtained from the configuration parameter provided to
the constructor.
Reference: https://msdn.microsoft.com/en-us/library/gg328263.aspx

How to trigger the removal of old cached values using MemoryCache as it was intended

I am using the following code in order to ensure that I only go to the database once for my Agent data and for the cached data to be refereshed when the contractId being passed in changes.
public static AgentCacher
{
private IAgentDal AgentDal;
private readonly ObjectCache AgentObjectCache;
private string LastContractId;
public AgentCacher(IAgentDal agentDal)
{
this.AgentDal = agentDal;
// Get the instance of the cache
this.AgentObjectCache = MemoryCache.Default;
}
public List<Agent> GetAgentsForContract(int contractId)
{
// Set the key to be used for the cache
var cacheKey = contractId.ToString(CultureInfo.InvariantCulture);
// Has the contract ID changed?
if (this.LastContractId !== cacheKey)
{
// Remove the item from the cache
this.AgentObjectCache.Remove(this.LastContractId);
}
// Are the agents for this contract ID already in the cache?
else if (this.AgentObjectCache.Contains(cacheKey))
{
// Return agents list from the cache
return
this.AgentObjectCache.Get(cacheKey) as
List<Agent>;
}
// Go to the database and get the agents
var agentsFromDatabase = this.AgentDal.GetAgentsForContract(contractId);
// Add the values to the cache
this.AgentObjectCache.Add(cacheKey, agentsFromDatabase, DateTimeOffset.MaxValue);
// Set the contract Id for checking next time
this.LastContractId = cacheKey;
// Return the agents
return agentsFromDatabase;
}
}
This works OK, but I feel like I'm probably not using the MemoryCache in the way it was intended to be used.
How can I trigger the removal of the values that I add to the cache to clear out the old values when the contractId changes, do I have to use ChangeMonitor or CacheItemPolicy that can be passed in when adding to the cache?
I've been struggling to find examples as to how it should be used properly.
Your logic looks right. However you are managing cache lifetime yourself instead of relying on built in expiration system technics. For instance instead of you to check if there is a new contractId, remove old one and add new one, I think you should cache for as many contractIds as needed, but to have for example absolute expiration for 1 hour. For example if there is contractId == 1 then you will have cache with cache key 1 and if another request asks for contractId == 2 then you will go to db pull contract information for id == 2 and store it in the cache for another absolute expiration 1 hour or so. I think this will be more efficient instead of you manage cache (add, remove) yourself.
You also need to consider locking data when you add and remove data from the cache in order to eliminate race condition.
You can find good example on how to do it:
Working With Caching in C#
Using MemoryCache in .NET 4.0

Items in .NET cache not expiring as expected

I am not very familiar with exactly how the cache works, so I hope I'm including enough information.
Basically, my problem is that we're caching items, and they appear to be added to the cache all right, but they don't seem to expire, or they expire whenever they feel like it. I am not using sliding expiration.
The item I am testing with currently has an expiration of 300 (5 minutes) but tested 30 minutes later, the item is still being returned from the cache. I have had items supposed to expire after a day still showing up 7 days later. Therefore, our changes/edits are not showing on the site.
Following is the code we have tried.
public static void AddtoCache(object objToCache, CacheLevel cacheLevel, string cacheKey)
{
if (objToCache == null)
{
return;
}
if (HttpContext.Current.Cache != null)
{
HttpContext.Current.Cache.Insert(cacheKey, objToCache, null, DateTime.Now.AddSeconds((double)cacheLevel), System.Web.Caching.Cache.NoSlidingExpiration);
}
}
and also:
public static void AddtoCache(object objToCache, CacheLevel cacheLevel, string cacheKey)
{
if (HttpContext.Current.Cache != null)
{
if (HttpContext.Current.Cache[cacheKey] == null)
{
HttpContext.Current.Cache.Add(cacheKey, objToCache, null, DateTime.UtcNow.AddSeconds((double)cacheLevel), TimeSpan.Zero, System.Web.Caching.CacheItemPriority.Default, null);
}
}
}
This happens both in development and production. We're using C# on Windows Server 2008 R2 Sp1 with IIS 7.5
UPDATE: This is the code we use to retrieve the item for display, in case that might be the problem.
public static T GetFromCache<T>(string cacheKey)
{
object cachedObj = HttpContext.Current.Cache[cacheKey];
if (cachedObj != null)
{
return (T)cachedObj;
}
return default(T);
}
UPDATE #2: The items that are being cached are all being displayed with .NET user controls (ascx) in case that matters. We are using Ektron as our CMS.
UPDATE 04/16 I've used the CacheItemRemovedCallback delegate and logged items getting removed from the cache on my dev site. When I refresh the page even 20 minutes after initial load (with cache expiry set to 5 minutes), the same timestamp appears and nothing is added to my log. There are a few other items in the log with "Reason: Removed". There is only one item that ever appears with "Reason: Expired" but it isn't the menu I'm testing. So it seems that at least one control is working properly, but not the others.

Update Cache with one item C#

I am getting users and their data from external webservice. I cache those items because I don't want to hit web service every time. Now, If user update any of their information, I am saving it through webservice. But I don't want to get the latest data from web service as it takes lot of time. Instead I want to update my cache. Can I do that ? If so, what would be the best way ? Here is my Code
List<User> users = appSecurity.SelectUsers();
var CacheKey = string.Format("GetUserList_{0}", currentUser);
CacheFactory.AddCacheItem(CacheKey, users, 300);
CacheFactory is a class where I handle Adding, Clearing and Removing cache. Below is the code
public static void RemoveCacheItem(string key)
{
Cache.Remove(key);
}
public static void ClearCache()
{
System.Collections.IDictionaryEnumerator enumerator = Cache.GetEnumerator();
while (enumerator.MoveNext())
{
RemoveCacheItem(enumerator.Key.ToString());
}
}
public static void AddCacheItem<T>(string key, T value, double timeOutInSeconds)
{
var Item = GetCacheItem<T>(key);
if (Item != null)
{
RemoveCacheItem(key);
Item = value;
}
Cache.Insert(key, value, null, DateTime.Now.AddSeconds(timeOutInSeconds), System.Web.Caching.Cache.NoSlidingExpiration);
}
The answer is yes, it can be done. It can also be done in many different ways depending on what you want to solve. At the basic level you can create a cache by using a List<T> or Dictionary<T,T> to store your data.
When you get information from the external web-service, you push the data into your List or Dictionary. You can then use that data throughout your application. When you need to update that cache, you update the value in the List/Dictionary.
You can update your dictonary like so
Dictionary<string, int> list = new Dictionary<string, int>();
then you can set the value for the key "test" as follows
list["test"] = list["test"] + 1;
When you are ready to push the updated data to the external source. All you need to do is properly parse that data into the format the source is expecting and send away.
Like I said there are many different ways to do this, but this is a basic sample way to accomplishing it. You can use this example to build off and go from there.

Categories