I am not very familiar with exactly how the cache works, so I hope I'm including enough information.
Basically, my problem is that we're caching items, and they appear to be added to the cache all right, but they don't seem to expire, or they expire whenever they feel like it. I am not using sliding expiration.
The item I am testing with currently has an expiration of 300 (5 minutes) but tested 30 minutes later, the item is still being returned from the cache. I have had items supposed to expire after a day still showing up 7 days later. Therefore, our changes/edits are not showing on the site.
Following is the code we have tried.
public static void AddtoCache(object objToCache, CacheLevel cacheLevel, string cacheKey)
{
if (objToCache == null)
{
return;
}
if (HttpContext.Current.Cache != null)
{
HttpContext.Current.Cache.Insert(cacheKey, objToCache, null, DateTime.Now.AddSeconds((double)cacheLevel), System.Web.Caching.Cache.NoSlidingExpiration);
}
}
and also:
public static void AddtoCache(object objToCache, CacheLevel cacheLevel, string cacheKey)
{
if (HttpContext.Current.Cache != null)
{
if (HttpContext.Current.Cache[cacheKey] == null)
{
HttpContext.Current.Cache.Add(cacheKey, objToCache, null, DateTime.UtcNow.AddSeconds((double)cacheLevel), TimeSpan.Zero, System.Web.Caching.CacheItemPriority.Default, null);
}
}
}
This happens both in development and production. We're using C# on Windows Server 2008 R2 Sp1 with IIS 7.5
UPDATE: This is the code we use to retrieve the item for display, in case that might be the problem.
public static T GetFromCache<T>(string cacheKey)
{
object cachedObj = HttpContext.Current.Cache[cacheKey];
if (cachedObj != null)
{
return (T)cachedObj;
}
return default(T);
}
UPDATE #2: The items that are being cached are all being displayed with .NET user controls (ascx) in case that matters. We are using Ektron as our CMS.
UPDATE 04/16 I've used the CacheItemRemovedCallback delegate and logged items getting removed from the cache on my dev site. When I refresh the page even 20 minutes after initial load (with cache expiry set to 5 minutes), the same timestamp appears and nothing is added to my log. There are a few other items in the log with "Reason: Removed". There is only one item that ever appears with "Reason: Expired" but it isn't the menu I'm testing. So it seems that at least one control is working properly, but not the others.
Related
One of our clients has a Job Application Web Site built with ASP.NET and Dot Net Framework 4.8.
Over the past few weeks, owing to some performance issues on the main database server, we have started optimizing certain critical features of the application. One such feature is the ability for applicants to search and apply for jobs. There are two broad aspects to this:-
Applicants login and search for jobs, using a set of optional filters
Admins approve jobs (an approved job would immediately show up in the job search results for applicants)
To optimize this feature, we started using ObjectCache to store the Jobs Data and every search request is performed against this cache, instead of running a query on the database. So far we have seen good improvement in application performance when data is fetched from the cache and filters applied via C# code.
As of now, we have a singleton instance of Objectcache, with a lock in place for thread safety:
using System.Threading;
public class JobsDataCache
{
private static ObjectCache jobsDataCache = null;
private static readonly object _lock = new object();
private JobsDataCache() { }
public static ObjectCache GetInstance()
{
if (jobsDataCache == null)
{
lock(_lock)
{
if (jobsDataCache == null)
{
jobsDataCache = new MemoryCache("JobsDataCache");
}
}
}
return jobsDataCache;
}
}
These are the service class methods that provide search results and also manage the cache instance:-
public SearchJobsResponse SearchJobs(SearchJobsParam param, string user, bool isTestUser)
{
try
{
// Method to evaluate and refresh the data cache
EvaluateCache()
//... Remaining Logic for filtering and returning data to controller
}
}
private void EvaluateCache()
{
lock (_lock)
{
var SearchJobsData = JobsDataCache.GetInstance().Get("SearchJobsData");
// If there is data in cache, then assign to result set and return
if (SearchJobsData != null)
{
result = (List<SearchApplyJobs>)SearchJobsData;
}
else
{
// Refresh the cache - fetch latest data from DB
RefreshCacheData();
}
}
}
private void RefreshCacheData()
{
var GlobalQuery = ";with ROWCTE AS (" +
"SELECT t.Ad_Number, t.JobType, c.CategoryID, t.Cert_Code, d.District, t.District_Name, t.End_Date, t.InstructionalShowing, " +
"t.Job_Description, t.Long_Description_String, t.Job_Number, t.Post_Date, t.Region_Code, t.Region_Name, d.Short_Name, t.Start_Date, z.ZIP_Code, z.Latitude, z.Longitude " +
"FROM ApplicationType c " +
"JOIN Job_Ad t ON t.ApplicationType = c.ApplicationTypeID " +
"JOIN District d ON t.District_Code = d.District " +
"JOIN ZIPInfo z ON z.ZIP_Code = d.Zipcode" +
" WHERE (CONVERT(DATE, t.Post_Date) <= CONVERT(DATE, GETDATE()) AND CONVERT(DATE, t.End_Date) >= CONVERT(DATE, GETDATE())))" +
"SELECT Ad_Number, JobType, CategoryID, Cert_Code, District, District_Name, CAST(End_Date AS datetime) AS End_Date, " +
"InstructionalShowing, Job_Description, Long_Description_String, Job_Number, CAST(Post_Date AS datetime) AS Post_Date, Region_Code, " +
"Region_Name AS RegionCode, Short_Name, CAST(Start_Date AS datetime) AS Start_Date, ZIP_Code, Latitude, Longitude " +
"FROM ROWCTE ORDER BY Job_Number";
result = identityConnection.Database.SqlQuery<SearchJobs>(GlobalQuery).ToList();
if (result.Count > 0)
{
CacheItemPolicy policy = new CacheItemPolicy { AbsoluteExpiration = DateTimeOffset.Now.AddHours(2) };
JobsDataCache.GetInstance().Add("SearchJobsData", result, policy);
}
}
// Method that will be used to refresh the cache when a job is approved
public void ClearCacheAndEvaluate()
{
lock(_lock)
{
var data = JobsDataCache.GetInstance().Get("SearchJobsData");
if (data != null)
{
JobsDataCache.GetInstance().Remove("SearchJobsData");
RefreshCacheData();
}
}
}
As far as the job search goes, this approach is working really well. However, when it comes to admins approving jobs, we realized that the cache may have to be refreshed (get the latest data from the DB) every time a job is approved.
Based on usage statistics, there could be anywhere between 15 - 35 jobs approved per day, with perhaps a few minutes to few hours between approvals, based on the admin's discretion (it is a manual task and not automated yet).
From the bandwidth perspective, there is a possibilty of a job search happening every minute (around 1500 - 2000 applicants are logged in during peak time) versus job approvals happening every few minutes to few hours. However, we are not able to get around the fact that the cache will have to be refreshed after every job approval.
We have already tried to optimize the Job Search queries on the database side, but there are infrastructure issues which we are not able to investigate / troubleshoot as we do not have access to the server. The cache solution looks very promising, but there is this challenge of keeping it up to date in regular intervals, and that means a round trip to the database.
The only possible solution I have been able to think is that we refresh the cache after a certain number of approvals, let's say 5 - 7. But since this is a manual task, there might be extended periods of time when this number has not been reached and the cache does not have latest data. Given this situation, should we completely ditch the cache approach and keep focusing on creating optimized queries on the database side ?
The improved performance in the jobs search with cache would keep the client and users very happy, but if there is a slight delay owing to cache refreshes after every job approval, we are not sure what kind of an impression that would have on the client and users.
Any ideas that would help us retain the cache approach and provide a decent user experience would be really appreciated from this community. Happy to share further information and code if necessary.
Thanks
i am trying to make make app expire after some days using the registry option, i have successfully written and read from registry, my issue is to check for expiration, below is my code:
regKey = Registry.CurrentUser.OpenSubKey("Systemfiles");//subkeyname
if (regKey == null)
{
regKey = Registry.CurrentUser.CreateSubKey("Systemfiles");
regKey.SetValue("tere", Encrypt("4/16/2017"));
}
else
{
regKey = Registry.CurrentUser.OpenSubKey("Systemfiles");//subkeyname
string decryptDateValue = Decrypt(regKey.GetValue("tere").ToString()); //Keyname
DateTime mainDate = Convert.ToDateTime(decryptDateValue);
DateTime expiringDate = mainDate.AddDays(1);
if (mainDate > expiringDate)
{
expire = true;
}
else
{
//Continue execution
}
}
from my code i assumed the user's first run is the 04/16/2017, i also assumed i want the user to run the application for one day, which supposed to expire on the 04/17/2017, meaning if the user tries to start the application after 04/17/2017 the if part should execute, but i am not really getting it right, the else part always execute, i will appreciate a better way of doing it. Thanks
You've got this in your code:
DateTime expiringDate = mainDate.AddDays(1);
if (mainDate > expiringDate)
So,expiringDate would always be bigger than mainDate (one day exactly).
What you want to check is the actual Date, so it should be:
if (DateTime.Now>expiringDate)
I have a requirement where we need a plugin to retrieve a session id from an external system and cache it for a certain time. I use a field on the entity to test if the session is actually being cached. When I refresh the CRM form a couple of times, from the output, it appears there are four versions (at any time consistently) of the same key. I have tried clearing the cache and testing again, but still the same results.
Any help appreciated, thanks in advance.
Output on each refresh of the page:
20170511_125342:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125410:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125342:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125437:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125358:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
20170511_125437:1:55a4f7e6-a1d7-e611-8100-c4346bc582c0
To accomplish this, I have implemented the following code:
public class SessionPlugin : IPlugin
{
public static readonly ObjectCache Cache = MemoryCache.Default;
private static readonly string _sessionField = "new_sessionid";
#endregion
public void Execute(IServiceProvider serviceProvider)
{
var context = (IPluginExecutionContext)serviceProvider.GetService(typeof(IPluginExecutionContext));
try
{
if (context.MessageName.ToLower() != "retrieve" && context.Stage != 40)
return;
var userId = context.InitiatingUserId.ToString();
// Use the userid as key for the cache
var sessionId = CacheSessionId(userId, GetSessionId(userId));
sessionId = $"{sessionId}:{Cache.Select(kvp => kvp.Key == userId).ToList().Count}:{userId}";
// Assign session id to entity
var entity = (Entity)context.OutputParameters["BusinessEntity"];
if (entity.Contains(_sessionField))
entity[_sessionField] = sessionId;
else
entity.Attributes.Add(new KeyValuePair<string, object>(_sessionField, sessionId));
}
catch (Exception e)
{
throw new InvalidPluginExecutionException(e.Message);
}
}
private string CacheSessionId(string key, string sessionId)
{
// If value is in cache, return it
if (Cache.Contains(key))
return Cache.Get(key).ToString();
var cacheItemPolicy = new CacheItemPolicy()
{
AbsoluteExpiration = ObjectCache.InfiniteAbsoluteExpiration,
Priority = CacheItemPriority.Default
};
Cache.Add(key, sessionId, cacheItemPolicy);
return sessionId;
}
private string GetSessionId(string user)
{
// this will be replaced with the actual call to the external service for the session id
return DateTime.Now.ToString("yyyyMMdd_hhmmss");
}
}
This has been greatly explained by Daryl here: https://stackoverflow.com/a/35643860/7708157
Basically you are not having one MemoryCache instance per whole CRM system, your code simply proves that there are multiple app domains for every plugin, so even static variables stored in such plugin can have multiple values, which you cannot rely on. There is no documentation on MSDN that would explain how the sanboxing works (especially app domains in this case), but certainly using static variables is not a good idea.Of course if you are dealing with online, you cannot be sure if there is only single front-end server or many of them (which will also result in such behaviour)
Class level variables should be limited to configuration information. Using a class level variable as you are doing is not supported. In CRM Online, because of multiple web front ends, a specific request may be executed on a different server by a different instance of the plugin class than another request. Overall, assume CRM is stateless and that unless persisted and retrieved nothing should be assumed to be continuous between plugin executions.
Per the SDK:
The plug-in's Execute method should be written to be stateless because
the constructor is not called for every invocation of the plug-in.
Also, multiple system threads could execute the plug-in at the same
time. All per invocation state information is stored in the context,
so you should not use global variables or attempt to store any data in
member variables for use during the next plug-in invocation unless
that data was obtained from the configuration parameter provided to
the constructor.
Reference: https://msdn.microsoft.com/en-us/library/gg328263.aspx
I'm currently working on getting a test environment stood up (it is currently called DEV) and am experiencing some weird issues.
When you first come to the site, we have an agreement page. Hitting the "I Agree" button will force the user through an Action to check to see if they are a member of the site already or not. We do use a demo mode also, but that is not part of the issue.
The issue I'm currently experiencing is the following. Initially in the Action, we create a Cookie called "siteaccept". Once that is created, we determine if the site is in demo mode or not, then move on to getting the user (actual user or demo user). Once the user is found, we log their Id in a Cookie called "cntPOC", and also create a Session variable by the same name with the same data (original developers wrote much of this convoluted logic which I want to change before someone asks why keep a Session and Cookie). We then do a RedirectToAction to the Action to bring up the main page of the site.
Here is where the issue comes into play. The main page of the site's Action has a CustomAuthorizeAttribute decoration on it. In our CustomAuthorizeAttribute class, we have OnAuthorizion and AuthorizeCore being overrode. OnAuthorizion fires off first, however, it uses base.OnAuthorization. Once that is called, AuthorizeCore is called. In AuthorizeCore, we check for the "siteaccept" Cookie, followed by a check on the "cntPOC" Session variable. If both are there, we return true, otherwise false if either fails.
On not only my local environment but the DBA's, this works without a hitch. I see our Cookies and Session variable. However, on our DEV environment, both the Cookies and Session variable are missing. We have IE 11 configured to allow Cookies, yet we cannot get them once we leave the Action and proceed into the CustomAuthorizeAttribute.
I did find I can find the Cookie today if I check HttpContext.Current.Response instead of HttpContext.Current.Request, but that is the incorrect way to do it obviously.
Below is my code. I'm fairly certain since the code works on my local environment, it should be fine in our DEV environment. Also a quick note, our production environment does work, so the code obviously functions. It's a question now of why does the DEV environment not.
MainController.cs
[HttpPost]
public ActionResult Index(FormCollection frmCollection)
{
try
{
Response.Cookies.Remove("bracmisaccept");
HttpCookie cookie = new HttpCookie("bracmisaccept");
cookie.Value = "true";
Response.Cookies.Add(cookie);
...
//Demo Mode
var poc = new HttpCookie("cntPOC");
cookie.Value = "7578";
Response.Cookies.Add(poc);
Session["cntPOC"] = 7578;
return RedirectToAction("ApplicationSelection");
}
catch (Exception ex)
{
logger.LogError("Main Index", ex);
return PartialView(#"../Error/ExceptionHandling");
}
}
[CustomAuthorizeAttribute]
public ActionResult ApplicationSelection()
{
return View();
}
CustomAuthorizeAttribute.cs
public string RedirectUrl = "~/Main/SessionTimeout";
public string CookieExpiredRedirectUrl = "~/Main/Index";
public string AjaxRedirectUrl = "~/Error/AjaxError";
private bool _isAuthorized;
private bool _isCookieExpired;
protected override bool AuthorizeCore(HttpContextBase httpContext)
{
if (HttpContext.Current.Request.Cookies["siteaccept"] == null)
{
_isAuthorized = false;
_isCookieExpired = true;
return false;
}
if (HttpContext.Current.Session["cntPOC"] == null)
{
_isAuthorized = false;
return false;
}
return true;
}
public override void OnAuthorization(AuthorizationContext filterContext)
{
base.OnAuthorization(filterContext);
if (!_isAuthorized)
{
if (filterContext.RequestContext.HttpContext.Request.IsAjaxRequest())
{
filterContext.HttpContext.Response.StatusCode = 401;
filterContext.HttpContext.Response.End();
}
else
{
if(_isCookieExpired)
filterContext.RequestContext.HttpContext.Response.Redirect(CookieExpiredRedirectUrl);
else
filterContext.RequestContext.HttpContext.Response.Redirect(RedirectUrl);
}
}
}
I'm fairly certain the code is fine, but I did read in a few articles that AuthorizeCore may or may not have the Cookies and Session variables at times. I just wanted to find out if I'm wasting my time with changing the code or if it's the box we have this site on. The server is super locked down, so yeah, kind of annoying...
Edit: I have yet to figure out how to fix this yet, however, I did find if I do a publish on this code, I can enter into the site properly. I still cannot run localhost to inspect the site, but a publish fixes a few minor issues of whether things will work on this site.
I am creating a web application and am having an issue with my cacheing.
My application has a large amount of data that I want to try and not call from the sql database everytime i need the info.
I have tried to use caching in the following way:
public static List<DAL.EntityClasses.AccountsEntity> Accounts
{
get
{
if (HttpContext.Current.Cache["Account"] == null)
{
HttpContext.Current.Cache.Insert("Account", LoadAccounts(), null, DateTime.Now.AddHours(4), System.Web.Caching.Cache.NoSlidingExpiration);
}
return (List<DAL.EntityClasses.AccountsEntity>)HttpContext.Current.Cache["Account"];
}
}
The problem is that it appears that as I am adding items to the Cache, the items that I have already cached get removed.
So most calls are calling the DB to get the data for the cache.
Where have I gone wrong?
Thanks
This is normal for a LRU cache - least used items get pushed out as the cache fills up capacity.
Configure your cache to larger amounts of data.
Just FYI:
Theres a problem with your implementation of the Accounts property, that is not releated to your original question, but may cause problems in the future:
What could happen is that between this line
if (HttpContext.Current.Cache["Account"] == null)
and this line:
return (List<DAL.EntityClasses.AccountsEntity>)HttpContext.Current.Cache["Account"];
your cache could be cleared / the Account entry could be deleted from the cache.
a better implementation would be:
public static List<DAL.EntityClasses.AccountsEntity> Accounts
{
get
{
List<DAL.EntityClasses.AccountsEntity> accounts =
HttpContext.Current.Cache["Account"] as List<DAL.EntityClasses.AccountsEntity>
if(accounts == null)
{
accounts = LoadAccounts();
HttpContext.Current.Cache.Insert("Account", accounts, null, DateTime.Now.AddHours(4), System.Web.Caching.Cache.NoSlidingExpiration);
}
return accounts;
}
}