I have a list of product that I have stored in asp.net cache but I have a problem in refreshing the cache. As per our requirement I want to refresh cache every 15 minutes but I want to know that if in the mean time when the cache is being refreshed if some user ask for the list of product then should he get error or the old list or he have to wait until the cache is refreshed.
the sample code is below
public class Product
{
public int Id{get;set;}
public string Name{get;set;}
}
we have a function which gives us list of Product in BLL
public List<Product> Products()
{
//////some code
}
Cache.Insert("Products", Products(), null, DateTime.Now.AddMinutes(15), TimeSpan.Zero);
I want to add one more situation here, Let say I use static object instead of cache object then what will happen and which approach is best if we are on a stand alone server and not on cluster
Sorry - this might be naive/obvious but just have a facade type class which does
if(Cache["Products"] == null)
{
Cache.Insert("Products", Products(), null, DateTime.Now.AddMinutes(15), TimeSpan.Zero);
}
return Cache["Products"];
There is also a CacheItemRemoveCallback delegate which you could use to repopulate an expired cache. As an alternative
ALSO
use the cache object rather than static objects. More efficient apparently (Asp.net - Caching vs Static Variable for storing a Dictionary) and you get all your cache management methods (sliding expiration and so on)
EDIT
If there is a concern about update times then consider two cache objects plus a controller e.g.
Active Cache
Backup Cache - this is the one that will be updated
Cache controller (another cache object?) this will indicate which object is active
So the process to update will be
Update backup cache
Completes. Check is valid
Backup becomes active and visa versa. The control now flags the Backup cache as being active
There needs to be a method which will fire when the products cache object is populated. I would probably use the CacheItemRemoveCallback delegate to initiate the cache repopulation. Or do an async call in the facade type class - you wouldn't want it blocking the current thread
I'm sure there are many other variants of this
EDIT 2
Actually thinking about this I would make the controller class something like this
public class CacheController
{
public StateEnum Cache1State {get;set;}
public StateEnum Cache1State {get;set;}
public bool IsUpdating {get;set;}
}
The state would be active, backup, updating and perhaps inactive and error. You would set the IsUpdating flag when the update is occurring and then back to false once complete to stop multiple threads trying to update at once - i.e. a race condition. The class is just a general principle and could/should be amended as required
Related
I use SQL Azure and have application, which sync data with external resource. Data is huge, approx 10K records, so, I get it from DB one time, update something if necessary during some minutes and save changes. It works, but problem with simultaneously access to data. IF during these some minutes other service add changes, these changes will be rewritten.
But in the most cases it concerns fields, which my application does not touch!
So, for example, my Table Device:
public partial class Device : BaseEntity
{
public string Name { get; set; }
public string IMEI { get; set; }
public string SN { get; set; }
public string ICCID { get; set; }
public string MacAddress { get; set; }
public DeviceStatus Status { get; set; }
first service (application with long-term process) can modify SN, ICCID, MacAddress, but not Status, second service, vice versa, can modify only Status.
Code to update in the first service:
_allLocalDevicesWithIMEI = _context.GetAllDevicesWithImei().ToList();
(it gets entities, not DTO, because really there are many fields can be changed)
and then:
_context.Devices.Update(localDevice);
for every device, which should be changed
and, eventually:
await _context.SaveChangesAsync();
How to mark, that field Status should be excluded from tracing?
One simple method to avoid update the status field when calling the first service is create a update entity not include the status field, and create another update entity for the second service which includes the status field.
Another way to resolve this problem is override the SaveChangesAsync method and control the update logic by yourself, but it's complex I think and the behavior is implicit, it will not easy for others to understand your code.
To avoid rewrite, you can specify RowVersion on entities. This is so called optimistic concurrency, it will throw error if rewrite happens and you can retry operation if someone already changed something. Or you can just level up your Transaction level, to something like RepeatableRead/Serialized to lock these rows for entire operation (which of course will pose huge performance impact and timeouts). Second option is simple, and good enough for background jobs and distributed transactions, first one is more flexible and usually faster. but hard to implement across multiple endpoints/entities.
I'm using Entity Framework. I have a list of Requests. Each Request has a list of Approvals. When a user is logged in, I need to find a list of Requests that the user is involved in (is a member of a group who's GroupId is in an Approval in the Request). To figure out which groups a user belongs to, I call CheckGroups(groupIds) where groupIds is a list of strings I want to check, and it returns a list of strings that the user belongs to. This method is relatively slow, as it has to make a network call (it's an Azure Active Directory Graph API call). Also, groupIds has a max size of 20.
public class MyDbContext : DbContext
{
public virtual DbSet<Request> Requests;
public virtual DbSet<Approval> Approvals;
}
public class Request
{
public int RequestId;
// several irrelevant properties
public virtual ICollection<Approval> Approvals;
}
public class Approval
{
public int ApprovalId;
public int RequestId;
// several irrelevant properties
public string GroupId;
}
This is what I'm thinking so far:
Go through MyDbContext.Approvals and get list of all unique GroupId.
Pass 20 of them to CheckGroups().
Store returned strings to a list.
Repeat steps 2 and 3 until all unique groups sent.
Go through MyDbContext.Approvals and if the GroupId matches the list from step 3, add the RequestId to a list.
Get list of all Requests that have a RequestId in the list from step 5.
Seems really inefficient. Is there a better way to do this? Trying to minimize time (database calls for entity framework and calls to CheckGroups() are the bottlenecks). As the database grows larger (more Requests added with multiple Approvals per Request) this could get ugly.
Based on my understanding, the network request has the most effect for the performance. Especially, you would repeat the request until the groups sent.
I also suggest that you get all the groups belonged users first then compare the groups locally to see whether the performance was improved.
Or you may consider making your request asynchronously to improve the performance of network request.
I'm a bit confused on the proper usage of MemoryCache.
Should/can it be used to load static information to save on repeated calls?
Should/can it be used to persist data on a view across several action methods?
I have an instance where I don't want to use the data store to populate and persist the data across my view. I started using the MemoryCache which works fine, however I'm starting to question if that was the correct approach.
My concerns was what happens if I have several users on the same page using the same MemoryCache?
First of all, MemoryCache is part of the System.Runtime.Caching namespace. It can be used by MVC applications, but it is not limited to MVC applications.
NOTE: There is also a System.Web.Caching namespace (much older) that can only be used with the ASP.NET framework (including MVC).
Should/can it be used to load static information to save on repeated calls?
Yes.
Should/can it be used to persist data on a view across several action methods?
Yes. If your views use the same data it can. Or if you have data that is on your _Layout.cshtml page that needs caching, it could be done in a global filter.
what happens if I have several users on the same page using the same MemoryCache?
Caching is shared between all users by default. It is specifically meant for keeping data in memory so it doesn't have to be fetched from the database on every request (for example, a list of state names on a checkout page for populating a dropdown for all users).
It is also a good idea to cache data that changes frequently for a second or two to prevent a flood of concurrent requests from being a denial-of-service attack on your database.
Caching depends on a unique key. It is possible to store individual user information in the cache by making the user's name or ID part of the key.
var key = "MyFavoriteItems-" + this.User.Identity.Name;
Warning: This method works only if you have a single web server. It won't scale to multiple web servers. Session state (which is meant for individual user memory storage) is a more scalable approach. However, session state is not always worth the tradeoffs.
Typical Caching Pattern
Note that although MemoryCache is thread-safe, using it in conjunction with a database call can make the operation cross threads. Without locking, it is possible that you might get several queries to the database to reload the cache when it expires.
So, you should use a double-checked locking pattern to ensure only one thread makes it through to reload the cache from the database.
Let's say you have a list that is wasteful to get on every request because every user will need the list when they get to a specific page.
public IEnumerable<StateOrProvinceDto> GetStateOrProvinceList(string country)
{
// Query the database to get the data...
}
To cache the result of this query, you can add another method with a double-checked locking pattern and use it to call your original method.
NOTE: A common approach is to use the decorator pattern to make the caching seamless to your API.
private ObjectCache _cache = MemoryCache.Default;
private object _lock = new object();
// NOTE: The country parameter would typically be a database key type,
// (normally int or Guid) but you could still use it to build a unique key using `.ToString()`.
public IEnumerable<StateOrProvinceDto> GetCachedStateOrProvinceList(string country)
{
// Key can be any string, but it must be both
// unique across the cache and deterministic
// for this function.
var key = "GetCachedStateList" + country;
// Try to get the object from the cache
var stateOrProvinceList = _cache[key] as IEnumerable<StateOrProvinceDto>;
// Check whether the value exists
if (stateOrProvinceList == null)
{
lock (_lock)
{
// Try to get the object from the cache again
stateOrProvinceList = _cache[key] as IEnumerable<StateOrProvinceDto>;
// Double-check that another thread did
// not call the DB already and load the cache
if (stateOrProvinceList == null)
{
// Get the list from the DB
stateOrProvinceList = GetStateOrProvinceList()
// Add the list to the cache
_cache.Set(key, stateOrProvinceList, DateTimeOffset.Now.AddMinutes(5));
}
}
}
// Return the cached list
return stateOrProvinceList;
}
So, you call the GetCachedStateOrProvinceList and it will automatically get the list from the cache and if it is not cached will automatically load the list from the database into the cache. Only 1 thread will be allowed to call the database, the rest will wait until the cache is populated and then return the list from the cache once available.
Note also that the list of states or provinces for each country will be cached individually.
I'm relatively new to programming so here is my question.
I have a C#-Forms application and an access database.
In the database I have the data of about 200-300 cars (Name, Year of construction, ...)
In my forms application i show all the cars in a list and i have a filter where I can search for specific words and types and so on.
At the moment I react to any filter input and then execute a new sql-query and list all the cars that fit the filter.
I obviously don't think that's a good solution because i have a database access every keyDown action.
Is it a viable way to create a car-class and create an instance of this class for every car and store them in a list?
What is the best way to handle all the 200 cars without reading them out of the database over and over again?
If your car data is not changing frequantly than you can store in memory data and after filter on that.
When new record is added you need to update in memory data.
the following code may help, although you should fill the blanks.
class CarDA
{
public const string YOUR_SPECIFIC_CACHE_KEY = "YOUR_SPECIFIC_CACHE_KEY";
public IList<Car> Search(string searchExpression)
{
var carList = ListAllCars();
//AMK: do the math and do the filter
return Filtered_Car_List;
}
private IList<Car> ListAllCars()
{
var ExpireTime = 10;
if (!MemoryCache.Default.Contains(YOUR_SPECIFIC_CACHE_KEY))
{
MemoryCache.Default.Add(
new CacheItem(YOUR_SPECIFIC_CACHE_KEY, PopulateCarList()),
new CacheItemPolicy
{
AbsoluteExpiration = DateTimeOffset.Now.AddMinutes(ExpireTime)
});
}
return MemoryCache.Default.Get(YOUR_SPECIFIC_CACHE_KEY) as IList<Car>;
}
private IList<Car> PopulateCarList()
{
//AMK: fetch car list from db and create a list
return new List<Car>();
}
}
class Car
{
public string Name { get; set; }
}
Seems more a philosophy question than a technical one, so I'd like to give you my 2 cents on this.
If the Cars database is modified only by your application and the application is run by a single person, you can read the data once and set a mechanism in your UI to reaload the in memory data when through the UI you update the database.
If you have multiple users using the data and updating the database, so that one user can add cars to the database while another is reading it to perform some operation on them, you have to prepare a mechanism to verify if data have been modified by someone else in the system to all users so that they can reload the cars list. (it can be something like a temporized query to a table where you memorize last date and time of update of the cars table for example, or it can be something more sophisticated in different cases).
Said so, Usually when working with databases I prepare a DataProvider Class that 'speaks' with the database, creating, updating, deleting and querying data, a Class that represents the Table Row of my data (in your case a Car class) and The DataProvider returns a List to my User interface that I can Use As is, or, if needed I can move to an ObservableCollection if using it with WPF UI objects and the data can change, or I can create something like a UICar object that is a Car plus other properties related to its use inside the User Interface that can be more useful to provide actions and functionality to the user using your application.
When you have your Cars collection inside your application, Searching in memory data using a simple Linq query becomes rather simple to implement and much more effective than calling the database at any change in the search textbox.
Also, as a suggestion, to avoid querying (memory or db is the same) at any keystroke, set a timer (just a few milliseconds) before starting the query reset in the key event so that if the user is typing a word the query starts just when he/she stops typing.
HTH
I have a somewhat complex permission system that uses six database tables in total and in order to speed it up, I would like to cache these tables in memory instead of having to hit the database every page load.
However, I'll need to update this cache when a new user is added or a permission is changed. I'm not sure how to go about having this in memory cache, and how to update it safely without causing problems if its accessed at the same time as updating
Does anyone have an example of how to do something like this or can point me in the right direction for research?
Without knowing more about the structure of the application, there are lots of possible options. One such option might be to abstract the data access behind a repository interface and handle in-memory caching within that repository. Something as simple as a private IEnumerable<T> on the repository object.
So, for example, say you have a User object which contains information about the user (name, permissions, etc.). You'd have a UserRepository with some basic fetch/save methods on it. Inside that repository, you could maintain a private static HashSet<User> which holds User objects which have already been retrieved from the database.
When you fetch a User from the repository, it first checks the HashSet for an object to return, and if it doesn't find out it gets it from the database, adds it to the HashSet, then returns it. When you save a User it updates both the HashSet and the database.
Again, without knowing the specifics of the codebase and overall design, it's hard to give a more specific answer. This should be a generic enough solution to work in any application, though.
I would cache items as you use it, which means on your data layer when you are getting you data back you check on your cache if it is available there otherwise you go to the database and cache the result after.
public AccessModel GetAccess(string accessCode)
{
if(cache.Get<string>(accessCode) != null)
return cache.Get<string>(accessCode);
return GetFromDatabase(accessCode);
}
Then I would think next on my cache invalidate strategy. You can follow two ways:
One would be set expire data to be 1 hour and then you just hit the database once in a hour.
Or invalidate the cache whenever you update the data. That is for sure the best but is a bit more complex.
Hope it helps.
Note: you can either use ASP.NET Cache or another solution like memcached depending on your infrastructure
Is it hitting the database every page load that's the problem or is it joining six tables that's the problem?
If it's just that the join is slow, why not create a database table that summarizes the data in a way that is much easier and faster to query?
This way, you just have to update your summary table each time you add a user or update a permission. If you group all of this into a single transaction, you shouldn't have issues with out-of-sync data.
You can take advantage of ASP.NET Caching and SqlCacheDependency Class. There is article on MSDN.
You can use the Cache object built in ASP.Net. Here is an article that explains how.
I can suggest cache such data in Application state object. For thread-safe usage, consider using lock operator. Your code would look something like this:
public void ClearTableCache(string tableName)
{
lock (System.Web.HttpContext.Current)
{
System.Web.HttpContext.Current.Application[tableName] = null;
}
}
public SomeDataType GetTableData(string tableName)
{
lock (System.Web.HttpContext.Current)
{
if (System.Web.HttpContext.Current.Application[tableName] == null)
{
//get data from DB then put it into application state
System.Web.HttpContext.Current.Application[tableName] = dataFromDb;
return dataFromDb;
}
return (SomeDataType)System.Web.HttpContext.Current.Application[tableName];
}
}