How to use Azure App Service Local Cache? - c#

I am having a MVC API project which is using a server side cache. It is deployed to Azure App Service. I was wondering to make use of App Service Local Cache to overcome challenge of keeping the cache in sync across nodes.Is this the right approach ? If yes, how do i modify the cache from my code.
PS: I am not much interested in using the Redis Cache.

I was wondering to make use of App Service Local Cache to overcome challenge of keeping the cache in sync across nodes.
What does sync across nodes mean? Do multiple webapp applications share cache data?
HttpRuntime.Cache can only be used in a single webapp application and cannot be accessed by other webapps. Once each cache is created, it takes up server resources. So from this point we can say: it is not that the more cache, the better. The cache has a time limit. After the expiration time set by the server, it will be recycled by the server. The cache can store any object.
So if you want to share cache data across sites, it is impossible to achieve, which is why the redis cache appears.
If you really don't want to use redis cache, then you can store the required data in sql server. This can only be an alternative, not the best.

Related

How do I connect SQL Server to Redis in a C#/.NET application?

I have a web site for which I would like to use Redis as the default caching mechanism to load certain pages of our site. I've set up a local instance of Redis and want to access my SQL Server database, but I'm not quite sure how to have the files of the particular pages loaded via Redis, and how to connect Redis to SQL Server. I've searched extensively on Google, and really only found how to set up Redis on a C#/.NET application (which I've done succesfully), but not really how to connect it to SQL Server and query the database. I hope this wasn't too broad, so let me know if you need more information and thank you for your help!
Choose an appropriate point in your application life cycle style to load your data for caching.
Write C# code to access your SQL Server and pull the data you wish to cache.
Use the Redis API to store this data in Redis.
Make all access to this data use the Redis API.
Handle a cache miss i.e. when the expected data isn't present.
Implement a cache refresh mechanism to ensure data is updated when required.

SharePoint 2010 and ASP.net Cache

We have a web farm of 4 SharePoint 2010 server. As web1, web2, web3, web4 and its AAM is http://xxx.sharepoint.com. Now I am making a web part where some data is going to be cached by HttpContext.Current.Cache. Now as we know that ASP.NET cache is limited to server memory and it is not shared between other servers in the web farm, So I created a solution where I will go to each server and in web.config I will fill the server list in the farm with semicolon separated and whenever any thing add to the cache I will send data to the other server and sync the cache and do the same for cache items removed.
Now my problem is how I will call other server individually by its own name in the SharePoint farm.
For example (http://web1/)
Not a direct answer but you can use timer jobs on Sharepoint for implementing something on all servers. Note that Sharepoint timer service must be running on all servers and it does not happen real-time(it doesn't take long either).
I used memcache, which is working good :) as expected
Microsoft Velocity is perfect use case for your scenario. Velocity will provide you distributed and in-memory cache.
Link:
Build Better Data-Driven Apps With Distributed Caching

Detect when IIS is recycling in Azure

I'm developing a WCF sercvice in Azure, in a web-role. I build an index in memmory, and use it to serve wcf request. My problem is that this index is gone from the memory after iis is recycling. Is there any way, to detect the recycling event, and copy the index to the memory again?
Thansk for your help.
Yes you can detect it in the Application_End event (in your Global.asax.cs).
The better solution would be not to put the index in memory. Since you're using a Web Role, you can use Windows Azure Caching (you won't need to pay extra for this). By using Windows Azure Caching you can store the index in memory, but not in the process of the application pool. It's still super-fast and it can survive application pool recycles.
Another advantage is that, when you store the index in Windows Azure Caching, all instances and roles in the same deployment will be able to use the same cache. This means they'll all use the same index you store in it.

azure cache on localhost

I'm developing an application to run in azure.
I'm making use of the azure cache, however when I run this locally I don't want to connect up to Azure to use the cache because it's a bit slow and tedious.
Can you run the cache locally?
[EDIT]
This is .Net C#
Unfortunately, you do need to connect to azure to test the windows azure cache service. Read this for more info: http://msdn.microsoft.com/en-us/library/windowsazure/gg278342.aspx
You can use Windows Server AppFabric Cache when local debug. It utilizes very similar configuration and program mode, which means almost all you need to change is the cache server IP and access token.
But I'd better to create a separated Cache layer to isolate the cache operations. For example introduces ICache interface with Add, Get, Remove, etc. methods. Then you can implement the Azure Cache, Memcached, In-Proc Cache, etc. in vary cases.
There's a good cache layer you might be interested in, check the ServiceStack project at GitHUB https://github.com/ServiceStack/ServiceStack/tree/master/src/ServiceStack.Interfaces/CacheAccess
It's not possible. To use the windows azure caching service locally, you'll always have to route your request to azure, which adds a serious delay on top of the request.
To property test your cache, you need to deploy your service in azure.
As others said, you can use Windows Server AppFabric caching locally, but be warned, there are some differences between the Windows Server AppFabric caching and the Windows Azure caching service, like for example the notifcation based invalidation on local cache items is not supported in azure. Make sure not to use any of these features while developing locally, or you might get surprised when deploying your service to the cloud.
Only the timeout based invalidation on local cache is supported for the windows azure caching service. Windows azure caching service is designed to be used for your cloud services, so it makes sense it's kinda crappy when using with on on-premise application.
Azure AppFabric caching uses a subset of the functionality of Windows Server AppFabric caching. If you're willing to setup a server in house with the cache installed you could probably get something comparable to using the Azure cache. I haven't tried this myself, so while I know that the code you'd need to write is more or less the same between the two, I'm not sure how different the configs need to be.
Chances are though that it's going to be a lot less time and effort to just use the Azure cache.
This article specifically talks about what you are trying to do. Create a caching "infrastructure" that switches between local and distributed cache based on configuration(s):
http://msdn.microsoft.com/en-us/magazine/hh708748.aspx
Now you can use azure in-role cache and try locally using emulator

How would a static variable work in a web farm?

I have a static class with a static dictionary to keep track of some stats. Is this approach viable in a single and multi server environment?
Not in a multi-server environment. Unless the dictionary isn't the .Net dictionary, but one that works against a database or some other outside storage. Also in a single server environment you should remember the Dictionary will be emptied with every IIS refresh.
No, a static variable in memory in one server won't be visible or synchronized with other servers.
Since you mention "keeping track of stats", what many multi-instance server farms end up doing is post-processing locally gathered data into an aggregate view of activity across all the servers. Each server accumulates stats in memory for a short while (a few minutes), then writes the data from memory to a log file on the local file system. Every few hours or daily a batch process copies the log files from each server in the farm to a central location and merges all the log records together to form one collection of data. One useful byproduct of this might be consolidating user activity across multiple servers if a user's web requests were served by different servers in the same farm (load balancing).
Depending on the volume of data, this batch processing of log data may itself require quite a bit of computational power, with multiple machines crunching different subsets of the server logs, and then those intermediates getting reduced again.
Google Sawzall is an example of a very large scale distributed data processing system. I believe Sawzall is (or was) used at Google to process server access logs to detect fraudulent ad click patterns.
No, it's not viable in a web farm multi-server environment.
Storing Session State in an ASP.Net Web Farm
Allowing Session in a Web Farm? Is StateServer Good Enough?
Fast, Scalable, and Secure Session State Management for Your Web Applications
ASP.NET Session State
Session-State Modes
The State Mode options are:
InProc mode, which stores session state in memory on the Web server. This is the default.
StateServer mode, which stores session state in a separate process called the ASP.NET state service. This ensures that session state is preserved if the Web application is restarted and also makes session state available to multiple Web servers in a Web farm.
SQLServer mode stores session state in a SQL Server database. This ensures that session state is preserved if the Web application is restarted and also makes session state available to multiple Web servers in a Web farm.
Custom mode, which enables you to specify a custom storage provider.
Off (disables session state)
Normally, this is what you would use Application state for.

Categories