I wonder if there are any cache engines or if someone has a good solution that can handle these requirements:
It should store plain HTML fragments of a page like the standard Output cache in asp.net
The HTML may contain dynamic content from a database
When an object is updated in the database all the cached HTML fragment containing that particular object should be destroyed and re-cached next time it will be requested.
There is a separate admin tool to handle all data in the database so I can easy store the Id’s in a cachetable when an object is invalid. I can also make a request to a page that destroy all cached HTML fragment for that object.
But when I write the markup, how could I do to store and retrieve a particular segment from the cache? Of cause I could do this in code behind and have the markup in a string but I don’t want that. I want to have the markup as intact as possible.
I'm assuming this is all within an ASP .Net page. I think your situation is simple enough to where you can write this caching mechanism yourself:
Make your SQL query as usual
Use the results to output your aspx page
Store the results in a static variable or static "Cache" class, this should persist from request to request
Future requests would use your stored results, unless they are invalidated as you mention they would be. (In my experience, a 1 minute time expiration can work well).
If you want to get more complicated, and preserve the cache even if IIS is restarted: You could easily use an XmlSerializer and write your results to a file instead of using static variables.
You have two problems to consider:
Number 1: Cache invalidation. This can be solved using cache dependencies (see http://msdn.microsoft.com/en-us/library/system.web.caching.cachedependency.aspx ) which will allow things to be invalidated and recached as data changes (automatically!).
Number 2: Cache storage. For this, just use the standard ASP.Net cache API using HttpContext.Current.Cache (or a suitable abstraction). That supports the dependency system as mentioned above.
The ASP.Net cache implementation is pluggable so you can change the storage mechanism between in memory, files, SQL databases, Memcached (via Enyim) or Microsoft Velocity for example. Each cache store has different reliability that you can pick and choose based on your requirements.
Related
Some background - I am working in a project which requires a kind of headshake authentication. The external service will send a request with a Token, and I will answer with Validator. Then it will send a second request containing the same Token and the data I should store in my database. The token is also used to get a couple extra fields that are required to insert the data in the database. Due to several project constraints and requirements, this "api" is implemented in Serverless (Azure Functions).
Since there are only 100 and something token-validator pairs that are not often updated (I will update manually every month or so), I have decided about not querying the database every time I get an incoming request. Normally I would simply use caching in C#, but since I am working with Functions, the code will be executed in multiple changing processes, which means no shared cache. I also think that using a cache service, such as Redis or Azure Cache would be an overkill.
My current solution - Currently, I am storing the data in a Hashtable that maps a Token to a ValidatorModel object that contains the validator and the extra fields I require. It works pretty well, but since it is a big C# object, it is a pain to update, the IDE lags when I open it, etc. I also don't know if it is a good idea to have it hardcoded in C# like that.
What I have been thinking about - I was thinking about storing a binary protobuf file that contained my Hashmap object. I am unsure if this would work or perform well.
My question - What is the best way do store such data and access it in a performatic way?
I'm working with ASP.NET and I want to load once a big object (specific by user) in my controller and then use it in my view.
I though about a static property but I find some problems with it.
See : Is it a bad idea to use a large static variable?
I'm not familiar with this language and I have no idea how to properly share the same object between the different methods for each user. Could you tell me how to do that ? Do you know if singleton could be a solution ?
A singleton won't help you here if this big object is going to be different for every user.
Without knowing all the details, I'd say perhaps your best option is to store it in the Session object,
HttpContext.Current.Session["bigObject"] = bigObject;
Depending on the size & traffic, this can have performance problems, so I'd suggest you read up on the pros and cons of using Session
If you want to get to use something for the next simultaneous request, then use TempData - which is a bucket where you can hold data that is only needed for the following request. That is, anything you put into TempData is discarded after the next request completes.
If you want to persist the information specific to user, then go for Session. With session, you will have timeout, so that after certain amount of time the data stored in session will be lost. You can configure this value to much more bigger value. Apart from that when you go for Webfarm of servers, then maintaining session will be a problem, probably you might need to go for managing session in SQL Server or in some other store.
Alternatively you can Use Runtime Cache object in ASP.Net MVC to keep all the common data. Cached data can be accessed fast and we have other benefits like expiring cache etc. You can share this Cache object across users, or you can maintain different cache objects for different users, that is purely dependent on your logic. In case of webfarms, yo u have distributed cache servers like redis, Azure Cache service etc., which you can use them.
My MVC based application, hooks some web services which send back lots of data!
Using the same, I render my views. The web services are slow and out of my control.
So I would like to store this info per session, but I am afraid that, this will bring down my web server to its knees. With few hundred users, the web server will run out of memory.
Is there a way that I can store this session data in a file per session? I am more looking at some out of the box open source solutions.
I welcome, new suggestion as well!
You can store pretty much any object in the Session storage, with a few exceptions which are generally related to running on a server farm. I'm going to ignore those cases here however.
If you're dealing with only a few MB of data, storing it in the Session object (or a Cache, as #Rick suggests) isn't necessarily a major problem. Once the data has been returned from the web service and parsed into your own internal data structures, simply place the data structure's root object into the Session. I use this method fairly often to store the results of database queries that take a long time to run, especially when the query criteria are unlikely to change frequently.
For larger data sets you should probably use a database to store the information. Create tables that match the structure of the data you're returning and tag the data in some way to indicate how old it is and what criteria were used when fetching it. Update as required, and query the database for records on each client request.
There are plenty of other options, including creating temporary files to store the data using the SessionID to identify them, but I recommend investigating the database option first.
Caching is your friend. And since you use MS technology you might want to take a look at the Cache Class
You could just serialize the result collection and save it on files as xml (even process it using linq/XPath directly from XML) , or use any .net native xml database to store and persist data on a file.
I am doing my first ASP.NET MVC project. (In fact, for the record, this is my first production website of any kind).
This is a mobile web page targeting HTML 5.
This page looks up some "expensive" information. First it uses the html 5 geocoding feature to get the customers latlong from their browser.
I pass that information to a controller. That controller fills in the City and State (into a location model) using the Google Maps API and then uses it in the view.
So this data is expensive in two ways, first it takes time to look it up and second, in the case of the Google API, it is literally not free in that Google limits the number of calls that can be made per day.
I understand that MVC is designed to "embrace" the web including the fact that http is a stateless protocol.
Nevertheless, I would like to avoid having to fetch this expensive data on every page load for every endpoint that needs it. Furthermore, one other piece of state that I would like to keep track is the fact that the data has already been fetched.
What is the best way / best practice for achieving this on an MVC 3 web application? Since my model for location has 4 data point (lat long city state) and there is a fifth data point (data retrieved) I would really like to avoid using querystrings to do this or a route for all of those data points?
I am sure this is a common need but I honestly don't know how to tackle it. Any help will be appreciated.
Seth
It Seems to me that you would like to cache the API call to google.
http://msdn.microsoft.com/en-us/library/18c1wd61(v=vs.71).aspx
You can store the object you got from google in cache and call it on the new controller event. you could also create another object that has the object from google and a bool that indicates if you have fetched the data or not.
It seem to me that the Cache would be your best bet.
You can store it in session state if it is available in your case instead of passing between pages.
Since this is "expensive" data, but still not to be persisted for a long time, you may:
use Session state
put the data in the Cache and either
set a cookie to enable the retrieval of the "expensive" data from cache
use a cache key which is unique to each query (lat long city state ?)
store the data ("data retrieved") on the client (since you do not seem to persist it on the server side)
My personal preference would be server side cache with a unique key.
Store expensive data to the cache, and build cache ID by parameters you send to google, cache id should be unique for every distinct place
Another option would be html5 storage. You will want to check to see if your target browsers support it though. The advantage to this approach is that the server does not have keep track of this data in session or a database - in fact the server doesn't know about client storage at all.
try
Session[xxx]=xxx;
or
Application[xxx]=xxx;
instead
I'm trying to store an xml serialized object in a cookie, but i get an error like this:
A potentially dangerous Request.Cookies value was detected from the client (KundeContextCookie="<?xml version="1.0" ...")
I know the problem from similiar cases when you try to store something that looks like javascript code in a form input field.
What is the best practise here? Is there a way (like the form problem i described) to supress this warning from the asp.net framework, or should i JSON serialize instead or perhaps should i binary serialize it? What is common practise when storing serialized data in a cookie?
EDIT:
Thanks for the feedback. The reason i want to store more data in the cookie than the ID is because the object i really need takes about 2 seconds to retreive from a service i have no control over. I made a lightweight object 'KundeContext' to hold a few of the properties from the full object, but these are used 90% of the time. This way i only have to call the slow service on 10% of my pages. If i only stored the Id i would still have to call the service on almost all my pages.
I could store all the strings and ints seperately but the object has other lightweight objects like 'contactinformation' and 'address' that would be tedious to manually store for each of their properties.
Storing serialized data in a cookie is a very, very bad idea. Since users have complete control over cookie data, it's just too easy for them to use this mechanism to feed you malicious data. In other words: any weakness in your deserialization code becomes instantly exploitable (or at least a way to crash something).
Instead, only keep the simplest identifier possible in your cookies, of a type of which the format can easily be validated (for example, a GUID). Then, store your serialized data server-side (in a database, XML file on the filesystem, or whatever) and retrieve it using that identifier.
Edit: also, in this scenario, make sure that your identifier is random enough to make it infeasible for users to guess each other's identifiers, and impersonate each other by simply changing their own identifier a bit. Again, GUIDs (or ASP.NET session identifiers) work very well for this purpose.
Second edit after scenario clarification by question owner: why use your own cookies at all in this case? If you keep a reference to either the original object or your lightweight object in the session state (Session object), ASP.NET will take care of all implementation details for you in a pretty efficient way.
I wouldn't store data in XML in the cookie - there is a limit on cookie size for starters (used to be 4K for all headers including the cookie). Pick a less verbose encoding strategy such as delimiters instead e.g. a|b|c or separate cookie values. Delimited encoding makes it especially easy and fast to decode the values.
The error you see is ASP.NET complaining that the headers look like an XSS attack.
Look into the View State. Perhaps you'd like to persist the data across post-backs in the ViewState instead of using cookies. Otherwise, you should probably store the XML on the server and a unique identifier to that data in the cookie, instead.
You might look into using Session State to store the value. You can configure it to use a cookie to store the session id. This is also more secure, because the value is neither visible or changeable by the user-side.
Another alternative is to use a distributed caching mechanism to store the value. My current favorite is Memcached.