I remark that for my project I really need often something to manage a cache of our data(for data access performance, for offline work, ...).
So I was asking me if there was something which could respond to my needs or if I will have to create my own framework for this. It can be only a "Core" which furnish the logic, and we have to implement the business part.
My needs are:
Data sources can be WCF/Web service/...(this part should be implemented on every new project
It has to manage an store of data
available
This store must be refreshed regularly by polling the service
This store can be persistent(write cache on disk for the next start)
The framework must allows modifications, online and offline,
asynchronous and synchronous(if online)
It has to run with c# 4.0
If the local cache store can be accessed through LINQ, it would be great(like directly through a list
The concurrency has to be managed(or offer us a way to manage it)
The use/configuration of this framework should be shorter than implement myself it every time
So here we are, do you know a tools which can fits into my query?
Somebody tell me that MS entreprise library should have something like that, but I didn't found anything.
Thank you!
You could have a look at
Windows Server AppFabric. It used to be called 'velocity'.
It is a distributed in-memory application cache platform for
developing scalable, high-performance applications.
Otherwise, the Enterprise Library Caching Application Block you're talking about is here: The Caching Application Block however, this page says:
Caching Application Block functionality is built into .NET Framework
4.0; therefore the Enterprise Library Caching Application Block will
be deprecated in releases after 5.0. You should consider using the
.NET 4.0 System.Runtime.Caching classes instead of the Caching
Application Block in future development.
And actually, the System.Runtime.Caching Namespace is a very good building block to build on if you're going to write something by yourself. I don't think it implements the notion of distributed cache, that's why Windows Server AppFabric exists.
Now, there is also non-Microsoft technologies available in the .NET space. Have a look a memcached and .NET implementation or usage:
Is there a port of memcache to .Net?
Memcached with Windows and .NET
You also have commercial packages available, like NCache (I'm not affiliated). I don't know what they provide, but it's also probably interesting to have a look at it, just to be aware what they provide, to ensure you don't miss any feature you'd need later one.
Have a look at SharedCache.
Related
I am writing public .NET class library version for our online REST service and I can't decide which version of .NET to choose.
I would like to use .NET 4.0 version but such compiled class library can't be used in .NET 2.0 version?
Maybe there is statistic how many developers use .Net 2.0 version?
There's little reason not to use the latest version of the framework. Not only do you get all the latest features and whistles that speed development time, but you also get to take advantage of all the bug fixes and improvements that Microsoft has done under the hood.
The only advantage of targeting earlier versions of the framework is in a vain hope that the user won't have to download and install anything in order to use your app. But that's far from foolproof, and mostly in vain. Remember that Windows is not a .NET Framework delivery channel and you can't reliably assume that the user will have any version of the .NET Framework installed. Even if you insisted on counting on it being bundled with Windows (which you shouldn't), lots of users still haven't upgraded from Windows XP. Even if you counted on it being pushed out over Windows Update, there are significant numbers of users who either don't use Windows Update, don't use Windows Update very often, or who live out in remote areas with poor/slow Internet access and can't download all of those updates.
The moral of the story is that you're going to have to provide the appropriate version of the .NET Framework with your application anyway. And the .NET 4.0 runtime is actually significantly smaller than the previous versions, so there's little reason to target them. The team has worked really hard on that, and their efforts have really paid off. Even better, as atornblad notes, most apps can target the Client Profile version of the framework which trims out some infrequently used pieces and slims things down another ~16%.
Additionally, I strongly recommend using a setup application that handles installing the required framework for the user automatically and seamlessly. Visual Studio comes with built-in support for creating setup applications, or you could use a third-party installer utility like Inno Setup. That makes using the latest version a no-brainer.
Everyone else seems to be recommending using the latest version, so I'll buck the trend and suggest 2.0 if you don't actually need any features from later versions... if this really is a client library and you have no control over and little idea about who is going to use it.
It really does depend on who your users are likely to be, which in turn depends on what the REST service is. If it's some sort of social media thing, then I'd say it's more likely that your clients will be in environments where they can use .NET 4. If it's something which might well be used by financial institutions or other big businesses, they may well not have the option of using .NET 4, so you should consider earlier versions. This is the approach we've taken for Noda Time where we believe the library will be useful in a wide variety of situations, and we can't predict the client requirements.
Of course, if you know all your clients and know they will all be able to use .NET 4, then go with that.
The big downsides of sticking to .NET 2.0 are that you won't be able to use LINQ internally (unless you use LINQBridge or something similar, which adds another dependency for your library) and you won't be able to (cleanly) provide extension methods. If you can usefully expose more features to the client if you use a later version, you may want to provide multiple versions of the library - but obviously that's a maintenance headache.
Another consideration is whether you ought to provide a Silverlight version - which again depends on what sort of service you're providing and what sort of users you're expecting.
If you are making a REST service you should probably use 4.0.
The only time you need to consider using a legacy version is if another project should reference your compiled dll. The REST service is exposed using HTTP over internet and the client will not use the .dll directly. Or did I understand the question wrong?
It's almost always is a good idea to use the latest version, cause MS provides a lot of bugfixes and innovations in those.
If, in your system, there is a limitation for 2.0, I'm afraid you need to use that one, cause you need to "make the stuff work".
For versions approx destribution, can look on this SO answer (but it till 3.5 version)
When you do not create your library to fit in an existing legacy environment you should always use the most up to date releases.
If I don't understand you wrongly you're looking to create a .NET-based client library to work with some REST service(s) made by you too.
Perhaps you want to provide a client library which can be consumed by 2.0, 3.5 and 4.0 applications, and this is absolutely possible, and using best features of each framework version.
Maybe there're more approaches, but I'd like to suggest you three of them:
Conditional compilation-based approach. You can implement your classes using a common feature set found in legacy and newer framework versions, but take advantage of useful features present in each version. This is possible using conditional compilation and compilation symbols, since you can define specific code to be compiled depending on target framework version (check this question: Is it possible to conditionally compile to .NET Framework version?).
Symbolic links in Visual Studio 2010-based approach. You can choose to use a common feature set, keeping in mind that this is going to be the one found in the oldest version. That is you can create a project which compiles in 2.0, and others for newer versions, adding all compilable files and embedded resources as symbolic links in these Visual Studio projects. This is going to produce an assembly for any of supported framework versions. You can mix conditional compilation-based approach with this one, and you can get a great way of delivering your public assembly in various framework versions in a very reliable and easy-to-maintain way. Note whenever you add a new compiled file or resource to a project, you need to create the corresponding symbolic links for it for your other projects. Check this MSDN article if you want to learn more about linked files: http://msdn.microsoft.com/en-us/library/9f4t9t92.aspx.
Version specific, optimized assemblies. Maybe the most time-consuming approach. It requires more effort, but if your REST service isn't a giant one, you can have room to develop an specific assembly for each framework version, and take advantage of best features and approaches of all of them.
My opinion
In my opinion, I'd take #2 approach, because it has the best of #1 and #3. If you get used with it, it's easy to maintain and it's all about discipline, and you'll have a good range of choices for your client developers.
I'd compromise and use the oldest framework that provides you (the library's author) the most bang for your buck. It's compromise that lets you develop the fastest and exposes your library to the more users. For me, that usually means 3.5 because I tend to use LINQ extensively.
It should be trivial to provide both 2.0 and 4.0 binaries, as long as you're not using any of the 4.0 specific dlls.
You can also publish your client library source code - .NET binaries are already so easy to decompile that you're not leaking out anything valuable this way.
I am trying to implement caching in .Net such that the cached data is accessible not just by an application that may run multiple times on the same machine but by other types of applications that may run on the machine. They could be windows services, web services, win forms etc.
I have looked at System.Runtime.Caching (because Enterprise Application Blocks Caching is going to become obsolete) as a means to achieve this. The default MemoryCache is insufficient to achieve this as I don't believe that it work across app domains.
Is there a way I can implement the kind of caching I am looking for or is there a caching dll of some sort (must be free) that I can use to achieve my goal?
Is there a way to use System.Runtime.Caching with IsolatedStorage scoped to MachineLevel?
I've looked at memcache too and can't use it because we need this to run on windows machines. I started looking at SharedCache (http://www.codeproject.com/KB/web-cache/AdvanceCaching.aspx) and am curious about the pitfalls it has as well.
Thanks.
-- Revision 1 --
I think the optimal solution for me would use the Caching object to a Memory Mapped File (http://msdn.microsoft.com/en-us/library/dd997372.aspx). So the question I have now is whether anyone has done that with the System.Runtime.Caching object. There must be a way to extend it if necessary...examples of how to do so would also be much appreciated.
You're looking for AppFabric Cache. It's a Windows Server technology from Microsoft. It's free.
I should also say that if you like memcached, you can use that on Windows as well, and in fact Microsoft Azure team members used to recommend it, before the AppFabric caching was available on Windows Azure.
Have you evaluated Microsoft Velocity? Take a look - I believe if you are not okay with using the AppFabric Cache, this should work out for you:
http://msdn.microsoft.com/en-us/magazine/dd861287.aspx#id0450004
For simple client based caching, you can look at file based caching.
I recently received a project that contains multiple web applications with no MVC structure. For starters I've created a library (DLL) that will contain the main Business Logic. The problem is with Caching - If I use the current web context cache object than I might end up with duplicate caching (as the web context will be different for every application).
I'm currently thinking about implementing a simple caching mechanism with a singleton pattern that will allow the different web sites (aka different application domains) to share their "caching wisdom".
I'd like to know what is the best way to solve this problem.
EDIT: I use only one server (with multiple applications).
Depending on the type and size of the data you want to cache, I'd suggest:
For small amounts of primitive data : nCacheD (codeplex) - memcached redux for .net
For heavyweight objects : MS Patterns and Practices Caching Block (msdn)
In general though, I would look at my requirements and make really sure an all-encompassing cache is really needed, and writing code to maintain its state (and tune its resource consumption) would not be more expensive than going straight to the database.
If most of the stuff you want to cache is static pages, or a combination of static & dynamic content, I would look into utilizing IIS/ASP.NET's page level cache.
I have two different suggestions depending on your plans to be scalable. Regardless of the back end cache you choose i would suggest that you first implement an adapter pattern layer that abstracts you from your cache, as a result it will limit your dependency on the cache and give you the ability to swap it out later.
If you want to scale out by adding a web farm (more than one application server) then look at velocity. Microsoft will be packaging this in 4.0 but it is a CPT3 currently and is very easy to use.
Documentation
Download
If you dont plan to move to a multiple server system then just use the HttpContext.Current.Cache
Sounds to me like you should take a look at Build Better Data-Driven Apps With Distributed Caching. The article describes a new, distributed cache from Microsoft (codenamed Velocity).
I've also been offered to use SharedCache, which look exactly like the architecture I'm looking for: Single Instance Caching.
I am working on a .NET app that will also run on iphone via monotouch and osx/linux via mono. The app will hold profiles for various users and the profile used for a particular session will be selected on startup, kind of like Skype.
To store per-user settings, I am considering using the Application Settings system that's part of .NET. However, this system seems to rely on reflection, which is not available on iphone. I am also not sure if this system will function on platforms other than Windows.
I could also use the app's sqlite database that stores the application data to store settings, and simply roll my own settings classes that would be serialized/deserialized to the sqlite database like all the other application data.
Finally I could roll my own file-based solution.
What are the tradeoffs for these approaches? Why does .NET have dedicated support for user settings? It seems like a quite simple thing that coders should do on their own, and the existence of dedicated support within the .NET framework makes me suspect that I'm missing some point of complexity.
Thanks!
First thought - don't use configuration settings, use the sqlite database as that is on the iPhone and the best approach to take. Remember MonoTouch just transliterates the .NET code to the Objective C equivalent code and compiled to native binary, and you may run into snags if you use Windows/Mono specific code that may not be present on the iPhone.
Avoid pinvokes like the plague if you want your code to work across all platforms.
.Net has support for user settings because Microsoft designed them that way.
Hope this helps,
Best regards,
Tom.
Is it possible to set cache in one application and use it in another application ?
Short answer is Yes. Regardless of the language you are using you can use a product such as MemCached (linux/unix), MemCached Win32 (windows), Velocity (Microsoft) in which such products are used for caching farms.
A caching farm is similar to a web farm in that it is a high availability and easily scalable solution...for caching. In this case the cache is totally separate from the application itself. So as long as you have a naming structure for your keys (assigned to the objects in the cache) you could technically span the cached content across not only applications but different platforms, languages, technologies, etc.
See more information regarding this here: System.Web.Caching vs. Enterprise Library Caching Block
You should really be much more specific in your questions. I have to assume you're talking about ASP.NET and the Cache property, but it's only a guess, since you didn't give any clue about what you're looking for (except that you said C#).
No, the Cache property is per-application.
Implement your caching functionality in one application and make it available through .Net Remoting.
Then access it from your other application. Remember that all your objects you want to cache this way will have to be serializable. (you probably have to serialize/deserialize it on your end. not the cache app end)