I want to refactor some existing code.
A controller with two actions.
Each action calls a repository which is layered on top of a StronglyTyped enterpriseLibrary cache manager.
The repository tries to get data from the cache. If it fails it gets it from the DB and saves it in cache.
1) Would you combine the two caches into one (not) strongly typed cache,
with two consumers that will handle the casting to strong types?
2) What are the advantages and disadvantages of each FW?
http.cache vs. EnterpriseLibrary.cache
Any other suggestion?
The enterprise library cache will require a bit more work. Its meant to be more all encompassing. The asp.net Cache is specific to asp.net and has a lot of functionality that you need already built in (however it is specific to asp.net). One specific example is that the ASP.NET cache will differentiate between sessions as is. You would have to custom implement the logic to differentiate between session caches with the enterprise library cache.
Now to your specific problem. The asp.net cache is not really designed as is for web farm scenarios. Are you on a web farm? The reason i ask, it sounds as though from your question that the providers they created may be accessing a caching server of some sort.
Related
My question is: how do I implement caching in my domain project, which is working like a normal stack with the repository pattern.
I have a setup that looks like the following:
ASP.NET MVC website
Web API
Domain project (using IoC, with Windsor)
My domain project for instance have:
IOrderRepository.cs
OrderRepository.cs
Order.cs
My ASP.NET MVC website calls the Web API and gets back some DTO classes. My Web API then maps these objects to business objects in my domain project, and makes the application work.
Nowhere in my application have I implemented caching.
Where should be caching be implemented?
I thought about doing it inside the methods in the OrderRepository, so my Get, GetBySpecification and Update methods has to call some generic cache handler injected by the OrderRepository.
This obviously gives some very ugly code, and isn't very generic.
How to maintain the cache?
Let's say we have a cache key like "OrderRepostory_123". When I call the Update method, should I call cacheHandler.Delete("OrderRepository_123") ? Because that seems very ugly as well
My own thoughts...
I can't really see a decent way to do it besides some of the messy methods I have described. Maybe I could make some cache layer, but I guess that would mean my WebAPI wouldn't call my OrderRepository anymore, but my CacheOrderRepository-something?
Personally, I am not a fan of including caching directly in repository classes. A class should have a single reason to change, and adding caching often adds a second reason. Given your starting point you have at least two likely reasonable options:
Create a new class that adds caching to the repository and exposes the same interface
Create a new service interface that uses one or more repositories and adds caching
In my experience #2 is often more valuable, since the objects you'd like to cache as a single unit may cross repositories. Of course, this depends on how you have scoped your repositories. A lot may depend on whether your repositories are based on aggregate roots (ala DDD), tables, or something else.
There are probably a million different ways to do this, but it seems to me (given the intent of caching is to improve performance) implementing the cache similar to a repository pattern - where the domain objects interact with the cache instead of the database, then perhaps a background thread could keep the database and cache in sync, and the initial startup of the app pool would fill the cache (assuming eager loading is desired). A whole raft of technical issues start to crop up, such as what to do if the cache is modified in a way that violates a database constraint. Code maintenance becomes a concern where any data structure related concerns possibly need to be implemented in multiple places. Concurrency issues start to enter the fray. Just some thoughts...
SQLCacheDependency with System.Web.Caching.Cache, http://weblogs.asp.net/andrewrea/archive/2008/07/13/sqlcachedependency-i-think-it-is-absolutely-brilliant.aspx . This will get you caching that gets invalidated based on other systems applying updates also.
there are multiple levels of caching depending on the situation however if you are looking for generic centralized caching with low number of changes I think you will be looking for EF second level caching and for more details check the following http://msdn.microsoft.com/en-us/magazine/hh394143.aspx
Also you can use caching on webapi level
Kindly consider if MVC and WebAPI the network traffic if they are hosted in 2 different data centers
and for huge read access portal you might consider Redis http://Redis.io
It sounds like you want to use a .NET caching mechanism rather than a distributed cache like Redis or Memcache. I would recommend using the System.Runtime.Caching.MemoryCache class instead of the traditional System.Web.Caching.Cache class. Doing this allows you to create your caching layer independent of your MVC/API layer because the MemoryCache has no dependencies on System.Web.
Caching your DTO objects would speed up your application greatly. This prevents you from having to wait for data to be assembled from a cache that mirrors your data layer. For example, requesting Order123 would only require a single cache read rather than to several reads to any FK data. Your caching layer would of course need to contain the logic to invalidate the cache on UPDATEs you perform. A recommended way would be to retrieve the cached order object and modify its properties directly, then persist to the DB asynchronously.
I have a windows form application(c#) and an asp.NET web application which both access Sql Server database. I want to centralize the database access. Which metedologies should i follow? What is the common approach to this issue?
Writing DAL and Model Libraries and using them in both application?
Writing WCF service including DAL model and using this service with both applicaiton?
None of the above?
Can you give me any idea?
Thank you.
I would go with the WCF approach. Keep in mind that when (not if, when) you have to make changes that pertain to one app, but not the other (yet), you will have to account for that in the common layer, so using interfaces may make your life a little easier.
The cleanest way is to wrap the DB with a WCF services.
If you don't write large amounts of data in one go you can use a WCF Data Service; this directly wraps an Entity Framework model and you can configure access to tables and methods in various ways.
What you want is to have one place where the DB is accessed, so that if there is an issue, you can fix it in one location, for instance.
Furthermore, if you want to log all calls to a particular table, for instance, the only way to make sure that will be done is by centralizing all calls to the DB this way and not allow anybody direct access to the DB.
Wrap the service, then keep the connection string secret.
I think using the SOA approach is really better (WCF or WebServices with a DAL layer) because this way you don't need to publish your DAL dll with the Windows Forms exe. Then, all changes to your data model will automatically happens to your both UI clients.
Remember that this can cause its own problems:
Concern with security so that your Services cannot be accessed directly by URL, allowing someone to run your methods.
Concern about maintenance, because changes in data layer that needs to affect only one interface will be more difficult to control and needs to be better planned before (with the creation of new methods specific to certain intercace).
Decrease in performance, because the HTTP access is always more costly than direct communication with a dll.
Risk of lack of communication with the server, something that is expected to ASP.NET but requires additional concerns in the Windows Forms client to behave properly in these cases.
Option 1 seems simpler and I would do the same.
Option 2 with WCF will add additional code to your product and hence maintenance. Also this would mean an additional layer as well.
Corporate programmers like the second option (WCF service including DAL).
Struggling with this one today.
Rewriting a web-based application; I would like to do this in such a way that:
All transactions go through a web services API (something like http://api.myapplication.com) so that customers can work with their data the same way that we do / everything they can do through our provided web interface they can also do programmatically
A class library serves as a data layer (SQL + Entity Framework), for a couple of design reasons not related to this question
Problem is, if I choose not to expose the Entity Framework objects through the web service, it's a lot of work to re-create "API" versions of the Entity Framework objects and then write all the "proxy" code to copy properties back and forth.
What's the best practice here? Suck it up and create an API model class for each object, or just use the Entity Framework versions?
Any shortcuts here from those of you who have been down this road and dealt with versioning / backwards compatibility, other headaches?
Edit: After feedback, what makes more sense may be:
Data/Service Layer - DLL used by public web interface directly as well as the Web Services API
Web Services API - almost an exact replica of the Service Layer methods / objects, with API-specific objects and proxy code
I would NOT have the website post data through the web services interface for the API. That way leads to potential performance issues of your main website. Never mind that as soon as you deploy a breaking API change you have to redeploy the main website at the same time. There are reasons why you wouldn't want to be forced to do this.
Instead, your website AND web services should both communicate directly to the underlying business/data layer(s).
Next, don't expose the EF objects themselves. The web service interface should be cleaner than this. In other words it should try and simplify the act of working with your backend as much as possible. Will this require a fair amount of effort on your part? yes. However, it will pay dividends when you have to change the model slightly without impacting currently connected clients.
It depends on project complexity and how long you expect it to live. For small, short living projects you can share domain objects across all layer's. But if it's big project, and you expect it to exist, work well, and update for next 5 years....
In my current project (which is big), I first started with shared entities across all layers, then i discovered that I need separate entities for Presentation, and now (6 month's passed) I'm using separate classes for each layer (persistence, service, domain, presentation) and that's not because i'm paranoid or was following some rules, just I couldn't make all work with single set of classes across layers... Make you conclusions..
P.S. There are tools that can help you convert your objects, like Automapper and Value Injecter.
I would just buck up and create an API specifically aimed at the needs of the application. It doesn't make much sense to what amounts to exposing the whole DB layer. Just expose what needs to be exposed in order to make the app work, and nothing else.
I'm writing a tool in C#.Net that will be used to generate Catalogs of content which users can browse. Initially I am creating a WinForms based interface, but in the future I'd like to be able to create a web based interface as well. So I've been careful to generalize the interface to a Catalog so that it does not depend on a specific UI.
My only experience with web development has been creating my own HTML website back in the early 90's, and I've done a little ASP (not ASP.NET). Now with ASP.NET it seems that I should be able to leverage my existing C#.Net object model, to create a web base interface. But I really hasn't done anything with ASP.NET beyond a simple hello world example.
Are there any special considerations I should make in designing my object model so that later I can create a web interface to it?
Here are few things to follow:
You should package your object model
is separate project (that you need
to do anyway to share it among
different projects) and make sure
that you do not add specific
references to it (for example, don't
add System.Web, WinForms, WPF etc) -
this will automatically avoid any
unwanted dependencies.
Try to have your classes as lean as possible. Avoid classes that track change states etc - in web scenario, tracking state over multiple requests is expensive. So it's best to have to your objects carry data only.
Consider the possibility that your objects may need to be serialized and/or passed over a wire. For example, a middle ware services serving both windows & web client. Or web page storing the object in the view-state.
There really shouldn't be that big a difference.
Be careful about placing too much “intelligence” in your entity classes. That’s a pattern I’ve seen often in Windows apps. Don't make references to controls that are specific to Windows Forms development in the parts of your project that you want to reuse for the web application.
Repository patterns work well with both Windows and Web applications, because you often want to optimize the web apps differently for performance with multiple users.
Your requirement can be handled with a multi-tier architecture:
http://en.wikipedia.org/wiki/Multitier_architecture
I have an ASP.net application that uses some common business object that we have created. These business objects are also used in a few other windows services or console applications.
The problem I am running into is that if I have a class "foo" and a class "bar" and each has a function loadClient(), if I call foo.loadClient() and bar.loadClient(), each request will hit the database. I figure implementing some sort of cache would reduce unnecessary round trips to the DB.
Here's the catch. I want the cache to be specific to each HTTP request that comes in on the ASP.net App. That is, a new request gets a brand new cache. The cache can exist for the lifetime of the other console applications since 90% of them are utilities.
I know I can use System.Web.Cache but I don't want my middleware tied to the System.Web libraries.
Hope that explains it. Can anyone point me in the right direction?
Thanks!
Are you reusing objects during the lifetime of a request? If not,then the model you have suggests that each postback will also create a new set of objects in effect obviating the need for a cache. Typically a cache has value when objects are shared across requests
As far as using a non web specific caching solution I've found the Microsoft Caching Application Block very robust and easy to use.
I think you can take a loot at Velocity project.
http://msdn.microsoft.com/en-us/data/cc655792.aspx - there is a brief article
If you are looking for interprocess caching then thats difficult.
But if you dont want your middleware tied to System.Web then you can write one interface library that will serve as bridge between your middleware and system.web.
In future if you want to tie it to other cache manager then you can just rewrite your bridge interface library keeping your middleware absolutely independent of actual cache manager.
The System.Runtime.Caching.MemoryCache is recommended by Microsoft in lieu of System.Web.Caching. It can be used in the context of the MS Caching Application Block suggested by Abhijeet Patel.
See:
Is System.Web.Caching or System.Runtime.Caching preferable for a .NET 4 web application
http://msdn.microsoft.com/en-us/library/dd997357.aspx