How to handle BL cache for multiple web applications? - c#

I recently received a project that contains multiple web applications with no MVC structure. For starters I've created a library (DLL) that will contain the main Business Logic. The problem is with Caching - If I use the current web context cache object than I might end up with duplicate caching (as the web context will be different for every application).
I'm currently thinking about implementing a simple caching mechanism with a singleton pattern that will allow the different web sites (aka different application domains) to share their "caching wisdom".
I'd like to know what is the best way to solve this problem.
EDIT: I use only one server (with multiple applications).

Depending on the type and size of the data you want to cache, I'd suggest:
For small amounts of primitive data : nCacheD (codeplex) - memcached redux for .net
For heavyweight objects : MS Patterns and Practices Caching Block (msdn)
In general though, I would look at my requirements and make really sure an all-encompassing cache is really needed, and writing code to maintain its state (and tune its resource consumption) would not be more expensive than going straight to the database.
If most of the stuff you want to cache is static pages, or a combination of static & dynamic content, I would look into utilizing IIS/ASP.NET's page level cache.

I have two different suggestions depending on your plans to be scalable. Regardless of the back end cache you choose i would suggest that you first implement an adapter pattern layer that abstracts you from your cache, as a result it will limit your dependency on the cache and give you the ability to swap it out later.
If you want to scale out by adding a web farm (more than one application server) then look at velocity. Microsoft will be packaging this in 4.0 but it is a CPT3 currently and is very easy to use.
Documentation
Download
If you dont plan to move to a multiple server system then just use the HttpContext.Current.Cache

Sounds to me like you should take a look at Build Better Data-Driven Apps With Distributed Caching. The article describes a new, distributed cache from Microsoft (codenamed Velocity).

I've also been offered to use SharedCache, which look exactly like the architecture I'm looking for: Single Instance Caching.

Related

How to make the framework and the dependent applications loosely coupled?

I have a specific case and I want to know the best practice way to handle it.
I make a specific .NET framework (web application). This web application acts like a platform or framework to many other web applications through the following methodology :
We create our dependent web applications (classes for the project business, rdlc reports) in a separate solutions then build them.
After that we add references to the resulted dll in the framework.
And create set of user controls (one for each dependent web application) and put them in a folder in the framework it self.
It works fine but any modification to a specific user control or any modification to any one of the dependent web applications. We have to add the references again and publish the whole framework !!
What I want to do is make those different web applications and the framework loosely coupled. So I could publish the framework one and only one and any modifications to the user controls or the different web applications just publish the updated part rather than the whole framework .
How to refactor my code so I can do this?
The most important thing is :
Never publish the whole framework if the change in any dependent application, just publish the updated part belongs to this application .
If loose coupling is what you are after, develop your "framework(web application)" to function as a WCF web service. Your client applications will pass requests to your web services and receive standard responses in the form of predefined objects.
If you take this route, I recommend that you implement an additional step: Do not use the objects passed to your client applications directly in your client code. Instead, create versions of these web service objects local to each client application and upon receiving your web service response objects, map them to their local counterparts. I tend to implement this with a facade project in my client solution. The facade handles all calls to my various web services, and does the mapping between client and service objects automatically with each call. It is very convenient.
The reason for this is that the day that you decide to modify the objects that your web service serves, you only have to change the mapping algorithms in your client applications... the internal code of each client solution remains unchanged. Do not underestimate how much work this can save you!
Developing WCF web services is quite a large subject. If you are interested, a book that I recommend is Programming WCF Services. It offers a pretty good introduction to WCF development for those who come from a .NET background.
I totally agree with levib, but I also have some tips:
As an alternative to WCF (with its crazy configuration needs), I would recommend ServiceStack. Like WCF it lets you receive requests and return responses in the form of predefined objects, but with NO code generation and minimal configuration. It supports all kinds of response formats, such as JSON, XML, JSV and CSV. This makes it much easier to consume from f.ex. JavaScript and even mobile apps. It even has binaries for MonoTouch and Mono for Android! It is also highly testable and blazing fast!
A great tool for the mapping part of your code is AutoMapper, it lets you set up all your mappings in a single place and map from one object type to another by calling a simple method.
Check them out! :)
Decades of experience says: avoid the framework and you won't have a problem to solve.
Frameworks evolve like cancer. The road to hell is paved with good intentions, and a good portion of those good intentions are embodied in a colossal tumour of a framework all in the name of potential re-use that never really happens.
Get some experience and knowledge when it comes to OO and design, and you'll find endless solutions to your technical problem, such as facades, and mementos, and what have you, but they are not solutions to your real problem.
Another thing, if you are using MS technology, don't bother with anything beyond what .NET offers. Stick with what the MS gods offer because as soon as you digress and become committed to some inhouse framework, your days are numbered.

Dispatch database data to several consumer in different format

I've a big database which contains a lot of data from a big enterprise.
We would like to be able to dispatch this data to different external applications (external, meaning that are not developed by us, but only accessible in our local network).
Consumers can be of very different kinds: accounting, reporting, tech(business), website, ...
With a big variety of formats: CSV, webservice, RSS, Excel, ...
The execution of these exports can be of two different types: scheduled (like every hour), or on demand.
There is mostly two kind of exports: almost-real-time-data(meaning we want to have current data), or statistical data(meaning we are taking in account a period of time).
I've yet to find a good approach to allows those access.
I thought about Biztalk, but I don't know this product very well, and I'm not sure it can make scheduled calls and have business logic. Does anyone have enough knowledge of Biztalk to indicate to me if it can fit my needs?
If Biztalk isn't a good way, is there any libraries which can ease the development of a custom service?
Biztalk can be made to do what you want to do i.e. Extract data from your database, transform it into various formats and send it to various systems on a scheduled basis or as and when required by exposing this as a webservice/WCF Service (Not entirely out of the box, but you might need to purchase additional adapters, pipelines, etc).
But, the question here is, how database intensive is this task? If its large volumes of data, clearly Biztalk is not a favorite candidate, as Biztalk struggles with large data. Its good for routing (without transforming/inspecting) though, even if its large data files.
SSIS, on the other hand is good for data intensive tasks. If your existing databases are on SQL Server, then it fits even better for your data intensive exports/imports and transformations. But it falls short when it comes to the variety of ways you need to connect to external systems (protocols).
So, you are looking at a combination of a good ETL tool, like SSIS, as well as something good at routing like Biztalk. Neither of them clearly fit your needs on their own, in terms of scalability, volumes, connectivity, data formats, etc.
Your question can result in quite a broad implementation. You could consider using a service bus (pub/sub) along with some form of CQRS (if applicable).
My FOSS Shuttle ESB project is here: http://shuttle.codeplex.com/
It has a generic scheduler built in. You could, of course, go with any other service bus such as MassTransit, or NServiceBus.
I think you could use ASP.NET MVC API. http://www.asp.net/web-api
I find it the easiest way to export different kind of info and file formats.
It won't generate scheduled reports or files, you will need the client app or a windows service to call the app. Similar to webservices, but it can return different formats and also files.
And creating excel files, etc. you have to create them manually. Thats a bit of a turndown, but i like this approach because it can be easily hosted on IIS and all the functions your clients are going to call can be on the same place and even called from javascript, so as i see it is a bit more work for you, but it creates really easy to consume services.
By dispatch, I'm assuming you're looking for a pub/sub model. Take a hard look at NServiceBus's (NSB) pub/sub capabilities, http://nservicebus.com/docs/Samples/PublishSubscribe.aspx. Underneath the covers NSB makes heavy use of MSMQ, which has become a lot more stable over time.
If you want to venture outside of your .NET comfort zone, check out Apache Camel or Fuse's Enterprise Service Bus. Either of these tools will support what you need as well. I've used Camel in some extremely high throughput areas without any major issues.

Design patterns for session (tracking) IDs in service-based systems?

We have a service-based system. The system is tiered or layered so that a single service call from an outside entity might hit one, two or three other services, depending on the type of call and system state.
What we want to be able to do is to track the progress of a given call across these different services.
Ideally, as the external call comes in, a tracking number is generated and this follows all subsequent calls throughout our system.
Are there any specific design patterns or WCF features (implementations of the pattern) that we can use to track this progress?
This page gives an example of using session IDs, but it's not clear what the right thing to do is once there are several services involved.
This page may also have some relevance.
We are specifically interested in C# / WCF implementation, but references to any resources that are relevant are interesting (Java / PHP / whatever).
I would use aspect orientated programming to handle this. I believe the right piece is a Service Behavior.
This will let you create an attribute that you stick on your service methods (or service). That has BeforeAction/AfterAction/OnError.
For the entry point service I would have the action create your personal sessionid, stuff it into the WCF context, and then use the Before, After, and Error methods to post data to your data store or however you plan identify progress.
Chris's response is a good one, but we found this information from Microsoft very, very useful --- and it's built-in to IIS / WCF support.
Enabling the log indicated in the screen capture below really helped.

WCF Caching Strategy - Including Dependancies

I would like to ask people's opinion on methods and strategies for introducing Caching to WCF. In particular what i am most interested in, is injected SQL Cache Dependencies. Once a web application is copied to multiple application servers I want to synchronize the cache efficiently.
I am currently looking at the WCF Rest Starter Kit which introduces a nice WebCache attribute for OperationContracts in which you can add SqlCacheDependancies.
I am just interested and would be greateful for, others take or experience in tackling this problem.
Kind Regards,
Andrew
Not sure which problem you're addressing, but you mention cache consistency across multiple servers. Having run a SqlCacheDependency configuration previously, here's my take on it.
The SqlCacheDependency incorporated with the WebCache attribute is the same cache dependency implementation that's been available since .Net 2.0. Overall, I find the aggregate configuration/operation/monitoring for SqlCacheDependency onerous. The design is acceptable for a single cache, but multiple systems -- no thanks.
I like separation of concerns. When working with multiple servers, I find a distributed cache tier much easier to manage than the SqlCacheDependency operation. Plenty of open-source and commercial distributed cache providers available. I find Memcache to be the most effective and operationally sound.

Is it possible to set cache in one application and use it in another application?

Is it possible to set cache in one application and use it in another application ?
Short answer is Yes. Regardless of the language you are using you can use a product such as MemCached (linux/unix), MemCached Win32 (windows), Velocity (Microsoft) in which such products are used for caching farms.
A caching farm is similar to a web farm in that it is a high availability and easily scalable solution...for caching. In this case the cache is totally separate from the application itself. So as long as you have a naming structure for your keys (assigned to the objects in the cache) you could technically span the cached content across not only applications but different platforms, languages, technologies, etc.
See more information regarding this here: System.Web.Caching vs. Enterprise Library Caching Block
You should really be much more specific in your questions. I have to assume you're talking about ASP.NET and the Cache property, but it's only a guess, since you didn't give any clue about what you're looking for (except that you said C#).
No, the Cache property is per-application.
Implement your caching functionality in one application and make it available through .Net Remoting.
Then access it from your other application. Remember that all your objects you want to cache this way will have to be serializable. (you probably have to serialize/deserialize it on your end. not the cache app end)

Categories