MVC 3 partial caching - c#

I know this question has been asked before but I'm confused as to the best approach so please forgive me asking again...
I have an MVC3 application that will be an extranet, allowing users to log in, via Forms Authentication. The users will be accessing confidential information so, in order to prevent somebody from hitting Back after they log out (and I SignOut of FormsAuthentication), I have disabled all caching, forcing the redirection to the logon page.
Everything works well from a security point of view, but my problem is that I'd like to cache the non-secure elements of the page, such as images, backgrounds, logos, etc.
At the moment, each page renders with an ugly flicker, because all my artwork is being downloaded each time.
Of course, this also has a negavtive impact on bandwidth too.
How can I control the caching such that the artwork, css, scripts, etc. get cached whilst preventing the dreaded Back button after FormsAuthentication SignOut problem?
Thank you in advance,
Simon.

Assuming the images are not dynamically generated you can either do it internally via MVC or using IIS.
Internally you'd need to serve all your images and set expires by.
If your using IIS it becomes much much simpler, you just edit the expires header in the IIS Custom headers section to a date in the future (a date in the past auto expires it). If you wish to ensure an image is not cached add a query string to it
<img src="image.png?dummy=8sn7ahh2" />
Then the image wouldn't be cached too so you basically want to cache all images and then black list (using query string) the ones you don't want cached.
Heres a nice example on how to switch it on/off for IIS7

Related

The Url where we type in address bar must not be accepted

Hello All I need all your help badly. We have made a role,ID based application in asp.net(C#) where the menu appear as per his roles. This is fine now the users are trying to directly type the link in address bars and using them. Cant restrict them in page loads and sessions as this is a production site which is already slow. So my intension is to show the url in encrypted format which expires in certain time where the user cant copy and paste it. Is there any possible way...
Cant restrict them in page loads and sessions as this is a production site which
is already slow.
Fix that crappy code and / or add more servers. Because this is the ONLY way it makes sense to do it. Anything else is the type of security that gets broken into and then you run around blaming the world for being unfair.
So my intension is to show the url in encrypted format which expires in certain
time where the user cant copy and paste it.
? So the menu has an encrypted URL that is only valid like for half a second? What if the user browses the source code of the page? He can see all the source there.
This is not security, it is hogwash. Sorry to be blunt, but this is not going to work and you are making a bad job here.
Checking this in page load will take less than a millisecond (assuming you cache roles in the session). WAY less.
Check for user role in page_load event and if the user does not have permission then redirect him to a page showing permission denied.
Please provide code if you need further help.
Yeah.... you've painted yourself into a corner here.
Short answer:
No, there's no clean way to do this and whatever type of 'special' url implementation you create will be open to abuse/spoofing and still require you to add code anyway.
Long answer:
I can't see any viable solution other than injecting some code into Page_Load.
I take it you're not using WIF/Claims-based security, just some bespoke written user login, database store based code? So your best approach (at this point) is to make a simple class in App_Code: When the user logs in, load their permissions into something like a DataTable and store that in a session variable. That way you can avoid doing database requests every time the page loads/posts back, this'll probably speed your site up a bit too.
Build a non-static method in a class that is to be used on Page_Load, where this will get the URL (or page id) being accessed, then check that against the session stored DataTable. If that check fails redirect them to an access denied page.
Building the class foundation is key, don't attempt to shortcut and copy-paste chunks of code into each page. With the 'security' class you can standardise your code and reduce testing down to a few simple checks.

how to create a decorator proxy page in asp.net 3.5

I am trying to make this feature, and I'm really stuck.
I have two applications that run on the same domain. and I need to have one application load pages from the other one inside it's own (the first) master page.
I have full control of the code of both sides, of course.
I have tries using HTTPRequest, and HTTPResponse, and I have tried using WebBrowser. Both work great as long as I have static(plain HTML) pages. However,
those pages are actually dynamic. the user need to press server-side buttons (postback) and generally use the session, viewstate, and/or cookies.
because of that, HTTPRequest and WebBrowser fail me, as they do not cause postback, and therefore those server-side controls are not working. more so, if I try to "fake" a postback by saving the ViewState after each response and than resend it on the next request, after a few (3-4) times the original page will return a "The state information is invalid for this page and might be corrupted" error, even if I use
EnableViewStateMac ="false" EnableSessionState="True" EnableEventValidation ="false" ValidateRequest ="false" ViewStateEncryptionMode ="Never
So... any ideas how can I solve this issue?
Thanks in advance
What is the main desire here?
Wrap one site's content in another without any architecture changes?
ANSWER: Iframe
Have a single submit button submit from two sites?
ANSWER: Not a good idea. You might be able to kludge this by creating a scraper and parser, but it would only be cool as an "I can do it trophy". Better to rearchitect the solution. But assuming you really want to do this, you will have to parse the result from the embedded site and redirect the submit to the new site. That site will then take the values and submit the form to the first site and wait for the result, which it will scrape to give a response to the user. It is actually quite a bit more complex, as you have to parse the HTML DOM (easier if all of the HTML is XHTML compliant, of course) to figure out what to intercept.
Caveat: Any changes to the embedded site can blow up your code, so the persons who maintain the first site must be aware of this artificially created dependency so they don't change anything that might cause problems. Ouch, that sounds brittle! ;-)
Other?
If using an iFrame does not work, then I would look at the business problem and draw up an ideal architecture to solve it, which might mean making the functionality of the embedded site available via a web service for the second site.

Search engine optimization for database loaded using jQuery

I am currently optimizing my site for search engines. It is mainly a database driven site. I am using C# on the back end but database content is loaded via jQuery ajax and a web service. Therefore, my database content is not in html at the point that the bots will crawl it. My site is kind of like an online supermarket format in that there are thousands of items in my database, users can load a single one of these or more onto the web page at a time and the page does not change significantly once items are loaded.
My question is, how (if at all) can I get my database contents indexed? I was thinking of having an anchor that links to an aspx page (eg called mydatabase) which loads all of my database items as a big html list. Then, using jQuery, I would make the anchor invisible to users. The data would still be accessible to users but not by this link, it would be accessed by using the jQuery interface I have created.
The thing is, I don't really want users to see this big, messy list - would google results show this page eg www.mysite.com/mydatabase.aspx as a search result? Also would google see this as "keyword rich" spam page? I have done quite a lot of research but found nothing on this. only instructions for php. Please help I'm not sure what to do and need to know the best way to go about this.
It's a shame you haven't taken the progressive enhancement approach as it would mean you would have started with a standard HTML output that's crawlable, and then adding the layering behaviour (AJAX) on top for the user experience.
Providing a single file (e.g. mydatabase.aspx) that lists all of your products in a list format provides no real value for the reason you gave - it would just be a big useless list. No editorial content relevance for each link etc.
You're much better off taking another look at your information architecture and trying ensure that each product is accessibile by it's own unique URL, then classifying the products into groups (result pages), being careful to think about pagination.
You can still make this act like a single-page application using AJAX, but you'd want to look into HTML5's History API to achieve this in a search engine friendly way.

How to implement page counter for an existing ASP.NET web application

I am finding a good way to implement Page Counter Statistic for internal web application (so maybe I can not use Google Analytics to help me).
I want to find out which page in my web application that user does not visit anymore. So I can investigate the reason why there is no hit to that page. If it has a bug or that page is not necessary anymore.
The easy way that comes to mind is to add every page with some line of codes to update the page page view. But there are so many page in my web, so this will take a lot of time.
So is there any other way to make a simple web page statistic with minimize line of codes.
For more information
- Every user have to log in before using this web.
- There is session to store user's ID.
- I use .NET 1.1 as an environment and plan to migrate to .NET 2.0+ in the future.
- Page stat is not show on web, I just want the hit count and then analyze it.
Google Analytics is probably your best bet. Although your site is internal, Google Analytics will still work so long as you are able to hit their server from within your network. I've used it on intranet sites before without any issues.
You use ashx files and invoke it via markup (i.e. as a jquery invocation or as using an image tag) or you add an HTTP module. Both can be implemented without recompiling-- just adding one more .dll to the bin and editing the web.config. There is not enough space in this text box to give full justice to the steps necessary to write a http module or handler and a hit counter.

Easy way to replicate web page across machines?

I am trying to replicate a browser page to another browser on another machine. I basically want to reproduce a page exactly how it appears to a customer for viewing by the website owner. I have done this before using some impersonation trickery, but found that it would throw the session state out of wack when the site owner would switch customers. So I would like to stay away from cookie and authentication manipulation.
Anybody done anything like that? Is there a way to easily transfer the DOM to a webservice?
The tech/programming at my disposal are C#, javascript, WCF.
Is sending image an option? If that is an option, you can use IECapt program to take screenshot of that image and send it to the other machine:
http://iecapt.sourceforge.net/
If session state is getting messed up when the site owner changes customer roles, your implementation might be the problem. I'd probably try fixing how your session management is working before solving a problem which is really a sympton of a deeper problem IMO.
Since you mentioned transferring the DOM to a webservice, I assume you need to inspect the page's source and not just its appearance. I recommend checking this link:
http://www.eggheadcafe.com/community/aspnet/7/10041011/view-source-of-a-web-page.aspx
It was a few suggestions for grabbing a page's source programmatically / screen-scraping.
Of course, a few more details might yield better answers. Specifically, does the customer submit their page to the owner (I imagine a scenario where a user of your site says "Hey, I'm having a problem! Take a look at this...") or is the owner looking at how the page renders when logged-in as a specific customer?
Easiest way is to post the innerHTML of the body tag to your webservice, which your other page can poll (or use comet, or something) to get back. You'll have to be careful to load the right css in your clone page. Also, you'll need to think about how often you want it to update.
This is a bit of a hack though, a better solution would be to have designed the page from the start with this in mind (I'm assuming this is too late now?), so that anything that mutated the page would at the same time send a message back to the server describing what was changed, or if the page is not very interactive, storing the canonical state of the page on the server, and querying that from both browsers with XHRs or similar.

Categories