Rewriting URL based on who is logged in - c#

Im looking for the best way to change the URL to pages based on who is logged on, the limitation is all the pages are PRE generated so the actual html will already be generated and cannot be generated again on a pr user basis.
Posible solutions
A posible solution might be to use javascript to basicly add to the end of all URL ?=MyUserName , but im unsure if this will work with all spiders ( By all i mean the major search engines). This solution feels a bit dirty to me..
There might also be some way of of when the request comes in to then basicly say that response is from Default.aspx=?Username with doing a response.Redirect?
Its also importent to remember i will be changing the cache settings based on this, like saying if your not logged in the page can be cached.

I'm not sure if you must use .html files or another specific extension, but you could always create your own handler and process what you want to do on every request that way. Your handler would determine who is accessing the page and then do a Response.Redirect (or whatever action is necessary).

Related

Get website's name from where the link was redirected to my application

I want to make an application that records not only the hits on my website but also the names of the websites from where the client is redirected to mine via a link. Considering I know every link I am going to place external, I thought I can create an action method that takes a param for a get request so the link href could be like mywebapp.com/index/CameFromLink1 and param differ for each link I know I will have. From there just increment a row in the database and I hide the param from url when the view is returned. But maybe there is a better way to achieve this?
You can use http referrer which let you learn where user came from. Also consider that it can be modifiable.

The Url where we type in address bar must not be accepted

Hello All I need all your help badly. We have made a role,ID based application in asp.net(C#) where the menu appear as per his roles. This is fine now the users are trying to directly type the link in address bars and using them. Cant restrict them in page loads and sessions as this is a production site which is already slow. So my intension is to show the url in encrypted format which expires in certain time where the user cant copy and paste it. Is there any possible way...
Cant restrict them in page loads and sessions as this is a production site which
is already slow.
Fix that crappy code and / or add more servers. Because this is the ONLY way it makes sense to do it. Anything else is the type of security that gets broken into and then you run around blaming the world for being unfair.
So my intension is to show the url in encrypted format which expires in certain
time where the user cant copy and paste it.
? So the menu has an encrypted URL that is only valid like for half a second? What if the user browses the source code of the page? He can see all the source there.
This is not security, it is hogwash. Sorry to be blunt, but this is not going to work and you are making a bad job here.
Checking this in page load will take less than a millisecond (assuming you cache roles in the session). WAY less.
Check for user role in page_load event and if the user does not have permission then redirect him to a page showing permission denied.
Please provide code if you need further help.
Yeah.... you've painted yourself into a corner here.
Short answer:
No, there's no clean way to do this and whatever type of 'special' url implementation you create will be open to abuse/spoofing and still require you to add code anyway.
Long answer:
I can't see any viable solution other than injecting some code into Page_Load.
I take it you're not using WIF/Claims-based security, just some bespoke written user login, database store based code? So your best approach (at this point) is to make a simple class in App_Code: When the user logs in, load their permissions into something like a DataTable and store that in a session variable. That way you can avoid doing database requests every time the page loads/posts back, this'll probably speed your site up a bit too.
Build a non-static method in a class that is to be used on Page_Load, where this will get the URL (or page id) being accessed, then check that against the session stored DataTable. If that check fails redirect them to an access denied page.
Building the class foundation is key, don't attempt to shortcut and copy-paste chunks of code into each page. With the 'security' class you can standardise your code and reduce testing down to a few simple checks.

how to create a decorator proxy page in asp.net 3.5

I am trying to make this feature, and I'm really stuck.
I have two applications that run on the same domain. and I need to have one application load pages from the other one inside it's own (the first) master page.
I have full control of the code of both sides, of course.
I have tries using HTTPRequest, and HTTPResponse, and I have tried using WebBrowser. Both work great as long as I have static(plain HTML) pages. However,
those pages are actually dynamic. the user need to press server-side buttons (postback) and generally use the session, viewstate, and/or cookies.
because of that, HTTPRequest and WebBrowser fail me, as they do not cause postback, and therefore those server-side controls are not working. more so, if I try to "fake" a postback by saving the ViewState after each response and than resend it on the next request, after a few (3-4) times the original page will return a "The state information is invalid for this page and might be corrupted" error, even if I use
EnableViewStateMac ="false" EnableSessionState="True" EnableEventValidation ="false" ValidateRequest ="false" ViewStateEncryptionMode ="Never
So... any ideas how can I solve this issue?
Thanks in advance
What is the main desire here?
Wrap one site's content in another without any architecture changes?
ANSWER: Iframe
Have a single submit button submit from two sites?
ANSWER: Not a good idea. You might be able to kludge this by creating a scraper and parser, but it would only be cool as an "I can do it trophy". Better to rearchitect the solution. But assuming you really want to do this, you will have to parse the result from the embedded site and redirect the submit to the new site. That site will then take the values and submit the form to the first site and wait for the result, which it will scrape to give a response to the user. It is actually quite a bit more complex, as you have to parse the HTML DOM (easier if all of the HTML is XHTML compliant, of course) to figure out what to intercept.
Caveat: Any changes to the embedded site can blow up your code, so the persons who maintain the first site must be aware of this artificially created dependency so they don't change anything that might cause problems. Ouch, that sounds brittle! ;-)
Other?
If using an iFrame does not work, then I would look at the business problem and draw up an ideal architecture to solve it, which might mean making the functionality of the embedded site available via a web service for the second site.

Post values to URL without redirecting

I'm setting up an SMS service where I have to post some values,
like receiver, sender and message to a specific url at provider.
Pretty simple if I just add a button and in the button event I
make a response.redirect("...url and url parameters with values...")
But I don't want the user to be redirected to another page when the
button is clicked. I have tried to post the url to a new window with
JavaScript. This is okay, but I'm running into a lot of pop-up blocking issues with
the browser...
Is there any recomendations on how to accomplish that, I think it must
be a pretty common way to post information to payment services and such.
Best regards.
When you need to load content in to different pages without reloading you can use local storage which is now widely supported. Similar in a sense to cookies but much more flexible and up to date.
In depth look in to Local Storage here
Brief local storage demo here
I can't be sure the exact method of using this with VB or C# but I am sure if you look around you will find it. It is a little hard to tell your exact use case, but ultimately GET variables are loaded in to the page or script on load, s even if you manage to change or update the variable, that won't be accessed until next reload.
With System.Net.WebClient you can call a website without doing a redirect.
Dim result As String = New System.Net.WebClient().DownloadString("http://...")

Easy way to replicate web page across machines?

I am trying to replicate a browser page to another browser on another machine. I basically want to reproduce a page exactly how it appears to a customer for viewing by the website owner. I have done this before using some impersonation trickery, but found that it would throw the session state out of wack when the site owner would switch customers. So I would like to stay away from cookie and authentication manipulation.
Anybody done anything like that? Is there a way to easily transfer the DOM to a webservice?
The tech/programming at my disposal are C#, javascript, WCF.
Is sending image an option? If that is an option, you can use IECapt program to take screenshot of that image and send it to the other machine:
http://iecapt.sourceforge.net/
If session state is getting messed up when the site owner changes customer roles, your implementation might be the problem. I'd probably try fixing how your session management is working before solving a problem which is really a sympton of a deeper problem IMO.
Since you mentioned transferring the DOM to a webservice, I assume you need to inspect the page's source and not just its appearance. I recommend checking this link:
http://www.eggheadcafe.com/community/aspnet/7/10041011/view-source-of-a-web-page.aspx
It was a few suggestions for grabbing a page's source programmatically / screen-scraping.
Of course, a few more details might yield better answers. Specifically, does the customer submit their page to the owner (I imagine a scenario where a user of your site says "Hey, I'm having a problem! Take a look at this...") or is the owner looking at how the page renders when logged-in as a specific customer?
Easiest way is to post the innerHTML of the body tag to your webservice, which your other page can poll (or use comet, or something) to get back. You'll have to be careful to load the right css in your clone page. Also, you'll need to think about how often you want it to update.
This is a bit of a hack though, a better solution would be to have designed the page from the start with this in mind (I'm assuming this is too late now?), so that anything that mutated the page would at the same time send a message back to the server describing what was changed, or if the page is not very interactive, storing the canonical state of the page on the server, and querying that from both browsers with XHRs or similar.

Categories