Post values to URL without redirecting - c#

I'm setting up an SMS service where I have to post some values,
like receiver, sender and message to a specific url at provider.
Pretty simple if I just add a button and in the button event I
make a response.redirect("...url and url parameters with values...")
But I don't want the user to be redirected to another page when the
button is clicked. I have tried to post the url to a new window with
JavaScript. This is okay, but I'm running into a lot of pop-up blocking issues with
the browser...
Is there any recomendations on how to accomplish that, I think it must
be a pretty common way to post information to payment services and such.
Best regards.

When you need to load content in to different pages without reloading you can use local storage which is now widely supported. Similar in a sense to cookies but much more flexible and up to date.
In depth look in to Local Storage here
Brief local storage demo here
I can't be sure the exact method of using this with VB or C# but I am sure if you look around you will find it. It is a little hard to tell your exact use case, but ultimately GET variables are loaded in to the page or script on load, s even if you manage to change or update the variable, that won't be accessed until next reload.

With System.Net.WebClient you can call a website without doing a redirect.
Dim result As String = New System.Net.WebClient().DownloadString("http://...")

Related

The Url where we type in address bar must not be accepted

Hello All I need all your help badly. We have made a role,ID based application in asp.net(C#) where the menu appear as per his roles. This is fine now the users are trying to directly type the link in address bars and using them. Cant restrict them in page loads and sessions as this is a production site which is already slow. So my intension is to show the url in encrypted format which expires in certain time where the user cant copy and paste it. Is there any possible way...
Cant restrict them in page loads and sessions as this is a production site which
is already slow.
Fix that crappy code and / or add more servers. Because this is the ONLY way it makes sense to do it. Anything else is the type of security that gets broken into and then you run around blaming the world for being unfair.
So my intension is to show the url in encrypted format which expires in certain
time where the user cant copy and paste it.
? So the menu has an encrypted URL that is only valid like for half a second? What if the user browses the source code of the page? He can see all the source there.
This is not security, it is hogwash. Sorry to be blunt, but this is not going to work and you are making a bad job here.
Checking this in page load will take less than a millisecond (assuming you cache roles in the session). WAY less.
Check for user role in page_load event and if the user does not have permission then redirect him to a page showing permission denied.
Please provide code if you need further help.
Yeah.... you've painted yourself into a corner here.
Short answer:
No, there's no clean way to do this and whatever type of 'special' url implementation you create will be open to abuse/spoofing and still require you to add code anyway.
Long answer:
I can't see any viable solution other than injecting some code into Page_Load.
I take it you're not using WIF/Claims-based security, just some bespoke written user login, database store based code? So your best approach (at this point) is to make a simple class in App_Code: When the user logs in, load their permissions into something like a DataTable and store that in a session variable. That way you can avoid doing database requests every time the page loads/posts back, this'll probably speed your site up a bit too.
Build a non-static method in a class that is to be used on Page_Load, where this will get the URL (or page id) being accessed, then check that against the session stored DataTable. If that check fails redirect them to an access denied page.
Building the class foundation is key, don't attempt to shortcut and copy-paste chunks of code into each page. With the 'security' class you can standardise your code and reduce testing down to a few simple checks.

What technology allows page content to change without changing the URL?

I have seen this on some survey websites. What is the C# code they use on the client side to keep the URL same, but when clicking the "Next" button, the same aspx page is maintained
without having any query string;
without any change even a character in the url; and
the grid, the data , the content, the questions keep changing?
Can anyone give a code-wise example how to achieve this?
My main query is how is this done in code-behind to change data of page and maintain same url.
Nothing simpler that a session, maintainted at the server side. Store a "current question number" in session, increment it at each succesfull postback and you have what you ask about.
Another possibility - a cookie which contains "current question number".
Both cookie and session are invisible in the query string of course.
"change data of page and maintain same url." Answer is Server.Transfer.
This method will preserve url.
The Next button may submit a form using the HTTP POST method. The form data may contain the session, question and response data. The site uses that to build a new response. Unlike a GET, a POST does not incorporate data into the URL.
Developers will typically accomplish this task by using AJAX. The basic premise behind it is that only a certain portion of the page (e.g. a grid or content area) will make a server call and retrieve the results (using Javascript). The effect achieved is that there has not been a full post back, which is why you don't see the URL or parameters changing.
It is possible to do this using jQuery, pure Javascript, or Microsoft's UpdatePanel.
oleksii's comment has some good links as well:
That's the AJAX magic. There are many JQuery plugings for this, for
example this one with a live demo. You can also program it easily
using JQuery Get or Post or any other wrapper that use XmlHttpRequest
object.

Rewriting URL based on who is logged in

Im looking for the best way to change the URL to pages based on who is logged on, the limitation is all the pages are PRE generated so the actual html will already be generated and cannot be generated again on a pr user basis.
Posible solutions
A posible solution might be to use javascript to basicly add to the end of all URL ?=MyUserName , but im unsure if this will work with all spiders ( By all i mean the major search engines). This solution feels a bit dirty to me..
There might also be some way of of when the request comes in to then basicly say that response is from Default.aspx=?Username with doing a response.Redirect?
Its also importent to remember i will be changing the cache settings based on this, like saying if your not logged in the page can be cached.
I'm not sure if you must use .html files or another specific extension, but you could always create your own handler and process what you want to do on every request that way. Your handler would determine who is accessing the page and then do a Response.Redirect (or whatever action is necessary).

append query string to URL on event (C#)

my client has a website that currently makes requests on a particular event (click, load etc). the website is working well and all but he has a problem with getting the website statistics with Google Analytics, the reason is, because the website never does a re-direct to a different page, it is all within the same page and no matter what event is loaded in the website(be it a video, tables etc) everything are displayed under the same url
www.somewebsite.com/default.aspx
what I want to achieve is on a particular event, change the url to
www.somewebsite.com/default.aspx?type=abc&id=999
How can I do this,. what is the easiest method to do this? please advise. The whole website is built in C#.
Many Thanks.
Is this event happening on the server or the client?
If it's on the server, you can call Response.Redirect and add your new query string parameter to the current url.
If it's on the client (Javascript), you can set the location property.
If you want to preserve your page's state, try adding your querystring parameter to the form's action parameter in Javascript.
Alternatively, as jeffamaphone suggested, you can change the hash (the part after the # sign) by setting location.hash without posting back to the server.
Actually, you should probably move some of the elements to different pages... this is based on what you said:
because basically all I am doing is
hiding and showing elements based on
events, if i do a response.redirect,
it will reload teh homepage.
If I understand correctly, the user is never redirected to a different page, you are just hiding/unhiding elements from default.aspx based in query strings... correct? the simplest solution will be to split that content into different aspx pages.

Easy way to replicate web page across machines?

I am trying to replicate a browser page to another browser on another machine. I basically want to reproduce a page exactly how it appears to a customer for viewing by the website owner. I have done this before using some impersonation trickery, but found that it would throw the session state out of wack when the site owner would switch customers. So I would like to stay away from cookie and authentication manipulation.
Anybody done anything like that? Is there a way to easily transfer the DOM to a webservice?
The tech/programming at my disposal are C#, javascript, WCF.
Is sending image an option? If that is an option, you can use IECapt program to take screenshot of that image and send it to the other machine:
http://iecapt.sourceforge.net/
If session state is getting messed up when the site owner changes customer roles, your implementation might be the problem. I'd probably try fixing how your session management is working before solving a problem which is really a sympton of a deeper problem IMO.
Since you mentioned transferring the DOM to a webservice, I assume you need to inspect the page's source and not just its appearance. I recommend checking this link:
http://www.eggheadcafe.com/community/aspnet/7/10041011/view-source-of-a-web-page.aspx
It was a few suggestions for grabbing a page's source programmatically / screen-scraping.
Of course, a few more details might yield better answers. Specifically, does the customer submit their page to the owner (I imagine a scenario where a user of your site says "Hey, I'm having a problem! Take a look at this...") or is the owner looking at how the page renders when logged-in as a specific customer?
Easiest way is to post the innerHTML of the body tag to your webservice, which your other page can poll (or use comet, or something) to get back. You'll have to be careful to load the right css in your clone page. Also, you'll need to think about how often you want it to update.
This is a bit of a hack though, a better solution would be to have designed the page from the start with this in mind (I'm assuming this is too late now?), so that anything that mutated the page would at the same time send a message back to the server describing what was changed, or if the page is not very interactive, storing the canonical state of the page on the server, and querying that from both browsers with XHRs or similar.

Categories