I have quite the odd scenario I have not run into before.
I simply have a hyper link, with the url hard coded to www.mywebsite.com/StoreLoc.aspx
My application has two pages that we are concerned about.
Store.aspx
StoreLoc.aspx
Even though the hyperlink is hardcoded to go to StoreLoc.aspx, SOMETIMES ( once out of tenish ) the hyperlink will direct to Store.aspx instead.
I am not the first person to work on this application, so the aspx files very well may have been renamed. Could this cause this issue?
Thank you all for your time.
You might have something in the OnSesssionStart function of the Global.asax file the does a Response.Redirect to Store.aspx.
You session might be getting killed after x minutes of inactivity. This would cause the session to get started again and then get redirected to the start page.
You may want to search for Respose.redirect or Server.Transfer in your visual studio solution.
Hope this helps.
Related
I am trying to make this feature, and I'm really stuck.
I have two applications that run on the same domain. and I need to have one application load pages from the other one inside it's own (the first) master page.
I have full control of the code of both sides, of course.
I have tries using HTTPRequest, and HTTPResponse, and I have tried using WebBrowser. Both work great as long as I have static(plain HTML) pages. However,
those pages are actually dynamic. the user need to press server-side buttons (postback) and generally use the session, viewstate, and/or cookies.
because of that, HTTPRequest and WebBrowser fail me, as they do not cause postback, and therefore those server-side controls are not working. more so, if I try to "fake" a postback by saving the ViewState after each response and than resend it on the next request, after a few (3-4) times the original page will return a "The state information is invalid for this page and might be corrupted" error, even if I use
EnableViewStateMac ="false" EnableSessionState="True" EnableEventValidation ="false" ValidateRequest ="false" ViewStateEncryptionMode ="Never
So... any ideas how can I solve this issue?
Thanks in advance
What is the main desire here?
Wrap one site's content in another without any architecture changes?
ANSWER: Iframe
Have a single submit button submit from two sites?
ANSWER: Not a good idea. You might be able to kludge this by creating a scraper and parser, but it would only be cool as an "I can do it trophy". Better to rearchitect the solution. But assuming you really want to do this, you will have to parse the result from the embedded site and redirect the submit to the new site. That site will then take the values and submit the form to the first site and wait for the result, which it will scrape to give a response to the user. It is actually quite a bit more complex, as you have to parse the HTML DOM (easier if all of the HTML is XHTML compliant, of course) to figure out what to intercept.
Caveat: Any changes to the embedded site can blow up your code, so the persons who maintain the first site must be aware of this artificially created dependency so they don't change anything that might cause problems. Ouch, that sounds brittle! ;-)
Other?
If using an iFrame does not work, then I would look at the business problem and draw up an ideal architecture to solve it, which might mean making the functionality of the embedded site available via a web service for the second site.
Im looking for the best way to change the URL to pages based on who is logged on, the limitation is all the pages are PRE generated so the actual html will already be generated and cannot be generated again on a pr user basis.
Posible solutions
A posible solution might be to use javascript to basicly add to the end of all URL ?=MyUserName , but im unsure if this will work with all spiders ( By all i mean the major search engines). This solution feels a bit dirty to me..
There might also be some way of of when the request comes in to then basicly say that response is from Default.aspx=?Username with doing a response.Redirect?
Its also importent to remember i will be changing the cache settings based on this, like saying if your not logged in the page can be cached.
I'm not sure if you must use .html files or another specific extension, but you could always create your own handler and process what you want to do on every request that way. Your handler would determine who is accessing the page and then do a Response.Redirect (or whatever action is necessary).
I am occasionally getting "Application compilation is starting." event in my Event Log and I can't identify what's causing it. I think I may try this - http://blogs.msdn.com/b/tess/archive/2008/11/06/troubleshooting-appdomain-restarts-and-other-issues-with-etw-tracing.aspx - but before I do that I was curious if I can identify the problem without starting to mess with something unknown.
I have used <%= %> and <%# %> tags throughout the app so I am wondering if this is what's causing the problems. On couple spots I have embedded C# code (using ) so that may add to it?
Precompiling the app is also valid choice for me, I just don't want to end in position in which I need execute precompilation command on the server every time I upload some changes to the server. Currently on my dev machine I've followed advices from this link - http://mikehadlow.blogspot.com/2008/05/compiling-aspx-templates-using.html - and it does awesome job as it allows me to identify errors in C# code in .aspx pages during build in Visual Studio. However, I presume precompilation results are not stored in my website directory (and won't be published when I use Publish option).
Ideally, I want to stay in position I am with default Web Application model with addition of automatically running compilation as soon as I upload changed .aspx or .ascx over FTP (not waiting for user's http request). Am I asking too much, or is this possible to setup?
From my research it seems it can.
Because nobody responded I'll accept this as answer.
Sorry if this is duplicate but I've been going crazy for the past two hours over this.
After changing the Master Page in ASP.NET MVC 1.0 application, I keep getting this familiar error when I try a postback without filling in the mandatory form elements which are validated by the server:
"Validation of viewstate MAC failed.
If this application is hosted by a Web
Farm or cluster, ensure that
configuration specifies
the same validationKey and validation
algorithm. AutoGenerate cannot be used
in a cluster."
The new page refers to a lot of jQuery code with lightboxes, superfish etc. Could that be a problem while doing a postback?
If I revert back to the original master, the error disappears and I'm able to validate form fields. Both masters are located in the same path.
I know a lot of other guys have faced this issue but I was unable to find anything which could help me.
Thanks.
Edited and added
After a little debugging, I've realized that a directive in the master page:
<% Html.RenderAction("menu", "nav"); %>
is creating the problem. The directive asks the "menu" action of the controller "nav" to inject a partial view Menu.ascx. If I delete this line from the new master page, everything works OK. My app's left bar navigation relies on this directive to work properly. Is there any way I can get around this? Very mysterious.
Are you using Html.AntiForgeryToken() anywhere on this page? Sometimes when I am testing locally with multiple different sites and / or port changes this will happen to me. If it does I clear my browser cache and it works just fine.
I did experienced the same problem two days ago. :)
A simple restart of the box has worked in my case so I did'nt investigate further.
I am trying to replicate a browser page to another browser on another machine. I basically want to reproduce a page exactly how it appears to a customer for viewing by the website owner. I have done this before using some impersonation trickery, but found that it would throw the session state out of wack when the site owner would switch customers. So I would like to stay away from cookie and authentication manipulation.
Anybody done anything like that? Is there a way to easily transfer the DOM to a webservice?
The tech/programming at my disposal are C#, javascript, WCF.
Is sending image an option? If that is an option, you can use IECapt program to take screenshot of that image and send it to the other machine:
http://iecapt.sourceforge.net/
If session state is getting messed up when the site owner changes customer roles, your implementation might be the problem. I'd probably try fixing how your session management is working before solving a problem which is really a sympton of a deeper problem IMO.
Since you mentioned transferring the DOM to a webservice, I assume you need to inspect the page's source and not just its appearance. I recommend checking this link:
http://www.eggheadcafe.com/community/aspnet/7/10041011/view-source-of-a-web-page.aspx
It was a few suggestions for grabbing a page's source programmatically / screen-scraping.
Of course, a few more details might yield better answers. Specifically, does the customer submit their page to the owner (I imagine a scenario where a user of your site says "Hey, I'm having a problem! Take a look at this...") or is the owner looking at how the page renders when logged-in as a specific customer?
Easiest way is to post the innerHTML of the body tag to your webservice, which your other page can poll (or use comet, or something) to get back. You'll have to be careful to load the right css in your clone page. Also, you'll need to think about how often you want it to update.
This is a bit of a hack though, a better solution would be to have designed the page from the start with this in mind (I'm assuming this is too late now?), so that anything that mutated the page would at the same time send a message back to the server describing what was changed, or if the page is not very interactive, storing the canonical state of the page on the server, and querying that from both browsers with XHRs or similar.