We have a website which we recently migrated to ASP.NET MVC. All of the URLs are now different from the original website. Google still has all of our old URLs, so if anyone finds us in a search, currently they will get a 404.
I have a catchall route that catches bad URLs, including all of the old ones. In a perfect world I would like to do a 301 redirect to the home page for all urls matching this catchall route, and I do have code for this that works properly on my development machine. However, I finally got someone at our ISP (Network Solutions) to tell me that they block 301 redirections (the web server returns a 404 instead).
So I think my only remaining option is to just accept any bad URL, and point it to the home page.
Here is my question: I know that the search engines (especially Google) are now penalizing duplicate content. If I just point all bad URLs to the home page, how much is this going to hurt us in the search rankings? Do I have any other technical options?
Honestly, I would suggest that you change ISP's. 301's are an important tool in any webmaster's toolbox, and for them to block that will penalize you terribly. You could easily transfer your domain to another IP address, wait for the DNS propagation, and then do your rollout.
From Google's Webmaster tools:
Use a 301 Redirect to permanently
redirect all pages on your old site to
your new site. This tells search
engines and users that your site has
permanently moved. We recommend that
you move and redirect a section or
directory first, and then test to make
sure that your redirects are working
correctly before moving all your
content.
Don't do a single redirect directing
all traffic from your old site to your
new home page. This will avoid 404
errors, but it's not a good user
experience. It's more work, but a
page-to-page redirect will help
preserve your site's ranking in Google
while providing a consistent and
transparent experience for your users.
If there won't be a 1:1 match between
pages on your old site and your new
site (recommended), try to make sure
that every page on your old site is at
least redirected to a new page with
similar content.
I'm sure that's much easier said then done, but I would never want an ISP that exerted that kind of filter against their clients.
Can you do a 302 redirect at least? I do agree with what womp says though, what ISP would block 301 redirects? Dump them. ISPs are a dime a dozen.
I completely agree with womp. I cannot believe that an ISP would block 301's.
I was so surprised that you can't do a 301 redirect on Network Solutions, because they're not exactly a two bit operation.
Their own marketing material suggests that you can. There's also a forum post by someone wanting to do a 301 redirect. Although they use a .htaccess, the reply from a Network Solutions tech support shows the user how to do a 301 redirect in ASP.
If you opt to not change ISP then the simples solution is to display a page where you say that the page has been moved with a link to the new page link, then you add a 5 second delay that re-directs using a HTML meta tag:
<html>
<head>
<title>Page moved</title>
<meta http-equiv="refresh" content="5;url=http://example.com/newurl">
</head>
<body>
The page has been moved, click here if you have not been re-directed to the new page within 5 seconds.
</body>
</html>
Alternatively you could use a URL rewriter, that way the old url "points" to the new page, there are basically two ways of doing this, the programmatically way is to create your own VirtualPathProvider, the second way is to use a URL Rewriter module like the IIS Url Rewrite Module.
Related
I am currently supporting an existing ASP.NET MVC web-site that was written by another developer. Many parts of the site were upgraded to more modern frameworks, and I would like to redirect users to the new site where possible. However, there are still some pages that will have to continue to be used on the old site until I can finish the migration.
The server is Windows Server 2008, IIS 7.0, .NET 4.5
Let's say the old URL is: https://www.companysite.com/
The new site is in a virtual directory at: https://www.companysite.com/thenewsite/
What is the best way to selectively redirect users to the new site, where I have those parts built, but also leave the old site accessible for the pages that are not yet transferred to the new design?
For example, I would like to redirect:
https://www.companysite.com/contracts/ to https://www.companysite.com/thenewsite/contracts/
But I can't redirect every path globally. For example:
https://www.companysite.com/shipping/ can NOT redirect to the new site yet, as I haven't built /thenewsite/shipping/ yet.
Here are some ideas I had, but I could use some guidance as to which one would be best:
Add a Response.Redirect or a html meta refresh to specific pages in the old site (Lots of effort)
Use the URL Rewrite module, with a custom rule (not sure how to do this)
Hopefully this makes sense. Any help or suggestions would be greatly appreciated.
For IIS 7.0 you will probably want to use https://www.iis.net/downloads/microsoft/url-rewrite
There are others and it depends on version of IIS. I used to use Helicon Rewrite and as an ISAPI plugin is was at the front of the request pipeline and that's important for performance, you don't want a request to get to a Controller before it's wrong, you'd want to catch it at the Routing at latest.
Whichever URL rewrite tool you use the key is to make them return the correct HTTP code.
A 301 redirect is a permanent redirect. It is cacheable and any bookmarks for this URL should be updated to point to the new URL. A 302 redirect is a temporary redirect. A 303 redirect is the same as a 302 except that the follow-up request is now explicitly changed to a GET request and no confirmation is required
Recently I've made an application in which I've made a stupid mistake by hard cording the url I want to post data to instead of using a proxy such as no-ip.
In short words, this app is sending requests regularly to my site which ended up in consuming lots of resources to my site. The request are sent to a PHP page which doesn't exist on my site:
http://www.example.com/non-existing-page.php
I suspect that it's impossible to prevent the distributed app from sending requests to my site without changing the url of my website.
The thing is that I cannot change the url and transfer my site to different URL, So what I need to know now is what should I do in order to make this stupid mistake less resource consuming.
1- Keep things as it is
or
2- Creating a blank php with the name of the called script..
Here is the short question:
When I call a page on a remote server through WebClient, which thing consume more server resources, calling a blank php page or calling an inexisting php page..
Thanks in advance
In a general broad sense whether the 404 page will be displayed or not depends on the browser.
IE doesn't show a custom 404 page unless it's larger than 512 bytes.
See here
Same deal with chrome too.
If you do want to put a custom 404 page make sure you include a favicon in it. Otherwise it leads to really long loading times. Discussed at length here
Can anyone verify what I am seeing is not a bug? I am checking the referrer when someone comes to my site which is an ASP.NET C# store site. When I link from any of the other sites I control, my main page sees the referrer properly.
I am trying to support another third-party site that's linking to me and they have a Google site page at sites.google.com/site/whatever and when I follow that link my referrer on my main page is blank.
Is that something Google is doing or is it a truly bizarre bug in my code. (I know you can't see my code but I would like verification that Google is stripping referrer from their sites.google pages please.)
Thanks
Google Sites is HTTPS by default, which means no referrer data is passed. This may be part of Google's move to HTTPS across the board. Implications discussed here.
HTTP RFC says referrers shouldn't be sent when going from HTTPS to HTTP. Not sure if HTTPS to HTTPS will work either. See discussion here.
I want to get the search terms that user typed on Google to get to my long-tail landing page (and use them on that page).
Getting the the "q" variable from the query string using the response referrer (in ASP C#) works well but only if the referring Google page was not loaded as https.
This is obviously a problem due to the fact that almost everyone is logged in to their Google accounts on their browsers all the time and, if they are, all Google pages will be automatically loaded (and redirected) to use https.
When a user (on https://www.google.com) searches for something and clicks on a search result, Google seems to redirect the user to an intermediate page that strips the request of its query string and replaces it with a different one that pretty much only contains url that the intermediate page should redirect to (i.e. the url to my long-tail landing page).
Is there any way that I can get the original search terms that were used on https://www.google.com anyway? Maybe if JavaScript could access the browser history or something similar?
Is there any way that I can get the original search terms that were used on https://www.google.com
No, the full text of the https session is secured via SSL this includes headers, urls etc. In your scenario, for security reasons browers tend to omit the referer header therefore you won't be able to access it (unless the destination URL is also secured via HTTPS). This is part of the HTTP spec - 15.1.3 Encoding Sensitive Information in URI's.
The only thing you can do is put a disclaimer on your site to say it doesn't work over https.
Since it is Google, it is not possible because there is not shared link with your website.
Once you are on HTTPS - it does not allow sending of REFERRER headers. I am sure you are aware that headers can be manipulated and cannot be trusted but, you may trust Google. However, due to privacy policy any activity done on Google by Google users are not shared by 3rd party. Link
Again, in server side languages you can find functions for HTTP Referrer but not HTTPS referrer. That is for a reason !
Unless and until you do not have a collaboration with the originating server who may create an exception in their RFC thing to allow HTTP REFERRER ONLY for your website. It isn't possible.
Hope that helps! (in moving on) :)
EDIT: Wikipedia Link See Referrer Hiding (2nd last line)
To see the referrer data you need to be either a paying google ads customer (and the visitor come via an ad-click) or have your site in HTTPS as well. Certs are cheap these days or you could use an intermediary like CloudFlare to do the SSL and have a self-signed cert on your site.
You can also see queries no matter the method used, with Google Webmaster tools.
I wrote a little about this here: http://blogs.dixcart.com/public/technology/2012/03/say-goodbye-to-keyword-tracking.html
I'm adding the Facebook Comments plugin to a site I'm building on my localhost that has a domain similar to: http://subdomain.domain.lom/
I've added the required code to the page and the plugin appears correctly and I can add comments. The only thing is that it displays a warning message:
Warning: http://subdomain.domain.lom/path is unreachable
I've also added the moderation <meta> tag to the head of my site:
<meta property="fb:app_id" content="{APP-ID}"/>
But when I login to the Facebook comment moderation tool, I don't see any of the test comments I've added.
Is this because I'm testing locally? If so, is there a way I can get the moderation working while developing locally?
You're going to have to fake facebook into believing this is local. Since comments are based upon a distinct URL, then give it a url to the real website (but to a fake page).
So if your production page is http://www.example.com/examplesAndHowTos.php then do http://www.example.com/examplesAndHowTos.php?id=test for the comments url. Be sure to place the fb:admins in the production page so when Facebook lints it, it can grab those values correctly.
is your site accessible to the outside world even though it's local?
what's giving you that warning message?
sounds to me that your page is just not accessible to facebook, since it's only available on a local network you are using.
If you want to test it locally, you should use the hostname 'localhost', so that facebook knows it will not be able to reach your page.