Testing Facebook Comments Plugin Locally - c#

I'm adding the Facebook Comments plugin to a site I'm building on my localhost that has a domain similar to: http://subdomain.domain.lom/
I've added the required code to the page and the plugin appears correctly and I can add comments. The only thing is that it displays a warning message:
Warning: http://subdomain.domain.lom/path is unreachable
I've also added the moderation <meta> tag to the head of my site:
<meta property="fb:app_id" content="{APP-ID}"/>
But when I login to the Facebook comment moderation tool, I don't see any of the test comments I've added.
Is this because I'm testing locally? If so, is there a way I can get the moderation working while developing locally?

You're going to have to fake facebook into believing this is local. Since comments are based upon a distinct URL, then give it a url to the real website (but to a fake page).
So if your production page is http://www.example.com/examplesAndHowTos.php then do http://www.example.com/examplesAndHowTos.php?id=test for the comments url. Be sure to place the fb:admins in the production page so when Facebook lints it, it can grab those values correctly.

is your site accessible to the outside world even though it's local?
what's giving you that warning message?
sounds to me that your page is just not accessible to facebook, since it's only available on a local network you are using.

If you want to test it locally, you should use the hostname 'localhost', so that facebook knows it will not be able to reach your page.

Related

Facebook OAuth Unable to Connect

I have been trying to create an iOS App using Xamarin (i.e. using Xamarin.iOS on C#) and I have run into an error I can't seem to resolve.
Basically, I have entered all the (correct) details into the default Facebook iOS SDK object as directed in the sample documents. However whenever I click login, the page returns 'unable to connect to server'.
The (URL Decoded) link the SDK sends me to is listed below:
https://m.facebook.com/v2.3/dialog/oauth?client_id=(ID-REDACTED)&default_audience=friends&display=touch&e2e={"init":REDACTED }&redirect_uri=fb-ID-REDACTED://authorize/&response_type=token,signed_request&return_scopes=true&scope=&sdk=ios&sdk_version=4.2.0&state={"REDACTED":"REDACTED= ","0_auth_logger_id":"REDACTED","com.facebook.sdk_client_state":true,"3_method":1}
Pointing to this in a browser returns me to a blank page (when I enter in my ID which I have removed from the link above). I'm pretty certain this is due to an error in the way I have set up my application on Facebook as opposed to an error with the SDK (I'm using the default provided by Xamarin). I have removed the app from Sandbox Mode as directed and I have tried to follow the instructions provided on a couple of forum posts but nothing has changed and I am at a loss right now...
Any advice?
Thanks!
Turns out I've managed to solve my issue...
For some really odd reason, m.facebook.com and facebook.com were added as entries in the hosts file (meaning that they were essentially blocked domains). This was weird given that I could access facebook from standard browsers like safari/firefox (just not the safari browser in the simulator).
If you have a similar issue, please change the hosts file (follow some tutorials online).

Why is my Azure website no longer visible?

I have created a new website and published it (via Visual Studio) to Azure.
Initially, everything worked fine.
But, after republishing the website a few times, the website stopped responding - i.e. it shows a "Server not found" error in Firefox or "This page can't be displayed" in IE).
Now, any new website I publish shows the same error. (Here's a basic test site I published, so you can see the message: http://www.test-website.azurewebsites.net)
The previously published websites are still working; it is just new websites that are failing.
Does anyone know why this would be happening?
Additional Info:
The website was created using Visual Studio 2012 C# ASP MVC .NET 4.5
The websites are using Microsoft's 'Free' pricing tier.
There are a total of 3 websites on the Azure account.
Claies brought this up in a comment, and I'll take it a step further. Your link should not start with www. when you're visiting a .azurewebsites.net domain.
If you're just typing this, then that's the issue.
On the other hand, given that you're asking this, I'm wondering whether maybe your configuration file is a bit messed up in VS. When you run the publish wizard, try going back a couple pages to the page with the textboxes, and double-check that none of those refer to the www. version. If they do, simply drop that.
That settings shouldn't affect the publish itself, but it will determine which URL to go to when publishing completes, so you'll definitely want to get that fixed if it is wrong, or else this will just keep happening.

Creating custom login portal

Our company has several clients for some major corporations, and we're looking into giving our portal page a little something extra by having customized urls all link to the same portal.
client1.ourcompany.net
client2.ourcompany.net
Ideally, we'd like for the urls to be superficial, I found the Request.Url.ToString() function that could be utilized on the code behind, but would I still need to create custom ASPX pages for each portal?
No, you could handle this in IIS. Basically, you set up your IIS endpoint to respond to all *.ourcompany.net requests with the same IIS application and your code can disambiguate details from there in the fashion you describe.
Also worth adding that IIS doesn't support Wildcard subdomains for websites which share an IP (source: http://forums.asp.net/t/1872857.aspx?Automatic+subdomain+with+c+in+IIS).
As you're working for a company - this may not be a problem but worth noting just in case.
Also some information here: https://serverfault.com/questions/84921/how-to-configure-iis-wildcards-for-account-subdomains-like-basecamp <- link explains how you'd set this up in IIS by making the website respond to all requests for the IP.

Redirecting Old Urls After Web Site Overhaul

We have a website which we recently migrated to ASP.NET MVC. All of the URLs are now different from the original website. Google still has all of our old URLs, so if anyone finds us in a search, currently they will get a 404.
I have a catchall route that catches bad URLs, including all of the old ones. In a perfect world I would like to do a 301 redirect to the home page for all urls matching this catchall route, and I do have code for this that works properly on my development machine. However, I finally got someone at our ISP (Network Solutions) to tell me that they block 301 redirections (the web server returns a 404 instead).
So I think my only remaining option is to just accept any bad URL, and point it to the home page.
Here is my question: I know that the search engines (especially Google) are now penalizing duplicate content. If I just point all bad URLs to the home page, how much is this going to hurt us in the search rankings? Do I have any other technical options?
Honestly, I would suggest that you change ISP's. 301's are an important tool in any webmaster's toolbox, and for them to block that will penalize you terribly. You could easily transfer your domain to another IP address, wait for the DNS propagation, and then do your rollout.
From Google's Webmaster tools:
Use a 301 Redirect to permanently
redirect all pages on your old site to
your new site. This tells search
engines and users that your site has
permanently moved. We recommend that
you move and redirect a section or
directory first, and then test to make
sure that your redirects are working
correctly before moving all your
content.
Don't do a single redirect directing
all traffic from your old site to your
new home page. This will avoid 404
errors, but it's not a good user
experience. It's more work, but a
page-to-page redirect will help
preserve your site's ranking in Google
while providing a consistent and
transparent experience for your users.
If there won't be a 1:1 match between
pages on your old site and your new
site (recommended), try to make sure
that every page on your old site is at
least redirected to a new page with
similar content.
I'm sure that's much easier said then done, but I would never want an ISP that exerted that kind of filter against their clients.
Can you do a 302 redirect at least? I do agree with what womp says though, what ISP would block 301 redirects? Dump them. ISPs are a dime a dozen.
I completely agree with womp. I cannot believe that an ISP would block 301's.
I was so surprised that you can't do a 301 redirect on Network Solutions, because they're not exactly a two bit operation.
Their own marketing material suggests that you can. There's also a forum post by someone wanting to do a 301 redirect. Although they use a .htaccess, the reply from a Network Solutions tech support shows the user how to do a 301 redirect in ASP.
If you opt to not change ISP then the simples solution is to display a page where you say that the page has been moved with a link to the new page link, then you add a 5 second delay that re-directs using a HTML meta tag:
<html>
<head>
<title>Page moved</title>
<meta http-equiv="refresh" content="5;url=http://example.com/newurl">
</head>
<body>
The page has been moved, click here if you have not been re-directed to the new page within 5 seconds.
</body>
</html>
Alternatively you could use a URL rewriter, that way the old url "points" to the new page, there are basically two ways of doing this, the programmatically way is to create your own VirtualPathProvider, the second way is to use a URL Rewriter module like the IIS Url Rewrite Module.

Take down website to public, but leave for testing... "We're Not Open"

We are rolling out a site for a client using IIS tomorrow.
I am to take the site down to the general public (Sorry, we are updating message) and allow the client to test over the weekend after we perform the upgrade.
If it is successful, I open it to everbody - if not, I rollback.
What is the easiest way to put a "We're not open" sign for the general public, but leave the rest open to testers?
Redirect via IIS. Create a new website in IIS and put your "Sorry updating" message in the Default.aspx. Then switch ports between the real site (will go from 80, to something else (6666)) and the 'maintenance' site (set on 80).
Then tell your testers to go to yoursite.com:6666.
Then switch the real site back to 80 after taking down the 'maintenance' site.
I thought it would be worthwhile to mention ASP.NET 2.0+'s "app offline" feature. (Yes, I realize the questioner wants to leave the app up for testing, but I'm writing this for later readers who might come here with different needs).
If you really want to take the application offline for everyone (for instance to do server maintenance) there is a very simple option. All you have to do in ASP.NET 2.0 and higher is put a file with this name:
app_offline.htm
...in the root directory of your ASP.NET application. Put an appropriate "sorry come back later" message in there. That's it. The ASP.NET runtime does the rest.
Details on Scott Guthrie's blog.
Require that testers login. You can even hide the login page so that you need a direct link to even see it. Then, for all people not logged in, redirect to the page that displays your message.
Fire up another "site" in IIS which will catch your host-header for your primary site. Use either a custom 307/503/404 page that has "we're down for maintainance" or use some sort of URL-rewrite to redirect people to your single static file.
switch host-header-binding on your real site to something else, like dev.domain.com or testing.domain.com that your developers use.
Or, block by IP, and have your custom "Not authorized" page tell visitors that your down to maintainance.
You have several options.
Some methods that I've used before:
Windows authentication and/or separate subdomains for client to test.
Disable anonymous website access in IIS and give your client a username/password combo to test the website.
Disable default document in IIS and give your client an absolute URL to the main index file.
We tend to have a log in page and an include file across all pages in the site (usually the DB Connection as it's included in all files) that checks for a valid logged in session. If you've not logged in you get a message saying the site's down for maintainance

Categories