Simplest Twitter authentication - c#

What is the very, very simplest method of getting the RSS (or JSONP) feed of a Twitter account's list or friends' timeline?
Here some background: I have a simple server script that feeds a Twitter widget. I wrote the script about two months ago but have not had the chance to deploy it until now. The script fetches the friends_timeline of a dummy account whose sole purpose is to combine the "friends" tweets into a single RSS feed (rather than making one request for every "friend"). Simple, lightweight, easy to maintain, light on Twitter's servers; seemed to be a good solution at the time. Well, friends_timeline requires authentication and right now the server uses NetworkCredential to pull down these RSS feeds. As a lot of you know, in August Twitter will cease to support basic authentication and force march everyone over to OAuth. I've looked through the OAuth documentation and I'm not very happy with Twitter right now.
I'm also hoping to avoid using the Twitterizer framework. That's a lot of code to check and it won't go near our production servers without a thorough code review. I know that reinventing the wheel is a bad thing, but in this case all I want is the wheel, not a race car.

This page provides a smallish C# Oauth code: http://oauth.net/code/
Don't hate twitter, it was a good idea.

Related

Xero.netstandard.oauth2 says UK payroll coming soon.. how can i integrate in the meantime?

I'm writing a new winforms c# integration into Xero. oauth1 is deprecated, Xero.netstandard.oauth2 is the correct SDK to use but some things are flagged as 'coming soon', including UK payroll. My oauth2 authentication is completed, and the accountingapi is drama free..
But my customer requires payroll integration, how can i accomplish this in the meantime? Do i have to use the soon-to-be-deprecated xero.api.sdk? Seems crap to say the least if that is the case..
great to see you are working on the OAuth 2.0 code base and is ahead of many devs. We are working hard on improving the SDKs projects to streamline the process and will introduce Payroll UK API soon! Soon is a relative term, I am afraid when I say this, it is still at least 2 weeks away from today.
But don't let that stop you from rolling your own API client that interface with our APIs directly with JSON objects! The API is well specified in our OpenAPI spec, you can even generate your own SDK with the help of any OpenAPI generators out there.
All sorted now. Turned out to be a Scope error, the token already being generated by the .netstandard.oauth2 client was correct in all other respects. UKPayroll now working off our own sdk.
Thanks

Building a Facebook interface for an eye-gaze computer running Windows 10

I am part of a senior design group looking to build an eye-gaze interface for Facebook on Windows 10. That is, my group has a client who controls his computer with his eyes and would like an easier way of accessing Facebook.
Of course we would like to allow our client to access all aspects of Facebook, most importantly Messaging and browsing his News Feed, but also commenting and replying to friends, posts, etc.
I have spent the last few weeks building test applications in Visual Studio, using Facebook's C# SDK and have found that, through it, many things are not accessible. Messaging and News Feed have been inaccessible from the start, but recently, after implementing and using a system that allowed me to comment on my own posts, this feature became unavailable as well.
My question is very broad and directly to Facebook Developers: Is there any way to accomplish my group's goal? (I realize that this may require more of a discussion, but I don't want to make this post too long.) What is the direction of the Graph API? We want to create a system that will last and requires as little maintenance as possible. (The recent removal of commenting makes me worried that our system would become unusable within a year if we used Graph API.)
*We are not looking to make any money from this endeavor. My group is creating this solely for academic purposes and are customizing this interface for a single "client" who is not compensating us in any way.
Thanks for taking the time to read this

Can Bots/Spiders utilize Cookies?

I am trying to detect is a visitor is human or not. I just got an idea but not sure if this will work or not. But if I can store a cookie on the persons browser and retrieve it when they are browsing my site. If I successfully retrieve the cookie can this be a good technique to detect bots and spiders?
A well-designed bot or spider can certainly store -- and send you back -- whatever cookies you're sending. So, no, this technique won't help one bit.
Browsers are just code. Bots are just code. Code can do anything you program it too. That includes cookies.
Bots, spammers and the like work on the principle of low-hanging fruit. They're after as many sites or users as they can get with as little effort as possible. Thus they go after popular packages like phpBB and vBulletin because getting into those will get them into a lot of sites.
By the same token, they won't spend a lot of effort to get into your site if the effort is only for your site (unless your site happens to be Facebook or the like). So the best defense against malicious activity of this kind of simply to be different in such a way that an automatic script already written won't work on your site.
But an "I am human" cookie isn't the answer.
No, as Alex says this won't work; the typical process is to use a robots.txt to get them to behave. Further to that, you start to investigate the user-agent string (but this can be spoofed). Any more work than this and you're into CAPTCHA territory.
What are you actually trying to avoid?
You should take a look at the information in the actual http headers and how .Net exposes these things to you. The extra information you have about the person hitting your website is there. Take a look at what Firefox is doing by downloading Live Http Headers plugin and go to your own site. Basically, at a page level, the Request.Headers property exposes this information. I don't know if it's the same in asp.net mvc though. So, the important header for what you want is the User-Agent. This can be altered, obviously, but the major crawlers will let you know who they are by sending a unique UserAgent that identifies them. Same thing with the major browsers.
I wrote a bot that works with cookies and javascript. The easiest way of bot/spam prevention is use Nobot component in Ajax Control Toolkit.
http://www.asp.net/AJAX/AjaxControlToolkit/Samples/NoBot/NoBot.aspx

Google Analytics LIKE Tool

I'm considering writing my own tool for tracking visitors/sales as Google Analytics and others are just not comprehensive enough in the data dept. They have nice GUIs but if you have SQL skills those GUIs are unnecessary.
I'm wondering what the best approach is to do this.
I could simply just log the IP, etc to a text file and then have an async service run in the background to dump it into the DB. Or, maybe that's overkill and I can just put it straight in the DB. But one DB WRITE per web request seems like a poor choice where scalability is concerned. Thoughts?
As a sidenote, it is possible to capture the referring URL or any incoming traffic, right? So if they came from a forum post or something, you can track that actual URL, is that right?
It just seems that this is a very standard requirement and I don't want to go reinventing the wheel.
As always, thanks for the insight SOF.
The answer to this question mentions the open-source GAnalytics alternative Piwik - it's not C# but you might get some ideas looking at the implementation.
For a .NET solution I would recommend reading Matt Berseth's Visit/PageView Analysis Services Cube blog posts (and earlier and example and another example, since they aren't easy to find on his site).
I'm not sure if he ever posted the server-side code (although you will find his openurchin.js linked in his html), but you will find most of the concepts explained. You could probably get something working pretty quickly by following his instructions.
I don't think you'd want to write to a text file - locking issues might arise; I'd go for INSERTs into a database table. If the table grows too big you can always 'roll up' the results periodically and purge old records.
As for the REFERER Url, you can definitely grab that info from the HTTP HEADERS (assuming it has been sent by the client and not stripped off by proxies or strict AV s/w settings).
BTW, keep in mind that Google Analytics adds a lot of value to stats - it geocodes IP addresses to show results by location (country/city) and also by ISP/IP owner. Their javascript does Flash detection and segments the User-Agent into useful 'browser catagories', and also detects other user-settings like operating system and screen resolution. That's some non-trivial coding that you will have to do if you want to achieve the same level of reporting - not to mention the data and calculations to get entry & exit page info, returning visits, unique visitors, returning visitors, time spent on site, etc.
There is a Google Analytics API that you might want to check out, too.
Have you looked at Log Parser to parse the IIS logs?
I wouldn't have though writing to a text file would be more efficient than writing to a database - quite the opposite, in fact. You would have to lock the text file while writing, to avoid concurrency problems, and this would probably have more of an impact than writing to a database (which is designed for exactly that kind of scenario).
I'd also be wary of re-inventing the wheel. I'm not at all clear what you think a bespoke hits logger could do better than Google Analytics, which is extremely comprehensive. Believe me, I've been down the road and written my own, and Analytics made it quite redundant.

What is recommended for monitoring traffic to my asp.net application

I have a asp.net 3.5 application hosted on IIS 7.0. I'm looking for a comprehensive system to monitor traffic, down to page level minimum. Does .net have any specific tools or is it better to write my own, or what systems/software is freely available to use
Thanks
Use Google Analytics. Its a small piece of Javascript code that is inserted before the tag. Its based on Urchin analytics tracking software which Google bought. They've been doing this for a long long time.
As long as your site is referenced using a fully qualified domain name, Google Analytics can track what you need. It's got lots of flexibility with the filter mechanism as well (let's you rewrite URLs based on query string parameters, etc.)
LOTS of functionality and well thought out as well as a pretty good API if you need to do tracking on things other than clicks.
If you have access to the IIS logs, you can use a log analyzer to interpret the data. An example is the free AWStats analyzer:
http://awstats.sourceforge.net/
An alternative (and one I recommend) is Google Analytics (http://www.google.com/analytics). This relies on you embedding a small chunk of Javascript in each page you want tracking, then Google does the grunt work for you, presenting the results in an attractive Flash-rich site.
I'd suggest trying both and seeing which suits your needs. I'd definitely recommend against rolling your own system, as the above solutions are very mature and capable. Best of luck!
You'll need a client-side / javascript tracking service (such as Google Analytics but there are other good free alternatives out there) because it runs even when the user clicks the back button and the previous page (on your site) is loaded from the browser cache and not the server. The IIS won't "see" the reload since no request is made to it.

Categories