Could you please refer me to an example of an Asp.Net application that would allow the owner to collect a fee for viewing certain pages content?
I have no idea what to start with. Besides the technical aspect of this question, I don't know where one would get a server to install such an application. Can any computer work as such?
Thanks, and sorry if my question is too naive.
You need to look for payment gateways. They will provide you with an API to call that can process the payments. An example would rely on the payment gateway that you are using as all of them are different.
I'd have a look at Stripe it is quite developer friendly (you won't have to trawl through loads of docs to find out what you want to do) and the rates arn't too bad. They are similar to sagepay (much harder to figure out, out of date)
Related
So, I got a dev key from CareerBuilder and I am looking to post a job on their site trough their API. However, so far I have only found ways to retrieve job posting based on certain criteria. Is there a way I can post a job on their site given provided information? (preferably using C# unless there is only one way)
CareerBuilder does not publicly expose their job posing API. Posting is about the only thing you cannot do via their API.
You can see all of the exposed endpoints for the API here: http://developer.careerbuilder.com/endpoints/index#collapse4
However, if you want to be able to post a job from CareerBuilder and don't want to use the web site CareerBuilder does offer you some options there.
You can use their Document Post Interface (DPI). Information can be found here: http://dpi.careerbuilder.com/Site/Index.aspx
The DPI will get you what you want but it is completely separate from the API. You will need to talk to your sales rep or account manager if you are interested in posting via the DPI.
If you do not have a sales rep or account manager, you can call their 1-800 number and they can get you the information there.
Unfortunately I don't believe there is a lot of public information about the DPI. I'm a former employee of CareerBuilder and used to work on their DPI and mapping platforms.
UPDATE: Remembered this after posting. You can also email IntegrationSupport#careerbuilder.com to find more information. That email address is more for technical support but they can point you in the right direction.
We have a scenario where we wish to analyse the visitors to our web site and determine the following:
The operating system
The browser
Whether it is a mobile device
Whether it is a tablet device
This is purely for reporting purposes to provide to our clients, and will not be used to alter the content at all for that particular device. Ideally this would be something performed server side.
The data has to be displayed by ourselves, so we need a tool to perform the categorisation and then hand it off to our process for storage, calculation and subsequent retrieval.
We currently use the ASP.NET HttpRequest.Browser object but the information gained is not particularly useful.
I have spent some time looking into this scenario, the most useful being WURFL and User Agent Info .
If my understanding of the AGPL is correct then by using WURFL and agreeing to the license we would be required to provide the source code to our application to anyone who requested it. That is out of the question and the licensing costs in our scenario are too costly (approximately $50k per annum).
User Agent Info seems a good product, however to tie a core feature of a commercial application to something that may or may not be kept up to date is a risk.
What other solutions are there out there? Does anyone have any recommendations or alternatives? We are not adverse to paying licensing, as long as they are realistic.
Thanks in advance.
I would use the free Google Analytics and if you need to check stuff in code, I would use the javascript called Modernizr
you can use custom reporting in analytics http://www.google.com/support/analytics/bin/answer.py?answer=98527
and you can schedule sending reports:
http://www.google.com/support/analytics/bin/answer.py?hl=en&answer=57163
Luca Passani of WURFL and ScientiaMobile here. Not sure where the original poster got the $50k quote. The reality is that we have a relatively complex pricing structure to accommodate everyone's DDR needs based on how much value WURFL actually provides to them and whether they are reselling WURFL functionality in some form (be it SaaS or bundled with own software) to third-party companies.
I can only suggest that the original poster (and whoever else is interested in licensing WURFL) takes contact with ScientiaMobile through this form:
http://www.scientiamobile.com/license
We also recommend that you save your own and our time by using real email addresses (gmail, hotmail and other virtually anonymous email addresses are not OK) and, above all, provide as truthful information as possible.
Finally, to be clear, commercial licensees are relieved from the copyleft provisions of AGPL. In addition to this, they get access to a personal customer vault with weekly snapshot of the data always available.
Thank you
Luca Passani
After further investigation it seems that WURFL is the most complete solution for mobile browsers and the only solution I can see for tablet detection but is slightly lacking in desktop browser support. I believe this support is to be improved in the near future.
It is then a matter of deciding whether the cost is acceptable for the application in question.
WURFL commercial licenses do not require you to release the source code to your application.
You can find the licensing costs and restrictions here
The most expensive single commercial license is $20k, and goes as low as $1500. Not sure where the $50k number is coming from, but I'd be happy to help if you have any licensing questions.
Seeing RRR's comment, further clarification is due. If your company is using WURFL as part of a SaaS service or bundled with your own software, you are effectively removing the need for your customers to license WURFL separately from ScientiaMobile. This is fine, but only as long you cover for that WURFL usage as part of your OEM license. To be clear, your customers, at that point, are not WURFl licensees, but rather authorized users of your license.
I am trying to add a feature in my website to let the user invite his email contacts to visit the website, the same as twitter and facebook are doing.
I got bored from trying to implement this feature for each email service, for gmail, yahoo, msn. and when I success in implementing one another one change somethings in their APIs and i start to debug the problem.
By the way, is there an API or a webservice I can use which can give me this feature?
Found this API http://code.google.com/p/socialauth-net/ and seams good, will test it and write here later the result.
Context.IO allows you to extract the contacts directly from the email data, as long as the emails are accessible through IMAP
http://openinviter.com/ has those apis used its in PHP although you should be able to get some idea on how to call them from C# .
Here is a demo http://openinviter.com/demo/
This question is similar to:
https://stackoverflow.com/questions/2627722/free-api-for-friends-invite-from-gmail-yahoo-aol-hotmail-php-ajax
Sounds like the packages Gigya and Plaxo worked for that user.
http://channel9.msdn.com/events/MIX/MIX10/FT07
http://gxlive.wordpress.com/2011/09/29/getting-all-items-email-calendar-contacts-etc-from-microsoft-exchange-server-thru-c/
http://channel9.msdn.com/events/mix/mix11/FRM10
http://channel9.msdn.com/Events/TechEd/NorthAmerica/2010/DEV323
http://sourceforge.net/projects/opencontactsnet/
http://sourceforge.net/project/downloading.php?group_id=210092&filesize=178426&filename=OpenContactsNet-1.0.zip
I think you can go through all of them find the right one for your email service.
There are no good open sources solutions to this problem. I have used the commercial products from this company and they work well:
http://stescodes.com/
You can see it in action on this site: http://www.gamzoo.com
The alternative is to research each individual email provider from which you would like to download contacts and implement somethign custom with each. The good news is that they are all starting to utilize OpenAuth for the authentication piece, but it's still a custom job for the most part.
I've used Janrain. It's rather expensive though, and you pay for many other features, not just address book access.
It's coverage is limited to the few most popular services, but I don't think you'll get anything better using OpenID.
I am trying to detect is a visitor is human or not. I just got an idea but not sure if this will work or not. But if I can store a cookie on the persons browser and retrieve it when they are browsing my site. If I successfully retrieve the cookie can this be a good technique to detect bots and spiders?
A well-designed bot or spider can certainly store -- and send you back -- whatever cookies you're sending. So, no, this technique won't help one bit.
Browsers are just code. Bots are just code. Code can do anything you program it too. That includes cookies.
Bots, spammers and the like work on the principle of low-hanging fruit. They're after as many sites or users as they can get with as little effort as possible. Thus they go after popular packages like phpBB and vBulletin because getting into those will get them into a lot of sites.
By the same token, they won't spend a lot of effort to get into your site if the effort is only for your site (unless your site happens to be Facebook or the like). So the best defense against malicious activity of this kind of simply to be different in such a way that an automatic script already written won't work on your site.
But an "I am human" cookie isn't the answer.
No, as Alex says this won't work; the typical process is to use a robots.txt to get them to behave. Further to that, you start to investigate the user-agent string (but this can be spoofed). Any more work than this and you're into CAPTCHA territory.
What are you actually trying to avoid?
You should take a look at the information in the actual http headers and how .Net exposes these things to you. The extra information you have about the person hitting your website is there. Take a look at what Firefox is doing by downloading Live Http Headers plugin and go to your own site. Basically, at a page level, the Request.Headers property exposes this information. I don't know if it's the same in asp.net mvc though. So, the important header for what you want is the User-Agent. This can be altered, obviously, but the major crawlers will let you know who they are by sending a unique UserAgent that identifies them. Same thing with the major browsers.
I wrote a bot that works with cookies and javascript. The easiest way of bot/spam prevention is use Nobot component in Ajax Control Toolkit.
http://www.asp.net/AJAX/AjaxControlToolkit/Samples/NoBot/NoBot.aspx
Can anyone please help me out suggesting any documentations/blogs available for the above subject.
You could probably do something with their RSS feed, by playing around with the search options you can see how they are represented in the querystring, and pull apart the returned XML...
Ershad,
If you are looking for the resume search in Naukri & Monster, then NO.
They are paid service providers. You will get access to their system once you become a paid subscriber. I am not sure whether they exposed any web service apis for this.