Screen scape altered info from another website - c#

I'm trying if it is possible to get the result of another site after clicking some items and altering some data on that site.
For example, you have a site which asks you first for some input and then gives a result. This result i want to have and save it to a database and display it on my own site. Is there any possibility for this?
I searched for hours and i can only find static examples.
Any help or link would be much appriciated.

It's possible. Use fiddler to detect what the form post (assuming its a form post) looks like. You can then recreate the post using an HttpWebRequest and swap your unique values in.

Related

Is there a way in C# to get a 'browser' to pre-process a WebRequest so you can work with the elements that you see when you 'View Source'

I want to make a webrequest to e.g. http://finance.yahoo.com/q?s=rb.l and extract the share price. However the text returned is before the browser has processed it and I need it processed first before the <span></span> element exists that I need to look for.
Is this possible, or should I be looking at doing it another way?
Similarly any reliable 15-min delayed free stock service for the LSE or other way of obtaining this data given just the ticker code would be great.
There are two questions here: first, how to programmatically access data on a page after allowing javascripts and such to run on that page as if it were being read by a real browser. Secondly, how to get stock ticker information programmatically.
To answer the first question: You could use something like WebDriver .NET to literally instantiate a browser that opens the page, and then access elements on the page.
To answer the second question, I suggest you try to search for that question directly, since it's a common enough problem that you'll probably find a number of people who have answered it already.

What technology allows page content to change without changing the URL?

I have seen this on some survey websites. What is the C# code they use on the client side to keep the URL same, but when clicking the "Next" button, the same aspx page is maintained
without having any query string;
without any change even a character in the url; and
the grid, the data , the content, the questions keep changing?
Can anyone give a code-wise example how to achieve this?
My main query is how is this done in code-behind to change data of page and maintain same url.
Nothing simpler that a session, maintainted at the server side. Store a "current question number" in session, increment it at each succesfull postback and you have what you ask about.
Another possibility - a cookie which contains "current question number".
Both cookie and session are invisible in the query string of course.
"change data of page and maintain same url." Answer is Server.Transfer.
This method will preserve url.
The Next button may submit a form using the HTTP POST method. The form data may contain the session, question and response data. The site uses that to build a new response. Unlike a GET, a POST does not incorporate data into the URL.
Developers will typically accomplish this task by using AJAX. The basic premise behind it is that only a certain portion of the page (e.g. a grid or content area) will make a server call and retrieve the results (using Javascript). The effect achieved is that there has not been a full post back, which is why you don't see the URL or parameters changing.
It is possible to do this using jQuery, pure Javascript, or Microsoft's UpdatePanel.
oleksii's comment has some good links as well:
That's the AJAX magic. There are many JQuery plugings for this, for
example this one with a live demo. You can also program it easily
using JQuery Get or Post or any other wrapper that use XmlHttpRequest
object.

How to programmatically scrape Fan Box Iframe using C# or JS/Jquery and obtain specific data?

I have a new design requirement that uses the facebook fan box unorthodoxically. Instead of 'hacking' the fan box, I'd rather scrape the fan box iframe (hidden) and retrieve the contents of the grid-item class in the .fan_box class (i.e. .fan_box .connections_grid .grid_item). I basically need the URLs to the face images and their links.
I'd prefer .NET methodology or JS/Jquery and something to get me started and pointed in the right direction. Please don't just provide a basic method to pull in a webpage to scrape, that's not the point. It's the iframe and accessing the data within it, is the challenge here.
I've not seen anyone try this and have searched thoroughly. I am not a expert so please give me more direction than you would give a brain-dead monkey. Thanks.
Why not just access the page's /feed connection via the API and display it in your own style?
Start here: https://developers.facebook.com/docs/reference/api/page/ (bear in mind you'll need an access token to access the feed)
A sample of the return type is here:
https://developers.facebook.com/tools/explorer/?method=GET&path=19292868552%2Ffeed

How to call a Web Page an automatically fill up a Form passing a String

I use asp.net 4 c sharp.
I would like populate a input text form with a string sent by an User.
The destination page is:
http://www.maxmind.com/app/locate_demo_ip
NOTE: im not the developer for the target page.
Here ho should work:
When a visitor from my site click a link (Ip address)
it will be sent to: http://www.maxmind.com/app/locate_demo_ip
and the TextBox automatically populates with the value (Ip address the user has clicked).
The user Will manually click the button "Look Up IP addresses" in maxmind.com to have the result.
Any idea how to do it? Maybe a sample of code?
Thanks guys as usual for your great support! :-)
if you can, generate a link with this form :
http://www.maxmind.com/app/locate_demo_ip?ip=XX.XX.XX.XX
then, the page can access this value using txt1.Text = Page.Request.QueryString["ip"]
[Edit] it assumes that you are the developer of the target page... is it ?
You tells me you are not the developper.
Either maxmind provide an url syntax similar to the one below (check if there is an api section, or you will have to inject with javascript the value. In this case, you have to know :
for security reason, to avoid a cross site scripting attack, you can't pilot an external site from one page to the other. You can maybe add your application in the trusted zone of the client computer, but it's not possible in an internet application
nothing guaranties that the html structure of maxmind won't change in the future. you can't rely on this.
An another approach would be to "proxy" the features of maxmin by calling yourself from your server application the target page, with a Http Post request. Then you can parse the results to use it on your application. Again, some limitations are to consider :
maxmind may disallow such calls. They may want the user to use their application
again, the target page may change its structure and the textbox names
parsing the result can give headache... and the output structure may change (again)
you have to handle yourself the UI related to this feature.
A final though : what is your goal ? maybe there are other ways to achieve it.

Replicate Printed Form With Web Form

I'm hoping people have some ideas to help solve this problem.
I am developing a C# ASP.NET website and the client requires an online form that users will fill in and submit. OK, so far so good.....
Imagine, say, a form that you fill in on paper - they normally have a distinctive look specific to the company and will be filed, quite possibly as a legally binding document.
I need to have an online form that when submitted emails the client with something they can print out and will look exactly like their printed forms.
As this is web based, I think the option of capturing a screenshot are out the question, so I'm wondering how best to approach this?
Even if I just had a form that captures the data presented how I want, how could I translate this data into the view they want?
Any ideas and suggestions greatly appreciated.
You'll need to take the raw data that was submitted and import it into a standard document (likely PDF). You can use Crystal or another reporting solution, or direct to PDF using one of the many PDF .NET solutions that are out there.
I don't think you'd even want to deal with making the document physically match the screen - much easier to make the web look like the web, and make the printed doc look like a printed doc.
Print a page (this one) from a Browser, notice all the headers and footers?
If you want serious control over how it is going to look, you will need to generate a PDF (or maybe XPS).
Couldn't you just use a sepparate page with a CSS that gives the desired look & feel?

Categories