Call classic ASP function from ASPX - c#

I am working on an old web application where pages were written in classic ASP and are wrapped in aspx pages using Iframes. I am rewriting one of those pages in ASP.NET (using C#) removing the dependency on Iframes altogether. The page_to_rewrite.asp call many other functions present in other ASP pages in the same application.
I am facing difficulty calling those ASP functions from aspx.cs. I tried to use WebClient class like this:
using (WebClient wc = new WebClient())
{
Stream _stream= wc.OpenRead("http://localhost/Employee/finance_util.asp?function=GetSalary?EmpId=12345");
StreamReader sr= new StreamReader(_stream);
string s = sr.ReadToEnd();
_stream.Close();
sr.Close();
}
Every request coming to this application is checked for a valid session cookie using an IIS HTTP module and if its not present user is redirected to login page. Now when I call this ASP page url from aspx I get the login page of my application as the response as no session cookie is present.
Can anyone please kindly suggest how can I call the ASP methods successfully.

As told by #Schadensbegrenzer in the comment I just had to pass the cookie in the request header like this:
using (WebClient wc = new WebClient())
{
wc.Headers[HttpRequestHeader.Cookie] = "SessionID=" + Request.Cookies["SessionID"].Value;
Stream _stream= wc.OpenRead("http://localhost/Employee/finance_util.asp?function=GetSalary&EmpId=12345");
StreamReader sr= new StreamReader(_stream);
string s = sr.ReadToEnd();
_stream.Close();
sr.Close();
}
In other similar questions on StackOverflow some people have suggested to also include User-Agent in the request header if you are getting blank output from the asp page as some web servers require this value in the request headers. See if it helps in your case. Mine worked even without it.
Also you will have to handle the request in your ASP page something like this:
Dim param_1
Dim param_2
Dim output
param_1 = Request.QueryString("function")
param_2 = Request.QueryString("EmpId")
If param_1 = "GetSalary" Then
output = GetSalary(param_2)
response.write output
End If
Hope it helps!

Related

Post Not Publishing to Page via FaceBook API

All I am trying to do is post to a page using the API. This task was extremely simple using Twitter; but with FaceBook, it has been very challenging.
I am using the following code:
string url = #"https://graph.facebook.com/{page_id}/feed?message=Hello&access_token={app_id}|{app_secret}";
WebClient client = new WebClient();
Stream data = client.OpenRead(url);
StreamReader reader = new StreamReader(data);
string s = reader.ReadToEnd();
Console.WriteLine(s);
It returns data like this:
{"data":[{"story":"Page updated their cover photo.","created_time":"2017-03-13T22:49:56+0000","id":"1646548358..._164741855..."}...
But, the post is never seen on the page! How can I successfully post from my app to my page?
Your request should be a POST request as the data you're getting back suggests this is a GET request.
Also you need the publish_pages permission to successfully post to a page

C# loading html of a webpage currently on

I am trying to make a small app that can log in automatically on a website, get certain texts on the website and return to user.
To show what I have, I did below to make it log in,
System.Windows.Forms.HtmlDocument doc = logger.Document as System.Windows.Forms.HtmlDocument;
try
{
doc.GetElementById("loginUsername").SetAttribute("value", "myusername");
doc.GetElementById("loginPassword").SetAttribute("value", "mypassword");
doc.GetElementById("loginSubmit").InvokeMember("click");
And below to load html of the page
WebClient myClient = new WebClient();
Stream response = myClient.OpenRead(webbrowser.Url);
StreamReader reader = new StreamReader(response);
string src = reader.ReadToEnd(); // finally reading html and saving in variable
Now, it successfully loaded html but html of the page where it's not logged in. Is there a way to refer to current html somehow? Or another way to achieve my goals. Thank you for reading!
Use the Webclient class so you can use sessions and cookies.
check this Q&A: Using WebClient or WebRequest to login to a website and access data
Why don't you make REST API calls and send the data like username and password from your code itself?
Is there any Web API for the URL ? If yes , you can simply call the service and pass on the required parameters. The API shall return in JSON/XML which you can parse and extract information

Reusing a session created to another website

We have an aspx page with some Page_Load code that calls a third party's TokenGenerator.aspx page to generate an SSO token
private string GetSSOToken()
{
using (WebClient client = new WebClient())
{
//serverUri is https://theirsite.com/Token.aspx, ssoRequestParam is a NameValueCollection
byte[] responsebytes = client.UploadValues(serverUri.AbsoluteUri, "POST", ssoRequestParam);
var ssoToken = Encoding.UTF8.GetString(responsebytes);
return ssoToken;
}
}
The returned ssoToken is added to a URL used as an iframe src. It ends up looking like this:
var frameUrl = "https://theirsite.com/SSO.aspx?ssotoken=returnedToken";
frame.Attributes["src"] = frameUrl;
The page then loads in the browser and all is well. This has been working fine for awhile.
Now we need to add a way to logout of the SSO.aspx. In our Logout() method when logging out of our application, I try calling their logout page:
using (WebClient client = new WebClient())
{
var logoutUrl = "https://theirsite.com/SSO.aspx?logout=true";
var s = client.DownloadString(logoutUrl);
}
But the logout never happens; their application doesn't show the logout happening, and I can paste the frameUrl into a browser and see the page still.
As a test, this works to logout:
Generate the frameUrl.
Copy/paste the frameUrl into a browser: it loads fine.
In a new separate window, paste the logoutUrl: logout does not happen, can still load frameUrl.
In another tab of the same browser, paste the logoutUrl: logout happens, frameUrl gives appropriate "token expired" message.
So I am guessing this is because the first WebClient session is not the same as the second WebClient session... but I am not sure how to reuse the first session during logout.

WebClient redirect to response page after POST of form

I'm having some trouble trying to POST some form values to a page + redirect the user. For example:
User clicks register button
Some form values are populated
WebClient POSTS these values and redirects to the external URL
I can achieve this without WebClient easily with an intermediate .aspx and a standard form with onload="document.forms[0].submit()" added to the body.
However, I would like to do this without an intermediate page. Is this possible using WebClient (or similar)?
I post my values like so: -
using (WebClient wc = new WebClient())
{
NameValueCollection nvc = new NameValueCollection(8);
nvc.Add("_charset_", "utf-8");
nvc.Add("shop_id", "");
nvc.Add("email", "");
nvc.Add("amount", "");
nvc.Add("curr", "");
nvc.Add("paymentType", "");
nvc.Add("kdnr", "");
nvc.Add("ordernr", "");
byte[] response = wc.UploadValues("https://www.trustedshops.com/shop/protection.php", nvc);
}
I can get the response from the byte array which is just the html of the page i post to, however, i want to actually send the user to the posted page so that they see the form prepoplated. Can this be done?
Thanks

C# Downloading HTML from a website after logging in

I've recently been looking on how to get data from a website using C#. I tried using the WebBrowser object to navigate and log in, and that worked fine, but I keep getting the same problem over and over again: when I navigate to the wanted page I get disconnected.
I've tried several things like making sure that only one HtmlDocument exists but I still get logged out.
TLDR: how do you stay logged in, from page to page, while navigating a website with WebBrowser? Or are there better alternatives?
EDIT: So far I have the following code;
currentWebBrowser = new WebBrowser();
currentWebBrowser.DocumentText = #"<head></head><body></body>";
currentWebBrowser.Url = new Uri("about:blank");
currentWebBrowser.Navigate("http://google.com");
HttpWebRequest Req = (HttpWebRequest) WebRequest.Create("http://google.com");
Req.Proxy = null;
Req.UseDefaultCredentials = true;
HttpWebResponse Res = (HttpWebResponse)Req.GetResponse();
currentWebBrowser.Document.Cookie = Res.Cookies.ToString();
At which moment should I get the cookies? And is my code correct?
You have to preserve the cookies returned from your login request and reuse those cookies on all subsequent requests - the authentication cookie tells the server that you are in fact logged in already. E.g. see here on how to do that.

Categories