Here i am passing the session id on a button click from one asp.net application to another asp.net application.
Application 1:
protected void imgBTN_Click(object sender, EventArgs e)
{
string sessionKey = HttpContext.Current.Session.SessionID;
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(#"http://localhost:43392/PartnerHome.aspx");
string svcCredentials = Convert.ToBase64String(ASCIIEncoding.ASCII.GetBytes(sessionKey));
req.Headers.Add("Authorization", "Basic " + svcCredentials);
try
{
using (WebResponse svcResponse = (HttpWebResponse)req.GetResponse())
{
using (StreamReader sr = new StreamReader(svcResponse.GetResponseStream()))
{
string jsonTxt = sr.ReadToEnd();
}
}
}
catch (Exception c)
{
}
}
and my problem here is how to retrieve this session id there in my second asp.net application pageload
Application 2:
protected void Page_Load(object sender, EventArgs e)
{
}
Any Suggestion?
Another solution:
What I would do is to add an invisible image (some transparant pixel) on the page of application 1:
<img src="http://site2.domain.com/transpimage.aspx">
The invisible image would actually be a asp.net page hosted by site 2.
In the page load event of transpimage.aspx. I would output some image:
public void Page_Load(...)
{
using(Bitmap image = new Bitmap(1,1))
using(MemoryStream stream = new MemoryStream())
{
image.Save(stream);
Response.ContentType = "image/png";
stream.WriteTo(Response.OutputStream);
Response.End();
}
}
Since the image from application 2 is served by an aspx page (not a static image in a file) it will keep the session of application2 alive. I've done this before and it tends to be fairly robust.
If you are familiar with HttpHandlers you can write the "image page" as a handler, but don't forget that if you do it as a handler you need to inherit from the IRequiresSessionState interface to actually acccess the session.
Hope this helps!
Related
We have an old ASP.NET application. We have an image element on one of the pages. For the security purpose we would like to stream the image instead of a direct reference. However, after the data link is generated from the stream, it appears to be permanent. Even after the application is closed, the data link is still alive.
Is there a way to make this link invalid when out of session, to somehow invalidate it, by maybe emptying a stream?
Please let me know
Thank you in advance
MemoryStream ms = new MemoryStream();
string physicalPath = "some path of an image";
using (FileStream fs = new FileStream(physicalPath, FileMode.Open, FileAccess.Read))
{
fs.CopyTo(ms);
}
img.ImageUrl = "data:image/png;base64," + Convert.ToBase64String(ms.ToArray(), 0, ms.ToArray().Length);
Found a solution
Changes the above code to
Session["ImageStream"] = true;
img.ImageUrl = "MyImage.aspx";
Where the code for the MyImage.aspx is
protected override void Page_Load(object sender, EventArgs e)
{
if (Session["ImageStream"] != null && (bool)Session["ImageStream"])
{
string somePath = "Some path";
byte[] imageBytes = File.ReadAllBytes(somePath);
Response.BinaryWrite(imageBytes);
}
}
You may consider placing a javascript code that change the element "src" attribute when the session is expired.
I am able to upload the files to an API, but I need a small help. Right now I just hard-coded it. But actually, I will be having a PDF and XML files in two different local file storage locations, I need to get the files from that location and needs to upload them to API. Can anyone help me to achieve this?
private void btnsubmit_Click(object sender, EventArgs e)
{
UploadFileAsync(#"D:\test\SBP-1102.pdf");
}
public static async Task UploadFileAsync(string path)
{
HttpClient client = new HttpClient();
// we need to send a request with multipart/form-data
var multiForm = new MultipartFormDataContent();
// add file and directly upload it
FileStream fs = File.OpenRead(path);
multiForm.Add(new StreamContent(fs), "files", Path.GetFileName(path));
// send request to API
var url = "https://spaysaas-dev/api/getOCRDocuments";
var response = await client.PostAsync(url, multiForm);
if(response.IsSuccessStatusCode)
{
MessageBox.Show("Success");
}
else
{
MessageBox.Show(response.ToString());
}
}
This answer is incomplete in that it doesn't actually explain why the file isn't being uploaded, but it might help you diagnose the problem.
The documentation on WebClient.UploadFileAsync says:
The file is sent asynchronously using thread resources that are automatically allocated from the thread pool. To receive notification when the file upload completes, add an event handler to the UploadFileCompleted event.
So you could try handling WebClient.UploadFileCompleted and checking the UploadFileCompletedEventArgs for errors.
private void Upload(string fileName)
{
var client = new WebClient();
client.UploadFileCompleted += Client_UploadFileCompleted;
try
{
var uri = new Uri("https://saas-dev/api/getDocs");
{
client.Headers.Add("fileName", System.IO.Path.GetFileName(fileName));
client.UploadFileAsync(uri, fileName);
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
private void Client_UploadFileCompleted(object sender, UploadFileCompletedEventArgs e)
{
// Check e.Error for errors
}
I am able to upload the PDF file to an API using multi-part form data
I am running an ASP 4.5 application. One one of the pages the user must answer several questions and the push the button to finish the test. My application uses a text file to analyze the users answers. If the user does everything quickly the application works fine, but when it takes longer then 20 min for him to finish the test I get an exception
Cannot read from a closed TextReader
I do not understand what's wrong, because I open StreamReader only when the button is pressed. This is a part of my code:
protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
GlobalVariables.surname = Request.QueryString["surname"];
GlobalVariables.name = Request.QueryString["name"];
GlobalVariables.gender = Request.QueryString["gender"];
GlobalVariables.age = int.Parse(Request.QueryString["age"]);
}
Label1.Width = 700;
Button1.Click += new EventHandler(this.Button1_Click);
}
void Button1_Click(Object sender, EventArgs e)
{
var f0= new FileStream(Server.MapPath("./key.txt"), FileMode.Open, FileAccess.Read);
StreamReader sr = new StreamReader(f0);
//.....
sr.Close();
sr.Dispose();
}
Could somebody help me please?
when it takes longer then 20 mis for him to finish the test I get an exception
That sounds a lot like their session expired. To fix this, I recommend adding some javascript to establish a heartbeat to the web server. The heartbeat will keep the session alive; it doesn't need to do anything other than simply make a request every minute or so, so the server knows you're still there.
In addition to the answer from Joel I would recommend to separate processing the file from reading the file.
List<string> lines = new List<string>();
using (var f0 = new FileStream(Server.MapPath("./key.txt"), FileMode.Open, FileAccess.Read))
{
string line;
using (StreamReader reader = new StreamReader(f0))
{
while ((line = reader.ReadLine()) != null)
{
lines.add(line);
}
}
}
// it would need to be a very big text file to be a memory issue
// do your processing here
f the page is not a post back, you would want to set up the page as it should be viewed the first time. I would also suggest moving the button click even within the if(!Page.IsPostBack) as well as anything that needs to be setup before a post-back. Move your Stream reader to the else... like so if(!Page.IsPostBack) else { stream reader stuff } and remove the button click even in general since the button causes postback.
protected void Page_Load(object sender, EventArgs e)
{
if (!IsPostBack)
{
GlobalVariables.surname = Request.QueryString["surname"];
GlobalVariables.name = Request.QueryString["name"];
GlobalVariables.gender = Request.QueryString["gender"];
GlobalVariables.age = int.Parse(Request.QueryString["age"]);
Label1.Width = 700;
}
else
{
DoPostBackStuff();
}
}
private void DoPostBackStuff()
{
var f0= new FileStream(Server.MapPath("./key.txt"), FileMode.Open, FileAccess.Read);
StreamReader sr = new StreamReader(f0);
//.....
sr.Close();
sr.Dispose();
}
I'm trying to load a remote image from my Amazon S3 bucket and send it to browser in binary. I'm also trying to learn ASP.Net at the same time. I've been a classic programmer for many years and need to change. I started yesterday and have my first headache today.
On a page in my application I have this image element:
<img src="loadImage.ashx?p=rqrewrwr">
and on loadImage.ashx, I have this exact code:
-------------------------------------------------
<%# WebHandler Language="C#" Class="Handler" %>
string url = "https://............10000.JPG";
byte[] imageData;
using (WebClient client = new WebClient()) {
imageData = client.DownloadData(url);
}
public void ProcessRequest(HttpContext context)
{
context.Response.OutputStream.Write(imageData, 0, imageData.Length);
}
-------------------------------------------------
There is probably quite a lot wrong with this, as it's my first attempt at .net and don't know what I'm doing. To start with, I'm getting the following error but sure there's more to come.
CS0116: A namespace does not directly contain members such as fields or methods
This is on line 3, which is string url = "https://............"
For an HttpHandler, you have to put the code in the code behind... if you expand loadimage.ashx in Solution Explorer, you should see a loadimage.ashx.cs file. This file is where your logic should be, and all of it should be in the ProcessRequest method.
So loadimage.ashx should be basically empty:
<%# WebHandler Language="C#" Class="loadimage" %>
And loadimage.ashx.cs should contain the rest:
using System.Web;
public class loadimage : IHttpHandler
{
public void ProcessRequest(HttpContext context)
{
string url = "https://............10000.JPG";
byte[] imageData;
using (WebClient client = new WebClient())
{
imageData = client.DownloadData(url);
}
context.Response.OutputStream.Write(imageData, 0, imageData.Length);
}
public bool IsReusable
{
get { return false; }
}
}
Alternatively, you can create an aspx page that serves the image. This removes the code behind requirement, but adds a little more overhead... create a loadimage.aspx page with the following:
<%# Page Language="C#" AutoEventWireup="true" %>
<script language="c#" runat="server">
public void Page_Load(object sender, EventArgs e)
{
string url = "https://............10000.JPG";
byte[] imageData;
using (System.Net.WebClient client = new System.Net.WebClient())
{
imageData = client.DownloadData(url);
}
Response.ContentType = "image/png"; // Change the content type if necessary
Response.OutputStream.Write(imageData, 0, imageData.Length);
Response.Flush();
Response.End();
}
</script>
Then reference this loadimage.aspx in the image src instead of the ashx.
I'm making a C# image gallery for a website (I know there's many free ones out there, but I want the experience). I'm grabbing files from a directory on the website by storing them in a array.
protected void Page_Load(object sender, EventArgs e)
{
string[] files = null;
files = Directory.GetFiles(Server.MapPath(#"Pictures"),"*.jpg");
I then am creating an array of Imagebuttons (which I will use as thumb-nails) and I'm dynamically adding them into a panel on the web form. However, the image buttons are added on the form correctly, but the pictures show the little square/circle/triangle symbol and fail to load the actual image.
ImageButton[] arrIbs = new ImageButton[files.Length - 1];
for (int i = 0; i < files.Length-1; i++)
{
arrIbs[i] = new ImageButton();
arrIbs[i].ID = "imgbtn" + Convert.ToString(i);
arrIbs[i].ImageUrl = Convert.ToString(files[i]);
Response.Write(Convert.ToString(files[i]) + "**--**");
arrIbs[i].Width = 160;
arrIbs[i].Height = 100;
arrIbs[i].BorderStyle = BorderStyle.Inset;
//arrIbs[i].BorderStyle = 2;
arrIbs[i].AlternateText = System.IO.Path.GetFileName(Convert.ToString(files[i]));
arrIbs[i].PostBackUrl = "default.aspx?Img=" + Convert.ToString(files[i]);
pnlThumbs.Controls.Add(arrIbs[i]);
}
}
This may or may not be related to the issue (if not related, it is a sub-question). When setting the Server.MapPath() to #"~/Gallery/Pictures" (which is where the directory is in relevance to the site root) I get an error. It states that "C:/.../.../.../... could not be found" The web-site only builds if I set the directory as "Pictures", which is where the pictures are, and "Pictures" is in the same folder as "Default.aspx" which the above code is at. I never have much luck with the ~ (tilda) character. Is this a file-structure issue, or a IIS issue?
The problem with this is that you're setting a path on the server as the image button source. The browser will try to load these images from the client's machine, hence they cannot load. You will also need to make sure that the ASPNET user on the server has permissions to that folder.
What you need to do is to serve the jpeg's streams as the source for the image buttons.
You could have an aspx page which takes in the path in a query string parameter and loads the file and serves it.
Eg, have a page called GetImage.aspx as such:
<%# Page Language="C#" %>
<%# Import Namespace="System.IO" %>
<script runat="server" language="c#">
public void Page_Load()
{
try
{
Response.Clear();
Response.ContentType = "image/jpeg";
string filename = Page.Request.QueryString["file"];
using (FileStream stream = new FileStream(filename, FileMode.Open))
{
int streamLength = (int)stream.Length;
byte[] buffer = new byte[streamLength];
stream.Read(buffer, 0, streamLength);
Response.BinaryWrite(buffer);
}
}
finally
{
Response.End();
}
}
</script>
and now when you create your ImageButtons, this should be your ImageUrl:
arrIbs[i].ImageUrl = String.Format("GetImage.aspx?file={0}", HttpUtility.UrlEncode(files[i]));