HttpWebRequest GetResponse. How to wait for that response? - c#

I'm sending some SMS using HttpWebRequest. But some times the messages appear to send double, three times and more reviewing all the proccess appear to be ok.
The only thing i think is HttpWebRequest is working asynchronus.
here is my code,
public bool sendSmsGateway(string tcDestino, string tcMensaje, string tcTitulo = "")
{
bool lReturn = false;
string contenido = HttpUtility.UrlEncode(tcMensaje);
string lcTitulo = "SMS Sender";
if (!String.IsNullOrEmpty(tcTitulo))
{
lcTitulo = tcTitulo;
}
lcTitulo = HttpUtility.UrlEncode(lcTitulo);
string fechaEnvio = DateTime.Now.ToString("s");
string url = string.Format(StrSMSGatewayURL, StrSMSGatewayUsuario, StrSMSGatewayPassword, tcDestino, contenido, lcTitulo, fechaEnvio);
HttpWebRequest enviar = (HttpWebRequest)WebRequest.Create(url);
string respuesta = new StreamReader(enviar.GetResponse().GetResponseStream()).ReadToEnd();
StrSMSGatewayRespuesta = respuesta.Trim();
if (StrSMSGatewayRespuesta == StrSMSGatewayRespuestaOk.Trim())
{
lReturn = true;
}
return lReturn;
}
this code runs in loop routine that send the message and mark it as sent when:
string respuesta = new StreamReader(enviar.GetResponse().GetResponseStream()).ReadToEnd();
return the propper code.
My question is.. there is a way to stop the proccess, or force the code to wait until the (httpWebRequest send the message and get the response) or (timeout succeed)

HttpWebRequest.GetResponse() is a synchronous call, so you don't need to have a different mechanism for blocking the current thread. If you were using the asynchronous method HttpWebRequest.BeginGetResponse(), then you would need to have another mechanism to wait for the response to be asynchronously received.
If you're sending multiple SMS messages, then this is probably caused by not receiving the correct response code when you get StrSMSGatewayRespuesta. If you get the wrong code but the message was actually sent, then the message is likely to be sent again since you're marking it as not sent.

Related

Workflow required to perform lengthy ASP.NET task

I have an ASP.NET WebForms application that mimics a help desk system. The application works fine, but recently, they asked me to make it so that it can text message everyone in the system whenever a new help desk ticket is opened.
I am using Twilio to do this and it is working just fine. The only problem is, there are like 15 people in the system that should be getting this text message and when the ticket is submitted, the application takes about 15-20 seconds to repost from the submit. In the future, there could be more then 15 people, double that even.
What I am wondering is if there is a way to send these messages in the background, so that the page will come back from the submit right away. Here is my relevant code:
This is my main method I wrote for sending the text message. Its in a Utility class:
public static string SendSms(string phoneNumber, string message)
{
var request = (HttpWebRequest)WebRequest.Create("https://api.twilio.com/2010-04-01/Accounts/" + Constants.TwilioId + "/Messages.json");
string postData = "From=" + Constants.TwilioFromNumber + "&To=+1" + phoneNumber + "&Body=" + HttpUtility.HtmlEncode(message);
byte[] data = Encoding.ASCII.GetBytes(postData);
string authorization = string.Format("{0}:{1}", Constants.TwilioId, Constants.TwilioAuthToken);
string encodedAuthorization = Convert.ToBase64String(Encoding.ASCII.GetBytes(authorization));
string credentials = string.Format("{0} {1}", "Basic", encodedAuthorization);
request.Method = "POST";
request.ContentType = "application/x-www-form-urlencoded";
request.ContentLength = data.Length;
request.Headers[HttpRequestHeader.Authorization] = credentials;
using (var stream = request.GetRequestStream())
{
stream.Write(data, 0, data.Length);
}
string responseString;
using (var response = (HttpWebResponse) request.GetResponse())
{
using (var reader = new StreamReader(response.GetResponseStream()))
{
responseString = reader.ReadToEnd();
}
}
return responseString;
}
And here is how I'm calling it:
public void BtnSubmit_Click(object sender, EventArgs e)
{
//
// This is more code here, but its irrelevant
//
var employees = new Employees();
employees.GetAll();
foreach (Employee employee in employees)
{
string number = employee.CellPhoneAreaCode + employee.CellPhonePrefix +
employee.CellPhoneSuffix;
if (!string.IsNullOrEmpty(number) && number.Length == 10)
{
Utility.SendSms(number, "A new Help Desk Ticket is in the System!");
}
}
}
The only other idea I can come up with is to create a WCF service, but that seemed like over kill. Any suggestions are welcome!
Any asynchronous approach should do the trick. For example, using a Task or (if you're on .NET 4.5+) an async method. (Remember to handle the asynchronous errors by supplying a callback with something like .ContinueWith() to examine the task for errors and respond accordingly.)
Meaningfully responding to errors in this case might be complex, though. It sounds like the sort of operation where you want to keep re-trying in the event of a failure (with logging in case of constant failures), and definitely want to continue with the loop even if one message fails. So something a little more manual might be in order.
For that I would recommend persisting the messages themselves to a simple database table from the application and continuing with the UI as you want. Then have a separate application, such as a Windows Service, which periodically polls that database table and sends the messages in a simple loop over the records.
A good approach for something like this would be to keep a simple status flag on the message records. Queued, sent, error (with an error message), etc. The Windows Service can update the records as it sends the messages in the loop. As any given message errors, just update that record and continue with the loop. Re-try error-ed messages as appropriate.

The connection was closed unexpectedly C# after a long running time

Hi I was making a crawler for a site. After about 3 hours of crawling, my app stopped on a WebException. below are my code in c#. client is predefined WebClient object that will be disposed every time gameDoc has already been processed. gameDoc is a HtmlDocument object (from HtmlAgilityPack)
while (retrygamedoc)
{
try
{
gameDoc.LoadHtml(client.DownloadString(url)); // this line caused the exception
retrygamedoc = false;
}
catch
{
client.Dispose();
client = new WebClient();
retrygamedoc = true;
Thread.Sleep(500);
}
}
I tried to use code below (to keep the webclient fresh) from this answer
while (retrygamedoc)
{
try
{
using (WebClient client2 = new WebClient())
{
gameDoc.LoadHtml(client2.DownloadString(url)); // this line cause the exception
retrygamedoc = false;
}
}
catch
{
retrygamedoc = true;
Thread.Sleep(500);
}
}
but the result is still the same. Then I use StreamReader and the result stays the same! below are my code using StreamReader.
while (retrygamedoc)
{
try
{
// using native to check the result
HttpWebRequest webreq = (HttpWebRequest)WebRequest.Create(url);
string responsestring = string.Empty;
HttpWebResponse response = (HttpWebResponse)webreq.GetResponse(); // this cause the exception
using (StreamReader reader = new StreamReader(response.GetResponseStream()))
{
responsestring = reader.ReadToEnd();
}
gameDoc.LoadHtml(client.DownloadString(url));
retrygamedoc = false;
}
catch
{
retrygamedoc = true;
Thread.Sleep(500);
}
}
What should I do and check? I am so confused because I got am able to crawl on some pages, on the same site, then in about 1000 reasults, it cause the exception. the message from exception is only The request was aborted: The connection was closed unexpectedly. and the status is ConnectionClosed
PS. the app is a desktop form app.
update :
Now I am skipping the values and turned them to null so that the crawling can goes on. But if the data is really needed, I still have to update the crawling result manually, which is tiring because the result contains thousands of record. Please help me.
example :
it was like you have downloaded like about 1300 data from the website, then the application stopped saying The request was aborted: The connection was closed unexpectedly. while all your internet connection still on and on a good speed.
ConnectionClosed may indicate (and probably does) that the server you're downloading from is closing the connection. Perhaps it is noticing a large amount of requests from your client and is denying you additional service.
Since you can't control server-side shenanigans, I'd recommend you have some sort of logic to retry the download a bit later.
Got this error because it was returned as 404 from the server.

how to do multitasking from same code at the same time?

Sorry if the title is not clear or correct, dont know what title should i put. Please correct if wrong.
I have this code to download images from IP camera and it can download the images.The problem is how can i do the images downloading process at the same time for all cameras if i have two or more cameras?
private void GetImage()
{
string IP1 = "example.IPcam1.com:81/snapshot.cgi;
string IP2 = "example.IPcam2.com:81/snapshot.cgi;
.
.
.
string IPn = "example.IPcamn.com:81/snapshot.cgi";
for (int i = 0; i < 10; i++)
{
string ImagePath = Server.MapPath("~\\Videos\\liveRecording2\\") + string.Format("{0}", i, i + 1) + ".jpeg";
string sourceURL = ip;
WebRequest req = (WebRequest)WebRequest.Create(sourceURL);
req.Credentials = new NetworkCredential("user", "password");
WebResponse resp = req.GetResponse();
Stream stream = resp.GetResponseStream();
Bitmap bmp = (Bitmap)Bitmap.FromStream(stream);
bmp.Save(ImagePath);
}
}
You should not run long-running code like that from an ASP.NET application. They are meant to simply respond to requests.
You should place this code in a service (Windows Services are easy), and control the service through a WCF service running inside of it.
You're also going to get into trouble because you don't have your WebResponse and Stream in using blocks.
There are several methods that will depend on how you want to report feedback to the user. It all comes down to multi-threading.
Here is one example, using the ThreadPool. Note that this is missing a bunch of error checking throughout... It is here as an example of how to use the ThreadPool, not as a robust application:
private Dictionary<String, String> _cameras = new Dictionary<String, String> {
{ "http://example.IPcam1.com:81/snapshot.cgi", "/some/path/for/image1.jpg" },
{ "http://example.IPcam2.com:81/snapshot.cgi", "/some/other/path/image2.jpg" },
};
public void DoImageDownload() {
int finished = 0;
foreach (KeyValuePair<String, String> pair in _cameras) {
ThreadPool.QueueUserWorkItem(delegate {
BeginDownload(pair.Key, pair.Value);
finished++;
});
}
while (finished < _cameras.Count) {
Thread.Sleep(1000); // sleep 1 second
}
}
private void BeginDownload(String src, String dest) {
WebRequest req = (WebRequest) WebRequest.Create(src);
req.Credentials = new NetworkCredential("username", "password");
WebResponse resp = req.GetResponse();
Stream input = resp.GetResponseStream();
using (Stream output = File.Create(dest)) {
input.CopyTo(output);
}
}
This example simply takes the work you are doing in the for loop and off-loads it to the thread pool for processing. The DoImageDownload method will return very quickly, as it is not doing much actual work.
Depending on your use case, you may need a mechanism to wait for the images to finish downloading from the caller of DoImageDownload. A common approach would be the use of event callbacks at the end of BeginDownload to notify when the download is complete. I have put a simple while loop here that will wait until the images finish... Of course, this needs error checking in case images are missing or the delegate never returns.
Be sure to add your error checking throughout... Hopefully this gives you a place to start.

C# How to stop a method if it takes longer than 2 seconds?

Following program will connect to the web and get html content of “msnbc.com” webpage and print out the result. If it takes longer than 2 seconds to get data from the webpage, I want my method to stop working and return. Can you please tell me how can I do this with an example?
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}
private void button1_Click(object sender, EventArgs e)
{
gethtml();
MessageBox.Show("End of program");
}
public void gethtml()
{
HttpWebRequest WebRequestObject = (HttpWebRequest)HttpWebRequest.Create("http://msnbc.com/");
WebResponse Response = WebRequestObject.GetResponse();
Stream WebStream = Response.GetResponseStream();
StreamReader Reader = new StreamReader(WebStream);
string webcontent = Reader.ReadToEnd();
MessageBox.Show(webcontent);
}
}
Two seconds is far too long to block the UI. You should only block the UI if you are planning on getting the result in, say fifty milliseconds or less.
Read this article on how to do a web request without blocking the UI:
http://www.developerfusion.com/code/4654/asynchronous-httpwebrequest/
Note that this will all be much easier in C# 5, which is in beta release at present. In C# 5 you can simply use the await operator to asynchronously await the result of the task. If you would like to see how this sort of thing will work in C# 5, see:
http://msdn.microsoft.com/en-us/async
Set the Timeout property of your WebRequest object. Documentation
MSDN Example:
// Create a new WebRequest Object to the mentioned URL.
WebRequest myWebRequest=WebRequest.Create("http://www.contoso.com");
Console.WriteLine("\nThe Timeout time of the request before setting is : {0} milliseconds",myWebRequest.Timeout);
// Set the 'Timeout' property in Milliseconds.
myWebRequest.Timeout=10000;
// This request will throw a WebException if it reaches the timeout limit before it is able to fetch the resource.
WebResponse myWebResponse=myWebRequest.GetResponse();
As stated above .Timeout
public void gethtml()
{
HttpWebRequest WebRequestObject = (HttpWebRequest)HttpWebRequest.Create("http://msnbc.com/");
WebRequestObject.Timeout = (System.Int32)TimeSpan.FromSeconds(2).TotalMilliseconds;
try
{
WebResponse Response = WebRequestObject.GetResponse();
Stream WebStream = Response.GetResponseStream();
StreamReader Reader = new StreamReader(WebStream);
string webcontent = Reader.ReadToEnd();
MessageBox.Show(webcontent);
}
catch (System.Net.WebException E)
{
MessageBox.Show("Fail");
}
}
You can use the TimeOut property on HttpWebRequest
Consider switching to asynchronous downloading of the content. You will stop blocking UI thread and will be able to handle multiple requests easily. You will be able to increase timeout significantly without impact on UI, and can decide upon receiving response if you still want to fetch data.

AsyncHttpWebRequest (web page title retrieve program)

I'm trying to create a program that will retrieve page titles given a url. I've written code that works when I'm not using a AsyncCallback, but when I use a AsyncCallback the code doesn't seem to work.
public void GetWebPageTitle(string URL)
{
// make request for web page
HttpWebRequest myWebRequest = (HttpWebRequest)HttpWebRequest.Create(URL);
myWebRequest.Method = "GET";
myWebRequest.BeginGetResponse(new AsyncCallback(FinishWebRequest), myWebRequest);
zConsole.WriteLine("Beginning HttpWebRequest for: " + URL);
}
void FinishWebRequest(IAsyncResult result)
{
zConsole.WriteLine("...");
string title = "Unknown";
//Code under here doesnt get extcuted
HttpWebResponse myWebResponse = (HttpWebResponse)((HttpWebRequest)result.AsyncState).EndGetResponse(result);
StreamReader myWebSource = new StreamReader(myWebResponse.GetResponseStream());
string source = "";
source = myWebSource.ReadToEnd();
myWebResponse.Close();
title = Regex.Match(source, #"\<title\b[^>]*\>\s*(?<Title>[\s\S]*?)\</title\>", RegexOptions.IgnoreCase).Groups["Title"].Value;
zConsole.WriteLine(title);
}
Thanks.
I think, the problem is, your program ends, before async result is returned.
The main thread after doing Console.Writeline dies.
Rest looks okay. BeginGetResponse at MSDN
Put a try/catch block around the code inside the callback and see if anything in there is throwing an exception.
Otherwise some more details would be useful. When you say that the code doesn't get executed are you actually stepping through the code/using breakpoints or are you assuming this is the case based on your console output? Is this request being made from the main window thread of your application?

Categories