I have a WCF service which accepts requests and for each request makes an HTTPWebRequest call and returns the response. I use a BlockingCollection to store the requests as they come in and a separate thread processes (makes the webrequest) the items in the collection. Sometimes the Webrequest returns a threadabortexception. I catch it and do a Thread.ResetAbort. But the exception flows up and and it clears the BlockingCollection. I have added snippets of the code below. I need to find a way for the foreach loop to keep continuing even if i get a threadabort exception.
public static class YBProcessor
{
static PriorityQueue<QueuePriorityLevel, string> queue = new PriorityQueue<QueuePriorityLevel, string>();
static BlockingCollection<KeyValuePair<QueuePriorityLevel, string>> requests;
static YBProcessor()
{
requests = new BlockingCollection<KeyValuePair<QueuePriorityLevel, string>>(queue);
Task.Factory.StartNew(() => SendRequestToYB());
}
public static void AddCalcRequest(string cusip, double price, QueuePriorityLevel priority)
{
requests.Add(new KeyValuePair<QueuePriorityLevel, string>(priority, cusip + "-" + price.ToString()));
}
private static void SendRequestToYB()
{
// this is a separate thread that processes the requests as the come in.
foreach (var obj in requests.GetConsumingEnumerable())
{
try
{
var request = GetXML(obj.Value);
var response = YBClient.GetResponse(request);
//code to handle response
}
catch (ThreadAbortException ex)
{
Thread.ResetAbort();
}
catch (Exception ex)
{
}
}
}
}
// In YBClient The GetResponse Method (just the key parts. Code wont compile)
private static String GetResponse(String text)
{
for (iTry = 0; iTry < MAX_TRY; iTry++)
{
try
{
// Create and setup request
bytes = encoding.GetBytes(text);
request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "POST";
request.ContentType = "text/xml";
request.ContentLength = bytes.Length;
request.CookieContainer = cookieJar;
request.Timeout = 100 * 1000;
request.ReadWriteTimeout = 100 * 1000;
// Prepare and send data
postStream = request.GetRequestStream();
postStream.Write(bytes, 0, bytes.Length);
postStream.Close();
// Get response from server
response = (HttpWebResponse)request.GetResponse();
if (response.StatusCode == HttpStatusCode.OK)
{
reader = new StreamReader(response.GetResponseStream(), encoding);
xmlResponse = reader.ReadToEnd();
reader.Close();
break;
}
response.Close();
}
catch (ThreadAbortException ex)
{
Thread.ResetAbort();
break;
}
catch (Exception ex)
{
if (ex.GetBaseException() is ThreadAbortException)
{
Thread.ResetAbort();
break;
}
}
}
}
return xmlResponse;
}
If the service itself is calling the thread abort exception then it might mean the service is going down, in that scenario your thread cannot continue to function (imagine a app pool recycle), If you want to make sure your pending requests are not lost then you can do one of the following:
1) Start a new appdomain http://msdn.microsoft.com/en-us/library/ms173139(v=vs.90).aspx
This method will mitigate service shutting down, however will still not resolve the fundamental issue of making sure all your requests are processed as the "other" app domain can also go down.
2) The better solution will be to keep writing your requests in serialized form in a central DB or a file and get a worker to keep popping items out of the same, if you want simplicity, create seperate files for each requests and delete them once processed (assuming you wont get thousands of requests/sec), for a more scalable solutions you can use Redis database (http://redis.io/) and use its "list" (queue) functionality.
P.S. You might want to mark your thread (task) as long running, if you don't, it uses the thread pool which is not recommended for very long running tasks.
Task.Factory.StartNew(Run, TaskCreationOptions.LongRunning);
Related
I've made a simple program that has to continuosly check for data based on API.
So far, what I've done is making a timer, then execute the GET procedures on timer event
private void TimerStatus_Tick(object sender, EventArgs e)
{
//stop timer
TimerStatus.Stop();
//get data
getCommand();
//restart timer
TimerStatus.Start();
}
void getCommand()
{
string url = "https://somewhere/getcommand?token=somekey¶m=";
string param = "0";
WebRequest request = WebRequest.Create(url + param ); ;
request.Method = "GET";
request.ContentType = "application/x-www-form-urlencoded";
request.Credentials = CredentialCache.DefaultCredentials;
try
{
WebResponse response = request.GetResponse();
bool connected = false;
if ((((HttpWebResponse)response).StatusDescription) == "OK")
connected = true;
//continue if connected
if (connected)
{
using (Stream dataStream = response.GetResponseStream())
{
// Open the stream using a StreamReader for easy access.
StreamReader reader = new StreamReader(dataStream);
// Read the content.
string responseFromServer = reader.ReadToEnd();
//check output
Console.WriteLine("Respond from server : " + responseFromServer);
try
{
//parse data, store value
parseThenProcess(responseFromServer);
}
catch
{
//parsing data error
Console.WriteLine("exception error response");
}
}
}
// Close the response.
response.Close();
}
catch
{
Console.WriteLine("Get command failed");
}
}
This code works fine for me. However, when I try to add more command that has different API in the timer event, the winforms feels kinda laggy. Is it just error on my side that irrelevant with the API handling or do I need to make some improvement about how to handle the API?
private void TimerStatus_Tick(object sender, EventArgs e)
{
//stop timer
TimerStatus.Stop();
//get data
getCommand_A();
getCommand_B();
getParameter_C();
getParameter_D();
//restart timer
TimerStatus.Start();
}
Not using a windows timer? And I am not joking. You have various approaches:
Learn how to use async and the async web interfaces so you do not block the UI thread too long.
or
use a separate thread or tasks (no need for a timer , you can have a task that then schedules another task, i.e.).
What you do is running it all on the UI thread and that is really not needed. Especially because you do send that synchronous so the UI blocks while the request is executed .This is a problem solved for many years by the UI knowing of async methods.
I have this code, but the API has limited requests per minute, so I sometimes get error 429.
I need to wait about a minute, but the WinForm UI becomes unresponsive (I guess I'm stopping thread with UI doing this?). What would be the proper way to implement this?
Code:
public static int sleepTime { get; set; } = 60000;
public static string GetData(string URL)
{
while (true) {
try
{
Controller.SetAppStatus(AppStatusses.FetchingData);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(URL);
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream dataStream = response.GetResponseStream();
StreamReader reader = new StreamReader(dataStream);
string rt = reader.ReadToEnd();
return rt;
}
catch
{
Controller.SetAppStatus(AppStatusses.Timeout);
Timeout();
}
}
}
public static async void Timeout() {
await Task.Delay(sleepTime);
}
Use HttpClient class : ms docs, its the recommended way of issuing requests and it has async methods so you can make your logic asynchronous and you won't block the UI thread. In addition, you have to await the Timeout method and return a Task from it so no blocking would happen.
I've got a list of Accounts. I want to login with all accounts on a website. I want to use Parallel.ForEach to process all accounts.
This is what my code looks like:
Parallel.ForEach(Accounts,
acc =>
{
acc.WebProxy = null;
acc.Channel = "pelvicpaladin__";
Debug.WriteLine("Connecting to {0}.", new object[] { acc.Username });
acc.ConnectToIrc().Wait();
});
Everything works fine except one single problem:
The first Account in the Accounts-list does not work. Internal I have to use more than one request (it is a bit more than just logging in). The first request just does nothing. If I break the debugger, there is no available source.
I've got about 12 accounts. I've tried to remove the first account from the list. But the problem is still the same (now the new first (old second) account fails).
And now the very strange point:
If I don't use Parallel.For, everything works fine.
foreach (var acc in Accounts)
{
acc.WebProxy = null;
Debug.WriteLine("Connecting to {0}.", new object[] { acc.Username });
await acc.ConnectToIrc();
}
Again: Everything works except the first account from the list. It is always the first (it does not depend on how much accounts the list contains or which account is the first account).
Does anyone has any idea?
EDIT: This is how I create WebRequests:
private async Task<string> GetResponseContent(HttpWebRequest request)
{
if (request == null)
throw new ArgumentNullException("request");
using (var response = await request.GetResponseAsync())
{
return await GetResponseContent((HttpWebResponse)response);
}
}
private async Task<string> GetResponseContent(HttpWebResponse response)
{
if (response == null)
throw new ArgumentNullException("response");
using (var responseStream = response.GetResponseStream())
{
return await new StreamReader(responseStream).ReadToEndAsync();
}
}
private HttpWebRequest GetRequest(string url)
{
if (String.IsNullOrWhiteSpace(url))
throw new ArgumentNullException("url");
try
{
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
request.CookieContainer = _cookieContainer;
request.Referer = url;
request.ContentType = "application/x-www-form-urlencoded; charset=UTF-8";
request.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.4) Gecko/20091016 Firefox/3.5.4 GTB6 (.NET CLR 3.5.30729)";
if (_webProxy != null)
request.Proxy = WebProxy.WebProxy;
request.KeepAlive = true;
request.Timeout = 30000;
return request;
}
catch (Exception ex)
{
ErrorLogger.Log(String.Format("Could not create Request on {0}.", url), ex);
return null;
}
}
You're running into the typical await deadlock situation. The problem is that your calls to ConnectToIrc are using await and capturing the synchronization context. They're trying to marshall the continuations to the main thread. The problem is that your main thread is busy blocking on the call to Parallel.ForEach. It's not allowing any of those continuations to run. The main thread is waiting on the continuations to continue, the continuations are waiting on the main thread to be free to run. Deadlock.
This is (one reason) why you shouldn't be synchronously waiting on asynchronous operations.
Instead just start up all of the asynchronous operations and use WhenAll to wait for them all to finish. There's no need to create new threads, or use the thread pool, etc.
var tasks = new List<Task>();
foreach (var acc in Accounts)
{
acc.WebProxy = null;
Debug.WriteLine("Connecting to {0}.", new object[] { acc.Username });
tasks.Add(acc.ConnectToIrc());
}
await Task.WhenAll(tasks);
This, unlike your second example, will perform all of the async operations in parallel, while still waiting asynchronously.
Updated again:
var tasks = Accounts.Select(MyTask).ToList();
await Task.WhenAll(tasks);
then you can write a named method:
private Task MyTask(Account acc)
{
acc.WebProxy = null;
Debug.WriteLine("Connecting to {0}.", new object[] { acc.Username });
return acc.ConnectToIrc();
}
thanks for the tip
I'm trying to find all the ftp servers accepting anonymous connections on a given set of ips.
Basically, I get the IPs I want to check, and then try a ListDirectory on each of them. If I have no exception, the ftp exists and is accessible.
I'm using an asynchronous method to verify an IP, which make things much faster. However, I then need to wait until all the async calls returned. To do this, I keep a counter on the number of async calls I have, the problem is this counter never gets to 0.
My code looks as follows:
to iterate over the IPs:
static int waitingOn;
public static IEnumerable<Uri> GetFtps()
{
var result = new LinkedList<Uri>();
waitingOn = 0;
IPNetwork ipn = IPNetwork.Parse("192.168.72.0/21");
IPAddressCollection ips = IPNetwork.ListIPAddress(ipn);
foreach( var ip in ips )
{
VerifyFtpAsync(ip, result);
}
while (waitingOn > 0)
{
Console.WriteLine(waitingOn);
System.Threading.Thread.Sleep(1000);
}
return result;
}
and to verify each IP:
public async static void VerifyFtpAsync( IPAddress ip, LinkedList<Uri> ftps )
{
++waitingOn;
try
{
Uri serverUri = new Uri("ftp://" + ip);
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(serverUri);
request.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
request.Timeout = 10000;
request.Credentials = new NetworkCredential("anonymous", "roim#search.com");
FtpWebResponse response = (FtpWebResponse) await request.GetResponseAsync();
// If we got this far, YAY!
ftps.AddLast(serverUri);
}
catch (WebException)
{
}
--waitingOn;
}
Replace
FtpWebResponse response = (FtpWebResponse) await request.GetResponseAsync();
with sync code
FtpWebResponse response = (FtpWebResponse) request.GetResponse();
FtpWebRequest.GetResponseAsync is not overriden for FTP specific and comes from the base WebRequest class, which doesn't seem to be able to handle this right.
First, you should never use async void unless you're writing an event handler.
Next, you do need to protect variables and collections against multithreaded access if your async methods may run in parallel (e.g., if this code is run in a console app). In your case, it sounds like you may want to use Task.WhenAll instead of a manual counter, and remove the shared collection.
public async static Task<Uri> VerifyFtpAsync(IPAddress ip)
{
try
{
...
return serverUri;
}
catch (WebException)
{
return null;
}
}
...
var ipTasks = ips.Select(ip => VerifyFtpAsync(ip));
var allResults = await Task.WhenAll(ipTasks);
var result = allResults.Where(url => url != null).ToArray();
Okay, before I go on, let me state that my background is in web scripting; so applications are very foreign to me. I know very little about .NET and I've been skating by on my limited knowledge.
Anyways, in my application, I have an OAuth httpRequest. The request itself works fine, it gets the data I need from the web API. However, the problem is that whenever I click the button that activates the request, my program freezes for a few seconds until the request is finished. I also have another request which is done automatically every 60 seconds. Which of course means every 60 seconds, my program freezes for a few seconds. How to fix this?
private string twitchCallAPI(string accessKey, string accessSecret, string endpointURI, string httpMethod)
{
OAuthHttpWebRequest httpRequest = new OAuthHttpWebRequest();
httpRequest.ConsumerToken = new OAuthToken { Token = this.twitchConKey, TokenSecret = this.twitchConSecret };
httpRequest.Token = new OAuthToken() { Token = accessKey, TokenSecret = accessSecret };
httpRequest.SetUri(endpointURI);
httpRequest.Method = httpMethod;
try
{
using (var response = httpRequest.GetResponse())
{
using (var reader = new StreamReader(response.GetResponseStream()))
{
return reader.ReadToEnd();
}
}
}
catch (WebException ex)
{
using (var reader = new StreamReader(ex.Response.GetResponseStream()))
{
System.Windows.MessageBox.Show(reader.ReadToEnd());
}
}
catch (Exception ex)
{
System.Windows.MessageBox.Show(ex.ToString());
}
return string.Empty;
}
You could use a background worker
Shortly said, do request in task and update UI thread with UI synchronization context
TaskFactory.StartNew(()=>
{
//do web request
})
.ContinueWith(() =>
{
this.TextBlock1.Text = "Complete";
}, TaskScheduler.FromCurrentSynchronizationContext());
You can try using Async methods, that is, using a different thread to wait for the response of the request. Its a solution that you can explore.
http://msdn.microsoft.com/en-us/library/86wf6409%28v=vs.100%29.aspx
You can use await keyword:
private async void OnButtonClick()
{
TextBox.Text = await twitchCallAPIAsync(accessKey, accessSecret, endpointURI, httpMethod);
}
The main reason of this is because your application is waiting the methods you launch to finish. You have to take a look at the 'async' concept.
A program executing an 'async' method continue its workflow, and doesn't wait the method to produce a result.