I have this function in my xamarin.ios app that posts an object to my api . The object contains strings and 1 member as a base64 string . I can't seem to get a timeout and get an exception in less than 5 minutes when there is no connection. However it seems to be timing out when there is no base 64 in the object. Any idea to get this to work? Here is my code :
public static string postData(object #params, string url)
{
MyWeWbClient webClient = new MyWeWbClient();
try
{
webClient.Headers["content-type"] = "application/json";
webClient.Headers.Add("Authorization", "Bearer " + Settings.GeneralSettings3);
var reqString = Encoding.UTF8.GetBytes(JsonConvert.SerializeObject(#params, Formatting.Indented));
byte[] resByte = webClient.UploadData(url, "post", reqString);
var resString1 = Encoding.Default.GetString(resByte);
webClient.Dispose();
return resString1;
}
catch (WebException ex)
{
string responseText = "";
var responseStream = ex.Response?.GetResponseStream();
if (responseStream != null)
{
using (var reader = new StreamReader(responseStream))
{
responseText = reader.ReadToEnd();
}
}
throw new Exception(responseText.ToString());
}catch(Exception ex)
{
throw;
}
}
And my custom webclient class so I can do set timeout:
public class MyWeWbClient : WebClient
{
protected override WebRequest GetWebRequest(Uri uri)
{
WebRequest w = base.GetWebRequest(uri);
w.Timeout = (int)TimeSpan.FromSeconds(10).TotalMilliseconds;//20 * 60 * 1000;
return w;
}
}
Thanks in advance. Any help is appreciated.
EDIT:
The same code is working perfectly fine on xamarin.android , and its timing out if there is no internet connection like intended.
I don't recommand using webClient, I would use an external depency like restsharp.
Alternatively, you can hardcode it using Task.Run() but since its in Xamarin I couldn't say.
I am calling the below method to get files from S3. I have a list of file paths in a collection. I call the below method inside Parallel.Foreach loop. The method addRecords() of container is thread safe and ensures that only one thread enters the critical section. The code skips some of the files and the returned result has less records than expected.
Parallel.ForEach(fileListAll, new ParallelOptions { MaxDegreeOfParallelism = 20 }, (currentFile) =>
{
string RecordsRaw = _AmazonS3.GetFileFromS3(currentFile);
Container.AddRecords(RecordsRaw);
});
public string GetFileFromS3(string filePath)
{
try
{
string ResponseData = null;
GetObjectRequest Request = new GetObjectRequest
{
BucketName = _AWSConfiguration.S3BucketName,
Key = filePath
};
using (GetObjectResponse response = _AmazonS3Client.GetObject(Request))
using (Stream responseStream = response.ResponseStream)
using (StreamReader reader = new StreamReader(responseStream))
{
ResponseData = reader.ReadToEnd();
}
if (!string.IsNullOrEmpty(ResponseData))
{
return ResponseData;
}
}
catch (Exception ex)
{
//Removed the code for brevity
}
return null;
}
I am not sure how can I change GetFileFromS3() method so that it can be called by multiple threads while still maintaining concurrency.
Please bear with as this might be a basic question but I am struggling from 2 days to get this working correctly.
my environment version is 2.0, I am currently working with Unity 5.5, c#.
I am experiencing a stall the 3rd time i try to get an HttpWebRequest request stream.
What I want to achieve is having a mechanism of "retry": when I send a request I wait for the response and if it times out I send another request and refresh the time out. I stop this loop when I hit MAX_ATTEMPTS or one of the requests get a response (even if it is from a request that has timed out).
The code that handles this loop is here:
IEnumerator DoRequest()
{
List<IAsyncResult> results = new List<IAsyncResult>(MAX_ATTEMPTS);
int attemptNumber = 1;
while (attemptNumber <= MAX_ATTEMPTS)
{
IAsyncResult result = null;
try
{
result = SendRequest();
}
catch (Exception e)
{
ProcessException(e);
yield break;
}
if (result != null)
{
results.Add(result);
DateTime timeOutTime = DateTime.UtcNow + TIMEOUT;
yield return null;
while (DateTime.UtcNow < timeOutTime)
{
int completedIndex = -1;
if (IsAnyResultCompleted(results, out completedIndex) == true)
{
ProcessResponse(results[completedIndex]);
yield break;
}
else
{
yield return null;
}
}
}
attemptNumber++;
}
if (results.Count != 0)
{
ProcessResponseFail("Timed out");
}
else
{
ProcessResponseFail("Unable to generate a request");
}
}
Where TIMEOUT is a TimeSpan of 10 seconds. And IsAnyResultCompleted is just a for loop that checks if any request has IsCompleted set to true.
Here I create and send the request:
IAsyncResult SendRequest()
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(RequestActionPath);
request.ContentType = "application/json";
request.Accept = "application/json";
request.Method = "POST";
string json = JsonUtility.ToJson(_dependency);
using (StreamWriter sw = new StreamWriter(request.GetRequestStream()))
{
sw.Write(json);
sw.Flush();
}
IAsyncResult result = request.BeginGetResponse(new AsyncCallback(onResponse), request);
return result;
}
"onResponse" is a cached delegate and it is assigned to this:
void OnResponse(IAsyncResult asynchronousResult)
{
try
{
HttpWebRequest request = (HttpWebRequest)asynchronousResult.AsyncState;
request.EndGetResponse(asynchronousResult);
}
catch (WebException e)
{
string responseStream;
using (var streamReader = new StreamReader(e.Response.GetResponseStream()))
{
responseStream = streamReader.ReadToEnd();
}
Utility.Console.LogError(responseStream);
}
}
I am well aware of this and I also tried to set ServicePointManager.DefaultConnectionLimit = 1000000 but the result is always the same, after 2 sent requests GetRequestStream() in SendRequest stalls for 5 seconds and I cannot understand why.
I am having a hard time convert the below code which i have created in 4.0 to 4.5 using HttpClient.
According to my understand i guess if i create multiple web requests in the GUI thread itself without blocking the GUI if i got with asynchronous requeest.
how to convert the below code to Asynchronous using HttpClient in 4.5
// This is what called when button is clicked
Task t3 = new Task(SpawnTask);
t3.Start();
//if noofthreads are less 50 then GUI is woking fine.. if number increases then takes much time for repaint..
//where as other softwares are working without any problem even if the threads are more than 500!! in the same system
public void SpawnTask()
{
try
{
ParallelOptions po = new ParallelOptions();
po.CancellationToken = cts.Token;
po.MaxDegreeOfParallelism = noofthreads;
Parallel.ForEach(
urls,
po,
url => checkpl(url));
}
catch (Exception ex)
{
}
}
public void checkpl(string url)
{
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 60*1000;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
string stext = "";
using (BufferedStream buffer = new BufferedStream(response.GetResponseStream()))
{
using (StreamReader reader = new StreamReader(buffer))
{
stext = reader.ReadToEnd();
}
}
response.Close();
if (stext .IndexOf("domainname.com") != -1)
{
tfound = tfound + 1;
string lext = "Total Found : "+tfound.ToString();
label3.BeginInvoke(new InvokeDelegate(UpdateLabel), ltext);
slist.Add(url);
textBox2.BeginInvoke(new InvokeDelegate4(UpdateText), "Working Url " + url);
}
}
catch (Exception ex)
{
}
}
Since you are using .NET 4.5 you can use the new async and await keywords. Here is what it might look like.
private async void YourButton_Click(object sender, EventArgs args)
{
YourButton.Enabled = false;
try
{
var tasks = new List<Task>();
foreach (string url in Urls)
{
tasks.Add(CheckAsync(url));
}
await TaskEx.WhenAll(tasks);
}
finally
{
YourButton.Enabled = true;
}
}
private async Task CheckAsync(string url)
{
bool found = await UrlResponseContainsAsync(url, "domainname.com");
if (found)
{
slist.Add(url);
label3.Text = "Total Found: " + slist.Count.ToString();
textbox2.Text = "Working Url " + url;
}
}
private async Task<bool> UrlResponseContainsAsync(string url, string find)
{
var request = WebRequest.Create(url);
request.Timeout = 60 * 1000;
using (WebResponse response = await request.GetResponseAsync())
{
using (var buffer = new BufferedStream(response.GetResponseStream()))
using (var reader = new StreamReader(buffer))
{
string text = reader.ReadToEnd();
return text.Contains(find);
}
}
}
The following asynchronous C# code runs through a list of 7 URLs and tries to get HTML from each one. Right now I just have it outputting simple debug responses to the console like, "Site HTML", "No Response", or "Bad URL". It seems to work fine, but I need to fire off an event once all 7 queries have been made. How would I do this? It's important that all cases are taken into account: 1) Site HTML has been received, 2) Site timed out, 3) Site was an improper URL and couldn't be loaded. I'm covering all these cases already, but can't figure out how to connect everything to trigger a global "OnComplete" event.
Thank you.
using System;
using System.Collections;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.IO;
using System.Net;
using System.Threading;
using System.Timers;
using System.Collections.Concurrent;
using System.Diagnostics;
namespace AsyncApp_05
{
class Program
{
static int _count = 0;
static int _total = 0;
static void Main(string[] args)
{
ArrayList alSites = new ArrayList();
alSites.Add("http://www.google.com");
alSites.Add("http://www.yahoo.com");
alSites.Add("http://www.ebay.com");
alSites.Add("http://www.aol.com");
alSites.Add("http://www.bing.com");
alSites.Add("adsfsdfsdfsdffd");
alSites.Add("http://wwww.fjasjfejlajfl");
alSites.Add("http://mundocinema.com/noticias/the-a-team-2/4237");
alSites.Add("http://www.spmb.or.id/?p=64");
alSites.Add("http://gprs-edge.ru/?p=3");
alSites.Add("http://blog.tmu.edu.tw/MT/mt-comments.pl?entry_id=3141");
_total = alSites.Count;
//Console.WriteLine(_total);
ScanSites(alSites);
Console.Read();
}
private static void ScanSites(ArrayList sites)
{
foreach (string uriString in sites)
{
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uriString);
request.Method = "GET";
request.Proxy = null;
RequestState state = new RequestState();
state.Request = request;
IAsyncResult result = request.BeginGetResponse(new AsyncCallback(ResponseCallback), state);
// Timeout comes here
ThreadPool.RegisterWaitForSingleObject(result.AsyncWaitHandle,
new WaitOrTimerCallback(TimeOutCallback), request, 100, true);
}
catch (Exception ex)
{
Console.WriteLine("Bad URL");
Interlocked.Increment(ref _count);
}
}
}
static void ReadCallback(IAsyncResult result)
{
try
{
// Get RequestState
RequestState state = (RequestState)result.AsyncState;
// determine how many bytes have been read
int bytesRead = state.ResponseStream.EndRead(result);
if (bytesRead > 0) // stream has not reached the end yet
{
// append the read data to the ResponseContent and...
state.ResponseContent.Append(Encoding.ASCII.GetString(state.BufferRead, 0, bytesRead));
// ...read the next piece of data from the stream
state.ResponseStream.BeginRead(state.BufferRead, 0, state.BufferSize,
new AsyncCallback(ReadCallback), state);
}
else // end of the stream reached
{
if (state.ResponseContent.Length > 0)
{
Console.WriteLine("Site HTML");
// do something with the response content, e.g. fill a property or fire an event
//AsyncResponseContent = state.ResponseContent.ToString();
// close the stream and the response
state.ResponseStream.Close();
state.Response.Close();
//OnAsyncResponseArrived(AsyncResponseContent);
}
}
}
catch (Exception ex)
{
// Error handling
RequestState state = (RequestState)result.AsyncState;
if (state.Response != null)
{
state.Response.Close();
}
}
}
static void ResponseCallback(IAsyncResult result)
{
Interlocked.Increment(ref _count);
Console.WriteLine("Count: " + _count);
try
{
// Get and fill the RequestState
RequestState state = (RequestState)result.AsyncState;
HttpWebRequest request = state.Request;
// End the Asynchronous response and get the actual resonse object
state.Response = (HttpWebResponse)request.EndGetResponse(result);
Stream responseStream = state.Response.GetResponseStream();
state.ResponseStream = responseStream;
// Begin async reading of the contents
IAsyncResult readResult = responseStream.BeginRead(state.BufferRead, 0, state.BufferSize, new AsyncCallback(ReadCallback), state);
}
catch (Exception ex)
{
// Error handling
RequestState state = (RequestState)result.AsyncState;
if (state.Response != null)
{
state.Response.Close();
}
Console.WriteLine("No Response");
}
}
static void TimeOutCallback(object state, bool timedOut)
{
if (timedOut)
{
HttpWebRequest request = state as HttpWebRequest;
if (request != null)
{
request.Abort();
}
}
}
}
public class RequestState
{
public int BufferSize { get; private set; }
public StringBuilder ResponseContent { get; set; }
public byte[] BufferRead { get; set; }
public HttpWebRequest Request { get; set; }
public HttpWebResponse Response { get; set; }
public Stream ResponseStream { get; set; }
public RequestState()
{
BufferSize = 1024;
BufferRead = new byte[BufferSize];
ResponseContent = new StringBuilder();
Request = null;
ResponseStream = null;
}
}
}
You can use a CountdownEvent to find out when all the sites have been scanned. It'd be initially set to sites.Count, and then wait on that event. On each completion (either by error, timeout or success) you'd signal the event. When the event count reaches zero, the wait will return and you can have your "OnComplete" event.
The simplest way IMHO is to create a semaphore, make each OnComplete handler to Release it and to WaitOne on it N times in a master thread (where N is number of sites).
private static void ScanSites(ArrayList sites)
{
var semaphore = new Semaphore(0,sites.Count);
foreach (string uriString in sites)
{
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uriString);
request.Method = "GET";
request.Proxy = null;
RequestState state = new RequestState();
state.Request = request;
state.Semaphore = semaphore;
IAsyncResult result = request.BeginGetResponse(new AsyncCallback(ResponseCallback), state);
// Timeout comes here
ThreadPool.RegisterWaitForSingleObject(result.AsyncWaitHandle,
(o, timeout => { TimeOutCallback }, request, 100, true);
}
catch (Exception ex)
{
Console.WriteLine("Bad URL");
Interlocked.Increment(ref _count);
}
}
for(var i =0; i <sites.Count; i++) semaphore.WaitOne();
}
static void ReadCallback(IAsyncResult result)
{
try
{ ... }
finally{
var state = result.State as RequestState;
if (state != null) state.Semaphore.Release();
}
}
Another option is to pass some WaitHandle (ManualResetEvent fits well) to each of handler and WaitHandle.WaitAll for them in master thread.
private static void ScanSites(ArrayList sites)
{
var handles = new List<WaitHandle>();
foreach (string uriString in sites)
{
try
{
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uriString);
request.Method = "GET";
request.Proxy = null;
RequestState state = new RequestState();
state.Request = request;
IAsyncResult result = request.BeginGetResponse(new AsyncCallback(ResponseCallback), state);
handles.Add(result.AsyncWaitHandle);
// Timeout comes here
ThreadPool.RegisterWaitForSingleObject(result.AsyncWaitHandle,
new WaitOrTimerCallback(TimeOutCallback), request, 100, true);
}
catch (Exception ex)
{
Console.WriteLine("Bad URL");
Interlocked.Increment(ref _count);
}
}
WaitHandle.WaitAll(handles.ToArray());
}
Surely, you can achieve the same with Interlocked as well, by using, e.g. Exchange or CompareExchange methods but IMHO, WaitHandles are more straightforward here (and performance hit for using them is not significant).