I'm trying to simulate many concurrent users (>2000) to test a web service. Every user performs actions at a specific pre-defined time, for example:
User A: 09:10:02, 09:10:03, 09:10:08
User B: 09:10:03, 09:10:05, 09:10:07
User C: 09:10:03, 09:10:09, 09:10:15, 09:10:20
I now want to send web request in real time at each of those times. I can tolerate a delay of at most ~2 seconds. What I was already trying without success:
a) Aggregate all times in a single list, sort it by time then iterate it:
foreach (DateTime sendTime in times) {
while (DateTime.now < sendTime)
Thread.Sleep(1);
SendRequest();
}
b) Create a thread for each user, with each thread checking for the same condition as above but a longer sleep time
Both approaches kind of work, but the delay between the time that the request was supposed to be sent and the time that it has actually been sent is way too high. Is there any way to send the requests with higher precision?
Edit: The suggests approaches work really well. However, the delay is still extremely high for many requests. Apparantly, the reason for this is my SendRequest() method:
private static async Task SendRequest()
{
// Log time difference
string url = "http://www.request.url/newaction";
WebRequest webRequest = WebRequest.Create(url);
try
{
WebResponse webResponse = await webRequest.GetResponseAsync();
}
catch (Exception e) { }
}
Note that my web service does not return any response, maybe this is the reason for the slow down? Can I send the request without waiting for the response?
Why are you doing this with multiple thread? Threading requires slow sleep/wake context switching. You could just do this all with timers/async calls.
List<DateTime> scheduledTimes = ...;
List<Task> requests = scheduledTimes
.Select(t => t - DateTime.Now)
.Select(async delay =>
{
await Task.Delay(delay);
SendRequest();
})
.ToList();
await Task.WhenAll(requests);
The above code will in one thread schedule all the requests onto the SynchronizationContext and run them.
Simples.
I would suggest to use a timer object to trigger the requests:
// In Form_Load or another init method
Timer tRequest = new Timer();
tRequest.Interval = 500;
tRequest.Tick += TRequest_Tick;
private void TRequest_Tick(object sender, EventArgs e)
{
var sendTimes = times.Where(t => t.AddMilliseconds(-500) < DateTime.Now && t.AddMilliseconds(500) > DateTime.Now);
foreach(DateTime sendTime in sendTimes)
{
SendRequest();
}
}
Related
I have a Windows service that reads data from the database and processes this data using multiple REST API calls.
Originally, this service ran on a timer where it would read unprocessed data from the database and process it using multiple threads limited using SemaphoreSlim. This worked well except that the database read had to wait for all processing to finish before reading again.
ServicePointManager.DefaultConnectionLimit = 10;
Original that works:
// Runs every 5 seconds on a timer
private void ProcessTimer_Elapsed(object sender, ElapsedEventArgs e)
{
var hasLock = false;
try
{
Monitor.TryEnter(timerLock, ref hasLock);
if (hasLock)
{
ProcessNewData();
}
else
{
log.Info("Failed to acquire lock for timer."); // This happens all of the time
}
}
finally
{
if (hasLock)
{
Monitor.Exit(timerLock);
}
}
}
public void ProcessNewData()
{
var unproceesedItems = GetDatabaseItems();
if (unproceesedItems.Count > 0)
{
var downloadTasks = new Task[unproceesedItems.Count];
var maxThreads = new SemaphoreSlim(semaphoreSlimMinMax, semaphoreSlimMinMax); // semaphoreSlimMinMax = 10 is max threads
for (var i = 0; i < unproceesedItems .Count; i++)
{
maxThreads.Wait();
var iClosure = i;
downloadTasks[i] =
Task.Run(async () =>
{
try
{
await ProcessItemsAsync(unproceesedItems[iClosure]);
}
catch (Exception ex)
{
// handle exception
}
finally
{
maxThreads.Release();
}
});
}
Task.WaitAll(downloadTasks);
}
}
To improve efficiency, I rewrite the service to run GetDatabaseItems in a separate thread from the rest so that there is a ConcurrentDictionary of unprocessed items between them that GetDatabaseItems fills and ProcessNewData empties.
The problem is that while 10 unprocessed items are send to ProcessItemsAsync, they are processed two at a time instead of all 10.
The code inside of ProcessItemsAsync calls var response = await client.SendAsync(request); where the delay occurs. All 10 threads make it to this code but come out of it two at a time. None of this code changed between the old version and the new.
Here is the code in the new version that did change:
public void Start()
{
ServicePointManager.DefaultConnectionLimit = maxSimultaneousThreads; // 10
// Start getting unprocessed data
getUnprocessedDataTimer.Interval = getUnprocessedDataInterval; // 5 seconds
getUnprocessedDataTimer.Elapsed += GetUnprocessedData; // writes data into a ConcurrentDictionary
getUnprocessedDataTimer.Start();
cancellationTokenSource = new CancellationTokenSource();
// Create a new thread to process data
Task.Factory.StartNew(() =>
{
try
{
ProcessNewData(cancellationTokenSource.Token);
}
catch (Exception ex)
{
// error handling
}
}, TaskCreationOptions.LongRunning
);
}
private void ProcessNewData(CancellationToken token)
{
// Check if task has been canceled.
while (!token.IsCancellationRequested)
{
if (unprocessedDictionary.Count > 0)
{
try
{
var throttler = new SemaphoreSlim(maxSimultaneousThreads, maxSimultaneousThreads); // maxSimultaneousThreads = 10
var tasks = unprocessedDictionary.Select(async item =>
{
await throttler.WaitAsync(token);
try
{
if (unprocessedDictionary.TryRemove(item.Key, out var item))
{
await ProcessItemsAsync(item);
}
}
catch (Exception ex)
{
// handle error
}
finally
{
throttler.Release();
}
});
Task.WhenAll(tasks);
}
catch (OperationCanceledException)
{
break;
}
}
Thread.Sleep(1000);
}
}
Environment
.NET Framework 4.7.1
Windows Server 2016
Visual Studio 2019
Attempts to fix:
I tried the following with the same bad result (two await client.SendAsync(request) completing at a time):
Set Max threads and ServicePointManager.DefaultConnectionLimit to 30
Manually create threads using Thread.Start()
Replace async/await pattern with sync HttpClient calls
Call data processing using Task.Run(async () => and Task.WaitAll(downloadTasks);
Replace the new long-running thread for ProcessNewData with a timer
What I want is to run GetUnprocessedData and ProcessNewData concurrently with an HttpClient connection limit of 10 (set in config) so that 10 requests are processed at the same time.
Note: the issue is similar to HttpClient.GetAsync executes only 2 requests at a time? but the DefaultConnectionLimit is increased and the service runs on a Windows Server. It also creates more than 2 connections when original code runs.
Update
I went back to the original project to make sure it still worked, it did. I added a new timer to perform some unrelated operations and the httpClient issue came back. I removed the timer, everything worked. I added a new thread to do parallel processing, the problem came back.
This is not a direct answer to your question, but a suggestion for simplifying your service that could make the debugging of any problem easier. My suggestion is to implement the producer-consumer pattern using an iterator for producing the unprocessed items, and a parallel loop for consuming them. Ideally the parallel loop would have async delegates, but since you are targeting the .NET Framework you don't have access to the .NET 6 method Parallel.ForEachAsync. So I will suggest the slightly wasteful approach of using a synchronous parallel loop that blocks threads. You could use either the Parallel.ForEach method, or the PLINQ like in the example below:
private IEnumerable<Item> Iterator(CancellationToken token)
{
while (true)
{
Task delayTask = Task.Delay(5000, token);
foreach (Item item in GetDatabaseItems()) yield return item;
delayTask.GetAwaiter().GetResult();
}
}
public void Start()
{
//...
ThreadPool.SetMinThreads(degreeOfParallelism, Environment.ProcessorCount);
new Thread(() =>
{
try
{
Partitioner
.Create(Iterator(token), EnumerablePartitionerOptions.NoBuffering)
.AsParallel()
.WithDegreeOfParallelism(degreeOfParallelism)
.WithCancellation(token)
.ForAll(item => ProcessItemAsync(item).GetAwaiter().GetResult());
}
catch (OperationCanceledException) { } // Ignore
}).Start();
}
Online demo.
The Iterator fetches unprocessed items from the database in batches, and yields them one by one. The database won't be hit more frequently than once every 5 seconds.
The PLINQ query is going to fetch a new item from the Iterator each time it has a worker available, according to the WithDegreeOfParallelism policy. The setting EnumerablePartitionerOptions.NoBuffering ensures that it won't try to fetch more items in advance.
The ThreadPool.SetMinThreads is used in order to boost the availability of ThreadPool threads, since the PLINQ is going to use lots of them. Without it the ThreadPool will not be able to satisfy the demand immediately, although it will gradually inject more threads and eventually will catch up. But since you already know how many threads you'll need, you can configure the ThreadPool from the start.
In case you dislike the idea of blocking threads, you can find a simple substitute of the Parallel.ForEachAsync here, based on the TPL Dataflow library. It requires installing a NuGet package.
The issue turned out to be the place where ServicePointManager.DefaultConnectionLimit is set.
In the version where HttpClient was only doing two requests at a time, ServicePointManager.DefaultConnectionLimit was being set before the threads were being created but after the HttpClient was initialized.
Once I moved it into the constructor before the HttpClient is initialized, everything started working.
Thank you very much to #Theodor Zoulias for the help.
TLDR; Set ServicePointManager.DefaultConnectionLimit before initializing the HttpClient.
I'm newbie in async. And making WPF app for scraping and API calls purposes. WPF's UI is needed only for service monitoring and settings control, so all services will run simultaneously in the background. And most of them is doing similar work.
For this one I need to implement strategy like this:
Start worker on threadpool
Worker must send request and process response from Website
2.1 If response processed and some new data appeared - raise an event
2.2 If request failed - handle an error
2.3 If there are many error percent for last x requests - stop worker
No matter the last request completed/failed/running we must send another request
Another request should be sent not earlier than the set delay but should not exceed the delay too much (as far as possible).
private _workTask;
private List<ScrapeParameters> _scrapeParams = new();
public event EventHandler<ScrapedEventArgs>? NewDataScraped;
//Can I run Worker like this?
public void RunWorker()
{
if(_workTask.IsCompleted)
_workTask = WorkAsync(_token)
}
private async Task WorkAsync(CancellationToken cancelToken)
{
List<Task> processTasks = new();
while(true)
{
if(cancelToken.IsCancellationRequested) return;
//Delay could be from 0.5 second to any value
var delayTask = Task.Delay(WorkerDelay);
var completedTasks = processTasks.Where(t => t.IsCompleted)
var setToHandle = new HashSet<Task>(completedTasks);
foreach(var task in setToHandle)
{
//Theoretical logic to handle errors and completion
if(task.IsFaulted)
HandleFaultedTask(task);
else
CountCompleted();
processTasks.Remove(task);
}
//Theoretical logic to obtain the desired parameters.
var currParameters = GetParameters();
processTasks.Add(ProcessAsync(currParameters, cancelToken));
await delayTask;
}
}
//This method usually takes around 2-4 seconds
private async Task ProcessAsync(ScrapeParameters parameters CancellationToken cancelToken)
{
//Some work with http requests
var response = await Client.GetAsync(parameters.ToUri());
...
//Processing response
...
if(newData != null)
NewDataScraped?.Invoke(new(newData));
}
Does my implementation matches the TAP pattern?
Especially I would like to focus on RunWorker() and setToHandle
I am working on a Xamarin.Forms Android application that requires the phone to ping to the server every 15 seconds. All my calls are asynchronous and all of them have the "await" attribute, this includes all the main user functions in the main classes that are not found in Device.StartTimer objects. For example, registering data upon button click, logging in and logging out. For the 15 second ping I am using a Device.StartTimer function and I have not had too many issues but at times I do notice that the responses do overlap, however I thought the "await" declaration would have taken care of responses overlapping because I read that Device.StartTimer works on the main thread. What I'm I doing wrong? Is there a better way to managed timed HTTPClient calls?
I tried applying the await attribute to the function to make sure that the calls do not overlap. There is a note about Device.StartTimer running on the main thread, so I thought that ALL my async await functions would be respected. Including the async function in the main classes.
//Function to ping to the server every 15 seconds
private void StartOfflineTimer()
{
Device.StartTimer(TimeSpan.FromSeconds(15.0), () =>
{
if(timerOffline)
{
Task.Run(async () =>
if(await InformarOfflineAsync(Settings.AccessToken, idRutaOffline))
{
DependencyService.Get<ILogUtils>().GuardarLine("**Device conected..");
}
else
{
DependencyService.Get<ILogUtils>().GuardarLine("**Device disconnected..");
}
);
}
return timerOffline;
});
}
//Default Example of how I handle ALL HTTPClient calls on the app, including calls that are in the main classes, not embedded in a device timer. All of these calls are inside their own public async Task<ExampleObject> function. Once again all of the functions that make these calls have an "await" attribute.
var jsonRequest = await Task.Run(() => JsonConvert.SerializeObject(requestObj));
var httpContent = new StringContent(jsonRequest, Encoding.UTF8, "application/json");
using (var httpClient = new HttpClient())
{
httpClient.Timeout = TimeSpan.FromSeconds(10.0);
var httpResponse = await httpClient.PostAsync(Constants.BaseUrl + "login/validarOffline", httpContent);
ExampleObjectResponseObj exampleObject = new ExampleObjectResponseObj();
var responseContent = await httpResponse.Content.ReadAsStringAsync();
ExampleObjectResponseObj = JsonConvert.DeserializeObject<InformDataResponseObj>(responseContent);
return ExampleObjectResponseObj;
}
The HTTPClient responses may overlap, or at times they double up and send at the exact same time.
If you don't want your calls to overlap, don't use timer but a loop with a delay:
Task.Run(async () =>
{
const TimeSpan checkInterval = TimeSpan.FromSeconds(15);
while (true)
{
var callTime = DateTime.UtcNow;
try
{
await server.Ping();
}
catch (Exception exception)
{
HandleException(exception);
}
var elapsedTime = DateTime.UtcNow - callTime;
var timeToWait = checkInterval - elapsedTime;
if (timeToWait > TimeSpan.Zero)
{
await Task.Delay(timeToWait);
}
}
});
The code above isn't complete and not enough for a very detailed and precise answer, but still most if not all things can be answered:
You run your timer callback in Task.Run, so it is NOT running in the main thread
If you wanted to run HttpClient in the UI thread it may prevent the overlap but expect your app to become completely unresponsive.
To prevent the overlap you may use several methods but most likely you are looking for SemaphoreSlim
I'm trying to figure out the best way to implement a delay into Task such that after the delay it calls itself again to attempt the same work.
My application is a server that generates reports from the database after the mobile devices sync their data with the server, however If another user has called the report generation method recently, I want it to pause for a period of time and then attempt to run again.
This is my current attempt
private static DateTime _lastRequest = Datetime.MinValue;
public async void IssueReports()
{
await Task.Run(() =>
{
if (DateTime.Now < _lastRequest + TimeSpan.FromMinutes(3)) //checks to see when a user last completed this method
{
Task.Delay(TimeSpan.FromMinutes(2));
IssueReports(); //calls itself again after the delay
return;
}
});
//code to generate reports goes here
_lastRequest = DateTime.Now; //updates last request into the static variable after it has finished running
}
Initially if it failed the check then the task would just end. This prevented 2 users hitting the database at the same time and it causing duplicate reports to be generated. However, the problem is that if 2 users sync within that same window then the second user reports wouldn't be sent until another sync call is done.
The delay is supposed to give the server time to finish generating the reports and updating the database before the next batch is requested by calling itself.
Am I overcomplicating things? I'm worried about it potentially hammering system resources with multiple loops in the event the reports take a long time to process
Following example run background service every 10 seconds recursively. This method is recommended only if you believe your task will complete within 10 seconds.
public frm_testform()
{
InitializeComponent();
dispatcherTimer_Tick().DoNotAwait();
}
private async Task dispatcherTimer_Tick()
{
DispatcherTimer timer = new DispatcherTimer();
TaskCompletionSource<bool> tcs = null;
EventHandler tickHandler = (s, e) => tcs.TrySetResult(true);
timer.Interval = TimeSpan.FromSeconds(10);
timer.Tick += tickHandler;
timer.Start();
while (true)
{
tcs = new TaskCompletionSource<bool>();
await Task.Run(() =>
{
// Run your background service and UI update here
await tcs.Task;
}
}
I need to add a retry logic to a web request if a write to db failed. I'd like to try two more times with a delay of 500ms before quitting. There is no requirement to block the user response during retry. My question is, once a response is made, is the timer still alive to complete the retries or is it killed immediately after the response?
Why not make use of async/await. It will probably make things a little easier:
public SomeResult HandleWebRequest()
{
StartThing(); //not awaited - fire-and-forget
//return a response
}
public async void StartThing()
{
//await Task.Yield(); //return control to call site and finish asynchronously
for(var numTries = 0; numTries < 3; numTries++)
{
if(trySomethingThatMightNotWork)break;
await Task.Delay(5000);
}
}