Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am creating a bot for a website where the system logs in with several accounts and does a certain action the problem is when it is time to repeat the process example I have 10 accounts and I would like all accounts to do the same process as the first, the accounts are browned in a txt file. What is the correct way to do this?
Sometimes the function runs ahead of time, I'm new to C # I'm studying
My code looping :/
Task.Delay(2000).ContinueWith(t => setMail());
Task.Delay(3500).ContinueWith(t => nextButton());
Task.Delay(5000).ContinueWith(t => setPass());
Task.Delay(6500).ContinueWith(t => logionButton());
Task.Delay(7500).ContinueWith(t => SucessLogin());
You are creating 5 independent tasks which will all run together, rather than one after the other. Instead, just collapse them to one async/await function:
async Task TestStuff(Account account)
{
await Task.Delay(2000);
setMail();
await Task.Delay(3500);
nextButton();
await Task.Delay(5000);
setPass();
await Task.Delay(6500)
logionButton();
await Task.Delay(7500);
SucessLogin();
}
You mentioned that you've got accounts stored in a file. In this example you'll need to create an Account class and populate it with the information you get from the file. For example:
List<Account> accounts = LoadAccounts("some-file.txt")
foreach(var account in accounts)
{
await TestStuff(account);
}
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
In my WebApi project I have an endpoint that should delay a small block of code.
Let's make an example:
I need to implement a mechanism that permit the client to book for a resource. The booking time should only have a duration of 120 seconds and then expire.
In terms of code I have something like this:
//booking
foreach (var item in list) {
item.Status = ItemStatus.Booked;
}
await _context.SaveChangesAsync();
//Setup a delayed "thread" which removes the booking
Task.Delay( TimeSpan.FromMinutes( 2 ) )
.ContinueWith( x => {
//loop on the list and set the status to "ReadyForSale"
//if this is still in the Booking
} );
return;
I would like to understand if a solution like this satisfy my requirement. The current thread should not be blocked from the delayed task and I need to find a way to pass the list of items to the delayed task.
I think there is a risk that the task will never run. What if the sever crashes or IIS decides it needs to recycle the pool.
There is the risk that the state would never be restored.
I would probably set a bokedAt DateTime field in the database and then check if that time compared to now is more than two minutes to determine if the item is booked or not. Perhaps even a computed column that checks this and returns the state.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
Hi I have code like below in a for loop. It is in a for loop because the number of times it needs to be run can vary dependent on how many items the user has added.
var taskList = new List<Task<IEnumerable<MyObject>>>();
for (int i = 0; i < numOfBatches; i++)
{
var task = Task.Factory.StartNew(() => MyMethod(variableA, variableB));
taskList.Add(task);
}
//Wait for all the tasks to complete
Task.WaitAll(taskList.Cast<Task>().ToArray());
return taskList.SelectMany(x => x.Result);
Is there a better way I can run these tasks in Parallel? I was thinking about a parallel for each loop but because the number of iterations of the loops isn't fixed I don't think I can use a parallel for each
There isn't necessarily a problem with the code. However if I have 10,000 items inputted it takes about 18 minutes and I was thinking if I could run the Tasks in parallel it may return faster. If 10,000 items are inputted the number of batches will be 10,000/25 = 400
The actuall code in MyMethod calls a 3rd party external service to return data based on data entered by user
Processing a list in parallel is about the easiest of parallel algorithms there is:
ParallelEnumerable.Range(0, numOfBatches)
.Select(_ => MyMethod(variableA, variableB))
.ToList();
It is a code smell to create unbounded numbers of tasks because this can lead to resource exhaustion and the code is clumsy.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have an ASP.NET app, where a single request invokes 6 very slow methods. The methods are not async and I don't have the time to rewrite and test them. How can I run those 6 methods on 6 threads and then agregate the results? I'm on .NET 4.5.
You can simply use Task.Run to create a task that runs each of the methods in another thread, and then wait for them all to finish so that you can use the results.
var tasks = new Task<YourResultType>[]
{
Task.Run(() => Method1()),
Task.Run(() => Method2()),
Task.Run(() => Method3()),
Task.Run(() => Method4()),
Task.Run(() => Method5()),
Task.Run(() => Method6()),
};
var results = Task.WhenAll(tasks).Result;
If the methods don't all have results of the same type, allowing you to put all of the tasks into an array, then you'll need to have separate local variables for each task and use Result on each one after starting them all.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
private async void btnLoadFile_Click(object sender, EventArgs e)
{
if(AccountsFile.ShowDialog()==DialogResult.OK)
{
Accounts = File.ReadAllLines(AccountsFile.FileName);
foreach(string str in Accounts)
{
await LoadAccount(str);
}
}
}
I've ran into a problem, I know how Asynchronous programming works it will wait for the task to be complete but LoadAccount() will never complete because it calls a function with a never ending while loop so it will not reach the next string in Accounts.
I don't know where to start with this problem. Any solutions?
Instead of waiting for each account successively, you could wait for them collectively. This way, even if one of your accounts enters an infinite loop, the others could still proceed to load.
Accounts = File.ReadAllLines(AccountsFile.FileName);
Task completionTask = Task.WhenAll(Accounts.Select(LoadAccount));
You would typically want to store completionTask in a class variable. Subsequently, when you break out of the indefinite while loop within your LoadAccount calls (for example, by signalling cancellation via a polled CancellationToken), you can use this completionTask to wait for all your tasks to complete.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I have a list of sites for which log files are generated. These logs have to be robocopied, unzipped, parsed and analysed with ruby code by running respective processes.
Can anybody suggest the best way to run these processes in parallel for all the site's logs?
Considering your data model like this:
class Website
{
public List<WebSiteLog> Logs;
}
A possible parallel solution using TPL (Task Parallel Library) is something like this:
// var sites = your sites list
var processTask = Task.Foreach(sites, site =>
{
Task.Factory.StartNew(theSite=>
{
theSite.UnzipLogs()
}.ContinueWith(unzipTask=>{
{
theSite.ParseLogs();
}.ContinueWith(parseTask=>{
{
theSite.AnalyzeLogs();
}
});
Task.WaitAll(processTask);
This is a very initial solution. Lots of exception management, partitioning and even more paralellizing on UnzipLogs, ParseLogs, AnalyzeLogs are applicable.