Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
Hi I have code like below in a for loop. It is in a for loop because the number of times it needs to be run can vary dependent on how many items the user has added.
var taskList = new List<Task<IEnumerable<MyObject>>>();
for (int i = 0; i < numOfBatches; i++)
{
var task = Task.Factory.StartNew(() => MyMethod(variableA, variableB));
taskList.Add(task);
}
//Wait for all the tasks to complete
Task.WaitAll(taskList.Cast<Task>().ToArray());
return taskList.SelectMany(x => x.Result);
Is there a better way I can run these tasks in Parallel? I was thinking about a parallel for each loop but because the number of iterations of the loops isn't fixed I don't think I can use a parallel for each
There isn't necessarily a problem with the code. However if I have 10,000 items inputted it takes about 18 minutes and I was thinking if I could run the Tasks in parallel it may return faster. If 10,000 items are inputted the number of batches will be 10,000/25 = 400
The actuall code in MyMethod calls a 3rd party external service to return data based on data entered by user
Processing a list in parallel is about the easiest of parallel algorithms there is:
ParallelEnumerable.Range(0, numOfBatches)
.Select(_ => MyMethod(variableA, variableB))
.ToList();
It is a code smell to create unbounded numbers of tasks because this can lead to resource exhaustion and the code is clumsy.
Related
Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 1 year ago.
This post was edited and submitted for review 1 year ago and failed to reopen the post:
Original close reason(s) were not resolved
Improve this question
I am trying to load as many messages into a server fast as possible. I currently have this bit of code and to load 50000 messages to a MessagingServer it takes around 5 minutes. This is a list of tasks that have server name, queuename and the message to place on the queue.
taskList.Add(new Task(() => PlaceMessageOnQueue(server, queueName, message)));
This is the current code that I have but I want to convert this to PLINQ as hopes to make this even faster.
_= Parallel.ForEach(task, new ParallelOptions()
{
MaxDegreeOfParallelism = Environment.ProcessorCount; //I have 4 Cores
},t => {
t.Start();
t.Wait();
Console.WriteLine(t.Status)
});
What I have so far but this isn't starting the tasks. This is the code below that I need help with.
var results = task.
.AsParallel()
.AsOrdered()
.WithDegreeofParallelism(Environment.ProcessorCount)
.ToList();
You can approach this by consuming the messages in parallel and selecting the result of processing the message, as in this example:
static void Main(string[] args)
{
var messages = Enumerable.Range(0, 10).Select(i => $"Message {i}");
var results = messages
.AsParallel()
.AsOrdered()
.WithDegreeOfParallelism(Environment.ProcessorCount)
.Select(m => PlaceMessageOnQueue("FOO", "BAR", m))
.ToList();
}
static bool PlaceMessageOnQueue(string server, string queue, string message)
{
Console.WriteLine($"Starting {server}/{queue}/{message}");
Thread.Sleep(1000);
Console.WriteLine($"Finished {server}/{queue}/{message}");
return true; // E.g. representing success
}
Note there isn't a reason to believe this will be fundamentally faster than your last approach. You may wish to look at a producer/consumer pattern using Channels. In many use cases, channels outperform similar concurrency options in .NET.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I am creating a bot for a website where the system logs in with several accounts and does a certain action the problem is when it is time to repeat the process example I have 10 accounts and I would like all accounts to do the same process as the first, the accounts are browned in a txt file. What is the correct way to do this?
Sometimes the function runs ahead of time, I'm new to C # I'm studying
My code looping :/
Task.Delay(2000).ContinueWith(t => setMail());
Task.Delay(3500).ContinueWith(t => nextButton());
Task.Delay(5000).ContinueWith(t => setPass());
Task.Delay(6500).ContinueWith(t => logionButton());
Task.Delay(7500).ContinueWith(t => SucessLogin());
You are creating 5 independent tasks which will all run together, rather than one after the other. Instead, just collapse them to one async/await function:
async Task TestStuff(Account account)
{
await Task.Delay(2000);
setMail();
await Task.Delay(3500);
nextButton();
await Task.Delay(5000);
setPass();
await Task.Delay(6500)
logionButton();
await Task.Delay(7500);
SucessLogin();
}
You mentioned that you've got accounts stored in a file. In this example you'll need to create an Account class and populate it with the information you get from the file. For example:
List<Account> accounts = LoadAccounts("some-file.txt")
foreach(var account in accounts)
{
await TestStuff(account);
}
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I making an Experiment, I have a loop of 100000 increments and inside it there is a thread that does a specific task(write a log to DB), my question is when I run it its finish in one second maybe and it start to insert them lately, then how the OS handles them and it will process them all or will skip from them?
I try awaited method with them its good. but I want to know what will happen if this code was on a server and received 100000 requests.
The Code:
for (int i = 0; i < 100000; i++)
{
Task.Run(() => log.WriteToLog(i + "", new Models.CustomerModel.CustomerModel()));
}
I am not looking for alternative ways, I need to know the behaviour and how this code handles in OS (if there is a queue, some of them will run, etc..)
PS: I know its not a good approach
1 second is a bit quick. I suspect you are not logging 100000 entries properly and entries are being lost.
Assuming that code was the only code inside say a console app's main(), then because you don't await any of the tasks, it is entirely possible your process is exiting before the logging is complete.
Change:
Task.Run(() => log.WriteToLog(i + "",
new Models.CustomerModel.CustomerModel()));
...to:
await Task.Run(() => log.WriteToLog(i + "",
new Models.CustomerModel.CustomerModel()));
Also, as ckuri mentions in the comments, spawning a great deal of tasks in a tight loop probably isn't a good idea. Consider batching the logging or using IOCP.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
So far I have only written single core programs. Now I want to improve my performance and trying to find out, how to pull and push data parallelized. But I even don't know whether I have the right idea about MultiThreading.
Actually it is a pretty simple case. I pull data from an external interface, rework them so that they are in the right order and push them into an OpenGL Pipeline in order to draw them.
I use a WPF-application as GUI and render my data with SharpGL(OpenGL wrapped). the program runs on a dual-core processor.
Here is a sketch of my vision.
So my idea is to use a bufferArray. Now the clue: How could I write and read in the same array from different thrads?
I was recommended to inform about OpenMP. But as it turned out it is not a good idea for .Net and C#.
Thus could you recommend some fitting papers? Maybe an explanation how to use Task Parallel Library (TPL) for this case.
The correct description for this is the producer consumer pattern. In .Net you can do this using TPL Dataflow
Another implementation can be build using a BlockingCollection. A basic version:
BlockingCollection<int> bc = new BlockingCollection<int>();
async Task Main()
{
// Fire up two readers
Task.Run(() => ReadCollection1());
Task.Run(() => ReadCollection2());
// Add items to process.
bc.Add(5);
bc.Add(6);
bc.Add(7);
bc.Add(8);
bc.Add(9);
bc.CompleteAdding(); // Signal we are finished adding items (on close of application for example)
}
void ReadCollection1()
{
foreach (var item in bc.GetConsumingEnumerable())
{
$"1 processed {item}".Dump();
}
}
void ReadCollection2()
{
foreach (var item in bc.GetConsumingEnumerable())
{
$"2 processed {item}".Dump();
}
}
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
In my WebApi project I have an endpoint that should delay a small block of code.
Let's make an example:
I need to implement a mechanism that permit the client to book for a resource. The booking time should only have a duration of 120 seconds and then expire.
In terms of code I have something like this:
//booking
foreach (var item in list) {
item.Status = ItemStatus.Booked;
}
await _context.SaveChangesAsync();
//Setup a delayed "thread" which removes the booking
Task.Delay( TimeSpan.FromMinutes( 2 ) )
.ContinueWith( x => {
//loop on the list and set the status to "ReadyForSale"
//if this is still in the Booking
} );
return;
I would like to understand if a solution like this satisfy my requirement. The current thread should not be blocked from the delayed task and I need to find a way to pass the list of items to the delayed task.
I think there is a risk that the task will never run. What if the sever crashes or IIS decides it needs to recycle the pool.
There is the risk that the state would never be restored.
I would probably set a bokedAt DateTime field in the database and then check if that time compared to now is more than two minutes to determine if the item is booked or not. Perhaps even a computed column that checks this and returns the state.