running quartz job in TriggerComplete event - c#

my program should run maximum (N) job at a time. if there is more job needs to be run it is store in temp storage and after completing one of the currently running job then i'll pick another trigger base on how much the trigger fails to start and it's priority, and then fire its job
at initialization phase, I create for example 5 job with 5 corresponding trigger and add it to scheduler everything's fine until second job is running but TriggerComplete of the trigger listener is not firing for picking up another job to run, could you please tell me where im wrong ??
public class CrawlerTriggerListener : ITriggerListener
{
private int maxConcurrentCrawling = 1;
private int currentCount = 0;
private object syncLock = new object();
private Dictionary fireDic = new Dictionary();
public string Name { get { return "listener"; } }
public void TriggerFired(Trigger trigger, JobExecutionContext context)
{
if (fireDic.Count == 0)
{
IScheduler sched = context.Scheduler;
string[] triggerNameList = sched.GetTriggerNames("triggerGroup");
foreach (string triggerName in triggerNameList)
{
MissfiredInfo missedInfo = new MissfiredInfo();
missedInfo.TriggerName = triggerName;
missedInfo.Priority = sched.GetTrigger(triggerName, "triggerGroup").Priority;
fireDic.Add(triggerName, missedInfo);
}
}
}
public bool VetoJobExecution(Trigger trigger, JobExecutionContext context)
{
lock (syncLock)
{
if (currentCount < maxConcurrentCrawling)
{
currentCount++;
fireDic[trigger.Name].FailCount = 0;
fireDic[trigger.Name].LastFireTime = DateTime.UtcNow;
return false;
}
else
{
fireDic[trigger.Name].LastFireTime = DateTime.UtcNow;
fireDic[trigger.Name].FailCount++;
return true;
}
}
}
public void TriggerMisfired(Trigger trigger) { }
public void TriggerComplete(Trigger trigger, JobExecutionContext context, SchedulerInstruction triggerInstructionCode)
{
lock (syncLock)
{
currentCount--;
var validCandidate = new Dictionary<string, int>();
foreach (KeyValuePair<string, MissfiredInfo> fireDicItem in fireDic)
if (fireDicItem.Value.FailCount > 0)
validCandidate.Add(fireDicItem.Key, fireDicItem.Value.FailCount * 73 + fireDicItem.Value.Priority);
if (validCandidate.Count > 0)
{
var sorted = (from entry in validCandidate orderby entry.Value ascending select entry);
string triggerName = sorted.First().Key;
fireDic[triggerName].LastFireTime = DateTime.UtcNow;
fireDic[triggerName].FailCount = 0;
string jobName = context.Scheduler.GetTrigger(triggerName, "triggerGroup").JobName;
currentCount++;
context.Scheduler.TriggerJob(jobName, "jobGroup");
}
}
}
}

Okay, so again, I'm not sure where you are instantiating the TriggerListener, but you might want to verify that you are adding the TriggerListener to the Scheduler.
http://quartznet.sourceforge.net/tutorial/lesson_7.html
See that the scheduler instance has a method for "adding" (or registering) listeners. If you don't do that, the events will never fire.

Related

C# Immutable counter for multiple fields

I have a fairly high throughput on a message counter (tens of thousands per second), and looking for an efficient way of getting the count without putting locks everywhere or ideally not locking on each message count when i am giving an update every 10 seconds.
Use of immutable counter object
I am using an immutable counter class:
public class Counter
{
public Counter(int quotes, int trades)
{
Quotes = quotes;
Trades = trades;
}
readonly public int Quotes;
readonly public int Trades;
// and some other counter fields snipped
}
And would update this on each message process loop:
class MyProcessor
{
System.Timers.Timer timer;
Counter counter = new Counter(0,0);
public MyProcessor()
{
// update ever 10 seconds
this.timer = new System.Timers.Timer(10000);
timer.Elapsed += (sender, e) => {
var quotesPerSecond = this.counter.Quotes / 10.0;
var tradesPerSecond = this.counter.Trades / 10.0;
this.Counter = new Counter(0,0);
});
}
public void ProcessMessages(Messages messages)
{
foreach(var message in messages) { /* */ }
var oldCounter = counter;
this.counter = new Counter(oldCounter.Quotes, oldCounter.Trades);
}
}
I have lots of counters (not all shown), so would mean a lot of individual Interlocked.Increment calls on individual counter fields.
The only other way I can think of is lock every single run of ProcessMessages (which will be extensive) and heavy for something which is a utility as opposed to critical where the program would crash.
Is it possible to use an immutable counter object in this fashion without hard interlocking/thread mechanisms when we only need to update once every 10 seconds?
Flag check idea to avoid locks
Could the timer thread set a flag for the ProcessMessages to check and if it sees it set, start the count from zero again, i.e.
/* snipped the MyProcessor class, same as before */
System.Timers.Timer timer;
Counter counter = new Counter(0,0);
ManualResetEvent reset = new ManualResetEvent(false);
public MyProcessor()
{
// update ever 10 seconds
this.timer = new System.Timers.Timer(10000);
timer.Elapsed += (sender, e) => {
var quotesPerSecond = this.counter.Quotes / 10.0;
var tradesPerSecond = this.counter.Trades / 10.0;
// log
this.reset.Set();
});
}
// this should be called every second with a heartbeat message posted to queue
public void ProcessMessages(Messages messages)
{
if (reset.WaitOne(0) == true)
{
this.counter = new Counter(this.counter.Quotes, this.counter.Trades, this.counter.Aggregates);
reset.Reset();
}
else
{
this.counter = new Counter(
this.counter.Quotes + message.Quotes.Count,
this.counter.Trades + message.Trades.Count);
}
}
/* end of MyProcessor class */
This would work, however the update "stalls" when the process messages comes to a halt (although the throughput is very high, it does pause for a number of hours at night ideally should show the actual rather than last value).
One way around this would be to post a heartbeat message to the MyProcessor.ProcessMessages() every second to force an internal update of the message counters and subsequent reset when the reset ManualResetEvent is set.
Here are three new methods for your Counter class. One for reading the latest value from a specific location, one for updating safely a specific location, and one for creating easily a new Counter based on an existing one:
public static Counter Read(ref Counter counter)
{
return Interlocked.CompareExchange(ref counter, null, null);
}
public static void Update(ref Counter counter, Func<Counter, Counter> updateFactory)
{
var counter1 = counter;
while (true)
{
var newCounter = updateFactory(counter1);
var counter2 = Interlocked.CompareExchange(ref counter, newCounter, counter1);
if (counter2 == counter1) break;
counter1 = counter2;
}
}
public Counter Add(int quotesDelta, int tradesDelta)
{
return new Counter(Quotes + quotesDelta, Trades + tradesDelta);
}
Usage example:
Counter latest = Counter.Read(ref this.counter);
Counter.Update(ref this.counter, existing => existing.Add(1, 1));
Accessing the MyProcessor.counter field directly by multiple threads concurrently is not thread-safe, because it's neither volatile nor protected by a lock. The above methods are safe to use because they are accessing the field through interlocked operations.
I wanted to update everyone with what I had come up with, the counter updates were pushed within the thread itself.
Everything is driven by the DequeueThread loop, and specifically this.queue.ReceiveAsync(TimeSpan.FromSeconds(UpdateFrequencySeconds)) function.
This will either return an item from the queue, process it and update the counters, or timeout and then update the counters - there are no other threads involved everything, including updating message rate, is done within the thread.
In summary, nothing runs in parallel (in terms of dequing the packet), it is fetching the items one at a time and processing it and the counters thereafter. Then finally looping back to process the next item in the queue.
This removes the need for synchronisation:
internal class Counter
{
public Counter(Action<int,int,int,int> updateCallback, double updateEvery)
{
this.updateCallback = updateCallback;
this.UpdateEvery = updateEvery;
}
public void Poll()
{
if (nextUpdate < DateTimeOffset.UtcNow)
{
// post the stats, and reset
this.updateCallback(this.quotes, this.trades, this.aggregates, this.statuses);
this.quotes = 0;
this.trades = 0;
this.aggregates = 0;
this.statuses = 0;
nextUpdate = DateTimeOffset.UtcNow.AddSeconds(this.UpdateEvery);
}
}
public void AddQuotes(int count) => this.quotes += count;
public void AddTrades(int count) => this.trades += count;
public void AddAggregates(int count) => this.aggregates += count;
public void AddStatuses(int count) => this.statuses += count;
private int quotes;
private int trades;
private int aggregates;
private int statuses;
private readonly Action<int,int,int,int> updateCallback;
public double UpdateEvery { get; private set; }
private DateTimeOffset nextUpdate;
}
public class DeserializeWorker
{
private readonly BufferBlock<byte[]> queue = new BufferBlock<byte[]>();
private readonly IPolygonDeserializer polygonDeserializer;
private readonly ILogger<DeserializeWorker> logger;
private readonly Counter counter;
const double UpdateFrequencySeconds = 5.0;
long maxBacklog = 0;
public DeserializeWorker(IPolygonDeserializer polygonDeserializer, ILogger<DeserializeWorker> logger)
{
this.polygonDeserializer = polygonDeserializer ?? throw new ArgumentNullException(nameof(polygonDeserializer));
this.logger = logger;
this.counter = new Counter(ProcesCounterUpdateCallback, UpdateFrequencySeconds);
}
public void Add(byte[] data)
{
this.queue.Post(data);
}
public Task Run(CancellationToken stoppingToken)
{
return Task
.Factory
.StartNew(
async () => await DequeueThread(stoppingToken),
stoppingToken,
TaskCreationOptions.LongRunning,
TaskScheduler.Default)
.Unwrap();
}
private async Task DequeueThread(CancellationToken stoppingToken)
{
while (stoppingToken.IsCancellationRequested == false)
{
try
{
var item = await this.queue.ReceiveAsync(TimeSpan.FromSeconds(UpdateFrequencySeconds), stoppingToken);
await ProcessAsync(item);
}
catch (TimeoutException)
{
// this is ok, timeout expired
}
catch(TaskCanceledException)
{
break; // task cancelled, break from loop
}
catch (Exception e)
{
this.logger.LogError(e.ToString());
}
UpdateCounters();
}
await StopAsync();
}
protected async Task StopAsync()
{
this.queue.Complete();
await this.queue.Completion;
}
protected void ProcessStatuses(IEnumerable<Status> statuses)
{
Parallel.ForEach(statuses, (current) =>
{
if (current.Result != "success")
this.logger.LogInformation($"{current.Result}: {current.Message}");
});
}
protected void ProcessMessages<T>(IEnumerable<T> messages)
{
Parallel.ForEach(messages, (current) =>
{
// serialize by type T
// dispatch
});
}
async Task ProcessAsync(byte[] item)
{
try
{
var memoryStream = new MemoryStream(item);
var message = await this.polygonDeserializer.DeserializeAsync(memoryStream);
var messagesTask = Task.Run(() => ProcessStatuses(message.Statuses));
var quotesTask = Task.Run(() => ProcessMessages(message.Quotes));
var tradesTask = Task.Run(() => ProcessMessages(message.Trades));
var aggregatesTask = Task.Run(() => ProcessMessages(message.Aggregates));
this.counter.AddStatuses(message.Statuses.Count);
this.counter.AddQuotes(message.Quotes.Count);
this.counter.AddTrades(message.Trades.Count);
this.counter.AddAggregates(message.Aggregates.Count);
Task.WaitAll(messagesTask, quotesTask, aggregatesTask, tradesTask);
}
catch (Exception e)
{
this.logger.LogError(e.ToString());
}
}
void UpdateCounters()
{
var currentCount = this.queue.Count;
if (currentCount > this.maxBacklog)
this.maxBacklog = currentCount;
this.counter.Poll();
}
void ProcesCounterUpdateCallback(int quotes, int trades, int aggregates, int statuses)
{
var updateFrequency = this.counter.UpdateEvery;
logger.LogInformation(
$"Queue current {this.queue.Count} (max {this.maxBacklog }), {quotes / updateFrequency} quotes/sec, {trades / updateFrequency} trades/sec, {aggregates / updateFrequency} aggregates/sec, {statuses / updateFrequency} status/sec");
}
}

Threads monitoring a Queue<Actions>

I doing a small project to map a network (routers only) using SNMP. In order to speed things up, I´m trying to have a pool of threads responsible for doing the jobs I need, apart from the first job which is done by the main thread.
At this time I have two jobs, one takes a parameter the other doesn´t:
UpdateDeviceInfo(NetworkDevice nd)
UpdateLinks() *not defined yet
What I´m trying to achieve is to have those working threads waiting for a job to
appear on a Queue<Action> and wait while it is empty. The main thread will add the first job and then wait for all workers, which might add more jobs, to finish before starting adding the second job and wake up the sleeping threads.
My problem/questions are:
How to define the Queue<Actions> so that I can insert the methods and the parameters if any. If not possible I could make all functions accept the same parameter.
How to launch the working threads indefinitely. I not sure where should I create the for(;;).
This is my code so far:
public enum DatabaseState
{
Empty = 0,
Learning = 1,
Updating = 2,
Stable = 3,
Exiting = 4
};
public class NetworkDB
{
public Dictionary<string, NetworkDevice> database;
private Queue<Action<NetworkDevice>> jobs;
private string _community;
private string _ipaddress;
private Object _statelock = new Object();
private DatabaseState _state = DatabaseState.Empty;
private readonly int workers = 4;
private Object _threadswaitinglock = new Object();
private int _threadswaiting = 0;
public Dictionary<string, NetworkDevice> Database { get => database; set => database = value; }
public NetworkDB(string community, string ipaddress)
{
_community = community;
_ipaddress = ipaddress;
database = new Dictionary<string, NetworkDevice>();
jobs = new Queue<Action<NetworkDevice>>();
}
public void Start()
{
NetworkDevice nd = SNMP.GetDeviceInfo(new IpAddress(_ipaddress), _community);
if (nd.Status > NetworkDeviceStatus.Unknown)
{
database.Add(nd.Id, nd);
_state = DatabaseState.Learning;
nd.Update(this); // The first job is done by the main thread
for (int i = 0; i < workers; i++)
{
Thread t = new Thread(JobRemove);
t.Start();
}
lock (_statelock)
{
if (_state == DatabaseState.Learning)
{
Monitor.Wait(_statelock);
}
}
lock (_statelock)
{
if (_state == DatabaseState.Updating)
{
Monitor.Wait(_statelock);
}
}
foreach (KeyValuePair<string, NetworkDevice> n in database)
{
using (System.IO.StreamWriter file = new System.IO.StreamWriter(n.Value.Name + ".txt")
{
file.WriteLine(n);
}
}
}
}
public void JobInsert(Action<NetworkDevice> func, NetworkDevice nd)
{
lock (jobs)
{
jobs.Enqueue(item);
if (jobs.Count == 1)
{
// wake up any blocked dequeue
Monitor.Pulse(jobs);
}
}
}
public void JobRemove()
{
Action<NetworkDevice> item;
lock (jobs)
{
while (jobs.Count == 0)
{
lock (_threadswaitinglock)
{
_threadswaiting += 1;
if (_threadswaiting == workers)
Monitor.Pulse(_statelock);
}
Monitor.Wait(jobs);
}
lock (_threadswaitinglock)
{
_threadswaiting -= 1;
}
item = jobs.Dequeue();
item.Invoke();
}
}
public bool NetworkDeviceExists(NetworkDevice nd)
{
try
{
Monitor.Enter(database);
if (database.ContainsKey(nd.Id))
{
return true;
}
else
{
database.Add(nd.Id, nd);
Action<NetworkDevice> action = new Action<NetworkDevice>(UpdateDeviceInfo);
jobs.Enqueue(action);
return false;
}
}
finally
{
Monitor.Exit(database);
}
}
//Job1 - Learning -> Update device info
public void UpdateDeviceInfo(NetworkDevice nd)
{
nd.Update(this);
try
{
Monitor.Enter(database);
nd.Status = NetworkDeviceStatus.Self;
}
finally
{
Monitor.Exit(database);
}
}
//Job2 - Updating -> After Learning, create links between neighbours
private void UpdateLinks()
{
}
}
Your best bet seems like using a BlockingCollection instead of the Queue class. They behave effectively the same in terms of FIFO, but a BlockingCollection will let each of your threads block until an item can be taken by calling GetConsumingEnumerable or Take. Here is a complete example.
http://mikehadlow.blogspot.com/2012/11/using-blockingcollection-to-communicate.html?m=1
As for including the parameters, it seems like you could use closure to enclose the NetworkDevice itself and then just enqueue Action instead of Action<>

C# - Create List of Tasks for a Quadcopter

I have a multilined textbox where I can write missons for a drone.
Example:
10 levantar
10 goto 50,40
10 goto 20,20
10 mayday
10 aterrar
I want to create a list that does this missoes step by step. Something like: takeoff, after takeoff goto that position and when reaches that positon goto the next, etc..
My question is: Is there a way to group this text on a list and when it finishes that task i simply remove the first position of the list and does the next?
private void executa_missao()
{
string[] linhas_separadas = null;
string[] pontos_separados = null;
for (int i = 0; i < tb_missoes.Lines.Length; i++)
{
linhas_separadas = tb_missoes.Lines[i].Split(null);
for(int k=0;k<drone.Length;k++)
{
listas_posicoes[k] = new List<PointF>();
if (linhas_separadas[0] == drone[k].ip_drone_final)
{
if (linhas_separadas[1] == "goto")
{
pontos_separados = linhas_separadas[2].Split(',');
drone[k].posicao_desejada = new PointF(Convert.ToSingle(pontos_separados[0]), Convert.ToSingle(pontos_separados[1]));
//guarda na lista as posicoes pretendidas
listas_posicoes[k].Add(new PointF(Convert.ToSingle(pontos_separados[0]), Convert.ToSingle(pontos_separados[1])));
}
else if (linhas_separadas[1] == "levantar")
{
drone[k]._droneClient.FlatTrim();
drone[k]._droneClient.Takeoff();
drone[k].subir_ate_altura = true;
}
else if (linhas_separadas[1] == "aterrar")
{
drone[k]._droneClient.Land();
}
}
}
Atm it's trying to do every step at the same time. I want to make step-by-step.
Use a Queue<T> instead of a List<T>
Then you can use the .Dequeue() function to get your current command, and remove it from the queue.
Creating sample working code for this behaviour can get very complicated and would take me a while, but the basic pattern would look as such:
public abstract class Command
{
public abstract bool IsComplete { get; }
public abstract void Execute();
}
public static class CommandExecutor
{
public static Queue<Command> commands;
public static Command current;
public static void Update()
{
if (commands.Count > 0
&& (current == null || current.IsComplete))
{
current = commands.Dequeue();
current.Execute();
}
}
}
Where the Update() method is called in recurring intervals.

Threads conflict in some cases, in loop conditions

I am working on a project that uses Threads. In some cases, I have these problems:
Here is some piece of my code :
List<EmailAddress> lstEmailAddress = new List<EmailAddress>();
private void TimerCheckInternetConnection_Tick(object sender, EventArgs e)
{
lock (TicketLock)
{
if (UtilityManager.CheckForInternetConnection())
{
if (ApplicationRunStatus == Enum_ApplicationRunStatus.UnknownDisconnect || ApplicationRunStatus == Enum_ApplicationRunStatus.IsReady)
{
// Connect
ThreadPool.QueueUserWorkItem((o) =>
{
for (int i = 0; i < lstEmailAddress.Count; i++)
{
lstEmailAddress[i].IsActive = lstEmailAddress[i].Login();
}
this.BeginInvoke(new Action(() =>
{
// some code
}));
});
}
}
}
}
and this is EmailAddress class :
class EmailAddress
{
private Imap4Client imap = new Imap4Client();
private object objectLock = new object();
public bool IsActive;
public string Address;
public string Password;
public string RecieveServerAddress;
public int RecieveServerPort;
public bool Login()
{
lock (objectLock)
{
try
{
imap.ConnectSsl(RecieveServerAddress, RecieveServerPort);
}
catch (Exception)
{
}
try
{
imap.Login(Address, Password);
return true;
}
catch (Exception)
{
return false;
}
}
}
}
And my problem is this:
When I want to use Login procedure that belongs to EmailAddress Class, it has some conflict. As you can see, I used Lock but any thing changed.
For more details:
If I have 3 items in lstEmailAddress , the Login procedure has to be called 3 times by this code. but every time, the login procedure will work on same username and password. So all my emails cannot login correctly.
If I remove threadpool, it will be ok.
Your code is very confusing:
If you add the lock in your code, it will run synchroniously, only one thread at the time, which will lead to performance loss.
If you queue work via QueueUserWorkItem - it will run in other thread, and not inside TicketLock
You should incapsulate locks inside your class, and should not lock entire logic in your program.
You start work for a loop variable i, which is being closured for it's last value, which lead for a problem you state in last sentence.
lock object in Email class isn't static so it's being created for each instance, and doesn't actually lock anithing.
As you are using Invoke method, your code is being started from UI, and you need to pass the synchronization context. I suggest you to use TPL code for this, and do not directly work with ThreadPool
So I suggest you this solution:
List<EmailAddress> lstEmailAddress = new List<EmailAddress>();
private void TimerCheckInternetConnection_Tick(object sender, EventArgs e)
{
// remove this lock as we have another in Email class
//lock (TicketLock)
if (UtilityManager.CheckForInternetConnection())
{
if (ApplicationRunStatus == Enum_ApplicationRunStatus.UnknownDisconnect
|| ApplicationRunStatus == Enum_ApplicationRunStatus.IsReady)
{
for (int i = 0; i < lstEmailAddress.Count; i++)
{
// use local variable to store index
int localIndex = i;
// Connect
ThreadPool.QueueUserWorkItem((o) =>
{
// if you add a lock here, this will run synchroniosly,
// and you aren't really need the ThreadPool
//lock (TicketLock)
lstEmailAddress[localIndex].IsActive = lstEmailAddress[localIndex].Login();
this.BeginInvoke(new Action(() =>
{
// some code
}));
});
}
}
}
}
class EmailAddress
{
// if you have to login only for one user simultaneosly
// use static variables here, other wise simply remove the lock as it is useless
private static Imap4Client imap;
private static object objectLock;
// static constructor for only one initialization for a static fields
static EmailAddress()
{
objectLock = new object();
imap = new Imap4Client();
}
public bool IsActive;
public string Address;
public string Password;
public string RecieveServerAddress;
public int RecieveServerPort;
public bool Login()
{
// aquire a static lock
lock (objectLock)
{
try
{
imap.ConnectSsl(RecieveServerAddress, RecieveServerPort);
}
catch (Exception)
{
// STORE THE EXCEPTION!!!
// return as you haven't connected
return false;
}
try
{
imap.Login(Address, Password);
return true;
}
catch (Exception)
{
// STORE THE EXCEPTION!!!
return false;
}
}
}
}
Change your Code as and try . you code is queing item from lstEmailAddress where it will always go and hit last item from the list. change your code to inquie each item in threadpool. that should fix. it.
for (int i = 0; i < lstEmailAddress.Count; i++)
{
ThreadPool.QueueUserWorkItem((o) =>
{
lstEmailAddress[i].IsActive = lstEmailAddress[i].Login();
}
}

Child tasks are not awaited by parent task

This is my code
static void Main(string[] args)
{
List<Thing> collection = new List<Thing>
{
new Thing { IntProp = 1, BoolProp = true },
new Thing { IntProp = 1, BoolProp = true },
new Thing { IntProp = 2, BoolProp = true },
new Thing { IntProp = 1, BoolProp = false }
};
int number = 0;
var task = Task.Factory.StartNew<bool>(() =>
{
TaskFactory ts = new TaskFactory(TaskCreationOptions.AttachedToParent, TaskContinuationOptions.ExecuteSynchronously);
foreach (var item in collection)
{
if (item.BoolProp)
{
ts.StartNew(() =>
number += GetNum1(item.IntProp));
}
else
{
ts.StartNew(() =>
number += GetNum2(item.IntProp));
}
}
return true;
});
task.Wait();
Console.WriteLine(number);
}
here is are definitions of GetNum1 and GetNum2
static int GetNum1(int num)
{
for (int i = 0; i < 1000000000; i++) { } // simulate some job
return 10;
}
static int GetNum2(int num)
{
for (int i = 0; i < 500000000; i++) { } // simulate some job
return 3;
}
and here is the Thing class
class Thing
{
public bool BoolProp { get; set; }
public int IntProp { get; set; }
}
basically what I am doing is just creating the collection of Thing objects. then I create a single parent task which will have several child task (which it should await, I guess).
there is a number variable which is incremented by child task by the amount returned from GetNum1 and GetNum2 method (10 or 3). the code above should output 33 (10 + 10 + 10 + 3) as I guess, but 10 is outputted instead, Because just the first child task is awaited. If I put the breakpoint in the code and go step by step than the output is correct. Why does this happen. does it have to do something with the foreach loop inside the parent task ? Please do not start asking question like "why you need this" and "there is no need for that", this is just an example code.
The parent task is in fact waiting for (not "awaiting") the child tasks. Your problem is that the code is accessing the number variable from multiple threads without synchronization:
var mutex = new object();
int number = 0;
var task = Task.Factory.StartNew<bool>(() =>
{
TaskFactory ts = new TaskFactory(TaskCreationOptions.AttachedToParent, TaskContinuationOptions.ExecuteSynchronously);
foreach (var item in collection)
{
if (item.BoolProp)
{
ts.StartNew(() =>
{
var value = GetNum1(item.IntProp);
lock (mutex) number += value;
});
}
else
{
ts.StartNew(() =>
{
var value = GetNum2(item.IntProp);
lock (mutex) number += value;
});
}
}
return true;
});
task.Wait();
lock (mutex)
Console.WriteLine(number);
Recommended reading: Parallel Tasks and Dynamic Task Parallelism.

Categories