Issues Publishing From Subsequent Rebus Instances - c#

I have several services that are essentially console applications hosted using TopShelf, and communiate using Rebus 0.99.50. One of these services (StepManager) loops through a collection of objects (of type Step), each of which contains a Bus instance, which it uses to send a message, and a handler used to handle a reply. The following Step(s) used for this example, in this order, are:
ReceiveFile
LogFileMetrics
ArchiveIncomingFile
In my actual scenario, I have a total of 7 Step(s)...When looping through these Step(s), ReceiveFile and LogFileMetrics behave as expected, however when ArchiveIncomingFile runs, .Send(req) is called, but the message never reaches its destination, leaving the process waiting for the reply that never returns. Regardless of what type of Step object or order of the objects in the list, this happens consistently at second instance of type Step (which does a .Send(req) in the Run() method) in the list. BUT, when I comment out the while (!Completed) { await Task.Delay(25); } statements, the messages appear to get sent, however without those statements, the Step(s) will all run with no specific execution order, which is a problem.
Why is this happening? What am I missing/doing wrong here? And is there a better alternative to accomplish what I am trying to do?
Here are the relevant portions of the classes in question:
public class StepManager
{
...
public string ProcessName { get; set; }
public List<Step> Steps { get; set; }
public BuiltinHandlerActivator ServiceBus { get; set; }
...
public async Task Init()
{
...
Steps = new List<Step>();
var process = Db.Processes.Include("Steps")
.Where(p => p.Name == ProcessName)
.FirstOrDefault();
...
foreach (var s in process.Steps)
{
var step = container.Resolve<Step>(s.Name);
...
Steps.Add(step);
}
}
public async Task Run()
{
foreach (var step in Steps)
{
await step.Run();
}
}
}
public class Step
{
public BuiltinHandlerActivator ServiceBus { get; set; }
public Step()
{
Db = new ClearStoneConfigContext();
Timer = new Stopwatch();
StepId = Guid.NewGuid().ToString();
Completed = false;
}
public virtual async Task Run() { }
}
public class ReceiveFile : Step
{
public ReceiveFile()
{
ServiceBus = new BuiltinHandlerActivator();
Configure.With(ServiceBus)
.Logging(l => l.ColoredConsole(LogLevel.Info))
.Routing(r => r.TypeBased().Map<ProcessLog>("stepmanager"))
.Transport(t => t.UseMsmq("receivefile"))
.Start();
}
public override async Task Run()
{
...
LogEntry.Message = "File " + FileEvent.Name + " received.";
await ServiceBus.Bus.Advanced.Routing.Send("stepmanager", LogEntry);
Completed = true;
}
}
public class LogFileMetrics : Step
{
public LogFileMetrics()
{
SubscriptionTable = "SandboxServiceBusSubscriptions";
ServiceBus = new BuiltinHandlerActivator();
Configure.With(ServiceBus)
.Logging(l => l.ColoredConsole(LogLevel.Info))
.Routing(r => r.TypeBased().Map<LogFileMetricsRequest>("metrics"))
.Transport(t => t.UseMsmq("logfilemetrics"))
.Start();
ServiceBus.Handle<FileMetricsLogged>(async msg=> await FileMetricsLogged(msg));;
}
public override async Task Run()
{
...
await ServiceBus.Bus.Send(new LogFileMetricsRequest { ProcessId = ProcessId, FileEvent = FileEvent }).ConfigureAwait(false);
while (!Completed) { await Task.Delay(25); }
}
private async Task FileMetricsLogged(FileMetricsLogged msg)
{
...
await ServiceBus.Bus.Advanced.Routing.Send("stepmanager", LogEntry);
Completed = true;
}
}
public class ArchiveIncomingFile : Step
{
public ArchiveIncomingFile()
{
SubscriptionTable = "SandboxServiceBusSubscriptions";
ServiceBus = new BuiltinHandlerActivator();
Configure.With(ServiceBus)
.Logging(l => l.ColoredConsole(LogLevel.Info))
.Routing(r => r.TypeBased().Map<ArchiveIncomingFileRequest>("incomingarchivefilerouter"))
.Transport(t => t.UseMsmq("archiveincomingfile"))
.Start();
ServiceBus.Handle<IncomingFileArchived>(async msg => await IncomingFileArchived(msg));
}
public override async Task Run()
{
...
ServiceBus.Bus.Send(req);
while (!Completed) { await Task.Delay(25); }
}
private async Task IncomingFileArchived(IncomingFileArchived msg)
{
...
await ServiceBus.Bus.Advanced.Routing.Send("stepmanager", LogEntry);
Completed = true;
}
}

I can see several issues with your code, although it is not clear to me what is causing the funny behavior you are experiencing.
First off, it seems like you are creating new bus instances every time you are creating steps. Are you aware that Rebus' bus instance is supposed to be created once at startup in your application, kept as a singleton, and must be properly disposed when your application shuts down?
You can of course perform this create-dispose cycle as many times as you like, it's not like Rebus will leave anything behind in any way, but the fact that you are NOT disposing the bus anywhere tells me that your application probably forgets to do this.
You can read more on the Rebus wiki, especially in the section about Rebus' bus instance.
Another issue is the subtle potential race condition in the ArchiveIncomingFile class whose ctor looks like this:
public ArchiveIncomingFile()
{
SubscriptionTable = "SandboxServiceBusSubscriptions";
ServiceBus = new BuiltinHandlerActivator();
Configure.With(ServiceBus)
.Logging(l => l.ColoredConsole(LogLevel.Info))
.Routing(r => r.TypeBased().Map<ArchiveIncomingFileRequest>("incomingarchivefilerouter"))
.Transport(t => t.UseMsmq("archiveincomingfile"))
.Start();
//<<< bus is receiving messages at this point, but there's no handler!!
ServiceBus.Handle<IncomingFileArchived>(async msg => await IncomingFileArchived(msg));
}
As you can see, there is a (very very very short, admittedly) time (marked by //<<<) in which the bus has been started (and thus will start to pull messages out of its input queue) where no handlers yet have been configured.
You should be sure to configure handlers BEFORE you start the bus.
Finally, you are asking
And is there a better alternative to accomplish what I am trying to do?
but I am unable to answer that question because I simply cannot figure out what you are trying to do ;)
(but if you explain to me at a slightly higher level what problem you are trying to solve, I might have some hints for you :))

Related

Impossible to read queue message with Masstransit

I have a queue with some messages in (created with masstransit).
I tried this piece of code the get the messages (see below).
I expected to get the messages on the Console.Out line but I never hit this line and the messages are still in the queue. I didn't get any error.
Any idea ?
class Program
{
static void Main(string[] args)
{
var bus = Bus.Factory.CreateUsingRabbitMq(cfg =>
{
cfg.Host("localhost", "/", h =>
{
h.Username("guest");
h.Password("guest");
});
cfg.ReceiveEndpoint("myQueue", e =>
{
e.Handler<ProcessingQueue>(context =>
{
return Console.Out.WriteLineAsync($"{context.Message.Id}");
});
});
});
}
}
public class ProcessingQueue
{
public int Id { get; set; }
public string Name { get; set; }
}
Thanks,
I tried to add :
bus.Start();
Console.WriteLine("Receive listening for messages");
Console.ReadLine();
bus.Stop();
but when I do this a new queue is created myQueue_skipped is created with my messages in.
If messages are moved to the _skipped queue, it indicates that those messages are not consumed by any of the consumers configured on that receive endpoint. The most common mistake, as highlighted at the top of the message documentation, is a mismatched namespace.
Similar answer: here
Try with this code for the ReceiveEndpoint
cfg.ReceiveEndpoint("myQueue", e =>
{
e.Consumer<MessagesConsumer>();
});
"MessagesConsumer" must inherit from IConsumer
public class MessagesConsumer: IConsumer<ProcessingQueue>
{ public async Task Consume(ConsumeContext<ProcessingQueue> context)
{
//access to the properties
var name=context.Message.Name;
var id=context.Message.Id;
}
}
In the Consume method, you will receive messages of the type "ProcessingQueue". You can access the properties here..

Correct way to handle task cancelation

I am experiencing some weird behaviour with a windows service application I am working on. This is my 1st dip into Tasks so I am on a steep learning curve and in need of some assistance as I know my issue is probably down to something I have misunderstood.
I have the following setup:
public partial class MyService
{
protected override void OnStart(string[] args)
{
MasterTokenSource = new CancellationTokenSource();
MasterCancellationToken = MasterTokenSource.Token;
//Begin tasks.
StartAllTasks();
//This is the thread that is going to listen for updates in the database.
Task MasterService = Task.Factory.StartNew(() =>
{
while (!MasterCancellationToken.IsCancellationRequested)
{
//Sleep for the amount of time as determined in the DB
Thread.Sleep(ServiceInstance.PollInterval * 1000);
Console.WriteLine("Polled for changes");
//Check service modules for changes as per DB config
UpdateServiceModulePropertiesAndRunningTasks();
//MasterTokenSource.Cancel();
}
MasterCancellationToken.ThrowIfCancellationRequested();
}, MasterCancellationToken);
}
private void StartAllTasks()
{
//Index pages task
ServiceModule PageIndexersm = ServiceInstance.GetServiceModule("PageIndexer");
PageIndexer.StartNewInstance(PageIndexersm, ConfigInstance, MasterTokenSource);
//There are other calls to other methods to do different things here but they all follow the same logic
}
private void UpdateServiceModulePropertiesAndRunningTasks()
{
//Get a fresh copy of the service instance, and compare to current values
ServiceInstance compareServiceInstance = new ServiceInstance(ConfigInstance.OneConnectionString, ConfigInstance.TwoConnectionString, ConfigInstance.ServiceName);
foreach (ServiceModule NewServiceModuleItem in compareServiceInstance.AllServiceModules)
{
ServiceModule CurrentServiceModuleInstance = ServiceInstance.GetServiceModule(NewServiceModuleItem.ModuleName);
if (!NewServiceModuleItem.Equals(CurrentServiceModuleInstance))
{
//Trigger changed event and pass new instance
CurrentServiceModuleInstance.On_SomethingChanged(NewServiceModuleItem, MasterTokenSource);
}
}
}
}
public class PageIndexer
{
public ServiceConfig ServiceConfig { get; set; }
public ServiceModule ServiceModuleInstance { get; set; }
public Guid InstanceGUID { get; set; }
public CancellationTokenSource TokenSource { get; set; }
public CancellationToken Token { get; set; }
public PageIndexer(ServiceModule PageIndexerServiceModule, ServiceConfig _ServiceConfig)
{
ServiceModuleInstance = PageIndexerServiceModule;
ServiceModuleInstance.SomethingChanged += ServiceModuleInstance_SomethingChanged;
ServiceConfig = _ServiceConfig;
InstanceGUID = Guid.NewGuid();
}
//This is the method called within the PageIndexer instance
private void ServiceModuleInstance_SomethingChanged(ServiceModule sm, CancellationTokenSource MasterCancelToken)
{
Console.WriteLine(InstanceGUID + ": Something changed");
TokenSource.Cancel();
//Start new indexer instance
PageIndexer.StartNewInstance(sm, ServiceConfig, MasterCancelToken);
}
public void RunTask()
{
Console.WriteLine("Starting Page Indexing");
Task.Factory.StartNew(() =>
{
while (true)
{
if (TokenSource.Token.IsCancellationRequested)
{
Console.WriteLine(InstanceGUID + ": Page index CANCEL requested: " + TokenSource.IsCancellationRequested);
TokenSource.Token.ThrowIfCancellationRequested();
}
if (ServiceModuleInstance.ShouldTaskBeRun())
{
Console.WriteLine(InstanceGUID + ": RUNNING full index, Cancellation requested: " + TokenSource.IsCancellationRequested);
RunFullIndex();
}
else
{
Console.WriteLine(InstanceGUID + ": SLEEPING, module off, Cancellation requested: " + TokenSource.IsCancellationRequested);
//If the task should not be run then sleep for a bit to save resources
Thread.Sleep(5000);
}
}
}, TokenSource.Token);
}
public static void StartNewInstance(ServiceModule serviceModule, ServiceConfig eServiceConfig, CancellationTokenSource MasterCancellationToken)
{
PageIndexer pageIndexerInstance = new PageIndexer(serviceModule, eServiceConfig);
CancellationTokenSource NewInstanceCancellationTokenSource = new CancellationTokenSource();
NewInstanceCancellationTokenSource = CancellationTokenSource.CreateLinkedTokenSource(MasterCancellationToken.Token);
pageIndexerInstance.TokenSource = NewInstanceCancellationTokenSource;
pageIndexerInstance.Token = pageIndexerInstance.TokenSource.Token;
pageIndexerInstance.RunTask();
}
}
What I am seeing is that the cancel and start are working fine for me for the 1st change detected but subsequent cancels issued after other changes are not working. I can see the call to the event method happening, however, it appears to be calling on the original instance of the page indexer.
I am sure I have just got to a point where I have been going around so long I have made a complete mess, but I would be grateful for any guidance anyone can offer to get me back on the right track
Thank you in advance.
Regards
A CancellationTokenSource and CancellationToken can only be signaled once. They become cancelled forever. If you want multiple cancellation signals for multiple threads/tasks then you need one token for each such operation.
Often, it is a good pattern to group them in a class:
class MyOperation {
Task task; //use this for waiting
CancellationTokenSource cts; //use this for cancelling
}
That way there automatically is a 1:1 association of task and token. You are able to cancel a specific task this way.

Sharing object context Tasks

I have a list lstSubscriptionRequests on which indivisual items i am doing some sort of processing asynchronously. Then after All items are processed i have to return the updated list items. My current implementation is like
List<SubscriptionRequest> lstSubscriptionRequests = FromSomeResource();
var tsk = new List<Task>();
foreach (var oTsk in lstSubscriptionRequests.Select(objSubscriptionRequest => new Task(
() => ProcessSubscriptionForASingleRecord(objSubscriptionRequest))))
{
oTsk.Start();
lock (tsk)
{
tsk.Add(oTsk);
}
}
Task.WaitAll(tsk.ToArray());
It's looks like some of the items after all the tasks completed are not updated.
Please let me know what correction i needed
You could accomplish this much easier. Note that using the Task constructor is not typically necessary or recommended, also mutating the state of a particular object can be difficult to follow or debug. Returning a new object that represents your desired state will allow you to enforce a minimum valid state. The following code will processes all your items and return the completed items to your client code.
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
namespace Processing {
public class MyProcessor {
public async Task<IEnumerable<Subscription>> ProcessSubscriptionRequestsAsync(IEnumerable<SubscriptionRequest> subscriptionRequests) {
var subscriptionProcessingTasks = subscriptionRequests.Select(request => ProcessSubscriptionForASingleRecord(request)).ToArray();
return await Task.WhenAll(subscriptionProcessingTasks);
}
public async Task<Subscription> ProcessSubscriptionForASingleRecord(SubscriptionRequest request) {
//process the request
try {
var subscription = await Context.ProcessRequest(request);
return subscription;
} catch {
//something went wrong with the request
}
}
}
public class SubscriptionRequest {
//A subcription request
}
public class Subscription {
//A completed subscription request
}
}
Update
It could help if you could exclude new Class Subscription and add the solution in your answer. i will then try it out
Hopefully the simplified before and after view will be easier to integrate. The primary difference is replacing the Parallel.ForEach with a Select to create your collection of tasks without needing to take a lock on the task list for every SubscriptionRequest, also spooling up Tasks in parallel isn't typically necessary as each one will be executing asynchronously you only gain reaching a point where all are waiting sooner, not finishing. Next each tasks is allowed to start and all are awaited at await Task.WhenAll(tasks). It will be important for you to determine what type of processing each SubscriptionRequest undergoes. For the sake of example i"ve made the assumption that each request is somehow linked to database access, i.e. storing the request, updating a user profile of sorts etc..
public class OriginalSynchronous {
public void ProcessSubscriptionRequest() {
List<SubscriptionRequest> lstSubscriptionRequests = FromSomeResource();
List<Task> tsk = new List<Task>();
Parallel.ForEach(lstSubscriptionRequests, objSubscriptionRequest => {
var oTsk =
new Task(
() => ProcessSubscriptionForASingleRecord(objSubscriptionRequest));// update some properties after processing SubscriptionRequest
oTsk.Start();
lock (tsk) {
tsk.Add(oTsk);
}
});
Task.WaitAll(tsk.ToArray());
}
private void ProcessSubscriptionForASingleRecord(SubscriptionRequest request) {
//modify SubscriptionRequest
}
}
public class ModifiedAsync {
public async Task ProcessSubscriptionRequest() {
var subscriptionRequests = await FromSomeResourceAsync();
var tasks = subscriptionRequests.Select(request => {
return ProcessSubscriptionForASingleRecord(request);
}).ToArray();
await Task.WhenAll(tasks);
}
public async Task ProcessSubscriptionForASingleRecord(SubscriptionRequest request) {
//modify SubscriptionRequest
}
}

Timing issue between async initialisation and results loading in application startup

Seeking some input on a behaviour I'm noticing in my code below. This is my first attempt at async/await using Xamarin Forms and I have perused hundreds of posts, blogs and articles on the subject including the writings from Stephen Cleary on async from constructors and best practices to avoid locking. Although I am using a MVVM framework I assume my issue is more generic than that so I'll ignore it for the moment here.
If I am still missing something or there are ways to improve what I'm trying to do ... happy to listen and learn.
At a high level the logic is as follows:
Application starts and initialises
During initialisation verify database exist and if not - create the SQLite DB. Currently I force this every time to simulate a new application and pre-populate it with some sample data for development purposes
After initialisation completed load results set and display
This works most of the time but I have noticed 2 infrequent occurrences due to the async handling of the database initialisation and pre-populating:
Occasionally not all sample records created are displayed once the app started up - I assume this is because the pre-population phase has not completed when the results are loaded
Occasionally I get an error that one of the tables have not been created - I assume this is because the database initialisation has not completed when the results are loaded
The code - simplified to show the flow during initialisation and startup:
----------- VIEW / PAGE MODEL ----------------
public class MyListItemsPageModel
{
private ObservableRangeCollection<MyListItem> _myListItems;
private Command loadItemsCommand;
public MyListItemsPageModel()
{
_myListItems = new ObservableRangeCollection<MyListItem>();
}
public override void Init(object initData)
{
if (LoadItemsCommand.CanExecute(null))
LoadItemsCommand.Execute(null);
}
public Command LoadItemsCommand
{
get
{
return loadItemsCommand ?? (loadItemsCommand = new Command(async () => await ExecuteLoadItemsAsyncCommand(), () => { return !IsBusy; }));
}
}
public ObservableRangeCollection<MyListItem> MyListItems {
get { return _myListItems ?? (_myListItems = new ObservableRangeCollection<MyListItem>()); }
private set {
_myListItems = value;
}
}
private async Task ExecuteLoadItemsAsyncCommand() {
if (IsBusy)
return;
IsBusy = true;
loadItemsCommand.ChangeCanExecute();
var _results = await MySpecificDBServiceClass.LoadAllItemsAsync;
MyListItems = new ObservableRangeCollection<MyListItem>(_results.OrderBy(x => x.ItemName).ToList());
IsBusy = false;
loadItemsCommand.ChangeCanExecute();
}
}
----------- DB Service Class ----------------
// THERE IS A SPECIFIC SERVICE LAYER BETWEEN THIS CLASS AND THE PAGE VIEW MODEL HANDLING THE CASTING OF TO THE SPECIFIC DATA TYPE
// public class MySpecificDBServiceClass : MyGenericDBServiceClass
public class MyGenericDBServiceClass<T>: IDataAccessService<T> where T : class, IDataModel, new()
{
public SQLiteAsyncConnection _connection = FreshIOC.Container.Resolve<ISQLiteFactory>().CreateConnection();
internal static readonly AsyncLock Mutex = new AsyncLock();
public DataServiceBase()
{
// removed this from the constructor
//if (_connection != null)
//{
// IsInitialized = DatabaseManager.CreateTableAsync(_connection);
//}
}
public Task<bool> IsInitialized { get; private set; }
public virtual async Task<List<T>> LoadAllItemsAsync()
{
// Temporary async/await initialisation code. This will be moved to the start up as per Stephen's suggestion
await DBInitialiser();
var itemList = new List<T>();
using (await Mutex.LockAsync().ConfigureAwait(false))
{
itemList = await _connection.Table<T>().ToListAsync().ConfigureAwait(false);
}
return itemList;
}
}
----------- DB Manager Class ----------------
public class DatabaseManager
{
static double CURRENT_DATABASE_VERSION = 0.0;
static readonly AsyncLock Mutex = new AsyncLock();
private static bool IsDBInitialised = false;
private DatabaseManager() { }
public static async Task<bool> CreateTableAsync(SQLiteAsyncConnection CurrentConnection)
{
if (CurrentConnection == null || IsDBInitialised)
return IsDBInitialised;
await ProcessDBScripts(CurrentConnection);
return IsDBInitialised;
}
private static async Task ProcessDBScripts(SQLiteAsyncConnection CurrentConnection)
{
using (await Mutex.LockAsync().ConfigureAwait(false))
{
var _tasks = new List<Task>();
if (CURRENT_DATABASE_VERSION <= 0.1) // Dev DB - recreate everytime
{
_tasks.Add(CurrentConnection.DropTableAsync<Table1>());
_tasks.Add(CurrentConnection.DropTableAsync<Table2>());
await Task.WhenAll(_tasks).ConfigureAwait(false);
}
_tasks.Clear();
_tasks.Add(CurrentConnection.CreateTableAsync<Table1>());
_tasks.Add(CurrentConnection.CreateTableAsync<Table2>());
await Task.WhenAll(_tasks).ConfigureAwait(false);
_tasks.Clear();
_tasks.Add(UpgradeDBIfRequired(CurrentConnection));
await Task.WhenAll(_tasks).ConfigureAwait(false);
}
IsDBInitialised = true;
}
private static async Task UpgradeDBIfRequired(SQLiteAsyncConnection _connection)
{
await CreateSampleData();
return;
// ... rest of code not relevant at the moment
}
private static async Task CreateSampleData()
{
IDataAccessService<MyListItem> _dataService = FreshIOC.Container.Resolve<IDataAccessService<MyListItem>>();
ObservableRangeCollection<MyListItem> _items = new ObservableRangeCollection<MyListItem>(); ;
_items.Add(new MyListItem() { ItemName = "Test 1", ItemCount = 14 });
_items.Add(new MyListItem() { ItemName = "Test 2", ItemCount = 9 });
_items.Add(new MyListItem() { ItemName = "Test 3", ItemCount = 5 });
await _dataService.SaveAllItemsAsync(_items).ConfigureAwait(false);
_items = null;
_dataService = null;
IDataAccessService<Sample> _dataService2 = FreshIOC.Container.Resolve<IDataAccessService<AnotherSampleTable>>();
ObservableRangeCollection<Sample> _sampleList = new ObservableRangeCollection<Sample>(); ;
_sampleList.Add(new GuestGroup() { SampleName = "ABC" });
_sampleList.Add(new GuestGroup() { SampleName = "DEF" });
await _dataService2.SaveAllItemsAsync(_sampleList).ConfigureAwait(false);
_sampleList = null;
_dataService2 = null;
}
}
In your DataServiceBase constructor, you're calling DatabaseManager.CreateTableAsync() but not awaiting it, so by the time your constructor exits, that method has not yet completed running, and given that it does very little before awaiting, it's probably barely started at that point. As you can't effectively use await in a constructor, you need to remodel things so you do that initialisation at some other point; e.g. perhaps lazily when needed.
Then you also want to not use .Result/.Wait() whenever possible, especially as you're in an async method anyway (e.g. ProcessDBScripts()), so instead of doing
var _test = CurrentConnection.DropTableAsync<MyListItem>().Result;
rather do
var _test = await CurrentConnection.DropTableAsync<MyListItem>();
You also don't need to use Task.Run() for methods that return Task types anyway. So instead of
_tasks.Add(Task.Run(() => CurrentConnection.CreateTableAsync<MyListItem>().ConfigureAwait(false)));
_tasks.Add(Task.Run(() => CurrentConnection.CreateTableAsync<AnotherSampleTable>().ConfigureAwait(false)));
just do
_tasks.Add(CurrentConnection.CreateTableAsync<MyListItem>()));
_tasks.Add(CurrentConnection.CreateTableAsync<AnotherSampleTable>()));
sellotape has correctly diagnosed the code problem: the constructor is starting an asynchronous method but nothing is (a)waiting for it to complete. A simple fix would be to add await IsInitialized; to the beginning of LoadAllItemsAsync.
However, there's also a design problem:
After initialisation completed load results set and display
That's not possible on Xamarin, or any other modern UI platform. You must load your UI immediately and synchronously. What you should do is immediately display a splash/loading page and start the asynchronous initialization work. Then, when the async init is completed, update your VM/UI with your "real" page. If you just have LoadAllItemsAsync await IsInitialized, then your app will sit there for some time showing the user zero data before it "fills in".
You may find my NotifyTask<T> type (available on NuGet) useful here if you want to show a splash/spinner instead of zero data.

How to correctly correlate a controller saga which starts multiple instances of another controller saga?

I have a controller saga which used to have a step starting a process containing 3 actions in one transaction. I am now in the process of refactoring this sub-process into a separate saga. The result of this will be that the original saga will start multiple instances of the new "sub-saga" (This sub-saga will also be started by other non-saga processes, through the same command). My problem is how to correlate this hierarchy of sagas in the best possible way?
In the following example, the main saga will try to start three instances of the sub-saga with the same correlationId. Even if this was to work, the 3 instances would interfere with eachother by handling "completed events" originating from all instances.
public class MyMainSaga : Saga<MyMainSagaData>,
IAmStartedByMessages<MyMainCommand>,
IHandleMessage<MySubProcessCommandCompletedEvent>
{
protected override void ConfigureHowToFindSaga(SagaPropertyMapper<MyMainSagaData> mapper)
{
mapper.ConfigureMapping<MyMainCommandCompletedEvent>(message => message.CorrelationId).ToSaga(data => data.CorrelationId);
}
public void Handle(MyMainCommand message)
{
Data.CorrelationId = message.CorrelationId;
foreach (var item in message.ListOfObjectsToProcess)
{
Bus.Send(new MySubProcessCommand{
CorrelationId = Data.CorrelationId,
ObjectId = item.Id
});
}
}
public void Handle(MySubProcessCommandCompletedEvent message)
{
SetHandledStatus(message.ObjectId);
if(AllObjectsWhereProcessed())
MarkAsComplete();
}
}
public class MySubSaga : Saga<MySubSagaData>,
IAmStartedByMessages<MySubProcessCommand>,
IHandleMessage<Step1CommandCompletedEvent>,
IHandleMessage<Step2CommandCompletedEvent>,
IHandleMessage<Step3CommandCompletedEvent>
{
protected override voidConfigureHowToFindSaga(SagaPropertyMapper<MySubSagaData> mapper)
{
mapper.ConfigureMapping<Step1CommandCompletedEvent>(message => message.CorrelationId).ToSaga(data => data.CorrelationId);
mapper.ConfigureMapping<Step2CommandCompletedEvent>(message => message.CorrelationId).ToSaga(data => data.CorrelationId);
mapper.ConfigureMapping<Step3CommandCompletedEvent>(message => message.CorrelationId).ToSaga(data => data.CorrelationId);
}
public void Handle(MySubProcessCommand message)
{
Data.CorrelationId = message.CorrelationId;
Data.ObjectId = message.ObjectId;
Bus.Send(new Step1Command{
CorrelationId = Data.CorrelationId;
});
}
public void Handle(Step1CommandCompletedEvent message)
{
Bus.Send(new Step2Command{
CorrelationId = Data.CorrelationId;
});
}
public void Handle(Step2CommandCompletedEvent message)
{
Bus.Send(new Step3Command{
CorrelationId = Data.CorrelationId;
});
}
public void Handle(Step3CommandCompletedEvent message)
{
Bus.Publish<MySubProcessCommandCompletedEvent>(e => {
e.CorrelationId = Data.CorrelationId;
e.ObjectId = Data.ObjectId;
});
MarkAsComplete();
}
}
The only sollution I see is to change the sub-saga to generate a separate correlationId as well as keeping the originator id. E.g:
public void Handle(MySubProcessCommand message)
{
Data.CorrelationId = Guid.NewGuid();
Data.OriginatorCorrelationId = message.CorrelationId;
Data.ObjectId = message.ObjectId;
Bus.Send(new Step1Command{
CorrelationId = Data.CorrelationId;
});
}
public void Handle(Step1CommandCompletedEvent message)
{
Bus.Send(new Step2Command{
CorrelationId = Data.CorrelationId;
});
}
public void Handle(Step2CommandCompletedEvent message)
{
Bus.Send(new Step3Command{
CorrelationId = Data.CorrelationId;
});
}
public void Handle(Step3CommandCompletedEvent message)
{
Bus.Publish<MySubProcessCommandCompletedEvent>(e => {
e.CorrelationId = Data.OriginatorCorrelationId;
e.ObjectId = Data.ObjectId;
});
MarkAsComplete();
}
Is there a "best practice" solution to this problem? I have been thinking of using Bus.Reply, notifying the MainSaga when the sub-saga has completed. Problem with this is that another consumer is also sending the MySubProcessCommand without waiting for a completed event/reply.
The best practice is to use ReplyToOriginator() in the sub-saga to communicate back to the main saga. This method is exposed on the Saga base class.
There are two ways to fix the issue of starting the sub-saga both by the main saga and a different initiator.
Use two different commands.
Let two different commands start the sub-saga, like
MySubProcessFromMainSagaCommand and MySubProcessFromSomewhereElseCommand. It is fine to have multiple IAmStartedByMessages<> for a Saga.
Extend MySubProcessCommand
Include some data in MySubProcessCommand to indicate whether it came from the main saga or the other initiator.
Either way will give you enough information to store how the sub-saga was started, for example Data.WasInitatedByMainSaga. Check this in the sub-saga completion logic. If it is true, do a ReplyToOriginator() to communicate back to the originating main saga. If not, skip the reply.

Categories