Update:
Adding TaskCreationOptions.LongRunning solved the issue but is this a good approach ? If not, what is the best solution to get over this exception ?
There is an issue i am trying to troubleshoot. I have implemented the suggestions that were provided in StackOverFlow but those have not helped solve the issue. I have used other alternatives like ContinuwWith option instead of Task.WaitAll by attaching the extension method. This did not help either .
I have put Ex.handle { } and i have tried throw ex in the Catch(aggrgateException ex) in the exceptions but that did not help to catch the actual exception.
I have only .Net 4.0 installed so i cannot try the .Net 4.5 resolution to solve this
The exception that i have been getting all the time is
"System.AggregateException:" The task's exception was not observed either by waiting on the task or accessing the Exception property "
After this it simply kills worker process and the App crashes and i see an entry in the EventViewer
Any help here here will be appreciated.
We have the below code:
Task<List<MyBusinessObject>>[] tasks = new Task<List<MyBusinessObject>>[MyCollection.Count];
for (int i = 0; i < MyCollection.Count; i++)
{
MyDTO dto = new MyDTO();
--Some Property Assignment for the MyDTO object--
tasks[i] = Task<List<MyBusinessObject>>.Factory.StartNew(MyDelegate, dto)
}
try
{
Task.WaitAll(tasks);
}
catch (AggregateException e)
{
AddToLogFile("Exceptions thrown by WaitAll() : ");
for (int j = 0; j < e.InnerExceptions.Count; j++)
{
AddToLogFile(e.InnerExceptions[j].ToString());
}
}
catch(Exception ex)
{
AddToLogFile(ex.Message);
}
Second Alternative
public Static Class Extensions
{
public static void LogExceptions(this Task<List<<MyBusinessObject>> task)
{
task.ContinueWith(t =>
{
var aggException = t.Exception.Flatten();
foreach (var exception in aggException.InnerExceptions)
{
AddToLogFile("Task Exception: " + exception.Message);
}
},
TaskContinuationOptions.OnlyOnFaulted);
}
}
//In a different class call the extension method after starting the new tasks
Task<List<MyBusinessObject>>[] tasks = new Task<List<MyBusinessObject>>[MyCollection.Count];
for (int i = 0; i < MyCollection.Count; i++)
{
MyDTO dto = new MyDTO();
--Some Property Assignment for the MyDTO object--
tasks[i] = Task<List<MyBusinessObject>>.Factory.StartNew(MyDelegate, dto).LogExceptions()
}
Instead of creating a new tasks for each iteration all at one time, i created the new task for 100 tasks every time. I then waited till all the tasks were complete and spwaned off another 100 tasks till all the tasks were finished
int count = Mycollection.Count();
Task>[] tasks;
int i = 0;
int j;
while (i < count)
{
j = 0;
tasks = new Task<List<MyBusinessObject>>[nbrOfTasks];
foreach (var p in Mycollection.Skip(i).Take(nbrOfTasks))
{
MyRequestDto dto = new MyRequestDto ();
--Some Proerty Assignment
tasks[j] = Task<List<MyBusinessObject>>.Factory.StartNew(MyDelegate, dto);
i++;
j++;
}
try
{
// Wait for all the tasks to finish.
if (tasks != null && tasks.Count() > 0)
{
tasks = tasks.Where(t => t != null).ToArray();
Task.WaitAll(tasks);
}
}
catch (AggregateException e)
{
}
This is to do with the memory avaiable to service multiple tasks. If the required RAM was not available to service the concurrent tasks along the way, the App Would just crash.That is why it is best to spawn limited nbr of tasks. Instead of spwaning multiple and leaving it to the threadpool to manage.
Related
The description of the Task.WhenAny method says, that it will return the first task finished, even if it's faulted. Is there a way to change this behavior, so it would return first successful task?
Something like this should do it (may need some tweaks - haven't tested):
private static async Task<Task> WaitForAnyNonFaultedTaskAsync(IEnumerable<Task> tasks)
{
IList<Task> customTasks = tasks.ToList();
Task completedTask;
do
{
completedTask = await Task.WhenAny(customTasks);
customTasks.Remove(completedTask);
} while (completedTask.IsFaulted && customTasks.Count > 0);
return completedTask.IsFaulted?null:completedTask;
}
First off, from my review there is no direct way of doing this without waiting for all the tasks to complete then find the first one that ran successfully.
To start with I am not sure of the edge cases that will cause issues that I havent tested, and given the source code around tasks and contiunuation requires more than an hour of review I would like to start to think around the follow source code. Please review my thoughts at the bottom.
public static class TaskExtensions
{
public static async Task<Task> WhenFirst(params Task[] tasks)
{
if (tasks == null)
{
throw new ArgumentNullException(nameof(tasks), "Must be supplied");
}
else if (tasks.Length == 0)
{
throw new ArgumentException("Must supply at least one task", nameof(tasks));
}
int finishedTaskIndex = -1;
for (int i = 0, j = tasks.Length; i < j; i++)
{
var task = tasks[i];
if (task == null)
throw new ArgumentException($"Task at index {i} is null.", nameof(tasks));
if (finishedTaskIndex == -1 && task.IsCompleted && task.Status == TaskStatus.RanToCompletion)
{
finishedTaskIndex = i;
}
}
if (finishedTaskIndex == -1)
{
var promise = new TaskAwaitPromise(tasks.ToList());
for (int i = 0, j = tasks.Length; i < j; i++)
{
if (finishedTaskIndex == -1)
{
var taskId = i;
#pragma warning disable CS4014 // Because this call is not awaited, execution of the current method continues before the call is completed
//we dont want to await these tasks as we want to signal the first awaited task completed.
tasks[i].ContinueWith((t) =>
{
if (t.Status == TaskStatus.RanToCompletion)
{
if (finishedTaskIndex == -1)
{
finishedTaskIndex = taskId;
promise.InvokeCompleted(taskId);
}
}
else
promise.InvokeFailed();
});
#pragma warning restore CS4014 // Because this call is not awaited, execution of the current method continues before the call is completed
}
}
return await promise.WaitCompleted();
}
return Task.FromResult(finishedTaskIndex > -1 ? tasks[finishedTaskIndex] : null);
}
class TaskAwaitPromise
{
IList<Task> _tasks;
int _taskId = -1;
int _taskCount = 0;
int _failedCount = 0;
public TaskAwaitPromise(IList<Task> tasks)
{
_tasks = tasks;
_taskCount = tasks.Count;
GC.KeepAlive(_tasks);
}
public void InvokeFailed()
{
_failedCount++;
}
public void InvokeCompleted(int taskId)
{
if (_taskId < 0)
{
_taskId = taskId;
}
}
public async Task<Task> WaitCompleted()
{
await Task.Delay(0);
while (_taskId < 0 && _taskCount != _failedCount)
{
}
return _taskId > 0 ? _tasks[_taskId] : null;
}
}
}
The code is lengthy I understand and may have lots of issues, however the concept is you need to execute all the tasks in parallel and find the first resulting task that completed successfully.
If we consider that we need to make a continuation block of all the tasks and be able to return out of the continuation block back to the original caller. My main concern (other than the fact I cant remove the continuation) is the while() loop in the code. Probably best to add some sort of CancellationToken and/or Timeout to ensure we dont deadlock while waiting for a completed task. In this case if zero tasks complete we never finish this block.
Edit
I did change the code slightly to signal the promise for a failure so we can handle a failed task. Still not happy with the code but its a start.
I have a scenario in which i have to process the multiple files(e.g. 30) parallel based on the processor cores. I have to assign these files to separate tasks based on no of processor cores. I don't know how to make a start and end limit of each task to process. For example each and every task knows how many files it has to process.
private void ProcessFiles(object e)
{
try
{
var diectoryPath = _Configurations.Descendants().SingleOrDefault(Pr => Pr.Name == "DirectoryPath").Value;
var FilePaths = Directory.EnumerateFiles(diectoryPath);
int numCores = System.Environment.ProcessorCount;
int NoOfTasks = FilePaths.Count() > numCores ? (FilePaths.Count()/ numCores) : FilePaths.Count();
for (int i = 0; i < NoOfTasks; i++)
{
Task.Factory.StartNew(
() =>
{
int startIndex = 0, endIndex = 0;
for (int Count = startIndex; Count < endIndex; Count++)
{
this.ProcessFile(FilePaths);
}
});
}
}
catch (Exception ex)
{
throw;
}
}
For problems such as yours, there are concurrent data structures available in C#. You want to use BlockingCollection and store all the file names in it.
Your idea of calculating the number of tasks by using the number of cores available on the machine is not very good. Why? Because ProcessFile() may not take the same time for each file. So, it would be better to start the number of tasks as the number of cores you have. Then, let each task read file name one by one from the BlockingCollection and then process the file, until the BlockingCollection is empty.
try
{
var directoryPath = _Configurations.Descendants().SingleOrDefault(Pr => Pr.Name == "DirectoryPath").Value;
var filePaths = CreateBlockingCollection(directoryPath);
//Start the same #tasks as the #cores (Assuming that #files > #cores)
int taskCount = System.Environment.ProcessorCount;
for (int i = 0; i < taskCount; i++)
{
Task.Factory.StartNew(
() =>
{
string fileName;
while (!filePaths.IsCompleted)
{
if (!filePaths.TryTake(out fileName)) continue;
this.ProcessFile(fileName);
}
});
}
}
And the CreateBlockingCollection() would be as follows:
private BlockingCollection<string> CreateBlockingCollection(string path)
{
var allFiles = Directory.EnumerateFiles(path);
var filePaths = new BlockingCollection<string>(allFiles.Count);
foreach(var fileName in allFiles)
{
filePaths.Add(fileName);
}
filePaths.CompleteAdding();
return filePaths;
}
You will have to modify your ProcessFile() to receive a file name now instead of taking all the file paths and processing its chunk.
The advantage of this approach is that now your CPU won't be over or under subscribed and the load will be evenly balanced too.
I haven't run the code myself, so there might be some syntax error in my code. Feel free to correct the error, if you come across any.
Based on my admittedly limited understanding of the TPL, I think your code could be rewritten as such:
private void ProcessFiles(object e)
{
try
{
var diectoryPath = _Configurations.Descendants().SingleOrDefault(Pr => Pr.Name == "DirectoryPath").Value;
var FilePaths = Directory.EnumerateFiles(diectoryPath);
Parallel.ForEach(FilePaths, path => this.ProcessFile(path));
}
catch (Exception ex)
{
throw;
}
}
regards
I am quite new to the topic TPL Dataflow. In the book Concurrency in C# I tested the following example. I can't figure out why there's no output which should be 2*2-2=2;
static void Main(string[] args)
{
//Task tt = test();
Task tt = test1();
Console.ReadLine();
}
static async Task test1()
{
try
{
var multiplyBlock = new TransformBlock<int, int>(item =>
{
if (item == 1)
throw new InvalidOperationException("Blech.");
return item * 2;
});
var subtractBlock = new TransformBlock<int, int>(item => item - 2);
multiplyBlock.LinkTo(subtractBlock,
new DataflowLinkOptions { PropagateCompletion = true });
multiplyBlock.Post(2);
await subtractBlock.Completion;
int temp = subtractBlock.Receive();
Console.WriteLine(temp);
}
catch (AggregateException e)
{
// The exception is caught here.
foreach (var v in e.InnerExceptions)
{
Console.WriteLine(v.Message);
}
}
}
Update1: I tried another example. Still I did not use Block.Complete() but I thought when the first block's completed, the result is passed into the second block automatically.
private static async Task test3()
{
TransformManyBlock<int, int> tmb = new TransformManyBlock<int, int>((i) => { return new int[] {i, i + 1}; });
ActionBlock<int> ab = new ActionBlock<int>((i) => Console.WriteLine(i));
tmb.LinkTo(ab);
for (int i = 0; i < 4; i++)
{
tmb.Post(i);
}
//tmb.Complete();
await ab.Completion;
Console.WriteLine("Finished post");
}
This part of the code:
await subtractBlock.Completion;
int temp = subtractBlock.Receive();
is first (asynchronously) waiting for the subtraction block to complete, and then attempting to retrieve an output from the block.
There are two problems: the source block is never completed, and the code is attempting to retrieve output from a completed block. Once a block has completed, it will not produce any more data.
(I assume you're referring to the example in recipe 4.2, which will post 1, causing the exception, which completes the block in a faulted state).
So, you can fix this test by completing the source block (and the completion will propagate along the link to the subtractBlock automatically), and by reading the output before (asynchronously) waiting for subtractBlock to complete:
multiplyBlock.Complete();
int temp = subtractBlock.Receive();
await subtractBlock.Completion;
I`m trying to make a small class for the multithreading usage in my WinForm projects.
Tried Threads(problems with UI), Backgroundworker(smth went wrong with UI too, just leave it now:)), now trying to do it with Task class. But now, can`t understand, how to make an infinitive loop and a cancelling method (in class) for all running tasks.
Examples i found is to be used in 1 method.
So, here is a structure & code of currently working part (Worker.css and methonds used in WinForm code).
Worker.css
class Worker
{
public static int threadCount { get; set; }
public void doWork(ParameterizedThreadStart method)
{
Task[] tasks = Enumerable.Range(0, 4).Select(i => Task.Factory.StartNew(() => method(i))).ToArray();
}
}
usage on
Form1.cs
private void Start_btn_Click(object sender, EventArgs e)
{
Worker.threadCount = 1; //actually it doesn`t using now, number of tasks is declared in class temporaly
Worker worker = new Worker();
worker.doWork(Job);
string logString_1 = string.Format("Starting {0} threads...", Worker.threadCount);
log(logString_1);
}
public static int j = 0;
private void Job(object sender)
{
Worker worker = new Worker();
Random r = new Random();
log("Thread "+Thread.CurrentThread.ManagedThreadId +" is working...");
for (int i = 0; i < 5; i++)
{
j++;
log("J==" + j);
if (j == 50)
{
//worker.Stop();
log("STOP");
}
}
Thread.Sleep(r.Next(500, 1000));
}
So, it run an example 4 threads, they executed, i got J==20 in my log, it`s ok.
My question is, how to implement infinitive loop for the tasks, created by Worker.doWork() method.
And also to make a .Stop() method for the Worker class (which should just stop all tasks when called). As i understand it`s related questions, so i put it in 1.
I tryed some solutions, but all of them based on the CancellationToken usage, but i have to create this element only inside of the Worker.doWork() method, so i can`t use the same token to create a Worker.Stop() method.
Someone can help? threads amount range i have to use in this software is about 5-200 threads.
using J computation is just an example of the the easy condition used to stop a software work(stop of tasks/threads).
In real, stop conditions is mostly like Queue<> is finished, or List<> elements is empty(finished).
Finally, get it works.
class Worker
{
public static int threadCount { get; set; }
Task[] tasks;
//ex data
public static string exception;
static CancellationTokenSource wtoken = new CancellationTokenSource();
CancellationToken cancellationToken = wtoken.Token;
public void doWork(ParameterizedThreadStart method)
{
try
{
tasks = Enumerable.Range(0, 4).Select(i => Task.Factory.StartNew(() =>
{
while (!cancellationToken.IsCancellationRequested)
{
method(i);
}
}, cancellationToken)).ToArray();
}
catch (Exception ex) { exception = ex.Message; }
}
public void HardStop()
{
try
{
using (wtoken)
{
wtoken.Cancel();
}
wtoken = null;
tasks = null;
}
catch (Exception ex) { exception = ex.Message; }
}
}
But if i`m using this method to quit cancellationToken.ThrowIfCancellationRequested();
Get a error:
when Job() method reach J == 50, and worker.HardStop() function called, program window crashes and i get and exception "OparetionCanceledException was unhandled by user code"
on this string
cancellationToken.ThrowIfCancellationRequested();
so, whats wrong? i`m already put it in try{} catch(){}
as i understood, just some boolean properties should be changed in Task (Task.IsCancelled == false, Task.IsFaulted == true) on wtoken.Cancel();
I'd avoid all of the mucking around with tasks and use Microsoft's Reactive Framework (NuGet "Rx-Main") for this.
Here's how:
var r = new Random();
var query =
Observable
.Range(0, 4, Scheduler.Default)
.Select(i =>
Observable
.Generate(0, x => true, x => x, x => x,
x => TimeSpan.FromMilliseconds(r.Next(500, 1000)),
Scheduler.Default)
.Select(x => i))
.Merge();
var subscription =
query
.Subscribe(i => method(i));
And when you want to cancel the calls to method just do this:
subscription.Dispose();
I've tested this and it works like a treat.
If I wrap this up in your worker class then it looks like this:
class Worker
{
private Random _r = new Random();
private IDisposable _subscription = null;
public void doWork()
{
_subscription =
Observable
.Range(0, 4, Scheduler.Default)
.Select(n =>
Observable
.Generate(
0, x => true, x => x, x => x,
x => TimeSpan.FromMilliseconds(_r.Next(500, 1000)),
Scheduler.Default)
.Select(x => n))
.Merge()
.Subscribe(i => method(i));
}
public void HardStop()
{
_subscription.Dispose();
}
}
I have recently upgraded my projects to ASP.NET 4.5 and I have been waiting a long time to use 4.5's asynchronous capabilities. After reading the documentation I'm not sure whether I can improve my code at all.
I want to execute a task asynchronously and then forget about it. The way that I'm currently doing this is by creating delegates and then using BeginInvoke.
Here's one of the filters in my project with creates an audit in our database every time a user accesses a resource that must be audited:
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
var request = filterContext.HttpContext.Request;
var id = WebSecurity.CurrentUserId;
var invoker = new MethodInvoker(delegate
{
var audit = new Audit
{
Id = Guid.NewGuid(),
IPAddress = request.UserHostAddress,
UserId = id,
Resource = request.RawUrl,
Timestamp = DateTime.UtcNow
};
var database = (new NinjectBinder()).Kernel.Get<IDatabaseWorker>();
database.Audits.InsertOrUpdate(audit);
database.Save();
});
invoker.BeginInvoke(StopAsynchronousMethod, invoker);
base.OnActionExecuting(filterContext);
}
But in order to finish this asynchronous task, I need to always define a callback, which looks like this:
public void StopAsynchronousMethod(IAsyncResult result)
{
var state = (MethodInvoker)result.AsyncState;
try
{
state.EndInvoke(result);
}
catch (Exception e)
{
var username = WebSecurity.CurrentUserName;
Debugging.DispatchExceptionEmail(e, username);
}
}
I would rather not use the callback at all due to the fact that I do not need a result from the task that I am invoking asynchronously.
How can I improve this code with Task.Run() (or async and await)?
If I understood your requirements correctly, you want to kick off a task and then forget about it. When the task completes, and if an exception occurred, you want to log it.
I'd use Task.Run to create a task, followed by ContinueWith to attach a continuation task. This continuation task will log any exception that was thrown from the parent task. Also, use TaskContinuationOptions.OnlyOnFaulted to make sure the continuation only runs if an exception occurred.
Task.Run(() => {
var audit = new Audit
{
Id = Guid.NewGuid(),
IPAddress = request.UserHostAddress,
UserId = id,
Resource = request.RawUrl,
Timestamp = DateTime.UtcNow
};
var database = (new NinjectBinder()).Kernel.Get<IDatabaseWorker>();
database.Audits.InsertOrUpdate(audit);
database.Save();
}).ContinueWith(task => {
task.Exception.Handle(ex => {
var username = WebSecurity.CurrentUserName;
Debugging.DispatchExceptionEmail(ex, username);
});
}, TaskContinuationOptions.OnlyOnFaulted);
As a side-note, background tasks and fire-and-forget scenarios in ASP.NET are highly discouraged. See The Dangers of Implementing Recurring Background Tasks In ASP.NET
It may sound a bit out of scope, but if you just want to forget after you launch it, why not using directly ThreadPool?
Something like:
ThreadPool.QueueUserWorkItem(
x =>
{
try
{
// Do something
...
}
catch (Exception e)
{
// Log something
...
}
});
I had to do some performance benchmarking for different async call methods and I found that (not surprisingly) ThreadPool works much better, but also that, actually, BeginInvoke is not that bad (I am on .NET 4.5). That's what I found out with the code at the end of the post. I did not find something like this online, so I took the time to check it myself. Each call is not exactly equal, but it is more or less functionally equivalent in terms of what it does:
ThreadPool: 70.80ms
Task: 90.88ms
BeginInvoke: 121.88ms
Thread: 4657.52ms
public class Program
{
public delegate void ThisDoesSomething();
// Perform a very simple operation to see the overhead of
// different async calls types.
public static void Main(string[] args)
{
const int repetitions = 25;
const int calls = 1000;
var results = new List<Tuple<string, double>>();
Console.WriteLine(
"{0} parallel calls, {1} repetitions for better statistics\n",
calls,
repetitions);
// Threads
Console.Write("Running Threads");
results.Add(new Tuple<string, double>("Threads", RunOnThreads(repetitions, calls)));
Console.WriteLine();
// BeginInvoke
Console.Write("Running BeginInvoke");
results.Add(new Tuple<string, double>("BeginInvoke", RunOnBeginInvoke(repetitions, calls)));
Console.WriteLine();
// Tasks
Console.Write("Running Tasks");
results.Add(new Tuple<string, double>("Tasks", RunOnTasks(repetitions, calls)));
Console.WriteLine();
// Thread Pool
Console.Write("Running Thread pool");
results.Add(new Tuple<string, double>("ThreadPool", RunOnThreadPool(repetitions, calls)));
Console.WriteLine();
Console.WriteLine();
// Show results
results = results.OrderBy(rs => rs.Item2).ToList();
foreach (var result in results)
{
Console.WriteLine(
"{0}: Done in {1}ms avg",
result.Item1,
(result.Item2 / repetitions).ToString("0.00"));
}
Console.WriteLine("Press a key to exit");
Console.ReadKey();
}
/// <summary>
/// The do stuff.
/// </summary>
public static void DoStuff()
{
Console.Write("*");
}
public static double RunOnThreads(int repetitions, int calls)
{
var totalMs = 0.0;
for (var j = 0; j < repetitions; j++)
{
Console.Write(".");
var toProcess = calls;
var stopwatch = new Stopwatch();
var resetEvent = new ManualResetEvent(false);
var threadList = new List<Thread>();
for (var i = 0; i < calls; i++)
{
threadList.Add(new Thread(() =>
{
// Do something
DoStuff();
// Safely decrement the counter
if (Interlocked.Decrement(ref toProcess) == 0)
{
resetEvent.Set();
}
}));
}
stopwatch.Start();
foreach (var thread in threadList)
{
thread.Start();
}
resetEvent.WaitOne();
stopwatch.Stop();
totalMs += stopwatch.ElapsedMilliseconds;
}
return totalMs;
}
public static double RunOnThreadPool(int repetitions, int calls)
{
var totalMs = 0.0;
for (var j = 0; j < repetitions; j++)
{
Console.Write(".");
var toProcess = calls;
var resetEvent = new ManualResetEvent(false);
var stopwatch = new Stopwatch();
var list = new List<int>();
for (var i = 0; i < calls; i++)
{
list.Add(i);
}
stopwatch.Start();
for (var i = 0; i < calls; i++)
{
ThreadPool.QueueUserWorkItem(
x =>
{
// Do something
DoStuff();
// Safely decrement the counter
if (Interlocked.Decrement(ref toProcess) == 0)
{
resetEvent.Set();
}
},
list[i]);
}
resetEvent.WaitOne();
stopwatch.Stop();
totalMs += stopwatch.ElapsedMilliseconds;
}
return totalMs;
}
public static double RunOnBeginInvoke(int repetitions, int calls)
{
var totalMs = 0.0;
for (var j = 0; j < repetitions; j++)
{
Console.Write(".");
var beginInvokeStopwatch = new Stopwatch();
var delegateList = new List<ThisDoesSomething>();
var resultsList = new List<IAsyncResult>();
for (var i = 0; i < calls; i++)
{
delegateList.Add(DoStuff);
}
beginInvokeStopwatch.Start();
foreach (var delegateToCall in delegateList)
{
resultsList.Add(delegateToCall.BeginInvoke(null, null));
}
// We lose a bit of accuracy, but if the loop is big enough,
// it should not really matter
while (resultsList.Any(rs => !rs.IsCompleted))
{
Thread.Sleep(10);
}
beginInvokeStopwatch.Stop();
totalMs += beginInvokeStopwatch.ElapsedMilliseconds;
}
return totalMs;
}
public static double RunOnTasks(int repetitions, int calls)
{
var totalMs = 0.0;
for (var j = 0; j < repetitions; j++)
{
Console.Write(".");
var resultsList = new List<Task>();
var stopwatch = new Stopwatch();
stopwatch.Start();
for (var i = 0; i < calls; i++)
{
resultsList.Add(Task.Factory.StartNew(DoStuff));
}
// We lose a bit of accuracy, but if the loop is big enough,
// it should not really matter
while (resultsList.Any(task => !task.IsCompleted))
{
Thread.Sleep(10);
}
stopwatch.Stop();
totalMs += stopwatch.ElapsedMilliseconds;
}
return totalMs;
}
}
Here's one of the filters in my project with creates an audit in our database every time a user accesses a resource that must be audited
Auditing is certainly not something I would call "fire and forget". Remember, on ASP.NET, "fire and forget" means "I don't care whether this code actually executes or not". So, if your desired semantics are that audits may occasionally be missing, then (and only then) you can use fire and forget for your audits.
If you want to ensure your audits are all correct, then either wait for the audit save to complete before sending the response, or queue the audit information to reliable storage (e.g., Azure queue or MSMQ) and have an independent backend (e.g., Azure worker role or Win32 service) process the audits in that queue.
But if you want to live dangerously (accepting that occasionally audits may be missing), you can mitigate the problems by registering the work with the ASP.NET runtime. Using the BackgroundTaskManager from my blog:
public override void OnActionExecuting(ActionExecutingContext filterContext)
{
var request = filterContext.HttpContext.Request;
var id = WebSecurity.CurrentUserId;
BackgroundTaskManager.Run(() =>
{
try
{
var audit = new Audit
{
Id = Guid.NewGuid(),
IPAddress = request.UserHostAddress,
UserId = id,
Resource = request.RawUrl,
Timestamp = DateTime.UtcNow
};
var database = (new NinjectBinder()).Kernel.Get<IDatabaseWorker>();
database.Audits.InsertOrUpdate(audit);
database.Save();
}
catch (Exception e)
{
var username = WebSecurity.CurrentUserName;
Debugging.DispatchExceptionEmail(e, username);
}
});
base.OnActionExecuting(filterContext);
}