I'm developing MVC3 based web application, which at one point needs to cache large amount of data from some external database. Process takes about 10-30 min (depending on a traffic) so I put it in BackgroundWorker. When you click particular button on the webpage, using ajax it just access other method in controller, depending on the returned value proper information is displayed on user interface.
Controller:
if (IsDbLocked())
{
return this.Json(new
{
success = false,
message = "There is already an update requested by other user on the way."
});
}
this.model.DataUpdateBegin();
return this.Json(new { success = true });
Model:
public void DataUpdateBegin()
{
var backgroundWorker = new BackgroundWorker
{
WorkerSupportsCancellation = false,
WorkerReportsProgress = true
};
backgroundWorker.DoWork += this.DataUpdateWorkerDoWork;
backgroundWorker.ProgressChanged += this.DataUpdaterWorkerProgressChanged;
backgroundWorker.RunWorkerCompleted += this.DataUpdaterWorkerRunWorkerCompleted;
if (this.DataUpdateLockDb(true))
{
backgroundWorker.RunWorkerAsync();
}
}
Now when I do update, UI still freezes. While debuging controller I can see, that it starts BackgroundWorker and instantly continues to return statement (with success = true), but then it just finishes, and nothing else happens (returned message never reaches webpage).
I can see page from other browser/user and everything works ok, but this particular thread is locked for several minutes (not entire 10-30 min, as it's get unlocked after about 5 min - timeout?)
My question is, what I did wrong and how to fix it. I expect backgroundWorker to just run in the background, and free user to move around page wherever he wish. Also making an entry in database and making some external application just fetch it and do all the work (in real background) is not an option for me.
Do not use Background worker like this. Yes, the work will be done within another thread, but still in scope of that web request. Web requests are not ment to be alive for 30 minutes, there are plenty of things that can go wrong (timeouts, app pool restart, other IIS behaviour..)
If you have this type of long-running task, you should do it in some worker - windows service, maybe console application, etc. and from web you just start it (console) or set it to be done (message queue, azure queue)
Also, i hope you are not locking database (you method IsDbLocked()) for 30 minutes? Just do your import in transaction and use proper isolation level (read commited) so DB work normally all the time and the changes are there instantly when import finishes.
I'm developing MVC3 based web application
... so I put it in BackgroundWorker
BackgroundWorker is designed to work with Windows (forms) application; to achieve similar in web application, use Task.Factory.StartNew or Thread (more primitive)
Related
I am writing a Windows Forms app in C# with Visual Studio 2022 on a Windows 10 machine. The app connects to an Azure database, which works fine. My issue is that sometimes it takes several seconds to connect (maybe 10 or so), or if there is an error it goes all the way to the timeout limit (usually 20 to 30 seconds) before coming back with whatever error message there is.
I am trying to provide some visual feedback to the user during this time, but the application does not appear to be processing any events, so whatever type of feedback I'm trying to send does not get done until the operation completes (at which point it is moot).
Any ideas on how to deal with this? Do I need to open the database on a different thread, and if so, will that be an issue throughout the rest of the app whenever I use the database object opened on a different thread?
I'm trying something simple, like gradually adding a row of dots, like so:
private void InitCloudDatabase()
{
Boolean success = true;
WorkingTimer.Enabled = true;
WorkingTimer.Start();
try
{
AzureAgDatabase db = new AzureAgDatabase();
db.OpenConnection();
}
catch
{
success = false;
}
WorkingTimer.Stop();
pbCloudResult.Image = (success) ? Properties.Resources.icons8_done_96 :
Properties.Resources.Red_X___Fail;
}
private void WorkingTimer_Tick(object sender, EventArgs e)
{
lblCloud.Text += " .";
if (lblCloud.Text.Contains(" . . . . . . . . . . ."))
{
lblCloud.Text = "Database Connection (Cloud)";
}
}
I haven't really worked with Windows Forms before, but in most UI based applications, you should reserve the UI Thread for just UI operations and move all time consuming tasks (Compute or I/O) to a different thread to ensure that the UI is still responsive.
In the case of Windows Forms, looks like you have a BackgroundWorker class that you can use to offload the DB operations into. Here is a walkthrough in the official docs that you can refer to.
Another approach would be to use the Task class to run your database code asynchronously, with lesser code compared to the first approach. You would simply wrap statements that take time in a Task.Run call and have follow up statements in a continuation task.
I am working on a MVC 5 based report generating web application (Excel files). On one "GenerateReports" page, on button click, I am calling StartMonthly function. This takes control to a void method "GenerateReportMainMonthly" in the controller. This method calls another void method "GenerateReportMain". In GenerateReportMain, there are 5 other void functions that are being called.
I do not want the control to get stuck at back-end until the report generation is completed. On button click, an alert box should show "Report Generation started." and the control should come back to the "GenerateReports" page.
I have tried ajax but have not been able to get the control back to UI. How can I get the control back to the same page without waiting for the back-end process to complete?
$('#btnStart').on('click', StartMonthly);
function StartMonthly() {
var url = '/GenerateReport/GenerateReportMainMonthly';
window.location.href = url;
}
public void GenerateReportMainMonthly()
{
_isDaily = false;
GenerateReportMain();
}
It seems you are after running background tasks in your controllers. This is generally a bad idea (see this SO answer) as you might find that your server process has been killed mid-way and your client will have to handle it somehow.
If you absolutely must run long-ish processes in your controller and cannot extract it into a background worker of some sort, you can opt for something like this SO answer suggests. Implementation will vary depending on your setup and how fancy you are willing/able to go, but the basics will ramain the same:
you make an instantaneous call to initiate your long action and return back a job id to refer back to
your backend will process the task and update the status accordingly
your client will periodically check for status and do your desired behaviour when the job is reported complete.
If I were to tackle this problem I'd try to avoid periodic polling and rather opt for SignalR updates as describled in this blog post (this is not mine, I just googled it up for example).
I have a long running action/method that is called when a user clicks a button on a internal MVC5 application. The button is shared by all users, meaning a second person can come in and click it seconds after it has been clicked. The long running task is updating a shared task window to all clients via SignalR.
Is there a recommended way to check if the task is still busy and simply notifying the user it's still working? Is there another recommended approach? (can't use external windows service for the work)
Currently what I am doing seems like a bad idea or I could be wrong and it's feasible. See below for a sample of what I am doing.
public static Task WorkerTask { get; set; }
public JsonResult SendData()
{
if (WorkerTask == null)
{
WorkerTask = Task.Factory.StartNew(async () =>
{
// Do the 2-15 minute long running job
});
WorkerTask = null;
}
else
{
TempData["Message"] = "Data is already being exported. Please see task window for the status.";
}
return Json(Url.Action("Export", "Home"), JsonRequestBehavior.AllowGet);
}
I don't think what you're doing will work at all. I see three issues:
You are storing the WorkerTask on the controller (I think). A new controller is created for every request. Therefore, a new WorkerTask will always be created.
If #1 weren't true, you would still need to wrap the instantiation of WorkerTask in a lock because multiple clients could reach the WorkerTask == null check at the same time.
You shouldn't have long running tasks in your web application. The app pool could restart at any time killing your WorkerTask.
If you want to skip the best practices advice of "don't do long running work in your web app", you could use the HostingEnvironment.QueueBackgroundWorkItem introduced in .NET 4.5.2 to kick off the long running task. You could store a variable in the HttpApplication.Cache to indicate whether the long running process has been kicked off.
This solution has more than a few issues (it won't work in a web farm, the app pool could die, etc.). A more robust solution would be to use something like Quartz.net or Hangfire.
I am working on an assignment in asp.net to send notification email to users at specific intervals.
But the problem is that since the server is not privately owned i cannot implement a windows service on it.
Any ideas?
There's no reliable way to achieve that. If you cannot install a Windows Service on the host you could write a endpoint (.aspx or .ashx) that will send the email and then purchase on some other site a service which will ping this endpoint at regular intervals by sending it HTTP request. Obviously you should configure this endpoint to be accessible only from the IP address of the provider you purchase the service from, otherwise anyone could send an HTTP request to the endpoint and trigger the process which is probably undesirable.
Further reading: The Dangers of Implementing Recurring Background Tasks In ASP.NET.
There are several ways to get code executing on an interval that don't require a windows service.
One option is to use the Cache class - use one of the Insert overloads that takes a CacheItemRemovedCallback - this will be called when the cache item is removed. You can re-add the cache item with this callback again and again...
Though, the first thing you need to do is contact the hosting company and find out if they already have some sort of solution for you.
You could set up a scheduled task on the server to invoke a program with the desired action.
You can always use a System.Timer and create a call at specific intervals. What you need to be careful is that this must be run one time, eg on application start, but if you have more than one pools, then it may run more times, and you also need to access some database to read the data of your actions.
using System.Timers;
var oTimer = new Timer();
oTimer.Interval = 30000; // 30 second
oTimer.Elapsed += new ElapsedEventHandler(MyThreadFun);
oTimer.Start();
private static void MyThreadFun(object sender, ElapsedEventArgs e)
{
// inside here you read your query from the database
// get the next email that must be send,
// you send them, and mark them as send, log the errors and done.
}
why I select system timer:
http://msdn.microsoft.com/en-us/magazine/cc164015.aspx
more words
I use this in a more complex class and its work fine. What are the points that I have also made.
Signaling the application stop, to wait for the timer to end.
Use mutex and database for synchronize the works.
Easiest solution is to exploit global.asax application events
On application startup event, create a thread (or task) into a static singleton variable in the global class.
The thread/task/workitem will have an endless loop while(true) {...} with your "service like" code inside.
You'll also want to put a Thread.Sleep(60000) in the loop so it doesn't eat unnecessary CPU cycles.
static void FakeService(object obj) {
while(true) {
try {
// - get a list of users to send emails to
// - check the current time and compare it to the interval to send a new email
// - send emails
// - update the last_email_sent time for the users
} catch (Exception ex) {
// - log any exceptions
// - choose to keep the loop (fake service) running or end it (return)
}
Thread.Sleep(60000); //run the code in this loop every ~60 seconds
}
}
EDIT Because your task is more or less a simple timer job any of the ACID type concerns from an app pool reset or other error don't really apply, because it can just start up again and keep trucking along with any data corruption. But you could also use the thread to simply execute a request to an aspx or ashx that would hold your logic.
new WebClient().DownloadString("http://localhost/EmailJob.aspx");
We have an application that has a primary window, it can launch multiple other windows, in new browsers. We are using a silverlight application as a coordinating server in the primary window to close all windows that are part of the app, regardless of the way they are opened (we can't guarantee it was via window.open so don't always have a handle to the window in javascript).
On log out, we want to signal all the other windows to perform an auto-save, if necessary, then close down.
So all windows have a silverlight app, they coordinate using localmessagesenders. However, these are asynchronous:
private void ProcessAutosave()
{
foreach (string s in _windows)
{
SendMessage(s, "notify-logout");
}
// code here quoted later...
}
// sendasynch doesn't send until the method terminates, so have to do it in it's own function.
private void SendMessage(string to, string message)
{
var lms = new LocalMessageSender(to);
lms.SendCompleted += new EventHandler<SendCompletedEventArgs>(SenderSendCompleted);
lms.SendAsync(message);
}
Since the ProcessAutosave is called from a javascript onunload event which can't be cancelled, we need this to be synchronous and not complete before we have a response processed from each sub-window so the session state will still be valid etc.
In the SenderSendCompleted we remove items from _windows when they have said they're done.
So I added a loop on the end:
while(_windows.Count > 0) {
Thread.Sleep(1)
}
However, that never terminates, unless I put an iteration counter on it.
Am I the victim of a compiler optimisation meaning the changes in SenderSendCompleted do not affect that while loop, or, have I fundamentally misunderstood something? Or missed something obvious that's staring me in the face?
It sounds like a subtle verson of a race situation due to going sync/async. Couldn't the process in queston also receive notifications from the windows that they have received the message and are shutting down? Once all of the counter messages have been received, then the main app could shut down without the busy wait at the end(?).
I have found a way to work round. However, this does not really "solve" the problem generally, just in my case, which is also only supporting internet explorer.
function WindowCloseEventHandler()
{
var app = // get silverlight app handle...
app.doAutoSave();
var params = 'whatever you need';
var args = new Object();
args.hwnd = window;
window.showModalDialog('blocker.aspx',args,params);
}
function checkAutoSave()
{
var app = // get silverlight app handle...
return app.autosavecomplete();
}
Then in blocker.aspx we display a static "performing logout handlers" type message and do:
function timerTick()
{
if(window.dialogArguments.hwnd.checkAutoSave()) {
window.close();
} else {
setTimeout(timerTick, 500);
}
}
And start the timer on window load.
The child window's silverlight apps are notified to start an autosave, then they notify the parent when they are done. We then poll the parent's status from a modal dialog, which blocks the termination of the WindowCloseEventHandler() which we have wired up to the onclose event of the body.
It's hacky and horrible, but it means silverlight stays asynchronous and we're using a javascript timer so the javascript isn't loading the system.
Of course if the user closes the modal dialogue, there is a potential for issue.