Background task in a ASP webapp - c#

I'm fairly new to C#, and recently built a small webapp using .NET 4.0. This app has 2 parts: one is designed to run permanently and will continuously fetch data from given resources on the web. The other one accesses that data upon request to analyze it. I'm struggling with the first part.
My initial approach was to set up a Timer object that would execute a fetch operation (whatever that operation is doesn't really matter here) every, say, 5 minutes. I would define that timer on Application_Start and let it live after that.
However, I recently realized that applications are created / destroyed based on user requests (from my observation they seem to be destroyed after some time of inactivity). As a consequence, my background activity will stop / resume out of my control where I would like it to run continuously, with absolutely no interruption.
So here comes my question: is that achievable in a webapp? Or do I absolutely need a separate Windows service for that kind of things?
Thanks in advance for your precious help!
Guillaume

While doing this on a web app is not ideal..it is achievable, given that the site is always up.
Here's a sample: I'm creating a Cache item in the global.asax with an expiration. When it expires, an event is fired. You can fetch your data or whatever in the OnRemove() event.
Then you can set a call to a page(preferably a very small one) that will trigger code in the Application_BeginRequest that will add back the Cache item with an expiration.
global.asax:
private const string VendorNotificationCacheKey = "VendorNotification";
private const int IntervalInMinutes = 60; //Expires after X minutes & runs tasks
protected void Application_Start(object sender, EventArgs e)
{
//Set value in cache with expiration time
CacheItemRemovedCallback callback = OnRemove;
Context.Cache.Add(VendorNotificationCacheKey, DateTime.Now, null, DateTime.Now.AddMinutes(IntervalInMinutes), TimeSpan.Zero,
CacheItemPriority.Normal, callback);
}
private void OnRemove(string key, object value, CacheItemRemovedReason reason)
{
SendVendorNotification();
//Need Access to HTTPContext so cache can be re-added, so let's call a page. Application_BeginRequest will re-add the cache.
var siteUrl = ConfigurationManager.AppSettings.Get("SiteUrl");
var client = new WebClient();
client.DownloadData(siteUrl + "default.aspx");
client.Dispose();
}
private void SendVendorNotification()
{
//Do Tasks here
}
protected void Application_BeginRequest(object sender, EventArgs e)
{
//Re-add if it doesn't exist
if (HttpContext.Current.Request.Url.ToString().ToLower().Contains("default.aspx") &&
HttpContext.Current.Cache[VendorNotificationCacheKey] == null)
{
//ReAdd
CacheItemRemovedCallback callback = OnRemove;
Context.Cache.Add(VendorNotificationCacheKey, DateTime.Now, null, DateTime.Now.AddMinutes(IntervalInMinutes), TimeSpan.Zero,
CacheItemPriority.Normal, callback);
}
}
This works well, if your scheduled task is quick.
If it's a long running process..you definitely need to keep it out of your web app.
As long as the 1st request has started the application...this will keep firing every 60 minutes even if it has no visitors on the site.

I suggest putting it in a windows service. You avoid all the hoops mentioned above, the big one being IIS restarts. A windows service also has the following benefits:
Can automatically start when the server starts. If you are running in IIS and your server reboots, you have to wait until a request is made to start your process.
Can place this data fetching process on another machine if needed
If you end up load-balancing your website on multiple servers, you could accidentally have multiple data fetching processes causing you problems
Easier to main the code separately (single responsibility principle). Easier to maintain the code if it's just doing what it needs to do and not also trying to fool IIS.

Create a static class with a constructor, creating a timer event.
However like Steve Sloka mentioned, IIS has a timeout that you will have to manipulate to keep the site going.
using System.Runtime.Remoting.Messaging;
public static class Variables
{
static Variables()
{
m_wClass = new WorkerClass();
// creates and registers an event timer
m_flushTimer = new System.Timers.Timer(1000);
m_flushTimer.Elapsed += new System.Timers.ElapsedEventHandler(OnFlushTimer);
m_flushTimer.Start();
}
private static void OnFlushTimer(object o, System.Timers.ElapsedEventArgs args)
{
// determine the frequency of your update
if (System.DateTime.Now - m_timer1LastUpdateTime > new System.TimeSpan(0,1,0))
{
// call your class to do the update
m_wClass.DoMyThing();
m_timer1LastUpdateTime = System.DateTime.Now;
}
}
private static readonly System.Timers.Timer m_flushTimer;
private static System.DateTime m_timer1LastUpdateTime = System.DateTime.MinValue;
private static readonly WorkerClass m_wClass;
}
public class WorkerClass
{
public delegate WorkerClass MyDelegate();
public void DoMyThing()
{
m_test = "Hi";
m_test2 = "Bye";
//create async call to do the work
MyDelegate myDel = new MyDelegate(Execute);
AsyncCallback cb = new AsyncCallback(CommandCallBack);
IAsyncResult ar = myDel.BeginInvoke(cb, null);
}
private WorkerClass Execute()
{
//do my stuff in an async call
m_test2 = "Later";
return this;
}
public void CommandCallBack(IAsyncResult ar)
{
// this is called when your task is complete
AsyncResult asyncResult = (AsyncResult)ar;
MyDelegate myDel = (MyDelegate)asyncResult.AsyncDelegate;
WorkerClass command = myDel.EndInvoke(ar);
// command is a reference to the original class that envoked the async call
// m_test will equal "Hi"
// m_test2 will equal "Later";
}
private string m_test;
private string m_test2;
}

I think you can can achieve it by using a BackgroundWorker, but i would rather suggest you to go for a service.

Your application context lives as long as your Worker Process in IIS is functioning. In IIS there's some default timeouts for when the worker process will recycle (e.g. Number of Idle mins (20), or regular intervals (1740).
That said, if you adjust those settings in IIS, you should be able to have the requests live, however, the other answers of using a Service would work as well, just a matter of how you want to implement.

I recently made a file upload functionality for uploading Access files to the database (not the best way but just a temporary fix to a longterm issue).
I solved it by creating a background thread that ran through the ProcessAccess function, and was deleted when completed.
Unless IIS has a setting in which it kills a thread after a set amount of time regardless of inactivity, you should be able to create a thread that calls a function that never ends. Don't use recursion because the amount of open functions will eventually blow up in you face, but just have a for(;;) loop 5,000,000 times so it'll keep busy :)

Application Initialization Module for IIS 7.5 does precisely this type of init work. More details on the module are available here Application Initialization Module

Related

C# System.Timers.Timer - Please, how do I make it stop?

I've got a Timer that's doing a 60 second countdown. When the ticks hit 60 seconds, it stops and disposes - no problem (I think). This is run in the context of a WebApi service. I need to be able to cancel the countdown from a UI, so I've exposed a method to handle this. Since the controller is transient (thanks Luaan) and, as Daniel points out, the app pool is not predictable, I need a way to send a "cancellable" countdown to clients. Ideas anyone?
[HttpGet]
public IHttpActionResult CancelCountdown()
{
// DOES NOTHING BECAUSE THERE'S A NEW INSTANCE OF THE CONTROLLER
timer.Stop();
timer.Dispose();
return Ok();
}
private void StartCountdown()
{
// MAY BE A BAD SOLUTION BECAUSE THE APP POOL MAY RECYCLE
timer.Interval = _timeIntervalInMilliseconds;
timer.Elapsed += BroadcastToClients;
timer.Start();
}
private void BroadcastToClients(object sender, EventArgs e)
{
_elapsed += 1;
if (_elapsed == _duration)//_duration is 60
{
timer.Stop();
timer.Dispose();
return;
}
_messageHub.Clients.All.shutdown(_elapsed);
}
It's kind of hard to provide an adequate solution without knowing what you're trying to accomplish with this, but i'll give it a shot.
As Luaan pointed out, controllers are designed to be essentially stateless, so you shouldn't put instance variable on them except for it's external dependencies, since each request creates a new instance of the controller class.
You could store the timer on a static dictionary, indexed by a GUID, and return the GUID on your controller and use it as the cancellation token.
Something like:
private static Dictionary<string,Timer> timers = new Dictionary<Guid,Timer>();
public Guid StartCountdown()
{
// MAY BE A BAD SOLUTION BECAUSE THE APP POOL MAY RECYCLE
timer.Interval = _timeIntervalInMilliseconds;
timer.Elapsed += BroadcastToClients;
var guid = Guid.NewGuid().ToString();
timers.Add(guid,timer);
timer.Start();
return guid;
}
public IHttpActionResult CancelCountdown(Guid cancelationToken)
{
//If the timer no longer exist or the user supplied a wrong token
if(!timers.HasKey(cancelationToken)) return;
var timer = timers[cancelationToken];
timer.Stop();
timer.Dispose();
timers.Remove(cancelationToken);
}
However this won't solve the problem with the AppPool recycling. For a more robust solution, instead of using a timer, you could store the start date and time of each countdown in a more permanent storage (say an SQL database, a NoSQL databse, a redis server or whatever), and have a running thread or global timer, or something like Hangfire, initialized on startup, that constantly checks your countdown storage. If enough time has passed to send a broadcast message you send it, and mark the countdown as finished. If a user wants to cancel the countdown, the controller will simply read the appropiate record, mark it as cancelled, and your running thread can ignore it.
If you go with this approach, you'll need to take into account some considerations:
If the timer interval is set too short you could have a perfomance bottleneck for having to access a permament storage too often. If the interval is too long, the countdown won't be too precise.
To alleviate this problem you could store the countdowns start time in permanent storage, in case the app pool resets and you need to restore them. And also have them stored in memory on a static variable for quicker access.
Please note that if you're working with a server farm instead of a single server, static variables won't be shared across instances.

Threading Timer doesn't callback

I have several Machine classes which have state whether they are online/offline and DateTime EndsAt when they will turn offline if they are online. They are (mapped?) to database using EF. When i turn them on i pass amount of seconds for them to stay online and create System.Threading.Timer to change its state back to offline when the time comes (EndsAt == DateTime.Now). Turning them on works fine, however they don't turn off - turnoff() is never called. And on top of that if it would be called and object would change its own variables will they be saved by entity framework?
public class Machine
{
private Timer timer=null;
[Key]
public int MachineId { get; set; }
public bool Online { get; set; }
public DateTime EndsAt { get; set; }
public void TurnOn(TimeSpan amount)
{
Debug.WriteLine("Turn on reached");
if (!Online)
{
EndsAt = DateTime.Today.Add(amount);
Online = true;
setTimer();
}
}
private void turnOff(object state)
{
Online = false;
Occuppied = false;
Debug.WriteLine("Timer ended!");
}
private void setTimer()
{
Debug.WriteLine("Timer being set");
if (EndsAt.CompareTo(DateTime.Now) == 1)
{
timer = new Timer(new TimerCallback(turnOff));
int msUntilTime = (int)((EndsAt - DateTime.Now).TotalMilliseconds);
timer.Change(msUntilTime, Timeout.Infinite);
}
else
{
Debug.WriteLine("EndsAt is smaller than current date");
}
}
}
Controller method where turnOn() is called
[HttpPost]
public ActionResult TurnOn() {
bool isChanged = false;
if (Request["machineId"] != null && Request["amount"] != null)
{
byte machineId = Convert.ToByte(Request["machineId"].ToString());
int amount = Convert.ToInt32(Request["amount"].ToString());
foreach (var machine in db.Machines.ToList())
{
if (machine.MachineId == machineId)
{
machine.TurnOn(TimeSpan.FromSeconds(amount));
db.Entry(machine).State = EntityState.Modified;
db.SaveChanges();
isChanged = true;
}
}
}
if (isChanged)
return new HttpStatusCodeResult(HttpStatusCode.OK);
else
return new HttpStatusCodeResult(HttpStatusCode.BadRequest);
}
The problem comes not from Entity Framework but ASP.NET.
The best way I can describe it is imagine your page request in ASP.NET is a console application, every new request the application starts up, does the request and responds to the user, waits a tiny bit for another request to come in then exits the Main() function.
If you created a Timer in that kind of application once the "tiny bit" runs out and the Main() returns your timer will not be running anymore and the thing you where waiting to happen will never happen. IIS does this exact process but it does it with AppDomain recycling, if no requests come in it will shut down the AppDomain and will kill your timer.
There two ways I know of to handle this problem:
The first way is you need to make a 2nd application that runs as a windows service outside of IIS that is always running, it will be what holds the timer. When you want to run any kind of long running operation that will outlive a page request you use WCF or some other technology for your web app to communicate with the service to start up the timer, when the timer is done either the service executes whatever operation you wanted done.
The second way to do it is you save the timer request in a database then in the background before every request you check the database of events and see if any need to be executed. There are libraries like hangfire that make this process easy, they also have tricks to keep the app domain alive longer or wake it back up if it shuts down (often they use two websites that talk to each other each keeping the other one alive).
Even though this specific question has been answered, here's some related discussion I hope can be helpful in the case of a timer callback not working.
Import considerations when using Threading.Timer
1.) Timer is subject to garbage collection. Even if active, it may be collected as garbage if it does not haven a reference.
2.) DotNet has many different types of timers, and it's important to use the right kind in the right way because it involves threading. Use Forms.Timer for Forms, Threading.Timer or wrap it in Timers.Timer (debate on thread safety), or Web.UI.Timer with ASP.NET for web page postbacks.
3.) The Callback method is defined when the timer is instantiated and cannot be changed.
Timer Related Tools
1.) You can use Thread.Sleep to release CPU resources and place your thread in a waitsleepjoin state which is essentially stopped.
2.) Sometimes a Task can be used along with or instead of a timer.
3.) Stopwatch can be used in different ways, for example, with an empty loop.

How to create a scheduled long running process using windows service in c#

I want to create a windows service that performs some really long and heavy work. The code is inside OnStart method like this:
protected override void OnStart(string[] args)
{
System.IO.File.WriteAllText(
#"C:\MMS\Logs\WinServiceLogs.txt",
DateTime.Now + "\t MMS Service started."
);
this.RequestAdditionalTime(5*60*1000);
this.RunService();
}
this.RunService() sends a request to WCF service library hosted on IIS. It does some really long processes, ranging from 1-20 min, depending on the data it has to process. This service that I'm writing is supposed to be scheduled to run every day in the morning. So far, it runs and works fine, but when the time goes over a few seconds or min, it generates timeout exception. This causes the windows service to be in unstable state, and I can't stop or uninstall it without restarting the computer. Since, I'm trying to create an automated system, this is an issue.
I did do this.RequestAdditionalTime(), but I'm not sure whether it's doing what it's supposed to or not. I don't get the timeout error message, but now I don't know how to schedule it so it runs every day. If the exception occurs, then it won't run the next time. There were several articles and SO's I found, but there's something I'm missing and I can't understand it.
Should I create a thread? Some articles say I shouldn't put heavy programs in OnStart, where should I put the heavy codes then? Right now, when the service starts, it does this huge data processing which makes the Windows Service status to "Starting", and it stays there for long time until either the program crashes due to timeout, or completes successfully. How can I start the service, then set the status to Running while the code is running to do some data processing?
Your situation might be better suited for a scheduled task as Lloyd said in the comments above. But if you really want to use a Windows service, this is what you would need to add/update in your service code. This will allow your service to list as started and not timeout on you. You can adjust the timer length to suit your needs.
private Timer processingTimer;
public YourService()
{
InitializeComponent();
//Initialize timer
processingTimer = new Timer(60000); //Set to run every 60 seconds
processingTimer.Elapsed += processingTimer_Elapsed;
processingTimer.AutoReset = true;
processingTimer.Enabled = true;
}
private void processingTimer_Elapsed(object sender, ElapsedEventArgs e)
{
//Check the time
if (timeCheck && haventRunToday)
//Run your code
//You should probably still run this as a separate thread
this.RunService();
}
protected override void OnStart(string[] args)
{
//Start the timer
processingTimer.Start();
}
protected override void OnStop()
{
//Check to make sure that your code isn't still running... (if separate thread)
//Stop the timer
processingTimer.Stop();
}
protected override void OnPause()
{
//Stop the timer
processingTimer.Stop();
}
protected override void OnContinue()
{
//Start the timer
processingTimer.Start();
}

Method called twice in same moment

I'm working on a windows forms application and fighting with a very harsh error. The application is supposed to run on a local machine and handle requests form a server applicaton. The client application looks like this:
public Reader mr_obj;
public Form1()
{
mr_obj = new MyReader.Reader(7137);
mr_obj.UserEvent += new ReaderEvent(UserEvent);
}
private void UserEvent(UserEvent e, long threadID)
{
Thread.Sleep(1000);
SafeSomethingToDB();
}
The Reader() object is connecting the client application to the server application. So after this, the server application is able to trigger the UserEvent() method in the client application. Ther problem is now, that the client application, which handles the UserEvents, crashes if the UserEvent() method gets triggered twice within one second.
(Its actually not crashing just hanging untill you kill the task, a try catch wont return an error)
What I've tried so far is to delegate the Thread.Sleep() and SafeSomethingToDB() to another thread. This doesnt work because the server application does not wait until the tread is finished. So the server application does not find the data in the DB because its not waiting 1 second...
The same problem happens when I did that with background workers.
Is there a possibility to handle these two triggers, which come from the same server application, in sort of a parallell way at the same time?
Any suggestions very apreciated
EDIT: I think locking the method does not cause the application to process both triggers in the same time. To make this visible I'v tried this:
private void UserEventHandler(UserEvent e, long threadID)
{
lock (_lockObject)
{
MessageBox.Show("Messagebox 1");
MessageBox.Show("Messagebox 2");
}
}
When the first request triggers UserEvent() "MessageBox1" appeares. If you press OK, "MessageBox2" appeares. But if the UserEvent gets triggered a second time while "Messagebox2" is still opened, "MessageBox1" does not appear. Instead of that the application start hanging. Shouldn "MessageBox1" appear again triggered by the second trigger of UserEvent() when the two triggers really ar bbeing processed at the same time? So the two triggers are not beeing preformed parallel or am I mistaking here?
Without knowing why you do the Sleep or what exactly SafeSomethingToDB does and what causes your problems, try to synchronize the calls:
private readonly object _lockObject = new object();
private void UserEvent(UserEvent e, long threadID)
{
lock(_lockObject)
{
Thread.Sleep(1000);
SafeSomethingToDB();
}
}
I think a simple lock for synchronization will work for you, try this
public Reader mr_obj;
private static readonly object sync = new object();
public Form1()
{
mr_obj = new MyReader.Reader(7137);
mr_obj.UserEvent += new ReaderEvent(UserEvent);
}
private void UserEvent(UserEvent e, long threadID)
{
lock(sync)
{
SafeSomethingToDB();
}
}
As you write in the comments, if SafeSomethingToDB() is called a second time before the first call has finished, then it crashes. So in other words: SafeSomethingToDB() is not re-entrant.
What you can do is use a Mutex (which stands for mutual exclusion), which defines a "critical section" in your code, meaning a code that can have only one thread executing it at any one time.
For instance:
private static Mutex mutex = new Mutex();
public void SafeSomethingToDB()
{
mutex.WaitOne(); // wait until it is safe to enter the critical section
// Critical section begins here
DoWorkAndStuff();
mutex.ReleaseMutex(); // indicate the end of the critical section
}
For more about System.Threading.Mutex, see http://msdn.microsoft.com/en-us/library/system.threading.mutex(v=vs.110).aspx.

How can I query a database on a second thread while my web application continues running?

I want to start a new thread to query a database while my web application continues running. I was under the impression that by using threads I can run the querying process independently while the normal web application page cycle carries on, but I could be wrong.
public class DbAsyncQuery
{
Dictionary<string, object> serviceResults = new Dictionary<string, object>();
public void UpdateBillingDB()
{
QueryAllServices();
foreach (KeyValuePair<string, object> p in serviceResults)
{
IEnumerable results = (IEnumerable)p.Value;
IEnumerable<object> sessions = results.Cast<object>();
DbUtil.MakeBillingDBEntry(sessions, p.Key);
}
}
public static string[] servicesToQuery = new string[] // Must go in config file ultimately
{
"xxx.x.xxx.xx"
};
public delegate void Worker();
private Thread worker;
public void InitializeThread(Worker wrk)
{
worker = new Thread(new ThreadStart(wrk));
worker.Start();
}
public void InitializeQuery()
{
Worker worker = QueryAllServices;
InitializeThread(worker);
}
private void QueryAllServices()
{
Dictionary<string, DateTime> lastEntries = DbUtil.GetLastEntries();
foreach (string ip in servicesToQuery)
{
string fullServicePath =
"http://" + ip + ":800/MyWebService.asmx";
//object[] lastEntry = new object[] { lastEntries[ip] };
object[] lastEntry = new object[] { new DateTime(2011, 1, 1, 0, 0, 0) };
object obj = WebServiceHandler.CallWebService
(fullServicePath, "MyWebService", "GetBillingDBEntries", lastEntry);
serviceResults.Add(ip, obj);
}
}
}
It seems to basically stall and wait to finish the query before loading the page (which can be thousands of rows, so it takes awhile). I put this in the following section of Global.asax:
protected void Application_Start(object sender, EventArgs e)
{
DbAsyncQuery query = new DbAsyncQuery();
query.UpdateBillingDB();
}
This is one of my first web pages, and I'm new to threading. I understand there is a difference between creating your own thread and using the ThreadPool. In this method, I am using my own thread, I think. Not sure if that's the best way.
Any ideas? The querying process can be completely independent, and its only going to occur on scheduled intervals (every 8 hours or so). The user doesn't need to have up to the minute data, so it can take awhile. I just don't want the site to wait for it to finish, if that's possible.
Since your database process runs on a schedule, then keep it out of the web application. Create a console application that does this process and use Windows's Scheduled Tasks to schedule it to run every 8 hours.
Put code that is common to the database procedure and web site in a shared dll.
The reason you want to separate this is:
Web app won't run if there are no clients. If you want this to run on a schedule, take control of the schedule, don't leave it up to clients hitting the website to trigger.
Separate resources. Since this routine is long running keep the resources it uses separate from the web site. Using a console app also 100% cleans up any resources it uses automatically when it closes.
Simpler to program. While it may sound more complicated at first to manage separate products, separating them and concentrating on things one at a time will actually simplify development.
Run to completion. The ASP.NET worker process recycles based on many conditions, including a set schedule, and a long-running process won't keep it from recycling. If it recycles while your process is running, your process won't complete. Separating it to a scheduled task will ensure it's allowed to run to completion regardless of ASP.NET worker process state or recycles.

Categories