Run a single while(true) on Azure - c#

I'm new to Azure and I need to DoSomeWork() in Azure. This DoSomeWork() should be running periodically, every N minutes or so. However, DoSomeWork() can't be executed twice at the same time. In other words, any DoSomeWork() execution can't start before a prior DoSomeWork() execution finished.
I've been taken a look at Azure Web Jobs, particularly Continuous Azure Web Jobs. This seems the way to go but it's not clear on how to start, especially with the starting code that you get in VS:
static void Main()
{
var config = new JobHostConfiguration();
if (config.IsDevelopment)
{
config.UseDevelopmentSettings();
}
var host = new JobHost(config);
// The following code ensures that the WebJob will be running continuously
host.RunAndBlock();
}
There is also Functions class that takes an input a QueueTrigger decorated parameter, but I don't want the code to be triggered by any queue message or so.
How can I get a simple Console.WriteLine("hello world") running e.g. every minute but without overlapping? If Azure Web Jobs is not the way to go, what should I use instead (should be Azure-based)?

As mentioned in the comment, azure webjobs supports TimerTrigger feature(scheduled WebJob) which can be ran every xxx minutes as per your need.
It's quite simple when using azure webjob. For example, in visual studio, just create a console project -> Add this line of code: Console.WriteLine("hello world") -> then build the project -> then zip all the necessary files including the .exe into a .zip file -> at last, upload the .zip file into your webjob, and set the schedule(like execute the code every 5 minutes).
Please refer to this doc for more details about creating a scheduled WebJob.
You can also consider using other azure services which supports timerTrigger feature, like azure function.

WebJobs can be complicated to set up and maintain. Especially when dealing with a CI/CD pipeline. I have a few running right now and am getting ready to move them into one of the following more dependable and maintainable solutions:
The way we set up scheduled work is to use an Azure Function that runs via CRON schedule. It's super dependable and durable since it's managed by Azure. You just set it up, throw your code up and the rest is up to Azure to make sure it fires off when you configured it to.
If you want to do this in your own application, take a look at running a background service in an ASP.NET Core application. You can run a timer in the background service that will fire off and do some work. Keep in mind that if your app scales horizontally, you will be running two timers, which probably isn't good in your situation.
You could do something fancy like setting up a Azure Function to hit an endpoint on your WebAPI at a scheduled time. Then you could send the work to a BackgroundService which is singleton, so you could block a second request if you are currently running your job.
We tend to go the last route. Azure fires off a timer, the function executes, sends a message to an endpoint, the endpoint places work in the background.
There are tons of options outside of what I mentioned, but these are the only ones I have had the privilege to architect.

Related

windows service equivalent using .net core hosted in AWS C#

I have several windows services that I have written that act as middle-ware to 2 separate vendors. The service sends requests to the first vendor, processes the response, and sends formatted data to the second vendor. These processes run every X seconds in the background on the service. The service uses TCP and json for transport and data types.
These services generally do nothing outside the above description except for logging.
I have been writing with windows/.net for eons and am attempting to venture into either AWS or Azure to run these particular assets. I have only scratched the surface on Lambda and Functions and already I feel overwhelmed.
Any direction to get started would be greatly appreciated.
You can do it using Azure Functions + Time Trigger. Here's a sample:
[FunctionName("TimerTriggerCSharp")]
public static void Run([TimerTrigger("0 */5 * * * *")]TimerInfo myTimer, ILogger log)
{
if (myTimer.IsPastDue)
{
log.LogInformation("Timer is running late!");
}
log.LogInformation($"C# Timer trigger function executed at: {DateTime.Now}");
}
Notice that it has a CRON expression to specify the frequency of the trigger.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer
PS: As far as I know, there's no time trigger on AWS Lambda.
If you are looking for something integrated, you can look at Hangfire. You can run it as a separate Dot Net Core App, then you would cloud-agnostic. You can still run using a serverless option from either Azure or AWS. It is basically is a task runner that you can schedule recurring code to run, all from c#. The company I work for uses it to send and process e-mail messages as well as pulling and importing data from vendors.
I set it up in an empty Dot Net Core project, that way I can give it its own resources and not slow end-users interactions. It can be attached to an existing application, as well.
You can find more info here: https://www.hangfire.io/
Hope this helps.

How to schedule c# script to run on AWS?

Actually I have did some research to run CRON job on AWS but did not found any good document which explain how to run c# script in AWS periodically.
I have found that we could do it with CloudWatch + Lambda, but it will not help in my case because Lambda has maximum 5 min timeout.
Then I start more research on this and found that there is a 'AWS Elastic Beanstalk' which has two options to select environment:
Web server environment
Worker environment
But here we can't create a script for .NET C# for worker environment.
So what are another options to create cron jobs on AWS?
I do this by creating a self-hosted c#/.net webapi as a windows service running on my ec2 instance.
The main loop of this webapi/windows service polls a dedicated SQS queue for 'jobs' to complete.
I then use aws cloudwatch events to put those jobs into the SQS queue at the desired interval. i.e. every hour, every day, every 5 minutes etc.
Its not as easy as just pasting a c# script someplace and telling it when to run, but it does give you a nice flexible framework for hanging lots of custom jobs that take a long time to run, and need to run on a regular basis.

Scaling up of Azure Webjobs

I have an azure web app with 2 webjobs. I need to scale up just one webjob. Is there any way of scaling it up/out independent of the site and the other webjob?
Assuming your WebJobs are continuous,
From https://github.com/projectkudu/kudu/wiki/WebJobs-API
If a continuous job is set as singleton it'll run only on a single instance opposed to running on all instances. By default, it runs on all instances.
To set a continuous job as singleton during deployment (without the need for
the REST API) you can simply create a file called settings.job with the content: { "is_singleton": true } and put it at the root of the (specific) WebJob directory.
WebJobs that you'd like to stay on one instance, set as singleton.
The rest of them will scale automagically with your App Service Plan.
Triggered WebJobs only run on one instance.
Source: same URL as above.
Invoke a triggered job
Note: if the site has multiple instances, the job will run on one of them arbitrarily.

MVC - Run scheduled task even if nobody logs into app

I have a hosted ASP.NET MVC5 web app. Is there any way to get the app to run a "scheduled" task even if nobody logs into the app? Or is my only choice to use the App Start when the app first runs?
I need to send an email to my users first thing each morning. Is there a reasonable way to do this with the MVC5 app or am I going to have to set up a Windows service?
Most people recommend a windows service. However, a reasonable way to do this would be using a scheduling framework like Quartz .NET
http://quartznet.sourceforge.net/
I prefer this because then my jobs/schedules travel with my application and when I deploy on a new box I don't have to setup a service or anything, everything is embedded in the MVC5 application. Quartz also has the ability to sync between servers via a db if you have a load-balanced environment (like I do) and it works well enough for me. Also using the DB as the job store makes sure that jobs persist between deployments and application restarts because by default jobs are in memory.
I would not involve an email sending job with MVC application, since if you think about it, an MVC application concern is to work by the Request-Response model, on which scenario do you see it start a new job?
If you have an access to your users emails, just create a simple Console Application or a Windows Service to do that work and set a scheduling for it using the Windows Task Scheduler or any other task scheduling tool.
In addition, if you're enforced to do it within your MVC application:
Read is a nice old post by Jeff Atwood about how to create a job inside ASP.NET application: Easy Background Tasks in ASP.NET
Create and schedule a call to an Action in your MVC application that will do that email sending work
Use Quartz.NET third-party library for creating scheduled background tasks in
Web-Applications
Don't use a Windows service, instead you should use the Windows Task Scheduler.
Just create a Console Application and register it in the scheduler.
You can create a singleton and in your ApplicationStart(); that will launch itself every 24h and then send emails. It will include locking that particular thread for 24h.
It's a very bad approach but it seems that you don't have any other options when you're on shared hosting with no access to actual system.
I think the question comes down to, do you need the ability to start/stop the service and have the webapp still running?
I personally try to avoid setting up a windows service because it adds another layer of complexity that can break/not work. If you use quartz or just a basic timer in your web app, the scheduling is guaranteed to run when your app runs.
With in-app scheduling you can install your webapp with a simple file copy.
Sure, there are situation when you need to do heavy background jobs, then you might want to consider a separate batch job project with a windows service... but for sending out a couple of email, just use in-app scheduling.
The common way to do this is with the Windows Task Scheduler.
The problem with calling START or some other command line parameter is the opened browser may never close or it might close when the task times out.
I wrote a console app to make a call to a website and wait for a response.
Imports System.IO
Imports System.Net
Module Module1
Sub Main()
Dim sw As New Stopwatch
sw.Start()
Try
Dim args = Environment.GetCommandLineArgs.ToList
If args.Count = 1 Then Throw New Exception("WebsiteWaitResponse url [user] [password]")
Console.WriteLine("{0:c} WebsiteWaitResponse {1:g}", sw.Elapsed, Now())
Dim web As New WebClient
If args.Count > 2 Then web.Credentials = New NetworkCredential(args(2), args(3))
Dim results = web.DownloadString(args(1))
Console.WriteLine(results)
Catch ex As Exception
Console.WriteLine(ex.Message)
End Try
Console.WriteLine("{0:c} WebsiteWaitResponse Complete", sw.Elapsed)
End
End Sub
End Module
Create a scheduled task which calls this app with command line parameters as follows:
MyConsoleApp url [userid] [password]
where [userid] and [password] are optional and used for ntlm authentication
MyConsoleApp http://mywebsite/controller/function mydomain\myuser mypassword
Hangfire to the rescue. This is perhaps the easiest and more reliable way to achieve this task. It comes with a dashboard that makes managing and monitoring your background tasks effortless.
Please check the below URL, using which you can make sure that your web application is always up, even if no body is logged into your application, or if your application is idle for long time
You just need to configured your server per below and make sure to start your jobs.
http://developers.de/blogs/damir_dobric/archive/2009/10/11/iis-7-5-and-always-running-web-applications.aspx

Spawning and executing a Worker process in Azure

The google has really failed me on this one. I am new to Azure and am only intermediate at .NET
I have an Azure solution going and I've written some code in a Web Role which runs great. What I would like to do now is move some of this code into an Azure Worker, which will be initialized by a controller function in the Web Role
What on earth do I need to do to get this going locally? I have created the Worker project within the SLN. I just need to know how to fire it up and run it.
I think part of my problem is I am assuming these workers behave like Heroku workers... is this the case? Because what I need is something like a queue system (a bunch of "worker tasks" in one big queue).
A lot of the links I've found for tutorials seem to tap dance around how to actually initialize the process from a Web Role.
Workers in Windows Azure are not tasks; they're entire VMs. To make your life easier, memorize this little detail: Web Role instances are Windows Server 2008 with IIS running, and Worker Roles are the same thing but with IIS disabled.
When you added that worker role to your project, you actually now have a new set of virtual machines running (at least one, depending on the instance count you set). These VMs have their own OnStart() and Run() methods you can put code into, for bootstrapping purposes.
If you grab the Windows Azure training kit, you'll see a few labs that show how to communicate between your various role instances (a common pattern being the use of Windows Azure queues). There's a good example of background processes with the Guestbook hands-on lab (the very first lab).
More info on this, as I've gotten it going now..
If you're coming from a Heroku background, then an Azure Worker is more or less the function in Rails that you'd actually execute with the queue. Unlike Heroku queued operations, an Azure Worker just runs endlessly and keeps polling for new stuff to do... hence the templated sleep(10000) in the Run() function.
The most conventional way I've found to make a Web and Worker talk to each other is by queue messages via Azure ServiceBus which is currently NOT emulated, meaning you need a functioning Azure account to make this work, and it will work even if you are running locally. You just need internet access.
A ServiceBus message can pass an entire object over to the Worker (so long as the Worker proj has the right dependencies in it), so that's kind of nice.
I think you're just having trouble starting the azure emulator along with your worker/web roles? Just set the azure configuration project as the start up project and run that. It'll boot up the emulator along with all your roles.

Categories