Can Hangfire Handle Changes to Scheduled Tasks Without Redeployment - c#

I have been playing around with Hangfire in a Microsoft MVC application. I have gotten it to compile and schedule fire-and-forget tasks, but I am surprised that I cannot add/remove jobs while the program is running. Is it true that Hangfire cannot dynamically schedule tasks during runtime? Is there a well-known framework that allows one to schedule tasks even after the application has been compiled or deployed without having to change the C# code every time I want to add tasks?
I have also researched Quartz.NET, and it seems to have the same issue.
EDIT:
Windows Task Scheduler can allow tasks to be scheduled with a GUI, and UNIX's cron can have tasks added or removed by editing a file, but I'm looking for some sort of application running on Windows that would allow the user to add or remove tasks after the application has been deployed. I do not want to re-compile the application every time I want to add or remove tasks.

As asked, the question seems to rest on a misunderstanding of the meaning of "dynamic...during runtime". The answer is "yes," it can change tasks without redeployment (but that doesn't appear to be what your really looking for).
Hangfire will add a dashboard UI to your application if you configure it to do so, but it is not an end-to-end task management application itself. It is designed to give your application the ability to schedule work, and have that work completed in a very disconnected way from the point of invocation--it may not even be completed on the same machine.
It is limited to invoking .NET code, but by definition this fulfills your stated requirement to "dynamically schedule tasks during runtime." This can be done in response to any event within your application that you like. Tasks can be also be removed, updated and cancelled.
(Post-edit) You're correct: any scheduling UI or deserialization of task-file format you'll have to write yourself. If you are looking for a tool that gives you a UI and/or task-file OOTB, you may need to move up to a commercial product like JAMS. (Disclaimer: this may not even itself have the capabilities you require--I don't have direct experience with the product but folks I've worked with have mentioned it in a positive light).

Create an API to schedule jobs dynamically after runtime. Your API can accept input via an HTTP Get/Put/Post/Delete etc, then run an instance of anything within your code upon the API call, using the data that you give it.
For example, say you have a hard coded Task A and Task B in your code and you want to schedule them to run dynamically using different parameters. You can create an API that will run the desired task at the specified time, using the parameters that you choose.
[HttpPost]
public IHttpActionResult Post([FromBody]TaskDto task)
{
var job = "";
if(task.TaskName == "TaskA"){
job = BackgroundJob.Schedule(() => RunTaskA(task.p1,task.p2), task.StartTime);
}
if(task.TaskName == "TaskB"){
job = BackgroundJob.Schedule(() => RunTaskB(task.p1,task.p2), task.StartTime);
}
if(!string.IsNullOrWhiteSpace(task.ContinueWith) && !string.IsNullOrWhiteSpace(job)){
if(task.ContinueWith == "TaskB"){
BackgroundJob.ContinueWith(job, () => RunTaskB(task.p3,task.p4));
}
if(task.ContinueWith == "TaskA"){
BackgroundJob.ContinueWith(job, () => RunTaskA(task.p3,task.p4));
}
}
return Ok(job)
}
Then you can call the API using a JSON POST (example using javascript)
// Sending JSON data to start scheduled task via POST
//
var xhr = new XMLHttpRequest();
var url = "https://www.example.com/api/scheduletask";
xhr.open("POST", url, true);
xhr.setRequestHeader("Content-type", "application/json");
xhr.onreadystatechange = function () {
if (xhr.readyState === 4 && xhr.status === 200) {
var json = JSON.parse(xhr.responseText);
}
};
var data = JSON.stringify({"TaskName": "TaskA", "ContinueWith": "TaskB",
"StartTime": "2-26-2018 10:00 PM", "p1": "myParam1", "p2": true,
"p3": "myParam3", "p4": false});
xhr.send(data);
And for completeness of the example here is the TaskDto class for this example
public class TaskDto
{
public string TaskName { get; set; }
public string ContinueWith { get; set; }
public DateTime StartTime { get; set; }
public string p1 { get; set; }
public bool p2 { get; set; }
public string p3 { get; set; }
public bool p4 { get; set; }
}

Related

Are bitmasks ever a good way to model if a work request has one or more statuses currently applied, and what would work better?

Summary
Need to be able to tell if a work request has one or more statuses currently applied to it, and be able to remove statuses without affecting other statuses applied. Currently, the work request can only have one status at a time, and the code for determining what the 'most important' status is keeps growing.
SQL server back end, C# with EF (mostly) for data access
Background
I'm working on an application where we have a work request where the status changes as people do specific activities until the request is finished. There are close to 30 statuses that the request can have, and there are many instances where we need to know if one or more statuses have been applied to the work request (to determine what happens next).
Currently the request has a single status that reflects the most current status, and when we change the status it has to go through code that looks at other associated data to determine what the 'most important' status is and change the request to that one.
This business problem seems to be perfect for using bitwise calculations, but I don't want to resort to an obsolete practice. The other possibility is to just have a collection of statuses and add/remove from the list.
Thanks
X DO NOT use an enum for open sets (such as the operating system
version, names of your friends, etc.).
[Microsoft Framework Design Guidelines].
Your use case sounds like an open set that will be added to over time. So based on that alone I'd say enum's are not right for this use case.
X AVOID creating flag enums where certain combinations of values are invalid.
Additionally it doesn't sound like all the values from your enum can be combined and still be valid.
Lastly here's a comment from Steven Clarke from the published copy of the Microsoft Framework Design Guidelines about the complexity in your proposed use of enums:
I'm sure that less experienced developers will be able to understand
bitwise operation on flags. The real question, though, is whether they
would expect to have to do this. Most of the APIs that I have run
through the labs don't require them to perform such operations so I
have a feeling that they would have the same experience that we
observed during a recent study - it's just not something that they are
used to doing so they might not even think about it. Where it could get
worse, I think, is that if less advanced developers don't realize they
are working with a set of flags that can be combined with one another,
they might just look at the list available and think that is all the
functionality they can access. As we've seen in other studies, if an
API makes it look to them as though a specific scenario or requirement
isn't immediately possible, it's likely that they will change the
requirement and do what does appear to be possible, rather than being
motivated to spend time investigating what they need to do to achieve
the original goal.
What follows are just some thoughts about enums should you go this route:
DO name flag enums with plural nouns or noun phrases and simple enums with singular nouns or noun phrases.
DO use powers of two for the flag enum values so they can be freely combined using the bitwise OR operation.
I don't think there is anything necessarily wrong with using a flagged enum for your status. To clarify, I think there are two things you are talking about. The set of actions that have been done for a request, and then some sort of derived value that you want to communicate to the user as being the most important value. Think this could be handled by doing a flagged enum, and then a property that is derived from your status enum (you might already be doing this). I would also recommend keeping a log of when each status was applied to a request in a separate entity.
As far as enforcing your statuses goes, one thing you could try is to represent your process as a directed graph of each of the steps. Then that data structure can be used to determine if all the conditions are met to move to the next step in your process.
[Flags]
public enum Status
{
Unknown = 0,
Completed = 1,
Blocked = 2,
Phase1 = 4,
Phase2 = 8,
Phase3 = 16,
Closed = 32
}
public class Request
{
public string Name { get; set; }
public string StatusText { get { return GetStatusText(); } }
public Status Status { get; set; }
public Request()
{
this.Status = Status.Unknown;
}
private string GetStatusText()
{
string statusText = "Created";
if (AnyStatus(Status.Closed | Status.Completed))
{
statusText = IsStatus(Status.Closed) ? "Closed" : "Completed";
}
else
{
if (IsStatus(Status.Blocked))
{
statusText = "Blocked";
}
else
{
if(IsStatus(Status.Phase3)) {
statusText = "Phase 3";
}
else if(IsStatus(Status.Phase2)) {
statusText = "Phase 2";
}
else if (IsStatus(Status.Phase1))
{
statusText = "Phase 1";
}
}
}
return statusText;
}
private bool IsStatus(Status checkStatus)
{
return ((this.Status & checkStatus) == checkStatus);
}
private bool AnyStatus(Status checkStatus)
{
return ((this.Status & checkStatus) > 0);
}
}
Possible class for logging status changes
public class StatusLog
{
public int RequestId { get; set; }
public int UserId { get; set; }
public DateTime Date { get; set; }
public Status Status { get; set; }
}

How can I Lock an Azure Table partition in an Azure Function using IQueryable and IAsyncCollector?

I'm fiddling with Azure Functions, combining it with CQRS and event sourcing. I'm using Azure Table Storage as an Event Store. The code below is a simplified version to not distract from the problem.
I'm not interested in any code tips, since this is not a final version of the code.
public static async Task Run(BrokeredMessage commandBrokeredMessage, IQueryable<DomainEvent> eventsQueryable, IAsyncCollector<IDomainEvent> eventsCollector, TraceWriter log)
{
var command = commandBrokeredMessage.GetBody<FooCommand>();
var committedEvents = eventsQueryable.Where(e => e.PartitionKey = command.AggregateRootId);
var expectedVersion = committedEvents .Max(e => e.Version);
// some domain logic that will result in domain events
var uncommittedEvents = HandleFooCommand(command, committedEvents);
// using(Some way to lock partition)
// {
var currentVersion = eventsQueryable.Where(e => e.PartitionKey = command.AggregateRootId).Max(e => e.Version);
if(expectedVersion != currentVersion)
{
throw new ConcurrencyException("expected version is not the same as current version");
}
var i = currentVersion;
foreach (var domainEvent in uncommittedEvents.OrderBy(e => e.Timestamp))
{
i++;
domainEvent.Version = i;
await eventsCollector.AddAsync(domainEvent);
}
// }
}
public class DomainEvent : TableEntity
{
private string eventType;
public virtual string EventType
{
get { return eventType ?? (eventType = GetType().UnderlyingSystemType.Name); }
set { eventType = value; }
}
public long Version { get; set; }
}
My efforts
To be fair, I could not try anything, because I don't know where to start and if this is even possible. Id did some research which did not solve my problem, but could help you solve this problem.
Do Azure Tables support locking?
yes, they do: Managing Concurrency in Microsoft Azure Storage. It's called leasing, but I do not know how to implement this in an Azure Function.
Other sources
Azure Functions triggers and bindings developer reference
Azure Functions C# developer reference
Tips, suggestions, alternatives
I'm always open to any suggestions on how to solve problems, but I cannot accept these as an answer to my question. Unless the answer to my question is "no", I can not mark an alternative as an answer. I'm not seeking for the best way to solve my problem, I want it to work the way I engineered it. I know this is stubborn, but this is practice/fiddling.
Blob leases would indeed work pretty well for what you're trying to accomplish (the Functions runtime actually makes extensive use of that internally).
If, before working on a partition, you acquire a lease on a blob (by convention, a blob named after the partition, or something like that) you'd be able to ensure only a given function is working on that partition.
The article you've linked to does show an example of lease acquisition and release, you can find more information in the documentation.
One thing you want to ensure is that you flush your collector before you leave the lock scope (by calling FlushAsync on it)
I hope this helps!

create a timer job to run once a month which exports sharepoint list items to excel and stores in a document library

I want to create a timer job or workflow which runs once a month and exports sharepoint list data to excel and stores this file in a document library.
I have downloaded the code to create timer job from below link but dont know how to include the above requirement
http://code.msdn.microsoft.com/SharePoint-2010-Custom-416cd3a1
//Create class derived from SPJonDefinition Class
class ListTimerJob : SPJobDefinition
{
public ListTimerJob()
: base()
{
}
public ListTimerJob(string jobName, SPService service, SPServer server, SPJobLockType targetType)
: base(jobName, service, server, targetType)
{
}
public ListTimerJob(string jobName, SPWebApplication webApplication)
: base(jobName, webApplication, null, SPJobLockType.ContentDatabase)
{
this.Title = "List Timer Job";
}
public override void Execute(Guid contentDbId)
{
// get a reference to the current site collection's content database
SPWebApplication webApplication = this.Parent as SPWebApplication;
SPContentDatabase contentDb = webApplication.ContentDatabases[contentDbId];
// get a reference to the "ListTimerJob" list in the RootWeb of the first site collection in the content database
SPList Listjob = contentDb.Sites[0].RootWeb.Lists["ListTimerJob"];
// create a new list Item, set the Title to the current day/time, and update the item
SPListItem newList = Listjob.Items.Add();
newList["Title"] = DateTime.Now.ToString();
newList.Update();
}
}
//Add Event receiver at Feature Level
[Guid("9a724fdb-e423-4232-9626-0cffc53fb74b")]
public class Feature1EventReceiver : SPFeatureReceiver
{
const string List_JOB_NAME = "ListLogger";
// Uncomment the method below to handle the event raised after a feature has been activated.
public override void FeatureActivated(SPFeatureReceiverProperties properties)
{
SPSite site = properties.Feature.Parent as SPSite;
// make sure the job isn't already registered
foreach (SPJobDefinition job in site.WebApplication.JobDefinitions)
{
if (job.Name == List_JOB_NAME)
job.Delete();
}
// install the job
ListTimerJob listLoggerJob = new ListTimerJob(List_JOB_NAME, site.WebApplication);
SPMinuteSchedule schedule = new SPMinuteSchedule();
schedule.BeginSecond = 0;
schedule.EndSecond = 59;
schedule.Interval = 5;
listLoggerJob.Schedule = schedule;
listLoggerJob.Update();
}
// Uncomment the method below to handle the event raised before a feature is deactivated.
public override void FeatureDeactivating(SPFeatureReceiverProperties properties)
{
SPSite site = properties.Feature.Parent as SPSite;
// delete the job
foreach (SPJobDefinition job in site.WebApplication.JobDefinitions)
{
if (job.Name == List_JOB_NAME)
job.Delete();
}
}
I would also advice you not to use SharePoint timer jobs engine.
It's definitely not stable.
Sometimes jobs simply don't trigger, they are difficult and slow to instantiate.
Of course you could always spend time tweaking SharePoint to achieve stability, but there is no guarantee. I know it sounds imperative, but trust me, I can't remember all the problems we had with this engine, but we lost much time on it.
I recommend you Quartz.NET or Windows Scheduler, as mentionned earlier.
These are well proven solutions, used by many people, also for SharePoint.
We implemented Quartz.Net for SharePoint at my company, all our Timer Jobs run on this engine.
We had no glitch for two years.
Best Regards.
You should change the SPMinuteSchedule to SPMonthlyByDaySchedule, see http://msdn.microsoft.com/en-us/library/microsoft.sharepoint.spschedule.aspx.
But, my recommendation is to use windows server scheduler and console application. Easy to change, easy to maintain (not iisreset!!) , and easy to log everything. We use console applications for various scheduled jobs, varying from 1 hour to 1 day.

Amazon AWS Simple Workflow Service SWF C# Sample

I was wondering if there are any SWF workflow C# sample code available for the AWS .NET SDK?
AWS Forum Post: https://forums.aws.amazon.com/thread.jspa?threadID=122216&tstart=0
As part of getting familiar with SWF, I ended up writing a common case library that I hope others can use as well. It's called SimpleWorkflowFramework.NET and is available as open source at https://github.com/sdebnath/SimpleWorkflowFramework.NET. It definitely could use a lot of help, so if you are interested, jump right in! :)
I have developed an open source .NET library- Guflow to program Amazon SWF. Here is how you can write a workflow to transcode the video:
[WorkflowDescription("1.0")]
public class TranscodeWorkflow : Workflow
{
public TranscodeWorkflow()
{
//DownloadActivity is the startup activity and will be scheduled when workflow is started.
ScheduleActivity<DownloadActivity>().OnFailure(Reschedule);
//After DownloadActivity is completed TranscodeActivity activity will be scheduled.
ScheduleActivity<TranscodeActivity>().AfterActivity<DownloadActivity>()
.WithInput(a => new {InputFile = ParentResult(a).DownloadedFile, Format = "MP4"})
ScheduleActivity<UploadToS3Activity>().AfterActivity<TranscodeActivity>()
.WithInput(a => new {InputFile = ParentResult(a).TranscodedFile});
ScheduleActivity<SendConfirmationActivity>().AfterActivity<UploadToS3Activity>();
}
private static dynamic ParentResult(IActivityItem a) => a.ParentActivity().Result();
}
In above example I have left out task routing for clarity.
Here is how you can create an activity:
[ActivityDescription("1.0")]
public class DownloadActivity : Activity
{
//It supports both sync/async method.
[ActivityMethod]
public async Task<Response> Execute(string input)
{
//simulate downloading of file
await Task.Delay(10);
return new Response() { DownloadedFile = "downloaded path", PollingQueue = PollingQueue.Download};
}
public class Response
{
public string DownloadedFile;
}
}
For clarity I'm leaving out examples of other activities. Guflow it supported by documentation, tutorial and samples.

Handle a function taking a lot of time Threading

I have a function which is taking a lot of time to execute in a web application.
I have tested this with a profiler and by my logging.
I have other functions running in the same pageload.
What is a best way to display the rest of the values from those functions and keep this function in a thread and display it in a label when it finishes?
This function is used to get events in application which takes time.
private void getEventErrors()
{
EventLog eventLog = new EventLog("Application", ".");
getEvents(eventLog.Entries);
}
private void getEvents(EventLogEntryCollection eventLogEntryCollection)
{
int errorEvents = 0;
foreach (EventLogEntry logEntry in eventLogEntryCollection)
{
if (logEntry.Source.Equals("XYZ"))
{
DateTime variable = Convert.ToDateTime(logEntry.TimeWritten);
long eventTimeTicks = (variable.Ticks);
long eventTimeUTC = (eventTimeTicks - 621355968000000000) / 10000000;
long presentDayTicks = DateTime.Now.Ticks;
long daysBackSeconds = ((presentDayTicks - 864000000000) - 621355968000000000) / 10000000;
if (eventTimeUTC > daysBackSeconds)
{
if (logEntry.EntryType.ToString() == "Error")
{
errorEvents = errorEvents + 1;
}
}
}
}
btn_Link_Event_Errors_Val.Text = errorEvents.ToString(GUIUtility.TWO_DECIMAL_PT_FORMAT);
if (errorEvents == 0)
{
lbl_EventErrorColor.Attributes.Clear();
lbl_EventErrorColor.Attributes.Add("class", "green");
}
else
{
lbl_EventErrorColor.Attributes.Clear();
lbl_EventErrorColor.Attributes.Add("class", "red");
}
}
I have 3 functions in the pageload event, two to get the values from the DB and the other one is shown above.
Should both these functions be service calls?
What i wanted was, the page should load fast and if there is a function taking a lot of time it should run in the background and display when done and in the process if the user want to navigate to a new page it should kill it and move on.
If you have a function that is running in a separate thread in ASP.NET, you may want to consider moving it to a service. There are many reason for this
See this answer (one of many on SO) for why running long running tasks in ASP.NET is not always a good idea.
One option for the service is to use WCF. You can get started here. Your service could implement a method, say GetEvents() which you could use to pull your events. That way you won't tie up your page waiting for this process to complete (using AJAX of course). Also, this allows you to change your implementation of GetEvents() without touching your code on your website.

Categories