I've this code in my ASP.NET MVC application to share a value across the application.
public static class Global
{
public static string Token { get; private set; };
public static void LoadFromFile()
{
// loads Token value from a settings file
}
}
What I want to do is to run LoadFromFile method once a day to update the Token value.
I can't use a separate background task like in HangFire, since I want to update the value for current running application.
How can I do it? thanks.
Update:
Mates who think this is a duplicate, please read the question. I want to update the shared value in current running application. changing it in in a separate background task won't change it for current application.
To help SO wandering polices rest a while, I got the answer.
I can run a scheduled background task to access an endpoint in my site, and from there update the static Token value.
I had faced similar requirement once, here is the trick I used
private static DateTime lastRunAt;
private static object loadingTokenLock = new object();
private static bool TokenUpdateNeeded
{
get
{
return DateTime.UtcNow.DayOfYear != lastRunAt.DayOfYear;
}
}
public static void TryLoadToken()
{
if (TokenUpdateNeeded)
lock (loadingTokenLock)
if (TokenUpdateNeeded)
LoadFromFile();
}
public static void LoadFromFile()
{
// loads Token value from a settings file
lastRunAt = DateTime.UtcNow;
}
void Session_Start(object sender, EventArgs e)
{
TryLoadToken();
}
I can't remember the exact coding, but the idea was to update the Token upon the first request of the day.
The problem is if your application have not been visit for more than a day, the Token will not be updated. So we defined a task in Windows Task Scheduler to visit the site once everyday
Related
I have an abstract class called HttpHelper it has basic methods like, GET, POST, PATCH, PUT
What I need to achieve is this:
Store the url, time & date in the database each time the function is called GET, POST, PATCH, PUT
I don't want to store directly to the database each time the functions are called (that would be slow) but to put it somewhere (like a static queue-memory-cache) which must be faster and non blocking, and have a background long running process that will look into this cache-storage-like which will then store the values in the database.
I have no clear idea how to do this but the main purpose of doing so is to take the count of each calls per hour or day, by domain, resource and url query.
I'm thinking if I could do the following:
Create a static class which uses ConcurrentQueue<T> to store data and call that class in each function inside HttpHelper class
Create a background task similar to this: Asp.Net core long running/background task
Or use Hangfire, but that might be too much for simple task
Or is there a built-in method for this in .netcore?
Both Hangfire and background tasks would do the trick as consumers of the queue items.
Hangfire was there before long running background tasks (pre .net core), so go with the long running tasks for net core implementations.
There is a but here though.
How important is to you that you will not miss a call? If it is, then neither can help you.
The Queue or whatever static construct you have will be deleted the time your application crashes/machine restarts or just plain recycling of the application pools.
You need to consider some kind of external Queuing mechanism like rabbit mq with persistence on.
You can also append to a file, but that might also cause some delays as read/write.
I do not know how complex your problem is but I would consider two solutions.
First is calling Async Insert Method which will not block your main thread but will start task. You can return response without waiting for your log to be appended to database. Since you want it to be implemented in only some methods, I would do it using Attributes and Middleware.
Simplified example:
public IActionResult SomePostMethod()
{
LogActionAsync("This Is Post Method");
return StatusCode(201);
}
public static Task LogActionAsync(string someParameter)
{
return Task.Run(() => {
// Communicate with database (X ms)
});
}
Better solution is creating buffer which will not communicate with database each time but only when filled or at interval. It would look like this:
public IActionResult SomePostMethod()
{
APILog.Log(new APILog.Item() { Date = DateTime.Now, Item1 = "Something" });
return StatusCode(201);
}
public partial class APILog
{
private static List<APILog.Item> _buffer = null;
private cont int _msTimeout = 60000; // Timeout between updates
private static object _updateLock = new object();
static APILog()
{
StartDBUpdateLoopAsync();
}
private void StartDBUpdateLoopAsync()
{
// check if it has been already and other stuff
Task.Run(() => {
while(true) // Do not use true but some other expression that is telling you if your application is running.
{
Thread.Sleep(60000);
lock(_updateLock)
{
foreach(APILog.Item item in _buffer)
{
//Import into database here
}
}
}
});
}
public static void Log(APILog.Item item)
{
lock(_updateLock)
{
if(_buffer == null)
_buffer = new List<APILog.Item>();
_buffer.Add(item);
}
}
}
public partial class APILog
{
public class Item
{
public string Item1 { get; set; }
public DateTime Date { get; set; }
}
}
Also in this second example I would not call APILog.Log() each time but use Middleware in combination with Attribute
I am using Hangfire to do jobs, and I'd like to change the behaviour that succeeded jobs are deleted from the database after a day - I'd like them to be stored for a year.
Following the instructions in this thread, which is the same as in this SO question, I have created a class:
public class OneYearExpirationTimeAttribute : JobFilterAttribute, IApplyStateFilter
{
public void OnStateUnapplied(ApplyStateContext context, IWriteOnlyTransaction transaction)
{
context.JobExpirationTimeout = TimeSpan.FromDays(365);
}
public void OnStateApplied(ApplyStateContext context, IWriteOnlyTransaction transaction)
{
context.JobExpirationTimeout = TimeSpan.FromDays(365);
}
}
and I register it in my Asp.net web api startup class as a global filter:
public class Startup
{
public void Configuration(IAppBuilder app)
{
// ... other stuff here ...
GlobalJobFilters.Filters.Add(new OneYearExpirationTimeAttribute());
GlobalConfiguration.Configuration.UseSqlServerStorage("HangFireDBConnection");
app.UseHangfireDashboard();
}
}
The web api is the place where jobs are posted (i.e., the call to BackgroundJob.Enqueue(() => ...) happens). I have not changed the configuration of the clients that do the actual jobs.
If I now post a job and it succeeds, it still has a expiry of one day as you can see in the screenshot, which shows both the dashboard and the entry in the HangfireDb,
What am I doing wrong or what am I missing?
My mistake in setup was that the attribute was set on the wrong application. As I stated in the question, I added the filter in the startup.cs file of the asp.net web api where jobs are posted.
Instead I should have added the configuration in the Console application where the jobs are being executed, i.e., my console app starts with
static void Main(string[] args)
{
GlobalConfiguration.Configuration.UseSqlServerStorage("HangFireDBConnection");
GlobalJobFilters.Filters.Add(new OneYearExpirationTimeAttribute());
// ... more stuff ...
}
Then it works. The Hangfire documentation could be a bit clearer on where the filter should be configured.
Using version:
// Type: Hangfire.JobStorage
// Assembly: Hangfire.Core, Version=1.7.11.0, Culture=neutral, PublicKeyToken=null
This can be done directly (apparently)
JobStorage.Current.JobExpirationTimeout = TimeSpan.FromDays(6 * 7);
I have a simple service interface I am using to synchronize data with a server via HTTP. The service interface has a method to start and stop the synchronization process. The idea is to start the synchronization process after the user signs in, and stop the synchronization at the end of the application before the user signs out. The synchronization service will check for new messages every few minutes, and then notify the ViewModel(s) of new/changed data using the MvxMessenger plugin.
What is the recommended way to ensure the synchronization service lives for the duration of the app? I am currently using a custom IMvxAppStart which registers the service interface as a singleton, and then holds a static reference to the service interface. Is that enough to keep the service alive for the lifetime of the app, or is there a better way?
public class App : MvxApplication
{
public override void Initialize()
{
...
RegisterAppStart(new CustomAppStart());
}
}
public class CustomAppStart : MvxNavigatingObject, IMvxAppStart
{
public static ISyncClient SynchronizationClient { get; set; }
public void Start(object hint = null)
{
SynchronizationClient = Mvx.Resolve<ISyncClient>();
ShowViewModel<SignInViewModel>();
}
}
public interface ISyncClient
{
void StartSync();
void StopSync();
bool IsSyncActive { get; }
}
You don't need a static property for this. When you register the Interface as a singleton, the IoC do the work for you. Example: In one of our apps wee need a state-property with important data for the whole lifetime of the app.
The models who need this state, just uses following code snippet:
protected IApplicationState AppState
{
get { return _appstate ?? (_appstate = Mvx.GetSingleton<IApplicationState>()); }
}
private IApplicationState _appstate;
But: You can do it also with a static property. But in this case you don't need a singleton-value in the IoC.
I am developing a project for a log monitor and I am using an ASP.NET application with SignalR.
The main objective of the application is to provide a monitor of error logs in multiple clients across different locations (LCD monitors). Every moment when a log error is created in database, the application should notify all the clients with the new error.
I am wondering to create a static Timer variable in the web application, that will be started by the Application_Start method.
But, knowing the application will have a single thread per session, I think the web server will have a lot of timers running together.
I need to know how to make this Timer instance unique for all the session instances in the web server.
Application_Start is not triggered by a new session, but by the start of the application. If you initialize your timer in Application_Start, you don't need to worry about multiple timer instances.
You can create an instance class that has a timer.
For instance:
public class MyTimerHolder
{
private static Lazy<MyTimerHolder> _instance = new Lazy<MyTimerHolder>(() => new MyTimerHolder());
private readonly TimeSpan _checkPeriod = TimeSpan.FromSeconds(3);
private IHubContext _hubProxy;
// Threaded timer
private Timer _timer;
public MyTimerHolder()
{
_timer = new Timer(CheckDB, null, _checkPeriod, _checkPeriod);
}
public void BroadcastToHub(IHubContext context)
{
_hubProxy = context;
}
public void CheckDB(object state)
{
if (_hubProxy != null)
{
// Logic to check your database
_hubProxy.Clients.All.foo("Whatever data you want to pass");
}
}
public static MyTimerHolder Instance
{
get
{
return _instance.Value;
}
}
}
Then you can change the hubContext at any point from any method. So lets say you want to broadcast to clients connected to hub "MyDBCheckHub". At any point in your application all you have to do is:
MyTimerHolder.Instance.BroadcastToHub(GlobalHost.ConnectionManager.GetHubContext<MyDBCheckHub>());
You could throw this in your application start or wherever you please, there'll only be 1 instance of MyTimerHolder within the app domain.
I have been task with (ha) creating an application that will allow the users to schedule a command line app we have with a parameter.
So the command line app takes an xml and "runs it"
So bottom line I either need to create a windows service or learn how to interact with the Task Scheduler service already running on the box (version 1 Xp /2003)
At first I though it would be easy have a service run and when a job is submitted, calculate the time between now and run and set up a timer to wait that amount of time. This is better then checking every minute if it's time to run.
Were I hit a wall is I relized I do not know how to communicate with a running windows service. Except maybe create a file with details and have the service with a file watcher to load the file and modify the schedule.
So the underlying questions are how can I execute this psedo code
from client
serviceThatIsRunning.Add(Job)
Or ineracting with the task schedule or creating .job files using c# 3.5
Edit:
To clarify I created a small sample to get my thoughts on "paper"
So I have a Job Class
public class Job
{
#region Properties
public string JobName { get; set; }
public string JobXML { get; set; }
private Timer _JobTimer;
public Timer JobTimer
{
get
{
return _JobTimer;
}
}
#endregion
public void SetJobTimer(TimeSpan time)
{
if (_JobTimer != null)
{
_JobTimer.Dispose();
}
_JobTimer = new Timer(new TimerCallback(RunJob), null, time, time);
}
private void RunJob(Object state)
{
System.Diagnostics.Debug.WriteLine(String.Format("The {0} Job would have ran with file {1}", JobName, JobXML));
}
public override string ToString()
{
return JobName;
}
public void StopTimer()
{
_JobTimer.Dispose();
}
}
Now I need to create an App to house these Jobs that is constantly running, that is why I though of Windows Services, and then a Windows app to allow the user to work with the Job List.
So the question is if I create a Windows Service how do I interact with methods in that service so I can change the JobList, add, delete, change.
Here is a small windows app I created to show that the Job class does run. Interesting point, If I am doing this correctly, I do not add the Job to a listbox and the Add method exits the Job Timer portion still runs and does not get picked up by the Garbage Collector.
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
}
private void btnAddJob_Click(object sender, EventArgs e)
{
Job job = new Job();
job.JobName = txtJobName.Text;
job.JobXML = txtJobXML.Text;
job.SetJobTimer(new TimeSpan(0, 0, Convert.ToInt32(JobTime.Value)));
// ??Even If I don't add the Job to a list or ListBox it seems
// ??to stay alive and not picked up by the GC
listBox1.Items.Add(job);
}
private void listBox1_SelectedIndexChanged(object sender, EventArgs e)
{
if (listBox1.SelectedIndex > -1)
{
Job job = listBox1.Items[listBox1.SelectedIndex] as Job;
txtJobName.Text = job.JobName;
txtJobXML.Text = job.JobXML;
}
}
private void btnRemove_Click(object sender, EventArgs e)
{
Job job = listBox1.Items[listBox1.SelectedIndex] as Job;
job.StopTimer();
listBox1.Items.Remove(job);
}
private void btnCollect_Click(object sender, EventArgs e)
{
GC.Collect();
}
}
If you want to schedule a task using the task scheduler it could be as simple as below. You just need to customize the command line arguments that you pass to schtasks for your needs. See this link for a detailed explanation of command line arguments.
Process p = Process.Start("schtasks", commandArgs);
p.WaitForExit();
If you want to start multiple tasks that run at different time intervals, you can
create for instance a class JobThread that defines a timer that is initialized using the Initialize method:
m_timer = new Timer(new TimerCallback(this.timerHandler), null, this.Interval, this.Interval);
Furthermore, this class defines a List of Job objects. These jobs are executed from the timerHandler.
Finally, you create a singleton JobManager class that defines a Start and Stop method.
In the Start method you do something like this:
foreach (var jobThread in this.m_jobThreads)
{
jobThread.Initialize();
}
This JobManager has also a Initiliaze method that accepts a XmlNode parameter. This method will parse the Xml-job you pass from the command-line.
There was an answer on this thread that is no longer there but, I am going to try to create a listener by keeping a port open
WCF through Windows Services
http://msdn.microsoft.com/en-us/library/ms733069.aspx
Also adding the attribute
[ServiceBehavior(InstanceContextMode = InstanceContextMode.Single)]
Helps to keep state of the service.