I am writing a Windows Forms app in C# with Visual Studio 2022 on a Windows 10 machine. The app connects to an Azure database, which works fine. My issue is that sometimes it takes several seconds to connect (maybe 10 or so), or if there is an error it goes all the way to the timeout limit (usually 20 to 30 seconds) before coming back with whatever error message there is.
I am trying to provide some visual feedback to the user during this time, but the application does not appear to be processing any events, so whatever type of feedback I'm trying to send does not get done until the operation completes (at which point it is moot).
Any ideas on how to deal with this? Do I need to open the database on a different thread, and if so, will that be an issue throughout the rest of the app whenever I use the database object opened on a different thread?
I'm trying something simple, like gradually adding a row of dots, like so:
private void InitCloudDatabase()
{
Boolean success = true;
WorkingTimer.Enabled = true;
WorkingTimer.Start();
try
{
AzureAgDatabase db = new AzureAgDatabase();
db.OpenConnection();
}
catch
{
success = false;
}
WorkingTimer.Stop();
pbCloudResult.Image = (success) ? Properties.Resources.icons8_done_96 :
Properties.Resources.Red_X___Fail;
}
private void WorkingTimer_Tick(object sender, EventArgs e)
{
lblCloud.Text += " .";
if (lblCloud.Text.Contains(" . . . . . . . . . . ."))
{
lblCloud.Text = "Database Connection (Cloud)";
}
}
I haven't really worked with Windows Forms before, but in most UI based applications, you should reserve the UI Thread for just UI operations and move all time consuming tasks (Compute or I/O) to a different thread to ensure that the UI is still responsive.
In the case of Windows Forms, looks like you have a BackgroundWorker class that you can use to offload the DB operations into. Here is a walkthrough in the official docs that you can refer to.
Another approach would be to use the Task class to run your database code asynchronously, with lesser code compared to the first approach. You would simply wrap statements that take time in a Task.Run call and have follow up statements in a continuation task.
I've got a C# console app running on Windows Server 2003 whose purpose is to read a table called Notifications and a field called "NotifyDateTime" and send an email when that time is reached. I have it scheduled via Task Scheduler to run hourly, check to see if the NotifyDateTime falls within that hour, and then send the notifications.
It seems like because I have the notification date/times in the database that there should be a better way than re-running this thing every hour.
Is there a lightweight process/console app I could leave running on the server that reads in the day's notifications from the table and issues them exactly when they're due?
I thought service, but that seems overkill.
My suggestion is to write simple application, which uses Quartz.NET.
Create 2 jobs:
First, fires once a day, reads all awaiting notification times from database planned for that day, creates some triggers based on them.
Second, registered for such triggers (prepared by the first job), sends your notifications.
What's more,
I strongly advice you to create windows service for such purpose, just not to have lonely console application constantly running. It can be accidentally terminated by someone who have access to the server under the same account. What's more, if the server will be restarted, you have to remember to turn such application on again, manually, while the service can be configured to start automatically.
If you're using web application you can always have this logic hosted e.g. within IIS Application Pool process, although it is bad idea whatsoever. It's because such process is by default periodically restarted, so you should change its default configuration to be sure it is still working in the middle of the night, when application is not used. Unless your scheduled tasks will be terminated.
UPDATE (code samples):
Manager class, internal logic for scheduling and unscheduling jobs. For safety reasons implemented as a singleton:
internal class ScheduleManager
{
private static readonly ScheduleManager _instance = new ScheduleManager();
private readonly IScheduler _scheduler;
private ScheduleManager()
{
var properties = new NameValueCollection();
properties["quartz.scheduler.instanceName"] = "notifier";
properties["quartz.threadPool.type"] = "Quartz.Simpl.SimpleThreadPool, Quartz";
properties["quartz.threadPool.threadCount"] = "5";
properties["quartz.threadPool.threadPriority"] = "Normal";
var sf = new StdSchedulerFactory(properties);
_scheduler = sf.GetScheduler();
_scheduler.Start();
}
public static ScheduleManager Instance
{
get { return _instance; }
}
public void Schedule(IJobDetail job, ITrigger trigger)
{
_scheduler.ScheduleJob(job, trigger);
}
public void Unschedule(TriggerKey key)
{
_scheduler.UnscheduleJob(key);
}
}
First job, for gathering required information from the database and scheduling notifications (second job):
internal class Setup : IJob
{
public void Execute(IJobExecutionContext context)
{
try
{
foreach (var kvp in DbMock.ScheduleMap)
{
var email = kvp.Value;
var notify = new JobDetailImpl(email, "emailgroup", typeof(Notify))
{
JobDataMap = new JobDataMap {{"email", email}}
};
var time = new DateTimeOffset(DateTime.Parse(kvp.Key).ToUniversalTime());
var trigger = new SimpleTriggerImpl(email, "emailtriggergroup", time);
ScheduleManager.Instance.Schedule(notify, trigger);
}
Console.WriteLine("{0}: all jobs scheduled for today", DateTime.Now);
}
catch (Exception e) { /* log error */ }
}
}
Second job, for sending emails:
internal class Notify: IJob
{
public void Execute(IJobExecutionContext context)
{
try
{
var email = context.MergedJobDataMap.GetString("email");
SendEmail(email);
ScheduleManager.Instance.Unschedule(new TriggerKey(email));
}
catch (Exception e) { /* log error */ }
}
private void SendEmail(string email)
{
Console.WriteLine("{0}: sending email to {1}...", DateTime.Now, email);
}
}
Database mock, just for purposes of this particular example:
internal class DbMock
{
public static IDictionary<string, string> ScheduleMap =
new Dictionary<string, string>
{
{"00:01", "foo#gmail.com"},
{"00:02", "bar#yahoo.com"}
};
}
Main entry of the application:
public class Program
{
public static void Main()
{
FireStarter.Execute();
}
}
public class FireStarter
{
public static void Execute()
{
var setup = new JobDetailImpl("setup", "setupgroup", typeof(Setup));
var midnight = new CronTriggerImpl("setuptrigger", "setuptriggergroup",
"setup", "setupgroup",
DateTime.UtcNow, null, "0 0 0 * * ?");
ScheduleManager.Instance.Schedule(setup, midnight);
}
}
Output:
If you're going to use service, just put this main logic to the OnStart method (I advice to start the actual logic in a separate thread not to wait for the service to start, and the same avoid possible timeouts - not in this particular example obviously, but in general):
protected override void OnStart(string[] args)
{
try
{
var thread = new Thread(x => WatchThread(new ThreadStart(FireStarter.Execute)));
thread.Start();
}
catch (Exception e) { /* log error */ }
}
If so, encapsulate the logic in some wrapper e.g. WatchThread which will catch any errors from the thread:
private void WatchThread(object pointer)
{
try
{
((Delegate) pointer).DynamicInvoke();
}
catch (Exception e) { /* log error and stop service */ }
}
You trying to implement polling approach, where a job is monitoring a record in DB for any changes.
In this case we are trying to hit DB for periodic time, so if the one hour delay reduced to 1 min later stage, then this solution turns to performance bottle neck.
Approach 1
For this scenario please use Queue based approach to avoid any issues, you can also scale up number of instances if you are sending so many emails.
I understand there is a program updates NotifyDateTime in a table, the same program can push a message to Queue informing that there is a notification to handle.
There is a windows service looking after this queue for any incoming messages, when there is a message it performs the required operation (ie sending email).
Approach 2
http://msdn.microsoft.com/en-us/library/vstudio/zxsa8hkf(v=vs.100).aspx
you can also invoke C# code from SQL Server Stored procedure if you are using MS SQL Server. but in this case you are making use of your SQL server process to send mail, which is not a good practice.
However you can invoke a web service, OR WCF service which can send emails.
But Approach 1 is error free, Scalable , Track-able, Asynchronous , and doesn't trouble your data base OR APP, you have different process to send email.
Queues
Use MSMQ which is part of windows server
You can also try https://www.rabbitmq.com/dotnet.html
Pre-scheduled tasks (at undefined times) are generally a pain to handle, as opposed to scheduled tasks where Quartz.NET seems well suited.
Furthermore, another distinction is to be made between fire-and-forget for tasks that shouldn't be interrupted/change (ex. retries, notifications) and tasks that need to be actively managed (ex. campaign or communications).
For the fire-and-forget type tasks a message queue is well suited. If the destination is unreliable, you will have to opt for retry levels (ex. try send (max twice), retry after 5 minutes, try send (max twice), retry after 15 minutes) that at least require specifying message specific TTL's with a send and retry queue. Here's an explanation with a link to code to setup a retry level queue
The managed pre-scheduled tasks will require that you use a database queue approach (Click here for a CodeProject article on designing a database queue for scheduled tasks)
. This will allow you to update, remove or reschedule notifications given you keep track of ownership identifiers (ex. specifiy a user id and you can delete all pending notifications when the user should no longer receive notifications such as being deceased/unsubscribed)
Scheduled e-mail tasks (including any communication tasks) require finer grained control (expiration, retry and time-out mechanisms). The best approach to take here is to build a state machine that is able to process the e-mail task through its steps (expiration, pre-validation, pre-mailing steps such as templating, inlining css, making links absolute, adding tracking objects for open tracking, shortening links for click tracking, post-validation and sending and retrying).
Hopefully you are aware that the .NET SmtpClient isn't fully compliant with the MIME specifications and that you should be using a SAAS e-mail provider such as Amazon SES, Mandrill, Mailgun, Customer.io or Sendgrid. I'd suggest you look at Mandrill or Mailgun. Also if you have some time, take a look at MimeKit which you can use to construct MIME messages for the providers allow sending raw e-mail and doesn't necessarily support things like attachments/custom headers/DKIM signing.
I hope this sets you on the right path.
Edit
You will have to use a service to poll at specific intervals (ex. 15 seconds or 1 minute). The database load can be somewhat negated by checkout out a certain amount of due tasks at a time and keeping an internal pool of messages due for sending (with a time-out mechanism in place). When there's no messages returned, just 'sleep' the polling for a while. I'd would advise against building such a system out against a single table in a database - instead design an independant e-mail scheduling system that you can integrate with.
I would turn it into a service instead.
You can use System.Threading.Timer event handler for each of the scheduled times.
Scheduled tasks can be scheduled to run just once at a specific time (as opposed to hourly, daily, etc.), so one option would be to create the scheduled task when the specific field in your database changes.
You don't mention which database you use, but some databases support the notion of a trigger, e.g. in SQL: http://technet.microsoft.com/en-us/library/ms189799.aspx
If you know when the emails need to be sent ahead of time then I suggest that you use a wait on an event handle with the appropriate timeout. At midnight look at the table then wait on an event handle with the timeout set to expire when the next email needs to be sent. After sending the email wait again with the timeout set based on the next mail that should be sent.
Also, based on your description, this should probably be implemented as a service but it is not required.
I have been dealing with the same problem about three years ago. I have changed the process several times before it was good enough, I tell you why:
First implementation was using special deamon from webhosting which called the IIS website. The website checked the caller IP and then check the database and send emails. This was working until one day, when I got a lot of very dirty emails from the users that I have totally spammed their mailboxes. The drawback of keeping email in database and sending from SMTP email is that there is NOTHING which ensure DB to SMTP transaction. You are never sure if the email has been successfully sent or not. Sending email can be successfull, can failed or it can be false positive or it can be false negative (SMTP client tells you, that the email was not sent, but it was). There was some problem with SMTP server and the server returned false(email not send), but the email was sent. The daemon was resending the email every hour the whole day before the dirty emails appears.
Second implementation: To prevent spamming, I have changed the algorithm, that the email is considered to be sent even if it failed (my email notification was not too important). My first advice is: "Don't launch the deamon too often, because this false negative smtp error makes users upset."
After several month there were some changes on the server and the daemon was not working well. I got the idea from the stackoverflow: bind the .NET timer to the web application domain. It wasn't good idea, because it seems, that IIS can restart application from time to time because of memory leaks and the timer never fires if the restarts are more often then timer ticks.
The last implementation. Windows scheduler every hour fires python batch which read local website. This fire ASP.NET code. The advantage is that time windows scheduler call the the local batch and website reliably. IIS doesn't hang, it has restart ability. The timer site is part of my website, it is still one projects. (you can use console app instead). Simple is better. It just works!
Your first choice is the correct option in my opinion. Task Scheduler is the MS recommended way to perform periodic jobs. Moreover it's flexible, can reports failures to ops, is optimized and amortized amongst all tasks in the system, ...
Creating any console-kind app that runs all the time is fragile. It can be shutdown by anyone, needs an open seesion, doesn't restart automatically, ...
The other option is creating some kind of service. It's guaranteed to be running all the time, so that would at least work. But what was your motivation?
"It seems like because I have the notification date/times in the database that there should be a better way than re-running this thing every hour."
Oh yeah optimization... So you want to add a new permanently running service to your computer so that you avoid one potentially unrequired SQL query every hour? The cure looks worse than the disease to me.
And I didn't mention all the drawbacks of the service. On one hand, your task uses no resource when it doesn't run. It's very simple, lightweight and the query efficient (provided you have the right index).
On the other hand, if your service crashes it's probably gone for good. It needs a way to be notified of new e-mails that may need to be sent earlier than what's currently scheduled. It permanently uses computer resources, such as memory. Worse, it may contain memory leaks.
I think that the cost/benefit ratio is very low for any solution other than the trivial periodic task.
I am working on an assignment in asp.net to send notification email to users at specific intervals.
But the problem is that since the server is not privately owned i cannot implement a windows service on it.
Any ideas?
There's no reliable way to achieve that. If you cannot install a Windows Service on the host you could write a endpoint (.aspx or .ashx) that will send the email and then purchase on some other site a service which will ping this endpoint at regular intervals by sending it HTTP request. Obviously you should configure this endpoint to be accessible only from the IP address of the provider you purchase the service from, otherwise anyone could send an HTTP request to the endpoint and trigger the process which is probably undesirable.
Further reading: The Dangers of Implementing Recurring Background Tasks In ASP.NET.
There are several ways to get code executing on an interval that don't require a windows service.
One option is to use the Cache class - use one of the Insert overloads that takes a CacheItemRemovedCallback - this will be called when the cache item is removed. You can re-add the cache item with this callback again and again...
Though, the first thing you need to do is contact the hosting company and find out if they already have some sort of solution for you.
You could set up a scheduled task on the server to invoke a program with the desired action.
You can always use a System.Timer and create a call at specific intervals. What you need to be careful is that this must be run one time, eg on application start, but if you have more than one pools, then it may run more times, and you also need to access some database to read the data of your actions.
using System.Timers;
var oTimer = new Timer();
oTimer.Interval = 30000; // 30 second
oTimer.Elapsed += new ElapsedEventHandler(MyThreadFun);
oTimer.Start();
private static void MyThreadFun(object sender, ElapsedEventArgs e)
{
// inside here you read your query from the database
// get the next email that must be send,
// you send them, and mark them as send, log the errors and done.
}
why I select system timer:
http://msdn.microsoft.com/en-us/magazine/cc164015.aspx
more words
I use this in a more complex class and its work fine. What are the points that I have also made.
Signaling the application stop, to wait for the timer to end.
Use mutex and database for synchronize the works.
Easiest solution is to exploit global.asax application events
On application startup event, create a thread (or task) into a static singleton variable in the global class.
The thread/task/workitem will have an endless loop while(true) {...} with your "service like" code inside.
You'll also want to put a Thread.Sleep(60000) in the loop so it doesn't eat unnecessary CPU cycles.
static void FakeService(object obj) {
while(true) {
try {
// - get a list of users to send emails to
// - check the current time and compare it to the interval to send a new email
// - send emails
// - update the last_email_sent time for the users
} catch (Exception ex) {
// - log any exceptions
// - choose to keep the loop (fake service) running or end it (return)
}
Thread.Sleep(60000); //run the code in this loop every ~60 seconds
}
}
EDIT Because your task is more or less a simple timer job any of the ACID type concerns from an app pool reset or other error don't really apply, because it can just start up again and keep trucking along with any data corruption. But you could also use the thread to simply execute a request to an aspx or ashx that would hold your logic.
new WebClient().DownloadString("http://localhost/EmailJob.aspx");
I use Asp .NET Mvc 3 for creating web page and I need to change something in database after each 20 minutes...
I set Timer in my Global.asax.cs file . Here is the code
protected void Application_Start()
{
AreaRegistration.RegisterAllAreas();
RegisterGlobalFilters(GlobalFilters.Filters);
RegisterRoutes(RouteTable.Routes);
Unit = new UnitOfWork();
System.Timers.Timer timer = new System.Timers.Timer();
timer.Interval = 1200000; //20 minutes
timer.Elapsed += new System.Timers.ElapsedEventHandler(Elapsed);
timer.Start();
}
void Elapsed(object sender, System.Timers.ElapsedEventArgs e)
{
Unit.Srvc.UpdateUserActivity();
}
Now I run it today, and what a pitty, it works only one time... After 20 minutes it change database and it's all.
P.S.Yesteday I tested it in 20 seconds and it works fine. But,today it don't want to work correctly in 20 minutes interval. Thank you for help.
P.S.2 I used Stored Procedure for updating database.
P.S.3 Just now I detect that it works randomly :D In 5:32Am I run the program... It works in 5:52Am, doesn't work in 6:12Am, and works now(now is 6:49 Am, I don't know when it works).
Most likely cause is that your AppDomain is shutting down due to inactivity, which means the entire application is not running. The idle timeout is 20 minutes of inactivity, I think.
See this question:
How to keep ASP.NET assemblies in AppDomain alive?
To me it looks like your timer will be killed by garbage collection as you are not keeping a reference to it after it goes out of scope from Application_Start. Try adding:
Application["Whatever"] = timer;
You might be finding that your thread (from the IIS AppPool) is being recycled or shut down.
Web applications typically work best when used for request-response processing rather than this type of behaviour. It's not clear what you are up to, but assuming you are using SQL Server perhaps you could look at maintenance tasks or triggers if it involves denormalizing data (i.e. rolling up calculated data). If it involves data collected during the request-response process then perhaps you might look at using the web cache and some cache expiration operations for the delayed persistence.
My guess is too that the session simply expires but I would like to add a little extra.
Reading the code I guess (again) that you are marking the user in the database as 'not active' or disconnected or something like that. If so, do not use a timer to do this, instead, set the session expiration (when the user hasn't sent any requests for a certain period) to the required duration and put the code you want to run when that happens in the Session_OnEnd handler
I use the following code in a asp.net website.
On application init i call InitializeTimer() once.
The goal of the code was to run DoWork() once every hour (1 time per hour) .
I also wanted the code to execute on different time every loop so i added the random part.
The result i got was werid , i can not find a explaination why is happens.
The code executed the function after 2hrs , then again after 2hrs , then after 3hrs , then after 2hrs , and 2hrs again.****
Can anybody explain the reason?
using System.Timers;
....
private static random = new Random();
....
public static void InitializeTimer()
{
tTimer = new Timer();
tTimer.AutoReset = true;
tTimer.Interval = TimeSpan.FromHours(1.0).TotalMilliseconds;
tTimer.Elapsed += new ElapsedEventHandler(ClassName1.tMailer_Elapsed);
tTimer.Start();
}
private static void tTimer_Elapsed(object sender, ElapsedEventArgs e)
{
tTimer.Interval += random.Next(-5, 5);
DoWork();
}
Update:
Please don't post "use windows service" , or "scheduled task".
My question is for the following code I'm not looking for better alternatives.
Also , during this test (10hrs) , website was with high traffic , iis pool did not restart!
Based on the following MSDN: (http://msdn.microsoft.com/en-us/library/system.timers.timer.interval.aspx)
If the interval is set after the Timer
has started, the count is reset. For
example, if you set the interval to 5
seconds and then set the Enabled
property to true, the count starts at
the time Enabled is set. If you reset
the interval to 10 seconds when count
is 3 seconds, the Elapsed event is
raised for the first time 13 seconds
after Enabled was set to true.
Is it possible that re-setting the interval in the elapsed function is the cause of the problem?
Meaning that when tTimer_Elapsed function is called the count is 1hr(min a few millisecond)
and my code "tTimer.Interval += random.Next(-5, 5);" is adding another full hour to the Interval?
ASP.NET applications will get shut down when not in use. If someone hits your site, and then no more hits, it can get shut down. Your timer won't fire.
For this type of maintenance work you want to use a windows scheduled task or windows service.
Check this out... Jeff Atwood actually discussed something similar. I guess it worked, but according to Jeff the site outgrew this method so they went to a dedicated task.
Since .net 4.5.2, there is a class called HostingEnvironment, it can do what you're asking, here is how to use:
https://blog.mariusschulz.com/2014/05/07/scheduling-background-jobs-from-an-asp-net-application-in-net-4-5-2
The HostingEnvironment.QueueBackgroundWorkItem method lets you
schedule small background work items. ASP.NET tracks these items and
prevents IIS from abruptly terminating the worker process until all
background work items have completed.
I second Sams suggestion of using windows scheduled task to hit a page every hour. I tried and tried to get mine to work and it sort of worked. I went to a scheduled task and it has never failed.