I have ASP.NET MVC application and I need to send email in "X" minutes(for each user time is different) to user after he leaves the page.
How can I do it?
Http is stateless and the time response is sent execution of page is finished. You need an application that will be sending mail even when website is not accessed by some body for a significant time interval. You can put the mails that need to be send after an interval of time in the database. Another application could be a Windows service that will pool the database after fixed interval of time let's say 30 seconds and send the mails which have reached the send time.
The solution I would choose depends on the needed scale and reliability of the system you're building.
If it's a low scale (i.e. 1 server with not too many users at the same time), non mission-critical system (i.e. it's OK if from time to time some emails are not actually sent, for example if your server crashes), then the solution can be as simple as managing a queue in memory with a thread that would wake periodically to send emails to the users that recently left the page.
If you need to build something that would be very reliable and potentially have to send a very large number of emails in a short time, and if your system has to scale to a lot of machines, then you would want to build a solution based on a queue in some storage, where as many machines as needed would pick items and handle them. An API such as Windows Azure Queue Service can be a good fit for this if you need a really high scale and reliability.
Related
I'm making an MVC app with the .NET Framework and in one of my controllers I call an async task that sends an e-mail to the signed in user.
This task is called upon when the user clicks a specific checkbox and the e-mail is meant to work as sort of reminder.
The entire task works as intended (the user gets an e-mail when the checkbox is checked), but I need it to wait 24 hours before actually sending the e-mail, as it is a reminder.
Currently the e-mail is sent right away, how can I delay the completion of my "e-mail task", while the code continues?
Use a library like Hangfire which lets you schedule background jobs and backs them with persistent storage.
You can then easily schedule a job like:
BackgroundJob.Schedule(
() => SendEmail("user#domain"),
TimeSpan.FromDays(1));
This is a classic X Y Problem. While it may be possible to make your system wait 24 hours you are creating a very fragile system that can be affected by app pool resets and server reboots.
Putting aside the possibility of an unexpected reboot, what happens when your maintenance cycle comes around and a scheduled reboot is going to happen? How many queued email reminders will you have that you can't do anything with?
The best approach for systems that don't immediately use their data is to buffer it through some form of storage scheme. It could be as simple as writing queued emails to files on the system, or something more robust like a database with a dedicated email sending service.
I have used a LOT of email sending systems over the years, and even for immediate sends we have used a database intermediary, with one dedicated email sending Windows service to produce and send the actual email. By centralizing the email production you not only get one place to maintain your email sending code, but you can also increase the durability of the whole system.
Bonus points if your database is part of a high availability cluster, as in this kind of system the database becomes the critical point. If it is then you're protected from any form of downtime other than a total network outage.
Let the Task wait for 24 hours before sending the mail.
await Task.delay(TimeSpan.FromHours(24));
Add this line in your async function bfore sending the email
I'm currently developing website in asp core 2.2. This site use external API. But I have one big problem and don't know how to solve this. This external API has limit 10 reguest per IP/s. If 11 user click button on my site and call API at the same time, the API can cut me off for a couple hours. The API owner tells clients to take care of not exceeding the limit. Can you have any idea how doing this?
ps. Of course, a million users are a joke, but I want the site to be publicly available :)
That 10 request/s is a hard limit and it seems like theres no way around it. So you have to solve it on your end.
There are couple options:
Calls that API directly using Javascript. This way each user will be able to do 10 request/s instead of 10 request/s for all users (recommended)
Queue the requests and only send out at most 10/s (highly not recommended, kills your thread pool and can block everyone from accessing your site when the speed of input coming is > output)
Drop the request on server side when you are reaching that 10/s limit and have the client retry at a later time. (wait time will be infinite when speed of input coming is > output)
And depending on the content returned by the API you might be able to cache it on server side to avoid having to request it from the 3rd party again.
In this scenario you would need to account for the possibility that you can't process requests in real time. You wouldn't want to have thousands of requests waiting on access to a resource that you don't control.
I second the answer about calling the API from the client, if that's an option.
Another option is to keep a counter of current requests, limit it to ten, and return a 503 error if a request comes in that exceeds that capacity. That's practical if you really don't expect to exceed ten concurrent requests often or ever but want to be sure that in the odd chance that it happens it doesn't shut down this feature of your site.
If you actually expect large volumes where you would exceed ten concurrent requests then you would need to queue the requests, but do it in a process separate from your web application. As mentioned, if you have tons of requests waiting for the same resource your application will become overloaded. You could enqueue the request with an entirely different process, and then the client would have to poll your application with occasional requests to see if there's a response.
The big flaw in this last scenario is that it means your users could end up waiting a long time because your application depends on a finite resource that you cannot scale. You can manage it in a way that keeps your application from failing, but not in a way that makes it respond quickly.
I've got a simple project that use M2mqtt client library to connect to HiveMQ broker. When a message arrives an event will fire, the fact is I can receive up to 100 messages per second and the program is able to processing just 20 messages per second.
HiveMQClient.MqttMsgPublishReceived += HiveMQClient_MqttMsgPublishReceived;
So, I have all the HiveMQ logs and telemetry and i can clearly see that the messages arrives in my application with the right rate (100 per second) but the strange thing is that the CPU of the PC where the client program is hosted runs at 10% of its capacity.
I was wondering if I need to "multi thread the event" or there is something that I'm missing.
Thank you all
EDIT
Inside the MqttMsgPublishReceived event i've got a ThreadPool that stores the messages that i receive inside an azure blob account. After some reviews I understood that this is the problem (thanks #Hans Kilian ).
Now i've got an azure blob storage account, standard configuration) that accepts only 30 calls per second. I tried to update into the premium tier but it is only for virtual machine VHD images.
Anybody knows how to improve these numbers?
The MqttMsgPublishReceived call back runs on the clients network thread, if you are interested in performance then you should not be doing any real work in this callback.
For high performance applications the model is normally to use the MqttMsgPublishReceived method to place the incoming message in a local queue in the client and have a thread pool consuming messages from that queue.
This becomes even more important when using QOS 1 or 2 messages as the broker will not send the next message until the MqttMsgPublishReceived has returned and the QOS handshake completes.
As #HandKilian says in the comments things like databases can also be a bottle neck, but using a thread pool combined with a database connection pool can help as it makes sure you are not building and tearing down a connection to the database for each message.
In your opinion (hopefully one that is formed based on fact, as opposed to emotion) what is the better way to send out email notifications from a website?
For example, say User A on your site requests a friendship with User B, at which point you would generate an email to send to User B.
The question is - when is the best time to send the email? Immediately, as part of the same execution path, or scheduling the email as part of a batch?
Like I said, my question is rather generalized, so you can assume different architectures - one server dedicated to hosting, another dedicated to emailing, a single server, cloud hosting, etc... I'm curious about all answers, really.
As I see it:
With immediate emails, you get timely emails, but you can potentially bog down your server by sending too many emails should your website receive a lot of traffic. That being said, because you're not sending a batch of emails, they are all one-offs.
If you batch your emails and have a scheduled task or cron job pick them up and send them, your emails are not as immediate - so assume you decrease the interval so that batches are sent every 1 minute. The issue, as I see it is concurrency - if another batch kicks off before the first one completes, you could risk sending double emails if you don't appropriately flag or lock what you're sending.
In my personal experience, when I've had emails sent off immediately on a high traffic site, performance wasn't impacted too much, though a number of emails failed to send out.
Thoughts?
I would say definitely schedule them. There has to be a tollerance in terms of user request and server action on it, as if someone is able to make someone other a friend, it also (I hope) is able to refuse a friendship with the same person. If so, what if I make fast accept and refuse clicks on your website. ?
You have 2 options in this case, imo:
like a SO does, add some timing on user clicks (you can not accept and refuse in 2 seconds)
or you can, but at this point final message to the person whom friendship was accepted/requested is scheduled on the server and will send, say, after 30 minutes (or less, matter of architect choice)
Hope this helps.
I'm having an issue sending large volumes of emails out from an ASP.Net application. I won't post the code, but instead explain what's going on. The code should send emails to 4000 recipients but seems to stall at 385/387.
The code creates the content for the email in a string.
It then selects a list of email address to send to.
Looping through the data via a datareader it picks out the email address and sends an email.
The email sending is done by a separate method which can handle failures and returns it's outcome.
As each record is sent I produce an XML node in an XML document to log each specific attempt to send.
The loop seems to end prematurely and the XML document is saved to disk.
Now I know the code works. I have run it locally using the same SMTP machine and it worked fine with 500 records. Granted there was less content, but I can't see how that would make any difference.
I don't think the page itself times out, but even if it did, I was sure .Net would continue processing the page, even if the user saw a page time out error.
Any suggestions appreciate because I'm pretty stumped.
You're sending lots of emails. During the span of a single request? IIS will kill a request if it takes longer than a certain (configurable) amount of time.
You need to use a separate process to do stuff like this. Whether that's a Timer you start from within global.asax, or a Thread which checks for a list of emails in a database/app_data directory, or a service you send a request to via WCF, or some combination of these.
The way I've handled this in the past is to queue the emails into a SQL Server table and then launch another thread to actually process/send the emails. Another aspx utility page can give me the status of the queue or restart the processing.
I also highly recommend that use an existing, legit, third-party mailing service for your SMTP server if you are sending mail out to the general public. Otherwise you run the risk of your ISP shutting off your mail access or (worse) your own server being blacklisted.
If the web server has a timeout setting, it will kill the page if it runs too long.
I recommend you check the value of HttpServerUtility.ScriptTimeout - if this is set then when a script has run for that length of time, it will be shut down.
Something you could do to help is go completely old-school - combine some Response.Writes with a few Response.Flush to send some data back to the client browser, and this tends to keep the script alive (certainly worked on an old ASP.NET 1.1 site we had).
Also, you need to take into account when this script is being run - the server may well also have been configured to perform an application reset (by default this is set to every 29 hours in IIS), if your server is set to something like 24 hours and this coincides with the time your script it run, you could be seeing that too - although the fact that the script's logging its response probably rules that out - unless your XML document is badly formed?
All that being said, I'd go with Will's answer of using a seperate process (not just a thread hosted by the site), or as Bryan said, go with a proper mailing service, which will help you with things like bounce backs, click tracking, reporting, open counts, etc, etc.