I have a couple of web applications which all utilize sending emails whether it be by contact form, or some kind of notification updates etc.
The problem I have found is that there isn't really any way to track the emails which are being sent from the web applications, so I've come up with a possible solution:
It's pretty straight forward really - instead of having each web application sending the emails themselves I would like to unify the process by creating a central Email Sender Service.
In basic terms, each application would just create a row in a 'Outbound Emails' table on the database with To,From,Subject,Content data.
The Email Sender Service (Win Service) would then pick the emails from the outbox, send them and then mark as sent.
Even though I would store 'basic email' information (to,from,subject,content) in the database, what I would really like to do is also store the 'MailMessage' object itself so that the Email Sender Service could then de-serialize the original MailMessage as this would allow any application to fully customize the email.
Are there any problems with using the MailMessage object in this way?
Update: Another objective, is to store a log of emails that have been sent - hence the reason for using a database.
A far better architecture is to have the applications call some sort of public interface on the send email service. The service itself can then be responsible for recording send in a database.
This architecture means that the database becomes internal to the service and so reduces the coupling between your applications (each application knows about a relatively small public contract rather than a database schema). It also means that if you do find some problem with storing MailMessage objects in the database then you can change the storage method without updating all of your clients.
Why use the database? Simply have the applications call your email service directly, providing all information.
If you'd like to queue up the sends, then you can use a net.msmq binding with WCF, which will store the requests in a reliable queue that the service would read from. All of this would be done for you.
Related
I have a client who would like a system developed which handles sending out alert emails to opted-in registered users. The alerts are based on geographic travel-related incidents and these are provided via a third party web service. The registered userbase is expected to be around 500,000 users and each user can subscribe to multiple alert categories.
I'm just trying to scope out what would be required technically to create this alerting functionality. In my mind we would create something like the following:
Poll the alert service once hourly
Store a "queue" of alerts in a temporary database table
Process the queue table as a scheduled task and send out the emails to the opted-in users
The part that I'm really not sure about is the physical sending of the emails. I was wondering if anyone could recommend any good options here. I guess we're looking to either create a custom component to send out the emails or use a dedicated third party service (like dotMailer maybe? Or a custom mass mail server?). I've read that rolling out your own solution runs the risk of having IP's blacklisted if you're not careful, and obviously we want to keep this all completely legitimate.
I should also mention that we will be looking to create the solution using .NET (C#) and we're limited to SQL Server Express 2008 edition.
Any thoughts much appreciated
Many thanks in advance
For the Poll , Queue and Send operations I'd create a windows service that calls the external service , operates on the data and then gathers the results to send out the emails updating the database as necessary with sent flags etc.
I handled a similar project recently and we found many ISPs or Hosting Providers got really twitchy when we mentioned mass emails. You should defintly check out the http://en.wikipedia.org/wiki/CAN-SPAM_Act_of_2003 CAN SPAM guidelines (or similar in your territory).
As long as you play by the rules and follow the guidelines you should be OK sending out from a local mail server however its important that you ensure that DNS Lookups or Reverse DNS lookups on the MX records all point back and forth to each other properly. Indeed this would be easier to out source to a third party mail provider or ISV but when we tried we were unable to find a good fit and ended up doing it ourselves.
Additionally you may want to glance at SPF records and other means to increase mass email delivery! For what its worth this can be a very tricky task to implement as SMTP (my least favourite protocol) is painful to try to debug and people get very upset if they receive multiples or unsolicited emails so ensure you have an Opt-in policy and appropriate checks to prevent duplicate delivery.
I have a web application from which emails should be sent after specific actions. I have some alternatives for handling this I'm not sure which one is the best.
The first is that, when a user does an action the email is being sent directly from the ASP.NET application. But I think this is not a really reliable system because if the SMTP server is down or something else happens, the user just gets a feedback that his action cannot be completed.
As an alternative I was thinking about implementing a queuing system for what I have some ideas:
Push emails to send, into a database table, and a service application periodically checks for new messages, then sends them. On successful send it marks the email task completed.
Use MSMQ for queing. In this case the whole email could be passed as a message; or the other way is to store the message with attachments into a db table, and pass only the data which is required to query the db table and send the message. In this case I don't have to deal with size limits of MSMQ (because of attachments).
something else, like a local WCF service to notify the service
Which way you think is the best?
Use MSMQ is not good solution since has a limitation of 4 MB of each size. http://blogs.msdn.com/b/johnbreakwell/archive/2007/08/22/why-is-there-a-4mb-limit-on-msmq-messages.aspx
Worse case scenario, if MSMQ is failed like it process throw error or suddenly shutdown, it will loss many message. In my case, this solution is good when hardware and software is instaled in almost ideal
Use database and window service is better since it is a simple and doesn't need much effort.
I usually use a combination of database and file. The database contains table to save a header information and a flag that message has been action (either success or error or else) and files contains message (either html or plain) and attachment in original format.
When process is run to send, it is quicker to assemble a message from files rather than from querying blob/clob.
Since they are using file system on the application, you can add hardware like server or components or else to add availibility of the system easily.
Database can be added too, but it will cost you more license in databse software.
I add a test send email after send email in x times to make sure it is works well; this test email is send to my self or dummy inbox and an application to check the test email that is the same email that send and receive. If it is the same, sending pending email will continue again
Another way if you are using MS Exchange, you can use message queue by utilize its web service to queue send. This is an easy way but you need license.
You can see on MSDN library how to utilize MS Exchange web service.
You can use an email server like hmail. In order to push emails into a queue, you can push them to a mail server. To do that, you can write a windows form application that has a timer object that checks every row that has a Status 0(not sent) in email table. When the thread sends it to the mail server, it will be marked as 1(sent).
You can also classify your emails if you use DB. Different actions can send different emails. You can store this info in DB also so that your windows form application thread will now which email template to send.
I have many applications across an enterprise environment and they all use different methods of sending emails. Some send directly through an exchange server, some queue up locally in an SMTP queue and others call a a web service that then sends the email.
I'm trying to decide on the best way to get guaranteed delivery of emails. If our Exchange server goes down, then the applications that send to it directly can no longer send emails, also any emails sent during the down time never get anywhere. I would also like to implement a universal templating solution that all applications can share.
Are there any pre-built solutions to this problem, or do you have an insight on how to handle this issue?
We solved this by creating a web service that sends all our emails. This web service uses the
System.Net.Mail.SmtpDeliveryMethod.PickupDirectoryFromIis
setting, which essentially saves the files to a spot on the disk, tries to send them via the main SMTP server, and if the server is unavailable, they sit in the directory until it BECOMES available.
Guaranteed delivery, as long as the web service is up. Since we have redundancy checks in place, this is almost never an issue. If it is, we treat it as an error in code and handle it.
edit - added
I forgot to mention that XSS is a concern even in an email, so be sure to use something like the Microsoft.Security.AntiXss library, which contains functions like GetSafeHtmlFragment to strip out potentially dangerous scripts before outputting html to an email.
http://msdn.microsoft.com/en-us/security/aa973814.aspx
http://msdn.microsoft.com/en-us/library/aa973813.aspx
I have heard good feedback about Postmark. Maybe a service like that could be solution as it has several integration points.
http://postmarkapp.com
We have used HMailServer (on windows platform) which is freeware. Configured the max retries to too many & used external smtp to relay the emails. our applications ques up emails on the HMailServer and that server relays it further. with the max retries configured to be many if at all the main smtp servers are down we can assure that the email are delivered - but though there is no gurantee if there is huge downtime with main smtp relay servers.
I do this by queuing email to a SQL server database. Any ACID compliant database will work or you can use MongoDB with 'safe mode' inserts but if you really need guaranteed then use SQL server or MySQL. This way if your mail gets into the database it is 'guaranteed' not to be lost and your app doesn't have to think about it. You could use a web service or just make a shared assembly with a static public method in a class to drop your email in the db for you.
Include a column for status like 'new', 'delivered', 'recipient mailbox temporarily full', which you can represent with numeric values and keep a TimeToSend column, which starts out as the time when the email is queued in the database.
Then you have a mail app, that you can have run once a minute as a windows scheduled task. Make it as a console app. When it loads, it checks if an instance of it already running and if there is one, it exits. When running:
1. Attempts to deliver each mail to mail server. Query the database for all mail where the TimeToSend is older than now.
2. If mail is delivered to mail server, mark it logical deleted.
3. If any mail can't be delivered, advance the TimeToSend column for them to 10 minutes from now.
4. Delete records from table that are logically deleted. You can do this in the app or you can do it by having a sql job do it.
As mentioned earlier, you can utilize a web service that you can usually POST JSON to using an HTTP request. Here are a bunch of choices:
PostageApp (Ours!)
SendGrid
PostmarkApp
Amazon SES
They all have different feature sets and offerings, so definitely give them all a spin and figure out which you prefer.
(Full Disclosure: I am the Product Manager of PostageApp.)
i have the current scenario:
my app generates for each user an
valid system email address of form
lets say: uuid#website.com
when an user has a problem/question he can send an
email from any address to that
predefined system email address
the app should receive the emails sent by the user and process them(check for spam, insert in db)
in this scenario a possible first solution that i had in mind was to pool the emails addresses on a 15 minutes period, process them(spam or not spam) in an external desktop app(or similar) and insert them in a database.
because i want to do this in .net, C#, SQL server 2008, and it should run on a webserver is the below solution possible using WCF?
i create a WCF webservice that when an email is received by an email address it captures it and starts the processing procedure.
One problem i see with WCF from the start is that i don't think that it can auto react, the only way i used wcf until now was only for calling it directly and receiving a result. So i think another layer should be put between the email server and the wcf service and that layer should "react" when something is received.
the main idea is to process the emails as they arrive not to be pulled out of the inbox periodically.
any pointers? thank you
You are right. A web service cannot capture anything for you. You will have to call(using an .ashx/or etc) the web service. That is what the web services are for, to be called.
the app should receive the emails sent by the user and process them
It sounds like you are looking to develop an email client; if so, then how about:
Create an email client app (for instance here)
Create a windows service, to help process the mails.
Assuming that you've tailored the client program, the windows service would work-with the client, look for new messages, and processes them accordingly.
For email client examples, checkout:
This answer
And this article
If connecting to Exchange 2007 SP1 or later Exchange Web Services looks like the best approach:
Read MS Exchange email in C#
I've inherited an ASP.NET website written in c# using an mssql 2k8 database that is sending emails based on an insert into a message table via a trigger using database mail :| One of any failures and too many things rollback, aren't logged and no email is sent...
What a mess. I've written a few email subsystems in the past and I thought I'd ask for input before starting a rewrite here. What are the best practices for queuing/sending email in a Microsoft environment? Should I be pushing emails to a queue, from there pulling, sending, logging? DB Email seems like a fail. Is the queue managed in SQL server? Does SQL Server call a C# app? If an email send fails, what's a good approach for recovery?
Thanks for any insight!
I think you are correct to completely separate the use of system functions: use SQL for data, and push email on to a completely different 'service provider'; I assume you have some sort of business logic tier that will orchestrate this?
I can't talk in terms of 'best practice' from experience (for email specifically), but abstracting out the email service (just like you'd abstract out data-access) is definitely on the right path, and that's probably the critical decision you need to make right now. You could then have different implementations of the email (including SQL, if you really wanted to).
Depending on volumes - do you look at an asynchronous call or synchronous call? WCF seems like a good candidate to handle comms - this would allow you to fire data (for emails) into an end-point that had a queue built into it, or you could call (via WCF) a web service that acted synchronously.
You can send mail via sql-server. For more refer this
The architecture of it is here
Another implementation of sending mail via c# is this as they have developed an email factory for implementation... hope this helps
sp_send_dbmail places the mail request into a queue in msdb. After the transaction that sent the mail commits, the queue activates an external process that handles the SMTP delivery, including retries, logging and all that. Overall system is quite resilient, scalable and performant.
Perhaps you're using the old, deprecated xp_sendmail based system?