I'm developing an application and need to send some real-time updates to users logged in. I'm currently developing for an XRM system which only offers email notifications, so to get things up and running a bit quicker I'm going to resort to polling the database.
I want to start a new process in the Application_Start() method in global.asax that will poll the database and broadcast among clients. The problem is I'm a bit of a noob and don't know what I need to get started.
I've read http://haacked.com/archive/2011/10/16/the-dangers-of-implementing-recurring-background-tasks-in-asp-net.aspx and am aware of the potential flaws, as I said it's nothing more that getting it running first.
Could anyone tell me what kind of project I need to add to my solution? I was also thinking that it might work out a bit better if it runs async allowing me to poll different entities for updates simultaneously.
The polling is going to be regulated by the server, clients don't request for the poll so no long-polling with the database, the updates get pushed to the users subscribed to a groups. This will help me keep the polling minimal and also turn the polling off if no one is logged in.
Related
This is more of a design/architectural question
I know the question sounds vague but let me explain my applications need.
So I have a windows form which collects certain data from the local machine and sends it to an azure queue regularly. I also have a Web app which pulls data from the queue simultaneously and displays the data. All is well and good here as the web application works fine . But the web app only pulls data from the queue when I 'launch' it. Is there a way to run this task of processing the data every time ? (like as and when the data is available).
This is a requirement because along with displaying the data, the web app also monitors contents of this data for threshold limit and sends notifications too.
Right now, only if the web app is launched/opened on a client browser, it can send notifications.
you should take a look at SignalR. It is a library which is used for real time communication, there are a lot of strategy for making this real-time communication happen and SignalR implements for you.
It can be used easily with ASP.NET you should just take a look for some codes to know how to implement in your case but this is your game.
From the couple of suggestions, I found Azure WebJobs ideal for my task. A little more research shows Azure Worker Role to be useful too, but setting it up looks more difficult.
I have an application written in ASP.NET C# that should run for about 2 days and then stop until it is started again.
The application sometimes runs into some trouble that, at the moment, requires manual intervention to make it continue.
(So far, SQL server in itself was never the problem)
When working, the application writes a line to a log database table at least every 20 minutes.
If these lines are not there, it means the application either finished the task or needs help to continue. In these cases, I would like to get an email or some other notification.
What is the best way to detect and notify me if or when the application has stopped writing these lines to the log database table?
If it makes things easier, I could make the application perform some other regular task, such as writing to a file or sending a TCP/UDP packet somewhere.
Aside of the obvious concerns with the design of this application. Given your current design, you could potentially write a windows service to monitor the log file. If it's older than N minutes, fire off an System.Net.Mail.MailMessage.
If you're able to make the application do something like sending packets, is there any reason you couldn't make the application send its own alerts? The alert needs to be sent when the application has finished, or when it needs "help" (I'm assuming some sort of manual input or confirmation). It's reasonable to assume that the application can tell when either of these things have happened, so... why can't the application send an alert? ASP.NET apps can access SMTP and send emails really easily using the System.Net.Mail.SmtpClient. It seems really odd to me to have a separate program sending emails on behalf of a program that has access to a perfectly good email client class.
You can send emails from the global application error handler (Application_Error in Global.asax). Windows service monitoring the database is another solution, but it may be better to write a short script/command line application and run it with windows scheduler. It has lower footprint and it is easier to write and install.
I have a client who would like a system developed which handles sending out alert emails to opted-in registered users. The alerts are based on geographic travel-related incidents and these are provided via a third party web service. The registered userbase is expected to be around 500,000 users and each user can subscribe to multiple alert categories.
I'm just trying to scope out what would be required technically to create this alerting functionality. In my mind we would create something like the following:
Poll the alert service once hourly
Store a "queue" of alerts in a temporary database table
Process the queue table as a scheduled task and send out the emails to the opted-in users
The part that I'm really not sure about is the physical sending of the emails. I was wondering if anyone could recommend any good options here. I guess we're looking to either create a custom component to send out the emails or use a dedicated third party service (like dotMailer maybe? Or a custom mass mail server?). I've read that rolling out your own solution runs the risk of having IP's blacklisted if you're not careful, and obviously we want to keep this all completely legitimate.
I should also mention that we will be looking to create the solution using .NET (C#) and we're limited to SQL Server Express 2008 edition.
Any thoughts much appreciated
Many thanks in advance
For the Poll , Queue and Send operations I'd create a windows service that calls the external service , operates on the data and then gathers the results to send out the emails updating the database as necessary with sent flags etc.
I handled a similar project recently and we found many ISPs or Hosting Providers got really twitchy when we mentioned mass emails. You should defintly check out the http://en.wikipedia.org/wiki/CAN-SPAM_Act_of_2003 CAN SPAM guidelines (or similar in your territory).
As long as you play by the rules and follow the guidelines you should be OK sending out from a local mail server however its important that you ensure that DNS Lookups or Reverse DNS lookups on the MX records all point back and forth to each other properly. Indeed this would be easier to out source to a third party mail provider or ISV but when we tried we were unable to find a good fit and ended up doing it ourselves.
Additionally you may want to glance at SPF records and other means to increase mass email delivery! For what its worth this can be a very tricky task to implement as SMTP (my least favourite protocol) is painful to try to debug and people get very upset if they receive multiples or unsolicited emails so ensure you have an Opt-in policy and appropriate checks to prevent duplicate delivery.
I have a LAN composed of 3 PC. Installed in PC1 is the MS SQL database. This computer will act as the server.
The PC2 and PC3 will each have a desktop application that will display the data from PC1.
My problem here is how to make each PC (PC2 and PC3) have the same copy of data.
Suppose in PC2 employee 0001 first name is updated from John to Peter and commit save. Without refreshing the application in PC3, employee 0001 will still have John for the first name.
What would be the best approach for this? My level in programming is not that good, but I'm open to all suggestions/concepts/example/etc..
Thanks.
If you want immediate update on all clients right after data changes then you need some sort of notification system either in a polled or pushed manner.
You can implement push mechanism using for example WCF with callback contract. Your client PCs would need to implement relevant callback interface and be constantly connected to server PC's WCF service. Callback call could actually carry the new data. Each client needs to filter out notifications which resulted from that client's own changes. Push mechanism is quick and efficient way.
Check this stackoverflow answer for example of WCF callback.
Pull mechanism would require a background thread on all client applications checking the server for changes. You can use a separate database table with a version counter that would get incremented each time anything changes on the server. Client applications would poll that counter, compare with latest version they have and update the data when new version is discovered. It is much less effective mechanism though as you need to do the polling frequently and get all the data each time there is a new version. You can make versioning more sophisticated and detect what exactly changed but that can get complicated quickly with multiple clients. Overall it does not scale very well. It is generally simpler than push though and for simple applications with not too much data it would be enough.
You need to tell the other machines when to update. This could be accomplished by simple messages sent over the network using UDP broadcast. Then the other PC could execute its refresh method.
well utivich...this is the same thing as a web application really. it's a common problem. usually the other clients will have stale data until the record is reloaded, or when they save maybe the server will throw an exception on stale data based on sql timestamp. however, with a desktop application you can setup a system with event notification just like a chat application where the server pushes events to subscribers and the clients will be able to update the record or whatever you need to do.
I have an application built that hits a third party company's web service in order to create an email account after a customer clicks a button. However, sometimes the web service takes longer than 1 minute to respond, which is way to long for my customers to be sitting there waiting for a response.
I need to devise a way to set up some sort of queuing service external from the web site. This way I can add the web service action to the queue and advise the customer it may take up to 2 minutes to create the account.
I'm curious of the best way to achieve this. My initial thought is to request the actions via a database table which will be checked on a regular basis by a Console app which is run via Windows Scheduled tasks.
Any issues with that method?
Is there a better method you can think of?
I would use MSMQ, it may be an older technology but it is perfect for the scenario you describe.
Create a WCF service to manage the queue and it's actions. On the service expose a method to add an action to the queue.
This way the queue is completely independent of your website.
What if you use a combination of AJAX and a Windows Service?
On the website side: When the person chooses to create an e-mail account, you add the request to a database table. If they want to wait, provide a web page that uses AJAX to check every so often (10 seconds?) whether their account has been created or not. If it's an application-style website, you could let them continue working and pop up a message once the account is created. If they don't want to wait, they close the page or browse to another and maybe get an e-mail once it's done.
On the processing side: Create a Windows service that checks the table for new requests. Once it's done with a request it has to somehow communicate back to the user, maybe by setting a status flag on the request. This is what the AJAX call would look for. You could send an e-mail at this point too.
If you use a scheduled task with a console app instead of a Windows service, you risk having multiple instances running at the same time. You would have to implement some sort of locking mechanism (at the app or request level) to prevent processing the same thing twice.
What about the Queue Class or Generic Queue Class?
Unfortunetally, your question is too vague to answer with any real detail. If this is something you want managed outside the primary application then a Windows Service would be a little more appropriate then creating a Console... From an integration and lifecycle management perspective this provides a nice foudation for adding other features (e.g. Performance Counters, Hosted Management Services in WCF, Remoting, etc...). MSMQ is great although there is a bit more involved in deployment. If you are willing to invest the time, there are a lot of advantanges to using MSMQ. If you really want to create your own point to point queue, then there are a ton of examples online that can serve as an example. Here is one, http://www.smelser.net/blog/page/SmellyQueue-(Durable-Queue).aspx.