How to perform server side DB polling in ASP.Net? - c#

I have an application running on angular.js, WebAPI, and mongodb. My goal is to replace client side polling, with signalr. Since mongodb doesn't fire events on changes, I still need to poll on the server side, and if I detect a change, broadcast it using a signalr hub.
Is there a solid way to do that, without breaking the recurring task by IIS?
What have I tried:
Hangfire: Seems like the recommended solution, but currently supports only SQL Server and redis (not an option).
HostingEnvironment.QueueBackgroundWorkItem: Might work, but I fear that it is not widely used, and I might miss some pitfalls.
There's a way to kind of get events from mongo using "tailable cursor on capped collection" - not an option

The answer as always: "it depends".
HostingEnvironment.QueueBackgroundWorkItem. The latest addition in .NET 4.5.2. No guarantees on execution, see the limitations at the end of post:
WebBackgrounder. Quote from Scott Hanselman blog: "Its code hasn't been touched in years, BUT the WebBackgrounder NuGet package has been downloaded almost a half-million times".
Hangfire. Popular open source project, adding mongodb support is always a welcome PR :)
Using RavenDb instead of MongoDb. Their Changes API is really cool.
The first and second options don't guarantee job execution, the third and fourth options rely on data stores instead.

QueueBackgroundWorkItem is your best option consider what we discussed above, but its not 100% bullet proof. The app pool will still recycle. You can create a Windows service that signal the Web app to poll the DB.
Anyway, once you got the singal (through polling or what not) you could use event driven design from that point on. The benefit is decoupled domains and scalability.. I have created this framework for just that.
https://github.com/AndersMalmgren/SignalR.EventAggregatorProxy/wiki

Related

What does WaitForAck do in SignalR

I am considering using SignalR for server-to-client real time communication. However, I need to guarantee delivery, so I need some form of ACK in the process.
I have seen answers here with suggestions for how to do this, but I also see that the Microsoft documentation for SignalR includes a Message.WaitForAck bool property. This makes me hopeful that perhaps Microsoft baked something in to do this--but I can find no postings at all of folks using this, nor any posts explaining what it does.
Is it just an inert flag? That is, are we still on the hook to roll our own ACK system?
Thanks.
WaitForAck is an internal thing. SignalR is build around a MessageBus where WaitForAck is used for some operations that should block until completed (or timed out). An example of such operation would be adding connection to a group.
If you want a guarantee delivery you need to implement it on your own on top of SignalR.

running timer from global.asax vs quartz.net

I am developing a asp.net site that needs hit a few social media sites daily for blanket friend/follower data. I have chosen arvixe business class as my hosting. In the future if we grow, I'd love to get onto a dedicated server and run a windows service, however since that is not in the cards at this point I need another reliable way of running scheduled tasks. I am familiar with running a thread timer from the app_code(global.aspx). However the app pool recycling will cause some problems with the timer. I have never used task scheduling like quartz but have read a lot about it on stackoverflow. I was looking for some advise as to how to approach my goal. One big problem I have using either method is that I will need the crawler threads to sleep for up to an hour regularly due to api call limits. My first thoughts were to use the db to save the starting and ending of a job. When the app pool recycles I would clear out any parts not completed and only start parts that do not have a record of running on that day. What do the experts here think? any good links to sample architecture of this type of scheduling?
It doesn't really matter what method you use, whether you roll your own or use Quartz. You are at the mercy of ASP.NET/IIS because that's where you want to host it.
Do you have a spare computer laying around that can just run a scheduled task and upload data to a hosted database? To be honest, it's possibly safer (depending on your use case) to just do it that way then try to run a scheduler in ASP.NET.
Somewhat along the lines of Bryan's post;
Find a spare computer.
Instead of allowing DB access have it call up a web service on your site. This service call should be the initiator of the process you are trying to do. Don't try to put params into it, just something like "StartProcess()" should work fine.
As far as going to sleep and resuming later take a look at Workflow Foundation. There are some nice built in features to persist state.
Don't expose your DB to the outside world, instead expose that page or web service and wraps some security around that. WCF has some nice built in security features for that.
The best part is when you decide to move off, you can keep your web service and have it called from a Windows Service in the same manner.
As long as you use a persistent job store (like a database) and you write and schedule your jobs so that they can handle things like being killed half way through, having IIS recycle your process is not that big a deal.
The bigger issue is that IIS shuts your site down if it doesn't have traffic. If you can keep your site up, then just make sure you set the misfire policy appropriately and that your jobs store any state data needed to pick up where they left off, you should be able to pull it off.
If you are language-agnostic and don't mind writing your "job-activation-script" in your favourite, Linux-supported language...
One solution that has worked very well for me is:
Getting relatively cheap, stable Linux hosting(from reputable
companies),
Creating a WCF service on your .Net hosted platform that will contain the logic you want to run regularly (RESTfully or SOAP or XMLRPC... whichever suits you),
Handling the calls through your Linux hosted cron jobs, written in your language of choice(I use PHP).
Working very well, like I said. No VPS expense,configurable and externally activated. I have one central place where my jobs are activated, with 99 to 100% uptime(never had any failures).

Update a JQuery progressbar using WCF Duplex callbacks?

I have a fairly straightforward WCF duplex service hosted on IIS (.NET 4.x)
One particular service method is long running (1-3 minutes), and when this method is triggered by the client, I would like to provide visual feedback to the user via a JQuery progressbar on the page that updates based on an integer value passed during multiple callbacks.
I am open to moving away from a duplex implementation, but it seemed like the logical approach at the time...
What is the best way to go about this, keeping in mind that I would like to minimize overhead, and also avoid introducing more technologies like silverlight (though I realize this is a perfectly viable solution for this problem).
Specifically, some code examples might be helpful. Also note that I am new to both JQuery and WCF.
I wasn't sure what other information to offer to make this question more clear, so if you require more information, please ask and I will amend the original question.
This is not feasible using standard WCF without moving to Silverlight as you would need to use a polling duplex channel at the browser.
There was an AJAX implementation of the client side of the PollingDuplex protocol done some time ago, which would allow you to have your client progress bar update in response to progress updates from the server (sort of - it still needs the client to poll). Not sure if it works with the latest version of PollingDuplex.

Need advice to query data from sql server on every 5 seconds and send it to other app.(.NET C#)

I have a require ment to read data from a table(SQL 2005) and send that data to other application for every 5 seconds. I am looking for the best approach to do the same.
Right now I am planning to write a console application(.NET and C#) which will read the data from sql server 2005(QUEUE table which will be filled through different applications) and send to other application through TCP/IP(Central server). Run that console application under schedule task for every 5 seconds. I am assuming scheduled task will take care to discard new run event if task is already running(avoid to run concurrent executions).
Does any body come accross similar situation? Please share your experience and advice me for best approach.
Thanks in advance for your valuable time spending for my request.
-Por-hills-
We have done simliar work. If you are going to query a sql database every 5 seconds, be sure to use a stored procedure that is optimized to be very fast. It should not update data unless aboslutely necessary. This approach is typically called 'polling' and I've found that it is acceptable if your sqlserver is not otherwise bogged down with too many other calls.
In approaches we've used, a Windows Service that does the polling works well.
To communicate results to another app, it all depends on what your other app is doing and what type of interface you can make into it, and how quickly you need the results. The WCF class libraries from Microsoft provide many workable approaches for real time communication. My preference is to write to the applications database, and then have the application read the data (if it works for that app). If you need something real time, WCF is the way to go, and I'd suggest using a stateless protocol like http if < 5 sec response time is required, (using standard HTTP posts), or TCP/IP if subsecond response time is required.
since I assume your central storage is also SQL 2005, have you considered using what SQL Server 2005 offers out of the box to achieve your requirements? Rather than pool every 5 seconds, marshal and unmarshal TCP/IP, implement authentication and authorization for the TCP/IP pipe, scale TCP transmission with boxcaring, manage message acknowledgments and retries, deal with central site availability, fragment large messages, implement fairness in transmission and so on and so forth, why not simply use Service Broker? It does all you need and more, out of the box, already tested, already tuned for performance and scalability.
Getting reliable messaging right is not trivial and you should focus your efforts in meeting your business specifics, not reiventing the wheel.
I would recommend writing a Windows Service (since you are C#) that has some timer which runs every 5 seconds. That way you wont be starting and stopping an application all the time, it can run even when there is no one logged into the machine, and it will automatically start when the machine is restarted.
For one of my projects, I needed to do something periodically. I opted for a service and set up a timer that takes care of reading the data. You might consider that solution. It has worked well for me.
I suggest to create a windows service and not an application and to perform the timing yourself - create a timer and execute one step on each timer event. For the communication you have many choices - I would consider using standard technologies like a webservice or Winows Communication Foundation.
Besides this custom solution I would evaluate if the task can be solved using Microsoft Integration Services .
Finally other question comes to mind - why do you need this application? Why doesn't/don't the application(s) consuming the data query the database? Is the expensive polling required? Is it possible for the data producers to signal the availibilty of new data directly to the data consumers?
I am not sure about the details of your project, specifically related to security but maybe it would be better to create an SSIS package and schedule it as a job?

Keeping in sync with database

The solution we developed uses a database (sqlserver 2005) for persistence purposes, and thus, all updated data is saved to the database, instead of sent to the program.
I have a front-end (desktop) that currently keeps polling the database for updates that may happen anytime on some critical data, and I am not really a fan of database polling and wasted CPU cycles with work that is being redone uselessly.
Our manager doesn't seem to mind us polling the database. The amount of data is small (less than 100 records) and the interval is high (1 min), but I am a coder. I do. Is there a better way to accomplish a task of keeping the data on memory as synced as possible with the data on the database? The system is developed using C# 3.5.
Since you're on SQL2005, you can use a SqlDependency to be notified of changes. Note that you can use it pretty effortlessly with System.Web.Caching.Cache, which, despite it's namespace runs just fine in a WinForms app.
First thought off the top of my head is a trigger combined with a message queue.
This may probably be overkill for your situation, but it may be interesting to take a look at the Microsoft Sync Framework
SQL Notification Services will allow you to have the database callback to an app based off a number of protocols. One method of implementation is to have the notification service create (or modify) a file on an accessible network share and have your desktop app react by using a FileSystemWatcher.
More information on Notification Services can be found at: http://technet.microsoft.com/en-us/library/aa226909(SQL.80).aspx
Please note that this may be a sledgehammer approach to a nut type problem though.
In ASP.NET, http://msdn.microsoft.com/en-us/library/ms178604(VS.80).aspx.
This may also be overkill but maybe you could implement some sort of caching mechanism. That is, when the data is written to the database, you could cache it at the same time and when you're trying to fetch data back from the DB, check the cache first.

Categories