Multi-threaded async web service call in c# .net 3.5 - c#

I have 2 ASP.net 3.5 asmx web services, ws2 and ws3. They contain operations op21 and op31 respectively. op21 sleeps for 2 seconds and op31 sleeps for 3 seconds. I want to call both op21 and op31 from op11 in a web service, ws1, asynchronously. Such that when I call op11 from a client synchronously.,the time-taken will be 3 seconds which is the total. I currently get 5 seconds with this code:
WS2SoapClient ws2 = new WS2SoapClient();
WS3SoapClient ws3 = new WS3SoapClient();
//capture time
DateTime now = DateTime.Now;
//make calls
IAsyncResult result1 = ws3.BeginOP31(null,null);
IAsyncResult result2 = ws2.BeginOP21(null,null);
WaitHandle[] handles = { result1.AsyncWaitHandle, result2.AsyncWaitHandle };
WaitHandle.WaitAll(handles);
//calculate time difference
TimeSpan ts = DateTime.Now.Subtract(now);
return "Asynchronous Execution Time (h:m:s:ms): " + String.Format("{0}:{1}:{2}:{3}",
ts.Hours,
ts.Minutes,
ts.Seconds,
ts.Milliseconds);
The expected result is that the total time for both requests should be equal to the time it takes for the slower request to execute.
Note that this works as expected when I debug it with Visual Studio, however when running this on IIS, the time is 5 seconds which seems to show the requests are not processed concurrently.
My question is, is there a specific configuration with IIS and the ASMX web services that might need to be setup properly for this to work as expected?

Original Answer:
I tried this with google.com and bing.com am getting the same thing, linear execution. The problem is that you are starting the BeginOP() calls on the same thread, and the AsyncResult (for whatever reason) is not returned until the call is completed. Kind of useless.
My pre-TPL multi-threading is a bit rusty but I tested the code at the end of this answer and it executes asynchronously: This is a .net 3.5 console app. Note I obviously obstructed some of your code but made the classes look the same.
Update:
I started second-guessing myself because my execution times were so close to each other, it was confusing. So I re-wrote the test a little bit to include both your original code and my suggested code using Thread.Start(). Additionally, I added Thread.Sleep(N) in the WebRequest methods such that it should simulate vastly different execution times for the requests.
The test results do show that the code you posted was sequentially executed as I stated above in my original answer.
Note the total time is much longer in both cases than the actual web request time because of the Thread.Sleep(). I also added the Thread.Sleep() to offset the fact that the first web request to any site takes a long time to spin up (9 seconds), as can be seen above. Either way you slice it, it's clear that the times are sequential in the "old" case and truly "asynchronous" in the new case.
The updated program for testing this out:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Net;
using System.Text;
using System.Threading;
namespace MultiThreadedTest
{
class Program
{
static void Main(string[] args)
{
// Test both ways of executing IAsyncResult web calls
ExecuteUsingWaitHandles();
Console.WriteLine();
ExecuteUsingThreadStart();
Console.ReadKey();
}
private static void ExecuteUsingWaitHandles()
{
Console.WriteLine("Starting to execute using wait handles (old way) ");
WS2SoapClient ws2 = new WS2SoapClient();
WS3SoapClient ws3 = new WS3SoapClient();
IAsyncResult result1 = null;
IAsyncResult result2 = null;
// Time the threadas
var stopWatchBoth = System.Diagnostics.Stopwatch.StartNew();
result1 = ws3.BeginOP31();
result2 = ws2.BeginOP21();
WaitHandle[] handles = { result1.AsyncWaitHandle, result2.AsyncWaitHandle };
WaitHandle.WaitAll(handles);
stopWatchBoth.Stop();
// Display execution time of individual calls
Console.WriteLine((result1.AsyncState as StateObject));
Console.WriteLine((result2.AsyncState as StateObject));
// Display time for both calls together
Console.WriteLine("Asynchronous Execution Time for both is {0}", stopWatchBoth.Elapsed.TotalSeconds);
}
private static void ExecuteUsingThreadStart()
{
Console.WriteLine("Starting to execute using thread start (new way) ");
WS2SoapClient ws2 = new WS2SoapClient();
WS3SoapClient ws3 = new WS3SoapClient();
IAsyncResult result1 = null;
IAsyncResult result2 = null;
// Create threads to execute the methods asynchronously
Thread startOp3 = new Thread( () => result1 = ws3.BeginOP31() );
Thread startOp2 = new Thread( () => result2 = ws2.BeginOP21() );
// Time the threadas
var stopWatchBoth = System.Diagnostics.Stopwatch.StartNew();
// Start the threads
startOp2.Start();
startOp3.Start();
// Make this thread wait until both of those threads are complete
startOp2.Join();
startOp3.Join();
stopWatchBoth.Stop();
// Display execution time of individual calls
Console.WriteLine((result1.AsyncState as StateObject));
Console.WriteLine((result2.AsyncState as StateObject));
// Display time for both calls together
Console.WriteLine("Asynchronous Execution Time for both is {0}", stopWatchBoth.Elapsed.TotalSeconds);
}
}
// Class representing your WS2 client
internal class WS2SoapClient : TestWebRequestAsyncBase
{
public WS2SoapClient() : base("http://www.msn.com/") { }
public IAsyncResult BeginOP21()
{
Thread.Sleep(TimeSpan.FromSeconds(10D));
return BeginWebRequest();
}
}
// Class representing your WS3 client
internal class WS3SoapClient : TestWebRequestAsyncBase
{
public WS3SoapClient() : base("http://www.google.com/") { }
public IAsyncResult BeginOP31()
{
// Added sleep here to simulate a much longer request, which should make it obvious if the times are overlapping or sequential
Thread.Sleep(TimeSpan.FromSeconds(20D));
return BeginWebRequest();
}
}
// Base class that makes the web request
internal abstract class TestWebRequestAsyncBase
{
public StateObject AsyncStateObject;
protected string UriToCall;
public TestWebRequestAsyncBase(string uri)
{
AsyncStateObject = new StateObject()
{
UriToCall = uri
};
this.UriToCall = uri;
}
protected IAsyncResult BeginWebRequest()
{
WebRequest request =
WebRequest.Create(this.UriToCall);
AsyncCallback callBack = new AsyncCallback(onCompleted);
AsyncStateObject.WebRequest = request;
AsyncStateObject.Stopwatch = System.Diagnostics.Stopwatch.StartNew();
return request.BeginGetResponse(callBack, AsyncStateObject);
}
void onCompleted(IAsyncResult result)
{
this.AsyncStateObject = (StateObject)result.AsyncState;
this.AsyncStateObject.Stopwatch.Stop();
var webResponse = this.AsyncStateObject.WebRequest.EndGetResponse(result);
Console.WriteLine(webResponse.ContentType, webResponse.ResponseUri);
}
}
// Keep stopwatch on state object for illustration of individual execution time
internal class StateObject
{
public System.Diagnostics.Stopwatch Stopwatch { get; set; }
public WebRequest WebRequest { get; set; }
public string UriToCall;
public override string ToString()
{
return string.Format("Request to {0} executed in {1} seconds", this.UriToCall, Stopwatch.Elapsed.TotalSeconds);
}
}
}

There is some throttling in your system. Probably the service is configured for only one concurrent caller which is a common reason (WCF ConcurrencyMode). There might be HTTP-level connection limits (ServicePointManager.DefaultConnectionLimit) or WCF throttlings on the server.
Use Fiddler to determine if both requests are being sent simultaneously. Use the debugger to break on the server and see if both calls are running simultaneously.

Related

MongoDB Client throws Wait Queue Full exception

I am seeing an odd issue where The .NET client for MongoDB throws a The wait queue for acquiring a connection to server 127.0.0.1:27017 is full. exception.
I have a semaphore that guards any call to MongoDB, with a size of 10.
Meaning, there are never more than 10 concurrent calls to Mongo.
The default connection pool size is 100 for the .NET driver, which is more than 10.
so 10 concurrent calls should not be an issue.
To replicate this I have the following code, contrived yes, but it makes the issue visible.
I also found this spec for MongoDB
https://github.com/mongodb/specifications/blob/master/source/connection-monitoring-and-pooling/connection-monitoring-and-pooling.rst#id94
Is that related?
Does each calling thread (thread pool worker in this case) go into the wait queue and try to grab a connection, and if I have more worker threads, even if concurrency level is low, the connections still have to be assigned to this new calling worker thread?
using System;
using System.Threading.Tasks;
using MongoDB.Bson;
using MongoDB.Driver;
using System.Threading;
namespace ConsoleApp58
{
public class AsyncSemaphore
{
private readonly SemaphoreSlim _semaphore;
public AsyncSemaphore(int maxConcurrency)
{
_semaphore = new SemaphoreSlim(
maxConcurrency,
maxConcurrency
);
}
public async Task<T> WaitAsync<T>(Task<T> task)
{
await _semaphore.WaitAsync();
//proves we have the correct max concurrent calls
// Console.WriteLine(_semaphore.CurrentCount);
try
{
var result = await task;
return result;
}
finally
{
_semaphore.Release();
}
}
}
class Program
{
public class SomeEntity
{
public ObjectId Id { get; set; }
public string Name { get; set; }
}
static void Main(string[] args)
{
var settings = MongoClientSettings.FromUrl(MongoUrl.Create("mongodb://127.0.0.1:27017"));
// settings.MinConnectionPoolSize = 10;
// settings.MaxConnectionPoolSize = 1000;
// I get that I can tweak settings, but I want to know why this occurs at all?
// if we guard the calls with a semaphore, how can this happen?
var mongoClient = new MongoClient(settings);
var someCollection = mongoClient.GetDatabase("dummydb").GetCollection<SomeEntity>("some");
var a = new AsyncSemaphore(10);
// is this somehow related ?
// https://github.com/mongodb/specifications/blob/master/source/connection-monitoring-and-pooling/connection-monitoring-and-pooling.rst#id94
_ = Task.Run(() =>
{
while (true)
{
// this bit is protected by a semaphore of size 10
// (we will flood the thread pool with ongoing tasks, yes)
_ = a.WaitAsync(RunTask(someCollection))
//after the task is done, dump the result
// dot is OK, else exception message
.ContinueWith(x =>
{
if (x.IsFaulted)
{
Console.WriteLine(x.Exception);
}
});
}
}
);
Console.ReadLine();
}
private static async Task<SomeEntity> RunTask(IMongoCollection<SomeEntity> pids)
{
//simulate some mongo interaction here
var res = await pids.Find(x => x.Name == "").FirstOrDefaultAsync();
return res;
}
}
}
Connections take time to be established. You do not instantly get 100 usable connections. If you create a client and immediately request even 10 operations, while there are no available connections, you can hit the wait queue timeout.
Some drivers also had a wait queue length limit. It's not standardized and should be deprecated in my understanding but may continue to exist for compatibility reasons. Consult your driver docs to see how to raise it.
Then, either increase waitQueueTimeoutMS or ramp up the load gradually or wait for connections to be established prior to starting the load (you can use CMAP events for the latter).
Make sure your concurrency bound of 10 outstanding operations is actually working properly too.

Getting HTML response fails respectively after first fail

I have a program which gets html code for ~500 webpages every 5 minutes
it runs correctly until first fail(unable to download source in 6 seconds)
after that all threads will fail
and if I restart program, again it runs correctly until ...
where I'm wrong, what I should do to do it better?
this function runs every 5 mins:
foreach (Company company in companies)
{
string link = company.GetLink();
Thread t = new Thread(() => F(company, link));
t.Start();
if (!t.Join(TimeSpan.FromSeconds(6)))
{
Debug.WriteLine( company.Name + " Fails");
t.Abort();
}
}
and this function download html code
private void F(Company company, string link)
{
try
{
string htmlCode = GetInformationFromWeb.GetHtmlRequest(link);
company.HtmlCode = htmlCode;
}
catch (Exception ex)
{
}
}
and this class:
public class GetInformationFromWeb
{
public static string GetHtmlRequest(string url)
{
using (MyWebClient client = new MyWebClient())
{
client.Encoding = Encoding.UTF8;
string htmlCode = client.DownloadString(url);
return htmlCode;
}
}
}
and web client class
public class MyWebClient : WebClient
{
protected override WebRequest GetWebRequest(Uri address)
{
HttpWebRequest request = base.GetWebRequest(address) as HttpWebRequest;
request.AutomaticDecompression = DecompressionMethods.Deflate | DecompressionMethods.GZip;
return request;
}
}
IF your foreach is looping over 500 companies, and each is creating a new thread, it could be that your internet speed could become a bottleneck and you will receive timeouts over 6 seconds, and fail very often.
I suggest you to try with parallelism. Note MaxDegreeOfParallelism, which sets maximum amount of parallel executions. You can tune this to suit your needs.
Parallel.ForEach(companies, new ParallelOptions { MaxDegreeOfParallelism = 10 }, (company) =>
{
try
{
string htmlCode = GetInformationFromWeb.GetHtmlRequest(company.link);
company.HtmlCode = htmlCode;
}
catch(Exception ex)
{
//ignore or process exception
}
});
I have four basic suggestions:
Use HttpClient instead of obsolete WebClient. HttpClient can deal with asynchronous operations natively and has far more flexibility to take advantage of. You can even read downloaded contents to strings/streams on different thread since you can configure await not to schedule back your operations. Or even program the HttpClientHandler to break after 6 seconds and raise TaskCanceledException if this was exceeded.
Avoid swallowing exceptions (like you do in your F function) as it breaks debugging and obfuscates the real cause of problems. Correctly-written program will never raise an exception during normal operation.
You are using threads in an useless way, in which they are not even overlapping; they are just waiting for each other to start, because you are locking the calling loop after each thread's start. In .NET it would be better to do multitasking using Tasks (for example, by calling them as Task.Run(async delegate() { await yourTask(); }) (or AsyncContext.Run(...) if you need UI access) and it won't block anything.
The whole GetInformationFromWeb class is pointless in the moment - and you are spawning multiple client objects also pointlessly, since one HTTP client object can handle multiple requests (if you'd use HttpClient even without additional bloat - you just instantiate it once as static global variable with all necessary configuration and then call it from any place using as little code as client.GetStringAsync(Uri uri).
OT: Is it some kind of an academic project?

Multiple parallel calls to WCF Service takes longer than single call

I'm testing WCF concurrency and instancing.
There is wcf Service :
public class Service1 : IService1
{
public string GetData(int value)
{
Thread.Sleep(1000);
return string.Format("You entered: {0}", value);
}
}
From my forms application I make call to this service method. When I do single call, it takes aprox: 1 sec as expected.
private void single_Click(object sender, EventArgs e)
{
using (var service = new Service1Client())
{
var sw = new Stopwatch();
sw.Start();
service.GetData(1);
sw.Stop();
Debug.WriteLine(sw.Elapsed);
}
}
But when I call it multiple times with Tasks, it takes aprox : call count * 1 second.
private void mult_Click(object sender, EventArgs e)
{
using (var service = new Service1Client())
{
var tasks = new List<Task<string>>();
for (var i = 0; i < 5; i++)
{
int p = i;
tasks.Add(Task.Factory.StartNew(() => service.GetData(p)));
}
var sw = new Stopwatch();
sw.Start();
Task.WaitAll(tasks.ToArray());
sw.Stop();
Debug.WriteLine(sw.Elapsed);
foreach (var task in tasks)
{
Debug.WriteLine(task.Result);
}
}
}
I've tried all 9 combinations of Instancing and Concurrency (Instance mode = Per Call and Concurrency = Single etc.)
Interesting thing is that if I create new ServiceClient object for all Task, it works fine, but I don't think it is right approach. I feel that there must be some thing I missed.If so, Can you tell me what exactly?
The issue is on the client side.
You have to explicitly call Open() on the Service1Client object before making any calls to the service. Otherwise your WCF client proxy is internally going to have a call to EnsureOpened(). The problem is specifically that EnsureOpened() will result in each request waiting until the previous request is completed before executing and this is why only one request will be sent out at a time and not in parallel as desired.
Change your code like this:
using (var service = new Service1Client())
{
service.Open();
// Do stuff...
}
From Wenlong Dong's excellent blog post on the subject:
If you don’t call the “Open” method first, the proxy would be opened
internally when the first call is made on the proxy. This is called
auto-open. Why? When the first message is sent through the auto-opened
proxy, it will cause the proxy to be opened automatically. You can use
.NET Reflector to open the method
System.ServiceModel.Channels.ServiceChannel.Call and see the following
code:
if (!this.explicitlyOpened)
{
this.EnsureDisplayUI();
this.EnsureOpened(rpc.TimeoutHelper.RemainingTime());
}
When you drill down into EnsureOpened, you will see that it calls
CallOnceManager.CallOnce. For non-first calls, you would hit
SyncWait.Wait which waits for the first request to complete. This
mechanism is to ensure that all requests wait for the proxy to be
opened and it also ensures the correct execution order. Thus all
requests are serialized into a single execution sequence until all
requests are drained out from the queue. This is not a desired
behavior in most cases.

Accessing Response Object While Using WebForms Async/Await

I have a simple .net 4.5 webforms project that I am using to output the results of some service checks. I have already created a service checking classes that are multi-threaded and awaitable (each check might take 1-3 seconds and I want them checked in parallel). I want the result of each service check to be written to the web page and flushed as soon as it is determined (I don't care if the results are in order). The service checker methods have been tested and work fine in a console application, but I'm having trouble porting it to a webforms app.
The code below somewhat works (very randomly). Sometimes, it's perfect. Other times, it "skips" displaying the results of one of the tests. Other times, it mixes the display of the CSS style in with the results! Mostly, though, it crashes on the Response.Flush line in CheckService stating "Overlapped I/O operation is in progress" and then it crashes.
How can I write the the web page response buffer when each check finishes and displays their results in a stable manner?
Here's the code for the aspx page...
<%# Page Language="C#" Async="true" AutoEventWireup="true" Inherits="ServiceMonitor._Default" Codebehind="Default.aspx.cs" %>
And here's the code behind...
public partial class _Default : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
Response.Write("<style type=\"text/css\">body {font-family: Arial;font-size: 10pt;}</style>");
Response.Write("Beginning processing...<br/>");
Response.Flush();
RegisterAsyncTask(new PageAsyncTask(CheckServices));
}
protected async Task CheckServices()
{
Response.Write(string.Format("Starting checks at {0}<br/>", DateTime.Now.ToLongTimeString()));
var tasks = new List<Task<ServiceStatus>>
{
Task.Run(() => CheckService("ServiceOne")),
Task.Run(() => CheckService("ServiceTwo")),
Task.Run(() => CheckService("ServiceThree"))
};
var checkResults = (await Task.WhenAll(tasks)).ToList();
Response.Write(string.Format("Checking complete at {0}<br/>", DateTime.Now.ToLongTimeString()));
Response.Flush();
}
public ServiceStatus CheckService(string serviceName)
{
var startTime = DateTime.Now;
// Simulate a longer running process, by pausing for 1-3 seconds
var random = new Random();
var end = DateTime.Now + TimeSpan.FromMilliseconds(random.Next(1000, 3000));
while (DateTime.Now < end) { }
var elapsedTime = (DateTime.Now - startTime).TotalSeconds;
var service = new ServiceStatus {Name = serviceName, IsRunning = true};
Response.Write(string.Format("Done with {0} in {1} seconds<br/>", service.Name, elapsedTime.ToString("N2")));
Response.Flush();
return service;
}
}
public class ServiceStatus
{
public string Name { get; set; }
public bool IsRunning { get; set; }
}
FYI - The app isn't supposed to be pretty, hence the removal of standard HTML markup. It also won't be accessed by many users (really just me) so IIS blocking isn't really a concern and the page should return back within 20-30 seconds.
Your problem is that you're trying to access the request context (i.e., Reponse.Write) from a random background thread (i.e., Task.Run).
The ideal solution is to go async all the way. By this, I mean to make your service checkers asynchronous but not multithreaded. If they're implemented using an API ping, then you can check out HttpClient for the implementation.
Once they're asynchronous, the CheckService method will be async as well:
public async Task<ServiceStatus> CheckServiceAsync(string serviceName)
{
var startTime = DateTime.Now;
// Simulate a longer running process, by pausing for 1-3 seconds
var random = new Random();
var waitTime = TimeSpan.FromMilliseconds(random.Next(1000, 3000));
await Task.Delay(waitTime);
var elapsedTime = (DateTime.Now - startTime).TotalSeconds;
var service = new ServiceStatus {Name = serviceName, IsRunning = true};
Response.Write(string.Format("Done with {0} in {1} seconds<br/>", service.Name, elapsedTime.ToString("N2")));
Response.Flush();
return service;
}
And it can be called concurrently, without multithreading:
protected async Task CheckServicesAsync()
{
Response.Write(string.Format("Starting checks at {0}<br/>", DateTime.Now.ToLongTimeString()));
var tasks = new List<Task<ServiceStatus>>
{
CheckService("ServiceOne"),
CheckService("ServiceTwo"),
CheckService("ServiceThree")
};
var checkResults = (await Task.WhenAll(tasks)).ToList();
Response.Write(string.Format("Checking complete at {0}<br/>", DateTime.Now.ToLongTimeString()));
Response.Flush();
}

Processing Messages in Parallel in Azure Service Bus

Problem: I've got tons of emails to send, presently, an average of 10 emails in the queue at any point in time. The code I have process the queue one at a time; that is, receive the message, process it and eventually send the email. This cause a considerably delay in sending emails to users when they signup for the service.
I've begun to think of modifying the code to process the messages in parrallel say 5 asynchronously. I'm imagining writing a method and using the CTP to call this method in parallel, say, 5 times.
I'm a little bit lost in how to implement this. The cost of making a mistake is exceedingly great as users will get disappointed if things go wrong.
Request: I need help in writing code that process messages in Azure service bus in parallel.
Thanks.
My code in a nutshell.
Public .. Run()
{
_myQueueClient.BeginReceive(ProcessUrgentEmails, _myQueueClient);
}
void ProcessUrgentEmails(IAsyncResult result)
{
//casted the `result` as a QueueClient
//Used EndReceive on an object of BrokeredMessage
//I processed the message, then called
sendEmail.BeginComplete(ProcessEndComplete, sendEmail);
}
//This method is never called despite having it as callback function above.
void ProcessEndComplete(IAsyncResult result)
{
Trace.WriteLine("ENTERED ProcessEndComplete method...");
var bm = result.AsyncState as BrokeredMessage;
bm.EndComplete(result);
}
This page gives you performance tips when using Windows Azure Service Bus.
About parallel processing, you could have a pool of threads for processing, and every time you get a message, you just grab one of that pool and assign it a message. You need to manage that pool.
OR, you could retrieve multiple messages at once and process them using TPL... for example, the method BeginReceiveBatch/EndReceiveBatch allows you to retrieve multiple "items" from Queue (Async) and then use "AsParallel" to convert the IEnumerable returned by the previous methods and process the messages in multiple threads.
VERY simple and BARE BONES sample:
var messages = await Task.Factory.FromAsync<IEnumerable<BrokeredMessage>>(Client.BeginReceiveBatch(3, null, null), Client.EndReceiveBatch);
messages.AsParallel().WithDegreeOfParallelism(3).ForAll(item =>
{
ProcessMessage(item);
});
That code retrieves 3 messages from queue and processes then in "3 threads" (Note: it is not guaranteed that it will use 3 threads, .NET will analyze the system resources and it will use up to 3 threads if necessary)
You could also remove the "WithDegreeOfParallelism" part and .NET will use whatever threads it needs.
At the end of the day there are multiple ways to do it, you have to decide which one works better for you.
UPDATE: Sample without using ASYNC/AWAIT
This is a basic (without error checking) sample using regular Begin/End Async pattern.
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.Linq;
using System.Net;
using System.Threading;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using Microsoft.WindowsAzure;
using Microsoft.WindowsAzure.ServiceRuntime;
namespace WorkerRoleWithSBQueue1
{
public class WorkerRole : RoleEntryPoint
{
// The name of your queue
const string QueueName = "QUEUE_NAME";
const int MaxThreads = 3;
// QueueClient is thread-safe. Recommended that you cache
// rather than recreating it on every request
QueueClient Client;
bool IsStopped;
int dequeueRequests = 0;
public override void Run()
{
while (!IsStopped)
{
// Increment Request Counter
Interlocked.Increment(ref dequeueRequests);
Trace.WriteLine(dequeueRequests + " request(s) in progress");
Client.BeginReceive(new TimeSpan(0, 0, 10), ProcessUrgentEmails, Client);
// If we have made too many requests, wait for them to finish before requesting again.
while (dequeueRequests >= MaxThreads && !IsStopped)
{
System.Diagnostics.Trace.WriteLine(dequeueRequests + " requests in progress, waiting before requesting more work");
Thread.Sleep(2000);
}
}
}
void ProcessUrgentEmails(IAsyncResult result)
{
var qc = result.AsyncState as QueueClient;
var sendEmail = qc.EndReceive(result);
// We have received a message or has timeout... either way we decrease our counter
Interlocked.Decrement(ref dequeueRequests);
// If we have a message, process it
if (sendEmail != null)
{
var r = new Random();
// Process the message
Trace.WriteLine("Processing message: " + sendEmail.MessageId);
System.Threading.Thread.Sleep(r.Next(10000));
// Mark it as completed
sendEmail.BeginComplete(ProcessEndComplete, sendEmail);
}
}
void ProcessEndComplete(IAsyncResult result)
{
var bm = result.AsyncState as BrokeredMessage;
bm.EndComplete(result);
Trace.WriteLine("Completed message: " + bm.MessageId);
}
public override bool OnStart()
{
// Set the maximum number of concurrent connections
ServicePointManager.DefaultConnectionLimit = 12;
// Create the queue if it does not exist already
string connectionString = CloudConfigurationManager.GetSetting("Microsoft.ServiceBus.ConnectionString");
var namespaceManager = NamespaceManager.CreateFromConnectionString(connectionString);
if (!namespaceManager.QueueExists(QueueName))
{
namespaceManager.CreateQueue(QueueName);
}
// Initialize the connection to Service Bus Queue
Client = QueueClient.CreateFromConnectionString(connectionString, QueueName);
IsStopped = false;
return base.OnStart();
}
public override void OnStop()
{
// Waiting for all requestes to finish (or timeout) before closing
while (dequeueRequests > 0)
{
System.Diagnostics.Trace.WriteLine(dequeueRequests + " request(s), waiting before stopping");
Thread.Sleep(2000);
}
// Close the connection to Service Bus Queue
IsStopped = true;
Client.Close();
base.OnStop();
}
}
}
Hope it helps.

Categories