IIS/Azure IIS - restrict max response time - c#

The web service I develop must respond in 6 seconds max. If web service is unabled to respond in 6 seconds it must respond with some hardcoded data.
The web service is kind of proxy which transforms incoming data and makes a request to another service. That service is third-party and may fail or respond too long.
Currently there is restriction in application code which limits timeout to third-party service to 3 seconds (remember, total respond time is 6 seconds max).
But the problem is application has very high load and sometimes it is unabled to execute their code in rest of 3 seconds (but without loading our app code is executed in milliseconds, there are really a little of code).
I believe that Windows is unabled to run all threads and sometimes threads sleeps more than 3 seconds, and max respond time is violated without possibility to handle this.
The application is developed in asp.net core in C# and fully async.
So question: is it possible to limit response time on IIS level? Is it possible to respond with predefined data if timeout occured? Is there a way except adding extra instances to Azure to improve responsiveness? Is there a programmatic way to handle timeouts robust?
EDIT
I need
Guaranteed response time on IIS with high-load. Is it possible at all on non-realtime OS?
When request is not processed in time send predefined hardcoded response.
Do this on Azure

For the configuration itself, you're limited to minutes which you could have used as a simple escape, however, you're not able to do this so I guess I would use something similar to the Enterprise Library Retry Logic, that is, make a small class that takes a method and a timeout value:
public class TimeRestrictor{
public bool InvokeAction(Action action, TimeSpan timeout){
}
public Tuple<bool, T> InvokeFunction<T>(Func<T> function, TimeSpan timeout){
}
}
You can easilly implement this class with a timer of some sorts, also using async/await (I left those out for simplicity)
It would then be a matter of calling this method as:
var ranInTime = timeRestrictor.InvokeAction(MyMethod, TimeSpan.FromSeconds(6));
if(!ranInTime){
SetTimeOutResult();
}
This needs some tweaking, I'm just writing this code off the top of my head, but gives you a general idea of my suggestion.

Related

Keep MSSQL database (application pool) responsive C#

Firstly let me apologise, as I don't really know how to phrase the question.
The issue I'm having is trying to keep my database 'alive' while users come to my site. An example being, if I build my c# asp.net application and publish it, then try and navigate to it, it takes a while to respond (which I get, I understand it, this isn't an issue for me) the problem is if some person hasn't been to the site for a while, it seems to take a while again, like a session timer has passed, I'm not sure if this is something to do with App Pool recycling?
I've tried to run a scheduled task to hit the database (trying to keep it responsive) every 15 minutes, but this doesn't seem to work, it works well every 15 minutes for say 5 hours, and then I receive a message on a random call that the request has taken over 4 seconds to respond and therefore fails.
My question then, how do I keep my connection to the database / the site responsive so that each time a person requests it, the site loads quickly, rather than having to 'start up'
Kind regards as always
I suggest to increase connection pool size in your connection string.
This looks like what you want:
Keep an ASP.NET IIS website responsive when time between visits is long: Keep an ASP.NET IIS website responsive when time between visits is long
You might consider IIS application auto-start?
Some web applications need to load large amounts of data, or perform expensive initialization processing, before they are ready to process requests. Developers using ASP.NET today often do this work using the “Application_Start” event handler within the Global.asax file of an application (which fires the first time a request executes). They then either devise custom scripts to send fake requests to the application to periodically “wake it up” and execute this code before a customer hits it, or simply cause the unfortunate first customer that accesses the application to wait while this logic finishes before processing the request (which can lead to a long delay for them).
ASP.NET 4 ships with a new feature called “auto-start” that better addresses this scenario, and is available when ASP.NET 4 runs on IIS 7.5

.NET Service to monitor web page changes

I want to create a service that will monitor changes to web pages i.e. the page content has been updated. I am trying to think of the best way to achieve this and at present I am considering a couple of options. Note that there could be hundreds of pages to monitor and the interval for checking could be seconds or hours (configurable).
Create a windows service for each page to monitor
Create a windows service that spawns a thread for each page to monitor
Now, I am concerned which of these is the best approach and whether these is an alternative I haven't considered. I thought 1 would have the benefit of isolating each monitoring task but would come at the expense of overhead in terms of physical resources and effort to create/maintain. The second would be slightly more complex but cleaner. Obviously it would also lose isolation in that if the service fails then all monitoring will fail.
I have done something similar and I solved it by having a persisted queue (a SQL Server table) that would store the remote Uri along with the interval and a DateTime for the last time it ran.
I can then get all entries that I want to run by selecting the ones that has lastRun + interval < now.
If your smallest interval are in the region of seconds, you probably want to use a ThreadPool, so that you can issue several request at the same time. (Remember to adjust the maxConnections setting in your app.config accordingly).
I would use one Windows service (have a look at the TopShelf project for that) and I would then have Quartz.Net trigger the jobs. With Quartz, you can control whether it has to wait for previous jobs to finish etc.
Creating one Windows Service is the way to go... regarding the failure of this windows Service there are several measures you could take to deal with that - for example configure windows to automatically restart the Windows Service on failure...
I would recommend using a thread pool approach and/or a System.Threading.Timer in combination with a ConcurrentDictionary or ConcurrentQueue .

locking call to webservice in ASP.NET to avoid Oracle CRM time per web service call limit

I have a web application using ASP.NET, that is connecting to Oracle CRM as a back end. The ASP.Net uses some business objects to call into the Oracle CRM webservices, and this works fine.
Except, however, Oracle CRM has a limitation where they only allow you to make 20 web service calls per second (or one call per 50mS), and if you exceed this rate a SOAPException is returned "The maximum rate of requests was exceeded. Please try again in X ms."
The traffic to the site has increased recently, so we are now getting a lot of these SOAPExceptions, but as the code that calls the webservice is wrapped up in a business object, I thought I would modify it to ensure that the 50ms limit is never breached.
I use the following code
private static object lock_obj = new object();
lock (lock_obj)
{
call webservice;
System.Threading.Thread.Sleep(50);
}
However, I am still getting some SOAP Exceptions. I did try writing the code using mutexes instead of lock(), but the performance impact proved to be a problem.
Can anyone explain to me why my solution isn't workinf, and perhaps suggest an alternative?
Edit: Moved to answer. Possible due to > 1 IIS worker process. I don't think object locking spans worker processes so subsequent simultaneous threads could be started but I could be wrong
http://hectorcorrea.com/Blog/Log4net-Thread-Safe-but-not-Process-Safe
My suggestion would be an application variable which stores the tick of the last request, then from that you can work out when it's safe to fire the next.
As long as your application is running with only one ASP.NET worker process you should be ok with what you have, but there are a few things to potentially consider.
Are you using a Web Garden? If so this creates multiple worker processes and therefore a lock is only obtained per/process
Are you in a load balanced environment? If so you will need to go to a different method.
OK, it turns out that a compounding issue was that we have a windows service running on the same server that was also calling into some of the same objects every 4 minutes (running on a different process of course). When I turn it off (and having bumped the sleep up to 100 as per Mitchel's suggestion) the problem seems to have gone away almost entirely.
I say almost, because every so often I still get the odd mysterious soapexception, but I think by and large the problem is sorted. I'm still a bit mystified as to how we can get any of these Exceptions, but we will live with it for now.
I think Oracle should publicise this feature of Oracle CRM on Demand a little more widely.

ASP.NET Threading - can I do Async methods or do I use threading?

My environment - C# 3.5 and ASP.NET 4.0 and VS 2010
Apologies - am a bit new to some of the concepts related to threading and Async methods.
My scenario is this:
My site will periodically make a couple of GET/POSTS to an external site and collect some data
This data will be cached in a central cache
The periodic action will happen once in about 5 minutes, and will happen for every new member who registers on my site. The querying for the member will stop based on certain conditions
The user does NOT need to be logged in for these periodic queries - they register on the site, and then off my async code goes - it keeps working 24/7 and messages the user once a while via email depending on certain trigger condition. So essentially it all should happen in the background regardless of whether the user is explicitly logged in or not.
Load Expected - I anticipate about 100 total running members a day (accounting for new members + old ones leaving/stopping).
the equation is ~ 100 visitors / day x 4 POST per fetch x 12 fetches / hour x 8 hours / day
In my mind - I'm running 100 threads a day, and each 'thread' wakes up once in 5 minutes and does some stuff. The threads will interact with a static central cache which is shared among all of them.
I've read some discussions on ThreadPools, AsyncPage etc - all a bit new territory. In my scenario what would you suggest? What's the best approach to doing this so it's efficient?
In your response I would appreciate if you mention specific classes/methods/links to use so I can chase this. Thanks a bunch!
You will not be able to do it with ASP.net as such, you will not be able to keep the "threads" running with any level of reliability. IIS could decide to restart the appication pool (I.E. the whole process) at any point in time. Really what you would need is some kind of windows service that runs and makes the requests. You could the use HttpWebRequest.BeginGetResponse method to make your calls. This will fire off the relevent delegate when the response comes back and .net will manage the threading.
Agreeing with Ben, I would not use threading in IIS with ASP.NET. It's not the same as using it in a desktop application.
If you're going to use some kind of polling or timed action, I recommend having a handler (.ashx) or asp.net page (aspx) that can take the request that you want to run in the background and return XML or JSON as a response. You can then set some javascript in your pages to do an AJAX request to that URI and get whatever data you need. That handler can do the server side operations that you need. This will let you run background processes and update the front-end for your users if need be, and will take advantage of the existing IIS thread pool, which you can scale to fit the traffic you're getting.
So, for instance
ajaxRequest.ashx : Processes "background" request, takes http POST/GET parameters.
myPage.aspx : your UI
someScript.js : javascript file with functions to call ajaxRequest.ashx from myPage.aspx (or any other page) when certain actions or intervals occur.
jQuery.js : No need to write all the AJAX code or event handlers yourself :)
You will need to create a separate Windows service(or a console app that runs using the Windows scheduler) to poll the remote server.
If you need to trigger requests based on user interation with your site, the best way is to use some kind of queuing system(eg MSMQ) that your service monitors.

Proper way to handle thousands of calls to external service from asp.net (mvc)

I'm tasked to create a web application. I'm currently using c# & asp.net (mvc - but i doubt its relevant to the question) - am a rookie developer and somewhat new to .net.
Part of the logic in the application im building is to make requests to an external smsgateway by means of hitting a particular url with a request - either as part of a user-initiated action in the webapp (could be a couple of messages send) or as part of a scheduledtask run daily (could and will be several thousand message send).
In relation to a daily task, i am afraid that looping - say - 10.000 times in one thread (especially if im also to take action depending on the response of the request - like write to a db) is not the best strategy and that i could gain some performance/timesavings from some parallelization.
Ultimately i'm more afraid that thousands of users at the same time (very likely) will perform the action that triggers a request. With a naive implementation that spawns some kind of background thread (whatever its called) for each request i fear a scenario with hundreds/thousands of requests at once.
So if my assumptions are correct - how do i deal with this? do i have to manually spawn some appropriate number of new Thread()s and coordinate their work from a producer/consumer-like queue or is there some easy way?
Cheers
If you have to make 10,000 requests to a service then it means that the service's API is anemic - probably CRUD-based, designed as a thin wrapper over a database instead of an actual service.
A single "request" to a well-designed service should convey all of the information required to perform a single "unit of work" - in other words, those 10,000 requests could very likely be consolidated into one request, or at least a small handful of requests. This is especially important if requests are going to a remote server or may take a long time to complete (and 2-3 seconds is an extremely long time in computing).
If you do not have control over the service, if you do not have the ability to change the specification or the API - then I think you're going to find this very difficult. A single machine simply can't handle 10,000 outgoing connections at once; it will struggle with even a few hundred. You can try to parallelize this, but even if you achieve a tenfold increase in throughput, it's still going to take half an hour to complete, which is the kind of task you probably don't want running on a public-facing web site (but then, maybe you do, I don't know the specifics).
Perhaps you could be more specific about the environment, the architecture, and what it is you're trying to do?
In response to your update (possibly having thousands of users all performing an action at the same time that requires you to send one or two SMS messages for each):
This sounds like exactly the kind of scenario where you should be using Message Queuing. It's actually not too difficult to set up a solution using WCF. Some of the main reasons why one uses a message queue are:
There are a large number of messages to send;
The sending application cannot afford to send them synchronously or wait for any kind of response;
The messages must eventually be delivered.
And your requirements fit this like a glove. Since you're already on the Microsoft stack, I'd definitely recommend an asynchronous WCF service backed by MSMQ.
If you are working with SOAP, or some other type XML request, you may not have an issue dealing with the level of requests in a loop.
I set up something similar using a SOAP server with 4-5K requests with no problem...
A SOAP request to a web service (assuming .NET 2.0 and superior) looks something like this:
WebServiceProxyClient myclient = new WebServiceProxyClient();
myclient.SomeOperation(parameter1, parameter2);
myclient.Close();
I'm assuming that this code will will be embedded into your business logic that you will be trigger as part of the user initiated action, or as part of the scheduled task.
You don't need to do anything especial in your code to cope with a high volume of users. This will actually be a matter of scalling on your platform.
When you say 10.000 request, what do you mean? 10.000 request per second/minute/hour, this is your page hit per day, etc?
I'd also look into using an AsyncController, so that your site doesn't quickly become completely unusable.

Categories