My application needs to call a WebService method to upload some xml.
The provider of this WebService has a restriction that my application cannot make more than 900 calls within a minute else it would be banned.
What kind of logic do I implement in my C# program which uses this webmethod to take care that more than 900 calls are not done in a minute.
Please note that I am not using WCF but simple httpwebrequest in my application.
Thanks in advance.
Keep a fixed sized queue of 900 elements. Push the timestamp in there for every call.
Before a pushing (every call), check the timestamp on the bottom end. If less than 1 min, do it. If more, fail or wait until it get to 1 minute
Related
I have over 5000 Account records in Salesforce that need to be deleted.
I am currently using the following in my c# code
SalesForceEnterpriseService.DeleteResult[] delResult = _service.delete(ids);
and ids is a string array defined as follows: string[] ids.
When the above code runs , it gives an error: EXCEEDED_ID_LIMIT: delete id limit reached: 200.
Is there a way i can delete these 5000 records calling my service from within c# ?
Building Apex REST (or SOAP) service just for this sounds like overkill, the overhead of writing unit test, deploying... not worth it.
From C# - can you call that delete in a loop, 200 records at a time? It'll be bit more network traffic but should be straightforward.
From Apex - do you know how to run "execute anonymous"? delete [SELECT Id FROM Account WHERE Id IN ('001...', '001...'); should be good start, add LIMIT 1000 if the operation times out.
If you routinely need to process large amounts of data - consider using Bulk API. But it's REST (your thing looks like it uses SOAP) and it works bit different. You submit a job (csv/xml/json), start it and periodically poll "is it done yet". Whereas your SOAP call just takes that 200 and waits till server returns (synchronous vs asynchronous)
Check https://trailhead.salesforce.com/content/learn/modules/api_basics/api_basics_bulk and https://trailhead.salesforce.com/content/learn/modules/large-data-volumes/load-your-data if you're curious.
The web service I develop must respond in 6 seconds max. If web service is unabled to respond in 6 seconds it must respond with some hardcoded data.
The web service is kind of proxy which transforms incoming data and makes a request to another service. That service is third-party and may fail or respond too long.
Currently there is restriction in application code which limits timeout to third-party service to 3 seconds (remember, total respond time is 6 seconds max).
But the problem is application has very high load and sometimes it is unabled to execute their code in rest of 3 seconds (but without loading our app code is executed in milliseconds, there are really a little of code).
I believe that Windows is unabled to run all threads and sometimes threads sleeps more than 3 seconds, and max respond time is violated without possibility to handle this.
The application is developed in asp.net core in C# and fully async.
So question: is it possible to limit response time on IIS level? Is it possible to respond with predefined data if timeout occured? Is there a way except adding extra instances to Azure to improve responsiveness? Is there a programmatic way to handle timeouts robust?
EDIT
I need
Guaranteed response time on IIS with high-load. Is it possible at all on non-realtime OS?
When request is not processed in time send predefined hardcoded response.
Do this on Azure
For the configuration itself, you're limited to minutes which you could have used as a simple escape, however, you're not able to do this so I guess I would use something similar to the Enterprise Library Retry Logic, that is, make a small class that takes a method and a timeout value:
public class TimeRestrictor{
public bool InvokeAction(Action action, TimeSpan timeout){
}
public Tuple<bool, T> InvokeFunction<T>(Func<T> function, TimeSpan timeout){
}
}
You can easilly implement this class with a timer of some sorts, also using async/await (I left those out for simplicity)
It would then be a matter of calling this method as:
var ranInTime = timeRestrictor.InvokeAction(MyMethod, TimeSpan.FromSeconds(6));
if(!ranInTime){
SetTimeOutResult();
}
This needs some tweaking, I'm just writing this code off the top of my head, but gives you a general idea of my suggestion.
I am looking to redesign a service that is used by several client applications. These applications make repeated requests at 30 to 60 second intervals of one particular method on the service. This method Gets data and then Caches it for approximately 30 to 45 seconds. Because the method is driven by requests it checks on every request to see if the time difference from the last cache is > 30 seconds and if so refreshes it before returning the results.
While I'd eventually like to move to a pub / sub model, for now I have to stay with polling. What I would like to do is create a repeating background process that refreshes the cache on a specified time interval independent of requests to the service. Then as requests to the method come it would always just return from cache.
I am not sure how exactly to accomplish this? I don't believe I want to tie the kickoff of the background thread to an initial request but I'm not sure how to start it. Do I have to create some kind of windows service that shares an App Domain or is there a better way?
Why you don't want to use cache expiration mechanism? In this case you can be sure that returned data will be correct if cached data became stale, and you don't need to do extra (possibly unneeded) requests to DB.
My environment - C# 3.5 and ASP.NET 4.0 and VS 2010
Apologies - am a bit new to some of the concepts related to threading and Async methods.
My scenario is this:
My site will periodically make a couple of GET/POSTS to an external site and collect some data
This data will be cached in a central cache
The periodic action will happen once in about 5 minutes, and will happen for every new member who registers on my site. The querying for the member will stop based on certain conditions
The user does NOT need to be logged in for these periodic queries - they register on the site, and then off my async code goes - it keeps working 24/7 and messages the user once a while via email depending on certain trigger condition. So essentially it all should happen in the background regardless of whether the user is explicitly logged in or not.
Load Expected - I anticipate about 100 total running members a day (accounting for new members + old ones leaving/stopping).
the equation is ~ 100 visitors / day x 4 POST per fetch x 12 fetches / hour x 8 hours / day
In my mind - I'm running 100 threads a day, and each 'thread' wakes up once in 5 minutes and does some stuff. The threads will interact with a static central cache which is shared among all of them.
I've read some discussions on ThreadPools, AsyncPage etc - all a bit new territory. In my scenario what would you suggest? What's the best approach to doing this so it's efficient?
In your response I would appreciate if you mention specific classes/methods/links to use so I can chase this. Thanks a bunch!
You will not be able to do it with ASP.net as such, you will not be able to keep the "threads" running with any level of reliability. IIS could decide to restart the appication pool (I.E. the whole process) at any point in time. Really what you would need is some kind of windows service that runs and makes the requests. You could the use HttpWebRequest.BeginGetResponse method to make your calls. This will fire off the relevent delegate when the response comes back and .net will manage the threading.
Agreeing with Ben, I would not use threading in IIS with ASP.NET. It's not the same as using it in a desktop application.
If you're going to use some kind of polling or timed action, I recommend having a handler (.ashx) or asp.net page (aspx) that can take the request that you want to run in the background and return XML or JSON as a response. You can then set some javascript in your pages to do an AJAX request to that URI and get whatever data you need. That handler can do the server side operations that you need. This will let you run background processes and update the front-end for your users if need be, and will take advantage of the existing IIS thread pool, which you can scale to fit the traffic you're getting.
So, for instance
ajaxRequest.ashx : Processes "background" request, takes http POST/GET parameters.
myPage.aspx : your UI
someScript.js : javascript file with functions to call ajaxRequest.ashx from myPage.aspx (or any other page) when certain actions or intervals occur.
jQuery.js : No need to write all the AJAX code or event handlers yourself :)
You will need to create a separate Windows service(or a console app that runs using the Windows scheduler) to poll the remote server.
If you need to trigger requests based on user interation with your site, the best way is to use some kind of queuing system(eg MSMQ) that your service monitors.
I want to write a WCF service that can't run more than X times per hour. I want the service to suspended messages to a queue if the service was called more than x time in the last 60 minutes.
Any ideas how can one limit the service?
I am willing to write custom components in the WCF stack.
Using a database, XML file, or some other datastore, record the date and time of every call to the service, and if it did any work. Every time the service is called:
Check the count of the records in your datastore that did work
within the last 60 minutes.
If less than X, do work, record that you did work, and when.
If more than x move the request to the queue, record the request.
You'll need something checking your queue of work to be done too (windows service?), and to determine if the work done in the queue counts against your X times per hour or not.
This is all very high level as we know nothing specific about your project, HTH.