EXCEEDED_ID_LIMIT updating Contact records - c#

We are using the Partner WSDL in our C# integration with Salesforce and we are receiving the following error when trying to update more than 200 records:
Error updating Contact: EXCEEDED_ID_LIMIT: record limit reached. cannot submit more than 200 records into this call
How do we go about increasing this number? Is it possible or are we stuck with 200 records?
Thanks ahead of time for your resonse.

You can only update 200 records in a single request, you need to chunk your update up into sets of 200 and make multiple calls.

The web service administrators have probably limited each call to 200 records as a safeguard. This means less load on their servers and quicker response to the client.
You probably cannot change this limit unless you contact the web service administrator directly.
For now you should keep the limit in mind and make multiple requests of 200 records each instead of a single request.
Note: Web services that limit the number of records returned per request will sometimes return an ID number. This usually allows the client to continue picking up records where they left off. Keep an eye out for this.

Related

Twilio API - How to send concurrent requests [duplicate]

I know this has been asked a few times but I'm trying to track down what my exact issue could be.
I've got a C# app, which queues up messages to be sent (using Azure Storage Queues) and these are processed by an Azure Webjob. We're using the twilio-csharp nuget package to send the messages.
The code to send a message is pretty simple:
MessageResource.Create(
body: message.Message,
from: new Twilio.Types.PhoneNumber(TwilioFromNumber),
to: new Twilio.Types.PhoneNumber(message.SendToPhoneNumber));
By default, the Webjob will process up to 16 messages at a time but to combat this issue we've set:
context.BatchSize = 2;
context.NewBatchThreshold = 0;
So, at any given point, we're not making more than 2 requests at a time.
Even with this low threshold, we still see these errors in the log periodically:
Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: TextMessageFunctions.SendTextMessage ---> Twilio.Exceptions.ApiException: Too Many Requests
at Twilio.Clients.TwilioRestClient.ProcessResponse(Response response)
Some other thoughts:
The answer on this question, from a Twilio Developer Evangelist, suggests the REST API's concurrency limit is 100 by default. Is this still true or there a way for me to check this on my account? There's no way we're close to 100. We never queue up more than 20-30 messages at a time, and that is on the extreme end of things.
We're using a Toll-Free US number to send from. According to Twilio, we should be able to queue up 43,200 messages on their end.
That same article says:
Notice: You can send messages to Twilio at a rapid rate, as long as the requests do not max out Twilio's REST API concurrency limit.
This makes me think I'm doing something wrong, because surely "a rapid rate" could be more than 2 requests at a time (and I still wonder about the rate of 100 mentioned above). Can we truly not call the Twilio API with 2 concurrent requests without getting this error?
Twilio developer evangelist here.
There has been a bit of a change in the concurrency limits recently that has affected you here. New accounts are now receiving a much lower concurrency allowance for POST requests, as low as 1 concurrent request. This was to combat a recent rise in fraudulent activity.
I am sure your activity isn't fraudulent, so here's what you should do:
For now reduce your batch size to 1 so that you only make 1 request at a time to the Twilio API.
Add code to catch errors and if they are 429 response, re-queue the job to happen later (with exponential back off if possible)
Get in touch with Twilio Sales to talk to them about your use case and request an increased concurrency limit
I am sure this limit is not going to be the long term solution to the issues we were facing and I am sorry that you are experiencing problems with this.

APIM requests and Application Insights count

We are using APIM for all our API requests and enabled Application Insights to make sure we get all information like country, request body, IP address, HTTP status code, etc.
We are using AppInsights API to get data APIM data, as on UI, there is a limit of 10K per query.
https://api.applicationinsights.io/v1/apps/
It was working fine till we had a limited number of calls on APIM like 7K/10K per day.
Now we are getting around 40k-80K data per day.
Now when I write a Kusto query in the AppInsights UI, it give me counts 38,648, 29,493, 26,847 for 3 days.
requests
|where url contains 'abc'
|where timestamp >= startofday(datetime('30-Apr-20')) and timestamp <= endofday(datetime('02-May-20'))
| summarize count(),
avg(duration)
by bin(timestamp , 1d)
But when I run an API query request, it gives me records around 54K whereas i should get records around 94K.
When it runs for days where requests are more 150+, still it gives records around 54K.
I checked with the limit on the number of queries. they are talking about 200 per second 30 seconds and 86400 per day. Nothing is mentioned about data size.
It seems, there is a limitation on data size from AppInsights API
When I download for 30-Apr to 01-May, file download size is around 74K
When I download for 30-Apr to 02-May, still file download size is around 74K
I have used AppInsights API in C# console application and using webClient.(DownloadString/DownloadFIle) method to get this data.
Query as follows
https://api.applicationinsights.io/v1/apps/<code/query?query=requests|where url contains 'abc'|where timestamp >= startofday(datetime('30-Apr-20'))and timestamp <= endofday(datetime('02-May-20'))
You have to set the sampling value to '100'.
How to integrate Azure API Management with Azure Application Insights
Sampling (%) decimal Values from 0 to 100 (percent).
Specifies what percentage of requests will be logged to Azure Application Insights. 0% sampling means zero requests logged, while 100% sampling means all requests logged.
This setting is used for reducing performance implications of logging requests to Azure Application Insights (see the section below).

ASP.NET WebAPI Ajax with progress

I have a really long webAPI request that basically does the follow :
1. retrieves a list of item categories from the db
2. for each category, retrieve all the items in the category
Now, the entire process takes a very long time and I don't want the user to wait till the entire process is over, if a category has finished loading I want it to return to the client
Does anyone know how I can do that? Send a request and get progress notifications by the server whenever a part of the request has finished?
You could use SignalR to send the data from the server to the client when it's available.
The other option is polling from the client. The client makes the initial request, which triggers a server side process that prepares the data and keeps it somewhere (in memory, in a database). Then the client polls the server for new available data until the server process finishes.
you need to break your request. use for loop. if elements from first category are downloaded then do something with them before going for second category.
so your request will go inside some loop. You can use Jquery or page methods if you are using asp.net webforms
PushStreamContent might help you:
http://weblogs.asp.net/andresv/asynchronous-streaming-in-asp-net-webapi

Asp.Net Web API preventing record duplication

I am using C#.NET Web API for my iOS application but I have concerns about multiple requests arrive at the same time.
Let's assume I try to prevent duplicating records while inserting a new record into Users table by:
Check if xxx#example.com exists in the Users table.
Insert if not exists.
Return OK.
Actually it's that simple unless web api runs async.
What if related web api method gets two requests at the same time (with same e-mail request) and when the first request reaches step 2 (but not executed yet) and second request will get "not exists" response since step two for first request has not been executed yet. Then two e-mail address will be saved and I will have duplicated records.
Using lock on static object seems will solve the problem but it will create performance issues.
If I don't want DB to get rows duplicated, how can I overcome by that problem?
UPDATE:
I can't use unique constraint on e-mail column due to I already have it on Id column.
If you make the email address in your table have a unique constraint then all you have to do is insert the email address, if it already there it will fail, if not you will have inserted a new record.
You need to handle the failure maybe respond with some appropriate code to the client so it knows email already exists.

Web Services. Get input data, process it at background thread

I've got several web-services: asmx,wcf. At couple of them there are some methods, which take a lot of time for processing, but size of input data for these methods are small and it takes not much time to transfer on the wire. I want move to not sync model. Client passes data to service, service answers that data transfer was correct and process it at background thread witout connection with client. So agter transfering connection should be closed. IS it possible? Can u help me with articles or may be just google request.
John is right - Once you close an http connection, it is done. You can't get back to the same process.
So if you can use another technology that allows duplex on one connection (e.g. WCF), do it!
However,
if you have no choice but to use webservices,
here are three ways to make it work. You may get timeouts on any of them.
Option 1:
Forget the part about 'client answers data was correct.' Just have each thread make its request and wait for the data.
Option 2:
Now, assuming that won't work and you must do the validation, this way requires the client to make 2 requests.
First request: returns valid/invalid.
Second request: returns the long-running results.
Variation of option 2:
If you have timeout problems, you could have the first request generate a GUID or unique database key and start another process, passing it this key, and return the key to the client. (if you can get the server to allow you to start a process - depends on security settings/needs - if not you may be able to start an async thread and have it keep running after the websvc one ends?) The process will do the long task, update the row in the database w/ the unique id when finished, revealing the results plus a 'done' flag. The second request by the client could always return immediately and if the processing is not done, return that, if it is, return the results. The client will repeat this every 5 sec or so until done.
Hacks, I know, but we don't always have a choice for the technology we use.
Don't do this with ASMX web services. They weren't designed for that. If you must do it with ASMX, then have the ASMX pass the data off to a Windows Service that will do the actual work, in the background.
This is more practical with WCF.
We have been writing stuff to interact with the UK gov website and the way they handle something similar is that you send your request and data to the server and it responds saying, roughly, "thanks very much - we're processing it now, please call back later using this id" - all in an XML message. You then, at some point later, send a new http request to the service saying, essentially, "I'm enquiring about the status of this particular request id" and the server returns a result that says either it has processed OK, or processed with errors, or is still processing, please try again in xx seconds.
Similar to option 2 described previously.
It's a polling solution rather than a callback or 2 way conversation but it seems to work.
The server will need to keep, or have access to, some form of persistent table or log for each request state - it can contain eg, the id, the original request, current stage through the workflow, any error messages so far, the result (if any) etc. And the web service should probably have passed the bulk of the request off to a separate Windows service as already mentioned.

Categories