Improving throughput to Dynamics CRM via asynchronous C# OData (Simple.Odata.Client) - c#

I've recently been assigned to a project, where all the batchjobs are being handled through horribly optimized SSIS packages, and I'm currently trying to create POC Console.Application which will handle the entire process instead.
The POC is already 400% faster than the SSIS packages, so the performance gain is already making the project worth it, however I am still not impressed by the throughput of the application, and I am seeking advice on how to possibly improve performance.
I'm doing all CRUD related operations, but I'll use Deletion as an example here. The Entities are POCO classes, with a [Table] and around 55 [Column] annotations, two of them are Lookup columns. So it is not an extremely convoluted class, although there is a bit of data, ofcourse.
With this setup, I reach around 27 records per second deletion, which I am not really impressed by.
Setup:
Threads: 4 (Mainly Letting Parallel.ForEach handling it on its own, but around same performance with MaximumDegreeOfParallelism at 4, and following the guidelines of )
Maximum batch size = 500 (No discernible difference from 100-500)
I ended up with the numbers for the setup based on the official documentation and a lot testing with the POC.
https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/administration/operational-limits-online
I'll use a Deletion method as an example, but they are all built around the same-ish logic.
var client = GetClient();
var entityList = GetEntitiesForDeletion();
var batchMaximumSize = 500;
await Parallel.ForEachAsync(entityList.Chunk(batchMaximumSize ), async (chunk, _) =>
{
var batch = new ODataBatch(client);
foreach (var entity in chunk)
{
batch += oDataClient => oDataClient.For<Entity>()
.Key(entity.Id)
.DeleteEntryAsync(_);
}
await batch.ExecuteAsync(_);
});
public static IODataClient GetClient()
{
const string baseAddress = "http://crm-address/";
const string apiUrl = "api/data/v8.2";
var crmDomain = Environment.GetEnvironmentVariable("DOMAIN");
var crmUsername = Environment.GetEnvironmentVariable("CRM_USERNAME");
var crmPassword = Environment.GetEnvironmentVariable("CRM_PASSWORD");
var httpHandler = new HttpClientHandler
{
Credentials = new NetworkCredential(crmUsername, crmPassword, crmDomain)
};
var httpClient = new HttpClient(httpHandler)
{
BaseAddress = new Uri(baseAddress),
};
var odataSettings = new ODataClientSettings(httpClient, new Uri(apiUrl, UriKind.Relative));
odataSettings.IgnoreResourceNotFoundException = true;
return new ODataClient(odataSettings);
}
Could anyone tell me if I'm missing anything glaringly obvious?
I know there is a connection limit of 100 to the server, however, I do not know how to make use of all of those.
I also have the possibility to create several more service users for the project, if this could help.
Or is the Dynamics CRM OData WebApi just not that fast?
Thanks in advance.

If you are actually connecting to dynamics business central, please ignore this.
Dynamics Crm generally means Dataverse / XRM in the power platform, the apps names vary between dynamics crm, dynamics 365 customer engagement, Dynamics sale / Service etc.
If you are using a dataverse, guidance will vary between online hosted versions and on premises.
A good place to start (for online) is the service protection limits https://learn.microsoft.com/en-us/power-apps/developer/data-platform/api-limits?tabs=sdk#how-to-maximize-throughput
This will help you understand what is going on in the server side.
In the maximize throughput section it will link you to this document https://learn.microsoft.com/en-us/power-apps/developer/data-platform/send-parallel-requests?tabs=sdk
Which will help explain how to optimize operation throughput via the .net client or via the web api.

Related

Query Azure Application Insights CustomEvents in Azure function in C#.Net

I need to query CustomEvents under application insights in an azure function.
I was able to read CustomEvents using below package:
Microsoft.Azure.ApplicationInsights.Query
Here is the code:
string applicationId = "xxxx-xxxx-xxxx";
string key = "xxxxxxxxxxx";
// Create client
var credentials = new ApiKeyClientCredentials(key);
var applicationInsightsClient = new ApplicationInsightsDataClient(credentials);
// Query Application Insights
var query = "customEvents" +
" | where timestamp > ago(840h)" +
" | take 3";
var response = await applicationInsightsClient.Query.ExecuteWithHttpMessagesAsync(applicationId, query);
The library 'Microsoft.Azure.ApplicationInsights.Query is however deprecated and suggestion is to use Azure.Monitor.Query
Below is the code that Microsoft documentation has as an example to query logs using Azure.Monitor.Query :
Azure.Response<Azure.Monitor.Query.Models.LogsQueryResult> response = await logsQueryClient.QueryWorkspaceAsync(
"<workspaceId>",
"customEvents ",
new QueryTimeRange(TimeSpan.FromMinutes(300)));
Since this library queries using workspace id, I linked my application insights instance to a log analytics workspace instance. However the function fails with a BadArgumentError "Failed to resolve table or column expression named 'customEvents'"
Is there a way we can query CustomEvents using the package Azure.Monitor.Query?
Any help is appreciated.
Thanks
Yes, it works.
Below is a tested code.
Once you link your Application Insights to Azure Monitor workspace, you can query your AI tables from that WS, without the need to use app().
The thing is that the tables` names are different, e.g., traces becomes AppTraces.
In the same manner, customEvents becomes AppEvents.
Well, it turns out it is even documented, under Migrate to workspace-based Application Insights resources
Legacy table name
New table name
Description
availabilityResults
AppAvailabilityResults
Summary data from availability tests.
browserTimings
AppBrowserTimings
Data about client performance, such as the time taken to process the incoming data.
dependencies
AppDependencies
Calls from the application to other components (including external components) recorded via TrackDependency() – for example, calls to REST API, database or a file system.
customEvents
AppEvents
Custom events created by your application.
customMetrics
AppMetrics
Custom metrics created by your application.
pageViews
AppPageViews
Data about each website view with browser information.
performanceCounters
AppPerformanceCounters
Performance measurements from the compute resources supporting the application, for example, Windows performance counters.
requests
AppRequests
Requests received by your application. For example, a separate request record is logged for each HTTP request that your web app receives.
exceptions
AppExceptions
Exceptions thrown by the application runtime, captures both server side and client-side (browsers) exceptions.
traces
AppTraces
Detailed logs (traces) emitted through application code/logging frameworks recorded via TrackTrace().
using Azure;
using Azure.Identity;
using Azure.Monitor.Query;
using Azure.Monitor.Query.Models;
string workspaceId = "...";
var client = new LogsQueryClient(new DefaultAzureCredential());
try
{
Response<LogsQueryResult> response = await client.QueryWorkspaceAsync(
workspaceId,
"AppEvents | count",
QueryTimeRange.All);
LogsTable table = response.Value.Table;
foreach (var row in table.Rows)
{
Console.WriteLine(row);
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}

Why does my query to Purchases.products in Google Player Developer Api take more than 40s to respond?

We use Purchases.products in Google Play Developer Api from our backend to validate purchases made by our client Android app.
This use to work fine, but now the processing of a valid purchase takes 20-40 seconds to complete. It still works though.
My question is: What has changed to cause such behaviour, and is there a different approach to validate purchases that should be used now?
It is a very popular app that generates an average of 16k requests per day, so I've been thinking that maybe we are being throttled? If so, how do I find out?
I already updated to the latest version of the Google API library for .NET, and nothing changed.
The response is ~1 second if I pass an invalid receipt in the request.
The code is something similar to this:
public void ValidatePurchase(string productId, string receipt, string bundleId, string clientEmail, string privateKey)
{
var credential = new ServiceAccountCredential(new ServiceAccountCredential.Initializer(clientEmail)
{
Scopes = new[] { AndroidPublisherService.Scope.Androidpublisher }
}.FromPrivateKey(privateKey));
// Create the service.
var service = new AndroidPublisherService(initializer: new BaseClientService.Initializer
{
HttpClientInitializer = credential
});
service.Purchases.Products.Get(bundleId, productId, receipt).Execute();
}
It works, but takes 20-40 seconds to complete, which is unacceptable.
Edit: I changed my actual code to re-use the AndroidPublisherService instance like suggested in Chris' comment. As I kind of expected, it made no difference.
Also worth noting: when validating purchases from other apps, using the same service, even ones using the same account, the responses are as fast as they should be.

couchbase lite xamarin pull replication with sync-gateway

I want to pull documents with username attribute
as user1 for user1 like that for each user only attribute with their name.
This is my replication code.
private void setupreplication(){
Console.WriteLine ("Setting up replication");
Uri Server = new Uri("http://192.168.1.213:4984/aussie-coins-syncgw/");
var pull = _db.CreatePullReplication (Server);
var push = _db.CreatePushReplication (Server);
pull.Filter = "byUser";
pull.FilterParams = new Dictionary<string, object> { {"type", "user1"} };
pull.Continuous = true;
push.Continuous = true;
pull.Start();
push.Start();
}
This is my set filter code
_couchBaseLiteLocal.SetFilter("byUser", (revision, filterParams) =>
{
var typeParam = filterParams["type"].ToString();
return (typeParam != null) && typeParam.Equals("user1");
});
With the above code generic pull itself not working.
I just tried to do as given in the documentation.
I do not understand how the setfilter function works to filter data from server. It would be great if someone help in understanding how setfilter works and to make the above code working
Thanks in advance.
The filter function in pull replications can indeed return the specific documents you are interested in. But it's not very efficient, the filter function will run on all the documents on the remote database to determine which ones to pull, every time a pull replication is started.
Instead Sync Gateway introduces the concept of a sync function that incrementally routes and computes access control rules on documents. That way, when starting the pull replication, it's fast and straightforward for Sync Gateway to return the specific documents the user has access to.
You can specify individual channels in a pull replication from Sync Gateway if needed. But the thing to remember is that filtered pull replication between Sync Gateway and Couchbase Lite is not based on filter functions. It's based on the sync function and channel based filtering if needed.
In a P2P scenario (replications between two Couchbase Lite instances), the filter function model is used.

Finding Connection by UserId in SignalR

I have a webpage that uses ajax polling to get stock market updates from the server. I'd like to use SignalR instead, but I'm having trouble understanding how/if it would work.
ok, it's not really stock market updates, but the analogy works.
The SignalR examples I've seen send messages to either the current connection, all connections, or groups. In my example the stock updates happen outside of the current connection, so there's no such thing as the 'current connection'. And a user's account is associated with a few stocks, so sending a stock notification to all connections or to groups doesn't work either. I need to be able to find a connection associated with a certain userId.
Here's a fake code example:
foreach(var stock in StockService.GetStocksWithBigNews())
{
var userIds = UserService.GetUserIdsThatCareAboutStock(stock);
var connections = /* find connections associated with user ids */;
foreach(var connection in connections)
{
connection.Send(...);
}
}
In this question on filtering connections, they mention that I could keep current connections in memory but (1) it's bad for scaling and (2) it's bad for multi node websites. Both of these points are critically important to our current application. That makes me think I'd have to send a message out to all nodes to find users connected to each node >> my brain explodes in confusion.
THE QUESTION
How do I find a connection for a specific user that is scalable? Am I thinking about this the wrong way?
I created a little project last night to learn this also. I used 1.0 alpha and it was Straight forward. I created a Hub and from there on it just worked :)
I my project i have N Compute Units(some servers processing work), when they start up they invoke the ComputeUnitRegister.
await HubProxy.Invoke("ComputeUnitReqisted", _ComputeGuid);
and every time they do something they call
HubProxy.Invoke("Running", _ComputeGuid);
where HubProxy is :
HubConnection Hub = new HubConnection(RoleEnvironment.IsAvailable ?
RoleEnvironment.GetConfigurationSettingValue("SignalREndPoint"):
"http://taskqueue.cloudapp.net/");
IHubProxy HubProxy = Hub.CreateHubProxy("ComputeUnits");
I used RoleEnviroment.IsAvailable because i can now run this as a Azure Role , a Console App or what ever in .NET 4.5. The Hub is placed in a MVC4 Website project and is started like this:
GlobalHost.Configuration.ConnectionTimeout = TimeSpan.FromSeconds(50);
RouteTable.Routes.MapHubs();
public class ComputeUnits : Hub
{
public Task Running(Guid MyGuid)
{
return Clients.Group(MyGuid.ToString()).ComputeUnitHeartBeat(MyGuid,
DateTime.UtcNow.ToEpochMilliseconds());
}
public Task ComputeUnitReqister(Guid MyGuid)
{
Groups.Add(Context.ConnectionId, "ComputeUnits").Wait();
return Clients.Others.ComputeUnitCameOnline(new { Guid = MyGuid,
HeartBeat = DateTime.UtcNow.ToEpochMilliseconds() });
}
public void SubscribeToHeartBeats(Guid MyGuid)
{
Groups.Add(Context.ConnectionId, MyGuid.ToString());
}
}
My clients are Javascript clients, that have methods for(let me know if you need to see the code for this also). But basicly they listhen for the ComputeUnitCameOnline and when its run they call on the server SubscribeToHeartBeats. This means that whenever the server compute unit is doing some work it will call Running, which will trigger a ComputeUnitHeartBeat on javascript clients.
I hope you can use this to see how Groups and Connections can be used. And last, its also scaled out over multiply azure roles by adding a few lines of code:
GlobalHost.HubPipeline.EnableAutoRejoiningGroups();
GlobalHost.DependencyResolver.UseServiceBus(
serviceBusConnectionString,
2,
3,
GetRoleInstanceNumber(),
topicPathPrefix /* the prefix applied to the name of each topic used */
);
You can get the connection string on the servicebus on azure, remember the Provider=SharedSecret. But when adding the nuget packaged the connectionstring syntax is also pasted into your web.config.
2 is how many topics to split it about. Topics can contain 1Gb of data, so depending on performance you can increase it.
3 is the number of nodes to split it out on. I used 3 because i have 2 Azure Instances, and my localhost. You can get the RoleNumber like this (note that i hard coded my localhost to 2).
private static int GetRoleInstanceNumber()
{
if (!RoleEnvironment.IsAvailable)
return 2;
var roleInstanceId = RoleEnvironment.CurrentRoleInstance.Id;
var li1 = roleInstanceId.LastIndexOf(".");
var li2 = roleInstanceId.LastIndexOf("_");
var roleInstanceNo = roleInstanceId.Substring(Math.Max(li1, li2) + 1);
return Int32.Parse(roleInstanceNo);
}
You can see it all live at : http://taskqueue.cloudapp.net/#/compute-units
When using SignalR, after a client has connected to the server they are served up a Connection ID (this is essential to providing real time communication). Yes this is stored in memory but SignalR also can be used in multi-node environments. You can use the Redis or even Sql Server backplane (more to come) for example. So long story short, we take care of your scale-out scenarios for you via backplanes/service bus' without you having to worry about it.

Updating entites in Dynamics CRM 4 through the CRM-webservice takes forever

Im having some trouble with Dynamics CRM 4. Im trying to update prices and availability with the crm-service in a WPF-app but it takes forever. Up to half an hour with about 6000 products. Should it take so long? Can I do this in some other quicker way?
QueryExpression query = new QueryExpression();
query.EntityName = EntityName.product.ToString();
BusinessEntityCollection entities = crmService.RetrieveMultiple(query);
foreach (product crmProduct in entities.BusinessEntities.OfType<product>())
{
crmProduct.price = new CrmMoney() { Value = 123M };
crmProduct.stock = new CrmNumber() { Value = 123 };
crmService.Update(crmProduct);
}
To improve performance, try to update only the fields that you really wan to update. Your code is updating every attribute, because you are using the product that comes from CRM. When you are doing that every plugins are fired and because product is a core CRM entity, more CRM logic can be fired when updating this entity.
Try to get only the primary key of product (productid) and set both fields and call the update statement. With this, you should achieve about 100 requests per second on standard hardware when using a sequential process.
To achieve more updates, try running the process on the CRM server or using parallel processing.
Try setting this on your CRM Service object:
crmService.UnsafeAuthenticatedConnectionSharing = true;
This makes the service only authenticate once, then uses the same credentials. This would be a bad thing if the code were in a web site where multiple people were going to be using the same CRM Service, as future users could get access to records they shouldn't, however, in a WPF app where there's just one user, this isn't a concern.
Here's an article with more metrics and some more things to think about tweaking. It originally applies to CRM 3, but we've found the same settings in 4 still boost performance.

Categories