Query Azure Application Insights CustomEvents in Azure function in C#.Net - c#

I need to query CustomEvents under application insights in an azure function.
I was able to read CustomEvents using below package:
Microsoft.Azure.ApplicationInsights.Query
Here is the code:
string applicationId = "xxxx-xxxx-xxxx";
string key = "xxxxxxxxxxx";
// Create client
var credentials = new ApiKeyClientCredentials(key);
var applicationInsightsClient = new ApplicationInsightsDataClient(credentials);
// Query Application Insights
var query = "customEvents" +
" | where timestamp > ago(840h)" +
" | take 3";
var response = await applicationInsightsClient.Query.ExecuteWithHttpMessagesAsync(applicationId, query);
The library 'Microsoft.Azure.ApplicationInsights.Query is however deprecated and suggestion is to use Azure.Monitor.Query
Below is the code that Microsoft documentation has as an example to query logs using Azure.Monitor.Query :
Azure.Response<Azure.Monitor.Query.Models.LogsQueryResult> response = await logsQueryClient.QueryWorkspaceAsync(
"<workspaceId>",
"customEvents ",
new QueryTimeRange(TimeSpan.FromMinutes(300)));
Since this library queries using workspace id, I linked my application insights instance to a log analytics workspace instance. However the function fails with a BadArgumentError "Failed to resolve table or column expression named 'customEvents'"
Is there a way we can query CustomEvents using the package Azure.Monitor.Query?
Any help is appreciated.
Thanks

Yes, it works.
Below is a tested code.
Once you link your Application Insights to Azure Monitor workspace, you can query your AI tables from that WS, without the need to use app().
The thing is that the tables` names are different, e.g., traces becomes AppTraces.
In the same manner, customEvents becomes AppEvents.
Well, it turns out it is even documented, under Migrate to workspace-based Application Insights resources
Legacy table name
New table name
Description
availabilityResults
AppAvailabilityResults
Summary data from availability tests.
browserTimings
AppBrowserTimings
Data about client performance, such as the time taken to process the incoming data.
dependencies
AppDependencies
Calls from the application to other components (including external components) recorded via TrackDependency() – for example, calls to REST API, database or a file system.
customEvents
AppEvents
Custom events created by your application.
customMetrics
AppMetrics
Custom metrics created by your application.
pageViews
AppPageViews
Data about each website view with browser information.
performanceCounters
AppPerformanceCounters
Performance measurements from the compute resources supporting the application, for example, Windows performance counters.
requests
AppRequests
Requests received by your application. For example, a separate request record is logged for each HTTP request that your web app receives.
exceptions
AppExceptions
Exceptions thrown by the application runtime, captures both server side and client-side (browsers) exceptions.
traces
AppTraces
Detailed logs (traces) emitted through application code/logging frameworks recorded via TrackTrace().
using Azure;
using Azure.Identity;
using Azure.Monitor.Query;
using Azure.Monitor.Query.Models;
string workspaceId = "...";
var client = new LogsQueryClient(new DefaultAzureCredential());
try
{
Response<LogsQueryResult> response = await client.QueryWorkspaceAsync(
workspaceId,
"AppEvents | count",
QueryTimeRange.All);
LogsTable table = response.Value.Table;
foreach (var row in table.Rows)
{
Console.WriteLine(row);
}
}
catch (Exception e)
{
Console.WriteLine(e.Message);
}

Related

Improving throughput to Dynamics CRM via asynchronous C# OData (Simple.Odata.Client)

I've recently been assigned to a project, where all the batchjobs are being handled through horribly optimized SSIS packages, and I'm currently trying to create POC Console.Application which will handle the entire process instead.
The POC is already 400% faster than the SSIS packages, so the performance gain is already making the project worth it, however I am still not impressed by the throughput of the application, and I am seeking advice on how to possibly improve performance.
I'm doing all CRUD related operations, but I'll use Deletion as an example here. The Entities are POCO classes, with a [Table] and around 55 [Column] annotations, two of them are Lookup columns. So it is not an extremely convoluted class, although there is a bit of data, ofcourse.
With this setup, I reach around 27 records per second deletion, which I am not really impressed by.
Setup:
Threads: 4 (Mainly Letting Parallel.ForEach handling it on its own, but around same performance with MaximumDegreeOfParallelism at 4, and following the guidelines of )
Maximum batch size = 500 (No discernible difference from 100-500)
I ended up with the numbers for the setup based on the official documentation and a lot testing with the POC.
https://learn.microsoft.com/en-us/dynamics365/business-central/dev-itpro/administration/operational-limits-online
I'll use a Deletion method as an example, but they are all built around the same-ish logic.
var client = GetClient();
var entityList = GetEntitiesForDeletion();
var batchMaximumSize = 500;
await Parallel.ForEachAsync(entityList.Chunk(batchMaximumSize ), async (chunk, _) =>
{
var batch = new ODataBatch(client);
foreach (var entity in chunk)
{
batch += oDataClient => oDataClient.For<Entity>()
.Key(entity.Id)
.DeleteEntryAsync(_);
}
await batch.ExecuteAsync(_);
});
public static IODataClient GetClient()
{
const string baseAddress = "http://crm-address/";
const string apiUrl = "api/data/v8.2";
var crmDomain = Environment.GetEnvironmentVariable("DOMAIN");
var crmUsername = Environment.GetEnvironmentVariable("CRM_USERNAME");
var crmPassword = Environment.GetEnvironmentVariable("CRM_PASSWORD");
var httpHandler = new HttpClientHandler
{
Credentials = new NetworkCredential(crmUsername, crmPassword, crmDomain)
};
var httpClient = new HttpClient(httpHandler)
{
BaseAddress = new Uri(baseAddress),
};
var odataSettings = new ODataClientSettings(httpClient, new Uri(apiUrl, UriKind.Relative));
odataSettings.IgnoreResourceNotFoundException = true;
return new ODataClient(odataSettings);
}
Could anyone tell me if I'm missing anything glaringly obvious?
I know there is a connection limit of 100 to the server, however, I do not know how to make use of all of those.
I also have the possibility to create several more service users for the project, if this could help.
Or is the Dynamics CRM OData WebApi just not that fast?
Thanks in advance.
If you are actually connecting to dynamics business central, please ignore this.
Dynamics Crm generally means Dataverse / XRM in the power platform, the apps names vary between dynamics crm, dynamics 365 customer engagement, Dynamics sale / Service etc.
If you are using a dataverse, guidance will vary between online hosted versions and on premises.
A good place to start (for online) is the service protection limits https://learn.microsoft.com/en-us/power-apps/developer/data-platform/api-limits?tabs=sdk#how-to-maximize-throughput
This will help you understand what is going on in the server side.
In the maximize throughput section it will link you to this document https://learn.microsoft.com/en-us/power-apps/developer/data-platform/send-parallel-requests?tabs=sdk
Which will help explain how to optimize operation throughput via the .net client or via the web api.

Get AWS caller Identity with C# SDK

When I execute this with the aws cli, i.ex. inside a fargate task, I can see the UserId that my application is going to use
aws sts get-caller-identity
with this output on the console
{
"Arn": "arn:aws:sts::643518765421:assumed-role/url_process_role/6ae81f92-66f3-30de-1eaa-3a7d1902bad9",
"UserId": "ARDYOAZLVOAQXTT5ZXTV4:4ea81f97-66f3-40de-beaa-3a7d1902bad9",
"Account": "692438514791"
}
I would like to get the same information but using the C# SDK. I tried with the methods exposed in this doc but I can see some account related details but not the UserId assigned.
So far I've tried with this but I cannot see any profile when running in a Fargate task.
var awsChain = new Amazon.Runtime.CredentialManagement.CredentialProfileStoreChain();
System.Console.WriteLine($"Found {awsChain.ListProfiles().Count} AWS profiles.");
My final goal is to get it and add to some task processed with Fargate to save a correlation Id in the database when something fails and easily find the Fargate log stream.
IAmazonSecurityTokenService will provide the same information when executed with .netcore. Notice that the above example will only work inside the AWS domain as the endpoint is not publicly available if testing from a development machine.
var getSessionTokenRequest = new GetSessionTokenRequest
{
DurationSeconds = 7200
};
var stsClient = hostContext.Configuration.GetAWSOptions().CreateServiceClient<IAmazonSecurityTokenService>();
var iden = stsClient.GetCallerIdentityAsync(new GetCallerIdentityRequest { }).Result;
System.Console.WriteLine($"A={iden.Account} ARN={iden.Arn} U={iden.UserId}");

In Xamarin Forms, SetEnvironmentVariable

In Xamarin Forms with .Net Standard code sharing app, I want to set the environment variable for the Google Datastore. So that I can communicate with Google Datastore through the mobile app.
Below piece of code is working fine in console app but in Xamarin Forms throwing error while trying to create the Datastore db object.
Error reading credential file from location /DB.json: Could not find file "/DB.json"
Please check the value of the Environment Variable GOOGLE_APPLICATION_CREDENTIALS
I put the DB.json on the root of the solution.
try
{
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", #"DB.json");
var dir = Environment.CurrentDirectory;
// Your Google Cloud Platform project ID.
string projectId = "xamarin-project";
//We are storing movies. So this is a Movie kind.
string kind = "Country";
//Create the datastore db
var db = DatastoreDb.Create(projectId);
// City entity
Entity cityEntities = new Entity
{
Key = db.CreateKeyFactory(kind).CreateKey($"US"),
["CountryCode"] = "US",
["Name"] = "United States"
};
//Lets send the city to the datastore
using (var transction = db.BeginTransaction())
{
transction.Upsert(cityEntities);
transction.Commit();
}
}
catch (Exception ex)
{
await DisplayAlert("Error", ex.Message, "OK");
Console.WriteLine(ex.Message);
}
Read the file using Dependency is also not working. I tried with .Android project.
If you are looking to use a database directly from your mobile app, I'd suggest you use Cloud Firestore (https://cloud.google.com/firestore/) instead. It has explicit security features to allow accessing the database directly from clients.
I think what you are trying to do with the snippet above is to load your service account credentials in the app which will give any user of your full access to your database.

TF400813: Resource not available for anonymous access. Client authentication required

I am working on the CodedUI Test Automation project. i am developing a framework in which i am trying to access Test Cases in VSTS through RestAPI. I have worked on an MVC application previously in which i did the same thing to pull data from VSTS using RestAPI.
Now the problem is i am not able to access the VSTS. Everytime i am trying to access the VSTS, i got the exception TF400813: Resource not available for anonymous access. Client authentication required.
I am using the same PAT token. I have all the required access on my team project. I am able to access all work items in my project from browser. I have tried all the option mentioned in below thread but still its not working.
Client authentication error when starting Visual Studio 2015.3Any leads will be appreciated.Below is my code to get data from VSTS:
public static List<WorkItem> GetWorkItemsWithSpecificFields(IEnumerable<int> ids)
{
var collectionUri = "https://<name>.visualstudio.com";
var fields = new string[] {
"System.Id",
"System.Title",
"System.WorkItemType",
"Microsoft.VSTS.Scheduling.RemainingWork"
};
using (WorkItemTrackingHttpClient workItemTrackingHttpClient = new WorkItemTrackingHttpClient(new Uri(collectionUri), new VssBasicCredential("", System.Configuration.ConfigurationManager.AppSettings["PATToken"])))
{
// Exception is coming on below line
List<WorkItem> results = workItemTrackingHttpClient.GetWorkItemsAsync(ids, fields).Result;
return results;
}
}

Finding Connection by UserId in SignalR

I have a webpage that uses ajax polling to get stock market updates from the server. I'd like to use SignalR instead, but I'm having trouble understanding how/if it would work.
ok, it's not really stock market updates, but the analogy works.
The SignalR examples I've seen send messages to either the current connection, all connections, or groups. In my example the stock updates happen outside of the current connection, so there's no such thing as the 'current connection'. And a user's account is associated with a few stocks, so sending a stock notification to all connections or to groups doesn't work either. I need to be able to find a connection associated with a certain userId.
Here's a fake code example:
foreach(var stock in StockService.GetStocksWithBigNews())
{
var userIds = UserService.GetUserIdsThatCareAboutStock(stock);
var connections = /* find connections associated with user ids */;
foreach(var connection in connections)
{
connection.Send(...);
}
}
In this question on filtering connections, they mention that I could keep current connections in memory but (1) it's bad for scaling and (2) it's bad for multi node websites. Both of these points are critically important to our current application. That makes me think I'd have to send a message out to all nodes to find users connected to each node >> my brain explodes in confusion.
THE QUESTION
How do I find a connection for a specific user that is scalable? Am I thinking about this the wrong way?
I created a little project last night to learn this also. I used 1.0 alpha and it was Straight forward. I created a Hub and from there on it just worked :)
I my project i have N Compute Units(some servers processing work), when they start up they invoke the ComputeUnitRegister.
await HubProxy.Invoke("ComputeUnitReqisted", _ComputeGuid);
and every time they do something they call
HubProxy.Invoke("Running", _ComputeGuid);
where HubProxy is :
HubConnection Hub = new HubConnection(RoleEnvironment.IsAvailable ?
RoleEnvironment.GetConfigurationSettingValue("SignalREndPoint"):
"http://taskqueue.cloudapp.net/");
IHubProxy HubProxy = Hub.CreateHubProxy("ComputeUnits");
I used RoleEnviroment.IsAvailable because i can now run this as a Azure Role , a Console App or what ever in .NET 4.5. The Hub is placed in a MVC4 Website project and is started like this:
GlobalHost.Configuration.ConnectionTimeout = TimeSpan.FromSeconds(50);
RouteTable.Routes.MapHubs();
public class ComputeUnits : Hub
{
public Task Running(Guid MyGuid)
{
return Clients.Group(MyGuid.ToString()).ComputeUnitHeartBeat(MyGuid,
DateTime.UtcNow.ToEpochMilliseconds());
}
public Task ComputeUnitReqister(Guid MyGuid)
{
Groups.Add(Context.ConnectionId, "ComputeUnits").Wait();
return Clients.Others.ComputeUnitCameOnline(new { Guid = MyGuid,
HeartBeat = DateTime.UtcNow.ToEpochMilliseconds() });
}
public void SubscribeToHeartBeats(Guid MyGuid)
{
Groups.Add(Context.ConnectionId, MyGuid.ToString());
}
}
My clients are Javascript clients, that have methods for(let me know if you need to see the code for this also). But basicly they listhen for the ComputeUnitCameOnline and when its run they call on the server SubscribeToHeartBeats. This means that whenever the server compute unit is doing some work it will call Running, which will trigger a ComputeUnitHeartBeat on javascript clients.
I hope you can use this to see how Groups and Connections can be used. And last, its also scaled out over multiply azure roles by adding a few lines of code:
GlobalHost.HubPipeline.EnableAutoRejoiningGroups();
GlobalHost.DependencyResolver.UseServiceBus(
serviceBusConnectionString,
2,
3,
GetRoleInstanceNumber(),
topicPathPrefix /* the prefix applied to the name of each topic used */
);
You can get the connection string on the servicebus on azure, remember the Provider=SharedSecret. But when adding the nuget packaged the connectionstring syntax is also pasted into your web.config.
2 is how many topics to split it about. Topics can contain 1Gb of data, so depending on performance you can increase it.
3 is the number of nodes to split it out on. I used 3 because i have 2 Azure Instances, and my localhost. You can get the RoleNumber like this (note that i hard coded my localhost to 2).
private static int GetRoleInstanceNumber()
{
if (!RoleEnvironment.IsAvailable)
return 2;
var roleInstanceId = RoleEnvironment.CurrentRoleInstance.Id;
var li1 = roleInstanceId.LastIndexOf(".");
var li2 = roleInstanceId.LastIndexOf("_");
var roleInstanceNo = roleInstanceId.Substring(Math.Max(li1, li2) + 1);
return Int32.Parse(roleInstanceNo);
}
You can see it all live at : http://taskqueue.cloudapp.net/#/compute-units
When using SignalR, after a client has connected to the server they are served up a Connection ID (this is essential to providing real time communication). Yes this is stored in memory but SignalR also can be used in multi-node environments. You can use the Redis or even Sql Server backplane (more to come) for example. So long story short, we take care of your scale-out scenarios for you via backplanes/service bus' without you having to worry about it.

Categories