Cross-workspace queries in azure log analytics .NET SDK - c#

I'm using azure log analytics .NET SDK to execute some log analytics queries.
The nugget package I'm using for this SDK is Microsoft.Azure.OperationalInsights.
This allows me to issue some simple queries like in the following code sample :
Authentication & Simple query :
partial class QueryProvider
{
private OperationalInsightsDataClient _operationalInsightsDataClient;
private async void Authenticate()
{
// Retrieving the credentials and settings data from the app settings .
var domain = SettingsHelpers.PullSettingsByKey("domain");
var clientId = SettingsHelpers.PullSettingsByKey("clientId");
var workspaceId = SettingsHelpers.PullSettingsByKey("workspaceId");
var authEndpoint = SettingsHelpers.PullSettingsByKey("authEndpoint");
var clientSecret = SettingsHelpers.PullSettingsByKey("clientSecret");
var tokenAudience = SettingsHelpers.PullSettingsByKey("tokenAudience");
// Authenticating to the azure log analytics service .
var azureActiveDirectorySettings = new ActiveDirectoryServiceSettings
{
AuthenticationEndpoint = new Uri(authEndpoint),
TokenAudience = new Uri(tokenAudience),
ValidateAuthority = true
};
var credentials = await ApplicationTokenProvider.LoginSilentAsync
(
domain
, clientId
, clientSecret
, azureActiveDirectorySettings
);
_operationalInsightsDataClient = new OperationalInsightsDataClient(credentials);
_operationalInsightsDataClient.WorkspaceId = workspaceId;
}
public async Task<string> LogAnalyticsSamleQuery()
{
var query = #"Usage
| where TimeGenerated > ago(3h)
| where DataType == 'Perf'
| where QuantityUnit == 'MBytes'
| summarize avg(Quantity) by Computer
| sort by avg_Quantity desc nulls last";
var jsonResult = await _operationalInsightsDataClient.QueryAsync(query);
return JsonConvert.SerializeObject(jsonResult.Results);
}
}
Now I want to write a method that runs a cross-workspace query , I get the workspaces Ids dynamically and I want to build o query that references all those workspaces .
I did not find any sample in the doc to build such queries .
I found an attribute of OperationalInsightDataClient class called AdditionalWorkspaces but it's unclear how to use it to achieve the goal .
Any help would be very appreciated .

Use the ListWorkspaces method, store workspace Id , CustomerIdor Name in List.
var ws = new List<string>();
foreach (var w in workspaces)
{
ws.Add(w.Id);
}
AdditionalWorkspaces is used to store workspaces you want to query, but has no influence on the query result.
_operationalInsightsDataClient.AdditionalWorkspaces = ws;
To cross-workspace query, add the list of workspace-Id in the query method.
var jsonResult = await _operationalInsightsDataClient.QueryAsync(query,null, _operationalInsightsDataClient.AdditionalWorkspaces);

Related

Programmatically find Partition Key Path of Cosmos Container

var cosmosClient = new CosmosClient(EndpointUrl, AuthorizationKey,
new CosmosClientOptions() {
AllowBulkExecution = true
});
var database = cosmosClient.GetDatabase(SourceDatabase);
var container = database.GetContainer(SourceContainerName);
I'm looking for an approach to programmatically find out
PartitionKeyPath for the container. Is there no API in the SDK to
obtain it from container object?
P.S. Microsoft displays partition key path in Azure portal.
Was wondering how do they display it?
In Azure SDK for .NET, JSON path used for containers partitioning can be found using
ContainerProperties.PartitionKeyPath Property
PartitionKeyPath defaults to "/PartitionKey". Refer: Ms doc
ContainerProperties cproperties = await container.ReadContainerAsync();
Console.WriteLine(cproperties.PartitionKeyPath);
Additionally, you can go ahead and share this feedback so the CosmosDB team can look into this idea.✌
To retrieve the partition key of a container you have to use the GetContainerQueryIterator() method. Depending on what is already available within your code, you could use something like this:
private static async Task<string> GetPartitionKey(Database database, string containerName)
{
var query = new QueryDefinition("select * from c where c.id = #id")
.WithParameter("#id", containerName);
using var iterator = database.GetContainerQueryIterator<ContainerProperties>(query);
while (iterator.HasMoreResults)
{
foreach (var container in await iterator.ReadNextAsync())
{
return container.PartitionKeyPath;
}
}
return null;
}
In version 3.31.2 you can access the PartitionKeyPath as follows
var containerResponse = await cosmosClient.GetDatabase(options.Value.Database)
.GetContainer(collectionId)
.ReadContainerAsync(cancellationToken: cancellationToken);
Console.WriteLine(containerResponse.Resource.PartitionKeyPath)

How to copy whole Container from one Azure Blob Service to another with Azure Client Library v12

I'm trying my hand at programmatic Azure tooling, specifically trying to copy/clone/move a Container from one Azure Blob Storage account to another one. I'm not seeing a good way to do this though with v12.x of the client library in its docs (so far at least).
For example:
BlobStorage1
|-- SomeContrainer
|-- blob1
|-- blob2
|-- blob3
BlobStorage2
|-- SomeOtherContrainer
|-- otherBlob
I want to programmatically move SomeContainer and all its blobs to BlobStorage2.
What I've tried so far:
docs read/consulated:
StartCopyFrom method
which doesn't seem to show how to go from account1 pushed to account2. You can pull down from account2 into account one but that's the opposite of what I'm after.
code attempt:
var localContainer = blobServiceClient.GetBlobContainerClient(localContainerName);
var blobs = localContainer.GetBlobs();
var remoteClient = new BlobServiceClient(remoteConnectionString);
var remoteContainer = remoteClient.GetBlobContainerClient(localContainerName);
foreach(var blob in blobs)
{
Console.WriteLine($"copying blob: {blob.Name}");
var sourceBlob = localContainer.GetBlobClient(blob.Name);
var remoteBlobClient = remoteContainer.GetBlobClient(blob.Name);
await remoteBlobClient.StartCopyFromUriAsync(sourceBlob.Uri);
}
Problem here is that I could copy from remote to local (via connection string), or within the same account since the URI would be quite similar, but not from the account I'm on to a separate storage account. What would be the recommended way to copy blobs (or containers whole sale if possible) with the client library (v12.x) from one account to a separate one?
Please try the code below:
static async Task CopyContainersAcrossAccounts()
{
var sourceAccountName = "source-account";
var sourceAccountKey = "source-account-key";
var targetAccountName = "target-account";
var targetAccountKey = "target-account-key";
Azure.Storage.StorageSharedKeyCredential sourceCredential = new Azure.Storage.StorageSharedKeyCredential(sourceAccountName, sourceAccountKey);
Azure.Storage.StorageSharedKeyCredential targetCredential = new Azure.Storage.StorageSharedKeyCredential(targetAccountName, targetAccountKey);
var sourceConnectionString = $"DefaultEndpointsProtocol=https;AccountName={sourceAccountName};AccountKey={sourceAccountKey};EndpointSuffix=core.windows.net;";
var sourceContainer = "source-container-name";
var targetConnectionString = $"DefaultEndpointsProtocol=https;AccountName={targetAccountName};AccountKey={targetAccountKey};EndpointSuffix=core.windows.net;";
var targetContainer = "target-container-name";
var sourceBlobContainerClient = new Azure.Storage.Blobs.BlobContainerClient(sourceConnectionString, sourceContainer);
var targetBlobContainerClient = new Azure.Storage.Blobs.BlobContainerClient(targetConnectionString, targetContainer);
var sourceBlobs = sourceBlobContainerClient.GetBlobs();
foreach (var blob in sourceBlobs)
{
//Get shared access signature with "Read" permission in case source container is a private container.
Azure.Storage.Sas.BlobSasBuilder blobSasBuilder = new Azure.Storage.Sas.BlobSasBuilder()
{
BlobContainerName = sourceContainer,
BlobName = blob.Name,
ExpiresOn = DateTime.UtcNow.AddHours(1)
};
blobSasBuilder.SetPermissions(Azure.Storage.Sas.BlobSasPermissions.Read);
var sasToken = blobSasBuilder.ToSasQueryParameters(sourceCredential).ToString();
var sourceBlobClient = sourceBlobContainerClient.GetBlobClient(blob.Name);
var targetBlobClient = targetBlobContainerClient.GetBlobClient(blob.Name);
var sourceBlobUri = new Uri($"{sourceBlobClient.Uri.AbsoluteUri}?{sasToken}");
await targetBlobClient.StartCopyFromUriAsync(sourceBlobUri);
}
}

How to access cloudwatch logs events using C# .net core

Spent lot of time in this. But, I am not getting a way to query logs from cloud watch to c# api. I want to display those logs on UI. Any help?
Thanks in advance
You can integrate with cloudwatch and access for logs by using below code.
First and foremost you have to install awssdk with latest version and you have to provide aws accessKey and secretKey, your regionEndpoint along with loggroup name of cloudwatch.
public static void DescribeSubscriptionFilters()
{
var credentials = new BasicAWSCredentials("awskey", "secretkey"); // provide aws credentials
IAmazonCloudWatchLogs client =
new AmazonCloudWatchLogsClient(credentials, RegionEndpoint.USGovCloudWest1); // provide regionEndPoint
var describeLogStreamsRequest = new DescribeLogStreamsRequest()
{
LogGroupName = "LogGroupName" //mention your cloudwatch log group
};
var describeLogStreamsResult = client.DescribeLogStreams(describeLogStreamsRequest);
foreach (var stream in describeLogStreamsResult.LogStreams)
{
var eventsRequest = new GetLogEventsRequest()
{
LogStreamName = stream.LogStreamName,
LogGroupName = describeLogStreamsRequest.LogGroupName
};
var result = client.GetLogEvents(eventsRequest);
foreach (var events in result.Events)
{
Console.WriteLine(events.Timestamp + " - " + events.Message );
}
}
}

How to pass filtering parameters in PoweBI C# SDK

I'm generating Power BI embed token using C# SDK.
using (var client = new PowerBIClient(new Uri(apiUrl), tokenCredentials))
{
var workspaceId = groupId.ToString();
var report = await client.Reports.GetReportInGroupAsync(workspaceId, reportId);
var generateTokenRequestParameters = new GenerateTokenRequest(accessLevel: "view");
var tokenResponsex = await client.Reports.GenerateTokenAsync(workspaceId, reportId, generateTokenRequestParameters);
result.EmbedToken = tokenResponsex;
result.EmbedUrl = report.EmbedUrl;
result.Id = report.Id;
}
I need to pass a parameter for filtering. But couldn't find a straightforward way to do this.
How do I get this done?
You can use RLS in embedded reports implemented with app owns data scenario (a single master account for authentication) by passing EffectiveIdentity information when generating access token for this report with GenerateTokenInGroup.
To implement RLS in the report itself you need either to use USERPRINCIPALNAME() DAX function to filter the data, or define roles and filter the data based on them. If you implement it with roles, after publishing the report go to dataset's security settings and add users to the roles.
To generate the token by providing an effective identity and roles membership, use code like this:
var credentials = new TokenCredentials(accessToken, "Bearer");
using (var client = new PowerBIClient(new Uri("https://api.powerbi.com"), credentials))
{
var datasets = new List<string>() { datasetId }; // Dataset's GUID as a string
var roles = new List<string>();
roles.Add('ROLE1');
roles.Add('ROLE2');
roles.Add('ROLE3');
var effectiveIdentity = new EffectiveIdentity('user#example.com', datasets, roles);
var r = new GenerateTokenRequest("view", effectiveIdentity);
var token = client.Reports.GenerateTokenInGroup(groupId, reportId, r).Token;
}

Access TFS Team Query from Client Object API

Is there a way to run a shared team query, by name, through the TFS 2013 client object API
I'm working on a C# script that will do some work based off of the results of a shared team query. I don't want to have to maintain the query in the TFS UI as well as in my script; I'd prefer to just run the registered query that my team uses, but then just play with the results. When I write "registered query" I'm just referring to a query that I wrote in the TFS UI and saved as a shared query.
In other words: I'd like to use the TFS UI to create a query, save the file in my "shared queries" list, call it "foo", then access foo from the client object API in my script.
I see that there is a GetQueryDefinition(GUID) method off of WorkItemStore, but where would I get the GUID for a shared team query?
Sample code that should do what you need
///Handles nested query folders
private static Guid FindQuery(QueryFolder folder, string queryName)
{
foreach (var item in folder)
{
if (item.Name.Equals(queryName, StringComparison.InvariantCultureIgnoreCase))
{
return item.Id;
}
var itemFolder = item as QueryFolder;
if (itemFolder != null)
{
var result = FindQuery(itemFolder, queryName);
if (!result.Equals(Guid.Empty))
{
return result;
}
}
}
return Guid.Empty;
}
static void Main(string[] args)
{
var collectionUri = new Uri("http://TFS/tfs/DefaultCollection");
var server = new TfsTeamProjectCollection(collectionUri);
var workItemStore = server.GetService<WorkItemStore>();
var teamProject = workItemStore.Projects["TeamProjectName"];
var x = teamProject.QueryHierarchy;
var queryId = FindQuery(x, "QueryNameHere");
var queryDefinition = workItemStore.GetQueryDefinition(queryId);
var variables = new Dictionary<string, string>() {{"project", "TeamProjectName"}};
var result = workItemStore.Query(queryDefinition.QueryText,variables);
}

Categories