Viewing/Collecting Service Fabric ETW Events OnPrem - c#

We are trying to view Service ETW Events in an OnPrem cluster. The long term plan will be to install an ElasticSearch cluster to send the events to. I don't have time to build that out this week, instead I need to understand why my App is blowing up.
We have installed Microsoft Message Analyzer on one of the node servers and I can connect with a live session to view the Cluster ETW Events Service Fabric System Provider.
I would like to be able to view the Application ETW Events. I've followed the instructions in the article here:
https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-diagnostic-collect-logs-without-an-agent
This article seems to focus on 3 files, eventFlowConfig.json, Program.cs and ServiceEventSource.cs.
EventFlowConfig.Json has:
"inputs": [
{
"type": "EventSource",
"sources": [
{ "providerName": "Microsoft-ServiceFabric-Services" },
{ "providerName": "Microsoft-ServiceFabric-Actors" },
{ "providerName": "TMHP-CacheApp-CacheAPI" }
]
}
],
"filters": [
{
"type": "drop",
"include": "Level == Verbose"
}
],
"outputs": [
{
"type": "StdOutput"
}
In program.cs I have:
using (var diagnosticsPipeline = ServiceFabricDiagnosticPipelineFactory.CreatePipeline("CacheApp-CacheAPI-DiagnosticsPipeline"))
{
ServiceRuntime.RegisterServiceAsync("EndpointType",
context => new Endpoint(context)).GetAwaiter().GetResult();
ServiceEventSource.Current.ServiceTypeRegistered(Process.GetCurrentProcess().Id, typeof(Endpoint).Name);
// Prevents this host process from terminating so services keeps running.
Thread.Sleep(Timeout.Infinite);
}
ServiceEventSource.cs has:
[EventSource(Name = "TMHP-CacheApp-CacheAPI")]
I package and deploy fine, but using MSMA I don't know how to attach to the Application Provider? I think I would be adding a "Custom Provider" but it asks for a GUID. Is there someway to find this Guid? I'm assuming I want to add the customer provider for my specific ServiceFabric Application of type:
"TMHP-CacheApp-CacheAPI"
Thanks in advance,
Greg

I'd recommend that you use PerFView instead, Vance has an article on how to view EventSource based events in PerfView https://blogs.msdn.microsoft.com/vancem/2012/07/09/introduction-tutorial-logging-etw-events-in-c-system-diagnostics-tracing-eventsource/ The '*' in front of the EventSource name is important.
Here's some instructions for how to view actor events, your custom EventSource should be the same pattern.
The Actor and ReliableServices are EventSource based so to view them in PerfView you have to follow the instructions on Vance’s blog. Don't forget the '*'!!!
Start PerfView with a command line like this: perfview /onlyproviders=*Microsoft-ServiceFabric-Actors
You can collect using Collect | Collect | Start Collection. Make sure that the Advanced Options | Additional Providers field contains =*Microsoft-ServiceFabric-Actors
When you finish collecting they are viewable under Events

The Guid of a .NET EventSource based Event Provider is actually based on the name of the provider, it is generated using a hash algorithm. This blog post has a short description of it: https://blogs.msdn.microsoft.com/dcook/2015/09/08/etw-provider-names-and-guids/. You can use the supplied ETWGuid.exe to generate the Guid for your provider:
C:\code> .\EtwGuid.exe TMHP-CacheApp-CacheAPI
TRACELOGGING_DEFINE_PROVIDER(
g_hMyProvider,
"TMHP-CacheApp-CacheAPI",
// {9deef099-8d1a-568a-1618-08ffbb7146b3}
(0x9deef099,0x8d1a,0x568a,0x16,0x18,0x08,0xff,0xbb,0x71,0x46,0xb3));
So the Guid for TMHP-CacheApp-CacheAPI would be 9deef099-8d1a-568a-1618-08ffbb7146b3. This only works for .NET EventSources btw, other Event Providers may have other ways of setting the guid for a provider.
You can then look for that provider in your Microsoft Message Analyzer, PerfView or any other tool for ETW viewing.
As for the Microsoft supplied Event Providers you have three built-in you should look at for Service Fabric:
Microsoft-ServiceFabric - CBD93BC2-71E5-4566-B3A7-595D8EECA6E8 - These ere the 'low-level' Service Fabric activities
Microsoft-ServiceFabric-Actors - (https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-reliable-actors-diagnostics)
Microsoft-ServiceFabric-Services - https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-reliable-services-diagnostics

Related

Delete obsolete reported properties from Azure iot edge module Twin

I am still rather inexperienced to Microsoft Azure iotedge (and stackoverflow - this being my first post) and how module twins work and have an issue regarding the deletion of obsolete properties from the reported part of my module twin.
I have migrated a couple of properties from one device to another within the module twin, however have not been able te remove the properties from the reported and I understand that setting them to null should do the trick (setting them to null and updating them in desired removes them only from the desired part of the twin). The obsolete properties are also not present in the module twin locally on the device
I have tried updating the reported - with a C# console app using the Microsoft.Azure package setting the obsolete properties to null - but this doesnt seem to work either.
await registryManager.UpdateTwinAsync(deviceId, moduleId, removeProperties, eTag);
with my string removeProperties being something like the following (updating desired using this route works like a charm)
{
{
"properties": {
"reported": {
"foo": {
"bar": null
}
}
}
}
Can Anybody suggest on a way to remove these properties?
You can't update the reported properties through the registry manager. However, it can be done using the device's identity. In the same way you wrote a console program to update the twin with the service SDK, you could do it with the device SDK (granted that the device is offline).
For instance if you have a file called twin.json:
{
"foo": {
"bar": null
}
}
You can update the reported properties like this:
var text = await File.ReadAllTextAsync("./twin.json");
var deviceClient = DeviceClient.CreateFromConnectionString("very-secret-connection-string");
var twinCollection = new TwinCollection(text);
await deviceClient.UpdateReportedPropertiesAsync(twinCollection);

How to connect from .NET .dll file to Azure using Linked Service

I want to write a code, similar to the code at the bottom of this link (https://azure.microsoft.com/en-us/blog/automating-azure-analysis-services-processing-with-azure-functions/) in Visual Studio and building a DLL file. However instead of using the connection string, i would like to use an existing Linked Service from my Azure portal.
The goal is to create a DLL that refreshes my Cube, while at the same time using an existing Linked Service which is already in my Azure Portal.
Is this possible?
Thanks.
#r "Microsoft.AnalysisServices.Tabular.DLL"
#r "Microsoft.AnalysisServices.Core.DLL"
#r "System.Configuration"
using System;
using System.Configuration;
using Microsoft.AnalysisServices.Tabular;
public static void Run(TimerInfo myTimer, TraceWriter log)
{
log.Info($"C# Timer trigger function started at: {DateTime.Now}");
try
{
Microsoft.AnalysisServices.Tabular.Server asSrv = new Microsoft.AnalysisServices.Tabular.Server();
var connStr = ConfigurationManager.ConnectionStrings["AzureASConnString"].ConnectionString; // Change this to a Linked Service connection
asSrv.Connect(connStr);
Database db = asSrv.Databases["AWInternetSales2"];
Model m = db.Model;
db.Model.RequestRefresh(RefreshType.Full); // Mark the model for refresh
//m.RequestRefresh(RefreshType.Full); // Mark the model for refresh
m.Tables["Date"].RequestRefresh(RefreshType.Full); // Mark only one table for refresh
db.Model.SaveChanges(); //commit which will execute the refresh
asSrv.Disconnect();
}
catch (Exception e)
{
log.Info($"C# Timer trigger function exception: {e.ToString()}");
}
log.Info($"C# Timer trigger function finished at: {DateTime.Now}");
}
So I guess you're using the Data Factory and you want to process your analysis services model from your pipeline. I don't see what your question actually has to do with the Data lake store.
To trigger Azure Functions from the Data Factory (v2 only), you'll have to use a web activity. It is possible to pass a Linked Service as part of your payload, as shown in the documentation. It looks like this:
{
"body": {
"myMessage": "Sample",
"linkedServices": [{
"name": "MyService1",
"properties": {
...
}
}]
}
However, there is no Analysis services linked service in the Data Factory, at least, I didn't hear of such a thing. Passing in a connectionstring from the pipeline seems like a good idea however. You could pass it as a pipeline parameter in your body of the webrequest.
Create a parameter in your pipeline
Add it to your Web Activity Payload
{
"body": {
"AzureASConnString": "#pipeline().parameters.AzureASConnString"
}
You can retrieve this value from functions like described here

TFS Capacity plan reading from c#

Am doing a reporting tool out of tfs. i was able to read the work item and iteration related information from tfs. How to get the iteration capacity plan information from tfs. using WIQL or any other option. i need to get the information in my c# code.
Thanks in advance for all the help.
It's not able to do this through Client Object Model. Please refer this similar question: TFS 11 2012 API Questions : query capacity and days off
These values are only available from the Server Object Model (there is no Client Object Model equivalent at the moment). The interfaces and objects are all made Internal so even on the server you can't access these values.
internal TeamCapacity GetTeamIterationCapacity(Guid teamId, Guid iterationId);
Declaring Type: Microsoft.TeamFoundation.Server.WebAccess.WorkItemTracking.Common.DataAccess.TeamConfigurationComponent
Assembly: Microsoft.TeamFoundation.Server.WebAccess.WorkItemTracking.Common, Version=12.0.0.0
You could either directly query from the ProjectCollection database from the tables mentioned by James Tupper in this thread.
Or you could also use Rest API to Get a team's capacity or Get a team member's capacity will get a response as below:
{
"values": [
{
"teamMember": {
"id": "8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d",
"displayName": "Chuck Reinhart",
"uniqueName": "fabrikamfiber3#hotmail.com",
"url": "https://fabrikam-fiber-inc.vssps.visualstudio.com/_apis/Identities/8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d",
"imageUrl": "https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/_api/_common/identityImage?id=8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d"
},
"activities": [
{
"capacityPerDay": 0,
"name": null
}
],
"daysOff": [],
"url": "https://fabrikam-fiber-inc.visualstudio.com/DefaultCollection/6d823a47-2d51-4f31-acff-74927f88ee1e/748b18b6-4b3c-425a-bcae-ff9b3e703012/_apis/work/teamsettings/iterations/2ec76bfe-ba74-4060-970d-4567a3e997ee/capacities/8c8c7d32-6b1b-47f4-b2e9-30b477b5ab3d"
}
]
}
Iteration objects are part of Project settings, you might need to query the iteration details from there and not from work item.

Using EventFlow to monitor ETW event on local machine

I am trying to set up a simple ETW and EventFlow example that allows specific ETW providers to be monitored. In this case the Service Control Manager ETW provider to monitor when Service Start and Stop messages are issued.
I have the following input configuration for Tracing and ETW.
"inputs": [
{
"type": "Trace",
"traceLevel": "Warning"
},
{
"type": "ETW",
"providers": [
{
"providerName": "Service Control Manager"
}
]
}]
I have the following code which is starting up monitoring using EventFlow.
static void Main(string[] args)
{
using (var pipeline = DiagnosticPipelineFactory.CreatePipeline("eventFlowConfig.json"))
{
System.Diagnostics.Trace.TraceWarning("EventFlow is working!");
Console.ReadLine();
}
}
The trace event is appearing in the console, but when I start and stop a service no ETW events are appearing.
Is EventFlow designed for this scenario on a local machine? If so what am i missing in my configuration or code?
The console process is running as administrator and the account has access to the Performance Log Users and Performance Log Monitors group
If you want to listen for ETW events from the Service Control Manager, you'll need to listen for the provider named Microsoft-Windows-Services.
Here is what I have in my eventFlowConfig.json
{
"inputs": [
{
"type": "ETW",
"providers": [
{ "providerName": "Microsoft-Windows-Services" }
]
}
],
"filters": [],
"outputs": [
{ "type": "StdOutput" }
],
"schemaVersion": "2016-08-11",
"extensions": []
}
To check that it worked, I stopped and started SQL Server services. The events were output in the console as expected.
As an additional sanity check, you can use the Visual Studio Diagnostic Events viewer to listen for ETW events. Launch the viewer, click the cog to configure, add the provider name in the list of ETW Providers, and apply. You should now be able to see the same events in both the viewer and your console application.

How to use a ServiceBus Trigger with a topic/subscription in an Azure Function

I'd like to create an Azure Function that is triggered when a new message is added to a topic/subscription.
For the moment I've created an Azure Function using the ServiceBusQueueTrigger C# Template and I've set the Queue Name to
topicPath + "/Subscriptions/" + subscriptionName
But I've got this exception:
Microsoft.ServiceBus: Cannot get entity 'topic-test/Subscriptions/subscription-test' because it is not of type QueueDescription. Check that you are using method(s) with the correct entity type. System.Runtime.Serialization: Error in line 1 position 1762. Expecting element 'QueueDescription' from namespace 'http://schemas.microsoft.com/netservices/2010/10/servicebus/connect'.. Encountered 'None' with name '', namespace ''. .
I thought the Azure Function was using the MessagingFactory.CreateMessageReceiver to initialize the message pump but not.
Is there any support for topic/subscription for the moment ?
Yes topics are supported, but our UI and templates are behind on that unfortunately - we'll be releasing some updates soon addressing these issues.
For now, you can use the Advanced Editor to edit your trigger binding directly. There you can specify your subscriptionName and topicName values. Here's an example:
{
"bindings": [
{
"type": "serviceBusTrigger",
"name": "message",
"direction": "in",
"subscriptionName": "subscription-test",
"topicName": "topic-test",
}
]
}
In general, since Azure Functions is build atop the WebJobs SDK, our various bindings are mapped directly to their SDK counterparts. For example serviceBusTrigger maps to ServiceBusTriggerAttribute which has SubscriptionName/TopicName properties. Therefore, expect to see the same properties in the Function metadata model.

Categories