Azure application insights drops some custom events - c#

I have a Web Application and want to track user logins in AppInsights with the following code:
[HttpPost]
public async Task<ActionResult> Login(AccountViewModel model, string returnUrl)
{
var success = await _userManager.CheckPasswordAsync(model);
_telemetryClient.TrackEvent("Login",
new Dictionary<string, string>
{
{ "UserName", model.UserName },
{ "UsingPin", model.UsingPin.ToString()},
{ "ReturnUrl", returnUrl},
{ "LastUsedUrl", model.LastUsedUrl },
{ "Success", success.ToString() }
});
return View(model);
}
The code works, I get events in analytics portal... but only a subset of events! In the sessions table on the database side I have about 1000 unique user logins, but AI reports around 100...
What I have already checked:
_telemetryClient is a singleton configured through Unity (<lifetime type="singleton" />)
there is no quota configured in Azure portal
I have more than 1000 POST requests to the Login method, this looks just right (I mean every request comes to this exact server, returns 200 status, etc.)
So it looks like Azure just drops some of custom events... or they are not send to azure for some reason. Is there any configuration part I am missing?

As far as I know, if your application sends a lot of data and you are using the Application Insights SDK for ASP.NET version 2.0.0-beta3 or later, the adaptive sampling feature may operate and send only a percentage of your telemetry.
So I guess this the reason why you just see 100 records in the AI.
If you want to send all the records to the AI, I suggest you could disable the adaptive sampling. More details about how to disable the adaptive sampling, you could refer to this article.
Notice: This is not recommended. Sampling is designed so that related telemetry is correctly transmitted, for diagnostic purposes.

Related

PayPalAPIInterfaceServiceService and SetExpressCheckout config / credential information

I created an app in my business account on this page:
https://developer.paypal.com/developer/applications/
When I click on the app I created I see the following:
Sandbox account, Client ID, and Secret.
I am trying to call SetExpressCheckout… but the documentation is unclear and examples are all over the map.
Basically I’m seeing things like:
var request = new SetExpressCheckoutReq() { … };
var config = new Dictionary<string, string>()
{
{ "mode", "sandbox" }, // some variations of these values
{ "clientId", "fromAbovePage" },
{ "clientSecret", "fromAbovePage" },
{ "sandboxAccount", "fromAbovePage" },
{ "apiUsername", "IDontKnow" },
{ "apiPassword", "IDontKnow" },
{ "apiSignature", "IDontKnow" }
};
var service = new PayPalAPIInterfaceServiceService(config);
var response = service.SetExpressCheckout(request, new SignatureCredential(config["apiUsername"], config["apiPassword"], config["apiSignature"]));
Also, kind of weird that credentials go into both the PayPalAPIInterfaceServiceService and the actual SetExpressCheckout call.
What are (and where do I get) the correct values for the above config? (the request itself I have pretty much figured out)
Note: PayPal support told me that I need to use Reference Transactons in order to charge varying amounts over potentially varying times without subsequent user interaction, if that is relevant.
I would love to see examples of this with the most recent API's if anyone has that information as well.
Thank you.
SetExpressCheckout is a legacy NVP API
For sandbox, it uses credentials from the "Profile" of a sandbox account in https://www.paypal.com/signin?intent=developer&returnUri=https%3A%2F%2Fdeveloper.paypal.com%2Fdeveloper%2Faccounts%2F
For live, https://www.paypal.com/api
ClientID/Secret credentials are used for the current v2/checkout/orders REST API, which at the moment does not have any public documentation for vaulting or reference transactions; it is for one time payments. You can find information for a server-side integration at https://developer.paypal.com/docs/checkout/reference/server-integration/
If you are using this REST API integration, create two routes --one for 'Set Up Transaction' and one for 'Create Transaction'. Then pair them with this approval flow: https://developer.paypal.com/demo/checkout/#/pattern/server

Add B2B external user to Azure AD without sending invitation email C#

We are using azure b2b for inviting the external users to access our applications in the Tenant. For inviting
For new users, we are sending the b2b invite(using c# Code with customized mail format), upon acceptance users are able to access the application.
For bulk user without sending email to user, there is an option in the azure, i.e to download the excel template and filling the details in the excel with column [sendEmail] values True or False
Now I want to add the user to the azure ad without sending the email using C# code. Can anyone suggest to achieve the requirement?
You could use the Graph in order to create B2B users without invitation.
Reference : https://learn.microsoft.com/en-us/graph/api/resources/invitation?view=graph-rest-1.0
POST https://graph.microsoft.com/v1.0/invitations
{
"invitedUserEmailAddress": "guestuser#sampledomain.com",
"inviteRedirectUrl": "https://sample.com",
"sendInvitationMessage": false,
}
You could probably experiment the same action and see whether it meets your requirement in the graph explorer :
https://developer.microsoft.com/en-us/graph/graph-explorer
Having said that, Now you can use the GRAPH C# SDK in order to achieve your requirement using the above request
Ref:https://learn.microsoft.com/en-us/graph/sdks/sdks-overview
To add a External user without the email using GraphClient using C# would be like below :
public static void CreateB2BUser()
{
try
{
var invitation = new Invitation
{
SendInvitationMessage = false,
InvitedUserEmailAddress = "user#sample.com",
InvitedUserType = "Member",
InviteRedirectUrl = "https://sampledomain.com",
InvitedUserDisplayName = "Sample User",
};
graphClient.Invitations.Request().AddAsync(invitation);
}
catch (ServiceException ex)
{
Console.WriteLine($"Error Creating User : {ex.Message}")
}
}
This article can help you to get a quickstart with the authentication and creation of the GraphClient.

Why does my query to Purchases.products in Google Player Developer Api take more than 40s to respond?

We use Purchases.products in Google Play Developer Api from our backend to validate purchases made by our client Android app.
This use to work fine, but now the processing of a valid purchase takes 20-40 seconds to complete. It still works though.
My question is: What has changed to cause such behaviour, and is there a different approach to validate purchases that should be used now?
It is a very popular app that generates an average of 16k requests per day, so I've been thinking that maybe we are being throttled? If so, how do I find out?
I already updated to the latest version of the Google API library for .NET, and nothing changed.
The response is ~1 second if I pass an invalid receipt in the request.
The code is something similar to this:
public void ValidatePurchase(string productId, string receipt, string bundleId, string clientEmail, string privateKey)
{
var credential = new ServiceAccountCredential(new ServiceAccountCredential.Initializer(clientEmail)
{
Scopes = new[] { AndroidPublisherService.Scope.Androidpublisher }
}.FromPrivateKey(privateKey));
// Create the service.
var service = new AndroidPublisherService(initializer: new BaseClientService.Initializer
{
HttpClientInitializer = credential
});
service.Purchases.Products.Get(bundleId, productId, receipt).Execute();
}
It works, but takes 20-40 seconds to complete, which is unacceptable.
Edit: I changed my actual code to re-use the AndroidPublisherService instance like suggested in Chris' comment. As I kind of expected, it made no difference.
Also worth noting: when validating purchases from other apps, using the same service, even ones using the same account, the responses are as fast as they should be.

ABP real-time notification system background jobs

How can I get Online users at server side, to send notifications only to Online users, using a background job?
Background job:
public async Task SendNotifications()
{
await _backgroundJobManager.EnqueueAsync<NotificationJob,
UserIdentifier>(new UserIdentifier(_session.TenantId, _session.UserId.Value), delay: TimeSpan.FromSeconds(3));
}
public override void Execute(UserIdentifier args)
{
var notifications = _userNotificationManager.GetUserNotifications(args);
Abp.Threading.AsyncHelper.RunSync(() => _realTimeNotifier.SendNotificationsAsync(notifications.ToArray()));
}
My Job is on a API where I don't have my session with user info, so I want to get all Online users to publish notifications only to them and after that call my job to notify them.
Does Module Zero by default uses Redis ? As Alper said "No".
I need to connect to SignalR on my API too ? Adding app.MapSignalR(); to my WebApi startupseems to work, but sometimes doesn't, I am missing something
How can I get Online users at web api (server side) ? An alternative I found was to get all subscriptions (GetSubscriptionsAsync) and there I have my userId to send the notifications. var onlineClients = _onlineClients.GetAllClients(); is returning null
Thanks

Finding Connection by UserId in SignalR

I have a webpage that uses ajax polling to get stock market updates from the server. I'd like to use SignalR instead, but I'm having trouble understanding how/if it would work.
ok, it's not really stock market updates, but the analogy works.
The SignalR examples I've seen send messages to either the current connection, all connections, or groups. In my example the stock updates happen outside of the current connection, so there's no such thing as the 'current connection'. And a user's account is associated with a few stocks, so sending a stock notification to all connections or to groups doesn't work either. I need to be able to find a connection associated with a certain userId.
Here's a fake code example:
foreach(var stock in StockService.GetStocksWithBigNews())
{
var userIds = UserService.GetUserIdsThatCareAboutStock(stock);
var connections = /* find connections associated with user ids */;
foreach(var connection in connections)
{
connection.Send(...);
}
}
In this question on filtering connections, they mention that I could keep current connections in memory but (1) it's bad for scaling and (2) it's bad for multi node websites. Both of these points are critically important to our current application. That makes me think I'd have to send a message out to all nodes to find users connected to each node >> my brain explodes in confusion.
THE QUESTION
How do I find a connection for a specific user that is scalable? Am I thinking about this the wrong way?
I created a little project last night to learn this also. I used 1.0 alpha and it was Straight forward. I created a Hub and from there on it just worked :)
I my project i have N Compute Units(some servers processing work), when they start up they invoke the ComputeUnitRegister.
await HubProxy.Invoke("ComputeUnitReqisted", _ComputeGuid);
and every time they do something they call
HubProxy.Invoke("Running", _ComputeGuid);
where HubProxy is :
HubConnection Hub = new HubConnection(RoleEnvironment.IsAvailable ?
RoleEnvironment.GetConfigurationSettingValue("SignalREndPoint"):
"http://taskqueue.cloudapp.net/");
IHubProxy HubProxy = Hub.CreateHubProxy("ComputeUnits");
I used RoleEnviroment.IsAvailable because i can now run this as a Azure Role , a Console App or what ever in .NET 4.5. The Hub is placed in a MVC4 Website project and is started like this:
GlobalHost.Configuration.ConnectionTimeout = TimeSpan.FromSeconds(50);
RouteTable.Routes.MapHubs();
public class ComputeUnits : Hub
{
public Task Running(Guid MyGuid)
{
return Clients.Group(MyGuid.ToString()).ComputeUnitHeartBeat(MyGuid,
DateTime.UtcNow.ToEpochMilliseconds());
}
public Task ComputeUnitReqister(Guid MyGuid)
{
Groups.Add(Context.ConnectionId, "ComputeUnits").Wait();
return Clients.Others.ComputeUnitCameOnline(new { Guid = MyGuid,
HeartBeat = DateTime.UtcNow.ToEpochMilliseconds() });
}
public void SubscribeToHeartBeats(Guid MyGuid)
{
Groups.Add(Context.ConnectionId, MyGuid.ToString());
}
}
My clients are Javascript clients, that have methods for(let me know if you need to see the code for this also). But basicly they listhen for the ComputeUnitCameOnline and when its run they call on the server SubscribeToHeartBeats. This means that whenever the server compute unit is doing some work it will call Running, which will trigger a ComputeUnitHeartBeat on javascript clients.
I hope you can use this to see how Groups and Connections can be used. And last, its also scaled out over multiply azure roles by adding a few lines of code:
GlobalHost.HubPipeline.EnableAutoRejoiningGroups();
GlobalHost.DependencyResolver.UseServiceBus(
serviceBusConnectionString,
2,
3,
GetRoleInstanceNumber(),
topicPathPrefix /* the prefix applied to the name of each topic used */
);
You can get the connection string on the servicebus on azure, remember the Provider=SharedSecret. But when adding the nuget packaged the connectionstring syntax is also pasted into your web.config.
2 is how many topics to split it about. Topics can contain 1Gb of data, so depending on performance you can increase it.
3 is the number of nodes to split it out on. I used 3 because i have 2 Azure Instances, and my localhost. You can get the RoleNumber like this (note that i hard coded my localhost to 2).
private static int GetRoleInstanceNumber()
{
if (!RoleEnvironment.IsAvailable)
return 2;
var roleInstanceId = RoleEnvironment.CurrentRoleInstance.Id;
var li1 = roleInstanceId.LastIndexOf(".");
var li2 = roleInstanceId.LastIndexOf("_");
var roleInstanceNo = roleInstanceId.Substring(Math.Max(li1, li2) + 1);
return Int32.Parse(roleInstanceNo);
}
You can see it all live at : http://taskqueue.cloudapp.net/#/compute-units
When using SignalR, after a client has connected to the server they are served up a Connection ID (this is essential to providing real time communication). Yes this is stored in memory but SignalR also can be used in multi-node environments. You can use the Redis or even Sql Server backplane (more to come) for example. So long story short, we take care of your scale-out scenarios for you via backplanes/service bus' without you having to worry about it.

Categories