How to connect to EventHubs with Apache-Kafka extension using Azure Functions? - c#

Can I use SharedAccessKey to connect to the broker (EventHubs)?
I'm unable to connect to my Azure EventHubs.
We use SharedAccessKey instead of SSL to get connected and I have this configuration to do it.
"EventBusConfig": {
"BootstrapServers": "anyname.servicebus.windows.net:9093",
"SecurityProtocol": "SaslSsl",
"SaslMechanism": "Plain",
"SaslUsername": "$ConnectionString",
"SaslPassword":
"Endpoint=sb://anyname.servicebus.windows.net/;SharedAccessKeyName=anyname.;SharedAccessKey=CtDbJ/Kfjs749
8s--anypassword--SkSk749/z2Z5Fr9///33/qQ+R6Cyg=",
"SocketTimeoutMs": "60000",
"SessionTimeoutMs": "30000",
"GroupId": "NameOfTheGroup",
"AutoOffsetReset": "Earliest",
"BrokerVersionFallback": "1.0.0",
"Debug": "cgrp"
}
But it seems I need the certification path (the pem file)
I want to produce a simple message like this
I'm using https://github.com/Azure/azure-functions-kafka-extension but I don't know if this beta library can handle SharedAccessKey.
I got this error when trying to connect:
Any help will be appreciated

I was able to produce and consume messages using the extension "https://github.com/Azure/azure-functions-kafka-extension".
To consume a message was easy because of the property "EventHubConnectionString" very intuitive.
To produce a message you need to configure a CA certificate, I thought that I need this from Azure but I was wrong and I just followed these instructions to make it work.
Download and set the CA certification location. As described in Confluent documentation, the .NET library does not have the capability to access root CA certificates.
Missing this step will cause your function to raise the error "sasl_ssl://xyz-xyzxzy.westeurope.azure.confluent.cloud:9092/bootstrap: Failed to verify broker certificate: unable to get local issuer certificate (after 135ms in state CONNECT)"
To overcome this, we need to:
Download CA certificate (i.e. from https://curl.haxx.se/ca/cacert.pem).
Rename the certificate file to anything other than cacert.pem to avoid any conflict with existing EventHubs Kafka certificate that is part of the extension.
Include the file in the project, setting "copy to output directory"
Set the SslCaLocation trigger attribute property. In the example, we set to confluent_cloud_cacert.pem
This is my producer Azure function with Kafka binding
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Microsoft.Azure.WebJobs.Extensions.Kafka;
namespace EY.Disruptor.AzureFunctionsWithKafka
{
public static class Function
{
[FunctionName("Producer")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
[Kafka("BootstrapServer",
"topic.event",
Username = "ConfluentCloudUsername",
Password = "ConfluentCloudPassword",
SslCaLocation = "confluent_cloud_cacert.pem",
AuthenticationMode = BrokerAuthenticationMode.Plain,
Protocol = BrokerProtocol.SaslSsl
)] IAsyncCollector<KafkaEventData<string>> events,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name ??= data?.name;
string responseMessage = string.IsNullOrEmpty(name)
? "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response."
: $"Hello, {name}. This HTTP triggered function executed successfully.";
var kafkaEvent = new KafkaEventData<string>()
{
Value = name
};
await events.AddAsync(kafkaEvent);
return new OkObjectResult(responseMessage);
}
}
}
This is my consume Azure function with Kafka binding
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Kafka;
using Microsoft.Extensions.Logging;
namespace EY.Disruptor.AzureFunctionsWithKafka
{
public static class Consumer
{
[FunctionName("FunctionKafkaConsumer")]
public static void Run(
[KafkaTrigger("BootstrapServer",
"topic.event",
Username = "ConfluentCloudUsername",
Password = "ConfluentCloudPassword",
EventHubConnectionString = "ConfluentCloudPassword",
AuthenticationMode = BrokerAuthenticationMode.Plain,
Protocol = BrokerProtocol.SaslSsl,
ConsumerGroup = "Group1")] KafkaEventData<string>[] kafkaEvents,
ILogger logger)
{
foreach (var kafkaEvent in kafkaEvents)
{
logger.LogInformation(kafkaEvent.Value);
}
}
}
}
This is my local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"BootstrapServer": "zyxabc.servicebus.windows.net:9093",
"ConfluentCloudUsername": "$ConnectionString",
"ConfluentCloudPassword": "Endpoint=sb://zyxabc.servicebus.windows.net/;SharedAccessKeyName=TestSvc;SharedAccessKey=YAr/="
}
}
And of course the initialization in the Startup.cs
public void Configure(IWebJobsBuilder builder)
{
builder.AddKafka();
}
I hope this recommendation helps other people :)

Related

Azure Function Post C# not executing with big body

I'm currently implementing an Azure Function App that exposes a few Functions (mostly gets).
The following code seems to have an issue:
using System;
using System.Collections.Specialized;
using System.Net;
using Microsoft.Azure.Functions.Worker;
using Microsoft.Azure.Functions.Worker.Http;
using Microsoft.Extensions.Logging;
using System.Net.Http;
using InternalVacanciesAzureFunction.Model;
using Microsoft.Extensions.Options;
using System.Text.Json;
using System.Collections.Generic;
namespace IVAFunction
{
public class PostFunction
{
private readonly HttpClient _httpClient;
public PostFunction(IHttpClientFactory httpClientFactory)
{
_httpClient = httpClientFactory.CreateClient();
}
[Function("postFunction")]
public HttpResponseData Run([HttpTrigger(AuthorizationLevel.Anonymous, "post", "put")] HttpRequestData req,
FunctionContext executionContext)
{
ILogger logger = executionContext.GetLogger("PostFunction");
logger.LogError("Code hit: PostFunction.cs");
HttpResponseData response = req.CreateResponse();
string body = new System.IO.StreamReader(req.Body).ReadToEnd();
JsonResponse data = Nfunction.postFunction(_httpClient, "/PostFunction", body, logger, requestPrincipalName);
if (data.responseType.Equals(ResponseType.OK))
{
response.StatusCode = HttpStatusCode.OK;
}
else
{
response.StatusCode = HttpStatusCode.InternalServerError;
}
response.Headers.Add("Content-Type", "application/json; charset=utf-8");
response.WriteString(data.json);
return response;
}
}
}
The data that is posted to this function is a JSON including a BASE64 encoded string in two of the fields. The max size for each of those two fields is 1.5MB. Everytime I post something small e.g. 2 x 400B, everything goes fine. But when I sent something like 2 x 900kB the logging show up like this:
2021-12-07T07:42:28.455 [Debug] Request successfully matched the route with name 'postFunction' and template 'api/postFunction'
2021-12-07T07:42:29.129 [Information] Executing 'Functions.postFunction' (Reason='This function was programmatically called via the host APIs.', Id=xxxxxxxxxxxxxxxxxxxxxxxxxxxxx)
The "Code hit" logger code is never hit and after a while the function times out.
Anyone having a clue what is going on? I can reproduce the issue on both my local dev environment as well on actual Azure.
You got a lot of missing directives or assembly references there
except for that the last } is unnenecessary.

Why is my header data missing from my Azure Function Http Trigger in .Net 5 when calling from HttpClient.GetAsync

I have a client using HttpClient.GetAsync to call into a Azure Function Http Trigger in .Net 5.
When I call the function using PostMan, I get my custom header data.
However, when I try to access my response object (HttpResponseMessage) that is returned from HttpClient.GetAsync, my header data empty.
I have my Content data and my Status Code. But my custom header data are missing.
Any insight would be appreciated since I have looking at this for hours.
Thanks for you help.
Edit: Here is the code where I am making the http call:
public async Task<HttpResponseMessage> GetQuotesAsync(int? pageNo, int? pageSize, string searchText)
{
var requestUri = $"{RequestUri.Quotes}?pageNo={pageNo}&pageSize={pageSize}&searchText={searchText}";
return await _httpClient.GetAsync(requestUri);
}
Edit 8/8/2021: See my comment below. The issue has something to do with using Blazor Wasm Client.
For anyone having problems after following the tips on this page, go back and check the configuration in the host.json file. you need the Access-Control-Expose-Headers set to * or they won't be send even if you add them. Note: I added the "extensions" node below and removed my logging settings for clarity.
host.json (sample file):
{
"version": "2.0",
"extensions": {
"http": {
"customHeaders": {
"Access-Control-Expose-Headers": "*"
}
}
}
}
This is because HttpResponseMessage's Headers property data type is HttpResponseHeaders but HttpResponseData's Headers property data type is HttpHeadersCollection. Since, they are different, HttpResponseHeaders could not bind to HttpHeadersCollection while calling HttpClient.GetAsync(as it returns HttpResponseMessage).
I could not find a way to read HttpHeadersCollection through HttpClient.
As long as your Azure function code is emitting the header value, you should be able to read that in your client code from the Headers collection of HttpResponseMessage. Nothing in your azure function (which is your remote endpoint you are calling) makes it any different. Remember, your client code has no idea how your remote endpoint is implemented. Today it is azure functions, tomorrow it may be a full blown aspnet core web api or a REST endpoint written in Node.js. Your client code does not care. All it cares is whether the Http response it received has your expected header.
Asumming you have an azure function like this where you are adding a header called total-count to the response.
[Function("quotes")]
public static async Task<HttpResponseData> RunAsync(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestData req,
FunctionContext executionContext)
{
var logger = executionContext.GetLogger("Quotes");
logger.LogInformation("C# HTTP trigger function processed a request for Quotes.");
var quotes = new List<Quote>
{
new Quote { Text = "Hello", ViewCount = 100},
new Quote { Text = "Azure Functions", ViewCount = 200}
};
var response = req.CreateResponse(HttpStatusCode.OK);
response.Headers.Add("total-count", quotes.Count.ToString());
await response.WriteAsJsonAsync(quotes);
return response;
}
Your existing client code should work as long as you read the Headers property.
public static async Task<HttpResponseMessage> GetQuotesAsync()
{
var requestUri = "https://shkr-playground.azurewebsites.net/api/quotes";
return await _httpClient.GetAsync(requestUri);
}
Now your GetQuotesAsync method can be called somewhere else where you will use the return value of it (HttpResponseMessage instance) and read the headers. In the below example, I am reading that value and adding to a string variable. HttpResponseMessage implements IDisposable. So I am using a using construct to implicitly call the Dispose method.
var msg = "Total count from response headers:";
using (var httpResponseMsg = await GetQuotesAsync())
{
if (httpResponseMsg.Headers.TryGetValues("total-count", out var values))
{
msg += values.FirstOrDefault();
}
}
// TODO: use "msg" variable as needed.
The types which Azure function uses for dealing with response headers is more of an implementation concern of azure functions. It has no impact on your client code where you are using HttpClient and HttpResponseMessage. Your client code is simply dealing with standard http call response (response headers and body)
The issue is not with Blazor WASM, rather if that header has been exposed on your API Side. In your azure function, add the following -
Note: Postman will still show the headers even if you don't expose the headers like below. That's because, Postman doesn't care about CORS headers. CORS is just a browser concept and not a strong security mechanism. It allows you to restrict which other web apps may use your backend resources and that's all.
First create a Startup File to inject the HttpContextAccessor
Package Required: Microsoft.Azure.Functions.Extensions
[assembly: FunctionsStartup(typeof(FuncAppName.Startup))]
namespace FuncAppName
{
public class Startup : FunctionsStartup
{
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddScoped<HttpContextAccessor>();
}
}
}
Next, inject it into your main Function -
using Microsoft.AspNetCore.Http;
namespace FuncAppName
{
public class SomeFunction
{
private readonly HttpContext _httpContext;
public SomeFunction(HttpContextAccessor contextAccessor)
{
_httpContext = contextAccessor.HttpContext;
}
[FunctionName("SomeFunc")]
public override Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Anonymous, new[] { "post" }, Route = "run")] HttpRequest req)
{
var response = "Some Response"
_httpContext.Response.Headers.Add("my-custom-header", "some-custom-value");
_httpContext.Response.Headers.Add("my-other-header", "some-other-value");
_httpContext.Response.Headers.Add("Access-Control-Expose-Headers", "my-custom-header, my-other-header");
return new OkObjectResult(response)
}
If you want to allow all headers you can use wildcard (I think, not tested) -
_httpContext.Response.Headers.Add("Access-Control-Expose-Headers", "*");
You still need to add your web-app url to the azure platform CORS. You can add * wildcard, more info here - https://iotespresso.com/allowing-all-cross-origin-requests-azure-functions/
to enable CORS for Local Apps during development - https://stackoverflow.com/a/60109518/9276081
Now to access those headers in your Blazor WASM, as an e.g. you can -
protected override async Task OnInitializedAsync()
{
var content = JsonContent.Create(new { query = "" });
using (var client = new HttpClient())
{
var result = await client.PostAsync("https://func-app-name.azurewebsites.net/api/run", content);
var headers = result.Headers.ToList();
}
}

create a web service which uses Azure Functions (HTTP Trigger) to Post images to a Azure Blob Container and return the URI of the same image using SAS

I'm creating a Azure web service in C# which will use Azure Functions to post an image(selfies) in a storage account using HTTP Trigger request. There will be two containers in the blob storage account, one for new user registration and another for existing user, when a new user takes a selfie it will post that image in the registry container and if it is a existing user, then the image will be posted to other container. After this, the web service will return the URI of the posted image using a Shared Access Signature.
I created a cloud solution in visual studio 2017 (.NET Framework) using the HTTP trigger request.
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
namespace KeolisKlockApp
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
// parse query parameter
string name = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0)
.Value;
if (name == null)
{
// Get request body
dynamic data = await req.Content.ReadAsAsync<object>();
name = data?.name;
}
return name == null
? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a name on the query string or in the request body")
: req.CreateResponse(HttpStatusCode.OK, "Hello " + name);
}
}
}
I expect the return of the web service will be a URI of the posted image in azure blob which uses a shared access signature.
Update:
local.settings.json(note, replace the value of AzureWebJobsStorage to your azure storage connection string):
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=your_storage_account;AccountKey=your_storage_account_key;BlobEndpoint=https://xx.blob.core.windows.net/;TableEndpoint=https://xxx.table.core.windows.net/;QueueEndpoint=https://xx.queue.core.windows.net/;FileEndpoint=https://xx.file.core.windows.net/",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
}
}
For host.json, just leave it as default, like below:
{
"version": "2.0"
}
Please use the code below for your testing purpose(It's just a simple code, and you can modify it to meet your need):
In visual studio, create a HttpTrigger azure function, then use the code below:
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
string account_name = "your_storage_account_name";
string account_key = "your_storage_account_key";
string container_name = "your_container_name";
string blob_name = name;
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials(account_name, account_key), true);
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = client.GetContainerReference(container_name);
CloudBlockBlob myblob = blobContainer.GetBlockBlobReference(blob_name);
await myblob.UploadFromStreamAsync(req.Body);
string blobSasUrl = GetBlobSasUri(blobContainer, blob_name, null);
Console.WriteLine(blobSasUrl);
return name != null
? (ActionResult)new OkObjectResult($"Hello, {name}")
: new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
if (policyName == null)
{
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create
};
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
Console.WriteLine("SAS for blob (ad hoc): {0}", sasBlobToken);
Console.WriteLine();
}
else
{
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
Console.WriteLine("SAS for blob (stored access policy): {0}", sasBlobToken);
Console.WriteLine();
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
}
Then click run button in visual studio to run the code locally. Here, I use postman for the test, you can see the test result as below: The blob sas url is generated, as well as the file is uploaded to the azure blob storage:
If you don't know how to use postman to test, please follow the steps below:
1.Click run button in visual studio, then wait for a while ,in the command prompt, there is a url which you need in postman:
2.Open postman, paste the url generated above in postman and select http method as POST, then in the Params section, fill in the key-value pair like name and it's value:
3.Then under the Body -> click the binary radio button -> click Select File, select a file you want to upload. At last, click Send button.

send an email from azure functions when a process finishes

I have a device that sends data in text form to a blob on AZURE, once the blob receives the data it triggers a function in azure functions which is basically and executable file made from c++ code, when it finishes it generates another text file which is stored in other blob
it is a very simple operation. But now I would like to receive an email each time the function goes trough successfully, I have searched on the web but the tutorial are very confusing or does not address this simple task.
I did developed the executable file with c++ but I inherited the azure function from someone else and I have zero experience with azure (i am electrical engineer not computer science). The azure function is written in C#, I just need a guide.
Thank you in advance!!
Use can add SendGrid output binding to you C# Azure Function. The binding in function.json would look something like this:
{
"name": "mail",
"type": "sendGrid",
"direction": "out",
"apiKey" : "MySendGridKey"
}
and function body like this:
#r "SendGrid"
using SendGrid.Helpers.Mail;
public static void Run(string input, out string yourExistingOutput, out Mail message)
{
// Do the work you already do
message = new Mail
{
Subject = "Your Subject"
};
var personalization = new Personalization();
personalization.AddTo(new Email("recipient#contoso.com"));
Content content = new Content
{
Type = "text/plain",
Value = "Email Body"
};
message.AddContent(content);
message.AddPersonalization(personalization);
}
Read about SendGrid and SendGrid bindings.
I had a similar problem which Mikhail's solution helped me solve. In my case I wanted the static Run method to be asynchronously, which meant I couldn't use the out parameter modifier. My solution's slightly different as it is a timer trigger and was implemented using Visual Studio and the NuGet package Microsoft.Azure.Webjobs.Extensions.SendGrid v2.1.0.
[FunctionName("MyFunction")]
public static async Task Run(
[TimerTrigger("%TimerInterval%")]TimerInfo myTimer,
[SendGrid] IAsyncCollector<Mail> messages,
TraceWriter log)
{
log.Info($"C# Timer trigger function started execution at: {DateTime.Now}");
// Do the work you already do...
log.Info($"C# Timer trigger function finished execution at: {DateTime.Now}");
var message = new Mail();
message.From = new Email("from#email.com");
var personalization = new Personalization();
personalization.AddTo(new Email("to#email.com"));
personalization.Subject = "Azure Function Executed Succesfully";
message.AddPersonalization(personalization);
var content = new Content
{
Type = "text/plain",
Value = $"Function ran at {DateTime.Now}",
};
message.AddContent(content);
await messages.AddAsync(message);
}
This solution used Zain Rivzi's answer to How can I bind output values to my async Azure Function?
and the SendGrid Web API v3 quick start guide.
The answer can be slightly simplified:
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using SendGrid.Helpers.Mail;
public static class ExtractArchiveBlob
{
[FunctionName("MyFunction")]
public static async Task RunAsync(string input,
[SendGrid(ApiKey = "SendGridApiKey")]
IAsyncCollector<SendGridMessage> messageCollector)
{
var message = new SendGridMessage();
message.AddContent("text/plain", "Example content");
message.SetSubject("Example subject");
message.SetFrom("from#email.com");
message.AddTo("to#email.com");
await messageCollector.AddAsync(message);
}
}
Where SendGridApiKey is the app setting holding your Send Grid api key.

ClaimsPrincipal.Current.Identity.Name Empty when authenticated from client, fine in browser

I have the following Azure Function,
#r "Newtonsoft.Json"
using Newtonsoft.Json.Linq;
using System.Net;
using System.Security.Claims;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
try
{
JObject pJOtClaims = new JObject();
foreach(Claim curClaim in ClaimsPrincipal.Current.Identities.First().Claims)
{
pJOtClaims.Add(curClaim.Type, new JValue(curClaim.Value));
}
return(req.CreateResponse(HttpStatusCode.OK, $"{pJOtClaims.ToString(Newtonsoft.Json.Formatting.None)}"));
}
catch(Exception ex)
{
return(req.CreateResponse(HttpStatusCode.OK, $"{ex.Message}"));
}
}
I have configured only Facebook authentication for this Function App. This function works for both in-browser and client authentication. When I invoke this method in browser I get a whole bunch of claims, including my registered Facebook email address. When I invoke this from client authentication, I get the following claims,
{
"stable_sid":"...",
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier":"...",
"http://schemas.microsoft.com/identity/claims/identityprovider":"...",
"ver":"...",
"iss":"...",
"aud":"...",
"exp":"...",
"nbf":"..."
}
Unfortunately none of these include my Facebook email address which I need. I have enabled the "email" scope for the Facebook authentication configuration. Any ideas how to get this?
Nick.
Okay so I haven't found the exact solution I wanted, but this should get me by. Technically I only need the email address during registration, after that I can just use the stable_sid as is part of the identity I do get. So What I have done is to pass on the x-zumo-auth header to the ".auth/me" method, get the property I need. I'm using this method
public static async Task<String> GetAuthProviderParam(String iAuthMeURL,
String iXZumoAUth,
String iParamKey)
{
using (HttpClient pHCtClient = new HttpClient())
{
pHCtClient.DefaultRequestHeaders.Add("x-zumo-auth", iXZumoAUth);
String pStrResponse = await pHCtClient.GetStringAsync(iAuthMeURL);
JObject pJOtResponse = JObject.Parse(pStrResponse.Trim(new Char[] { '[', ']' }));
if(pJOtResponse[iParamKey] != null)
{
return (pJOtResponse[iParamKey].Value<String>());
}
else
{
throw new KeyNotFoundException(String.Format("A parameter with the key '{0}' was not found.", iParamKey));
}
}
}
This can be called in the function like so,
if(req.Headers.Contains("x-zumo-auth"))
{
String pStrXZumoAuth = req.Headers.GetValues("x-zumo-auth").First();
String pStrParam = await FunctionsHelpers.GetAuthProviderParam("https://appname.azurewebsites.net/.auth/me",
pStrXZumoAuth,
"user_id");
//pStrParam = user_id
}

Categories