I created an Azure function which uses a packages that needs a properties file in order to work. From the way they implemented this I need to pass the property file location to the package.
ConfigFactory.setConfigLocation("SomeFolder/AnotherFolder/connector.properties");
When I run this locally this works just fine and everything works as intended. However when I publish it on Azure it tells me the file url is invalid,
[Error] Invalid url to location ]SomeFolder/AnotherFolder/connector.properties[ errorMessage :no protocol: SomeFolder/AnotherFolder/connector.properties file
Am I missing something? How come this doesn't just work on Azure?
Refer to a file in an azure function you have 2 options.
Using Execution Context :
[FunctionName("MyNewHTTPAzureFunction")]
public static async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req, ILogger log, ExecutionContext context)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
string requestBody = await new StreamReader(req.Body).ReadToEndAsync();
dynamic data = JsonConvert.DeserializeObject(requestBody);
name = name ?? data?.name;
string responseMessage = string.IsNullOrEmpty(name) ? "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response." : $"Hello, {name}. This HTTP triggered function executed successfully.";
//This is how you can access the curent Directory with the help of context.FunctionAppDirectory
string localFile = Path.Combine(context.FunctionAppDirectory, "Data", "Your_file_with_extension");
string readLocalFile = File.ReadAllText(localFile);
...... // your business needs
return new OkObjectResult(responseMessage);
}
Use Path.GetDirectoryName :
To access the directory of the Azure Function if you don’t have access to the ExecutionContext.
You can read the local file in your Startup class like below.
var getPath= Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location);
var rootpath = Path.GetFullPath(Path.Combine(binpath, ".."));
///Read the file
File.ReadAllText(rootpath + "/path/to/filename_with_extension");
Azure Function Access To The Path Is Denied
To avoid the Access Denied error in Azure functions, You Always try to keep your files in the D:\home\site\wwwroot (root Directory) directory that is exactly where exactly the Azure Function resides.
Refer for more information here & Doc for Binding Expression pattern
Related
I have created an azure function (HTTP triggered with Open API) and deployed to azure. The endpoints working good. I am getting response when testing from Postman
Swagger UI also loading. But when trying to post from Swagger it keeps saying 401 unauthorized. But I have copied the function key from Azure portal (below screen shot) and specified that in authorize popup with in swagger.
But it still saying unauthorized.
When I copy the URL from portal for the http endpoint, it looks like this
https://myurls-asev3.appserviceenvironment.net/api/ObjectRead?code=mycode
Here mycode is exactly the same one I copied from function keys. But the only difference is, the code is attached as a query string in url when I copied the URL from portal
But in swagger it sends as header.
But in function configuration I designed it to accept as header.
[FunctionName("ObjectRead")]
[OpenApiOperation(operationId: "Run", tags: new[] { "name" })]
[OpenApiSecurity("function_key", SecuritySchemeType.ApiKey, Name = "code", In = OpenApiSecurityLocationType.Header)]
[OpenApiRequestBody("application/json", typeof(FileDetails))]
[OpenApiResponseWithBody(statusCode: HttpStatusCode.OK, contentType: "application/json", bodyType: typeof(string), Description = "The OK response")]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)] HttpRequest req)
{
_logger.LogInformation($"ObjectRead function triggered at {DateTime.UtcNow.ToString("O")}.");
var requestBody = await new StreamReader(req.Body).ReadToEndAsync().ConfigureAwait(false);
var data = JsonConvert.DeserializeObject<FileDetails>(requestBody);
var responseMessage = await _objectReadService.ReadAsync(data.FileName, data.FilePath).ConfigureAwait(false);
_logger.LogInformation($"ObjectRead function completed at {DateTime.UtcNow.ToString("O")}.");
return responseMessage.Length != 0 ? new FileStreamResult(responseMessage, "application/octet-stream") { FileDownloadName = data.FileName} : new NotFoundObjectResult("Unable to retrieve the file or File not found.");
}
Above code snippet 3rd line I have mentioned the key as header
Even why its not working in swagger and in url why the code still shows as query string
From the documentation:
https://<APP_NAME>.azurewebsites.net/api/<FUNCTION_NAME>?code=<API_KEY>
The key can be included in a query string variable named code, as above. It can also be included in an x-functions-key HTTP header. The value of the key can be any function key defined for the function, or any host key.
so your attirbute should looks like that:
[OpenApiSecurity("function_key", SecuritySchemeType.ApiKey, Name = "x-functions-key", In = OpenApiSecurityLocationType.Header)]
I have a client using HttpClient.GetAsync to call into a Azure Function Http Trigger in .Net 5.
When I call the function using PostMan, I get my custom header data.
However, when I try to access my response object (HttpResponseMessage) that is returned from HttpClient.GetAsync, my header data empty.
I have my Content data and my Status Code. But my custom header data are missing.
Any insight would be appreciated since I have looking at this for hours.
Thanks for you help.
Edit: Here is the code where I am making the http call:
public async Task<HttpResponseMessage> GetQuotesAsync(int? pageNo, int? pageSize, string searchText)
{
var requestUri = $"{RequestUri.Quotes}?pageNo={pageNo}&pageSize={pageSize}&searchText={searchText}";
return await _httpClient.GetAsync(requestUri);
}
Edit 8/8/2021: See my comment below. The issue has something to do with using Blazor Wasm Client.
For anyone having problems after following the tips on this page, go back and check the configuration in the host.json file. you need the Access-Control-Expose-Headers set to * or they won't be send even if you add them. Note: I added the "extensions" node below and removed my logging settings for clarity.
host.json (sample file):
{
"version": "2.0",
"extensions": {
"http": {
"customHeaders": {
"Access-Control-Expose-Headers": "*"
}
}
}
}
This is because HttpResponseMessage's Headers property data type is HttpResponseHeaders but HttpResponseData's Headers property data type is HttpHeadersCollection. Since, they are different, HttpResponseHeaders could not bind to HttpHeadersCollection while calling HttpClient.GetAsync(as it returns HttpResponseMessage).
I could not find a way to read HttpHeadersCollection through HttpClient.
As long as your Azure function code is emitting the header value, you should be able to read that in your client code from the Headers collection of HttpResponseMessage. Nothing in your azure function (which is your remote endpoint you are calling) makes it any different. Remember, your client code has no idea how your remote endpoint is implemented. Today it is azure functions, tomorrow it may be a full blown aspnet core web api or a REST endpoint written in Node.js. Your client code does not care. All it cares is whether the Http response it received has your expected header.
Asumming you have an azure function like this where you are adding a header called total-count to the response.
[Function("quotes")]
public static async Task<HttpResponseData> RunAsync(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestData req,
FunctionContext executionContext)
{
var logger = executionContext.GetLogger("Quotes");
logger.LogInformation("C# HTTP trigger function processed a request for Quotes.");
var quotes = new List<Quote>
{
new Quote { Text = "Hello", ViewCount = 100},
new Quote { Text = "Azure Functions", ViewCount = 200}
};
var response = req.CreateResponse(HttpStatusCode.OK);
response.Headers.Add("total-count", quotes.Count.ToString());
await response.WriteAsJsonAsync(quotes);
return response;
}
Your existing client code should work as long as you read the Headers property.
public static async Task<HttpResponseMessage> GetQuotesAsync()
{
var requestUri = "https://shkr-playground.azurewebsites.net/api/quotes";
return await _httpClient.GetAsync(requestUri);
}
Now your GetQuotesAsync method can be called somewhere else where you will use the return value of it (HttpResponseMessage instance) and read the headers. In the below example, I am reading that value and adding to a string variable. HttpResponseMessage implements IDisposable. So I am using a using construct to implicitly call the Dispose method.
var msg = "Total count from response headers:";
using (var httpResponseMsg = await GetQuotesAsync())
{
if (httpResponseMsg.Headers.TryGetValues("total-count", out var values))
{
msg += values.FirstOrDefault();
}
}
// TODO: use "msg" variable as needed.
The types which Azure function uses for dealing with response headers is more of an implementation concern of azure functions. It has no impact on your client code where you are using HttpClient and HttpResponseMessage. Your client code is simply dealing with standard http call response (response headers and body)
The issue is not with Blazor WASM, rather if that header has been exposed on your API Side. In your azure function, add the following -
Note: Postman will still show the headers even if you don't expose the headers like below. That's because, Postman doesn't care about CORS headers. CORS is just a browser concept and not a strong security mechanism. It allows you to restrict which other web apps may use your backend resources and that's all.
First create a Startup File to inject the HttpContextAccessor
Package Required: Microsoft.Azure.Functions.Extensions
[assembly: FunctionsStartup(typeof(FuncAppName.Startup))]
namespace FuncAppName
{
public class Startup : FunctionsStartup
{
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddScoped<HttpContextAccessor>();
}
}
}
Next, inject it into your main Function -
using Microsoft.AspNetCore.Http;
namespace FuncAppName
{
public class SomeFunction
{
private readonly HttpContext _httpContext;
public SomeFunction(HttpContextAccessor contextAccessor)
{
_httpContext = contextAccessor.HttpContext;
}
[FunctionName("SomeFunc")]
public override Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Anonymous, new[] { "post" }, Route = "run")] HttpRequest req)
{
var response = "Some Response"
_httpContext.Response.Headers.Add("my-custom-header", "some-custom-value");
_httpContext.Response.Headers.Add("my-other-header", "some-other-value");
_httpContext.Response.Headers.Add("Access-Control-Expose-Headers", "my-custom-header, my-other-header");
return new OkObjectResult(response)
}
If you want to allow all headers you can use wildcard (I think, not tested) -
_httpContext.Response.Headers.Add("Access-Control-Expose-Headers", "*");
You still need to add your web-app url to the azure platform CORS. You can add * wildcard, more info here - https://iotespresso.com/allowing-all-cross-origin-requests-azure-functions/
to enable CORS for Local Apps during development - https://stackoverflow.com/a/60109518/9276081
Now to access those headers in your Blazor WASM, as an e.g. you can -
protected override async Task OnInitializedAsync()
{
var content = JsonContent.Create(new { query = "" });
using (var client = new HttpClient())
{
var result = await client.PostAsync("https://func-app-name.azurewebsites.net/api/run", content);
var headers = result.Headers.ToList();
}
}
I am trying to return a SAS url to my frontend so I can redirect the user to that link and so they can download the file.
This is my code to create the SAS url
private SasQueryParameters GenerateSaSCredentials(string containerName, string blobName) {
// Defines the resource being accessed and for how long the access is allowed.
BlobSasBuilder blobSasBuilder = new() {
StartsOn = DateTime.UtcNow.Subtract(TimeSpan.FromMinutes(10)),
ExpiresOn = DateTime.UtcNow.Add(TimeSpan.FromMinutes(120)) + TimeSpan.FromSeconds(1),
Resource = "b",
BlobName = blobName,
BlobContainerName = containerName
};
// Defines the type of permission.
blobSasBuilder.SetPermissions(BlobSasPermissions.Read);
// Builds an instance of StorageSharedKeyCredential
StorageSharedKeyCredential storageSharedKeyCredential = new(_accountName, _key);
// Builds the Sas URI.
return blobSasBuilder.ToSasQueryParameters(storageSharedKeyCredential);
}
public Uri CreateBlobUri(string blobName, string containerName) {
SasQueryParameters parameters = GenerateSaSCredentials(containerName, blobName);
return new UriBuilder {
Scheme = "https",
Host = $"{_accountName}.blob.core.windows.net",
Path = $"files/{containerName}/{blobName}",
Query = WebUtility.UrlDecode(parameters.ToString())
}.Uri;
}
You may notice the url decode on parameters.ToString() is because of a similar issue ive seen on stackoverflow where they spoke of double encoding.
However when i return this url to the browser and redirect i get the following error.
This is how i return the URL
return Ok(_blobUtils.CreateBlobUri(fileName, containerName).ToString());
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header
is formed correctly including the signature. RequestId:01696cca-d01e-0023-2ea4-74f5df000000
Time:2021-07-09T09:23:33.0250817Z</Message>
<AuthenticationErrorDetail>Signature fields not well formed.</AuthenticationErrorDetail>
</Error>
If i remove the WebUtility.UrlDecode from the parameters.ToString(), i get this error
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header
is formed correctly including the signature. RequestId:016a1821-d01e-0023-3da4-74f5df000000
Time:2021-07-09T09:24:38.4051042Z</Message>
<AuthenticationErrorDetail>Signature did not match. String to sign used was r 2021-07-
09T09:14:38Z 2021-07-09T11:24:39Z /blob/${_acountName}/files/bqXbY54sRRsipOUB1PF6/fyI67FYOqDS80y1vNWRL/PRE_OP_CT/0/TK1.left.TST.PTN1.PRE
_OP_CT.zip 2020-04-08 b </AuthenticationErrorDetail>
</Error>
The structure of the Blob i am trying to access is:
And finally the blob we are trying to create a SAS to
Can anyone see why this would fail?
Please get rid of files from Path here:
return new UriBuilder {
Scheme = "https",
Host = $"{_accountName}.blob.core.windows.net",
Path = $"files/{containerName}/{blobName}",
Query = WebUtility.UrlDecode(parameters.ToString())
}.Uri;
It should be something like:
return new UriBuilder {
Scheme = "https",
Host = $"{_accountName}.blob.core.windows.net",
Path = $"{containerName}/{blobName}",
Query = WebUtility.UrlDecode(parameters.ToString())
}.Uri;
UPDATE
Based on the screenshot and the error message, the name of your container is files and the name of the blob is bqXbY54sRRsipOUB1PF6/fyI67FYOqDS80y1vNWRL/PRE_OP_CT/0/TK1.left.TST.PTN1.PRE. Please use them in your code and you should not get the error. You still need to remove files from the Path above as it is already included in your containerName.
The reason your code is failing is because you're calculating SAS token for a blob inside a blob container (the blob path becomes container-name/blob-name). However in your request, you're prepending files to your request URL, your blob path becomes files/container-name/blob-name. Since the SAS token is obtained for a different path but is used for another path, you're getting the error.
I'm creating a Azure web service in C# which will use Azure Functions to post an image(selfies) in a storage account using HTTP Trigger request. There will be two containers in the blob storage account, one for new user registration and another for existing user, when a new user takes a selfie it will post that image in the registry container and if it is a existing user, then the image will be posted to other container. After this, the web service will return the URI of the posted image using a Shared Access Signature.
I created a cloud solution in visual studio 2017 (.NET Framework) using the HTTP trigger request.
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
namespace KeolisKlockApp
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req, TraceWriter log)
{
log.Info("C# HTTP trigger function processed a request.");
// parse query parameter
string name = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "name", true) == 0)
.Value;
if (name == null)
{
// Get request body
dynamic data = await req.Content.ReadAsAsync<object>();
name = data?.name;
}
return name == null
? req.CreateResponse(HttpStatusCode.BadRequest, "Please pass a name on the query string or in the request body")
: req.CreateResponse(HttpStatusCode.OK, "Hello " + name);
}
}
}
I expect the return of the web service will be a URI of the posted image in azure blob which uses a shared access signature.
Update:
local.settings.json(note, replace the value of AzureWebJobsStorage to your azure storage connection string):
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=your_storage_account;AccountKey=your_storage_account_key;BlobEndpoint=https://xx.blob.core.windows.net/;TableEndpoint=https://xxx.table.core.windows.net/;QueueEndpoint=https://xx.queue.core.windows.net/;FileEndpoint=https://xx.file.core.windows.net/",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
}
}
For host.json, just leave it as default, like below:
{
"version": "2.0"
}
Please use the code below for your testing purpose(It's just a simple code, and you can modify it to meet your need):
In visual studio, create a HttpTrigger azure function, then use the code below:
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
log.LogInformation("C# HTTP trigger function processed a request.");
string name = req.Query["name"];
string account_name = "your_storage_account_name";
string account_key = "your_storage_account_key";
string container_name = "your_container_name";
string blob_name = name;
CloudStorageAccount storageAccount = new CloudStorageAccount(new StorageCredentials(account_name, account_key), true);
CloudBlobClient client = storageAccount.CreateCloudBlobClient();
CloudBlobContainer blobContainer = client.GetContainerReference(container_name);
CloudBlockBlob myblob = blobContainer.GetBlockBlobReference(blob_name);
await myblob.UploadFromStreamAsync(req.Body);
string blobSasUrl = GetBlobSasUri(blobContainer, blob_name, null);
Console.WriteLine(blobSasUrl);
return name != null
? (ActionResult)new OkObjectResult($"Hello, {name}")
: new BadRequestObjectResult("Please pass a name on the query string or in the request body");
}
private static string GetBlobSasUri(CloudBlobContainer container, string blobName, string policyName = null)
{
string sasBlobToken;
CloudBlockBlob blob = container.GetBlockBlobReference(blobName);
if (policyName == null)
{
SharedAccessBlobPolicy adHocSAS = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(24),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write | SharedAccessBlobPermissions.Create
};
sasBlobToken = blob.GetSharedAccessSignature(adHocSAS);
Console.WriteLine("SAS for blob (ad hoc): {0}", sasBlobToken);
Console.WriteLine();
}
else
{
sasBlobToken = blob.GetSharedAccessSignature(null, policyName);
Console.WriteLine("SAS for blob (stored access policy): {0}", sasBlobToken);
Console.WriteLine();
}
// Return the URI string for the container, including the SAS token.
return blob.Uri + sasBlobToken;
}
}
Then click run button in visual studio to run the code locally. Here, I use postman for the test, you can see the test result as below: The blob sas url is generated, as well as the file is uploaded to the azure blob storage:
If you don't know how to use postman to test, please follow the steps below:
1.Click run button in visual studio, then wait for a while ,in the command prompt, there is a url which you need in postman:
2.Open postman, paste the url generated above in postman and select http method as POST, then in the Params section, fill in the key-value pair like name and it's value:
3.Then under the Body -> click the binary radio button -> click Select File, select a file you want to upload. At last, click Send button.
Edited Thank you #Marco
I'm trying to write a function app that grabs an SVG from a URL and converts it to PNG. I know there are existing API's that do this like CloudConvert, but they don't work nicely with embedded fonts, which is a requirement for me.
Anyway, I wrote a very basic function app that simply downloads a file at this point. Everything works perfectly fine locally, but when I publish to Azure, I get An exception occurred during a WebClient request.
Thanks to #Marco's suggestion, I switched from WebClient to HTTPWebRequest to get more detailed error handling, and as a result, I see the following:
2018-10-11T13:53:53.558 [Info] Function started (Id=e3cbda04-140e-4ef7-ad6c-c871ffe179dd)
2018-10-11T13:53:53.590 [Info] C# HTTP trigger function processed a request.
2018-10-11T13:53:53.752 [Info] Download Fail
2018-10-11T13:53:53.752 [Info] Access to the path
'D:\Windows\system32\734e16961fc276df.svg' is denied.
Am I trying to do something that isn't possible, or is there a fix for this? Is there a way to configure permissions in an Azure function? I need to pull the file down to edit and not just work with the byte array.
Many thanks!
public static async Task<HttpResponseMessage> Run([HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)]HttpRequestMessage req,
TraceWriter log, ExecutionContext context)
{
log.Info("C# HTTP trigger function processed a request.");
// parse query parameter
string svgURL = req.GetQueryNameValuePairs()
.FirstOrDefault(q => string.Compare(q.Key, "l", true) == 0)
.Value;
if (svgURL == null)
{
// Get request body
dynamic data = await req.Content.ReadAsAsync<object>();
svgURL = data?.svgURL;
}
// download file from URL
var uniqueName = GenerateId() ;
try
{
using (var client = new WebClient())
{
client.DownloadFile(svgURL, uniqueName + ".svg" );
}
}
catch (Exception e)
{
log.Info("Download Fail");
log.Info(e.Message);
}
}
The easiest way to solve this is to use temp storage. I can see why Azure wouldn't want functions cludging up the app directory anyway. Updated code below:
I replaced this:
client.DownloadFile(svgURL, uniqueName + ".svg" );
With this:
client.DownloadFile(svgURL, Path.GetTempPath() + "\\" + uniqueName + ".svg" );
Worked like a charm.
Edit:
Below is the GitHub repo where I make this call. There's other stuff going on but you can see where I save to temp storage.
https://github.com/osuhomebase/SVG2PNG-AzureFunction