c# azure function trigger: create a csv file from a given string - c#

i am trying to create a csv file from a string. I was able to create the csv file as string like so:
private void SaveToCsv<T>(List<T> reportData)
{
var lines = new List<string>();
IEnumerable<PropertyDescriptor> props = TypeDescriptor.GetProperties(typeof(T)).OfType<PropertyDescriptor>();
var header = string.Join(",", props.ToList().Select(x => x.Name));
lines.Add(header);
var valueLines = reportData.Select(row => string.Join(",", header.Split(',').Select(a => row.GetType().GetProperty(a).GetValue(row, null))));
lines.AddRange(valueLines);
}
But I cannot seem to find out how to create an actual .csv file from the string in lines, as the function app cannot follow a specific path (eg: D://drive/user/xya)
How do I create a file in code without a path in a function app?

Please check if the below steps helps to work around:
After generating the CSV Files, to store them you need the storage account. It is automatically created when creating the Azure Function App.
Update the Storage account name, connection string and file path in local.settings.json as well as in the configuration of the Azure Function App in the portal.
Here File_path is the blob container folder path where you're going to save the CSV files.
Add the CORS option through portal or code to run out of the issues like allowing the requests from your local IP addresses, current IP address if connected to VPN.
Refer this article for practical workaround and the code.
If you want to save the CSV files in the temporary location of the Azure Functions, then refer this SO Thread
References:
Storage Consideration of Azure Functions
To mount Azure local storage as a local file, refer to this MSFT Q&A
Accessing Azure File Storage from Azure Function

Related

Azure Function fails to find file

I have an azure function that uses the Azure context. When I execute my function from visual studio 2019 on my machine, it executes correctly. However when I publish this to my Azure account, I get an error that the my.azureauth file cannot be found.
Could not find file 'D:\Program Files (x86)\SiteExtensions\Functions\2.0.12950\32bit\my.azureauth'
The code that is used:
var authFilePath = "my.azureauth";
Console.WriteLine($"Authenticating with Azure using credentials in file at {authFilePath}");
azure = Azure.Authenticate(authFilePath).WithDefaultSubscription();
sub = azure.GetCurrentSubscription();
Console.WriteLine($"Authenticated with subscription '{sub.DisplayName}' (ID: {sub.SubscriptionId})");
This is code that I found on one of the Microsoft tutorials. I have set my my.azureauth file to "Copy Always".
Could anyone point my in the right direction?
You are get this file path because the Directory.GetCurrentDirectory() would return D:\Program Files (x86)\SiteExtensions\Functions\2.0.12950\32bit instead of D:\home\site\wwwroot\ or D:\home\site\wwwroot\FunctionName.
And if you want to get the wwwroot folder or the function app directory you should use ExecutionContext. Further more information you could refer to this wiki doc.
So the right file path should be context.FunctionDirectory+"\my.azureauth" or context.FunctionAppDirectory+"\my.azureauth", which one to use depends on where your file is stored.
I have found that Kudu is extremely useful in seeing what has been deployed to Azure.
Navigate to your function in the Azure portal.
The instructions here will help get to the kudu console.
https://www.gslab.com/blogs/kudu-azure-web-app
From there you can browse the files which have been deployed into your function's file system.
If you add " , ExecutionContext context)" at the end of the function's run entry point, you can then get the folder which your function is running from with "var path = context.FunctionAppDirectory;
PS apologies for any formatting I am editing this on my phone.
Welcome to Stackoverflow.
Firstly, I'd recommend strongly against using file-based authentication as shown in your question.
From notes:
Note, file-based authentication is an experimental feature that may or may not be available in later releases. The file format it relies on is subject to change as well.
Instead, I would personally store the connection string details (AzureCredentials) in the config file (Web/SiteSettings) and use the provided constructor...
Again, the below are taken from the documentation notes:
Similarly to the file-based approach, this method requires a service principal registration, but instead of storing the credentials in a local file, the required inputs can be supplied directly via an instance of the AzureCredentials class:
var creds = new AzureCredentialsFactory().FromServicePrincipal(client, key, tenant, AzureEnvironment.AzureGlobalCloud);
var azure = Azure.Authenticate(creds).WithSubscription(subscriptionId);
or
var creds = new AzureCredentialsFactory().FromServicePrincipal(client, pfxCertificatePath, password, tenant, AzureEnvironment.AzureGlobalCloud);
var azure = Azure.Authenticate(creds).WithSubscription(subscriptionId);
where client, tenant, subscriptionId, and key or pfxCertificatePath and password are strings with the required pieces of information about your service principal and subscription. The last parameter, AzureEnvironment.AzureGlobalCloud represents the Azure worldwide public cloud. You can use a different value out of the currently supported alternatives in the AzureEnvironment enum.
The first example is most likely the one you should be looking at.
The notes I got this information from can be accessed here.
If you have some problems with AAD, these screenshots may help you.
Client ID:
Key:
Please note that the Key value can only be copied when it is created, after which it will be hidden.
Hope this helps you get started with AAD quickly.

Where the templ files are created in blob triggered azure function?

I am working on an Azure Function application on Windows Azure.
I have created a Blob triggered azure function that creats a file in temp folder. To get the path of temporary folder to create the temporary file I am using following code block:
string TempFolderLocation = Path.GetTempPath();
string TempFileName = DateTime.UtcNow.ToString("yyyyMMddHHmmssfff") + ".txt";
string TempFilePath = Path.Combine(TempFolderLocation, TempFileName);
System.IO.File.WriteAllText(TempFilePath, "This is the time log : " + DateTime.UtcNow.ToString("yyyy.MM.dd HH:mm:ss.fff"));
I get the path "D:\local\Temp\" as result of Path.GetTempPath().
I am not getting any error when above code gates executed But when I take KUDU of my azure function application, there is no files created in folder "D:\local\Temp\".
So my questions are:
Is execution of blob triggered azure function happening at different
location then KUDU?
Is the execution time temp folder is different then the actual?
Where can I see those temp files?
The azure function doesn't share the temp storage with kudu (and there is a legacy flag to force them to share).
Check "Temporary files" section here:
https://github.com/projectkudu/kudu/wiki/Understanding-the-Azure-App-Service-file-system#temporary-files
Another important note is that the Main site and the scm site do not
share temp files. So if you write some files there from your site, you
will not see them from Kudu Console (and vice versa). You can make
them use the same temp space if you disable separation (via
WEBSITE_DISABLE_SCM_SEPARATION). But note that this is a legacy flag,
and its use is not recommended/supported.

Validating file or folder location in azure-storage-blob using simple http call

We have a requirement to validate the folder or a file location which is there in azure-blob-storage container.
Example folder path: wasbs://#.blob.core.windows.net/
Example file path: wasbs://#.blob.core.windows.net//
Would like to validate the file or the folder exists or not before proceeding with my business logic.
Is there any way I can validate the paths using URI? instead of going with storage packages.
Note: We are not allowed to use SAS token to access the storage path.
But, we can go with storage key or connection string to connect to storage account from application code.
wasb is the hdfs compatible API on top of Azure blob storage, if you use HTTP:// , you could perhaps check on the path and the HTTP response you get, 404 probably path/file doesn't exist, 200, file path exists. I hope this helps.
Update:
Thanks #Gaurav for the insightful comment,I also added an example on checking the blob status in python, you can do so in other languages as well, you can just plugin the needed information: Storage account name, key, container name, blob name, and you'll get back a boolean value if the blob exists or not:
from azure.storage.blob import BlockBlobService
block_blob_service = BlockBlobService(account_name='', account_key='')
def blob_exists():
container_name = ""
blob_name = ""
exists=(block_blob_service.exists(container_name, blob_name))
return exists
blobstat = blob_exists()
print(blobstat)

How do I upload a resource file and then access that file in an Azure Cloud Service?

I want to have my Worker Role (have not done this with a web role) upload a file when it is deployed and also be able to access that deployed file in code.
How do I deploy a file with my Cloud Service to each instance?
How can I find that file in code?
Good question, Adam. To deploy a file with your cloud service you can follow this article. The 2 main points are as follows:
set Build Action to None
set Copy to Output Directory to "Copy if newer"
With these two properties set the file and its folder structure will be copied to your instance deployment. Things brings us to accessing that file in code.
To access it in code simply do the following which I got from here:
string appRoot = Environment.GetEnvironmentVariable("RoleRoot");
string pathToFiles = Path.Combine(appRoot + #"\", #"approot\" + "anyFolderNameIUsed\");
It is important to remember that this is part of the deployment package which means this file will be overwritten whenever the deployment is completed again. Also, each instance will have it's own version of the file.
If you need persistance or the ability to share file contents between your worker roles you can save your files in Blob Storage. The following article can you help accomplish this.
Using Blobs

Azure Website Writing file to a network Drive

I have a service that I currently run on our local IIS machine that will write a file to a specified directory using the following code:
//Creating the filepath that is needed
string FilePath = FolderPath.Replace("'", "")
+ FileName.Replace("'", "").Replace(" ", "")
.ToString();
//Writing the file to the correct path
File.WriteAllBytes(FilePath, fileTransferDump.FileBinary);
When I change over to an Azure Website how would I make the System write to the file path that I would like to write to.
Basically how do I cause the Azure website to write to a network share. This is just a normal Azure Website.
Azure Websites don't have the ability to write to off-machine resources using traditional windows file shares. Your best bet would be to wrap any of this logic in a custom library that redirects these file writes to Azure Blob Storage instead.

Categories