I use BlobContainerClient from Azure.Storage.Blobs library. I try to delete some files in Blob Container. When I do so, and if some directories get empty after deletions, these directories also disappear and can no longer be seen in Azure Portal.
I need to keep all empty directories in Container. How is it possible?
Also, this question could be formulated this way:
how is it possible to create an empty directory in Azure Storage?
It is not possible to keep empty directories in Azure Blob Storage if the Hierarchical namespace is disabled
To keep the empty directories in Azure Blob Storage, you need to use Data Lake Gen2 Storage Account
To check if the Hierarchical namespace is enabled or disabled, you can use below lines of C# code:
var serviceClient = new BlobServiceClient(connectionString);
AccountInfo accountInfo = serviceClient.GetAccountInfo();
Console.WriteLine(accountInfo.IsHierarchicalNamespaceEnabled);
You need to use latest version of Azure.Storage.Blobs namespace. The package version is
Azure.Storage.Blobs v12.10.0
Related
i am trying to create a csv file from a string. I was able to create the csv file as string like so:
private void SaveToCsv<T>(List<T> reportData)
{
var lines = new List<string>();
IEnumerable<PropertyDescriptor> props = TypeDescriptor.GetProperties(typeof(T)).OfType<PropertyDescriptor>();
var header = string.Join(",", props.ToList().Select(x => x.Name));
lines.Add(header);
var valueLines = reportData.Select(row => string.Join(",", header.Split(',').Select(a => row.GetType().GetProperty(a).GetValue(row, null))));
lines.AddRange(valueLines);
}
But I cannot seem to find out how to create an actual .csv file from the string in lines, as the function app cannot follow a specific path (eg: D://drive/user/xya)
How do I create a file in code without a path in a function app?
Please check if the below steps helps to work around:
After generating the CSV Files, to store them you need the storage account. It is automatically created when creating the Azure Function App.
Update the Storage account name, connection string and file path in local.settings.json as well as in the configuration of the Azure Function App in the portal.
Here File_path is the blob container folder path where you're going to save the CSV files.
Add the CORS option through portal or code to run out of the issues like allowing the requests from your local IP addresses, current IP address if connected to VPN.
Refer this article for practical workaround and the code.
If you want to save the CSV files in the temporary location of the Azure Functions, then refer this SO Thread
References:
Storage Consideration of Azure Functions
To mount Azure local storage as a local file, refer to this MSFT Q&A
Accessing Azure File Storage from Azure Function
I built an azure function that gets an XML file from a POST request, converts it to JSON and uploads it to an azure blob storage container. Currently I just have the connection string to the container hard coded within my function. However the issue is, the file needs to be uploaded to a different container depending whether the dev or prod deployment of the function is being used.
var connectionString = "sampleConnectionString";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
var containerNameXML = "sampleContainerName";
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerNameXML);
BlobClient blobClient = containerClient.GetBlobClient(xmlFileName);
I know I can store the connection string in a local.settings.json file and access it in the code, but that would only be for one of the environments. So I am wondering if it is possible to overwrite the local environment variable via azure for each environment or something similar.
Thank you in advance for any advice.
You can change settings in Azure for your functions app.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-use-azure-function-app-settings?tabs=portal
We have had an existing implementation with Azure using the depreciated WindowsAzure.Storage nuget for a few years.
We are upgrading the nuget to the new Azure.Storage.Blobs nuget.
As part of the original implementation, the blobs were to be stored in a folder structure within the container: container/year/month/:
test/2021/01/test.pdf
test/2020/12/test.pdf
The requirement is that this should be continued going forward, but I cannot workout if it is possible anymore.
Has anyone managed to get this to work?
In Azure's Blob storage, you have nothing called the folder structure. It is all virtual. If you specify the blob's name with the full path which you wish it to be (year/month in your case) in the blob reference part of your code, its possible to retain the same logic of uploading blobs.
You can simply create it through code:
using Azure.Storage;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
// Get a connection string to our Azure Storage account. You can
// obtain your connection string from the Azure Portal (click
// Access Keys under Settings in the Portal Storage account blade)
// or using the Azure CLI with:
//
// az storage account show-connection-string --name <account_name> --resource-group <resource_group>
//
// And you can provide the connection string to your application
// using an environment variable.
string connectionString = "<connection_string>";
// Get a reference to a container named "sample-container" and then create it
BlobContainerClient container = new BlobContainerClient(connectionString, "sample-container");
container.Create();
// Get a reference to a blob named "sample-file" in a container named "sample-container"
BlobClient blob = container.GetBlobClient("sample-folder/sample-file");
// Open a file and upload it's data
using (FileStream file = File.OpenRead("local-file.jpg"))
{
blob.Upload(file);
}
Check out documentation from PG about this nuget.
We have a requirement to validate the folder or a file location which is there in azure-blob-storage container.
Example folder path: wasbs://#.blob.core.windows.net/
Example file path: wasbs://#.blob.core.windows.net//
Would like to validate the file or the folder exists or not before proceeding with my business logic.
Is there any way I can validate the paths using URI? instead of going with storage packages.
Note: We are not allowed to use SAS token to access the storage path.
But, we can go with storage key or connection string to connect to storage account from application code.
wasb is the hdfs compatible API on top of Azure blob storage, if you use HTTP:// , you could perhaps check on the path and the HTTP response you get, 404 probably path/file doesn't exist, 200, file path exists. I hope this helps.
Update:
Thanks #Gaurav for the insightful comment,I also added an example on checking the blob status in python, you can do so in other languages as well, you can just plugin the needed information: Storage account name, key, container name, blob name, and you'll get back a boolean value if the blob exists or not:
from azure.storage.blob import BlockBlobService
block_blob_service = BlockBlobService(account_name='', account_key='')
def blob_exists():
container_name = ""
blob_name = ""
exists=(block_blob_service.exists(container_name, blob_name))
return exists
blobstat = blob_exists()
print(blobstat)
I know that we can upload files to azure blob storage using below:
CloudBlockBlob cloudBlockBlob = fileContainer.GetBlockBlobReference(fileName);
await cloudBlockBlob.UploadFromFileAsync(fileFullPath);
I already created some folders in the container. I tried a few time but the files always uploaded outside the folder.
How do we upload the files into a specific folder in the blob storage?
Actually there aren't real folders in Azure Blob Storage, it's just a virtual concept. In other words, Azure Blob Storage only has simple 2-level "container - blob" structure, the so-called "folders" are just prefixes of existing blob names.
For example, if there is a blob named a/b/c.jpg. The a and b are virtual folder names, you can't directly create or delete them, they exist there just because of blob a/b/c.jpg.
I guess you have to get the right reference first ( you need to include the fullpath of the file in the GetBlockBlobReference) as in :
CloudBlockBlob cloudBlockBlob = fileContainer.GetBlockBlobReference($"yourfoldername/{fileName}");
One important thing is you DON'T need to create folder, it will automatically create it for you based on your path in the GetBlockBlobReference.