We have had an existing implementation with Azure using the depreciated WindowsAzure.Storage nuget for a few years.
We are upgrading the nuget to the new Azure.Storage.Blobs nuget.
As part of the original implementation, the blobs were to be stored in a folder structure within the container: container/year/month/:
test/2021/01/test.pdf
test/2020/12/test.pdf
The requirement is that this should be continued going forward, but I cannot workout if it is possible anymore.
Has anyone managed to get this to work?
In Azure's Blob storage, you have nothing called the folder structure. It is all virtual. If you specify the blob's name with the full path which you wish it to be (year/month in your case) in the blob reference part of your code, its possible to retain the same logic of uploading blobs.
You can simply create it through code:
using Azure.Storage;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
// Get a connection string to our Azure Storage account. You can
// obtain your connection string from the Azure Portal (click
// Access Keys under Settings in the Portal Storage account blade)
// or using the Azure CLI with:
//
// az storage account show-connection-string --name <account_name> --resource-group <resource_group>
//
// And you can provide the connection string to your application
// using an environment variable.
string connectionString = "<connection_string>";
// Get a reference to a container named "sample-container" and then create it
BlobContainerClient container = new BlobContainerClient(connectionString, "sample-container");
container.Create();
// Get a reference to a blob named "sample-file" in a container named "sample-container"
BlobClient blob = container.GetBlobClient("sample-folder/sample-file");
// Open a file and upload it's data
using (FileStream file = File.OpenRead("local-file.jpg"))
{
blob.Upload(file);
}
Check out documentation from PG about this nuget.
Related
I want to upload and download data to Azure Blob containers of different storage accounts. I don't want to store the connection strings. I am passing the connection string to the BlobServiceClient.
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString, options);
string containerName = "nameXYZ";
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
I see a way to do this using a POST call, as shown in this link.
Is there any other way to dynamically get the connection string?
I have the container name and Blob URL.
(I am using Azure.Storage.Blobs V12)
OAuth is a better option for what you're trying to accomplish. Here's the documentation related to storage specifically: https://learn.microsoft.com/en-us/azure/storage/common/storage-auth-aad-app?tabs=dotnet
I use BlobContainerClient from Azure.Storage.Blobs library. I try to delete some files in Blob Container. When I do so, and if some directories get empty after deletions, these directories also disappear and can no longer be seen in Azure Portal.
I need to keep all empty directories in Container. How is it possible?
Also, this question could be formulated this way:
how is it possible to create an empty directory in Azure Storage?
It is not possible to keep empty directories in Azure Blob Storage if the Hierarchical namespace is disabled
To keep the empty directories in Azure Blob Storage, you need to use Data Lake Gen2 Storage Account
To check if the Hierarchical namespace is enabled or disabled, you can use below lines of C# code:
var serviceClient = new BlobServiceClient(connectionString);
AccountInfo accountInfo = serviceClient.GetAccountInfo();
Console.WriteLine(accountInfo.IsHierarchicalNamespaceEnabled);
You need to use latest version of Azure.Storage.Blobs namespace. The package version is
Azure.Storage.Blobs v12.10.0
I built an azure function that gets an XML file from a POST request, converts it to JSON and uploads it to an azure blob storage container. Currently I just have the connection string to the container hard coded within my function. However the issue is, the file needs to be uploaded to a different container depending whether the dev or prod deployment of the function is being used.
var connectionString = "sampleConnectionString";
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
var containerNameXML = "sampleContainerName";
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerNameXML);
BlobClient blobClient = containerClient.GetBlobClient(xmlFileName);
I know I can store the connection string in a local.settings.json file and access it in the code, but that would only be for one of the environments. So I am wondering if it is possible to overwrite the local environment variable via azure for each environment or something similar.
Thank you in advance for any advice.
You can change settings in Azure for your functions app.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-use-azure-function-app-settings?tabs=portal
We have a requirement to validate the folder or a file location which is there in azure-blob-storage container.
Example folder path: wasbs://#.blob.core.windows.net/
Example file path: wasbs://#.blob.core.windows.net//
Would like to validate the file or the folder exists or not before proceeding with my business logic.
Is there any way I can validate the paths using URI? instead of going with storage packages.
Note: We are not allowed to use SAS token to access the storage path.
But, we can go with storage key or connection string to connect to storage account from application code.
wasb is the hdfs compatible API on top of Azure blob storage, if you use HTTP:// , you could perhaps check on the path and the HTTP response you get, 404 probably path/file doesn't exist, 200, file path exists. I hope this helps.
Update:
Thanks #Gaurav for the insightful comment,I also added an example on checking the blob status in python, you can do so in other languages as well, you can just plugin the needed information: Storage account name, key, container name, blob name, and you'll get back a boolean value if the blob exists or not:
from azure.storage.blob import BlockBlobService
block_blob_service = BlockBlobService(account_name='', account_key='')
def blob_exists():
container_name = ""
blob_name = ""
exists=(block_blob_service.exists(container_name, blob_name))
return exists
blobstat = blob_exists()
print(blobstat)
I have a C# web application which uploads an image file to Azure Blob Storage. I am passing the local path of image file from a textbox (no File Upload Controller). This application works locally as expected. But when I publish it on Azure, it throws exception.
Could not find file (filename)
What changes should be made to run it on Azure?
Code :
CloudBlobContainer container = Program.BlobUtilities.GetBlobClient.GetContainerReference(Container);// container
container.CreateIfNotExists();
container.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
CloudBlobDirectory directory = container.GetDirectoryReference(foldername);
// Get reference to blob (binary content)
CloudBlockBlob blockBlob_image = directory.GetBlockBlobReference(imageid);
using (var filestream = System.IO.File.OpenRead(image_path))
{
blockBlob_image.UploadFromStream(filestream);
}
Could not find file (filename)
The exception is caused by System.IO.File.OpenRead(filePath), as you Web Application is published to Azure. If you want use System.IO.File.OpenRead(filePath), you need to make sure that the filepath could be found in the WebApp.
What changes should be made to run it on Azure?
If you want to use the code on the Azure, you need to make sure that the file could be found in the Azue Website. You need to copy the file to Azure. If you want to upload file to the Azure blob, it is not recommanded. Since that you need to copy the file to Azure WebApp first.
Also As you mentioned that you could use FileUpload Controller to do that.
You're probably using the path to your computer which will be different on azure. You can try change the path to something like this:
string path = HostingEnvironment.ApplicationPhysicalPath + #"\YourProjectName\PathToFile"
Ok, Found the solution. Instead of passing file path to a textbox, I used FileUpload Controller. In the code-behind,
Stream image_path = FileUpload1.FileContent;
Actually, earlier too I had tried using FileUpload controller, but Server.MapPath(FileUpload1.Filename) and Path.GetFullPath(FileUpload1.FileName) were not giving the correct path.
Also,
using (var filestream = image_path)
{
blockBlob_image.UploadFromStream(image_path);
}
is replaced by
blockBlob_image.UploadFromStream(image_path);