I know that we can upload files to azure blob storage using below:
CloudBlockBlob cloudBlockBlob = fileContainer.GetBlockBlobReference(fileName);
await cloudBlockBlob.UploadFromFileAsync(fileFullPath);
I already created some folders in the container. I tried a few time but the files always uploaded outside the folder.
How do we upload the files into a specific folder in the blob storage?
Actually there aren't real folders in Azure Blob Storage, it's just a virtual concept. In other words, Azure Blob Storage only has simple 2-level "container - blob" structure, the so-called "folders" are just prefixes of existing blob names.
For example, if there is a blob named a/b/c.jpg. The a and b are virtual folder names, you can't directly create or delete them, they exist there just because of blob a/b/c.jpg.
I guess you have to get the right reference first ( you need to include the fullpath of the file in the GetBlockBlobReference) as in :
CloudBlockBlob cloudBlockBlob = fileContainer.GetBlockBlobReference($"yourfoldername/{fileName}");
One important thing is you DON'T need to create folder, it will automatically create it for you based on your path in the GetBlockBlobReference.
Related
I use BlobContainerClient from Azure.Storage.Blobs library. I try to delete some files in Blob Container. When I do so, and if some directories get empty after deletions, these directories also disappear and can no longer be seen in Azure Portal.
I need to keep all empty directories in Container. How is it possible?
Also, this question could be formulated this way:
how is it possible to create an empty directory in Azure Storage?
It is not possible to keep empty directories in Azure Blob Storage if the Hierarchical namespace is disabled
To keep the empty directories in Azure Blob Storage, you need to use Data Lake Gen2 Storage Account
To check if the Hierarchical namespace is enabled or disabled, you can use below lines of C# code:
var serviceClient = new BlobServiceClient(connectionString);
AccountInfo accountInfo = serviceClient.GetAccountInfo();
Console.WriteLine(accountInfo.IsHierarchicalNamespaceEnabled);
You need to use latest version of Azure.Storage.Blobs namespace. The package version is
Azure.Storage.Blobs v12.10.0
We have had an existing implementation with Azure using the depreciated WindowsAzure.Storage nuget for a few years.
We are upgrading the nuget to the new Azure.Storage.Blobs nuget.
As part of the original implementation, the blobs were to be stored in a folder structure within the container: container/year/month/:
test/2021/01/test.pdf
test/2020/12/test.pdf
The requirement is that this should be continued going forward, but I cannot workout if it is possible anymore.
Has anyone managed to get this to work?
In Azure's Blob storage, you have nothing called the folder structure. It is all virtual. If you specify the blob's name with the full path which you wish it to be (year/month in your case) in the blob reference part of your code, its possible to retain the same logic of uploading blobs.
You can simply create it through code:
using Azure.Storage;
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Models;
// Get a connection string to our Azure Storage account. You can
// obtain your connection string from the Azure Portal (click
// Access Keys under Settings in the Portal Storage account blade)
// or using the Azure CLI with:
//
// az storage account show-connection-string --name <account_name> --resource-group <resource_group>
//
// And you can provide the connection string to your application
// using an environment variable.
string connectionString = "<connection_string>";
// Get a reference to a container named "sample-container" and then create it
BlobContainerClient container = new BlobContainerClient(connectionString, "sample-container");
container.Create();
// Get a reference to a blob named "sample-file" in a container named "sample-container"
BlobClient blob = container.GetBlobClient("sample-folder/sample-file");
// Open a file and upload it's data
using (FileStream file = File.OpenRead("local-file.jpg"))
{
blob.Upload(file);
}
Check out documentation from PG about this nuget.
i want to get the full path of the uploaded file in the IIS Server without saving the file in the disk by calling File.SaveAs() method. why i want to do this?. Once i have the full path of the file without saving it in the disk i want to upload this file in azure blob container.
I have successfully uploaded the file in azure blob container after saving the file in the disk but i don't want to first save and then delete the file from disk after uploading the file to azure blob container.
Files posted to the server are stored in streams, not in temporary directories. Assuming your object is of type HttpPostedFileBase, you would need to access the InputStream property and read the bytes into your Azure Blob.
I have a C# web application which uploads an image file to Azure Blob Storage. I am passing the local path of image file from a textbox (no File Upload Controller). This application works locally as expected. But when I publish it on Azure, it throws exception.
Could not find file (filename)
What changes should be made to run it on Azure?
Code :
CloudBlobContainer container = Program.BlobUtilities.GetBlobClient.GetContainerReference(Container);// container
container.CreateIfNotExists();
container.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
CloudBlobDirectory directory = container.GetDirectoryReference(foldername);
// Get reference to blob (binary content)
CloudBlockBlob blockBlob_image = directory.GetBlockBlobReference(imageid);
using (var filestream = System.IO.File.OpenRead(image_path))
{
blockBlob_image.UploadFromStream(filestream);
}
Could not find file (filename)
The exception is caused by System.IO.File.OpenRead(filePath), as you Web Application is published to Azure. If you want use System.IO.File.OpenRead(filePath), you need to make sure that the filepath could be found in the WebApp.
What changes should be made to run it on Azure?
If you want to use the code on the Azure, you need to make sure that the file could be found in the Azue Website. You need to copy the file to Azure. If you want to upload file to the Azure blob, it is not recommanded. Since that you need to copy the file to Azure WebApp first.
Also As you mentioned that you could use FileUpload Controller to do that.
You're probably using the path to your computer which will be different on azure. You can try change the path to something like this:
string path = HostingEnvironment.ApplicationPhysicalPath + #"\YourProjectName\PathToFile"
Ok, Found the solution. Instead of passing file path to a textbox, I used FileUpload Controller. In the code-behind,
Stream image_path = FileUpload1.FileContent;
Actually, earlier too I had tried using FileUpload controller, but Server.MapPath(FileUpload1.Filename) and Path.GetFullPath(FileUpload1.FileName) were not giving the correct path.
Also,
using (var filestream = image_path)
{
blockBlob_image.UploadFromStream(image_path);
}
is replaced by
blockBlob_image.UploadFromStream(image_path);
I have task to load some images into the blob storage simultaneously. Name of blob is defined as md5 of the blob. It can happen that different threads try to load same files from different locations.
Now I need to know how to block other threads from loading same file, if first already trying to upload such blob.
You can do it without leasing it by using optimistic concurrency. Basicly set an access condition that says this blob will be different from all etags of blobs with this name. If there is indeed a blob with some etag the second upload will fail.
var access = AccessCondition.GenerateIfNoneMatchCondition("*");
await blobRef.UploadFromStreamAsync(stream, access, null, null);