Should I Be Using BlobContainerClient or BlobClient or Both? - c#

I am getting confused with the C# azure sdk.
What I am trying to achieve.
Upload files from my computer to a folder in azure.
For example
Locally
MyFiles
-- Folder1
-- file.txt
-- img.jpg
-- Folder2
-- file2.json
-- test.png
Azure Result
Container
MyFiles
-- Folder1
-- file.txt
-- img.jpg
-- Folder2
-- file2.json
-- test.png
So I want in my container on azure same file structure.
how I am doing it is
var sasCred = new AzureSasCrdentials("sasurl");
var container = new BlobContainerClient(new Uri("containerUrl"), sasCred);
var allFiles = Directory.GetFiles("MyFilesFolderPath", "*", SearchOption.AllDirectories);
foreach(var file in files)
{
var cloudFilePath = file.Replace("MyFilesFolderPath", string.Empty);
var fullPath= $"MyFiles{cloudFilePath};
using(var s = new MemoryStream(File.ReadAllBytes(file))
{
await container.UploadBlobAsync(fullPath,stream);
}
}
This seems to do what I need it to do though I noticed the file type is something like "file octet stream" instead of .json/.png/txt or whatever it should be.
When I search it talks about using BlobCLient to set the file type but now I am sure if I should be using BlobContainerClient or not.

You would need to use both BlobContainerClient and BlobClient in this case. The way you would do it is that you would create an instance of BlobClient (BlockBlobClient specifically) using BlobContainerClient and blob name and use UploadAsync method there.
Your code (untested) would be something like:
var sasCred = new AzureSasCrdentials("sasurl");
var container = new BlobContainerClient(new Uri("containerUrl"), sasCred);
var allFiles = Directory.GetFiles("MyFilesFolderPath", "*", SearchOption.AllDirectories);
foreach(var file in files)
{
var cloudFilePath = file.Replace("MyFilesFolderPath", string.Empty);
var fullPath= $"MyFiles{cloudFilePath};
using (var s = new MemoryStream(File.ReadAllBytes(file))
{
var blockBlob = container.GetBlockBlobClient(fullPath);//Get BlockBlobClient instance
var blobContentType = GetContentTypeFromFileSomehow(file);//Write a helper method to get the content type
var headers = new BlobHttpHeaders() { ContentType = blobContentType};//Set content type header for blob.
var blobUploadOptions = new BlobUploadOptions() { HttpHeaders = headers};
await blockBlob.UploadAsync(s, blobUploadOptions);//Upload blob
}
}

You can specify the ContentType property of the BlobHttpHeaders class as the desired content type.
There are also nuget libraries to get the mime type from the file name extension if they vary by upload.
blobClient.UploadAsync(stream, new BlobHttpHeaders { ContentType = "text/plain" });

Related

How to Read from Blob path contains container name

I have a requirement where i want to read blob and to upload in Sftp location.Blob path is available as in the format "/container/folder1/subfold1/abc.blob".I am using below code to read the blob from above blob path
BlobServiceClient blobServiceClient1 = new BlobServiceClient(connectionString);
BlobContainerClient containerClient1 = blobServiceClient1.GetBlobContainerClient(containerName);
var bname = "container/folder1/subfold1/abc.blob";
var blockBlobClient1 = containerClient1.GetBlockBlobClient(bname);
using (var uploadBlobStream = blockBlobClient1 .OpenReadAsync())
{
_sftp.Connect();
var blobPath = string.Format("{0}/{1}", remoteFilePath, "tst.txt");
_sftp.UploadFile(uploadBlobStream.Result, remoteFilePath, true);
_sftp.Disconnect();
}
But it throws error that Blob file is not present. Can anyone help on this.
You can try the following changes:
You will need to use the value of containerName instead of "container" in the path.
var bname = $"{containerName}/folder1/subfold1/abc.blob";
The OpenReadAsync method returns a Task object that represents the asynchronous operation, rather than the stream itself. You will need to await the task to get the stream, like this:
using (var uploadBlobStream = await blockBlobClient1.OpenReadAsync())
{
// ...
}
Also, it's worth noting that the UploadFile method expects a Stream object as its first argument, but you are passing it the Task object returned by OpenReadAsync. You will need to pass the stream itself, like this:
_sftp.UploadFile(uploadBlobStream, remoteFilePath, true);
Do upvote, if you found this useful. Thanks :)

Creating and Uploading Append Blobs to store in Azure Container

I am trying to upload a new append blob file to a container every time a message comes in from a service bus. I do not want to append to the blob that is already there. I want to create a whole new append blob and add it at the end.
Is this possible?
I was looking at this article but couldn't quiet understand what they meant when they got to the content part: https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-storage-blob/12.1.1/classes/appendblobclient.html#appendblock
Here is the code that I have so far:
public static async void StoreToBlob(Services service)
{
//Serealize Object
var sender = JsonConvert.SerializeObject(service);
// Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
// Create the container and return a container client object
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
// Create the container if it doesn't already exist.
await containerClient.CreateIfNotExistsAsync();
//Reference to blob
AppendBlobClient appendBlobClient = containerClient.GetAppendBlobClient("services" + Guid.NewGuid().ToString() + ".json");
// Create the blob.
appendBlobClient.Create();
await appendBlobClient.AppendBlock(sender, sender.Length); //here is where I am having an issue
}
Can you try something like the following (not tested code):
byte[] blockContent = Encoding.UTF8.GetBytes(sender);
using (var ms = new MemoryStream(blockContent))
{
appendBlobClient.AppendBlock(ms, blockContent.Length);
}
Essentially we're converting the string to byte array, creating a stream out of it and then uploading that stream.

Download and upload DriveItem from shared OneDrive Folder with MS Graph SDK

I'm currently trying to implement several tasks that involve listing, uploading and downloading files from a shared OneDrive folder. This folder is accesible via the logged in users OneDrive (visible in his root folder). The listing part works pretty well so far, using this code:
string remoteDriveId = string.Empty;
private GraphServiceClient graphClient { get; set; }
// Get the root of the owners OneDrive
DriveItem ownerRoot = await this.graphClient.Drive.Root.Request().Expand("thumbnails,children($expand=thumbnails)").GetAsync();
// Select the shared folders general information
DriveItem sharedFolder = ownerRoot.Children.Where(c => c.Name == "sharedFolder").FirstOrDefault();
// Check if it is a remote folder
if(sharedFolder.Remote != null)
{
remoteDriveId = item.RemoteItem.ParentReference.DriveId;
// Get complete Information of the shared folder
sharedFolder = await graphClient.Drives[remoteDriveId].Items[sharedFolder.RemoteItem.Id].Request().Expand("thumbnails,children").GetAsync();
}
So obviously I need to retrieve the shared folders information from the OneDrive that shared it with the other OneDrive.
Next part for me is to list the contents of this shared folder, which also works pretty well like this:
foreach (DriveItem child in sharedFolder.Children)
{
DriveItem childItem = await graphClient.Drives[remoteDriveId].Items[child.Id].Request().Expand("thumbnails,children").GetAsync();
if(childItem.Folder == null)
{
string path = Path.GetTempPath() + Guid.NewGuid();
// Download child item to path
}
}
My problem starts with the "Download child item to path" part. There I want to download everything, that is not a folder to a temporary file. The problem is that OneDrive always answers my request with an error message, that the file was not found. What I tried so far is:
using (var stream = await graphClient.Drives[remoteDriveId].Items[childItem.Id].Content.Request().GetAsync())
using (var outputStream = new System.IO.FileStream(path, System.IO.FileMode.Create))
{
await stream.CopyToAsync(outputStream);
}
In another variant I tried to use the ID of the childItem ParentReference (but I think this will only lead me to the remote OneDrives ID of sharedFolder):
using (var stream = await graphClient.Drives[remoteDriveId].Items[childItem.ParentReference.Id].Content.Request().GetAsync())
using (var outputStream = new System.IO.FileStream(path, System.IO.FileMode.Create))
{
await stream.CopyToAsync(outputStream);
}
After Downloading the files I want to edit them and reupload them to a different path in the shared folder. That path is created by me (which allready works) like this:
DriveItem folderToCreate = new DriveItem { Name = "folderName", Folder = new Folder() };
await graphClient.Drives[remoteDriveId].Items[sharedFolder.Id].Children.Request().AddAsync(folderToCreate);
The upload then fails. I've tried it like this:
using (var stream = new System.IO.FileStream(#"C:\temp\testfile.txt", System.IO.FileMode.Open))
{
await graphClient.Drives[remoteDriveId].Items[sharedFolder.Id].Content.Request().PutAsync<DriveItem>(stream);
}
And also like this (which works if it is not a shared folder and I therefore use Drive instead of Drives):
using (var stream = new System.IO.FileStream(#"C:\temp\testfile.txt", System.IO.FileMode.Open))
{
string folderPath = sharedFolder.ParentReference == null ? "" : sharedFolder.ParentReference.Path.Remove(0, 12) + "/" + Uri.EscapeUriString(sharedFolder.Name);
var uploadPath = folderPath + "/" + uploadFileName;
await graphClient.Drives[remoteDriveId].Root.ItemWithPath(uploadPath).Content.Request().PutAsync<DriveItem>(stream);
}
I couldn't get the AddAsync method (like in the folder creation) to work because I don't know how to create a DriveItem from a Stream.
If somebody could point me in the right direction I would highly appreciate that! Thank you!
The request:
graphClient.Drives[remoteDriveId].Items[childItem.ParentReference.Id].Content.Request().GetAsync()
corresponds to Download the contents of a DriveItem endpoint and is only valid if childItem.ParentReference.Id refers to a File resource, in another cases it fails with expected exception:
Microsoft.Graph.ServiceException: Code: itemNotFound Message: You
cannot get content for a folder
So, to download content from a folder the solution would be to:
enumerate items under folder: GET /drives/{drive-id}/items/{folderItem-id}/children
per every item explicitly download its content if driveItem corresponds to a File facet: GET /drives/{drive-id}/items/{fileItem-id}/content
Example
var sharedItem = await graphClient.Drives[driveId].Items[folderItemId].Request().Expand(i => i.Children).GetAsync();
foreach (var item in sharedItem.Children)
{
if (item.File != null)
{
var fileContent = await graphClient.Drives[item.ParentReference.DriveId].Items[item.Id].Content.Request()
.GetAsync();
using (var fileStream = new FileStream(item.Name, FileMode.Create, System.IO.FileAccess.Write))
fileContent.CopyTo(fileStream);
}
}
Example 2
The example demonstrates how to download file from a source folder and upload it into a target folder:
var sourceDriveId = "--source drive id goes here--";
var sourceItemFolderId = "--source folder id goes here--";
var targetDriveId = "--target drive id goes here--";
var targetItemFolderId = "--target folder id goes here--";
var sourceFolder = await graphClient.Drives[sourceDriveId].Items[sourceItemFolderId].Request().Expand(i => i.Children).GetAsync();
foreach (var item in sourceFolder.Children)
{
if (item.File != null)
{
//1. download a file as a stream
var fileContent = await graphClient.Drives[item.ParentReference.DriveId].Items[item.Id].Content.Request()
.GetAsync();
//save it into file
//using (var fileStream = new FileStream(item.Name, FileMode.Create, System.IO.FileAccess.Write))
// fileContent.CopyTo(fileStream);
//2.Upload file into target folder
await graphClient.Drives[targetDriveId]
.Items[targetItemFolderId]
.ItemWithPath(item.Name)
.Content
.Request()
.PutAsync<DriveItem>(fileContent);
}
}
Instead of downloading/uploading file content, i think what you are actually after is DriveItem copy or move operations. Lets say there are files that needs to be copied from one (source) folder into another (target), then the following example demonstrates how to accomplish it:
var sourceDriveId = "--source drive id goes here--";
var sourceItemFolderId = "--source folder id goes here--";
var targetDriveId = "--target drive id goes here--";
var targetItemFolderId = "--target folder id goes here--";
var sourceFolder = await graphClient.Drives[sourceDriveId].Items[sourceItemFolderId].Request().Expand(i => i.Children).GetAsync();
foreach (var item in sourceFolder.Children)
{
if (item.File != null)
{
var parentReference = new ItemReference
{
DriveId = targetDriveId,
Id = targetItemFolderId
};
await graphClient.Drives[sourceDriveId].Items[item.Id]
.Copy(item.Name, parentReference)
.Request()
.PostAsync();
}
}
}

Partial file name Search of Azure blob storage without file extension

I have image files on azure in Blob container. All files have unique names. I nead to search these image files on name without the extentions. For example i have files:
123.PNG
345.jpg
122.JPG
Present code can search if i give complete name of the file such as 123.PNG.
How to make it work with just passing 123.
Code: ID is being passed as a paramenter which is the file name in blob.:
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("images");
container.CreateIfNotExists();
var blockBlob = container.GetBlockBlobReference(id);
blockBlob.FetchAttributes();
byte[] downloadedImage = new byte[blockBlob.Properties.Length];
blockBlob.DownloadToByteArray(downloadedImage, 0);
var imageBase64 = Convert.ToBase64String(downloadedImage);
What you could do is use the ListBlobs method that accepts a string prefix parameter like this:
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("images");
container.CreateIfNotExists();
var blockBlobs = container.ListBlobs(prefix: "123.").OfType<CloudBlockBlob>();
var blockBlob = blockBlobs.First();
blockBlob.FetchAttributes();
byte[] downloadedImage = new byte[blockBlob.Properties.Length];
blockBlob.DownloadToByteArray(downloadedImage, 0);
var imageBase64 = Convert.ToBase64String(downloadedImage);
The above example will find 123.JPG or 123.PNG (or both)
You will get a list of all blobs that have a name starting with the value of prefix.
For newcomers, you should use like this:
var pagesize = 10;
var resultSegment = blobContainerClient.GetBlobsAsync(prefix: "BlobName")
.AsPages(default, pagesize);
// Enumerate the blobs returned for each page.
await foreach (Azure.Page<BlobItem> blobPage in resultSegment)
{
foreach (BlobItem blobItem in blobPage.Values)
{
Console.WriteLine("Blob name: {0}", blobItem.Name);
}
Console.WriteLine();
}
Ref: MSDN(List blobs with Azure Storage client libraries)

Replace the Contents inside Azure Storage

Is there are any way to replace a file if the same name exists? I can't see any replace method in Azure Storage. Here is my code:
var client = new CloudBlobClient(
new Uri(" http://sweetapp.blob.core.windows.net/"), credentials);
var container = client.GetContainerReference("cakepictures");
await container.CreateIfNotExistsAsync();
var perm = new BlobContainerPermissions();
perm.PublicAccess = BlobContainerPublicAccessType.Blob;
await container.SetPermissionsAsync(perm);
var blockBlob = container.GetBlockBlobReference(newfilename + i + file.FileType);
using (var fileStream = await file.OpenSequentialReadAsync())
{
await blockBlob.UploadFromStreamAsync(fileStream);
}
Is there anything that I could add into this code so that it replaces existing or same file name?
If a blob exists in blob storage and if you upload another file with the same name as that of the blob, old blob contents will automatically be replaced with the contents of new file. You don't have to do anything special.
As Gaurav also mentioned in his answer, the default behavior of UploadFromStream API is to overwrite if the blob already exists.

Categories