trying to download the word document from azure blob - c#

i am trying to download the word document stored in azure blob container having private access and i need to convert downloaded document into byte array so that i can be able to send to react app
this is the code i am trying below
[Authorize, HttpGet("{id}/{projectphase?}")]
public async Task<ActionResult<DesignProject>> GetDesignProject(string id, string projectphase = null)
{
var blobContainerName = Startup.Configuration["AzureStorage:BlobContainerName"];
var azureStorageConnectionString = Startup.Configuration["AzureStorage:ConnectionString"];
BlobContainerClient blobContainerClient = new BlobContainerClient(azureStorageConnectionString, blobContainerName);
blobContainerClient.CreateIfNotExists();
....... // not sure how to proceed further
.......
......
return new InlineFileContentResult('here i need to return byte array???', "application/docx") { FileDownloadName = fileName };
}
I have got the full path name where the file has been stored like as below
https://xxxx.blob.core.windows.net/design-project-files/99999-99/99999-99-BOD-Concept.docx
and then i have got the file name as well 99999-99-BOD-Concept.docx
Could any one please guide me how to proceed with the next to download the document that would be very grateful to me.

Please try something like the following (untested code though):
public async Task<ActionResult<DesignProject>> GetDesignProject(string id, string projectphase = null)
{
var blobContainerName = Startup.Configuration["AzureStorage:BlobContainerName"];
var azureStorageConnectionString = Startup.Configuration["AzureStorage:ConnectionString"];
BlobContainerClient blobContainerClient = new BlobContainerClient(azureStorageConnectionString, blobContainerName);
blobContainerClient.CreateIfNotExists();
var blobClient = new BlobClient("https://xxxx.blob.core.windows.net/design-project-files/99999-99/99999-99-BOD-Concept.docx");
var blobName = blobClient.Name;
blobClient = new BlobClient(azureStorageConnectionString, blobContainerName, blobName);
using (var ms = new MemoryStream())
{
await blobClient.DownloadToAsync(ms);
return new InlineFileContentResult(ms.ToArray(), "application/docx") { FileDownloadName = fileName };
}
}
Basically what we're doing is that we're first creating a BlobClient using the URL that you have so that we can extract blob's name out of that URL (you can do URL parsing as well). Once we have the blob's name, we create a new instance of BlobClient using connection string, blob container name and blob's name.
Then we download the blob's content as stream and convert that stream to byte array (this part I am not 100% sure that my code would work) and return that byte array.

You don't really need to have this process where your react app requests to your server, so your server downloads the file and then sends it to the react app; that file in blob storage is on the web, downloadable from blob storage so it's kinda unnecessary to hassle your sevrer into being a proxy for it
If you configure public access for blobs then you just put that URL into your react app - user clicks it, bytes download. Happy days. If you have a private container you can still generate SAS URLs for the blobs
If you actually need the bytes in your react app, then just fetch it with a javascript web request - you'll need to set a CORS policy on the blob container though
If you really want to download the file to/via the server, you'll probably have to get into streaming it to the response stream connected to the react app, passed into the SOMETHING below:
BlobClient blob = blobContainerClient.GetBlobClient( BLOB NAME I.E PATH INSIDE CONTAINER);
//download to a file or stream
await blob.DownloadToAsync( SOMETHING );

Related

Azure Functions: Is there optimal way to upload file without storing it in process memory?

I'm trying to upload file to blob container via HTTP.
On request receiving file through:
public class UploadFileFunction
{
//Own created wrapper on BlobContainerClient
private readonly IBlobFileStorageClient _blobFileStorageClient;
public UploadFileFunction(IBlobFileStorageClient blobFileStorageClient)
{
_blobFileStorageClient = blobFileStorageClient;
}
[FunctionName("UploadFile")]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "/equipments/{route}")]
HttpRequest request,
[FromRoute] string route)
{
IFormFile file = request.Form.Files["File"];
if (file is null)
{
return new BadRequestResult();
}
string fileNamePath = $"{route}_{request.Query["fileName"]}_{request.Query["fileType"]}";
BlobClient blob = _blobFileStorageClient.Container.GetBlobClient(fileNamePath);
try
{
await blob.UploadAsync(file.OpenReadStream(), new BlobHttpHeaders { ContentType = file.ContentType });
}
catch (Exception)
{
return new ConflictResult();
}
return new OkResult();
}
}
Than making request with file:
On UploadAsync whole stream of the file is uploaded in process memory
Is exists some way to upload directly to blob without uploading in process memory?
Thank you in advance.
The best way to avoid this is to not upload your file via your own HTTP endpoint at all. Asking how to avoid the uploaded data not ending up in the process memory (via an HTTP endpoint) makes no sense.
Simply use the Azure Blob Storage REST API to directly upload this file to the Azure blob storage. Your own HTTP endpoint simply needs to issue a Shared access signature (SAS) token for a file upload and the client can upload the file directly to the Blob storage.
This pattern should be used for file uploads unless you have a very good reason not to. Your trigger function is only called after the HTTPRunTime is finished with the HTTP request, hence the trigger's HttpRequest object is allocated in the process memory which is then passed to the trigger.
I also suggest block blobs if you want to upload in multiple stages.
Thats the default way UploadAsync works, this will be ok for files that are small. I ran into an out of memory issue with large files; the solution here is to use AppendBlobAsync
You will need to create the blob as an append blob, so you can keep appending to end of the blob. Basic gist is:
Create an append blob
Go through the existing file and grab xMB(say 2 MB) chunks at a time
Append these chunks to the append blob until the end of file
pseudo code something like below
var appendBlobClient = _blobFileStorageClient.GetAppendBlobClient(fileNamePath);
await appendBlobClient.CreateIfNotExistsAsync();
var appendBlobMaxAppendBlockBytes = appendBlobClient.AppendBlobMaxAppendBlockBytes;
using (var file = file.OpenReadStream())
{
int bytesRead;
var buffer = new byte[appendBlobMaxAppendBlockBytes];
while ((bytesRead = file.Read(buffer, 0, buffer.Length)) > 0)
{
//Stream stream = new MemoryStream(buffer);
var newArray = new Span<byte>(buffer, 0, bytesRead).ToArray();
Stream stream = new MemoryStream(newArray);
stream.Position = 0;
appendBlobClient.AppendBlock(stream);
}
}

Creating and Uploading Append Blobs to store in Azure Container

I am trying to upload a new append blob file to a container every time a message comes in from a service bus. I do not want to append to the blob that is already there. I want to create a whole new append blob and add it at the end.
Is this possible?
I was looking at this article but couldn't quiet understand what they meant when they got to the content part: https://azuresdkdocs.blob.core.windows.net/$web/javascript/azure-storage-blob/12.1.1/classes/appendblobclient.html#appendblock
Here is the code that I have so far:
public static async void StoreToBlob(Services service)
{
//Serealize Object
var sender = JsonConvert.SerializeObject(service);
// Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient = new BlobServiceClient(connectionString);
// Create the container and return a container client object
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(containerName);
// Create the container if it doesn't already exist.
await containerClient.CreateIfNotExistsAsync();
//Reference to blob
AppendBlobClient appendBlobClient = containerClient.GetAppendBlobClient("services" + Guid.NewGuid().ToString() + ".json");
// Create the blob.
appendBlobClient.Create();
await appendBlobClient.AppendBlock(sender, sender.Length); //here is where I am having an issue
}
Can you try something like the following (not tested code):
byte[] blockContent = Encoding.UTF8.GetBytes(sender);
using (var ms = new MemoryStream(blockContent))
{
appendBlobClient.AppendBlock(ms, blockContent.Length);
}
Essentially we're converting the string to byte array, creating a stream out of it and then uploading that stream.

Azure blob storage returns 404 with some files

I have an Azure blob storage setup with a couple of files in it. I am able to download the files into a Stream when they are small (KB sized), but when the files are a little larger (MB sized) I get a 404 error. I have manually downloaded from the portal one of the images that is returning 404 fine and have resized that image and then uploaded the smaller image back to the container and I can then grammatically download it into a stream.
Here is the code that I'm using to download the blob
private static byte[] PerformDownload(string fileName, CloudBlobContainer container)
{
var blockBlob = container.GetBlockBlobReference(fileName);
using (var memoryStream = new MemoryStream())
{
blockBlob.DownloadToStream(memoryStream);
memoryStream.Seek(0, SeekOrigin.Begin);
var binaryReader = new BinaryReader(memoryStream);
var bytes = binaryReader.ReadBytes((int)memoryStream.Length);
return bytes;
}
}
The container is passed into this method and as I mentioned I can download some files from the container without issue, but if you need that code I can add that as well
The container is retrieve using the standard examples that you find, but here is the code
private static CloudBlobContainer GetContainer(string containerName)
{
var storageAccount = CloudStorageAccount.Parse(ConnectionString);
var container = CreateContainerIfNeeded(storageAccount, containerName);
return container;
}
private static CloudBlobContainer CreateContainerIfNeeded(CloudStorageAccount storageAccount, string containerName)
{
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(containerName);
container.CreateIfNotExists();
return container;
}
Also Case sensitivity is not the issue because the container's name is 2017-106 and the file is 4448.jpg.
I am able to download the files into a Stream when they are small (KB sized), but when the files are a little larger (MB sized) I get a 404 error.
Currently max size of a block blob is approx. 4.75 TB, storing MB-sized data in a block blob, which should not cause Azure Blob service return 404 when you access the blob. 404 error indicates that the specified blob does not exist, as Gaurav Mantri said, Blob name is case-sensitive, please make sure the filename (blob name) you provided indeed exists in your container.
Besides, If only that specific blob can not be found, but it is really existing in your container, you can create support request to report it.

Downloading from Azure Blob storage in C#

I have a very basic but working Azure Blob uploader/downloader built on C# ASP.net.
Except the download portion does not work. This block is called by the webpage and I simply get no response. The uploads are a mixture of images and raw files. I'm looking for the user to get prompted to select a destination and just have the file download to their machine. Can anyone see where I am going wrong?
[HttpPost]
public void DownloadFile(string Name)
{
Uri uri = new Uri(Name);
string filename = System.IO.Path.GetFileName(uri.LocalPath);
CloudBlobContainer blobContainer = _blobStorageService.GetCloudBlobContainer();
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(filename);
using (Stream outputFile = new FileStream("Downloaded.jpg", FileMode.Create))
{
blob.DownloadToStream(outputFile);

Azure SAS download blob

I'm trying to download a blob with SAS and kinda clueless right now.
I'm listing all the belonging user blobs in a view. When a user clicks on the blob its supposed to start downloading it.
Here is the view:
#foreach (var file in Model)
{
<a href='#Url.Action("GetSaSForBlob", "Folder", new { blob = file })>
</a>
}
Here is my two functions located in "Folder" controller.
public void GetSaSForBlob(CloudBlockBlob blob)
{
var sas = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-5),
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(3),
Permissions = SharedAccessBlobPermissions.Read | SharedAccessBlobPermissions.Write,
});
DownloadFileTest(string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sas));
//return string.Format(CultureInfo.InvariantCulture, "{0}{1}", blob.Uri, sas);
}
static void DownloadFileTest(string blobSasUri)
{
CloudBlockBlob blob = new CloudBlockBlob(new Uri(blobSasUri));
using (MemoryStream ms = new MemoryStream())
{
blob.DownloadToStream(ms);
byte[] data = new byte[ms.Length];
ms.Position = 0;
ms.Read(data, 0, data.Length);
}
}
What should i be passing from my view to GetSasForBlob? At the moment CloudBlockBlob blob is null.
Am i missing any code in function DownloadFileTest?
Should i be calling DownloadFileTest directly from GetSasForBlob?
How can i protect these two functions so people cant access them outside the view? They are both static functions now. I'm guessing that is not safe?
1, What the value is of your file in your view. I don't think MVC can create CloudBlockBlob object based on the file you provided. So this might be the reason you got null of your CloudBlockBlob.
2, In your DownloadFileTest you just download the binaries of the blob into the memory stream in your server and that's all. If you need to let user download it to local disk you need to put the binaries into Response.Stream. You can just use something like blob.DownloadToStram(Response.Stream).
3, That's up to you. You can merge them in the same method if you want.
4, If you want user to download blob through your web front (website or web service), as what you are doing now, you need to set your blob container as private and secure your website or web service by using something loke [Authorize] attribute. Basically in your case you really don't need to use SAS at all because all download requests are performed through your web front.
Hope this helps a bit.

Categories