Accessing a file located in Azure Blob, from Azure Timer Trigger Function - c#

I have an Azure Timer Trigger function, which generates an excel file, and I need to send it via an email, as an attachment.
I have done the following successfully -
Created the file and uploaded it into an azure blob
Sent email without attachment (Using SmtpClient and MailMessage)
How can I fetch the file from Azure Blob and send it as an attachment?
P.S.
I was able to send it as an attachment, when I stored the file in Azure's local storage of the function. However, I want to move the storage of the file to Azure Blob

How can I fetch the file from Azure Blob and send it as an attachment?
Per my understanding, you could download your blob into a temp local file in your function as follows:
// Save blob contents to a file.
using (var fileStream = System.IO.File.OpenWrite(#"path\myfile"))
{
blockBlob.DownloadToStream(fileStream);
}
Then, you could construct your Attachment as follows:
System.Net.Mail.Attachment attach = new System.Net.Mail.Attachment("{file-path}");
Or you could directly download your blob into the MemoryStream as follows:
using (var memoryStream = new MemoryStream())
{
blockBlob2.DownloadToStream(memoryStream);
memoryStream.Position = 0;
System.Net.Mail.Attachment attach = new System.Net.Mail.Attachment(memoryStream , "{contentType}");
}
Details you could follow Download blobs.

var blob_Archive = Environment.GetEnvironmentVariable("blobName", EnvironmentVariableTarget.Process);
var archiveSheetFile = $"{blob_Archive}/FileName-{DateTime.UtcNow:ddMMyyyy_hh.mm}.csv";
using (var txReader = await binder.BindAsync<TextReader>(
new BlobAttribute(file2read, FileAccess.Read)))
{
if(txReader != null)
sBlobNotification = await txReader.ReadToEndAsync();
}
log.LogInformation($"{sBlobNotification}");
you can write the above code in timer trigger function and get the blob content.
Does this helps!!

Related

Azure.Storage.Blog downloading file from Azure Storage container to local path - from ASP.NET Core - file path and mime type?

I have an Azure application built with ASP.NET Core using the MVC pattern. Document uploads are stored in Azure Blob Containers and the C# upload code I wrote is working great.
I am using Azure.Storage.Blobs version 12.14.1
Here is my download blob code:
//get document metadata stored in sql table, id comes from a method param
var document = _unitOfWork.Documents.GetById(id);
if (document == null)
{
throw new FileNotFoundException();
}
//get connection and container from appsettings.json
string connection = _appConfig.AzureStorageConnection;
string containerName = _appConfig.AzureStorageContainer;
//work with blob client
var serviceClient = new BlobServiceClient(connection);
var container = serviceClient.GetBlobContainerClient(containerName);
var fileName = document.UniqueDocumentName;
var blobClient = container.GetBlobClient(document.UniqueDocumentName);
using (FileStream fileStream = System.IO.File.OpenWrite("<path>"))
{
blobClient.DownloadTo(fileStream);
}
After I get to using code to set up the file stream, I don't understand what to pass into the OpenWrite method as a path. This application is a B2C app and so how do I just prompt a user to download the file?
I do get a file download with the above code but the file is called download.xml. That is not the file that should be downloaded. I expected the download file to be an .odt file.
Documentation seems to be very sparse on downloading from Azure Blob Storage
EDIT 1.
I got rid of the FileStream and did this instead:
MemoryStream ms = new MemoryStream();
ms.Position = 0;
blobClient.DownloadTo(ms);
return new FileStreamResult(ms, document.FileType);
I did this instead of using FileStream and returned FileResult instead:
MemoryStream ms = new MemoryStream();
ms.Position = 0;
blobClient.DownloadTo(ms);
return new FileStreamResult(ms, document.FileType);

Azure Functions: Is there optimal way to upload file without storing it in process memory?

I'm trying to upload file to blob container via HTTP.
On request receiving file through:
public class UploadFileFunction
{
//Own created wrapper on BlobContainerClient
private readonly IBlobFileStorageClient _blobFileStorageClient;
public UploadFileFunction(IBlobFileStorageClient blobFileStorageClient)
{
_blobFileStorageClient = blobFileStorageClient;
}
[FunctionName("UploadFile")]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "/equipments/{route}")]
HttpRequest request,
[FromRoute] string route)
{
IFormFile file = request.Form.Files["File"];
if (file is null)
{
return new BadRequestResult();
}
string fileNamePath = $"{route}_{request.Query["fileName"]}_{request.Query["fileType"]}";
BlobClient blob = _blobFileStorageClient.Container.GetBlobClient(fileNamePath);
try
{
await blob.UploadAsync(file.OpenReadStream(), new BlobHttpHeaders { ContentType = file.ContentType });
}
catch (Exception)
{
return new ConflictResult();
}
return new OkResult();
}
}
Than making request with file:
On UploadAsync whole stream of the file is uploaded in process memory
Is exists some way to upload directly to blob without uploading in process memory?
Thank you in advance.
The best way to avoid this is to not upload your file via your own HTTP endpoint at all. Asking how to avoid the uploaded data not ending up in the process memory (via an HTTP endpoint) makes no sense.
Simply use the Azure Blob Storage REST API to directly upload this file to the Azure blob storage. Your own HTTP endpoint simply needs to issue a Shared access signature (SAS) token for a file upload and the client can upload the file directly to the Blob storage.
This pattern should be used for file uploads unless you have a very good reason not to. Your trigger function is only called after the HTTPRunTime is finished with the HTTP request, hence the trigger's HttpRequest object is allocated in the process memory which is then passed to the trigger.
I also suggest block blobs if you want to upload in multiple stages.
Thats the default way UploadAsync works, this will be ok for files that are small. I ran into an out of memory issue with large files; the solution here is to use AppendBlobAsync
You will need to create the blob as an append blob, so you can keep appending to end of the blob. Basic gist is:
Create an append blob
Go through the existing file and grab xMB(say 2 MB) chunks at a time
Append these chunks to the append blob until the end of file
pseudo code something like below
var appendBlobClient = _blobFileStorageClient.GetAppendBlobClient(fileNamePath);
await appendBlobClient.CreateIfNotExistsAsync();
var appendBlobMaxAppendBlockBytes = appendBlobClient.AppendBlobMaxAppendBlockBytes;
using (var file = file.OpenReadStream())
{
int bytesRead;
var buffer = new byte[appendBlobMaxAppendBlockBytes];
while ((bytesRead = file.Read(buffer, 0, buffer.Length)) > 0)
{
//Stream stream = new MemoryStream(buffer);
var newArray = new Span<byte>(buffer, 0, bytesRead).ToArray();
Stream stream = new MemoryStream(newArray);
stream.Position = 0;
appendBlobClient.AppendBlock(stream);
}
}

trying to download the word document from azure blob

i am trying to download the word document stored in azure blob container having private access and i need to convert downloaded document into byte array so that i can be able to send to react app
this is the code i am trying below
[Authorize, HttpGet("{id}/{projectphase?}")]
public async Task<ActionResult<DesignProject>> GetDesignProject(string id, string projectphase = null)
{
var blobContainerName = Startup.Configuration["AzureStorage:BlobContainerName"];
var azureStorageConnectionString = Startup.Configuration["AzureStorage:ConnectionString"];
BlobContainerClient blobContainerClient = new BlobContainerClient(azureStorageConnectionString, blobContainerName);
blobContainerClient.CreateIfNotExists();
....... // not sure how to proceed further
.......
......
return new InlineFileContentResult('here i need to return byte array???', "application/docx") { FileDownloadName = fileName };
}
I have got the full path name where the file has been stored like as below
https://xxxx.blob.core.windows.net/design-project-files/99999-99/99999-99-BOD-Concept.docx
and then i have got the file name as well 99999-99-BOD-Concept.docx
Could any one please guide me how to proceed with the next to download the document that would be very grateful to me.
Please try something like the following (untested code though):
public async Task<ActionResult<DesignProject>> GetDesignProject(string id, string projectphase = null)
{
var blobContainerName = Startup.Configuration["AzureStorage:BlobContainerName"];
var azureStorageConnectionString = Startup.Configuration["AzureStorage:ConnectionString"];
BlobContainerClient blobContainerClient = new BlobContainerClient(azureStorageConnectionString, blobContainerName);
blobContainerClient.CreateIfNotExists();
var blobClient = new BlobClient("https://xxxx.blob.core.windows.net/design-project-files/99999-99/99999-99-BOD-Concept.docx");
var blobName = blobClient.Name;
blobClient = new BlobClient(azureStorageConnectionString, blobContainerName, blobName);
using (var ms = new MemoryStream())
{
await blobClient.DownloadToAsync(ms);
return new InlineFileContentResult(ms.ToArray(), "application/docx") { FileDownloadName = fileName };
}
}
Basically what we're doing is that we're first creating a BlobClient using the URL that you have so that we can extract blob's name out of that URL (you can do URL parsing as well). Once we have the blob's name, we create a new instance of BlobClient using connection string, blob container name and blob's name.
Then we download the blob's content as stream and convert that stream to byte array (this part I am not 100% sure that my code would work) and return that byte array.
You don't really need to have this process where your react app requests to your server, so your server downloads the file and then sends it to the react app; that file in blob storage is on the web, downloadable from blob storage so it's kinda unnecessary to hassle your sevrer into being a proxy for it
If you configure public access for blobs then you just put that URL into your react app - user clicks it, bytes download. Happy days. If you have a private container you can still generate SAS URLs for the blobs
If you actually need the bytes in your react app, then just fetch it with a javascript web request - you'll need to set a CORS policy on the blob container though
If you really want to download the file to/via the server, you'll probably have to get into streaming it to the response stream connected to the react app, passed into the SOMETHING below:
BlobClient blob = blobContainerClient.GetBlobClient( BLOB NAME I.E PATH INSIDE CONTAINER);
//download to a file or stream
await blob.DownloadToAsync( SOMETHING );

Move files from Azure Storage blob to an Ftp server

I need to upload a few files from Azure Storage to an external Ftp server.
Is there any way with Azure to uplodad these files directly without download them before ?
You will need to use two classes/libraries and create two methods here:
WebClient class to download the file from the blob storage to your local drive
FTP library such WinSCP to move the file
WebClient Class:
You need to supply the URI parameter with the format: https://[accountname].blob.core.windows.net/[containername]/[filetodownloadincludingextension]
The download location must then be a variable as the origin location of the file to be uploaded to your FTP server.
string uri = "https://[accountname].blob.core.windows.net/[containername]/[filetodownloadincludingextension]/";
string file = "file1.txt";
string downloadLocation = #"C:\";
WebClient webClient = new WebClient();
Log("Downloading File from web...");
try
{
webClient.DownloadFile(new Uri(uri+file), downloadLocation);
Log("Download from web complete");
webClient.Dispose();
}
catch (Exception ex)
{
Log("Error Occurred in downloading file. See below for exception details");
Log(ex.Message);
webClient.Dispose();
}
return downloadLocation + file;
Once downloaded in your local drive, you need to upload it to your FTP/SFTP server. You may use the library of WinSCP for this:
string absPathSource = downloadLocation + file;
string destination = "/root/folder"; //this basically is your FTP path
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Sftp,
HostName = ConfigurationManager.AppSettings["scpurl"],
UserName = ConfigurationManager.AppSettings["scpuser"],
Password = ConfigurationManager.AppSettings["scppass"].Trim(),
SshHostKeyFingerprint = ConfigurationManager.AppSettings["scprsa"].Trim()
};
using (Session session = new Session())
{
//disable version checking
session.DisableVersionCheck = true;
// Connect
session.Open(sessionOptions);
// Upload files
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
TransferOperationResult transferResult;
transferResult = session.PutFiles(absPathSource, destination, false, transferOptions);
// Throw on any error
transferResult.Check();
// Print results
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
//Console.WriteLine("Upload of {0} succeeded", transfer.FileName);
}
}
You may include a File.Delete code at the end of the upload to FTP code if you want to delete the file from your local hard drive after the upload.
I came across this question whilst looking for the same answer, I came up with the following solution:
Get the Azure file as a Stream [Handled by Azure Functions for you]
Using WebClient Upload the Stream
This allowed me to transfer the file directly from Blob Storage to an FTP client. For me the Azure Blob file as a Stream was already done as I was creating an Azure Function based on a blob trigger.
I then converted the Stream to a MemoryStream and passed that to WebClient.UploadData() as a byte array [very roughly something like]:
// ... Get the Azure Blob file in to a Stream called myBlob
// As mentioned above the Azure function does this for you:
// public static void Run([BlobTrigger("containerName/{name}", Connection = "BlobConnection")]Stream myBlob, string name, ILogger log)
public void UploadStreamToFtp(Stream file, string targetFilePath)
{
using (MemoryStream ms = new MemoryStream())
{
// As memory stream already handles ToArray() copy the Stream to the MemoryStream
file.CopyTo(ms);
using (WebClient client = new WebClient())
{
// Use login credentails if required
client.Credentials = new NetworkCredential("username", "password");
// Upload the stream as Data with the STOR method call
// targetFilePath is a fully qualified filepath on the FTP, e.g. ftp://targetserver/directory/filename.ext
client.UploadData(targetFilePath, WebRequestMethods.Ftp.UploadFile, ms.ToArray());
}
}
}

Downloading from Azure Blob storage in C#

I have a very basic but working Azure Blob uploader/downloader built on C# ASP.net.
Except the download portion does not work. This block is called by the webpage and I simply get no response. The uploads are a mixture of images and raw files. I'm looking for the user to get prompted to select a destination and just have the file download to their machine. Can anyone see where I am going wrong?
[HttpPost]
public void DownloadFile(string Name)
{
Uri uri = new Uri(Name);
string filename = System.IO.Path.GetFileName(uri.LocalPath);
CloudBlobContainer blobContainer = _blobStorageService.GetCloudBlobContainer();
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(filename);
using (Stream outputFile = new FileStream("Downloaded.jpg", FileMode.Create))
{
blob.DownloadToStream(outputFile);

Categories