Can we upload/download files from kubernetes pod using c#? - c#

I have one API deployed in kubernetes Pod. We have common persistent volume claim storage and want to upload text or any format file in this location using c# code. A .net core api will perform these operation. I am using IFormFile for file input.
Can we do read/write operations with these pods using c# ?

Related

How to upload large file using Blazor to WebAPI (C#) and store it into Azure Blob storage

I need to upload a large file(up 4GB) using Blazor.
Please in your answers using the new version of Azure Storage SDK (v12 Azure.Storage.Blobs NuGet package.)
In my case, the old version will not work Azure Storage SDK (v11 WindowsAzure.Storage NuGet package.)
Requirement`
Upload file in Blazor UI
Send that file to the server side via API
Store that file in Azure Blob Storage on the server side.
I try a few ways to do this but I didn't manage it.
Is there any way to do this?
What I tried`
Send the file itself(2GB) to API but out of memory exception (google says do not do this way, bad practice)
2.Cut the file as chunks and send it to API, the first part of sending works but I can't collect them all together as one file and store it in Azure Blob Storage (I will have a memory issue).
If there is a way to store it in Azure Blob for example store the first part of chunks then update that file and write other parts sequentially in one file on Azure Blob not service memory. In the end, I will have one large file in Blob.
If there is a way to store it in Azure Blob for example store the
first part of chunks then update that file and write other parts
sequentially in one file on Azure Blob not service memory.
Yes. This is how Block Blobs work in Azure Storage.
Basically what you will need to do is send chunks (called blocks in Azure Storage) of the data to your API and directly save those chunks in blob storage using BlockBlobClient.StageBlockAsync. There is no need for you to save those chunks in your API. You will need to maintain the block ids of the chunks you are uploading in your Blazor application.
Once all the chunks (blocks) are uploaded, you will need to send the block ids to your API and will need to call BlockBlobClient.CommitBlockListAsync from your API to tell Azure Blob Storage to combine all blocks together to create a blob.
To learn more about creating block blobs by uploading blocks, you may find these links useful:
https://learn.microsoft.com/en-us/rest/api/storageservices/put-block
https://learn.microsoft.com/en-us/rest/api/storageservices/put-block-list

Creating a local PDF file in Azure Function Linux consumption plan with Microsoft Playwright

I am using Microsoft Playwright in order to create a PDF from a remote website inside an Azure Function (HTTP trigger). The method used is page.GetPdfAsync() that only handles a file path in order to create the generated PDF file. I would like to store the file in an Azure Blob storage container but as the GetPdfAsync() method doesn't handle stream or Azure blob storage, I am trying to temporary store the generated file. I tried to use different local folders such as /tmp or /local but each time I trigger the function I see an exception in the trace saying that the filesystem is read-only.
I read this blog post from Anthony Chu so it seems that Playwright is now supported on a Linux consumption plan but in the article the generated screenshot is directly sent back in the HTTP response and never not stored on the local disk.
For the coding environment, I am using C#, VS 2019 and Azure Function Core Tools + Azure CLI for the deployment.
Any idea how I can handle this scenario?
Page.GetPdfAsync() returns a byte[]. You can pass null to the path and upload the resulting byte[] to Azure.

How to save image on Linux (Debian) using ASP.NET CORE 3

I will have to accomplish this new task, to save images on Linux (Debian) server using ASP.NET Core 3. Any idea how to do that. What are the best practices? Any advice are welcome.
thnx
If the blob you mentioned is azure blob storage, you can take a look the following solutions as per your need.
1.If you want to use asp.net core save images to azure blob storage -> then from blob storage, save it to linux file system.
You can first install this nuget package Microsoft.Azure.Storage.Blob to your asp.net core, and refer to this link for saving images to azure blob storage via code(Note that this code is for this older package WindowsAzure.Storage, you need to modify a little). When the images are stored in azure blob storage, you can use blobfuse to map blob storage to your linux server.
2.If you just want to directly save images from asp.net core to linux server, then you should setup a ftp server in the linux server, then directly save images to that linux via ftp.
Hope it helps.

Where do I store media in an Azure Web App with C# ASP.NET

I am creating a C# ASP.NET app (using Visual Studio), which I'm hosting on Microsoft Azure. Currently, I have a folder in the solution named "Content", in which I store some media. For example, there a logo that is placed on the website.
The purpose of the web app is to generate a document that a user can download after entering some data. To generate this document, I also need to use some media (mainly images). There can be quite a lot of such images!
Where should I store these images? I currently have them in this "Content" folder as well (in seperate sub folders for each user), but I noticed on Azure there is also a tab called "Storage". I have tried to use this service for a bit, but I don't really understand its purpose. Would it be advisable to use this for storing the media, and then retrieving them with the web app when necessary, or should I leave them on the web app server? What is considered Good Practice?
Thanks in advance for any help
As a starting point, using Blob storage (see Azure Storage Documentation) would be significantly better than file storage on a single webserver - its cheaper and more scalable (pricing tiers for Application server storage will be expensive, you'd have have to duplicate files or have a multi-server directory in a load-balanced environment). The basic design is the application will use an SDK to retrieve the bits and then stream it back to the web browser or other client.
If you anticipate many users downloading the same file, and network performance matters, consider using a Content Delivery Network
You should store it in an Azure Storage Account and reference it using the SDK, after generating the document, you can use Shared Access Signature to give the user access and you can limit the access to read or write for a specific time.
If you will generate videos then you can serve it through Azure Media Services

Get FileInfo from remote path

I have some files on Azure Storage eg
http://mywebsite.blob.core.windows.net/scans/1d251700-5457-49c6-abec-c70fa37f77dd.png
I am using MVC app as my api to process files etc.
To do what I want I need to create FileInfo object from the image.
Is it possible somehow using path as above?
You need to utilise the Azure Storage SDK in your application. This can be installed via nuget in Visual Studio. Once you have this installed you need to setup an access key to the blob storage location(s) you are reading and writing from and use the Storage client. There is a great how-to on the Azure documentation site.

Categories