Upload Multiple Files Async In C# Winforms App - c#

I have an existing Java REST API that takes a file and passes it along to an S3 bucket for storage.
I've been given a C# Winforms desktop .NET Framework app (4.7). This app needs to take files (around 300+ JPEGs) from a user's specified folder and upload them each independently and asynchronously by calling a Java REST API "upload" endpoint. What is a proper way to make multiple async REST calls to C# so that I can report back to the user as each file gets uploaded and then when all of them have been uploaded?
I've thought about using a Parallel ForEach loop to process all the files and make a REST call for each, but wasn't sure if this was the most efficient approach or if I could properly get the feedback/progress needed as the files get uploaded and finished.

Related

How to serve files stored in FTP space

My ASP.Net MVC application allows you to download files that are stored in a repository accessible via FTP.
I would need to implement the best strategy to serve these files to the client. I could implement a method that downloads the file from FTP and then serves the file through FileResult ... but clearly it does not seem the best way at all (especially in the case of large files the client should first wait for the application to download the file and then wait a second time for the time necessary for the download).
Any indication or help will be appreciated
If the web server can only access the files over FTP, then that's the way to go.
If the files are on a different server, your web server needs to download them from there (either entirely or streaming) before it can serve them to its HTTP clients.
Alternatively both servers could share the same file location, either by attaching the same (virtual) disk or through another network protocol such as NFDlS, SMB, ...

C# batch up operations to load in remote database asynchronously

I have several front end web app servers, running ASP.NET MVC apps, which collect some analytics data on visitors.
I want to collect this data into a central database, via a Web API call.
I don't want to lose data, but I also don't want the front end servers to slow or stop in the event that the webapi/db server is not available.
I'm envisioning writing all these events to a log file and then run an async process to ship them to the central webapi/db, and can retry in case the server is not available for a time.
Is there some standard library or method for performing this, without building a bunch of custom code around it?

Creating a zip file with C# more efficiently

Short Version
Is there a more efficient or less-resource-intensive way in C# to create a zip file of a folder recursively than using System.IO.Compression.ZipFile.CreateFromDirectory?
Long Version
I have a REST API running in an Azure App Service (Scaled at P2V2 420 total ACU and 7 GB memory). One of my endpoints accepts a POST request and generates a zip file using the following code. It then returns the name of the zip file that was generated wrapped in a standard JSON format.
System.IO.Compression.ZipFile.CreateFromDirectory(sourcePath, outputPath);
This runs fine on my local machine, but appears to have scaling issues on the cloud server. When I scale up to the next level app service (P3V2 820 ACU, 14GB memory) it runs fine, but without scaling up the API returns a 503 Service Unavailable message. I am not sure which resource is constrained (cpu, memory, disk, etc.), but I don't think it's a server timeout issue since it only runs about 10-20 seconds. The zip file generated is roughly 400 files or 200MB compressed.
My question is, since scaling up costs an extra $120 per month, is there a way to generate a zip file more efficiently so I don't need to scale up the server? Would it help at all to split the zip file creation into creating and adding files steps, using multiple commands rather than using CreateFromDirectory(...)?
Other info:
This endpoint is not being hit by anyone but me
I can try to create multiple smaller zip files to see if this helps with resource consumption, but this is not ideal for my use case.
This is not the only thing running on this server which is why it's already scaled up so much
Just to clarify, the zip file does actually get created on the server and I am able to see a log entry that gets logged just before response goes back to the user, and I can see the zip file in kudu. But Azure intercepts the 200 API response for some reason and sends a 503 instead.

Bulk upload large number of files to Azure from local file system

What is the best way for an admin user of a website to upload 10,000+ images spread across 2,000 sub-directories?
I have a c# MVC .Net web app where 4 times a year the business need to replace 10,000+ images. They have them on a network share, there is 1 parent directory, and then around 2,000 sub-directories underneath, each housing multiple image files.
I know how to do write to BLOB storage, parallel Tasks, etc., but how can the app running on Azure navigate the client side local file storage to find all the files in the sub-directories to upload them?
You can run AzCopy tool on the local network where the local files are, and use the /S flag to copy the files in a subfolder: Upload all blobs in a folder
In my opinion, I suggest you could directly write a command-line code or exe for the client admin to upload file.
Since our web app have no permission to access client's resources.If you want your web app access the client resources, you need use special ways like Relay Hybrid Connections or VNET.
It also need the client admin to config the client machine to allow the azure web app access.
In my opinion, the most easily way you could write a exe(it will auto upload the file to azure storage using datamovement library) which will run by the scheduled job in the client-side.

Upload Multiple Large Files via FTP with Progress Bar using ASP.NET C#

I need to upload large files (>2GB) to a web server, and ASP.NET has a 2GB file upload limit. So what i would like to know is, if its possible to upload files using FTP as i do with HTTP. In other words, is it possible to do an asyncronous (multiple) file upload with progress bar using FTP?
I already have a async file upload with progress bar using a handler (ashx) to send multiple files to a web server. Can i reuse this method to upload file via FTP, or do i need a totally different approach?
As mentioned i need to upload large files to a server, so any other solution that can help me accomplish the task would be much appreciated.
Managed to solve my issue using ResumableJS

Categories