What is the best way for an admin user of a website to upload 10,000+ images spread across 2,000 sub-directories?
I have a c# MVC .Net web app where 4 times a year the business need to replace 10,000+ images. They have them on a network share, there is 1 parent directory, and then around 2,000 sub-directories underneath, each housing multiple image files.
I know how to do write to BLOB storage, parallel Tasks, etc., but how can the app running on Azure navigate the client side local file storage to find all the files in the sub-directories to upload them?
You can run AzCopy tool on the local network where the local files are, and use the /S flag to copy the files in a subfolder: Upload all blobs in a folder
In my opinion, I suggest you could directly write a command-line code or exe for the client admin to upload file.
Since our web app have no permission to access client's resources.If you want your web app access the client resources, you need use special ways like Relay Hybrid Connections or VNET.
It also need the client admin to config the client machine to allow the azure web app access.
In my opinion, the most easily way you could write a exe(it will auto upload the file to azure storage using datamovement library) which will run by the scheduled job in the client-side.
Related
My ASP.Net MVC application allows you to download files that are stored in a repository accessible via FTP.
I would need to implement the best strategy to serve these files to the client. I could implement a method that downloads the file from FTP and then serves the file through FileResult ... but clearly it does not seem the best way at all (especially in the case of large files the client should first wait for the application to download the file and then wait a second time for the time necessary for the download).
Any indication or help will be appreciated
If the web server can only access the files over FTP, then that's the way to go.
If the files are on a different server, your web server needs to download them from there (either entirely or streaming) before it can serve them to its HTTP clients.
Alternatively both servers could share the same file location, either by attaching the same (virtual) disk or through another network protocol such as NFDlS, SMB, ...
Please suggest the best way to I copy data from Azure File storage to local machines every 2 hours (periodically). Can we write a C# exe to do that and deploy it on the PC?
Write a desktop application in any language with the SDK support of Azure File Storage. Within that application, create a timer to do your download through the API.
If the there are configurable settings or user interactions needed, I'd say go for a desktop application.
Otherwise, and if your clients are windows PCs, best way would be to write a windows service that does the job.
You perhaps could use Azure Logic Apps - if you set a job to run periodically and copy files from File Storage to OneDrive, for example, then OneDrive would replicate your files onto a on-premise server.
Currently trying to save a file to a folder on a network location.
network//location//folder/save-here
The WebAPI is connected though azure/VPN /Entity Framework, however I need to save the file on the protected network location, not just a record in the database.
I've started trying to use a Hybrid Connection, however I'm not sure it will help solve this issue.
What is the best way to achieve saving a file to a folder on a network location from a Web API/Azure?
Unfortunately, in the Application Service Plan you cannot mount a file share.
If you were using Azure's file share from Azure Storage, you could just save the file using the API. However, since you are trying to save to an on-prem file share, you might need to set up some kind of service (possibly another API) running on-prem than you would call and it would be able to save the file for you.
I have a web application hosted on a web farm. In the application there is a functionality where by user can write the files on a folder located on the virtual directory in the IIS like this:
var compressedFile = FileCompression.GetCompressedAndEncrypted(xml);
File.WriteAllBytes((filePath), compressedFile);
The issue here is that only the server that hooked to the client gets the updated files, but the requirement is to do this on both the servers of the farm simultaneously. There are two servers in the farm.
I want to achieve this programmatically. Please suggest...
You can do one of the following -
store in database (easy to implement, but only good for small files)
store in common location where all servers can access (like Windows Azure's Blob storage)
use Microsoft Sync Framework to sync files (steep learning curve)
I have built a web application which uses two web front end servers, the Users are randomly directed to either one through the same URL. The web app has specific functionality to upload and download files. When a file is uploaded, it is stored on a specific directory on the server to which it is uploaded.
The issue is that when a User uploads a file to the folder on Server 1, any user trying to download that same file from Server 2 will not be able to as it only exists on the server where it was uploaded.
What's the best way of solving this? I've looking at:
- Using a SAN, problem here is I don't want to change or create a domain
- Writing a Windows Service, would prefer to avoid this if possible, I've not done it before but will give it a go if necessary
Thanks in advance!
Joe
Unless I'm missing something very obvious, all you need is a shared location. This could be a network share addressed through a UNC path, a folder on an FTP server, a database, anything at all, as long as it's
shared,
accessible from both web-servers
web-application service account has read/write permissions to it.
From your requirements a network share on a file server (perhaps 1 of the 2 web-servers, or the load-balancer, or (ideally) a new server entirely) would be the simplest method.