I have projects which:
use deployed files (i.e. project can send mail using templates deployed with project)
local files system (i.e. for temp files)
and are running as:
web app on local IIS
windows service on local machine
web role on Azure
worker role on Azure
Now I have settings with path to deployed and local files and directories, but I have to set sometimes absolute paths (windows service), sometimes relatives (azure web role). Sometimes is hard to say, how path should be set.
Is there any way to have one way of setting path? Way, that would work in all above environments?
There's no one single way of accessing files in different environments. When your code runs in Web/Worker role, your application has access to specialized folder (local storage) and you read/write files there. When your code runs as Windows Service on local machine, you could grant access to any folder on that machine (and same goes for web app in local IIS).
I would recommend two options:
Abstract File System access and write concrete implementation per environment.
Use blob storage if it is possible.
Related
We have a dotnet core application hosted in Azure app service (Windows machine) in our production environment. It consists of two components -
Email Service
Business Rules Engine
The Email service downloads all emails first to a folder Attachments in the same directory where the application is hosted (D:\home\wwwroot\). For each email, a separate directory (with a guid value) is created under the Attachments directory.
The Business Rules engine accesses that folder and uses the email and it's attachments. Once done, we clear out all contents from the Attachments directory.
The problem we're seeing is that after a certain number of emails are processed, all of a sudden our application is unable to create directories under the Attachments folder. The statement
Directory.CreateDirectory({path})
throws an error saying the specified path could not be found.
The only way we've been able to resolve this is to restart the app service and it again happily goes on it's way creating directories, processing emails until it fails again in a day or so 8-|
What we've tried -
Ours was a multithreaded app, so assuming that maybe one thread is holding a lock on the filesystem due to incorrect or incomplete disposing of resources, we changed it to single threaded processing
Where the directories were being created, we used DirectoryInfo, so tried using DirectoryInfo.Refresh() after every directory deletion, creation etc
Wherever FileStream was being used, we've added explicit .Dispose() statements to dispose of the FileStream
Called GC.Collect() at the end of each run of our service
I suspect this issue is due to the Azure environment but we've not been able to identify what is causing this issue. Has anybody had any such issues and if so how was it resolved?
I made some changes to my code based on what I read in these links here which gives a good summary of the storage system in Azure app service -
https://www.thebestcsharpprogrammerintheworld.com/2017/12/13/how-to-manually-create-a-directory-on-your-azure-app-service/
https://github.com/projectkudu/kudu/wiki/Understanding-the-Azure-App-Service-file-system
D:\local directory points to a folder that is accessible only to that instance of the service, instead of what I was using earlier which is shared among instances - D:\home.
So I changed the code to resolve the %Temp% environment variable, which resolved to D:\local\Temp and then used that location to store the downloaded Emails.
So far multiple testing runs have been executed without any exceptions related to the file system.
Yes, based on your issue description,it does look to be a sandbox restriction. To provide more more on this, the standard/native Azure Web Apps run in a secure environment called a sandbox. Each app runs inside its own sandbox, isolating its execution from other instances on the same machine as well as providing an additional degree of security and privacy which would otherwise not be available.
Azure App Service provides pre-defined application stacks on Windows like ASP.NET or Node.js, running on IIS. The preconfigured Windows environment locks down the operating system from administrative access, software installations, changes to the global assembly cache, and so on (see Operating system functionality on Azure App Service). If your application requires more access than the preconfigured environment allows, you can deploy a custom Windows container instead.
Symbolic link creation: While sandboxed applications can follow/open existing symbolic links, they cannot create symbolic links (or any other reparse point) anywhere.
Additionally, you can check if the files has read-only attribute, to check for this, go to Kudu Console (({yoursite}.scm.azurewebsites.net)) and run attrib somefile.txt, and check if it includes the R (read-only) attribute.
Please suggest the best way to I copy data from Azure File storage to local machines every 2 hours (periodically). Can we write a C# exe to do that and deploy it on the PC?
Write a desktop application in any language with the SDK support of Azure File Storage. Within that application, create a timer to do your download through the API.
If the there are configurable settings or user interactions needed, I'd say go for a desktop application.
Otherwise, and if your clients are windows PCs, best way would be to write a windows service that does the job.
You perhaps could use Azure Logic Apps - if you set a job to run periodically and copy files from File Storage to OneDrive, for example, then OneDrive would replicate your files onto a on-premise server.
What is the best way for an admin user of a website to upload 10,000+ images spread across 2,000 sub-directories?
I have a c# MVC .Net web app where 4 times a year the business need to replace 10,000+ images. They have them on a network share, there is 1 parent directory, and then around 2,000 sub-directories underneath, each housing multiple image files.
I know how to do write to BLOB storage, parallel Tasks, etc., but how can the app running on Azure navigate the client side local file storage to find all the files in the sub-directories to upload them?
You can run AzCopy tool on the local network where the local files are, and use the /S flag to copy the files in a subfolder: Upload all blobs in a folder
In my opinion, I suggest you could directly write a command-line code or exe for the client admin to upload file.
Since our web app have no permission to access client's resources.If you want your web app access the client resources, you need use special ways like Relay Hybrid Connections or VNET.
It also need the client admin to config the client machine to allow the azure web app access.
In my opinion, the most easily way you could write a exe(it will auto upload the file to azure storage using datamovement library) which will run by the scheduled job in the client-side.
we have a suite of applications which are add-ons to an enterprise product installed onto a windows server into the program files(x86) folder. The applications are written in c#. A share is created on the server allowing users to launch the applications in the installation folder either by terminal server or across the network.
The common settings for our applications are stored within a single xml file. Some of these settings only need to be read and are configured by a dedicated application that requires admin rights (as it also performs other functions such as scheduling tasks). Other settings need to be modified by various department managers to suit they way they want the applications to work and should not require admin access - but they need to be persisted in the same file as they are application rather than user specific.
I am somewhat confused with all of the available options for where the settings file might be stored (including special folders) such that admin access is not required to write to the file, yet the file location is accessible irrespective of whether the user is launching the application via terminal server, network share etc.
Is the program files folder the best option and just creating the necessary permissions on the share? Or is there a special folder for this scenario? If there is a special folder, what is the correct way to access it? (I did try this route, but kept finding the file was being created/updated on the user's local machine rather than the network file).
thanks
Matt
I have a web application hosted on a web farm. In the application there is a functionality where by user can write the files on a folder located on the virtual directory in the IIS like this:
var compressedFile = FileCompression.GetCompressedAndEncrypted(xml);
File.WriteAllBytes((filePath), compressedFile);
The issue here is that only the server that hooked to the client gets the updated files, but the requirement is to do this on both the servers of the farm simultaneously. There are two servers in the farm.
I want to achieve this programmatically. Please suggest...
You can do one of the following -
store in database (easy to implement, but only good for small files)
store in common location where all servers can access (like Windows Azure's Blob storage)
use Microsoft Sync Framework to sync files (steep learning curve)