Azure VM Logs and Azure Storage - c#

I need an advise about Azure services.
Suppose that I have a Linux VM in Azure with some application working there. This application generates some log files. I'm interesting in uploading these log files to a directory in Azure Storage. What is the best way to do it?. How about security?, would I do it without creating a public directory?.

there are several ways to deal with this use case. I'll give you the simplest one. Under your storage account, create a file share, through Files, once you do, there will be an option to connect on top left, this will provide you with the specific commands to run on your vm in order to attach this file share. once this is done, point out the log files to be stored to that drive. From a security standpoint, you will be using unique keys to access that specific file share to protect access to it.
Step by step guide can be found here: https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-create-file-share

Related

How to add trigger to move file to azure blob from azure file share when a file uploaded to azure file share?

The requirement is to move the files from azure file share to azure blob when user is uploading a file to azure file share.
I have gone through the link below:-
https://feedback.azure.com/forums/287593-logic-apps/suggestions/20324680-add-trigger-for-azure-file-storage
What I found out till now is trigger mechanism is not supported on azure file share like we can do on azure blob.So How I can achieve the same functionality on azure file share.
Basically i want a trigger on azure file share so that when file is uploaded i should able to execute my custom logic written in c# to process the file and upload to blob.
As you mentioned the functionality is not yet offered as a connector, however, since you have written your custom logic in C#, you can use Azure Functions, which allows you to run custom code and will be used as if it was a logic app instance. you can get the idea from:
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-azure-functions
I will also Upvote your feedback and will discuss the status with the internal teams as well.

How to rename file/directory (not Blob file) in Azure Storage (not Blob storage)?

I thought that this would be very easy job to do, but as I have researched, I found nothing on how to rename a file or directory in Azure Storage.
I don't want to do a copy/delete (the size of files/directories is very large), I just want to change the name of a given file/directory programmatically through C#.
Edit: I'm talking about CloudFile/CloudFileDirectory objects.
Can someone help me?
You talk about Azure Files. Similar to Azure Blob Storage, there is no rename / move feature thus you have to copy the file (e. g. using StartCopyAsync) and then delete it using the CloudFile.DeleteAsync method.
I wrote a blog article about how to Rename Azure Storage Blob using PowerShell (same probably applies to Azure Files)
A straightforward solution through the SDK won't work. The renaming functionality isn't supported on the REST API, so all the wrappers (Azure CLI, SDK, etc.) don't have it. The Azure Storage Explorer tool can "rename" but under the hood it clones folders/files rather than renaming (again, it's working via API)
The Wailing Wall on the Azure Feedback portal is here, please upvote:
Rename blobs without needing to copy them
Rename, copy, move blob file from azure portal
FileShare mapping
Though, if you have some flexibility, you can mount an Azure File Share with SMB on Windows, Linux, or macOS. It does true renaming over SM. And for your app it would be just normal disk I/O operations.
If you want to rename an Azure blob folder manually you can download the Azure Storage Explorer tool given below, connect to your blob storage and you can rename the blob folders, you can clone the blob files also.
Azure storage explorer tool

Storing Images in Azure and Accessing it in Code

Before I published the website I've been working on to Azure, I kept all the images inside a local "Catalog" folder which was referenced in the program like;
image src='/Catalog/Images/Thumbs/<%#:Item.ImagePath%
Now it is deployed on Azure, I believe I need to turn to something called "Unstructured Blob Storage" to store and retrieve the images on the website.
This is my first time using Azure, I am wondering if it is as easy as storing the images in an unstructured blob storage on Azure, then just changing the "Catalog/Images/Thumbs" to the file path on Azure.
Does anybody know exactly how this works?
Thanks!
AFAIK, after deployed your web application to Azure, you could still store your resources (e.g. image, doc, excel, etc.) within your web application. In order to better manage your resources and reduce the pressure for your application when serving the static resources, you could store your resources in a central data store.
This is my first time using Azure, I am wondering if it is as easy as storing the images in an unstructured blob storage on Azure, then just changing the "Catalog/Images/Thumbs" to the file path on Azure.
Based on your requirement, you could create a blob container named catalog and upload your images to the virtual directory Images/Thumbs, and set anonymous read access to your container and blobs. For a simple way, you could leverage Azure Storage Explorer to upload your images and set access level for your container as follows:
And you image would look like this:
<img src="https://brucchstorage.blob.core.windows.net/catalog/Images/Thumbs/lake.jpeg">
Moreover, you could leverage AzCopy to copy data to Azure Blob storage using simple commands with optimal performance. Additionally, you could leverage Azure Storage client library to manage your storage resources, details you could follow here.

Create a cloud storage app with ASP.NET and Azure

I need to create a cloud storage application using ASP.NET MVC, C# and integrate it with Azure storage.
I currently have a functional interface which allows users to register and securely stores their details in an SQL database. I also have a basic file uploader using Azure Blob storage that was created using this tutorial as a guideline.
My question regards how to give users their own container/page so that their files are only accessible by them. At the moment, the file uploader and Azure container is shared so that anybody with an account can view and edit the uploads. I want to restrict this so that each user has their own individual space that cannot be read or modified by others.
I have searched for answers but cannot find anything that suits my needs. Any advice would be greatly appreciated.
My question regards how to give users their own container/page so that
their files are only accessible by them.
One way to achieve this is by assigning a container to a user. When a user signs up, as a part of registration process you create a blob container for the user and store the name of the container along with other details about the user. When the user signs in, you fetch this information and only show files from that container only. Similarly when the user uploads the files, you save the files in that container only.
A few things you would need to consider:
You can't set any hard limit on the size of the container. So a container can be as big as a size of your storage account. If you want to put some restrictions on how much data a user can upload, you would need to manage that outside of storage in your application. You may also want to look into Azure File Service if that's a requirement. In Azure File Service, you can restrict the size of a share (equivalent of a blob container).
You may even want to load-balance your users across multiple storage accounts to achieve better throughput. If you decide to go down this route then along with container name, you would also need to store the storage account name along with user information.
Store the document names in a separate SQL database linked to the user's account. Then display to the user only those files with filenames linked specifically to them, and you should be on your way! I've used this architecture before, and it works like a charm. You should attach a Guid or some other unique identifier to each filename before implementing this model, however.

load balancer question c# asp.net

The place where I work has 2 servers and a load balancer. The setup is horrible since I have to manually make sure both servers have the same files. I know there are ways to automate this but it has not been implemented, hopefully soon (I have no control over this). I wrote an application that collects a bunch of information from a user, then creates a folder named after the email of the user in one of the servers. The problem is that I can't control in which server the folder gets created in, so let say a user goes in.. fills his stuff and his folder gets created in server 1, user goes away for a while and goes back to the site but this time the load balancer throws the user into server 2, now the user does something that needs to be saved into his folder but since it didn't created in this server an error occurs. What can I do about this? any suggestions?
Thanks
It sounds like you could solve a few issues by implementing a cloud file service for the file writes such as Amazon S3 http://aws.amazon.com/s3/
Disk size management would no longer be a concern
Files are now written and read from S3 so load balancer concerns are solved
Benefits of a semi-edge network with AWS. (not truly edge but in my experience better than most internally hosted solutions)
Don't store your data in the file system, store it in a database.
If you really can't avoid using the file system, you could look at storing the files in a network share both servers have access to. This would be a terrible hack, however.
It sounds like you may be having a session state issue. It sounds odd the way you describe it, but have a look at this article. It's old, but covers the basics. If it doesn't try googling "asp.net session state web farm"
http://ondotnet.com/pub/a/dotnet/2003/03/24/sessionstate.html
Use NAS or SAN to centralize storage. That same network-accessible storage can store the shared configuration that IIS can be setup to use.
Web Deploy v2 just released from Microsoft, I would encourage the powers that be to investigate that, along with Application Request Routing and the greater Web Farm Framework.
This is a normal infrastructure setup. Below are the two commonly used solutions for the situation you are in.
If you have network attached storage available (e.g. Netapps), you can use this storage to centrally store all of your user files that need to be available across all servers in your web farm.
Redesign your application to store all user specific data in a database.

Categories