How to create an empty folder in S3 using C#? [duplicate] - c#

I'm trying to determine if a folder exists on my Amazon S3 Bucket and if it doesn't I want to create it.
At the moment I can create the folder using the .NET SDK as follows:
public void CreateFolder(string bucketName, string folderName)
{
var folderKey = folderName + "/"; //end the folder name with "/"
var request = new PutObjectRequest();
request.WithBucketName(bucketName);
request.StorageClass = S3StorageClass.Standard;
request.ServerSideEncryptionMethod = ServerSideEncryptionMethod.None;
//request.CannedACL = S3CannedACL.BucketOwnerFullControl;
request.WithKey(folderKey);
request.WithContentBody(string.Empty);
S3Response response = m_S3Client.PutObject(request);
}
Now when I try to see if the folder exists using this code:
public bool DoesFolderExist(string key, string bucketName)
{
try
{
S3Response response = m_S3Client.GetObjectMetadata(new GetObjectMetadataRequest()
.WithBucketName(bucketName)
.WithKey(key));
return true;
}
catch (Amazon.S3.AmazonS3Exception ex)
{
if (ex.StatusCode == System.Net.HttpStatusCode.NotFound)
return false;
//status wasn't not found, so throw the exception
throw;
}
}
It cannot find the folder. The strange thing is if I create the folder using the AWS Management Console, the 'DoesFolderExist' method can see it.
I'm not sure if it's an ACL/IAM thing but am not sure how to resolve this.

Your code actually works for me, but there are a few things you need to be aware off.
As I understand it, Amazon S3 does not have a concept of folders, but individual clients may display the S3 objects as if they did. So if you create an object called A/B , then the client may display it as if it was an object called B inside a folder called A. This is intuitive and seems to have become a standard, but simulating an empty folder does not appear to have a standard.
For example, I used your method to create a folder called Test, then actually end up creating an object called Test/. But I created a folder called Test2 in AWS Explorer (ie the addon to Visual Studio) and it ended up creating an object called Test2/Test2_$folder$
(AWS Explorer will display both Test and Test2 as folders)
Once of the things that this means is that you don't need to create the 'folder' before you can use it, which may mean that you don't need a DoesFolderExist method.
As I mention I tried your code and it works and finds the Test folder it created, but the key had to be tweaked to find the folder created by AWS Explorer , ie
DoesFolderExist("Test/" , bucketName); // Returns true
DoesFolderExist("Test2/" , bucketName); // Returns false
DoesFolderExist("Test2/Test2_$folder$", bucketName); // Returns true
So if you do still want to have a DoesFolderExist method, then it might be safer to just look for any objects that start with folderName + "/" , ie something like
ListObjectsRequest request = new ListObjectsRequest();
request.BucketName = bucketName ;
request.WithPrefix(folderName + "/");
request.MaxKeys = 1;
using (ListObjectsResponse response = m_S3Client.ListObjects(request))
{
return (response.S3Objects.Count > 0);
}

Just refactored above codes to on async method with version 2 of AWS .Net SDK:
public async Task CreateFoldersAsync(string bucketName, string path)
{
path = path.EnsureEndsWith('/');
IAmazonS3 client =
new AmazonS3Client(YOUR.AccessKeyId, YOUR.SecretAccessKey,
RegionEndpoint.EUWest1);
var findFolderRequest = new ListObjectsV2Request();
findFolderRequest.BucketName = bucketName;
findFolderRequest.Prefix = path;
findFolderRequest.MaxKeys = 1;
ListObjectsV2Response findFolderResponse =
await client.ListObjectsV2Async(findFolderRequest);
if (findFolderResponse.S3Objects.Any())
{
return;
}
PutObjectRequest request = new PutObjectRequest()
{
BucketName = bucketName,
StorageClass = S3StorageClass.Standard,
ServerSideEncryptionMethod = ServerSideEncryptionMethod.None,
Key = path,
ContentBody = string.Empty
};
// add try catch in case you have exceptions shield/handling here
PutObjectResponse response = await client.PutObjectAsync(request);
}

ListObjectsRequest findFolderRequest = new ListObjectsRequest();
findFolderRequest.BucketName = bucketName;
findFolderRequest.Prefix = path;
ListObjectsResponse findFolderResponse = s3Client.ListObjects(findFolderRequest);
Boolean folderExists = findFolderResponse.S3Objects.Any();
path can be something like "images/40/".
Using the above code can check if a so-called folder "images/40/" under bucket exists or not.
But Amazon S3 data model does not have the concept of folders. When you try to copy a image or file to certain path, if this co-called folder does not exist it will be created automatically as part of key name of this file or image. Therefore, you actually do not need to check if this folder exists or not.
Very important information from docs.aws.amazon.com : The Amazon S3 data model is a flat structure: you create a bucket, and the bucket stores objects. There is no hierarchy of subbuckets or subfolders; however, you can infer logical hierarchy using key name prefixes and delimiters as the Amazon S3 console does.
http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingMetadata.html

Related

Downloading images from publicly shared folders and sub-folders on Dropbox

This is similar to my previous question:
Downloading images from publicly shared folder on Dropbox
I have this piece of code (simplified version) that needs to download all images from publicly shared folder and all sub-folders.
using Dropbox.Api;
using Dropbox.Api.Files;
...
// AccessToken - get it from app console
// FolderToDownload - https://www.dropbox.com/sh/{unicorn_string}?dl=0
using (var dbx = new DropboxClient(_dropboxSettings.AccessToken))
{
var sharedLink = new SharedLink(_dropboxSettings.FolderToDownload);
var sharedFiles = await dbx.Files.ListFolderAsync(path: "", sharedLink: sharedLink);
// var sharedFiles = await dbx.Files.ListFolderAsync(path: "", sharedLink: sharedLink, recursive: true);
// "recursive: true" throws: Error in call to API function "files/list_folder": Recursive list folder is not supported for shared link.
foreach (var entry in sharedFiles.Entries)
{
if (entry.IsFile)
{
var link = await dbx.Sharing.GetSharedLinkFileAsync(url: _dropboxSettings.FolderToDownload, path: "/" + entry.Name);
var byteArray = await link.GetContentAsByteArrayAsync();
}
if (entry.IsFolder)
{
var subFolder = entry.AsFolder;
// var folderContent = await dbx.Files.ListFolderAsync(path: subFolder.Id);
// var subFolderSharedLink = new SharedLink(???);
}
}
}
How do I list entries of all sub-folders?
For any given subfolder, to list its contents, you'll need to call back to ListFolderAsync again, using the same sharedLink value, but supplying a path value for the subfolder, relative to the root folder for the shared link.
For example, if you list the contents of the folder shared link, and one of the entries is a folder with the name "SomeFolder", to then list the contents of "SomeFolder", you would need to make a call like:
await dbx.Files.ListFolderAsync(path: "/SomeFolder", sharedLink: sharedLink);

Why is my download from Azure storage empty?

I can connect to the Azure Storage account and can even upload a file, but when I go to download the file using DownloadToFileAsync() I get a 0kb file as a result.
I have checked and the "CloudFileDirectory" and the "CloudFile" fields are all correct, which means the connection with Azure is solid. I can even write the output from the file to the console, but I cannot seem to save it as a file.
public static string PullFromAzureStorage(string azureFileConn, string remoteFileName, string clientID)
{
var localDirectory = #"C:\cod\clients\" + clientID + #"\ftp\";
var localFileName = clientID + "_xxx_" + remoteFileName;
//Retrieve storage account from connection string
var storageAccount = CloudStorageAccount.Parse(azureFileConn);
var client = storageAccount.CreateCloudFileClient();
var share = client.GetShareReference("testing");
// Get a reference to the root directory for the share
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
//Get a ref to client folder
CloudFileDirectory cloudFileDirectory = rootDir.GetDirectoryReference(clientID);
// Get a reference to the directory we created previously
CloudFileDirectory unprocessed = cloudFileDirectory.GetDirectoryReference("Unprocessed");
// Get a reference to the file
CloudFile sourceFile = unprocessed.GetFileReference(remoteFileName);
//write to console and log
Console.WriteLine("Downloading file: " + remoteFileName);
LogWriter.LogWrite("Downloading file: " + remoteFileName);
//Console.WriteLine(sourceFile.DownloadTextAsync().Result);
sourceFile.DownloadToFileAsync(Path.Combine(localDirectory, localFileName), FileMode.Create);
//write to console and log
Console.WriteLine("Download Successful!");
LogWriter.LogWrite("Download Successful!");
//delete remote file after download
//sftp.DeleteFile(remoteDirectory + remoteFileName);
return localFileName;
}
In the commented out line of code where you write the output to the Console, you explicitly use .Result because you're calling an async method in a synchronous one. You should either also do so while downloading the file as well, or make the entire method around it async.
The first solution would look something like this:
sourceFile.DownloadToFileAsync(Path.Combine(localDirectory, localFileName), FileMode.Create).Result();
EDIT:
As far as the difference with the comment, that uses GetAwaiter().GetResult(), goes: .Result wraps any exception that might occur in an AggregateException, while GetAwaiter().GetResult() won't. Anyhow: if there's any possibility you can refactor the method to be async so you can use await: please do so.

Upload to Azure Media Services / blob storage with SasLocator not showing uploaded file

So I want to upload video's from client desktop application to Azure Media Services (which of course uses Azure Storage).
I am trying to do a combination of:
this old documentation: 3 - Uploading Video into Microsoft Azure Media Services
and this relative new documentation: Upload multiple files with Media Services .NET SDK.
The first one shows an perfect example of my scenario, but the second one illustrates how to use BlobTransferClient to upload multiple files and have a "progress" indicator.
The problem: It does seem to upload and I don't get any error after uploading, yet nothing is showing up in Azure portal / Storage account.
It seems to upload because task takes long, task manager shows wifi upload progress and Azure storage shows that (successful) requests are being made.
So, serverside, I create a SasLocator for a temporary time:
public async Task<VideoUploadModel> GetSasLocator(string filename)
{
var assetName = filename + DateTime.UtcNow;
IAsset asset = await _context.Assets.CreateAsync(assetName, AssetCreationOptions.None, CancellationToken.None);
IAccessPolicy accessPolicy = _context.AccessPolicies.Create(assetName, TimeSpan.FromMinutes(10),
AccessPermissions.Write);
var locator = _context.Locators.CreateLocator(LocatorType.Sas, asset, accessPolicy);
var blobUri = new UriBuilder(locator.Path);
blobUri.Path += "/" + filename;
var model = new VideoUploadModel()
{
Filename = filename,
AssetName = assetName,
SasLocator = blobUri.Uri.AbsoluteUri,
AssetId = asset.Id
};
return model;
}
And client-side, I try to upload:
public async Task UploadVideoFileToBlobStorage(string[] files, string sasLocator, CancellationToken cancellationToken)
{
var blobUri = new Uri(sasLocator);
var sasCredentials = new StorageCredentials(blobUri.Query);
//var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
var blobClient = new CloudBlobClient(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
var blobTransferClient = new BlobTransferClient(TimeSpan.FromMinutes(1))
{
NumberOfConcurrentTransfers = 2,
ParallelTransferThreadCount = 2
};
//register events
blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
//files
var uploadTasks = new List<Task>();
foreach (var filePath in files)
{
await blobTransferClient.UploadBlob(blobUri, filePath, new FileEncryption(), cancellationToken, blobClient, new NoRetry());
}
//StorageFile storageFile = null;
//if (string.IsNullOrEmpty(file.FutureAccessToken))
//{
// storageFile = await StorageFile.GetFileFromPathAsync(file.Path).AsTask(cancellationToken);
//}
//else
//{
// storageFile = await StorageApplicationPermissions.FutureAccessList.GetFileAsync(file.FutureAccessToken).AsTask(cancellationToken);
//}
//cancellationToken.ThrowIfCancellationRequested();
//await blob.UploadFromFileAsync(storageFile);
}
I know I am probably not doing it correctly with naming of assets and using the progress indicator instead of await, but of course I first want this to work first before finishing it.
I configured Azure Media Services to "Connect to Media Services API with service principal", where I created a new Azure AD app and generated keys for that, like this documentation page. I am not really sure how this exactly works, little unexperienced in Azure AD and Azure AD apps (guidance?).
Uploading:
Asset created but no files:
Storage doesn't show any files either:
Storage does show successful upload:
The reason I can't exactly follow the Upload multiple files with Media Services .NET SDK documentation is because it uses the _context (which is Microsoft.WindowsAzure.MediaServices.Client.CloudMediaContext), that _context I can use serverside but not client-side because it requires the TentantDomain,RESTAPI Endpoint, ClientId and Client Secret.
I guess uploading via SaSLocator is the correct way (?).
UPDATE 1
When uploading using CloudBlockBlob it does upload again and it is shown in my storage account within an asset, yet when I go the media services within azure and click on the particular asset, it doesn't show any files.
So the code for that:
var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
//files
var uploadTasks = new List<Task>();
foreach (var filePath in files)
{
await blob.UploadFromFileAsync(filePath, CancellationToken.None);
}
I've also tried to upload an asset manually within Azure. So Clicking on "Upload" in the Asset menu, then Encoding it. This all works fine.
UPDATE 2:
Digging deeper I came up with the following, not yet production-proof, way to make it currently work:
1. Get a Shared access signature directly from storage and upload it to there:
public static async Task<string> GetMediaSasLocator(string filename)
{
CloudBlobContainer cont = await GetMediaContainerAsync();
SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
{
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(60),
Permissions = SharedAccessBlobPermissions.Write,
SharedAccessStartTime = DateTimeOffset.UtcNow.AddMinutes(-5)
};
await cont.FetchAttributesAsync();
return cont.Uri.AbsoluteUri + "/" + filename + cont.GetSharedAccessSignature(policy);
}
With this SaS I can upload just like I showed in UPDATE 1, nothing changed there.
2. Create a Azure Function (which was already planned to do) which handles the asset creation, file uploading to asset, encoding and publishing.
This has been done by following this tutorial: Azure Functions Tools for Visual Studio and then implement the code that is illustrated in Upload multiple files with Media Services .NET SDK.
So this "works" but is not perfect yet, I still don't have my progress indicator within my client WPF application and the Azure Function takes quite a long time to complete because we basically "upload" the file again to an Asset after it is already in Azure Storage. I rather use a method to either copy from one container to an asset container.
I came to this point because Azure functions need a fixed given container name, since assets create their own containers within an storage account, you can't trigger an Azure function on those. So to work with Azure Functions it seems I really have to upload it to a fixed container name and thereafter do the rest.
Question still remains: Why uploading a video file to Azure Storage via the BlobTransferClient does not work? And if it will work, how do I trigger an Azure function based on multiple containers. A 'path' like asset-{name}/{name}.avi would be preferred.
Eventually it turned out that I need to specify the base URL in the UploadBlob method, so without the filename itself which is within the SasLocator URL, but only the container name.
Once I fixed that I also noted it didn't upload to the filename I have provided in the SasLocator I generated server side (it includes a customerID prefix). I had to use one of the other method overloads to get the correct filename.
public async Task UploadVideoFilesToBlobStorage(List<VideoUploadModel> videos, CancellationToken cancellationToken)
{
var blobTransferClient = new BlobTransferClient();
//register events
blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
//files
_videoCount = _videoCountLeft = videos.Count;
foreach (var video in videos)
{
var blobUri = new Uri(video.SasLocator);
//create the sasCredentials
var sasCredentials = new StorageCredentials(blobUri.Query);
//get the URL without sasCredentials, so only path and filename.
var blobUriBaseFile = new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path,
UriFormat.UriEscaped));
//get the URL without filename (needed for BlobTransferClient (seems to me like a issue)
var blobUriBase = new Uri(blobUriBaseFile.AbsoluteUri.Replace("/"+video.Filename, ""));
var blobClient = new CloudBlobClient(blobUriBaseFile, sasCredentials);
//upload using stream, other overload of UploadBlob forces to put online filename of local filename
using (FileStream fs = new FileStream(video.FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
{
await blobTransferClient.UploadBlob(blobUriBase, video.Filename, fs, null, cancellationToken, blobClient,
new NoRetry(), "video/x-msvideo");
}
_videoCountLeft -= 1;
}
blobTransferClient.TransferProgressChanged -= BlobTransferClient_TransferProgressChanged;
}
private void BlobTransferClient_TransferProgressChanged(object sender, BlobTransferProgressChangedEventArgs e)
{
Console.WriteLine("progress, seconds remaining:" + e.TimeRemaining.Seconds);
double bytesTransfered = e.BytesTransferred;
double bytesTotal = e.TotalBytesToTransfer;
double thisProcent = bytesTransfered / bytesTotal;
double procent = thisProcent;
//devide by video amount
int videosUploaded = _videoCount - _videoCountLeft;
if (_videoCountLeft > 0)
{
procent = (thisProcent + videosUploaded) / _videoCount;
}
procent = procent * 100;//to real %
UploadProgressChangedEvent?.Invoke((int)procent, videosUploaded, _videoCount);
}
Actually Microsoft.WindowsAzure.MediaServices.Client.BlobTransferClient should be able to do concurrent uploads but there is no Method for uploading multiple yet it has properties for NumberOfConcurrentTransfers and ParallelTransferThreadCount, not sure how to use this.
I didn't check if this is now working with Assets as well because I now upload to 1 single container for every file and later using an Azure Function to process to an Asset, mainly because I can't trigger an Azure Function on a dynamic container name (every asset creates its own container).

Get all files inside a specific folder in a library with UWP

I'm trying to get all the videos in a specific folder inside the Videos library using UWP, right now I can get all videos inside the Videos library, but I'd like to reduce my results to only those inside the specified folder. My code is this:
Windows.Storage.Search.QueryOptions queryOption = new QueryOptions(CommonFileQuery.OrderByTitle, new string[] {".mp4"});
queryOption.FolderDepth = FolderDepth.Deep;
var files = await KnownFolders.VideosLibrary.CreateFileQueryWithOptions(queryOption).GetFilesAsync();
StorageFile videoToPlay = (files[new Random().Next(0, files.Count)] as StorageFile);
var stream = await videoToPlay.OpenAsync(Windows.Storage.FileAccessMode.Read);
Player.SetSource(stream, videoToPlay.ContentType);
Debug.WriteLine(Player.Source);
How could I access a subfolder named "Videos to Play" and then get all the videos inside that folder? I tried accesing it by using a path like:
string localfolder = Windows.Storage.ApplicationData.Current.LocalFolder.Path;
var array = localfolder.Split('\\');
var username = array[2];
string[] allVideos = System.IO.Directory.GetFiles("C:/Users/" + username + "/Videos/Videos to Play");
But I get access denied even though I already requested access to the Videos library (and the fact that the first example works shows that I actually have access to it).
try
{
var folder = await KnownFolders.VideosLibrary.GetFolderAsync("Videos to Play");
}
catch (FileNotFoundException exc)
{
// TODO: Handle the case when the folder wasn't found on the user's machine.
}
In the folder variable you'll have the reference to the desired folder. Then it's the very same stuff that you already do, but instead of KnownFolders.VideosLibrary folder use this one!

How to check a folder exists in DropBox using DropNet

I'm programming an app that interact with dropbox by use DropNet API. I want to check if the folder is exist or not on dropbox in order to I will create one and upload file on it after that. Everything seen fine but if my folder is exist it throw exception. Like this:
if (isAccessToken)
{
byte[] bytes = File.ReadAllBytes(fileName);
try
{
string dropboxFolder = "/Public/DropboxManagement/Logs" + folder;
// I want to check if the dropboxFolder is exist here
_client.CreateFolder(dropboxFolder);
var upload = _client.UploadFile(dropboxFolder, fileName, bytes);
}
catch (DropNet.Exceptions.DropboxException ex) {
MessageBox.Show(ex.Response.Content);
}
}
I'm not familiar with dropnet, but looking at the source code, it appears you should be able to do this by using the GetMetaData() method off of your _client object. This method returns a MetaData object.
Example:
//gets contents at requested path
var metaData = _client.GetMetaData("/Public/DropboxManagement/Logs");
//without knowing how this API works, Path may be a full path and therefore need to check for "/Public/DropboxManagement/Logs" + folder
if (metaData.Contents.Any(c => c.Is_Dir && c.Path == folder)
{
//folder exists
}

Categories