Azure Blob Resource Not Found for Access Level Private - c#

The access level of my container is 'Private (no anonymous access)'. I prefer this access level as I only want authorized users to view my content. However, when I try to access a file via the container URL I get a ResourceNotFound error.
I guess there are alternative steps to authenticate the validity of the request. Can someone please help me out by letting me know the steps to get a file displayed.
My front end is Angular/HTML.

This is expected behavior. You can't directly access a blob if it is in a blob container with Private ACL.
What you would need to do is create a Shared Access Signature (SAS) on the blob with at least read permission and use the SAS URL of the blob. You can create a Service SAS on the blob.
The way it normally works is that you would create a SAS on the blob in your backend API and then pass that SAS token/URL to your frontend.

Related

How to use Single SAS URL for container to read Multiple Blobs [duplicate]

I am generating shared access signatures for blobs inside an azure blob container like so:
string sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(30)
});
return new Uri(blob.Uri, sasToken).AbsoluteUri;
This returns a URI string that we can ping to download an individual blob. Great, it works.
However, I need to potentially generate hundreds of these shared access signatures, for many different blobs inside the container. It seems very inefficient to loop through each blob and make this call individually each time.
I know that I can call:
container.GetSharedAccessSignature()
in a similar manner, but how would I use the container's SAS token to distribute SAS tokens for each individual blob inside the container?
Yes, you can.
After you generate the container sas token, then it can also work for each blob inside it.
You just need to add the blob name in the url like below:
https://xxxxx/container/blob?container_sastoken.

Azure Data Lake Gen2 - How do I move files from folder to another folder using C#

I have provisioned Datalake gen2 and in C# I am looking for an option on how to move a file from one folder to another folder? With blob storage it's simple but with Datalake I am getting confused about which SDK to use and how it can be done in C#
Also, can I use SAS token generated at the container level for authentication?
If you want to move a file from one folder to another folder in Azure Data Lake Gen2, please refer to the following code
public async Task<DataLakeFileClient> MoveDirectory
(DataLakeFileSystemClient fileSystemClient)
{
DataLakeFileClient fileClient=
fileSystemClient.GetFileClient("<file path>");
return await fileClient.RenameAsync("<new file path>");
}
For more details, please refer to here and here
Besides, we can use the sas token to manage azure data lake gen2 resource. Regarding how to generate sas token, please refer to here.

After successful generation of access token using oauth2.0 upload file/folder to onedrive/sharepoint in Webapplication

Using Oauth 2.0 i have authorized the user, got the access token and refresh token, using access token how to upload file to one drive
There is a LOT of official documentation about this topic.
This is the documentation for Downloading a file
where you could take a look at:
GET /drives/{drive-id}/items/{item-id}/content
GET /groups/{group-id}/drive/items/{item-id}/content
GET /me/drive/root:/{item-path}:/content
GET /me/drive/items/{item-id}/content
GET /sites/{siteId}/drive/items/{item-id}/content
GET /users/{userId}/drive/items/{item-id}/content
and this is for Uploading
HTTP request (to replace an existing item)
PUT /drives/{drive-id}/items/{item-id}/content
PUT /groups/{group-id}/drive/items/{item-id}/content
PUT /me/drive/items/{item-id}/content
PUT /sites/{site-id}/drive/items/{item-id}/content
PUT /users/{user-id}/drive/items/{item-id}/content
HTTP request (to upload a new file)
HTTP
PUT /drives/{drive-id}/items/{parent-id}:/{filename}:/content
PUT /groups/{group-id}/drive/items/{parent-id}:/{filename}:/content
PUT /me/drive/items/{parent-id}:/{filename}:/content
PUT /sites/{site-id}/drive/items/{parent-id}:/{filename}:/content
PUT /users/{user-id}/drive/items/{parent-id}:/{filename}:/content
Also here is the documentation about Drives and DriveItems

Does StorageConnectionString in AzureWebjobSDK require access to the whole storage account?

I'm trying to use Azure WebJobs SDK to trigger a function when a message is posted on a queue.
This works fine when setting StorageConnectionString to a connection string with the storage account key.
I would like to use a Shared Access Token (SAS) which has access to that queue (and only that) in the StorageConnectionString but getting errors:
Message=Failed to validate Microsoft Azure WebJobs SDK Storage
connection string. The Microsoft Azure Storage account connection
string is not formatted correctly. Please visit
http://msdn.microsoft.com/en-us/library/windowsazure/ee758697.aspx for
details about configuring Microsoft Azure Storage connection strings.
And:
Message=The account credentials for '' are incorrect.
Source=Microsoft.Azure.WebJobs.Host
StackTrace:
at Microsoft.Azure.WebJobs.Host.Executors.DefaultStorageCredentialsValidator.<ValidateCredentialsAsyncCore>d__4.MoveNext()
The connection string I'm using is formatted this way:
BlobEndpoint=https://myaccount.blob.core.windows.net/;QueueEndpoint=https://myaccoount.queue.core.windows.net/queuename;SharedAccessSignature=token
Any chance StorageConnectionString requires access to the whole storage account? If so, do you have an idea what I could do?
Looking at the WebjobSDK code: https://github.com/Azure/azure-webjobs-sdk/tree/dev/src it looks like the exception you are facing is thrown by the storage account parser. Looking at the code, it parses as follows:
public static StorageAccountParseResult TryParseAccount(string connectionString, out CloudStorageAccount account)
{
if (String.IsNullOrEmpty(connectionString))
{
account = null;
return StorageAccountParseResult.MissingOrEmptyConnectionStringError;
}
CloudStorageAccount possibleAccount;
if (!CloudStorageAccount.TryParse(connectionString, out possibleAccount))
{
account = null;
return StorageAccountParseResult.MalformedConnectionStringError;
}
account = possibleAccount;
return StorageAccountParseResult.Success;
}
I checked the formatting you sent using CloudStorageAccount and it seems to pass. Notice that you have an unnecessary '/' after the blob endpoint, maybe you are missing some text and that is causing the parsing to fail.
According to your description, I followed this official document to Configure Azure Storage Connection Strings. Occasionally I encountered the error as you mentioned: The account credentials for '' are incorrect.
As I known, Azure WebJobs SDK would reference the Azure Storage client library, which is a wrapper of Azure Storage Service REST API. For troubleshooting the similar issue, you could leverage Fiddler to capture the network package. Here is the screenshot when I caught the above error via Fiddler:
Any chance StorageConnectionString requires access to the whole storage account? If so, do you have an idea what I could do?
I assumed that there be something wrong with your connection string.
QueueEndpoint=https://myaccoount.queue.core.windows.net/queuename
Here is my connection string that include an account SAS for blob and queue storage within my WebJob project, you could refer to it.
<add name="AzureWebJobsStorage" connectionString="BlobEndpoint=https://brucechen01.blob.core.windows.net/;QueueEndpoint=https://brucechen01.queue.core.windows.net/;SharedAccessSignature=sv=2015-12-11&ss=bq&srt=sco&sp=rwdlacup&se=2016-12-31T18:39:25Z&st=2016-12-25T10:39:25Z&spr=https&sig={signature}" />
Note: If you are specifying a SAS in a connection string in a configuration file, you may need to encode special characters in the URL via Html Encode.
UPDATE:
As you mentioned in the comment below My SAS includes permissions to the queue with the "queuename" only. Since you have configured SAS token for both Blob and Queue, I assumed that you need to create an account SAS token for blob and queue service. You could leverage Microsoft Azure Storage Explorer to create the SAS token as follow:
Choose your storage account, right click and select "Get Shared Access Signature".
Note: When you replace the value of SharedAccessSignature with the generated SAS token, you need to remove the first ? symbol in your SAS token.

How to get Azure Storage Account Key

I want to upload a file to an Azure Storage account that is automatically generated (As part of a Service Fabric resource group, with a known name), using C#.
I need to upload the file as a blob to allow it to be publicly available.
The tutorial Get started with Azure Blob storage using .NET uses a connection string stored in the App.config file.
Since I want to use the to-be-generated storage account, I can't use such a method.
The prefered method is using the user's AD somehow in order to get the key of the storage account.
This link: Get Storage Account Key shows how to GET it with a Rest request, so I guess there is a way to do it using C# code.
It seems to me, that the solution is using the StorageManagementClient class, which has a StorageAccounts property, though I could not find a way to authenticate it using AzureAd.
I tried using AuthenticationContext.AcquireTokenAsync, and aquiring a token for diffenent resources, for instance: https://management.azure.com/, but when using the token, I get the following error:
Microsoft.WindowsAzure.CloudException: AuthenticationFailed: The JWT token does not contain expected audience uri 'https://management.core.windows.net/'.
When using the resource https://management.core.windows.net/ I get a different error:
Microsoft.WindowsAzure.CloudException: ForbiddenError: The server failed to authenticate the request. Verify that the certificate is valid and is associated with this subscription.
Is there a different resource I should use, different method, or maybe it's impossible?
To use the Storage Service Management REST, we need to specify the resource to https://management.core.windows.net/ instead of https://management.azure.com/. And this is using the operate the classic storage account.
The https://management.azure.com/ is the new endpoint for the Azure REST service. If you want to handle the new storage account, you need to use this resource. And below is a sample using the new Azure REST for your reference:
POST: https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resrouceGroupName}/providers/Microsoft.Storage/storageAccounts/{storageAccountName}/listKeys?api-version=2016-01-01
Authorization: Bearer {token}

Categories