My application allows users to enter an Azure Blob Storage SAS URL. How would I go about validating it? I'm using the Azure Storage Blobs client library, and there doesn't seem to be any way of validating SAS URLs without actually performing a blob operation (which I don't want to do).
The validation operation can be asynchronous and involve an API call if necessary (ie it can be triggered with a button).
public class SASURLValidator
{
public async Task<bool> ValidateSASURL(string sasURL)
{
// What goes here?
}
public async Task Test()
{
var result = await ValidateSASURL("https://blobstorageaccountname.blob.core.windows.net/containerName?sp=w&st=2022-02-15T02:07:49Z&se=2022-03-15T10:07:49Z&spr=https&sv=2020-08-04&sr=c&sig=JDFJEF342JDSFJIERJsdjfkajiwSKDFJIQWJIFJSKDFJWE%3D")
// result should be true if the above is a valid SAS
}
}
You man test the list or write and delete access. Depending on your scenario you can use on of both. It would be also possible to modify the sample for testing. read access to a singe file.
private async Task TestSasAsync(String uri, bool testWriteAndDelete, bool testList)
{
try
{
var cloudBlobContainer = new CloudBlobContainer(new Uri(uri));
if (testList)
{
foreach (var blob in cloudBlobContainer.ListBlobs())
{
Console.WriteLine(blob.Uri);
}
}
if (testWriteAndDelete)
{
var blockBlob = cloudBlobContainer.GetBlockBlobReference("testBlob.txt");
await blockBlob.UploadTextAsync("Hello world");
await blockBlob.DeleteAsync();
}
}
catch (Exception ex)
{
throw new Exception("Failed to validate SAS Uri: " + ex.Message, ex);
}
}
Related
I am currently working on a problem I've encountered while using Azure Blob Storage together with C# API. I also didn't find a fitting solution in the questions here since most of them just download files once and they're done.
What I want to achieve is to have an API as a proxy for handling file downloads for my mobile clients. Therefore I need fast response / fast first byte responses since the mobile applications have a rather low timeout of five seconds.
[HttpGet, Route("{id}")]
[Authorize(Policy = xxxxx)]
public async Task<FileStreamResult> Get(Guid tenantId, Guid id)
{
if (tenantId == default)
{
throw new ArgumentException($"Tenant id '{tenantId}' is not valid.");
}
if (id == default)
{
throw new ArgumentException($"Package id '{id}' is not valid.");
}
var assetPackage = await _assetPackageService.ReadPackage(myenum.myvalue, tenantId, id).ConfigureAwait(false);
if (assetPackage == null)
{
return File(new MemoryStream(), "application/octet-stream");
}
return File(assetPackage.FileStream, assetPackage.ContentType);
}
public async Task<AssetPackage> ReadPackage(AssetPackageContent packageContent, Guid tenantId, Guid packageId)
{
var blobRepository = await _blobRepositoryFactory.CreateAsync(_settings, tenantId.ToString())
.ConfigureAwait(false);
var blobPath = string.Empty;
//some missing irrelevant code
var blobReference = await blobRepository.ReadBlobReference(blobPath).ConfigureAwait(false);
if (blobReference == null)
{
return null;
}
var stream = new MemoryStream();
await blobReference.DownloadToStreamAsync(stream).ConfigureAwait(false);
stream.Seek(0, SeekOrigin.Begin);
return new AssetPackage(packageContent, stream, blobReference.Properties.ContentType);
}
I am aware that MemoryStream is terrible for downloading and stuff since it consumes the files into memory before distributing it to the client.
How would you tackle this? Is there a easy solution to have my API act as a proxy rather than downloading the whole file and then let the client download it again from my api?
Possible and working solution is - as silent mentioned - adding the Azure Storage to the Azure API Management. You could add authorization or work with SAS links which might or might not fit your application.
I followed this guide to setup my architecture and it works flawlessly. Thanks to silent for the initial idea.
Background
I'm currently working on a .Net Core - C# application that is reliant on various Azure services. I've been tasked with creating an endpoint that allows users to bulk download a varying number of files based on some querying/filtering. The endpoint will be triggered by a download all button on the frontend and should return a .zip of all said files. The total size of this zip could be anywhere from 100KB-100GB depending on the query/filters provided.
Note: Although I'm familiar with Asynchrony, Concurrency, and Streams. Interactions between these and between api layers is something Im still getting my head around. Bear with me.
Question
How can I achieve this in a performant and scalable manner given some architectural constraints? Details provided below.
Architecture
The backend currently consist of 2 main layers. The API Layer consist of Azure Functions which are the first point of contact for any and all request from the frontend. The Service Layer stands in-between the API Layer and other Azure Services. In this particular case the Service Layer interacts with an Azure Blob Storage Container, where the various files are stored.
Current Implementation/Plan
Request:
The request itself is strait forward. The API Layer takes query's and filters and turns that into a list of filenames. That is then sent in the body of a POST request to the Service Layer. The Service Layer loops through the list and retrieves each file individually from the blob storage. As of right now there is no way of bulk downloading attachments. This is where complications start.
Given the potential file size, can't pull all data into memory at one time, may need to be streamed or batched.
Given many files, may need to download files in parallel from blob storage.
Need to build zip file from async parallel task? Which can't be built completely in memory.
Response:
I currently have a working version of this that doesn't worry about memory. The above diagram is meant as an illustration of the limitations/considerations of the task rather than a concept that can be put to code. No one layer can/should hold all of the data at any given time. My original attempt/idea was to use a series of streams that can pipe data down the line in some manor. However, I realized this might be a fools errand and decided to make this post.
Any thoughts on a better high-level work flow to accomplish this task would be greatly appreciated. I would also love to hear completely different solutions to the problem.
Thank you Sha. Posting your suggestions as an answer to help other community members.
POST a list of file paths to a Azure Function (Http trigger)
Create a queue message containing the file paths and put on a storage queue.
Listen to said storage queue with another Azure function (Queue trigger).
Stream each file from Azure Storage -> Add it to a Zip stream -> Stream it back to Azure storage.
Below code will help on creating ZIP file.
public class AzureBlobStorageCreateZipFileCommand : ICreateZipFileCommand
{
private readonly UploadProgressHandler _uploadProgressHandler;
private readonly ILogger<AzureBlobStorageCreateZipFileCommand> _logger;
private readonly string _storageConnectionString;
private readonly string _zipStorageConnectionString;
public AzureBlobStorageCreateZipFileCommand(
IConfiguration configuration,
UploadProgressHandler uploadProgressHandler,
ILogger<AzureBlobStorageCreateZipFileCommand> logger)
{
_uploadProgressHandler = uploadProgressHandler ?? throw new ArgumentNullException(nameof(uploadProgressHandler));
_logger = logger ?? throw new ArgumentNullException(nameof(logger));
_storageConnectionString = configuration.GetValue<string>("FilesStorageConnectionString") ?? throw new Exception("FilesStorageConnectionString was null");
_zipStorageConnectionString = configuration.GetValue<string>("ZipStorageConnectionString") ?? throw new Exception("ZipStorageConnectionString was null");
}
public async Task Execute(
string containerName,
IReadOnlyCollection<string> filePaths,
CancellationToken cancellationToken)
{
var zipFileName = $"{DateTime.UtcNow:yyyyMMddHHmmss}.{Guid.NewGuid().ToString().Substring(0, 4)}.zip";
var stopwatch = Stopwatch.StartNew();
try
{
using (var zipFileStream = await OpenZipFileStream(zipFileName, cancellationToken))
{
using (var zipFileOutputStream = CreateZipOutputStream(zipFileStream))
{
var level = 0;
_logger.LogInformation("Using Level {Level} compression", level);
zipFileOutputStream.SetLevel(level);
foreach (var filePath in filePaths)
{
var blockBlobClient = new BlockBlobClient(_storageConnectionString, containerName, filePath);
var properties = await blockBlobClient.GetPropertiesAsync(cancellationToken: cancellationToken);
var zipEntry = new ZipEntry(blockBlobClient.Name)
{
Size = properties.Value.ContentLength
};
zipFileOutputStream.PutNextEntry(zipEntry);
await blockBlobClient.DownloadToAsync(zipFileOutputStream, cancellationToken);
zipFileOutputStream.CloseEntry();
}
}
}
stopwatch.Stop();
_logger.LogInformation("[{ZipFileName}] DONE, took {ElapsedTime}",
zipFileName,
stopwatch.Elapsed);
}
catch (TaskCanceledException)
{
var blockBlobClient = new BlockBlobClient(_zipStorageConnectionString, "zips", zipFileName);
await blockBlobClient.DeleteIfExistsAsync();
throw;
}
}
private async Task<Stream> OpenZipFileStream(
string zipFilename,
CancellationToken cancellationToken)
{
var zipBlobClient = new BlockBlobClient(_zipStorageConnectionString, "zips", zipFilename);
return await zipBlobClient.OpenWriteAsync(true, options: new BlockBlobOpenWriteOptions
{
ProgressHandler = _uploadProgressHandler,
HttpHeaders = new BlobHttpHeaders
{
ContentType = "application/zip"
}
}, cancellationToken: cancellationToken);
}
private static ZipOutputStream CreateZipOutputStream(Stream zipFileStream)
{
return new ZipOutputStream(zipFileStream)
{
IsStreamOwner = false
};
}
}
Check Zip File using Azure functions for further information.
Created a POST API which basically save a file in one directory.
Will asynchronous code make my API better at handling the scalability when multiple requests come from clients?
Currently, the code works synchronously.
Should I make every method as asynchronous? And where should I place the keyword await?
The tasks:
Task 1: Read request content (XML)
Task 2: Create a directory if not created already
Task 3: Uniquely make filenames unique
Save file on the directory
[System.Web.Mvc.HttpPost]
public IHttpActionResult Post(HttpRequestMessage request)
{
try
{
string contentResult = string.Empty;
ValidateRequest(ref contentResult, request);
//contentResult = "nothing";
//Validation of the post-requested XML
//XmlReaderSettings(contentResult);
using (StringReader s = new StringReader(contentResult))
{
doc.Load(s);
}
string path = MessagePath;
//Directory creation
DirectoryInfo dir = Directory.CreateDirectory($#"{path}\PostRequests");
string dirName = dir.Name;
//Format file name
var uniqueFileName = UniqueFileNameFormat();
doc.Save($#"{path}\{dirName}\{uniqueFileName}");
}
catch (Exception e)
{
LogService.LogToEventLog($"Error occured while receiving a message from messagedistributor: " + e.ToString(), System.Diagnostics.EventLogEntryType.Error);
throw e;
}
LogService.LogToEventLog($"Message is received sucessfully from messagedistributor: ", System.Diagnostics.EventLogEntryType.Information);
return new ResponseMessageResult(Request.CreateResponse((HttpStatusCode)200));
}
Yes, it should.
When you use async with a network or IO calls, you do not block threads and they can be reused for processing other requests.
But, if you have only one drive and other clients do the the same job - you will not get speed benefits, but whole system health still would be better with async calls.
On VS2019, when using this OneDrive sample with UWP from Microsoft, I am getting the following error. An online search shows some relevant links (such as this or this or this) but their context are different (as they are using web apps or Python etc.):
AADSTS50011: The reply URL specified in the request does not match the reply URLs configured for the application: '55dbdbc9-xxxxxxxxxxxxx-a24'
I have followed the sample's instructions for Registering and Configuring the app where Redirect URI I have selected is Public client (mobile & desktop), and have set it's value to https://login.microsoftonline.com/common/oauth2/nativeclient
Question: What I may be doing wrong, and how can we resolve the issue?
UPDATE:
Error occurs at line FolderLoaded?.Invoke(this, EventArgs.Empty); of the method shown below. This is line 180 of file OneDriveList.xaml.cs in the sample. And it is not the error OperationCanceledException since error goes to the second catch statement.
private async Task LoadFolderAsync(string id = null)
{
// Cancel any previous operation
_cancellationTokenSource?.Cancel();
_cancellationTokenSource = new CancellationTokenSource();
// Check if session is set
if (AuthenticationService == null) throw new InvalidOperationException($"No {nameof(AuthenticationService)} has been specified");
// Keep a local copy of the token because the source can change while executing this function
var token = _cancellationTokenSource.Token;
// Add an option to the REST API in order to get thumbnails for each file
// https://learn.microsoft.com/en-us/onedrive/developer/rest-api/api/driveitem_list_thumbnails
var options = new[]
{
new QueryOption("$expand", "thumbnails"),
};
// Create the graph request builder for the drive
IDriveRequestBuilder driveRequest = AuthenticationService.GraphClient.Me.Drive;
// If folder id is null, the request refers to the root folder
IDriveItemRequestBuilder driveItemsRequest;
if (id == null)
{
driveItemsRequest = driveRequest.Root;
}
else
{
driveItemsRequest = driveRequest.Items[id];
}
// Raise the loading event
FolderLoading?.Invoke(this, EventArgs.Empty);
try
{
try
{
// Make a API request loading 50 items per time
var page = await driveItemsRequest.Children.Request(options).Top(50).GetAsync(token);
token.ThrowIfCancellationRequested();
// Load each page
await LoadGridItemsAsync(page, token);
token.ThrowIfCancellationRequested();
}
finally
{
// Raise the loaded event
FolderLoaded?.Invoke(this, EventArgs.Empty);
}
}
catch (OperationCanceledException)
{ }
catch (Exception ex)
{
// Raise the error event
LoadingError?.Invoke(this, ex);
}
}
I have a C# project that subscribes multiple Registrations to a Topic. Because of the nature of the project and the fact that you cant check to see how many people have already subscribed to a Topic I need to make the following Async Calls to the server:
Subscribe Registrations
TopicManagementResponse response = await FirebaseMessaging.DefaultInstance.SubscribeToTopicAsync(registrationTokens, topic);
Send message to Topic
string response = await FirebaseMessaging.DefaultInstance.SendAsync(message);
Unsubscribe Registrations
TopicManagementResponse response = await FirebaseMessaging.DefaultInstance.UnsubscribeFromTopicAsync(registrationTokens, topic);
Because there are three calls I need to Create an Instance of the FirebaseApp using Credentials:
FirebaseApp.Create(new AppOptions()
{
Credential = GoogleCredential.FromFile(path),
});
BUT because the async posts return a "WaitingForActivation" response (yet it does correctly do what it is supposed to do) I cant Delete the Instance to move on to the next function as it throws an error as it cant re-create another FirebaseApp Instance - It fails if I give it a name so I cant use GetInstance(string name).
Am I missing something or is there another way to do this.
Here is an example of a subscribe function:
internal static async Task SubscribeToTopic(string path, string topic, string regID5, string regID)
{
FirebaseApp app = FirebaseApp.Create(new AppOptions()
{
Credential = GoogleCredential.FromFile(path),
});
var registrationTokens = new List<string>()
{
regID5, regID
};
// Subscribe the devices corresponding to the registration tokens to the
// topic
try
{
TopicManagementResponse response = await FirebaseMessaging.DefaultInstance.SubscribeToTopicAsync(registrationTokens, topic);
using (StreamWriter sw = System.IO.File.AppendText(HttpContext.Current.Server.MapPath("/tokens.txt")))
{
sw.WriteLine($"{response.SuccessCount} tokens were subscribed successfully");
}
}
catch (Exception ex)
{
string myerror = ex.Message;
}
}
Any ideas?
you are creating firebase instance every time. so you need to create firebase instance in application start in global.asax file.