Images stored in Azure Blob don't cache in Browser - c#

I have set the cache-property to "max-age=3600, must-revalidate" when uploading. Looking at the browser developer tool, the network tab, I can see that the header is correct and the cache property is sett.
But still the Photos get fetched from the Azure blob storage every time with 200OK response.
The way I upload is as follows:
-Photo uploaded by user
-GUID added to name of photo and saved to Azure SQL database with user info.
-Blob Reference created using Azure library in C#
-Cache properties set and saved after uploading
var uniqueFileName = Guid.NewGuid().ToString() + "_" + fileName;
var newBlob = container.GetBlockBlobReference(uniqueFileName);
using var fileStream = file.OpenReadStream();
newBlob.UploadFromStreamAsync(fileStream).Wait();
newBlob.Properties.CacheControl = "max-age=3600, must-revalidate";
newBlob.SetPropertiesAsync().Wait();
Blob is fetch by using an URI + SAS Token
I get the name from the SQL database, Look it up in azure Blob then getting the URI and Adding The SAS to give the Client access to the Blob.
var blob = container.GetBlockBlobReference(fileName);
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1)
});
var blobUrl = blob.Uri.AbsoluteUri + sasToken;
I found that every time a new SAS is created the image is treated as new version. And I do create a new SAS for every request to create a link that the client use to download image. How can I bypass this behavior. Should I make the container accessible by public for read with no SAS used?

The cache control should work if using blob with SAS token.
If you're using a browser(like Chrome) to visit the blob with SAS token, please make sure you DON'T select the checkbox of "Disable cache" in Develop tool. I can see it uses cache at my side, the screenshot as below:

Related

Trying to Connect Azure DataLake using Managed access but getting unauthorized error

string dfsUri = "https://" + accountName + ".dfs.core.windows.net";
DataLakeServiceClient dataLakeServiceClient = new DataLakeServiceClient(new Uri(dfsUri), new DefaultAzureCredential(new DefaultAzureCredentialOptions()));
DataLakeFileSystemClient dataLakeFileSystemClient = await dataLakeServiceClient.CreateFileSystemAsync("test1");
DataLakeDirectoryClient directoryClient = await dataLakeFileSystemClient.CreateDirectoryAsync("my-directory");
DataLakeFileClient fileClient = await directoryClient.CreateFileAsync("uploaded-file.txt");
FileStream fileStream = File.OpenRead("");
long fileSize = fileStream.Length;
await fileClient.AppendAsync(fileStream, offset: 0);
await fileClient.FlushAsync(position: fileSize);
Trying to Connect Azure DataLake using Managed Identity but getting unauthorized error on line:
DataLakeFileSystemClient dataLakeFileSystemClient = await dataLakeServiceClient.CreateFileSystemAsync("test1");
error message : This request is not authorized to perform this operation , Status: 403 (This request is not authorized to perform this operation.)\r\nErrorCode: AuthorizationFailure
Make sure your managed identity has atleast storage data blob reader or storage data blob contributor roles.
To know assign managed identity access to resource using Azure portal check below link: Assign a managed identity access to a resource by using the Azure portal
To know details of roles available in ask gen2 check below link: Access control model in Azure Data Lake Storage Gen2
Could you please let us know, what roles you have granted for managed identity?

Google Cloud Storage api (c#) - cache header metadata

I upload to google cloud storage bucket via the storage c# api (Google.Cloud.Storage.V1). These are public files accessed by client pages.
Problem:
the files are sent with "private, max-age= 0".
Question:
I would like to set custom cache headers instead while or after uploading the files via the api itself. Is this possible to sent the cache header or other metadata via the c# google storage api call?
I am also curious: since I have not set any cache header, why does google storage serve these files with max-age=0, instead of not sending any cache header at all?
You can set the cache control when you call UploadObject, if you specify an Object instead of just the bucket name and object name. Here's an example:
var client = StorageClient.Create();
var obj = new Google.Apis.Storage.v1.Data.Object
{
Bucket = bucketId,
Name = objectName,
CacheControl = "public,max-age=3600"
};
var stream = new MemoryStream(Encoding.UTF8.GetBytes("Hello world"));
client.UploadObject(obj, stream);
You can do it after the fact as well using PatchObject:
var patch = new Google.Apis.Storage.v1.Data.Object
{
Bucket = bucketId,
Name = objectName,
CacheControl = "public,max-age=7200"
};
client.PatchObject(patch);
I don't know about the details of cache control if you haven't specified anything though, I'm afraid.

ImageResizer - Need to remove images that are in the image cache

I use the ImageResizer tool with the DiskCache plugin. We use Azure blob storage to store images and a custom plugin to serve those images within the resizer code. Something went awry and some of the blobs have been deleted, but are cached in the DiskCache in the resizer.
I need to be able to build the hash key to be able to identify the images in the cache. I tried building the key from what I can see in the code, but the string returned does not yield a file in the cache
var vp = ResolveAppRelativeAssumeAppRelative(virtualPath);
var qs = PathUtils.BuildQueryString(queryString).Replace("&red_dot=true", "");
var blob = new Blob(this, virtualPath, queryString);
var modified = blob.ModifiedDateUTC;
var cachekey = string.Format("{0}{1}|{2}", vp, qs, blob.GetModifiedDateUTCAsync().Result.Ticks.ToString(NumberFormatInfo.InvariantInfo));
var relativePath = new UrlHasher().hash(cachekey, 4096, "/");
How can I query the cache to see if the images are still cached and then delete them if they do not exist in the blob storage account?
Note: I have tried to use the AzureReader2 plugin and it doesn't work for us at the moment.
Custom plugins are responsible for controlling access to cached files.
If you want to see where an active request is being cached, check out HttpContext.Current.Items["FinalCachedFile"] during the EndRequest phase of the request. You could do this with an event handler.

Downloading from Blob Azure redirects wrong

When I press download and call this action, I get the result
The resource you are looking for has been removed, had its name
changed, or is temporarily unavailable.
and directed to
http://integratedproject20170322032906.azurewebsites.net/MyDocumentUps/Download/2020Resume.pdf
Why does it link to above and not to
https://filestorageideagen.blob.core.windows.net/documentuploader/2020Resume.pdf
as shown in the controller?
Here is my actionlink in the view
#Html.ActionLink("Download", "Download", "MyDocumentUps", new { id =
item.DocumentId.ToString() + item.RevisionId.ToString() +
item.Attachment }, new { target="_blank"}) |
public ActionResult Download(string id)
{
string path = #"https://filestorageideagen.blob.core.windows.net/documentuploader/";
return View(path + id);
}
I can think of 2 ways by which you can force download a file in the client browser.
Return aFileResultorFileStreamResultfrom your controller. Here's an example of doing so: How can I present a file for download from an MVC controller?. Please note that this will download the file first on your server and then stream the contents to the client browser from there. For smaller files/low load this approach may work but as your site grows or the files to be downloaded becomes bigger in size, it will create more stress on your web server.
Use a Shared Access Signature (SAS) for the blob with Content-Disposition response header set. In this approach you will simply create a SAS token for the blob to be downloaded and using that create a SAS URL. Your controller will simply return a RedirectResult with this URL. The advantage with this approach is that all the downloads are happening directly from Azure Storage and not through your server.
When creating SAS, please ensure that
You have at least Read permission in the SAS.
Content-Disposition header is overridden in the SAS.
The expiry of SAS should be sufficient for the file to download.
Here's the sample code to create a Shared Access Signature.
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
}, new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment;filename=" + blob.Name
});
var blobUrl = string.Format("{0}{1}", blob.Uri.AbsoluteUri, sasToken);
return Redirect(blobUrl);
P.S. While I was answering the question, the question got edited so the answer may seem a bit out of whack :) :).
Given that you have the image url as part of your Model and blob has no restrictions:
<img src=#Model.YourImageUri>
var image = new BitmapImage(new Uri("https://your_storage_account_name.blob.core.windows.net/your_container/your_image.jpg"));

Upload image to Azure blob storage from Windows Phone. Not creating

I am using Windows Azure to store images in my windows phone application.
The camera takes a photo and the chosen photo stream is then uploaded. However it is throwing NO error but is not uploading?
var blobContainer = CloudStorageContext.Current.Resolver.CreateCloudBlobClient();
var container = blobContainer.GetContainerReference("pics");
var blob = container.GetBlobReference("picture.jpg");
blob.UploadFromStream(e.ChosenPhoto, response => { MessageBox.Show(blob.Uri.ToString()) });
I don't have a clue what is happening. The Resolver contains the correct user, key and urls. The container "pics" does exist, but no image is being uploaded. The message box pops up with a url which does not exist.
UPDATE - There seems to be a simular (well almost identical) question posted here - Uploading a photo stream from camera into azure blob in WP7. However the upper case container name is not an issue here, so that solution did not fix this
I have an application (Windows Phone 8) that uploads an image taken to an Azure webrole, which in turn stores the image in an Azure Storage Blob. The code below is how the server stores the images. Again, this code does not run on the phone, but you can use it as a reference.
string randomGUID = locationID
+ "-"
+ Guid.NewGuid().ToString();
//Retrieve storage account from application settings
CloudStorageAccount storageAccount = GetStorageAccount();
//Create blob client
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
//Retrieve reference to images container
CloudBlobContainer container = blobClient.GetContainerReference(
RoleEnvironment.GetConfigurationSettingValue("BlobContainer"));
//Retrieve references to the blob inside the container
CloudBlockBlob blockBlob = container.GetBlockBlobReference(randomGUID);
blockBlob.UploadFromStream(imageToUpload);
The variable imageToUpload is of the type Stream.
As you can see, this is pretty straightforward code. Perhaps your problem has to do with the lambda expression you have in UploadFromStream?

Categories