In my DownloadFile page's Page_Load method, the filename a user wishes to download is retrieved from the query string. The file is hosted in an Azure Blob Storage account. I am attempting to download the file using shared access signatures (SAS) via this approach:
var containerName = "containerName";
var con = "connectionString";
CloudStorageAccount account = CloudStorageAccount.Parse(con);
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(containerName);
var blob = container.GetBlockBlobReference("file.pdf");
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(10),
}, new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment; filename=file.pdf"
});
var blobUrl = string.Format("{0}{1}", blob.Uri.AbsoluteUri, sasToken);
Response.Redirect(blobUrl);
However, this does not download the file. Instead, the browser just shows a garbled character stream of the file:
The blobUrl appears to be valid. But the Response.Redirect isn't working as expected. Am I doing something wrong?
Note: I am using WebForms (unfortunately), not MVC
I was able to figure out a workaround by using an iframe. If I add an iframe to the page:
<iframe id="iframeFile" runat="server" style="display:none;"></iframe>
Then set the source in the codefile:
iframeFile.Src = blobUrl;
It downloads the file.
This may not be ideal, and I'm still not sure why Response.Redirect isn't working as expected. So I'm certainly open to other suggestions as well. But this workaround does resolve the issue of not being able to download the file.
Related
I am trying to return a SAS url to my frontend so I can redirect the user to that link and so they can download the file.
This is my code to create the SAS url
private SasQueryParameters GenerateSaSCredentials(string containerName, string blobName) {
// Defines the resource being accessed and for how long the access is allowed.
BlobSasBuilder blobSasBuilder = new() {
StartsOn = DateTime.UtcNow.Subtract(TimeSpan.FromMinutes(10)),
ExpiresOn = DateTime.UtcNow.Add(TimeSpan.FromMinutes(120)) + TimeSpan.FromSeconds(1),
Resource = "b",
BlobName = blobName,
BlobContainerName = containerName
};
// Defines the type of permission.
blobSasBuilder.SetPermissions(BlobSasPermissions.Read);
// Builds an instance of StorageSharedKeyCredential
StorageSharedKeyCredential storageSharedKeyCredential = new(_accountName, _key);
// Builds the Sas URI.
return blobSasBuilder.ToSasQueryParameters(storageSharedKeyCredential);
}
public Uri CreateBlobUri(string blobName, string containerName) {
SasQueryParameters parameters = GenerateSaSCredentials(containerName, blobName);
return new UriBuilder {
Scheme = "https",
Host = $"{_accountName}.blob.core.windows.net",
Path = $"files/{containerName}/{blobName}",
Query = WebUtility.UrlDecode(parameters.ToString())
}.Uri;
}
You may notice the url decode on parameters.ToString() is because of a similar issue ive seen on stackoverflow where they spoke of double encoding.
However when i return this url to the browser and redirect i get the following error.
This is how i return the URL
return Ok(_blobUtils.CreateBlobUri(fileName, containerName).ToString());
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header
is formed correctly including the signature. RequestId:01696cca-d01e-0023-2ea4-74f5df000000
Time:2021-07-09T09:23:33.0250817Z</Message>
<AuthenticationErrorDetail>Signature fields not well formed.</AuthenticationErrorDetail>
</Error>
If i remove the WebUtility.UrlDecode from the parameters.ToString(), i get this error
<Error>
<Code>AuthenticationFailed</Code>
<Message>Server failed to authenticate the request. Make sure the value of Authorization header
is formed correctly including the signature. RequestId:016a1821-d01e-0023-3da4-74f5df000000
Time:2021-07-09T09:24:38.4051042Z</Message>
<AuthenticationErrorDetail>Signature did not match. String to sign used was r 2021-07-
09T09:14:38Z 2021-07-09T11:24:39Z /blob/${_acountName}/files/bqXbY54sRRsipOUB1PF6/fyI67FYOqDS80y1vNWRL/PRE_OP_CT/0/TK1.left.TST.PTN1.PRE
_OP_CT.zip 2020-04-08 b </AuthenticationErrorDetail>
</Error>
The structure of the Blob i am trying to access is:
And finally the blob we are trying to create a SAS to
Can anyone see why this would fail?
Please get rid of files from Path here:
return new UriBuilder {
Scheme = "https",
Host = $"{_accountName}.blob.core.windows.net",
Path = $"files/{containerName}/{blobName}",
Query = WebUtility.UrlDecode(parameters.ToString())
}.Uri;
It should be something like:
return new UriBuilder {
Scheme = "https",
Host = $"{_accountName}.blob.core.windows.net",
Path = $"{containerName}/{blobName}",
Query = WebUtility.UrlDecode(parameters.ToString())
}.Uri;
UPDATE
Based on the screenshot and the error message, the name of your container is files and the name of the blob is bqXbY54sRRsipOUB1PF6/fyI67FYOqDS80y1vNWRL/PRE_OP_CT/0/TK1.left.TST.PTN1.PRE. Please use them in your code and you should not get the error. You still need to remove files from the Path above as it is already included in your containerName.
The reason your code is failing is because you're calculating SAS token for a blob inside a blob container (the blob path becomes container-name/blob-name). However in your request, you're prepending files to your request URL, your blob path becomes files/container-name/blob-name. Since the SAS token is obtained for a different path but is used for another path, you're getting the error.
I am trying to download a file from a client Sharepoint site. I am using sharepoint CSOM.
My code is as follows:
using Microsoft.SharePoint.Client;
var username = "username";
var password = "pass";
var url = "https://myclient.sharepoint.com/";
var fileurl = "https://myclient.sharepoint.com/teams/folder1/folder%20x/somefile.docx";
using (ClientContext context = new ClientContext(url))
{
SecureString passWord = new SecureString();
foreach (char c in password.ToCharArray()) passWord.AppendChar(c);
context.Credentials = new SharePointOnlineCredentials(username, passWord);
Uri filename = new Uri(fileurl);
string server = filename.AbsoluteUri.Replace(filename.AbsolutePath, "");
string serverrelative = filename.AbsolutePath;
Microsoft.SharePoint.Client.File file = context.Web.GetFileByServerRelativeUrl(serverrelative);
context.Load(file);
ClientResult<Stream> streamResult = file.OpenBinaryStream();
context.ExecuteQuery();
var file2 = streamResult.Value;
}
The problem is that I get access denied, yet when I log in with the same credentials, I can download the file successfully.
Is there a separate permission in Sharepoint for downloading file from API instead of UI?
Could the space in the folder name be the problem?
UPDATE
Verified this does not have anything to do with spaces in folder or filename.
In case if SharePoint site uses multiple authentication providers using a set of Windows credentials (also relevant for SharePoint Online), the additional header must be included in a request: X-FORMS_BASED_AUTH_ACCEPTED with a value of f
For ClientContext class the header could be included like this:
ctx.ExecutingWebRequest += (sender, e) =>
{
e.WebRequestExecutor.WebRequest.Headers["X-FORMS_BASED_AUTH_ACCEPTED"] = "f";
};
Example
var file = ctx.Web.GetFileByUrl(fileAbsUrl);
ctx.Load(file);
var streamResult = file.OpenBinaryStream();
ctx.ExecuteQuery();
//save into file
using (var fileStream = System.IO.File.Create(#"C:\path\filename.docx"))
{
streamResult.Value.Seek(0, SeekOrigin.Begin);
streamResult.Value.CopyTo(fileStream);
}
Note: instead of converting to relative url, GetFileByUrl
method is used which accepts absolute url
Problem was that I was not connecting to the right url (new ClientContext(url))
I was connecting to: https://myclient.sharepoint.com/
I should have been connecting to: https://myclient.sharepoint.com/teams/folder1/
I have images in Azure that I need to add in a pdf using pdfJet.
This is the code that I use if I read an image on disk, however I have a lot of images and it does not make sense to download them from Azure.
Image image = new Image(objects, new BufferedStream(new FileStream(LocalPath + "image.PNG", FileMode.Open, FileAccess.Read)), ImageType.PNG);
PS: This is done in asp.net webforms.
Thank you for your help.
I am now using the following function to read a PDF:
public MemoryStream DownloadToMemoryStream(DTO.BlobUpload b)
{
CloudStorageAccount storageAccount = Conn.SNString(b.OrgID);
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = blobClient.GetContainerReference(b.Container);
CloudBlockBlob blob = container.GetBlockBlobReference(b.FileName);
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(10),//assuming the blob can be downloaded in 10 miinutes
}, new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment; filename=file-name"
});
using (MemoryStream ms = new MemoryStream())
{
blob.DownloadToStream(ms);
return ms;
}
}
And in the aspx page I use:
MemoryStream pdfScript = B.DownloadToMemoryStream(b);
to read the stream:
SortedDictionary<Int32, PDFobj> objects = pdfFinalScript.Read(pdfScript);
However I get the error message:
Cannot access a closed Stream
I have looked at how to open the stream but haven't managed to do it.
Could you please help, thank you
According to your description, you would download blobs from Azure.
Here are several ways you could refer to.
1.Download with blob url.
Create a Shared Access Signature with Read permission and Content-Disposition header set and create blob URL based on that and use that URL. In this case, the blob contents will be directly streamed from storage to the client browser.
2.Get the blob and DownloadFileFromBlob.
3.Download file to exactly path in local.
Webform:
You could use Response.Redirect(blobUrl); to redirect blob url and download it.
In .aspx:
<asp:Button ID="Button1" runat="server" Text="Click Me" OnClick="Button1_Click" />
In aspx.cs:
protected void Button1_Click(object sender, EventArgs e)
{
CloudStorageAccount account = new CloudStorageAccount(new StorageCredentials("accountname", "accountkey"), true);
var blobClient = account.CreateCloudBlobClient();
var container = blobClient.GetContainerReference("container");
var blob = container.GetBlockBlobReference("text.PNG");
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(10),//assuming the blob can be downloaded in 10 miinutes
}, new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment; filename=file-name"
});
using (MemoryStream ms = new MemoryStream())
{
blob.DownloadToStream(ms);
Image image = new Image(objects, ms, ImageType.PNG);
}
}
I have the following ASPX markup:
<asp:Image ImageUrl="placeholder.png" runat="server" ID="plhdr" />
The webpage lets a user upload an image, which is processed through a JavaScript library and then set as the source of the above control. The JavaScript sets the source as a Base64 String like this:
data:image/jpeg;base64,/9j/4AAQ...
I have a function on the same page that is meant to upload the displayed image to Azure Storage and then add a reference to it in Azure SQL. The code I have in the code behind is:
StorageCredentials creden = new StorageCredentials(accountname, accesskey);
CloudStorageAccount acc = new CloudStorageAccount(creden, useHttps: true);
CloudBlobClient client = acc.CreateCloudBlobClient();
CloudBlobContainer cont = client.GetContainerReference("testcont");
cont.CreateIfNotExists();
cont.SetPermissions(new BlobContainerPermissions
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
CloudBlockBlob cblob = cont.GetBlockBlobReference("cblob");
var imagesrc = plhdr.ImageUrl;
cblob.UploadFromFile(#imagesrc);
var imageUrl = cblob.Uri;
Server Error in '/' Application. Could not find a part of the path
'D:\Windows\system32\placeholder.png'. Exception Details:
System.IO.DirectoryNotFoundException: Could not find a part of the
path 'D:\Windows\system32\dist\img\fling\space.gif'.
At this line: cblob.UploadFromFile(#imagesrc);.
Hoping someone can point me in the right direction.
According to your description, we can upload the image file to azure storage by using UploadFromStream(stream), Please have a try to do it with following sample code. It works for me.
CloudBlockBlob cblob = cont.GetBlockBlobReference("testblob");
var bytes = Convert.FromBase64String(#"iVBORw0KGgoAAAANSUhEUgAAAAUAAAAFCAYAAACNbyblAAAAHElEQVQI12P4//8/w38GIAXDIBKE0DHxgljNBAAO9TXL0Y4OHwAAAABJRU5ErkJggg==");// without data:image/jpeg;base64 prefix, just base64 string
using (var stream = new MemoryStream(bytes))
{
cblob.UploadFromStream(stream);
}
I`d like to create spreadsheet from .csv file.
Code:
static void Main()
{
String CLIENT_ID = "<MY ID>";
String CLIENT_SECRET = "<MY SECRET>";
var provider = new NativeApplicationClient(GoogleAuthenticationServer.Description, CLIENT_ID, CLIENT_SECRET);
var auth = new OAuth2Authenticator<NativeApplicationClient>(provider, GetAuthorization);
var service = new DriveService(auth);
File body = new File();
body.Title = "My spread";
body.Description = "A test spread";
body.MimeType = "application/vnd.google-apps.spreadsheet";
byte[] byteArray = System.IO.File.ReadAllBytes("spread.csv");
System.IO.MemoryStream stream = new System.IO.MemoryStream(byteArray);
service.Files.Insert(body, stream, "application/vnd.google-apps.spreadsheet").Upload();
}
private static IAuthorizationState GetAuthorization(NativeApplicationClient arg)
{
string[] scopes = new string[] { "https://www.googleapis.com/auth/drive", "https://www.googleapis.com/auth/userinfo.profile" };
IAuthorizationState state = new AuthorizationState(scopes);
state.Callback = new Uri(NativeApplicationClient.OutOfBandCallbackUrl);
Uri authUri = arg.RequestUserAuthorization(state);
// Request authorization from the user (by opening a browser window):
Process.Start(authUri.ToString());
Console.Write(" Authorization Code: ");
string authCode = Console.ReadLine();
Console.WriteLine();
// Retrieve the access token by using the authorization code:
return arg.ProcessUserAuthorization(authCode, state);
}
After inserting file and clicking on it on google-drive page(logged as owner) I get message
"We're sorry.
The spreadsheet at this URL could not be found. Make sure that you have the right URL and that the owner of the spreadsheet hasn't deleted it"
When I change MimeType to "text/csv" and insert file, after clicking on it, I get message
"No preview available
This item was created with (MyApp'sName), a Google Drive app.
Download this file or use one of the apps you have installed to open it."
I can also right click on this file(that one created with "text/csv" MimeType) and choose option "Export to Google Docs" and this gives me result that I'd like to reach - spreadsheet file with my .csv's file content. But such indirect method doesn`t fully satify me. Is there any method to make spreadsheet file on google drive, with content from .csv file direct from my application?
I don't know c#, but on the basis it mirrors Java, within the line
service.Files.Insert(body, stream,
"application/vnd.google-apps.spreadsheet").Upload();
you need to insert the equivalent of
.setConvert(true)
also ...
the mimetype should be "text/csv"
I've found the answer :)
How to programmatically convert a file into an appropriate Google Documents format:
var service = new DriveService(new BaseClientService.Initializer
{
...
});
...
FilesResource.InsertMediaUpload request = service.Files.Insert(body, stream, _mimeType);
request.Convert = true;
request.Upload();