Using S3 Storage with .NET - c#

I am using AWS.Net to upload user content (images) and display them on my site. This is what my code looks like currently for the upload:
using (client = Amazon.AWSClientFactory.CreateAmazonS3Client())
{
var putObjectRequest = new PutObjectRequest
{
BucketName = bucketName,
InputStream = fileStream,
Key = fileName,
CannedACL = S3CannedACL.PublicRead,
//MD5Digest = md5Base64,
//GenerateMD5Digest = true,
Timeout = 3600000 //1 Hour
};
S3Response response = client.PutObject(putObjectRequest);
response.Dispose();
}
Whats the best way to store the path to these files? Is there a way to get a link to my file from the response?
Currently I just have a URL in my webconfig like https://s3.amazonaws.com/<MyBucketName>/ and then when I need to show an image I take that string and use the key from the object I store in the db that represents the file uploaded.
Is there a better way to do this?
All the examples that come with it don't really address this kind of usage. And the documentation isn't online and I don't know how to get to it after I install the SDK, despite following the directions on Amazon.

Your approach of storing paths including your bucket names in web.config is the same thing that I do, which works great. I then just store the relative paths in various database tables.
The nice thing about this approach is that it makes it easier to migrate to different storage mechanisms or CDNs such as CloudFront. I don't think that there's a better way than this approach because S3 files reside on a different domain, or subdomain if you do CNAME mapping, and thus your .NET runtime does not run under the same domain or subdomain.

There is also a "Location" property in the response which points directly to the uri where the S3Object is.

Related

How to download the picture of a Google+ user only if it has changed after a certain date?

I am using Google API to get information about an authenticated user. I can get the basic profile information, such as the ID and the full name. From the profile information, I can get the URL to the picture:
var plusMeUri = new Uri($"https://www.googleapis.com/plus/v1/people/me?key=<APP-ID>&access_token=<ACCESS-TOKEN>");
string userResponse = await HttpClient.GetStringAsync(plusMeUri);
JObject userObject = JObject.Parse(userResponse);
...
var imageObject = userObject.GetValue("image") as JObject;
var pictureUrl = imageObject.GetValue("url").Value<string>();
var pictureUri = new Uri(pictureUrl);
string uri = $"{pictureUri.Scheme}://{pictureUri.Host}{pictureUri.AbsolutePath}";
var pictureRequest = new HttpRequestMessage(HttpMethod.Get, uri);
pictureRequest.Headers.IfModifiedSince = <previous-timestamp>;
HttpResponseMessage pictureResponse = await HttpClient.SendAsync(pictureRequest);
if (pictureResponse.StatusCode == HttpStatusCode.NotModified)
// No need to handle anything else
return;
Question
I do not want to download the user's picture if it has not changed. This is why I am using the IfModifiedSince property. It does work with Facebook's API but it does not seem to work with Google's. How can I make it work?
From the information given, it seems like what you're trying to do is determine whether the image you're downloading/about to download is the same image as you've downloaded before. After looking at the Google+ API docs, it looks like the header you've been using isn't officially (at least not obviously) supported by their APIs.
But this is not the only way we can determine whether the image has changed or not (in fact, date last modified isn't necessarily the best way to do this anyway). Alternative methods include:
1) diffing the two images
2) checking the url (if we can assume different resources have different urls)
1 is likely the most accurate but also likely the least efficient, so I'll leave that to you to solve if you decide to go that route. I think the most promising is #2. I went ahead and played around with the API a little bit and it looks like the image.url field changes when you update your profile picture.
For example, here are my last two Google+ profile picture URLs:
https://lh4.googleusercontent.com/-oaUVPGFNkV8/AAAAAAAAAAI/AAAAAAAAAqs/KM7H8ZIFuxk/photo.jpg?sz=50
https://lh4.googleusercontent.com/-oaUVPGFNkV8/AAAAAAAAAAI/AAAAAAAAl24/yHU99opjgN4/photo.jpg?sz=50
As such, instead of waiting for the response from the server and checking its header to decide whether the image has been updated or not, you may be able to short-circuit the entire HTTP request by simply checking whether the last image you pulled down was from the same url or not. If it was from the same URL, it's likely you've already acquired that image otherwise you may not have it so should incur the cost of downloading anyway.
In this case, your code would read something like:
var imageObject = userObject.GetValue("image") as JObject;
var pictureUrl = imageObject.GetValue("url").Value<string>();
if(pictureUrl != <previous-picture-url>)
{
// insert get new picture logic here...
}

Downloading from Blob Azure redirects wrong

When I press download and call this action, I get the result
The resource you are looking for has been removed, had its name
changed, or is temporarily unavailable.
and directed to
http://integratedproject20170322032906.azurewebsites.net/MyDocumentUps/Download/2020Resume.pdf
Why does it link to above and not to
https://filestorageideagen.blob.core.windows.net/documentuploader/2020Resume.pdf
as shown in the controller?
Here is my actionlink in the view
#Html.ActionLink("Download", "Download", "MyDocumentUps", new { id =
item.DocumentId.ToString() + item.RevisionId.ToString() +
item.Attachment }, new { target="_blank"}) |
public ActionResult Download(string id)
{
string path = #"https://filestorageideagen.blob.core.windows.net/documentuploader/";
return View(path + id);
}
I can think of 2 ways by which you can force download a file in the client browser.
Return aFileResultorFileStreamResultfrom your controller. Here's an example of doing so: How can I present a file for download from an MVC controller?. Please note that this will download the file first on your server and then stream the contents to the client browser from there. For smaller files/low load this approach may work but as your site grows or the files to be downloaded becomes bigger in size, it will create more stress on your web server.
Use a Shared Access Signature (SAS) for the blob with Content-Disposition response header set. In this approach you will simply create a SAS token for the blob to be downloaded and using that create a SAS URL. Your controller will simply return a RedirectResult with this URL. The advantage with this approach is that all the downloads are happening directly from Azure Storage and not through your server.
When creating SAS, please ensure that
You have at least Read permission in the SAS.
Content-Disposition header is overridden in the SAS.
The expiry of SAS should be sufficient for the file to download.
Here's the sample code to create a Shared Access Signature.
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
}, new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment;filename=" + blob.Name
});
var blobUrl = string.Format("{0}{1}", blob.Uri.AbsoluteUri, sasToken);
return Redirect(blobUrl);
P.S. While I was answering the question, the question got edited so the answer may seem a bit out of whack :) :).
Given that you have the image url as part of your Model and blob has no restrictions:
<img src=#Model.YourImageUri>
var image = new BitmapImage(new Uri("https://your_storage_account_name.blob.core.windows.net/your_container/your_image.jpg"));

Google drive api, upload file with shared permission

Im trying to upload file to the google drive, and I want to set 'shared' permission for this file, but I don't know how to do it...
I tried to use this code, but the file is uploaded without the shared permission.
My code is:
// _drive - google drive object
Google.Apis.Drive.v2.Data.File item = new Google.Apis.Drive.v2.Data.File();
Permission permission = new Permission();
permission.Role = "reader";
permission.Type = "anyone";
permission.WithLink = true;
item.Permissions = new List<Permission>() { permission };
FilesResource.InsertMediaUpload request = _drive.Files.Insert(item, fileStream, mimeType);
request.Upload();
OK I have spent the last hour playing around with this. If you check out Files.insert documentation it doesn't really state anyplace that you should be able to set the permissions at insert time.
At the bottom if you test out try it. Setting the permissions up as you have done above under Request body.
It does upload the file. But the Json returned gives us a clue.
"shared": false,
Now if i check the file in Google drive
This leads me to believe that this is not supported by the Google Drive API. It is not possible to set the permissions at the time of upload. You are going to have to create a separate call to set the permissions after you have uploaded the file.
While it looks like the body does support the permissions it doesn't appear to be working. I am not sure if this is a bug or something that is just not supported. I am going to see if i can find the location of the issue tracker for Drive and add it as an issue.
In the mean time you are going to have to make the two calls and eat a bit of your Quota.
Issue 3717: Google drive api, upload file with shared permission
Also experienced this bug. Specifying permissions in the same file / directory upload, had to do it in a separate request, like below. The Google Drive API documentation is not clear about this (and not clear about how to handle file permissions when using a Service Account).
var NewDirRequest = DService.Files.Insert(GoogleDir);
var NewDir = NewDirRequest.Execute();
GoogleFolderID = NewDir.Id;
var NewPermissionsRequest = DService.Permissions.Insert(new Permission()
{
Kind = "drive#permission",
Value = emailAddress,
Role = "writer",
Type = "user"
}, GoogleFolderID);
DService.Permissions.Insert(new Permission()
{
Kind = "drive#permission",
Value = "mydomain.com",
Role = "reader",
Type = "domain"
}, GoogleFolderID);
NewPermissionsRequest.Execute();

Why does requesting a pre-signed URL in the Amazon SDK for a file that doesn't exist return a URL?

I ran into this little bit of weirdness today, and I haven't been able to find anything about it, so I was hoping someone here could help. I'm trying to get a pre-signed URL for an image in my S3 bucket using the AWS SDK for C# .NET. I make the request by doing the following:
string url = string.Empty;
using (s3Client = new AmazonS3Client("aws-access-key",
"aws-secret-key",
RegionEndpoint.USEast1))
{
GetPreSignedUrlRequest request1 = new GetPreSignedUrlRequest()
{
BucketName = BUCKET_NAME,
Key = "whatever.jpg",
Expires = DateTime.Now.AddMinutes(1)
};
try
{
url = s3Client.GetPreSignedURL(request1);
}
catch (AmazonS3Exception amazonS3Exception)
{
}
}
"Whatever.jpg" doesn't exist in my bucket, but it still returns with a URL. If I try going to that URL, it just tells me that the specified key does not exist. This all seems a bit weird to me. Why does it return a URL at all instead of throwing some exception?
Would it be better to check to see if the file exists first on S3 and then create the request for the pre-signed URL? Thanks for the all help in advance!
Signing URLs is a purely client-side operation (using cryptography).
There is no reason to add a network request to that.
For one thing, this allows you to sign your URLs before uploading them.

Getting Access Denied Exception when deleting a file in Amazon S3 using the .Net AWSSDK

I am trying to do some simple file IO using amazon S3 and C#.
So far I have been able to create files and list them. I am the bucket owner and I should have full access. In CloudBerry I can create and delete files in the bucket. In my code when I try to delete a file I get an access denied exception.
This is my test method:
[Test]
public void TestThatFilesCanBeCreatedAndDeleted()
{
const string testFile = "test.txt";
var awsS3Helper = new AwsS3Helper();
awsS3Helper.AddFileToBucketRoot(testFile);
var testList = awsS3Helper.ListItemsInBucketRoot();
Assert.True(testList.ContainsKey(testFile)); // This test passes
awsS3Helper.DeleteFileFromBucket(testFile); // Access denied exception here
testList = awsS3Helper.ListItemsInBucketRoot();
Assert.False(testList.ContainsKey(testFile));
}
My method to add a file:
var request = new PutObjectRequest();
request.WithBucketName(bucketName);
request.WithKey(fileName);
request.WithContentBody("");
S3Response response = client.PutObject(request);
response.Dispose();
My method to delete a file:
var request = new DeleteObjectRequest()
{
BucketName = bucketName,
Key = fileKey
};
S3Response response = client.DeleteObject(request);
response.Dispose();
After running the code the file is visible in CloudBerry and I can delete it from there.
I have very little experience with Amazon S3 so I don't know what could be going wrong. Should I be putting some kind of permissions on to any files I create or upload? Why would I be able to delete a file while I am logged in to CloudBerry with the same credentials provided to my program?
I'm not sure what is the source of problem. Possibly security rules, but maybe something very simple with your bucket configuration. You can check them using S3 Organizer Firefox plugin, using AWS management site or any other management tool. Also I recommend request-responce logging - that helped a lot in different investigating for me. AWSSDK has plenty of good exmamples with logging - so you need only copy-paste them and everything works. If you have actual requests sending to Amazon, you can compare them with documentation. Please check AccessKeyId for your deleteRequest

Categories