Im trying to upload file to the google drive, and I want to set 'shared' permission for this file, but I don't know how to do it...
I tried to use this code, but the file is uploaded without the shared permission.
My code is:
// _drive - google drive object
Google.Apis.Drive.v2.Data.File item = new Google.Apis.Drive.v2.Data.File();
Permission permission = new Permission();
permission.Role = "reader";
permission.Type = "anyone";
permission.WithLink = true;
item.Permissions = new List<Permission>() { permission };
FilesResource.InsertMediaUpload request = _drive.Files.Insert(item, fileStream, mimeType);
request.Upload();
OK I have spent the last hour playing around with this. If you check out Files.insert documentation it doesn't really state anyplace that you should be able to set the permissions at insert time.
At the bottom if you test out try it. Setting the permissions up as you have done above under Request body.
It does upload the file. But the Json returned gives us a clue.
"shared": false,
Now if i check the file in Google drive
This leads me to believe that this is not supported by the Google Drive API. It is not possible to set the permissions at the time of upload. You are going to have to create a separate call to set the permissions after you have uploaded the file.
While it looks like the body does support the permissions it doesn't appear to be working. I am not sure if this is a bug or something that is just not supported. I am going to see if i can find the location of the issue tracker for Drive and add it as an issue.
In the mean time you are going to have to make the two calls and eat a bit of your Quota.
Issue 3717: Google drive api, upload file with shared permission
Also experienced this bug. Specifying permissions in the same file / directory upload, had to do it in a separate request, like below. The Google Drive API documentation is not clear about this (and not clear about how to handle file permissions when using a Service Account).
var NewDirRequest = DService.Files.Insert(GoogleDir);
var NewDir = NewDirRequest.Execute();
GoogleFolderID = NewDir.Id;
var NewPermissionsRequest = DService.Permissions.Insert(new Permission()
{
Kind = "drive#permission",
Value = emailAddress,
Role = "writer",
Type = "user"
}, GoogleFolderID);
DService.Permissions.Insert(new Permission()
{
Kind = "drive#permission",
Value = "mydomain.com",
Role = "reader",
Type = "domain"
}, GoogleFolderID);
NewPermissionsRequest.Execute();
Related
My colleague shared with me a lot of university documents and I want to copy them to my google drive.
I start C# .net core application and Google Drive v3 API.
I have
var request = _dataService.Files.Get(id);
request.SupportsAllDrives = true;
request.Fields = "*";
return request;
Where dataService
_dataService = new DriveService(new BaseClientService.Initializer
{
ApiKey = settingsData.ApiKey,
ApplicationName = settingsData.ApplicationName
});
How to copy files ? He sent me file ids so I have access to that file id. How to make a copy to my google drive ? I didn't find anything in documentation.
Is there any reason why you cannot use the method Copy?
Google.Apis.Drive.v3.Data.File copiedFile = new Google.Apis.Drive.v3.Data.File();
//This will be the body of the request so probably you would want to modify this
copiedFile.Name = "Name of the new file";
string originFileId = "insert fileId to be copied";
FilesResource.CopyRequest copyRequest = service.Files.Copy(copiedFile, originFileId);
// You can change more parameter of the request here
copyRequest.Execute();
You can modify the body of the request through the File object (copiedFile) you can look at the parameters in here.
Make sure that you include all relevant scopes.
If you still have doubts about the fields or methods in C# classes you can take a deeper look at the C# doc for the api.
Also I would suggest to try the Quickstart for .NET Drive API and work from there in case the solution above don't work.
I wrote some logic to upload files into a Google Drive folder which is shared to a service account (uploader#uploadproject-204816.iam.gserviceaccount.com)
The Google Drive itself is associated with my G Suite account.
This worked for a while, but eventually I started getting this error:
Google.Apis.Requests.RequestError The user's Drive storage quota has
been exceeded. [403] Errors [ Message[The user's Drive storage quota
has been exceeded.] Location[ - ] Reason[storageQuotaExceeded]
Domain[global] ]
Ever since, the upload is not working, getting this error even when uploading a puny file.
Using my real account, I cleaned up all the files owned by the service account, even those that were "unorganized".
The quota to the left says "170MB of 30GB used"
I also waited more than 24hrs.
I think this is about the service account itself, not the drive. But I'm not sure what to check.
I tried inspecting the results of the About.Get() method, didn't find anything useful.
var cred = new ServiceAccountCredential(new ServiceAccountCredential.Initializer(clientEmail)
{
Scopes = (new[] { "https://www.googleapis.com/auth/drive" }).ToList()
}.FromPrivateKey(privateKey));
service = new DriveService(new Google.Apis.Services.BaseClientService.Initializer { ApiKey = apiKey, HttpClientInitializer = cred });
var about = service.About.Get();
EDIT:
Turns out I needed to do something else to get more information about the quota
var get = service.About.Get();
get.Fields = "*";
var about = get.Execute();
Then I can see in about.StorageQuota
Limit: 16106127360
Usage: 16106125128
So indeed, I understand the storage quota error now. The question is, how do I reset these numbers ? All the files the service account uploaded were deleted by MY account in the same location.
Any ideas ?
you can't delete file created (so owned) by account service with your real account.
even you did that, you'll not see the files on your real account,but they still on the service account.
you have to delete them with the account service.
below a script explained to delete file with PHP. for this you have to know the ID of files you want to delete. so we make request to get all files IDs contained in folder parent referenced by 0B1-GGKdq5A3qX1dzNkhOLW1VWVU (each file and foler is referenced by unique UID like this), and then make a loop todelete every file.
first create client to connect to service account
$drive_service = new Google_Service_Drive($client);
second, get all files in folder referenced by 0B1-GGKdq5A3qX1dzNkhOLW1VWVU
$files_list = $drive_service->files->listFiles(array(
'fields' => 'files(id, name)',
'q' => "'0B1-GGKdq5A3qX1dzNkhOLW1VWVU' in parents" ,
);) ;
$files_list = $files_list->getFiles() ;
then make loop to delete all files
foreach ($files_list as $file){
$id = $file->id;
$resp = $drive_service->files->delete($id);
}
It turns out to lower the storage usage for the service account, files need to be deleted with the service account as well.
When does the files get deleted from Google Drive Service account?
When I press download and call this action, I get the result
The resource you are looking for has been removed, had its name
changed, or is temporarily unavailable.
and directed to
http://integratedproject20170322032906.azurewebsites.net/MyDocumentUps/Download/2020Resume.pdf
Why does it link to above and not to
https://filestorageideagen.blob.core.windows.net/documentuploader/2020Resume.pdf
as shown in the controller?
Here is my actionlink in the view
#Html.ActionLink("Download", "Download", "MyDocumentUps", new { id =
item.DocumentId.ToString() + item.RevisionId.ToString() +
item.Attachment }, new { target="_blank"}) |
public ActionResult Download(string id)
{
string path = #"https://filestorageideagen.blob.core.windows.net/documentuploader/";
return View(path + id);
}
I can think of 2 ways by which you can force download a file in the client browser.
Return aFileResultorFileStreamResultfrom your controller. Here's an example of doing so: How can I present a file for download from an MVC controller?. Please note that this will download the file first on your server and then stream the contents to the client browser from there. For smaller files/low load this approach may work but as your site grows or the files to be downloaded becomes bigger in size, it will create more stress on your web server.
Use a Shared Access Signature (SAS) for the blob with Content-Disposition response header set. In this approach you will simply create a SAS token for the blob to be downloaded and using that create a SAS URL. Your controller will simply return a RedirectResult with this URL. The advantage with this approach is that all the downloads are happening directly from Azure Storage and not through your server.
When creating SAS, please ensure that
You have at least Read permission in the SAS.
Content-Disposition header is overridden in the SAS.
The expiry of SAS should be sufficient for the file to download.
Here's the sample code to create a Shared Access Signature.
var sasToken = blob.GetSharedAccessSignature(new SharedAccessBlobPolicy()
{
Permissions = SharedAccessBlobPermissions.Read,
SharedAccessExpiryTime = DateTime.UtcNow.AddHours(1),
}, new SharedAccessBlobHeaders()
{
ContentDisposition = "attachment;filename=" + blob.Name
});
var blobUrl = string.Format("{0}{1}", blob.Uri.AbsoluteUri, sasToken);
return Redirect(blobUrl);
P.S. While I was answering the question, the question got edited so the answer may seem a bit out of whack :) :).
Given that you have the image url as part of your Model and blob has no restrictions:
<img src=#Model.YourImageUri>
var image = new BitmapImage(new Uri("https://your_storage_account_name.blob.core.windows.net/your_container/your_image.jpg"));
I am trying to do some simple file IO using amazon S3 and C#.
So far I have been able to create files and list them. I am the bucket owner and I should have full access. In CloudBerry I can create and delete files in the bucket. In my code when I try to delete a file I get an access denied exception.
This is my test method:
[Test]
public void TestThatFilesCanBeCreatedAndDeleted()
{
const string testFile = "test.txt";
var awsS3Helper = new AwsS3Helper();
awsS3Helper.AddFileToBucketRoot(testFile);
var testList = awsS3Helper.ListItemsInBucketRoot();
Assert.True(testList.ContainsKey(testFile)); // This test passes
awsS3Helper.DeleteFileFromBucket(testFile); // Access denied exception here
testList = awsS3Helper.ListItemsInBucketRoot();
Assert.False(testList.ContainsKey(testFile));
}
My method to add a file:
var request = new PutObjectRequest();
request.WithBucketName(bucketName);
request.WithKey(fileName);
request.WithContentBody("");
S3Response response = client.PutObject(request);
response.Dispose();
My method to delete a file:
var request = new DeleteObjectRequest()
{
BucketName = bucketName,
Key = fileKey
};
S3Response response = client.DeleteObject(request);
response.Dispose();
After running the code the file is visible in CloudBerry and I can delete it from there.
I have very little experience with Amazon S3 so I don't know what could be going wrong. Should I be putting some kind of permissions on to any files I create or upload? Why would I be able to delete a file while I am logged in to CloudBerry with the same credentials provided to my program?
I'm not sure what is the source of problem. Possibly security rules, but maybe something very simple with your bucket configuration. You can check them using S3 Organizer Firefox plugin, using AWS management site or any other management tool. Also I recommend request-responce logging - that helped a lot in different investigating for me. AWSSDK has plenty of good exmamples with logging - so you need only copy-paste them and everything works. If you have actual requests sending to Amazon, you can compare them with documentation. Please check AccessKeyId for your deleteRequest
I am using AWS.Net to upload user content (images) and display them on my site. This is what my code looks like currently for the upload:
using (client = Amazon.AWSClientFactory.CreateAmazonS3Client())
{
var putObjectRequest = new PutObjectRequest
{
BucketName = bucketName,
InputStream = fileStream,
Key = fileName,
CannedACL = S3CannedACL.PublicRead,
//MD5Digest = md5Base64,
//GenerateMD5Digest = true,
Timeout = 3600000 //1 Hour
};
S3Response response = client.PutObject(putObjectRequest);
response.Dispose();
}
Whats the best way to store the path to these files? Is there a way to get a link to my file from the response?
Currently I just have a URL in my webconfig like https://s3.amazonaws.com/<MyBucketName>/ and then when I need to show an image I take that string and use the key from the object I store in the db that represents the file uploaded.
Is there a better way to do this?
All the examples that come with it don't really address this kind of usage. And the documentation isn't online and I don't know how to get to it after I install the SDK, despite following the directions on Amazon.
Your approach of storing paths including your bucket names in web.config is the same thing that I do, which works great. I then just store the relative paths in various database tables.
The nice thing about this approach is that it makes it easier to migrate to different storage mechanisms or CDNs such as CloudFront. I don't think that there's a better way than this approach because S3 files reside on a different domain, or subdomain if you do CNAME mapping, and thus your .NET runtime does not run under the same domain or subdomain.
There is also a "Location" property in the response which points directly to the uri where the S3Object is.