Can someone please show me how to determine if a certain file/object exists in a S3 bucket and display a message if it exists or if it does not exist.
Basically I want it to:
1) Check a bucket on my S3 account such as testbucket
2) Inside of that bucket, look to see if there is a file with the prefix test_ (test_file.txt or test_data.txt).
3) If that file exists, then display a MessageBox (or Console message) that the file exists, or that the file does not exist.
Can someone please show me how to do this?
Using the AWSSDK For .Net I Currently do something along the lines of:
public bool Exists(string fileKey, string bucketName)
{
try
{
response = _s3Client.GetObjectMetadata(new GetObjectMetadataRequest()
.WithBucketName(bucketName)
.WithKey(key));
return true;
}
catch (Amazon.S3.AmazonS3Exception ex)
{
if (ex.StatusCode == System.Net.HttpStatusCode.NotFound)
return false;
//status wasn't not found, so throw the exception
throw;
}
}
It kinda sucks, but it works for now.
Use the S3FileInfo.Exists method:
using (var client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretKey))
{
S3FileInfo s3FileInfo = new Amazon.S3.IO.S3FileInfo(client, "your-bucket-name", "your-file-name");
if (s3FileInfo.Exists)
{
// file exists
}
else
{
// file does not exist
}
}
Not sure if this applies to .NET Framework, but the .NET Core version of AWS SDK (v3) only supports async requests, so I had to use a slightly different solution:
/// <summary>
/// Determines whether a file exists within the specified bucket
/// </summary>
/// <param name="bucket">The name of the bucket to search</param>
/// <param name="filePrefix">Match files that begin with this prefix</param>
/// <returns>True if the file exists</returns>
public async Task<bool> FileExists(string bucket, string filePrefix)
{
// Set this to your S3 region (of course)
var region = Amazon.RegionEndpoint.USEast1;
using (var client = new AmazonS3Client(region))
{
var request = new ListObjectsRequest {
BucketName = bucket,
Prefix = filePrefix,
MaxKeys = 1
};
var response = await client.ListObjectsAsync(request, CancellationToken.None);
return response.S3Objects.Any();
}
}
And, if you want to search a folder:
/// <summary>
/// Determines whether a file exists within the specified folder
/// </summary>
/// <param name="bucket">The name of the bucket to search</param>
/// <param name="folder">The name of the folder to search</param>
/// <param name="filePrefix">Match files that begin with this prefix</param>
/// <returns>True if the file exists</returns>
public async Task<bool> FileExists(string bucket, string folder, string filePrefix)
{
return await FileExists(bucket, $"{folder}/{filePrefix}");
}
Usage:
var testExists = await FileExists("testBucket", "test_");
// or...
var testExistsInFolder = await FileExists("testBucket", "testFolder/testSubFolder", "test_");
This solves it:
List the bucket for existing objects and use a prefix like so.
var request = new ListObjectsRequest()
.WithBucketName(_bucketName)
.WithPrefix(keyPrefix);
var response = _amazonS3Client.ListObjects(request);
var exists = response.S3Objects.Count > 0;
foreach (var obj in response.S3Objects) {
// act
}
I know this question is a few years old but the new SDK handles this beautifully. If anyone is still searching this. You are looking for S3DirectoryInfo Class
using (IAmazonS3 s3Client = new AmazonS3Client(accessKey, secretKey))
{
S3DirectoryInfo s3DirectoryInfo = new Amazon.S3.IO.S3DirectoryInfo(s3Client, "testbucket");
if (s3DirectoryInfo.GetFiles("test*").Any())
{
//file exists -- do something
}
else
{
//file doesn't exist -- do something else
}
}
I know this question is a few years old but the new SDK nowdays handles this in an easier manner.
public async Task<bool> ObjectExistsAsync(string prefix)
{
var response = await _amazonS3.GetAllObjectKeysAsync(_awsS3Configuration.BucketName, prefix, null);
return response.Count > 0;
}
Where _amazonS3 is your IAmazonS3 instance and _awsS3Configuration.BucketName is your bucket name.
You can use your complete key as a prefix.
I used the following code in C# with Amazon S3 version 3.1.5(.net 3.5) to check if the bucket exists:
BasicAWSCredentials credentials = new BasicAWSCredentials("accessKey", "secretKey");
AmazonS3Config configurationAmazon = new AmazonS3Config();
configurationAmazon.RegionEndpoint = S3Region.EU; // or you can use ServiceUrl
AmazonS3Client s3Client = new AmazonS3Client(credentials, configurationAmazon);
S3DirectoryInfo directoryInfo = new S3DirectoryInfo(s3Client, bucketName);
bucketExists = directoryInfo.Exists;// true if the bucket exists in other case false.
I used the followings code(in C# with Amazon S3 version 3.1.5 .net 3.5) the file Exists.
Option 1:
S3FileInfo info = new S3FileInfo(s3Client, "butcketName", "key");
bool fileExists = info.Exists; // true if the key Exists in other case false
Option 2:
ListObjectsRequest request = new ListObjectsRequest();
try
{
request.BucketName = "bucketName";
request.Prefix = "prefix"; // or part of the key
request.MaxKeys = 1; // max limit to find objects
ListObjectsResponse response = s3Client .ListObjects(request);
return response.S3Objects.Count > 0;
}
I'm not familiar with C#, but I use this method from Java (conversion to c# is immediate):
public boolean exists(AmazonS3 s3, String bucket, String key) {
ObjectListing list = s3.listObjects(bucket, key);
return list.getObjectSummaries().size() > 0;
}
my 2 cents
public async Task<bool> DoesKeyExistsAsync(string key)
{
ListObjectsResponse response = null;
try
{
response = await _s3Client.ListObjectsAsync(new ListObjectsRequest { BucketName = _settings.BucketName, Prefix = key });
}
catch (Exception ex)
{
_logger.LogError($"Error while checking key {key}");
return false;
}
return (response?.S3Objects?.Count > 0);
}
s3 = new S3(S3KEY, S3SECRET, false);
res = s3->getObjectInfo(bucketName, filename);
It will return array if file exists
try this one:
NameValueCollection appConfig = ConfigurationManager.AppSettings;
AmazonS3 s3Client = AWSClientFactory.CreateAmazonS3Client(
appConfig["AWSAccessKey"],
appConfig["AWSSecretKey"],
Amazon.RegionEndpoint.USEast1
);
S3DirectoryInfo source = new S3DirectoryInfo(s3Client, "BUCKET_NAME", "Key");
if(source.Exist)
{
//do ur stuff
}
using Amazon;
using Amazon.S3;
using Amazon.S3.IO;
using Amazon.S3.Model;
string accessKey = "xxxxx";
string secretKey = "xxxxx";
string regionEndpoint = "EU-WEST-1";
string bucketName = "Bucket1";
string filePath = "https://Bucket1/users/delivery/file.json"
public bool FileExistsOnS3(string filePath)
{
try
{
Uri myUri = new Uri(filePath);
string absolutePath = myUri.AbsolutePath; // /users/delivery/file.json
string key = absolutePath.Substring(1); // users/delivery/file.json
using(var client = AWSClientFactory.CreateAmazonS3Client(accessKey, secretKey, regionEndpoint))
{
S3FileInfo file = new S3FileInfo(client, bucketName, key);
if (file.Exists)
{
return true;
// custom logic
}
else
{
return false;
// custom logic
}
}
}
catch(AmazonS3Exception ex)
{
return false;
}
}
There is an overload for GetFileSystemInfos
Notice this line has filename.*
var files= s3DirectoryInfo.GetFileSystemInfos("filename.*");
public bool Check()
{
var awsCredentials = new Amazon.Runtime.BasicAWSCredentials("AccessKey", "SecretKey");
using (var client = new AmazonS3Client(awsCredentials, Amazon.RegionEndpoint.USEast1))
{
S3DirectoryInfo s3DirectoryInfo = new S3DirectoryInfo(client, bucketName, "YourFilePath");
var files= s3DirectoryInfo.GetFileSystemInfos("filename.*");
if(files.Any())
{
//fles exists
}
}
}
Related
I have to go through all team drives, folders, and files and the code I have works perfectly fine:
/// <summary>
/// Get All files inside a folder
/// </summary>
/// <param name="service"></param>
/// <param name="folderId"></param>
/// <param name="td"></param>
/// <returns></returns>
private static async Task<Google.Apis.Drive.v3.Data.FileList> GetAllFilesInsideFolder(DriveService service, string folderId, TeamDrive td)
{
string FolderId = folderId;
// Define parameters of request.
FilesResource.ListRequest listRequest = service.Files.List();
listRequest.Corpora = "drive";
listRequest.SupportsAllDrives = true;
listRequest.DriveId = td.Id;
listRequest.PageSize = 100;
listRequest.IncludeItemsFromAllDrives = true;
listRequest.Q = "'" + FolderId + "' in parents and trashed=false";
listRequest.Fields = "nextPageToken, files(*)";
// List files.
Google.Apis.Drive.v3.Data.FileList files = await listRequest.ExecuteAsync();
return files;
}
/// <summary>
/// Gets all folders from an specific team drive
/// </summary>
/// <param name="service"></param>
/// <param name="td"></param>
/// <returns></returns>
private static async Task<Google.Apis.Drive.v3.Data.FileList> GetAllFoldersFromTeamDriveAsync(DriveService service, TeamDrive td)
{
try
{
var request = service.Files.List();
request.Corpora = "drive";
request.SupportsAllDrives = true;
request.DriveId = td.Id;
request.PageSize = 100;
request.IncludeItemsFromAllDrives = true;
request.Q = "mimeType = 'application/vnd.google-apps.folder'";
var response = await request.ExecuteAsync();
return response;
}
catch (Exception ex )
{
Console.WriteLine("GetAllFoldersFromTeamDriveAsync Error: " + ex.Message);
throw;
}
}
/// <summary>
/// Gets all teamdrives
/// </summary>
/// <param name="service"></param>
/// <returns></returns>
private static async Task<Google.Apis.Drive.v3.Data.TeamDriveList> GetAllTeamDrivesAsync(DriveService service)
{
try
{
var request = service.Teamdrives.List();
request.PageSize = 100;
var response = await request.ExecuteAsync();
return response;
}
catch (Exception ex)
{
Console.WriteLine("GetAllTeamDrivesAsync Error" + ex.Message);
throw;
}
}
Main method:
public async Task RunFix()
{
UserCredential credential;
Directory.CreateDirectory(StorageLocation);
var DataStore = new FileDataStore(StorageLocation);
using var stream = new FileStream(CredentialsString, FileMode.Open, FileAccess.Read);
// Please give consent to the Auth/drive scope
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
Scopes,
Username,
CancellationToken.None,
DataStore).Result;
try
{
var service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName,
});
var clientbillCycleGoogleDriveFilesDEV = new TableClient("...");
// Get All team drives
var allTeamDrives = await GetAllTeamDrivesAsync(service);
// We will use a CSV to copy the unmapped files
var csvUnmappedBillCycleFiles = new StringBuilder();
//Iterate over all Team Drives I have access to:
foreach (TeamDrive td in allTeamDrives.TeamDrives)
{
if (td.Name == "XXX")
{
Console.WriteLine("Team Drive: " + td.Name);
Stopwatch sAllFilesInsideFolder = Stopwatch.StartNew();
// Get All Folders inside this team drive
var allFolders = await GetAllFoldersFromTeamDriveAsync(service, td);
foreach (Google.Apis.Drive.v3.Data.File file in allFolders.Files)
{
Console.WriteLine("--Folder-- " + file.Name);
if (file.Name.StartsWith("xyz"))
{
//some business logic
foreach (Google.Apis.Drive.v3.Data.File googlefile in allFilesInsideFolder.Files)
{
//some business logic
}
}
}
}
}
System.IO.File.WriteAllText("csvUnmappedBillCycleFiles.csv", csvUnmappedBillCycleFiles.ToString());
}
catch (Exception ex)
{
Console.WriteLine("Error" + ex.Message);
}
}
The problem is that I cant set PageSize to something above 100.
How can I change this code to actually to go through everything?
Assuming that you are using drives list and files list. Which is just a guess since you haven't included the code GetAllTeamDrivesAsync? GetAllFoldersFromTeamDriveAsync.
You can set the page size to max 1000 on your file.list method.
var request = service.Files.List();
request.PageSize = 1000;
But thats you are still going to need to paginate over the page. I recommend using the pageStreamer. It will loop over each of the rows for you.
var request = service.Files.List();
request.PageSize = 1000;
var pageStreamer =
new Google.Apis.Requests.PageStreamer<Google.Apis.Drive.v3.Data.File, FilesResource.ListRequest,
Google.Apis.Drive.v3.Data.FileList, string>(
(req, token) => request.PageToken = token,
response => response.NextPageToken,
response => response.Files);
// data storage
var all = new Google.Apis.Drive.v3.Data.FileList();
all.Files = new List<Google.Apis.Drive.v3.Data.File>();
// fetching data and adding it to storage
foreach (var result in await pageStreamer.FetchAllAsync(request, CancellationToken.None))
{
all.Files.Add(result);
}
I have a video up that explains how pagestreamer works How to use the nextpagetoken from the files list method in the Google Drive api v3.
You should be able to use pagestreamer with drives list as well.
When my AWS Credentials File (see docs) is updated by an external process the AmazonSQSClient doesn't re-read it, SendMessageAsync fails with a security/token error.
We use a custom powershell script to refresh the local AWS cred's file periodically. The script works fine, the file is refreshed prior to the credentials expiring on AWS. However, if my app is running when the file is refreshed the new credentials are not re-read from the file, the "client" will show that the previous credentials are still in use.
The AWS docs list several AWSCredential providers but none of them seem to be the correct choice...I think..
Restarting the app works, the new credentials are read correctly and messages are sent until the next time the cred's file is updated.
using (var client = new AmazonSQSClient(Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
I don't think there is a way for a running app to pick up the default credentials being refreshed in credentials file. There is a solution for Node.js loading credentials from a JSON file. You can create a similar solution in C#. You can also run a local DB to store credentials so whenever credentials file is updated DB table or JSON file is also updated. You will need to use access key and secret key in your SQS client constructor as opposed to using default credentials.
// Load these from JSON file or DB.
var accessKey = "";
var secretKey = "";
using (var client = new AmazonSQSClient(accessKey, secretKey, Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
The following works "ok" but I've only tested it with one profile and the file watcher is not as timely as you'd like so I'd recommend you wrap your usage inside a Retry mechanism.
// Usage..
var credentials = new AwsCredentialsFile();
using (var client = new AmazonSQSClient(credentials, Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
public class AwsCredentialsFile : AWSCredentials
{
// https://docs.aws.amazon.com/sdk-for-net/v2/developer-guide/net-dg-config-creds.html#creds-file
private const string DefaultProfileName = "default";
private static ConcurrentDictionary<string, ImmutableCredentials> _credentials = new ConcurrentDictionary<string, ImmutableCredentials>(StringComparer.OrdinalIgnoreCase);
private static FileSystemWatcher _watcher = BuildFileSystemWatcher();
private readonly System.Text.Encoding _encoding;
private readonly string _profileName;
public AwsCredentialsFile()
: this(AwsCredentialsFile.DefaultProfileName, System.Text.Encoding.UTF8)
{
}
public AwsCredentialsFile(string profileName)
: this(profileName, System.Text.Encoding.UTF8)
{
}
public AwsCredentialsFile(string profileName, System.Text.Encoding encoding)
{
_profileName = profileName;
_encoding = encoding;
}
private static FileSystemWatcher BuildFileSystemWatcher()
{
var watcher = new FileSystemWatcher
{
Path = Path.GetDirectoryName(GetDefaultCredentialsFilePath()),
NotifyFilter = NotifyFilters.LastWrite,
Filter = "credentials"
};
watcher.Changed += (object source, FileSystemEventArgs e) => { _credentials?.Clear(); };
watcher.EnableRaisingEvents = true;
return watcher;
}
public static string GetDefaultCredentialsFilePath()
{
return System.Environment.ExpandEnvironmentVariables(#"C:\Users\%USERNAME%\.aws\credentials");
}
public static (string AccessKey, string SecretAccessKey, string Token) ReadCredentialsFromFile(string profileName, System.Text.Encoding encoding)
{
var profile = $"[{profileName}]";
string awsAccessKeyId = null;
string awsSecretAccessKey = null;
string token = null;
var lines = File.ReadAllLines(GetDefaultCredentialsFilePath(), encoding);
for (int i = 0; i < lines.Length; i++)
{
var text = lines[i];
if (text.Equals(profile, StringComparison.OrdinalIgnoreCase))
{
awsAccessKeyId = lines[i + 1].Replace("aws_access_key_id = ", string.Empty);
awsSecretAccessKey = lines[i + 2].Replace("aws_secret_access_key = ", string.Empty);
if (lines.Length >= i + 3)
{
token = lines[i + 3].Replace("aws_session_token = ", string.Empty);
}
break;
}
}
var result = (AccessKey: awsAccessKeyId, SecretAccessKey: awsSecretAccessKey, Token: token);
return result;
}
public override ImmutableCredentials GetCredentials()
{
if (_credentials.TryGetValue(_profileName, out ImmutableCredentials value))
{
return value;
}
else
{
var (AccessKey, SecretAccessKey, Token) = ReadCredentialsFromFile(_profileName, _encoding);
var credentials = new ImmutableCredentials(AccessKey, SecretAccessKey, Token);
_credentials.TryAdd(_profileName, credentials);
return credentials;
}
}
}
I'm looping through all object ACL's in a bucket, to remove "Everyone" permissions from all of them. The idea here is to retain all current permissions.
My issue is that the PutACL call doesn't work. In the example below, a new AccessControlList is created, omitting the "everyone" entries. The PutACL call returns successfully, but the object's ACL is unchanged.
Perhaps there is an easier way to identify and remove specific Grants.
AmazonS3Client s3 = new AmazonS3Client();
GetACLRequest aclRequest = new GetACLRequest() { BucketName = "my-bucket", Key = "/dir/protect_me.txt" };
var aclResponse = s3.GetACL(aclRequest);
bool foundEveryonePriv = false; //if found at least one.
S3AccessControlList newAcl = new S3AccessControlList();
foreach (var grant in aclResponse.AccessControlList.Grants)
{
bool grantToEveryone = string.Compare(grant.Grantee.URI, "http://acs.amazonaws.com/groups/global/AllUsers") == 0;
Logger.log.InfoFormat("{0},{1},{2},{3}", aclRequest.BucketName, o.Key, grant.Permission, (everyoneHasThisPriv ? "EVERYONE" : string.Empty));
if (grantToEveryone)
{
foundEveryonePriv = true;
newAcl.AddGrant(grant.Grantee, grant.Permission);
}
}
//modify the items if necessary and requested.
if (foundEveryonePriv)
{
newAcl.Owner = aclResponse.AccessControlList.Owner;
var response = s3.PutACL(new PutACLRequest() { AccessControlList = newAcl, BucketName = aclRequest.BucketName, Key = o.Key });
}
Try modifying the existing ACL from the GET to remove the public grant. Then send the modified ACL back in a PUT request. Here's what I did and it's working well to retain the original grants and remove the public grant from a given object.
private void RemovePublicAcl(AmazonS3Client client, string bucket, string key)
{
var aclRequest = new GetACLRequest { BucketName = bucket, Key = key };
var aclResponse = client.GetACL(aclRequest);
var acl = aclResponse.AccessControlList;
const string PUBLIC_GRANTEE = "http://acs.amazonaws.com/groups/global/AllUsers";
if (acl.Grants.Any(x =>
!string.IsNullOrWhiteSpace(x.Grantee.URI) &&
x.Grantee.URI.Equals(PUBLIC_GRANTEE)))
{
var publicGrant = new S3Grantee();
publicGrant.URI = PUBLIC_GRANTEE;
acl.Grants.RemoveAll(x =>
!string.IsNullOrWhiteSpace(x.Grantee.URI) &&
x.Grantee.URI.Equals(PUBLIC_GRANTEE));
var aclUpdate = new PutACLRequest();
aclUpdate.BucketName = bucket;
aclUpdate.Key = key;
aclUpdate.AccessControlList = acl;
var response = client.PutACL(aclUpdate);
}
Can someone please provide an example of how to use Google.Apis.Storage.v1 for uploading files to google cloud storage in c#?
I found that this basic operation is not as straight forward as you might expect. Google's documentation about it's Storage API is lacking in information about using it in C# (or any other .NET language). Searching for 'how to upload file to google cloud storage in c#' didn't exactly help me, so here is my working solution with some comments:
Preparation:
You need to create OAuth2 account in your Google Developers Console - go to Project/APIs & auth/Credentials.
Copy Client ID & Client Secret to your code. You will also need your Project name.
Code (it assumes that you've added Google.Apis.Storage.v1 via NuGet):
First, you need to authorize your requests:
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = clientId;
clientSecrets.ClientSecret = clientSecret;
//there are different scopes, which you can find here https://cloud.google.com/storage/docs/authentication
var scopes = new[] {#"https://www.googleapis.com/auth/devstorage.full_control"};
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets,scopes, "yourGoogle#email", cts.Token);
Sometimes you might also want to refresh authorization token via:
await userCredential.RefreshTokenAsync(cts.Token);
You also need to create Storage Service:
var service = new Google.Apis.Storage.v1.StorageService();
Now you can make requests to Google Storage API.
Let's start with creating a new bucket:
var newBucket = new Google.Apis.Storage.v1.Data.Bucket()
{
Name = "your-bucket-name-1"
};
var newBucketQuery = service.Buckets.Insert(newBucket, projectName);
newBucketQuery.OauthToken = userCredential.Result.Token.AccessToken;
//you probably want to wrap this into try..catch block
newBucketQuery.Execute();
And it's done. Now, you can send a request to get list of all of your buckets:
var bucketsQuery = service.Buckets.List(projectName);
bucketsQuery.OauthToken = userCredential.Result.Token.AccessToken;
var buckets = bucketsQuery.Execute();
Last part is uploading new file:
//enter bucket name to which you want to upload file
var bucketToUpload = buckets.Items.FirstOrDefault().Name;
var newObject = new Object()
{
Bucket = bucketToUpload,
Name = "some-file-"+new Random().Next(1,666)
};
FileStream fileStream = null;
try
{
var dir = Directory.GetCurrentDirectory();
var path = Path.Combine(dir, "test.png");
fileStream = new FileStream(path, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload,fileStream,"image/png");
uploadRequest.OauthToken = userCredential.Result.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
And bam! New file will be visible in you Google Developers Console inside of selected bucket.
You can use Google Cloud APIs without SDK in the following ways:
Required api-key.json file
Install package Google.Apis.Auth.OAuth2 in order to authorize the
HTTP web request
You can set the default configuration for your application in this
way
I did the same using .NET core web API and details are given below:
Url details:
"GoogleCloudStorageBaseUrl": "https://www.googleapis.com/upload/storage/v1/b/",
"GoogleSpeechBaseUrl": "https://speech.googleapis.com/v1/operations/",
"GoogleLongRunningRecognizeBaseUrl": "https://speech.googleapis.com/v1/speech:longrunningrecognize",
"GoogleCloudScope": "https://www.googleapis.com/auth/cloud-platform",
public void GetConfiguration()
{
// Set global configuration
bucketName = _configuration.GetValue<string>("BucketName");
googleCloudStorageBaseUrl = _configuration.GetValue<string>("GoogleCloudStorageBaseUrl");
googleSpeechBaseUrl = _configuration.GetValue<string>("GoogleSpeechBaseUrl");
googleLongRunningRecognizeBaseUrl = _configuration.GetValue<string>("GoogleLongRunningRecognizeBaseUrl");
// Set google cloud credentials
string googleApplicationCredentialsPath = _configuration.GetValue<string>("GoogleCloudCredentialPath");
using (Stream stream = new FileStream(googleApplicationCredentialsPath, FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(stream).CreateScoped(_configuration.GetValue<string>("GoogleCloudScope"));
}
Get Oauth token:
public string GetOAuthToken()
{
return googleCredential.UnderlyingCredential.GetAccessTokenForRequestAsync("https://accounts.google.com/o/oauth2/v2/auth", CancellationToken.None).Result;
}
To upload file to cloud bucket:
public async Task<string> UploadMediaToCloud(string filePath, string objectName = null)
{
string bearerToken = GetOAuthToken();
byte[] fileBytes = File.ReadAllBytes(filePath);
objectName = objectName ?? Path.GetFileName(filePath);
var baseUrl = new Uri(string.Format(googleCloudStorageBaseUrl + "" + bucketName + "/o?uploadType=media&name=" + objectName + ""));
using (WebClient client = new WebClient())
{
client.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + bearerToken);
client.Headers.Add(HttpRequestHeader.ContentType, "application/octet-stream");
byte[] response = await Task.Run(() => client.UploadData(baseUrl, "POST", fileBytes));
string responseInString = Encoding.UTF8.GetString(response);
return responseInString;
}
}
In order to perform any action to the cloud API, just need to make a HttpClient get/post request as per the requirement.
Thanks
This is for Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1), but appears to be a bit simpler to perform an upload now. I started with the Client libraries "Getting Started" instructions to create a service account and bucket, then experimented to find out how to upload an image.
The process I followed was:
Sign up for Google Cloud free trial
Create a new project in Google Cloud (remember the project name\ID for later)
Create a Project Owner service account - this will result in a json file being downloaded that contains the service account credentials. Remember where you put that file.
The getting started docs get you to add the path to the JSON credentials file into an environment variable called GOOGLE_APPLICATION_CREDENTIALS - I couldn't get this to work through the provided instructions. Turns out it is not required, as you can just read the JSON file into a string and pass it to the client constructor.
I created an empty WPF project as a starting point, and a single ViewModel to house the application logic.
Install the Google.Cloud.Storage.V1 nuget package and it should pull in all the dependencies it needs.
Onto the code.
MainWindow.xaml
<StackPanel>
<Button
Margin="50"
Height="50"
Content="BEGIN UPLOAD"
Click="OnButtonClick" />
<ContentControl
Content="{Binding Path=ProgressBar}" />
</StackPanel>
MainWindow.xaml.cs
public partial class MainWindow
{
readonly ViewModel _viewModel;
public MainWindow()
{
_viewModel = new ViewModel(Dispatcher);
DataContext = _viewModel;
InitializeComponent();
}
void OnButtonClick(object sender, RoutedEventArgs args)
{
_viewModel.UploadAsync().ConfigureAwait(false);
}
}
ViewModel.cs
public class ViewModel
{
readonly Dispatcher _dispatcher;
public ViewModel(Dispatcher dispatcher)
{
_dispatcher = dispatcher;
ProgressBar = new ProgressBar {Height=30};
}
public async Task UploadAsync()
{
// Google Cloud Platform project ID.
const string projectId = "project-id-goes-here";
// The name for the new bucket.
const string bucketName = projectId + "-test-bucket";
// Path to the file to upload
const string filePath = #"C:\path\to\image.jpg";
var newObject = new Google.Apis.Storage.v1.Data.Object
{
Bucket = bucketName,
Name = System.IO.Path.GetFileNameWithoutExtension(filePath),
ContentType = "image/jpeg"
};
// read the JSON credential file saved when you created the service account
var credential = Google.Apis.Auth.OAuth2.GoogleCredential.FromJson(System.IO.File.ReadAllText(
#"c:\path\to\service-account-credentials.json"));
// Instantiates a client.
using (var storageClient = Google.Cloud.Storage.V1.StorageClient.Create(credential))
{
try
{
// Creates the new bucket. Only required the first time.
// You can also create buckets through the GCP cloud console web interface
storageClient.CreateBucket(projectId, bucketName);
System.Windows.MessageBox.Show($"Bucket {bucketName} created.");
// Open the image file filestream
using (var fileStream = new System.IO.FileStream(filePath, System.IO.FileMode.Open))
{
ProgressBar.Maximum = fileStream.Length;
// set minimum chunksize just to see progress updating
var uploadObjectOptions = new Google.Cloud.Storage.V1.UploadObjectOptions
{
ChunkSize = Google.Cloud.Storage.V1.UploadObjectOptions.MinimumChunkSize
};
// Hook up the progress callback
var progressReporter = new Progress<Google.Apis.Upload.IUploadProgress>(OnUploadProgress);
await storageClient.UploadObjectAsync(
newObject,
fileStream,
uploadObjectOptions,
progress: progressReporter)
.ConfigureAwait(false);
}
}
catch (Google.GoogleApiException e)
when (e.Error.Code == 409)
{
// When creating the bucket - The bucket already exists. That's fine.
System.Windows.MessageBox.Show(e.Error.Message);
}
catch (Exception e)
{
// other exception
System.Windows.MessageBox.Show(e.Message);
}
}
}
// Called when progress updates
void OnUploadProgress(Google.Apis.Upload.IUploadProgress progress)
{
switch (progress.Status)
{
case Google.Apis.Upload.UploadStatus.Starting:
ProgressBar.Minimum = 0;
ProgressBar.Value = 0;
break;
case Google.Apis.Upload.UploadStatus.Completed:
ProgressBar.Value = ProgressBar.Maximum;
System.Windows.MessageBox.Show("Upload completed");
break;
case Google.Apis.Upload.UploadStatus.Uploading:
UpdateProgressBar(progress.BytesSent);
break;
case Google.Apis.Upload.UploadStatus.Failed:
System.Windows.MessageBox.Show("Upload failed"
+ Environment.NewLine
+ progress.Exception);
break;
}
}
void UpdateProgressBar(long value)
{
_dispatcher.Invoke(() => { ProgressBar.Value = value; });
}
// probably better to expose progress value directly and bind to
// a ProgressBar in the XAML
public ProgressBar ProgressBar { get; }
}
Use of Google.Apis.Storage.v1 for uploading files using SDK to google cloud storage in c#:
Required api-key.json file
Install the package Google.Cloud.Storage.V1; and Google.Apis.Auth.OAuth2;
The code is given below to upload the file to the cloud
private string UploadFile(string localPath, string objectName = null)
{
string projectId = ((Google.Apis.Auth.OAuth2.ServiceAccountCredential)googleCredential.UnderlyingCredential).ProjectId;
try
{
// Creates the new bucket.
var objResult = storageClient.CreateBucket(projectId, bucketName);
if (!string.IsNullOrEmpty(objResult.Id))
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus1 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Google.GoogleApiException ex)
{
// Error code =409, means bucket already created/exist then upload file in the bucket
if (ex.Error.Code == 409)
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus2 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
return objectName;
}
To set the credentials
private bool SetStorageCredentials()
{
bool status = true;
try
{
if (File.Exists(credential_path))
{
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", credential_path);
using (Stream objStream = new FileStream(Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS"), FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(objStream);
// Instantiates a client.
storageClient = StorageClient.Create();
channel = new Grpc.Core.Channel(SpeechClient.DefaultEndpoint.Host, googleCredential.ToChannelCredentials());
}
else
{
DialogResult result = MessageBox.Show("File " + Path.GetFileName(credential_path) + " does not exist. Please provide the correct path.");
if (result == System.Windows.Forms.DialogResult.OK)
{
status = false;
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
status = false;
}
return status;
}
I used SDK in one of my window application. You can use the same code according to your needs/requirements.
You'll be happy to know it still works in 2016...
I was googling all over using fancy key words like "google gcp C# upload image", until I just plain asked the question: "How do I upload an image to google bucket using C#"... and here I am. I removed the .Result in the user credential, and this was the final edit that worked for me.
// ******
static string bucketForImage = ConfigurationManager.AppSettings["testStorageName"];
static string projectName = ConfigurationManager.AppSettings["GCPProjectName"];
string gcpPath = Path.Combine(Server.MapPath("~/Images/Gallery/"), uniqueGcpName + ext);
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = ConfigurationManager.AppSettings["GCPClientID"];
clientSecrets.ClientSecret = ConfigurationManager.AppSettings["GCPClientSc"];
var scopes = new[] { #"https://www.googleapis.com/auth/devstorage.full_control" };
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets, scopes, ConfigurationManager.AppSettings["GCPAccountEmail"], cts.Token);
var service = new Google.Apis.Storage.v1.StorageService();
var bucketToUpload = bucketForImage;
var newObject = new Google.Apis.Storage.v1.Data.Object()
{
Bucket = bucketToUpload,
Name = bkFileName
};
FileStream fileStream = null;
try
{
fileStream = new FileStream(gcpPath, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload, fileStream, "image/"+ ext);
uploadRequest.OauthToken = userCredential.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
// ******
Here is the link to their official C# example of ".NET Bookshelf App" using Google Cloud storage.
https://cloud.google.com/dotnet/docs/getting-started/using-cloud-storage
Source on github:
https://github.com/GoogleCloudPlatform/getting-started-dotnet/blob/master/aspnet/3-binary-data/Services/ImageUploader.cs
https://github.com/GoogleCloudPlatform/getting-started-dotnet/tree/master/aspnet/3-binary-data
Nuget
https://www.nuget.org/packages/Google.Cloud.Storage.V1/
Here are 2 examples that helped me to upload files to a bucket in Google Cloud Storage with Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1):
Upload files to Google cloud storage using c#
Uploading .csv Files to Google Cloud Storage using C# .Net
I got both working on a C# Console Application just for testing purposes.
#February 2021
string _projectId = "YOUR-PROJECT-ID-GCP"; //ProjectID also present in the json file
GoogleCredential _credential = GoogleCredential.FromFile("credential-cloud-file-123418c9e06c.json");
/// <summary>
/// UploadFile to GCS Bucket
/// </summary>
/// <param name="bucketName"></param>
/// <param name="localPath">my-local-path/my-file-name</param>
/// <param name="objectName">my-file-name</param>
public void UploadFile(string bucketName, string localPath, string objectName)
{
var storage = StorageClient.Create(_credential);
using var fileStream = File.OpenRead(localPath);
storage.UploadObject(bucketName, objectName, null, fileStream);
Console.WriteLine($"Uploaded {objectName}.");
}
You get the credentials JSON file from the google cloud portal where you create a bucket under your project..
Simple, with auth:
private void SaveFileToGoogleStorage(string path, string? fileName, string ext)
{
var filePath = Path.Combine(path, fileName + ext);
var gcCredentialsPath = Path.Combine(Environment.CurrentDirectory, "gc_sa_key.json");
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", gcCredentialsPath);
var gcsStorage = StorageClient.Create();
using var f = File.OpenRead(filePath);
var objectName = Path.GetFileName(filePath);
gcsStorage.UploadObject(_bucketName, objectName, null, f);
Console.WriteLine($"Uploaded {objectName}.");
}
I'm using the Amazon C# SDK and trying to upload a file, but by default it has restricted permissions. I would like to make it publicly available, but I can't seem to find out how to do it as part of the upload.
My bucket is public, but when I upload a new file using the code below, the file I upload is not public.
Has anyone had to do this before?
public class S3Uploader
{
private string awsAccessKeyId;
private string awsSecretAccessKey;
private string bucketName;
private Amazon.S3.Transfer.TransferUtility transferUtility;
public S3Uploader(string bucketName)
{
this.bucketName = bucketName;
this.transferUtility = new Amazon.S3.Transfer.TransferUtility("XXXXXXXXXXXXXXXXXX", "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX");
}
public void UploadFile(string filePath, string toPath)
{
AsyncCallback callback = new AsyncCallback(uploadComplete);
transferUtility.BeginUpload(filePath, bucketName, toPath, callback, null);
}
private void uploadComplete(IAsyncResult result)
{
var x = result;
}
}
The solution by Tahbaza is correct, but it doesn't work in the later versions of AWS SDK (2.1+)...
Consider using the following for 2.1+ versions of SDK:
private void UploadFileToS3(string filePath)
{
var awsAccessKey = ConfigurationManager.AppSettings["AWSAccessKey"];
var awsSecretKey = ConfigurationManager.AppSettings["AWSSecretKey"];
var existingBucketName = ConfigurationManager.AppSettings["AWSBucketName"];
var client = Amazon.AWSClientFactory.CreateAmazonS3Client(awsAccessKey, awsSecretKey,RegionEndpoint.USEast1);
var uploadRequest = new TransferUtilityUploadRequest
{
FilePath = filePath,
BucketName = existingBucketName,
CannedACL = S3CannedACL.PublicRead
};
var fileTransferUtility = new TransferUtility(client);
fileTransferUtility.Upload(uploadRequest);
}
Found it, need to use a TransferUtilityUploadRequest:
public class S3Uploader
{
private string awsAccessKeyId;
private string awsSecretAccessKey;
private string bucketName;
private Amazon.S3.Transfer.TransferUtility transferUtility;
public S3Uploader(string bucketName)
{
this.bucketName = bucketName;
this.transferUtility = new Amazon.S3.Transfer.TransferUtility("XXXXXXXXXXXXXXXXXX", "XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX");
}
public void UploadFile(string filePath, string toPath)
{
AsyncCallback callback = new AsyncCallback(uploadComplete);
var uploadRequest = new TransferUtilityUploadRequest();
uploadRequest.FilePath = filePath;
uploadRequest.BucketName = "my_s3_bucket";
uploadRequest.Key = toPath;
uploadRequest.AddHeader("x-amz-acl", "public-read");
transferUtility.BeginUpload(uploadRequest, callback, null);
}
private void uploadComplete(IAsyncResult result)
{
var x = result;
}
}
I have done this before, and in C#, but not with your library (so this may be of limited help).
The idea is to send an ACL header along with the file.
This is one way to do it that should point you in the right direction.
Here's also a link to some relevant AWS docs.
private const string AWS_ACL_HEADER = "x-amz-acl";
private static string ToACLString(S3ACLType acl) {
switch (acl) {
case S3ACLType.AuthenticatedRead:
return "authenticated-read";
case S3ACLType.BucketOwnerFullControl:
return "bucket-owner-full-control";
case S3ACLType.BucketOwnerRead:
return "bucket-owner-read";
case S3ACLType.Private:
return "private";
case S3ACLType.PublicRead:
return "public-read";
case S3ACLType.PublicReadWrite:
return "public-read-write";
default: return "";
}
}
public void Put(string bucketName, string id, byte[] bytes, string contentType, S3ACLType acl) {
string uri = String.Format("https://{0}/{1}", BASE_SERVICE_URL, bucketName.ToLower());
DreamMessage msg = DreamMessage.Ok(MimeType.BINARY, bytes);
msg.Headers[DreamHeaders.CONTENT_TYPE] = contentType;
msg.Headers[DreamHeaders.EXPECT] = "100-continue";
msg.Headers[AWS_ACL_HEADER] = ToACLString(acl);
Plug s3Client = Plug.New(uri).WithPreHandler(S3AuthenticationHeader);
s3Client.At(id).Put(msg);
}
Consider CannedACL approach, shown in the follwing snippet:
string filePath = #"Path\Desert.jpg";
Stream input = new FileStream(filePath, FileMode.Open);
PutObjectRequest request2 = new PutObjectRequest();
request2.WithMetaData("title", "the title")
.WithBucketName(bucketName)
.WithCannedACL(S3CannedACL.PublicRead)
.WithKey(keyName)
.WithInputStream(input);
You can use the property CannedACL of Amazon.S3.Model.PutObjectRequest to upload files to S3 as shown below,
var putObjectRequest = new Amazon.S3.Model.PutObjectRequest
{
BucketName = bucketName,
Key = key,
CannedACL = S3CannedACL.PublicReadWrite
};