I've been struggling for hours with this problem, I hope you can help me. Somehow my compiler will only accept InitiateMultipartUploadAsync and not the regular InitiateMultipartUpload and absolutely requires the callback as parameters to compile, but I can figure out what callback function to give him.
private static async Task UploadObjectAsync()
{
// Create list to store upload part responses.
List<UploadPartResponse> uploadResponses = new List<UploadPartResponse>();
// Setup information required to initiate the multipart upload.
InitiateMultipartUploadRequest initiateRequest = new InitiateMultipartUploadRequest
{
BucketName = "XXXXXXXXX",
Key = "videos/multipart"
};
// Initiate the upload.
InitiateMultipartUploadResponse initResponse =
await S3Client.InitiateMultipartUploadAsync(initiateRequest);
// Upload parts.
long contentLength = new FileInfo("videotest").Length;
long partSize = 5 * (long)Math.Pow(2, 20); // 5 MB
try
{
Console.WriteLine("Uploading parts");
long filePosition = 0;
for (int i = 1; filePosition < contentLength; i++)
{
UploadPartRequest uploadRequest = new UploadPartRequest
{
BucketName = "XXXXXXXX",
Key = "videos/multipart",
UploadId = initResponse.UploadId,
PartNumber = i,
PartSize = partSize,
FilePosition = filePosition,
FilePath = "videotest"
};
// Track upload progress.
uploadRequest.StreamTransferProgress +=
new EventHandler<StreamTransferProgressArgs>(UploadPartProgressEventCallback);
// Upload a part and add the response to our list.
uploadResponses.Add(await S3Client.UploadPartAsync(uploadRequest));
filePosition += partSize;
}
// Setup to complete the upload.
CompleteMultipartUploadRequest completeRequest = new CompleteMultipartUploadRequest
{
BucketName = "XXXXXXXXXXX",
Key = "videos/multipart",
UploadId = initResponse.UploadId
};
completeRequest.AddPartETags(uploadResponses);
// Complete the upload.
CompleteMultipartUploadResponse completeUploadResponse =
await S3Client.CompleteMultipartUploadAsync(completeRequest);
}
catch (Exception exception)
{
Console.WriteLine("An AmazonS3Exception was thrown: {0}", exception.Message);
// Abort the upload.
AbortMultipartUploadRequest abortMPURequest = new AbortMultipartUploadRequest
{
BucketName = "XXXXXXXXX",
Key = "videos/multipart",
UploadId = initResponse.UploadId
};
await S3Client.AbortMultipartUploadAsync(abortMPURequest);
}
}
public static void UploadPartProgressEventCallback(object sender, StreamTransferProgressArgs e)
{
// Process event.
Console.WriteLine("{0}/{1}", e.TransferredBytes, e.TotalBytes);
}
This code was inspired from the offical aws example : https://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFileDotNet.html So I really don't understand why this doesn't work ! Currently with above code, Visual Studio is telling me that InitiateMultipartUploadAsync, UploadPartAsync, CompleteMultipartUploadAsync and AbortMultipartUploadAsync all require a callback function, but 1) examples says callback is optional 2) every callback I tried doesn't work. Thanks in advance
In case someone else as the same bug, I changed the code completly and used this solution : https://github.com/aws/aws-sdk-net/issues/562
you can initiate it like this:
s3Client = new AmazonS3Client(BUCKETREGION);
var initiateRequest = new InitiateMultipartUploadRequest { BucketName = BUCKET, Key = KEY };
await s3Client.InitiateMultipartUploadAsync(initiateRequest).ContinueWith(response =>
{
if (!string.IsNullOrEmpty(response.Result.UploadId))
{
uploadId = response.Result.UploadId;
}
else
{
Debug.WriteLine(response.Exception);
}
});
Related
I am using the following code to upload video on Vimeo. I want to add filename as video title currently I am getting Untitled video with the above code
please guide me how to add Title/Name
public async Task<IActionResult> UploadVideos([FromForm] IFormFile videoFile)
{
string tagName = "tagName";
//var files = Request.Form.Files;
//IFormFile file = files[0];
string uploadStatus = "";
var getVideo = new Video();
try
{
if (videoFile != null)
{
ServicePointManager.Expect100Continue = true;
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls12;
VimeoClient vimeoClient = new VimeoClient(accessToken);
var authcheck = await vimeoClient.GetAccountInformationAsync();
if (authcheck.Name != null)
{
IUploadRequest uploadRequest = new UploadRequest();
//Stream stream = file.OpenReadStream();
//using(var memoryStream = new MemoryStream())
//{
// stream.CopyTo(memoryStream);
// memoryStream.ToArray();
//}
BinaryContent binaryContent = new BinaryContent(obj.videoFile.OpenReadStream(), obj.videoFile.ContentType);
int chunkSize = 0;
int contentLength = Convert.ToInt32(obj.videoFile.Length);
int temp1 = contentLength / 1024;
binaryContent.OriginalFileName = "Test Name";
if (temp1 > 1)
{
chunkSize = temp1 / 1024;
if (chunkSize == 0)
{
chunkSize = 1048576;
}
else
{
if (chunkSize > 10)
{
chunkSize = chunkSize / 10;
}
chunkSize = chunkSize * 1048576;
}
}
else
{
chunkSize = 1048576;
}
var checkChunk = chunkSize;
var status = "uploading";
uploadRequest = await vimeoClient.UploadEntireFileAsync(binaryContent, chunkSize, null);
var _tag = tagName;
var tagVideo = await vimeoClient.AddVideoTagAsync(uploadRequest.ClipId.GetValueOrDefault(), _tag);
while (status != "available")
{
getVideo = await vimeoClient.GetVideoAsync(long.Parse(uploadRequest.ClipId.Value.ToString()));
status = getVideo.Status;
}
uploadStatus = String.Concat("file Uploaded ", getVideo.Files[0].LinkSecure);
}
}
return Ok(new { status = uploadStatus, video = getVideo });
}
catch (Exception ex)
{
return BadRequest(ex.ToString());
}
}
I tried to set title with this binaryContent.OriginalFileName but its results untitled video. Please guide by providing the modification required in the api
I try to refer to the Vimeo documentation and come to know that you could use the Pull approach or you could set metadata for the video to set the title.
Video uploads on Vimeo include metadata such as the name of the video and the video's privacy settings. Besides being useful as titles and text descriptors, metadata are also a key component in strategies for search engine optimization.
You can specify values for a video's metadata fields in the body of the initial POST request of an upload, like this:
{
"upload": {
"approach": "tus",
"size": 800000000
},
"name": "My Video",
"privacy": { "view": "nobody" }
}
If you don’t specify a name for your video, we automatically give it the name Untitled, unless you upload it with the pull approach. In that case, we use the file name of the video.
For more detailed information, please refer to Setting video metadata point in this document.
Trying to make use of the AndroidPublisherService from Play Developer API Client.
I can list active tracks and the releases in those tracks, but when I try to upload a new build there seems to be no way of attaching the authentication already made previously to read data.
I've authenticated using var googleCredentials = GoogleCredential.FromStream(keyDataStream) .CreateWithUser(serviceUsername); where serviceUsername is the email for my service account.
private static void Execute(string packageName, string aabfile, string credfile, string serviceUsername)
{
var credentialsFilename = credfile;
if (string.IsNullOrWhiteSpace(credentialsFilename))
{
// Check env. var
credentialsFilename =
Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS",
EnvironmentVariableTarget.Process);
}
Console.WriteLine($"Using credentials {credfile} with package {packageName} for aab file {aabfile}");
var keyDataStream = File.OpenRead(credentialsFilename);
var googleCredentials = GoogleCredential.FromStream(keyDataStream)
.CreateWithUser(serviceUsername);
var credentials = googleCredentials.UnderlyingCredential as ServiceAccountCredential;
var service = new AndroidPublisherService();
var edit = service.Edits.Insert(new AppEdit { ExpiryTimeSeconds = "3600" }, packageName);
edit.Credential = credentials;
var activeEditSession = edit.Execute();
Console.WriteLine($"Edits started with id {activeEditSession.Id}");
var tracksList = service.Edits.Tracks.List(packageName, activeEditSession.Id);
tracksList.Credential = credentials;
var tracksResponse = tracksList.Execute();
foreach (var track in tracksResponse.Tracks)
{
Console.WriteLine($"Track: {track.TrackValue}");
Console.WriteLine("Releases: ");
foreach (var rel in track.Releases)
Console.WriteLine($"{rel.Name} version: {rel.VersionCodes.FirstOrDefault()} - Status: {rel.Status}");
}
using var fileStream = File.OpenRead(aabfile);
var upload = service.Edits.Bundles.Upload(packageName, activeEditSession.Id, fileStream, "application/octet-stream");
var uploadProgress = upload.Upload();
if (uploadProgress == null || uploadProgress.Exception != null)
{
Console.WriteLine($"Failed to upload. Error: {uploadProgress?.Exception}");
return;
}
Console.WriteLine($"Upload {uploadProgress.Status}");
var tracksUpdate = service.Edits.Tracks.Update(new Track
{
Releases = new List<TrackRelease>(new[]
{
new TrackRelease
{
Name = "Roswell - Grenis Dev Test",
Status = "completed",
VersionCodes = new List<long?>(new[] {(long?) upload?.ResponseBody?.VersionCode})
}
})
}, packageName, activeEditSession.Id, "internal");
tracksUpdate.Credential = credentials;
var trackResult = tracksUpdate.Execute();
Console.WriteLine($"Track {trackResult?.TrackValue}");
var commitResult = service.Edits.Commit(packageName, activeEditSession.Id);
Console.WriteLine($"{commitResult.EditId} has been committed");
}
And as the code points out, all action objects such as tracksList.Credential = credentials; can be given the credentials generated from the service account.
BUT the actual upload action var upload = service.Edits.Bundles.Upload(packageName, activeEditSession.Id, fileStream, "application/octet-stream"); does not expose a .Credential object, and it always fails with:
The service androidpublisher has thrown an exception: Google.GoogleApiException: Google.Apis.Requests.RequestError
Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project. [401]
Errors [
Message[Login Required.] Location[Authorization - header] Reason[required] Domain[global]
]
at Google.Apis.Upload.ResumableUpload`1.InitiateSessionAsync(CancellationToken cancellationToken)
at Google.Apis.Upload.ResumableUpload.UploadAsync(CancellationToken cancellationToken)
So, how would I go about providing the actual Upload action with the given credentials here?
Managed to figure this out during the day, I was missing one call to CreateScoped() when creating the GoogleCredential object as well as a call to InitiateSession() on the upload object.
var googleCredentials = GoogleCredential.FromStream(keyDataStream)
.CreateWithUser(serviceUsername)
.CreateScoped(AndroidPublisherService.Scope.Androidpublisher);
Once that was done I could then get a valid oauth token by calling
var googleCredentials = GoogleCredential.FromStream(keyDataStream)
.CreateWithUser(serviceUsername)
.CreateScoped(AndroidPublisherService.Scope.Androidpublisher);
var credentials = googleCredentials.UnderlyingCredential as ServiceAccountCredential;
var oauthToken = credentials?.GetAccessTokenForRequestAsync(AndroidPublisherService.Scope.Androidpublisher).Result;
And I can now use that oauth token in the upload request:
upload.OauthToken = oauthToken;
_ = await upload.InitiateSessionAsync();
var uploadProgress = await upload.UploadAsync();
if (uploadProgress == null || uploadProgress.Exception != null)
{
Console.WriteLine($"Failed to upload. Error: {uploadProgress?.Exception}");
return;
}
The full code example for successfully uploading a new aab file to google play store internal test track thus looks something like this:
private async Task UploadGooglePlayRelease(string fileToUpload, string changeLogFile, string serviceUsername, string packageName)
{
var serviceAccountFile = ResolveServiceAccountCertificateInfoFile();
if (!serviceAccountFile.Exists)
throw new ApplicationException($"Failed to find the service account certificate file. {serviceAccountFile.FullName}");
var keyDataStream = File.OpenRead(serviceAccountFile.FullName);
var googleCredentials = GoogleCredential.FromStream(keyDataStream)
.CreateWithUser(serviceUsername)
.CreateScoped(AndroidPublisherService.Scope.Androidpublisher);
var credentials = googleCredentials.UnderlyingCredential as ServiceAccountCredential;
var oauthToken = credentials?.GetAccessTokenForRequestAsync(AndroidPublisherService.Scope.Androidpublisher).Result;
var service = new AndroidPublisherService();
var edit = service.Edits.Insert(new AppEdit { ExpiryTimeSeconds = "3600" }, packageName);
edit.Credential = credentials;
var activeEditSession = await edit.ExecuteAsync();
_logger.LogInformation($"Edits started with id {activeEditSession.Id}");
var tracksList = service.Edits.Tracks.List(packageName, activeEditSession.Id);
tracksList.Credential = credentials;
var tracksResponse = await tracksList.ExecuteAsync();
foreach (var track in tracksResponse.Tracks)
{
_logger.LogInformation($"Track: {track.TrackValue}");
_logger.LogInformation("Releases: ");
foreach (var rel in track.Releases)
_logger.LogInformation($"{rel.Name} version: {rel.VersionCodes.FirstOrDefault()} - Status: {rel.Status}");
}
var fileStream = File.OpenRead(fileToUpload);
var upload = service.Edits.Bundles.Upload(packageName, activeEditSession.Id, fileStream, "application/octet-stream");
upload.OauthToken = oauthToken;
_ = await upload.InitiateSessionAsync();
var uploadProgress = await upload.UploadAsync();
if (uploadProgress == null || uploadProgress.Exception != null)
{
Console.WriteLine($"Failed to upload. Error: {uploadProgress?.Exception}");
return;
}
_logger.LogInformation($"Upload {uploadProgress.Status}");
var releaseNotes = await File.ReadAllTextAsync(changeLogFile);
var tracksUpdate = service.Edits.Tracks.Update(new Track
{
Releases = new List<TrackRelease>(new[]
{
new TrackRelease
{
Name = $"{upload?.ResponseBody?.VersionCode}",
Status = "completed",
InAppUpdatePriority = 5,
CountryTargeting = new CountryTargeting { IncludeRestOfWorld = true },
ReleaseNotes = new List<LocalizedText>(new []{ new LocalizedText { Language = "en-US", Text = releaseNotes } }),
VersionCodes = new List<long?>(new[] {(long?) upload?.ResponseBody?.VersionCode})
}
})
}, packageName, activeEditSession.Id, "internal");
tracksUpdate.Credential = credentials;
var trackResult = await tracksUpdate.ExecuteAsync();
_logger.LogInformation($"Track {trackResult?.TrackValue}");
var commitResult = service.Edits.Commit(packageName, activeEditSession.Id);
commitResult.Credential = credentials;
await commitResult.ExecuteAsync();
_logger.LogInformation($"{commitResult.EditId} has been committed");
}
I am trying to delete all the files inside a folder which is basically the date.
Suppose, if there are 100 files under folder "08-10-2015", instead of sending all those 100 file names, i want to send the folder name.
I am trying below code and it is not working for me.
DeleteObjectsRequest multiObjectDeleteRequest = new DeleteObjectsRequest();
multiObjectDeleteRequest.BucketName = bucketName;
multiObjectDeleteRequest.AddKey(keyName + "/" + folderName + "/");
AmazonS3Config S3Config = new AmazonS3Config()
{
ServiceURL = string.Format(servicehost)
};
using (IAmazonS3 client = Amazon.AWSClientFactory.CreateAmazonS3Client(accesskey, secretkey, S3Config))
{
try
{
DeleteObjectsResponse response = client.DeleteObjects(multiObjectDeleteRequest);
Console.WriteLine("Successfully deleted all the {0} items", response.DeletedObjects.Count);
}
catch (DeleteObjectsException e)
{
// Process exception.
}
I am using the above code and it is not working.
I think you can delete the entire folder using the following code:
AmazonS3Config cfg = new AmazonS3Config();
cfg.RegionEndpoint = Amazon.RegionEndpoint.EUCentral1;
string bucketName = "your bucket name";
AmazonS3Client s3Client = new AmazonS3Client("your access key", "your secret key", cfg);
S3DirectoryInfo directoryToDelete = new S3DirectoryInfo(s3Client, bucketName, "your folder name or full folder key");
directoryToDelete.Delete(true); // true will delete recursively in folder inside
I am using amazon AWSSDK.Core and AWSSDK.S3 version 3.1.0.0 for .net 3.5.
I hope it can help you
You have to:
List all objects in the folder
Retrieve key for each object
Add this key to a multiple Delete Object Request
Make the request to delete all objects
AmazonS3Config S3Config = new AmazonS3Config()
{
ServiceURL = "s3.amazonaws.com",
CommunicationProtocol = Amazon.S3.Model.Protocol.HTTP,
};
const string AWS_ACCESS_KEY = "xxxxxxxxxxxxxxxx";
const string AWS_SECRET_KEY = "yyyyyyyyyyyyyyyy";
AmazonS3Client client = new AmazonS3Client(AWS_ACCESS_KEY, AWS_SECRET_KEY, S3Config);
DeleteObjectsRequest request2 = new DeleteObjectsRequest();
ListObjectsRequest request = new ListObjectsRequest
{
BucketName = "yourbucketname",
Prefix = "yourprefix"
};
ListObjectsResponse response = await client.ListObjectsAsync(request);
// Process response.
foreach (S3Object entry in response.S3Objects)
{
request2.AddKey(entry.Key);
}
request2.BucketName = "yourbucketname";
DeleteObjectsResponse response2 = await client.DeleteObjectsAsync(request2);
I'm not sure why they didn't keep this method in future SDKs, but, for those interested, here is the implementation of the S3DirectoryInfo.Delete method:
ListObjectsRequest listObjectsRequest = new ListObjectsRequest
{
BucketName = bucket,
Prefix = directoryPrefix
};
DeleteObjectsRequest deleteObjectsRequest = new DeleteObjectsRequest
{
BucketName = bucket
};
ListObjectsResponse listObjectsResponse = null;
do
{
listObjectsResponse = s3Client.ListObjects(listObjectsRequest);
foreach (S3Object item in listObjectsResponse.S3Objects.OrderBy((S3Object x) => x.Key))
{
deleteObjectsRequest.AddKey(item.Key);
if (deleteObjectsRequest.Objects.Count == 1000)
{
s3Client.DeleteObjects(deleteObjectsRequest);
deleteObjectsRequest.Objects.Clear();
}
listObjectsRequest.Marker = item.Key;
}
}
while (listObjectsResponse.IsTruncated);
if (deleteObjectsRequest.Objects.Count > 0)
{
s3Client.DeleteObjects(deleteObjectsRequest);
}
Can someone please provide an example of how to use Google.Apis.Storage.v1 for uploading files to google cloud storage in c#?
I found that this basic operation is not as straight forward as you might expect. Google's documentation about it's Storage API is lacking in information about using it in C# (or any other .NET language). Searching for 'how to upload file to google cloud storage in c#' didn't exactly help me, so here is my working solution with some comments:
Preparation:
You need to create OAuth2 account in your Google Developers Console - go to Project/APIs & auth/Credentials.
Copy Client ID & Client Secret to your code. You will also need your Project name.
Code (it assumes that you've added Google.Apis.Storage.v1 via NuGet):
First, you need to authorize your requests:
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = clientId;
clientSecrets.ClientSecret = clientSecret;
//there are different scopes, which you can find here https://cloud.google.com/storage/docs/authentication
var scopes = new[] {#"https://www.googleapis.com/auth/devstorage.full_control"};
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets,scopes, "yourGoogle#email", cts.Token);
Sometimes you might also want to refresh authorization token via:
await userCredential.RefreshTokenAsync(cts.Token);
You also need to create Storage Service:
var service = new Google.Apis.Storage.v1.StorageService();
Now you can make requests to Google Storage API.
Let's start with creating a new bucket:
var newBucket = new Google.Apis.Storage.v1.Data.Bucket()
{
Name = "your-bucket-name-1"
};
var newBucketQuery = service.Buckets.Insert(newBucket, projectName);
newBucketQuery.OauthToken = userCredential.Result.Token.AccessToken;
//you probably want to wrap this into try..catch block
newBucketQuery.Execute();
And it's done. Now, you can send a request to get list of all of your buckets:
var bucketsQuery = service.Buckets.List(projectName);
bucketsQuery.OauthToken = userCredential.Result.Token.AccessToken;
var buckets = bucketsQuery.Execute();
Last part is uploading new file:
//enter bucket name to which you want to upload file
var bucketToUpload = buckets.Items.FirstOrDefault().Name;
var newObject = new Object()
{
Bucket = bucketToUpload,
Name = "some-file-"+new Random().Next(1,666)
};
FileStream fileStream = null;
try
{
var dir = Directory.GetCurrentDirectory();
var path = Path.Combine(dir, "test.png");
fileStream = new FileStream(path, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload,fileStream,"image/png");
uploadRequest.OauthToken = userCredential.Result.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
And bam! New file will be visible in you Google Developers Console inside of selected bucket.
You can use Google Cloud APIs without SDK in the following ways:
Required api-key.json file
Install package Google.Apis.Auth.OAuth2 in order to authorize the
HTTP web request
You can set the default configuration for your application in this
way
I did the same using .NET core web API and details are given below:
Url details:
"GoogleCloudStorageBaseUrl": "https://www.googleapis.com/upload/storage/v1/b/",
"GoogleSpeechBaseUrl": "https://speech.googleapis.com/v1/operations/",
"GoogleLongRunningRecognizeBaseUrl": "https://speech.googleapis.com/v1/speech:longrunningrecognize",
"GoogleCloudScope": "https://www.googleapis.com/auth/cloud-platform",
public void GetConfiguration()
{
// Set global configuration
bucketName = _configuration.GetValue<string>("BucketName");
googleCloudStorageBaseUrl = _configuration.GetValue<string>("GoogleCloudStorageBaseUrl");
googleSpeechBaseUrl = _configuration.GetValue<string>("GoogleSpeechBaseUrl");
googleLongRunningRecognizeBaseUrl = _configuration.GetValue<string>("GoogleLongRunningRecognizeBaseUrl");
// Set google cloud credentials
string googleApplicationCredentialsPath = _configuration.GetValue<string>("GoogleCloudCredentialPath");
using (Stream stream = new FileStream(googleApplicationCredentialsPath, FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(stream).CreateScoped(_configuration.GetValue<string>("GoogleCloudScope"));
}
Get Oauth token:
public string GetOAuthToken()
{
return googleCredential.UnderlyingCredential.GetAccessTokenForRequestAsync("https://accounts.google.com/o/oauth2/v2/auth", CancellationToken.None).Result;
}
To upload file to cloud bucket:
public async Task<string> UploadMediaToCloud(string filePath, string objectName = null)
{
string bearerToken = GetOAuthToken();
byte[] fileBytes = File.ReadAllBytes(filePath);
objectName = objectName ?? Path.GetFileName(filePath);
var baseUrl = new Uri(string.Format(googleCloudStorageBaseUrl + "" + bucketName + "/o?uploadType=media&name=" + objectName + ""));
using (WebClient client = new WebClient())
{
client.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + bearerToken);
client.Headers.Add(HttpRequestHeader.ContentType, "application/octet-stream");
byte[] response = await Task.Run(() => client.UploadData(baseUrl, "POST", fileBytes));
string responseInString = Encoding.UTF8.GetString(response);
return responseInString;
}
}
In order to perform any action to the cloud API, just need to make a HttpClient get/post request as per the requirement.
Thanks
This is for Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1), but appears to be a bit simpler to perform an upload now. I started with the Client libraries "Getting Started" instructions to create a service account and bucket, then experimented to find out how to upload an image.
The process I followed was:
Sign up for Google Cloud free trial
Create a new project in Google Cloud (remember the project name\ID for later)
Create a Project Owner service account - this will result in a json file being downloaded that contains the service account credentials. Remember where you put that file.
The getting started docs get you to add the path to the JSON credentials file into an environment variable called GOOGLE_APPLICATION_CREDENTIALS - I couldn't get this to work through the provided instructions. Turns out it is not required, as you can just read the JSON file into a string and pass it to the client constructor.
I created an empty WPF project as a starting point, and a single ViewModel to house the application logic.
Install the Google.Cloud.Storage.V1 nuget package and it should pull in all the dependencies it needs.
Onto the code.
MainWindow.xaml
<StackPanel>
<Button
Margin="50"
Height="50"
Content="BEGIN UPLOAD"
Click="OnButtonClick" />
<ContentControl
Content="{Binding Path=ProgressBar}" />
</StackPanel>
MainWindow.xaml.cs
public partial class MainWindow
{
readonly ViewModel _viewModel;
public MainWindow()
{
_viewModel = new ViewModel(Dispatcher);
DataContext = _viewModel;
InitializeComponent();
}
void OnButtonClick(object sender, RoutedEventArgs args)
{
_viewModel.UploadAsync().ConfigureAwait(false);
}
}
ViewModel.cs
public class ViewModel
{
readonly Dispatcher _dispatcher;
public ViewModel(Dispatcher dispatcher)
{
_dispatcher = dispatcher;
ProgressBar = new ProgressBar {Height=30};
}
public async Task UploadAsync()
{
// Google Cloud Platform project ID.
const string projectId = "project-id-goes-here";
// The name for the new bucket.
const string bucketName = projectId + "-test-bucket";
// Path to the file to upload
const string filePath = #"C:\path\to\image.jpg";
var newObject = new Google.Apis.Storage.v1.Data.Object
{
Bucket = bucketName,
Name = System.IO.Path.GetFileNameWithoutExtension(filePath),
ContentType = "image/jpeg"
};
// read the JSON credential file saved when you created the service account
var credential = Google.Apis.Auth.OAuth2.GoogleCredential.FromJson(System.IO.File.ReadAllText(
#"c:\path\to\service-account-credentials.json"));
// Instantiates a client.
using (var storageClient = Google.Cloud.Storage.V1.StorageClient.Create(credential))
{
try
{
// Creates the new bucket. Only required the first time.
// You can also create buckets through the GCP cloud console web interface
storageClient.CreateBucket(projectId, bucketName);
System.Windows.MessageBox.Show($"Bucket {bucketName} created.");
// Open the image file filestream
using (var fileStream = new System.IO.FileStream(filePath, System.IO.FileMode.Open))
{
ProgressBar.Maximum = fileStream.Length;
// set minimum chunksize just to see progress updating
var uploadObjectOptions = new Google.Cloud.Storage.V1.UploadObjectOptions
{
ChunkSize = Google.Cloud.Storage.V1.UploadObjectOptions.MinimumChunkSize
};
// Hook up the progress callback
var progressReporter = new Progress<Google.Apis.Upload.IUploadProgress>(OnUploadProgress);
await storageClient.UploadObjectAsync(
newObject,
fileStream,
uploadObjectOptions,
progress: progressReporter)
.ConfigureAwait(false);
}
}
catch (Google.GoogleApiException e)
when (e.Error.Code == 409)
{
// When creating the bucket - The bucket already exists. That's fine.
System.Windows.MessageBox.Show(e.Error.Message);
}
catch (Exception e)
{
// other exception
System.Windows.MessageBox.Show(e.Message);
}
}
}
// Called when progress updates
void OnUploadProgress(Google.Apis.Upload.IUploadProgress progress)
{
switch (progress.Status)
{
case Google.Apis.Upload.UploadStatus.Starting:
ProgressBar.Minimum = 0;
ProgressBar.Value = 0;
break;
case Google.Apis.Upload.UploadStatus.Completed:
ProgressBar.Value = ProgressBar.Maximum;
System.Windows.MessageBox.Show("Upload completed");
break;
case Google.Apis.Upload.UploadStatus.Uploading:
UpdateProgressBar(progress.BytesSent);
break;
case Google.Apis.Upload.UploadStatus.Failed:
System.Windows.MessageBox.Show("Upload failed"
+ Environment.NewLine
+ progress.Exception);
break;
}
}
void UpdateProgressBar(long value)
{
_dispatcher.Invoke(() => { ProgressBar.Value = value; });
}
// probably better to expose progress value directly and bind to
// a ProgressBar in the XAML
public ProgressBar ProgressBar { get; }
}
Use of Google.Apis.Storage.v1 for uploading files using SDK to google cloud storage in c#:
Required api-key.json file
Install the package Google.Cloud.Storage.V1; and Google.Apis.Auth.OAuth2;
The code is given below to upload the file to the cloud
private string UploadFile(string localPath, string objectName = null)
{
string projectId = ((Google.Apis.Auth.OAuth2.ServiceAccountCredential)googleCredential.UnderlyingCredential).ProjectId;
try
{
// Creates the new bucket.
var objResult = storageClient.CreateBucket(projectId, bucketName);
if (!string.IsNullOrEmpty(objResult.Id))
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus1 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Google.GoogleApiException ex)
{
// Error code =409, means bucket already created/exist then upload file in the bucket
if (ex.Error.Code == 409)
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus2 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
return objectName;
}
To set the credentials
private bool SetStorageCredentials()
{
bool status = true;
try
{
if (File.Exists(credential_path))
{
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", credential_path);
using (Stream objStream = new FileStream(Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS"), FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(objStream);
// Instantiates a client.
storageClient = StorageClient.Create();
channel = new Grpc.Core.Channel(SpeechClient.DefaultEndpoint.Host, googleCredential.ToChannelCredentials());
}
else
{
DialogResult result = MessageBox.Show("File " + Path.GetFileName(credential_path) + " does not exist. Please provide the correct path.");
if (result == System.Windows.Forms.DialogResult.OK)
{
status = false;
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
status = false;
}
return status;
}
I used SDK in one of my window application. You can use the same code according to your needs/requirements.
You'll be happy to know it still works in 2016...
I was googling all over using fancy key words like "google gcp C# upload image", until I just plain asked the question: "How do I upload an image to google bucket using C#"... and here I am. I removed the .Result in the user credential, and this was the final edit that worked for me.
// ******
static string bucketForImage = ConfigurationManager.AppSettings["testStorageName"];
static string projectName = ConfigurationManager.AppSettings["GCPProjectName"];
string gcpPath = Path.Combine(Server.MapPath("~/Images/Gallery/"), uniqueGcpName + ext);
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = ConfigurationManager.AppSettings["GCPClientID"];
clientSecrets.ClientSecret = ConfigurationManager.AppSettings["GCPClientSc"];
var scopes = new[] { #"https://www.googleapis.com/auth/devstorage.full_control" };
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets, scopes, ConfigurationManager.AppSettings["GCPAccountEmail"], cts.Token);
var service = new Google.Apis.Storage.v1.StorageService();
var bucketToUpload = bucketForImage;
var newObject = new Google.Apis.Storage.v1.Data.Object()
{
Bucket = bucketToUpload,
Name = bkFileName
};
FileStream fileStream = null;
try
{
fileStream = new FileStream(gcpPath, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload, fileStream, "image/"+ ext);
uploadRequest.OauthToken = userCredential.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
// ******
Here is the link to their official C# example of ".NET Bookshelf App" using Google Cloud storage.
https://cloud.google.com/dotnet/docs/getting-started/using-cloud-storage
Source on github:
https://github.com/GoogleCloudPlatform/getting-started-dotnet/blob/master/aspnet/3-binary-data/Services/ImageUploader.cs
https://github.com/GoogleCloudPlatform/getting-started-dotnet/tree/master/aspnet/3-binary-data
Nuget
https://www.nuget.org/packages/Google.Cloud.Storage.V1/
Here are 2 examples that helped me to upload files to a bucket in Google Cloud Storage with Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1):
Upload files to Google cloud storage using c#
Uploading .csv Files to Google Cloud Storage using C# .Net
I got both working on a C# Console Application just for testing purposes.
#February 2021
string _projectId = "YOUR-PROJECT-ID-GCP"; //ProjectID also present in the json file
GoogleCredential _credential = GoogleCredential.FromFile("credential-cloud-file-123418c9e06c.json");
/// <summary>
/// UploadFile to GCS Bucket
/// </summary>
/// <param name="bucketName"></param>
/// <param name="localPath">my-local-path/my-file-name</param>
/// <param name="objectName">my-file-name</param>
public void UploadFile(string bucketName, string localPath, string objectName)
{
var storage = StorageClient.Create(_credential);
using var fileStream = File.OpenRead(localPath);
storage.UploadObject(bucketName, objectName, null, fileStream);
Console.WriteLine($"Uploaded {objectName}.");
}
You get the credentials JSON file from the google cloud portal where you create a bucket under your project..
Simple, with auth:
private void SaveFileToGoogleStorage(string path, string? fileName, string ext)
{
var filePath = Path.Combine(path, fileName + ext);
var gcCredentialsPath = Path.Combine(Environment.CurrentDirectory, "gc_sa_key.json");
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", gcCredentialsPath);
var gcsStorage = StorageClient.Create();
using var f = File.OpenRead(filePath);
var objectName = Path.GetFileName(filePath);
gcsStorage.UploadObject(_bucketName, objectName, null, f);
Console.WriteLine($"Uploaded {objectName}.");
}
using .NET SDK v.1.5.21.0
I'm trying to upload a large file (63Mb) and I'm following the example at:
http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFileDotNet.html
But using a helper instead the hole code and using jQuery File Upload
https://github.com/blueimp/jQuery-File-Upload/blob/master/basic-plus.html
what I have is:
string bucket = "mybucket";
long totalSize = long.Parse(context.Request.Headers["X-File-Size"]),
maxChunkSize = long.Parse(context.Request.Headers["X-File-MaxChunkSize"]),
uploadedBytes = long.Parse(context.Request.Headers["X-File-UloadedBytes"]),
partNumber = uploadedBytes / maxChunkSize + 1,
fileSize = partNumber * inputStream.Length;
bool lastPart = inputStream.Length < maxChunkSize;
// http://docs.aws.amazon.com/AmazonS3/latest/dev/LLuploadFileDotNet.html
if (partNumber == 1) // initialize upload
{
iView.Utilities.Amazon_S3.S3MultipartUpload.InitializePartToCloud(fileName, bucket);
}
try
{
// upload part
iView.Utilities.Amazon_S3.S3MultipartUpload.UploadPartToCloud(fs, fileName, bucket, (int)partNumber, uploadedBytes, maxChunkSize);
if (lastPart)
// wrap it up and go home
iView.Utilities.Amazon_S3.S3MultipartUpload.CompletePartToCloud(fileName, bucket);
}
catch (System.Exception ex)
{
// Huston, we have a problem!
//Console.WriteLine("Exception occurred: {0}", exception.Message);
iView.Utilities.Amazon_S3.S3MultipartUpload.AbortPartToCloud(fileName, bucket);
}
and
public static class S3MultipartUpload
{
private static string accessKey = System.Configuration.ConfigurationManager.AppSettings["AWSAccessKey"];
private static string secretAccessKey = System.Configuration.ConfigurationManager.AppSettings["AWSSecretKey"];
private static AmazonS3 client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretAccessKey);
public static InitiateMultipartUploadResponse initResponse;
public static List<UploadPartResponse> uploadResponses;
public static void InitializePartToCloud(string destinationFilename, string destinationBucket)
{
// 1. Initialize.
uploadResponses = new List<UploadPartResponse>();
InitiateMultipartUploadRequest initRequest =
new InitiateMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'));
initResponse = client.InitiateMultipartUpload(initRequest);
}
public static void UploadPartToCloud(Stream fileStream, string destinationFilename, string destinationBucket, int partNumber, long uploadedBytes, long maxChunkedBytes)
{
// 2. Upload Parts.
UploadPartRequest request = new UploadPartRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId)
.WithPartNumber(partNumber)
.WithPartSize(maxChunkedBytes)
.WithFilePosition(uploadedBytes)
.WithInputStream(fileStream) as UploadPartRequest;
uploadResponses.Add(client.UploadPart(request));
}
public static void CompletePartToCloud(string destinationFilename, string destinationBucket)
{
// Step 3: complete.
CompleteMultipartUploadRequest compRequest =
new CompleteMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId)
.WithPartETags(uploadResponses);
CompleteMultipartUploadResponse completeUploadResponse =
client.CompleteMultipartUpload(compRequest);
}
public static void AbortPartToCloud(string destinationFilename, string destinationBucket)
{
// abort.
client.AbortMultipartUpload(new AbortMultipartUploadRequest()
.WithBucketName(destinationBucket)
.WithKey(destinationFilename.TrimStart('/'))
.WithUploadId(initResponse.UploadId));
}
}
my maxChunckedSize is 6Mb (6 * (1024*1024)) as I have read that the minimum is 5Mb...
why am I getting "Your proposed upload is smaller than the minimum allowed size" exception? What am I doing wrong?
The error is:
<Error>
<Code>EntityTooSmall</Code>
<Message>Your proposed upload is smaller than the minimum allowed size</Message>
<ETag>d41d8cd98f00b204e9800998ecf8427e</ETag>
<MinSizeAllowed>5242880</MinSizeAllowed>
<ProposedSize>0</ProposedSize>
<RequestId>C70E7A23C87CE5FC</RequestId>
<HostId>pmhuMXdRBSaCDxsQTHzucV5eUNcDORvKY0L4ZLMRBz7Ch1DeMh7BtQ6mmfBCLPM2</HostId>
<PartNumber>1</PartNumber>
</Error>
How can I get ProposedSize if I'm passing the stream and stream length?
Here is a working solution for the latest Amazon SDK (as today: v.1.5.37.0)
Amazon S3 Multipart Upload works like:
Initialize the request using client.InitiateMultipartUpload(initRequest)
Send chunks of the file (loop until the end) using client.UploadPart(request)
Complete the request using client.CompleteMultipartUpload(compRequest)
If anything goes wrong, remember to dispose the client and request, as well fire the abort command using client.AbortMultipartUpload(abortMultipartUploadRequest)
I keep the client in Session as we need this for each chunk upload as well, keep an hold of the ETags that are now used to complete the process.
You can see an example and simple way of doing this in Amazon Docs itself, I ended up having a class to do everything, plus, I have integrated with the lovely jQuery File Upload plugin (Handler code below as well).
The S3MultipartUpload is as follow
public class S3MultipartUpload : IDisposable
{
string accessKey = System.Configuration.ConfigurationManager.AppSettings.Get("AWSAccessKey");
string secretAccessKey = System.Configuration.ConfigurationManager.AppSettings.Get("AWSSecretKey");
AmazonS3 client;
public string OriginalFilename { get; set; }
public string DestinationFilename { get; set; }
public string DestinationBucket { get; set; }
public InitiateMultipartUploadResponse initResponse;
public List<PartETag> uploadPartETags;
public string UploadId { get; private set; }
public S3MultipartUpload(string destinationFilename, string destinationBucket)
{
if (client == null)
{
System.Net.WebRequest.DefaultWebProxy = null; // disable proxy to make upload quicker
client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretAccessKey, new AmazonS3Config()
{
RegionEndpoint = Amazon.RegionEndpoint.EUWest1,
CommunicationProtocol = Protocol.HTTP
});
this.OriginalFilename = destinationFilename.TrimStart('/');
this.DestinationFilename = string.Format("{0:yyyy}{0:MM}{0:dd}{0:HH}{0:mm}{0:ss}{0:fffff}_{1}", DateTime.UtcNow, this.OriginalFilename);
this.DestinationBucket = destinationBucket;
this.InitializePartToCloud();
}
}
private void InitializePartToCloud()
{
// 1. Initialize.
uploadPartETags = new List<PartETag>();
InitiateMultipartUploadRequest initRequest = new InitiateMultipartUploadRequest();
initRequest.BucketName = this.DestinationBucket;
initRequest.Key = this.DestinationFilename;
// make it public
initRequest.AddHeader("x-amz-acl", "public-read");
initResponse = client.InitiateMultipartUpload(initRequest);
}
public void UploadPartToCloud(Stream fileStream, long uploadedBytes, long maxChunkedBytes)
{
int partNumber = uploadPartETags.Count() + 1; // current part
// 2. Upload Parts.
UploadPartRequest request = new UploadPartRequest();
request.BucketName = this.DestinationBucket;
request.Key = this.DestinationFilename;
request.UploadId = initResponse.UploadId;
request.PartNumber = partNumber;
request.PartSize = fileStream.Length;
//request.FilePosition = uploadedBytes // remove this line?
request.InputStream = fileStream; // as UploadPartRequest;
var up = client.UploadPart(request);
uploadPartETags.Add(new PartETag() { ETag = up.ETag, PartNumber = partNumber });
}
public string CompletePartToCloud()
{
// Step 3: complete.
CompleteMultipartUploadRequest compRequest = new CompleteMultipartUploadRequest();
compRequest.BucketName = this.DestinationBucket;
compRequest.Key = this.DestinationFilename;
compRequest.UploadId = initResponse.UploadId;
compRequest.PartETags = uploadPartETags;
string r = "Something went badly wrong";
using (CompleteMultipartUploadResponse completeUploadResponse = client.CompleteMultipartUpload(compRequest))
r = completeUploadResponse.ResponseXml;
return r;
}
public void AbortPartToCloud()
{
// abort.
client.AbortMultipartUpload(new AbortMultipartUploadRequest()
{
BucketName = this.DestinationBucket,
Key = this.DestinationFilename,
UploadId = initResponse.UploadId
});
}
public void Dispose()
{
if (client != null) client.Dispose();
if (initResponse != null) initResponse.Dispose();
}
}
I use DestinationFilename as the destination file so I can avoid the same name, but I keep the OriginalFilename as I needed later.
Using jQuery File Upload Plugin, all works inside a Generic Handler, and the process is something like this:
// Upload partial file
private void UploadPartialFile(string fileName, HttpContext context, List<FilesStatus> statuses)
{
if (context.Request.Files.Count != 1)
throw new HttpRequestValidationException("Attempt to upload chunked file containing more than one fragment per request");
var inputStream = context.Request.Files[0].InputStream;
string contentRange = context.Request.Headers["Content-Range"]; // "bytes 0-6291455/14130271"
int fileSize = int.Parse(contentRange.Split('/')[1]);,
maxChunkSize = int.Parse(context.Request.Headers["X-Max-Chunk-Size"]),
uploadedBytes = int.Parse(contentRange.Replace("bytes ", "").Split('-')[0]);
iView.Utilities.AWS.S3MultipartUpload s3Upload = null;
try
{
// ######################################################################################
// 1. Initialize Amazon S3 Client
if (uploadedBytes == 0)
{
HttpContext.Current.Session["s3-upload"] = new iView.Utilities.AWS.S3MultipartUpload(fileName, awsBucket);
s3Upload = (iView.Utilities.AWS.S3MultipartUpload)HttpContext.Current.Session["s3-upload"];
string msg = System.String.Format("Upload started: {0} ({1:N0}Mb)", s3Upload.DestinationFilename, (fileSize / 1024));
this.Log(msg);
}
// cast current session object
if (s3Upload == null)
s3Upload = (iView.Utilities.AWS.S3MultipartUpload)HttpContext.Current.Session["s3-upload"];
// ######################################################################################
// 2. Send Chunks
s3Upload.UploadPartToCloud(inputStream, uploadedBytes, maxChunkSize);
// ######################################################################################
// 3. Complete Upload
if (uploadedBytes + maxChunkSize > fileSize)
{
string completeRequest = s3Upload.CompletePartToCloud();
this.Log(completeRequest); // log S3 response
s3Upload.Dispose(); // dispose all objects
HttpContext.Current.Session["s3-upload"] = null; // we don't need this anymore
}
}
catch (System.Exception ex)
{
if (ex.InnerException != null)
while (ex.InnerException != null)
ex = ex.InnerException;
this.Log(string.Format("{0}\n\n{1}", ex.Message, ex.StackTrace)); // log error
s3Upload.AbortPartToCloud(); // abort current upload
s3Upload.Dispose(); // dispose all objects
statuses.Add(new FilesStatus(ex.Message));
return;
}
statuses.Add(new FilesStatus(s3Upload.DestinationFilename, fileSize, ""));
}
Keep in mind that to have a Session object inside a Generic Handler, you need to implement IRequiresSessionState so your handler will look like:
public class UploadHandlerSimple : IHttpHandler, IRequiresSessionState
Inside fileupload.js (under _initXHRData) I have added an extra header called X-Max-Chunk-Size so I can pass this to Amazon and calculate if it's the last part of the uploaded file.
Fell free to comment and make smart edits for everyone to use.
I guess you didn't set the content-length of the part inside the UploadPartToCloud() function.