OneDrive Upload/Download to Specified Directory - c#

I'm trying to use the Live SDK (v5.6) to include backup/restore from OneDrive in my Windows Phone 8.1 Silverlight application. I can read/write to the standard "me/skydrive" directory, but I am having a horrible time in finding a way to upload/download to a specified directory. I can create the folder if it doesn't exist no problem.
I have been trying below with no luck.
var res = await _client.UploadAsync("me/skydrive/mydir", fileName, isoStoreFileStream, OverwriteOption.Overwrite);
I've also tried getting the directory ID and passing that in also.
var res = await _client.UploadAsync("me/skydrive/" + folderId, fileName, isoStoreFileStream, OverwriteOption.Overwrite);
Same error.. I receive 'mydir' or the id isn't supported...
"{request_url_invalid: Microsoft.Live.LiveConnectException: The URL contains the path 'mydir', which isn't supported."
Any suggestions? If you suggest an answer for the uploadasync, could you also include how I could download my file from the specified directory? Thanks!

Here's an extension method that checks if a folder is created and:
If created returns the folder id.
If not created, creates it and returns the folder id.
You can then use this id to upload to and download from that folder.
public async static Task<string> CreateDirectoryAsync(this LiveConnectClient client,
string folderName, string parentFolder)
{
string folderId = null;
// Retrieves all the directories.
var queryFolder = parentFolder + "/files?filter=folders,albums";
var opResult = await client.GetAsync(queryFolder);
dynamic result = opResult.Result;
foreach (dynamic folder in result.data)
{
// Checks if current folder has the passed name.
if (folder.name.ToLowerInvariant() == folderName.ToLowerInvariant())
{
folderId = folder.id;
break;
}
}
if (folderId == null)
{
// Directory hasn't been found, so creates it using the PostAsync method.
var folderData = new Dictionary<string, object>();
folderData.Add("name", folderName);
opResult = await client.PostAsync(parentFolder, folderData);
result = opResult.Result;
// Retrieves the id of the created folder.
folderId = result.id;
}
return folderId;
}
You then use this as:
string skyDriveFolder = await CreateDirectoryAsync(liveConnectClient, "<YourFolderNameHere>", "me/skydrive");
Now skyDriveFolder has the folder id that you can use when uploading and downloading. Here's a sample Upload:
LiveOperationResult result = await liveConnectClient.UploadAsync(skyDriveFolder, fileName,
fileStream, OverwriteOption.Overwrite);
ADDITION TO COMPLETE THE ANSWER BY YnotDraw
Using what you provided, here's how to download a text file by specifying the file name. Below does not include if the file is not found and other potential exceptions, but here is what works when the stars align properly:
public async static Task<string> DownloadFileAsync(this LiveConnectClient client, string directory, string fileName)
{
string skyDriveFolder = await OneDriveHelper.CreateOrGetDirectoryAsync(client, directory, "me/skydrive");
var result = await client.DownloadAsync(skyDriveFolder);
var operation = await client.GetAsync(skyDriveFolder + "/files");
var items = operation.Result["data"] as List<object>;
string id = string.Empty;
// Search for the file - add handling here if File Not Found
foreach (object item in items)
{
IDictionary<string, object> file = item as IDictionary<string, object>;
if (file["name"].ToString() == fileName)
{
id = file["id"].ToString();
break;
}
}
var downloadResult= await client.DownloadAsync(string.Format("{0}/content", id));
var reader = new StreamReader(downloadResult.Stream);
string text = await reader.ReadToEndAsync();
return text;
}
And in usage:
var result = await DownloadFile(_client, "MyDir", "backup.txt");

Related

Only upload updated files into Blob Storage

I have a method that uploads XML files from a folder into a Blob Storage. Connected to that Blob Storage i have a Blob Trigger that listens to changes in the Blob Storage, takes the files and then does a PUT request to a server. I got that sorted out and working.
My problem is that when I want to update a specific file in the folder and run my code, all the files in the folder seems to be uploaded again and my Blob Trigger goes of, doing a PUT for all the files. I only want to do a PUT on the files that changed in the folder (except for my initial upload to the blob of course).
The code I have so far is as basic as my level of experience. For the import I followed a simple guide.
My code that uploads the file into the Blob Storage:
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Starting...");
string accountName = ConfigurationManager.AppSettings["accountName"];
string accountKey = ConfigurationManager.AppSettings["accountKey"];
string localFolder = ConfigurationManager.AppSettings["mySourceFolder"];
string destContainer = ConfigurationManager.AppSettings["destContainer"];
var stringReturned = BlobSetup(accountName, accountKey, localFolder, destContainer);
Console.WriteLine(stringReturned);
Console.Read();
}
static async Task UploadBlob(CloudBlobContainer container, string key, string filePath, bool deleteAfter)
{
//Get a blob reference to write this file to
var blob = container.GetBlockBlobReference(key);
await blob.UploadFromFileAsync(filePath);
Console.WriteLine("Uploaded {0}", filePath);
//if delete of file is requested, do that
if (deleteAfter)
{
File.Delete(filePath);
}
}
static async Task<string> BlobSetup(string accountName, string accountKey, string localFolder, string destContainer)
{
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(destContainer);
//create container if not exists
await container.CreateIfNotExistsAsync();
await container.SetPermissionsAsync(new BlobContainerPermissions()
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
string[] fileEntries = Directory.GetFiles(localFolder);
foreach (string filePath in fileEntries)
{
//Handle only json and xml?
if(filePath.EndsWith(".json") || filePath.EndsWith(".xml"))
{
string keys = Path.GetFileName(filePath);
await UploadBlob(container, keys, filePath, false);
}
}
return "some response";
}
My Blob Trigger that does the PUT:
public static class BlobTriggerExample
{
const string serverUrl= "theurl";
static HttpClient client = new HttpClient();
[FunctionName("BlobTriggerExample")]
public static async Task Run([BlobTrigger("myblob/{name}", Connection = "AzureWebJobsStorage")]CloudBlockBlob myBlob, string name, TraceWriter log)
{
string putUrlString = "";
string idValue = "";
XDocument xdoc = new XDocument();
myBlob.StreamMinimumReadSizeInBytes = 20 * 1024 * 1024;
await myBlob.FetchAttributesAsync();
//Read stream
var blobStream = await myBlob.OpenReadAsync();
xdoc = new XDocument(XDocument.Load(blobStream));
//Read root node(resourceType)
string resourceType = xdoc.Root.Name.LocalName;
//Get id value
idValue = xdoc.Descendants().Where(x => x.Name.LocalName == "id").First().LastAttribute.Value;
//Build redirect string
putUrlString = serverUrl + resourceType + "/" + idValue;
//PUT
var httpContent = new StringContent(xdoc.ToString(), Encoding.UTF8, "application/xml");
var response = await client.PutAsync(putUrlString, httpContent);
Console.WriteLine($"Response: {response}");
Console.Read();
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.StreamWriteSizeInBytes} Bytes : Response message: {response}");
}
}
My guess is that I want to be able to control what files I'm uploading into the Blob Storage by doing some sort of check if the exact same file already exist. Or maybe I want to make some sort of check in the Blob Trigger before doing the PUT?
File names in the folder I'm uploading from is always the same(a must), even though some if the content might have changed.
Is there anyone that could be so kind to give me some guidelines on how I might approach this? I have been googling around for hours and I'm getting nowhere.
Yes, your code loops through and uploads all the files in your local folder. The blob trigger just sees that the blobs have been written and has no concept of whether or not their content has changed (or whether that matters) so it also processes all of them.
What you need to do is to compare your local files against the ones in blob storage before you upload to see whether they're new versions or not, so in your UploadBlob method you want something along the lines of
// Get a blob reference to write this file to
var blob = container.GetBlockBlobReference(key);
// If the blob already exists
if (await blob.ExistsAsync()) {
// Fetch the blob's properties
await blob.FetchAttributesAsync();
// Only proceed if modification time of local file is newer
if (blob.Properties.LastModified > File.GetLastWriteTimeUtc(filePath))
return;
}
If checking the modification time isn't enough then you can also attach your own metadata (e.g. a checksum) to blobs and use that for the comparison - see https://learn.microsoft.com/en-us/azure/storage/blobs/storage-properties-metadata.
Below is a version we use in production. Note that Metadata is used instead of blob last modified property for better code flexibility & reliability. Might not fit everybody's needs but this is one way of doing it that is tested working.
for (var i = 0; i < files.Length; i++)
{
var filePath = files[i];
//maintains folder structure
var blobName = filePath.Replace(selectedBuildPath, "").Replace("\\", "/");
var blobClient = GetNewBlobClientWithMaxRetry(blobName, containerClient);
// If the blob already exists, get the last modified tick count in the blobs metadata
long blobLastModifiedTick = 0;
if (await blobClient.ExistsAsync())
{
try
{
var blobProperties = (await blobClient.GetPropertiesAsync()).Value;
//this line will fail if lastmodifiedticks was not set before
blobLastModifiedTick = long.Parse(blobProperties.Metadata[_lastmodifiedticks]);
}
//if fail, blob last tick will be 0, aka outdated
catch { }
}
//if local file is latest, then upload else skip
var localLastModified = File.GetLastWriteTimeUtc(filePath);
var localIsNewer = blobLastModifiedTick < localLastModified.Ticks;
if (localIsNewer)
{
//print name 1st to know which gets stuck
//extra space to delete extra prev
Console.Write($"\r{i}/{files.Length} Uploading:{blobName} ");
//note will overwrite existing
await using var fs = File.Open(filePath, FileMode.Open);
await blobClient.UploadAsync(fs, overwrite: true, CancellationToken.None);
//auto correct content type from wrongly set "octet/stream"
var blobHttpHeaders = new BlobHttpHeaders { ContentType = MimeTypeMap.GetMimeType(filePath) };
await blobClient.SetHttpHeadersAsync(blobHttpHeaders);
await blobClient.SetMetadataAsync(new Dictionary<string, string>() { { _lastmodifiedticks, localLastModified.Ticks.ToString() } });
}
else
{
Console.Write($"\r{i}/{files.Length} Skipped:{blobName} "); //space to delete extra previous line
}
}

How to get fields from ExecuteAsync() method

I'm familiar with creating folder in my Google Drive, but I'm looking into doing this using the asynchronous method. However, in doing so, I'm not sure how to obtain a field that I explicitly added to the fields I'd like returned.
My code below:
private Task<Google.Apis.Drive.v3.Data.File> CreateGoogleDriveFolderAsync(DriveService service, string foldername, string parent_id = null)
{
IList<string> parent_ids = new List<string>();
Google.Apis.Drive.v3.Data.File folder = new Google.Apis.Drive.v3.Data.File
{
Name = foldername
, MimeType = "application/vnd.google-apps.folder"
, Description = "Client Name: blah\nUser: Rudy\n"
};
var insert = service.Files.Create(folder);
// The field I'd like to get somewhere.
insert.Fields = "id";
var task = insert.ExecuteAsync();
task.ContinueWith(t =>
{
// NotOnRanToCompletion - this code will be called if the upload fails
Console.WriteLine("Failed to create folder \"{0}\": " + t.Exception, foldername);
}, TaskContinuationOptions.NotOnRanToCompletion);
task.ContinueWith(t =>
{
// I'd like a way to access "id" from my insert execution.
log.insertLogging(foldername, "Directory Created");
});
return task;
}
Using an example from the documentation, await the task and get the file back from there you should have access to its properties
private async Task<File> CreateGoogleDriveFolderAsync(DriveService driveService, string foldername) {
var metadata = new File()
{
Name = foldername,
MimeType = "application/vnd.google-apps.folder"
};
var request = driveService.Files.Create(metadata);
request.Fields = "id";
var folder = await request.ExecuteAsync();
Console.WriteLine("Folder ID: " + folder.Id);
return folder;
}

Web API Upload Files

I have some data to save into a database.
I have created a web api post method to save data. Following is my post method:
[Route("PostRequirementTypeProcessing")]
public IEnumerable<NPAAddRequirementTypeProcessing> PostRequirementTypeProcessing(mdlAddAddRequirementTypeProcessing requTypeProcess)
{
mdlAddAddRequirementTypeProcessing rTyeProcessing = new mdlAddAddRequirementTypeProcessing();
rTyeProcessing.szDescription = requTypeProcess.szDescription;
rTyeProcessing.iRequirementTypeId = requTypeProcess.iRequirementTypeId;
rTyeProcessing.szRequirementNumber = requTypeProcess.szRequirementNumber;
rTyeProcessing.szRequirementIssuer = requTypeProcess.szRequirementIssuer;
rTyeProcessing.szOrganization = requTypeProcess.szOrganization;
rTyeProcessing.dIssuedate = requTypeProcess.dIssuedate;
rTyeProcessing.dExpirydate = requTypeProcess.dExpirydate;
rTyeProcessing.szSignedBy = requTypeProcess.szSignedBy;
rTyeProcessing.szAttachedDocumentNo = requTypeProcess.szAttachedDocumentNo;
if (String.IsNullOrEmpty(rTyeProcessing.szAttachedDocumentNo))
{
}
else
{
UploadFile();
}
rTyeProcessing.szSubject = requTypeProcess.szSubject;
rTyeProcessing.iApplicationDetailsId = requTypeProcess.iApplicationDetailsId;
rTyeProcessing.iEmpId = requTypeProcess.iEmpId;
NPAEntities context = new NPAEntities();
Log.Debug("PostRequirementTypeProcessing Request traced");
var newRTP = context.NPAAddRequirementTypeProcessing(requTypeProcess.szDescription, requTypeProcess.iRequirementTypeId,
requTypeProcess.szRequirementNumber, requTypeProcess.szRequirementIssuer, requTypeProcess.szOrganization,
requTypeProcess.dIssuedate, requTypeProcess.dExpirydate, requTypeProcess.szSignedBy,
requTypeProcess.szAttachedDocumentNo, requTypeProcess.szSubject, requTypeProcess.iApplicationDetailsId,
requTypeProcess.iEmpId);
return newRTP.ToList();
}
There is a field called 'szAttachedDocumentNo' which is a document that's being saved in the database as well.
After saving all data, I want the physical file of the 'szAttachedDocumentNo' to be saved on the server. So i created a method called "UploadFile" as follows:
[HttpPost]
public void UploadFile()
{
if (HttpContext.Current.Request.Files.AllKeys.Any())
{
// Get the uploaded file from the Files collection
var httpPostedFile = HttpContext.Current.Request.Files["UploadedFile"];
if (httpPostedFile != null)
{
// Validate the uploaded image(optional)
string folderPath = HttpContext.Current.Server.MapPath("~/UploadedFiles");
//string folderPath1 = Convert.ToString(ConfigurationManager.AppSettings["DocPath"]);
//Directory not exists then create new directory
if (!Directory.Exists(folderPath))
{
Directory.CreateDirectory(folderPath);
}
// Get the complete file path
var fileSavePath = Path.Combine(folderPath, httpPostedFile.FileName);
// Save the uploaded file to "UploadedFiles" folder
httpPostedFile.SaveAs(fileSavePath);
}
}
}
Before running the project, i debbugged the post method, so when it comes to "UploadFile" line, it takes me to its method.
From the file line, it skipped the remaining lines and went to the last line; what means it didn't see any file.
I am able to save everything to the database, just that i didn't see the physical file in the specified location.
Any help would be much appreciated.
Regards,
Somad
Makes sure the request "content-type": "multipart/form-data" is set
[HttpPost()]
public async Task<IHttpActionResult> UploadFile()
{
if (!Request.Content.IsMimeMultipartContent())
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
try
{
MultipartMemoryStreamProvider provider = new MultipartMemoryStreamProvider();
await Request.Content.ReadAsMultipartAsync(provider);
if (provider.Contents != null && provider.Contents.Count == 0)
{
return BadRequest("No files provided.");
}
foreach (HttpContent file in provider.Contents)
{
string filename = file.Headers.ContentDisposition.FileName.Trim('\"');
byte[] buffer = await file.ReadAsByteArrayAsync();
using (MemoryStream stream = new MemoryStream(buffer))
{
// save the file whereever you want
}
}
return Ok("files Uploded");
}
catch (Exception ex)
{
return InternalServerError(ex);
}
}

Uploading objects to google cloud storage buckets in c#

Can someone please provide an example of how to use Google.Apis.Storage.v1 for uploading files to google cloud storage in c#?
I found that this basic operation is not as straight forward as you might expect. Google's documentation about it's Storage API is lacking in information about using it in C# (or any other .NET language). Searching for 'how to upload file to google cloud storage in c#' didn't exactly help me, so here is my working solution with some comments:
Preparation:
You need to create OAuth2 account in your Google Developers Console - go to Project/APIs & auth/Credentials.
Copy Client ID & Client Secret to your code. You will also need your Project name.
Code (it assumes that you've added Google.Apis.Storage.v1 via NuGet):
First, you need to authorize your requests:
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = clientId;
clientSecrets.ClientSecret = clientSecret;
//there are different scopes, which you can find here https://cloud.google.com/storage/docs/authentication
var scopes = new[] {#"https://www.googleapis.com/auth/devstorage.full_control"};
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets,scopes, "yourGoogle#email", cts.Token);
Sometimes you might also want to refresh authorization token via:
await userCredential.RefreshTokenAsync(cts.Token);
You also need to create Storage Service:
var service = new Google.Apis.Storage.v1.StorageService();
Now you can make requests to Google Storage API.
Let's start with creating a new bucket:
var newBucket = new Google.Apis.Storage.v1.Data.Bucket()
{
Name = "your-bucket-name-1"
};
var newBucketQuery = service.Buckets.Insert(newBucket, projectName);
newBucketQuery.OauthToken = userCredential.Result.Token.AccessToken;
//you probably want to wrap this into try..catch block
newBucketQuery.Execute();
And it's done. Now, you can send a request to get list of all of your buckets:
var bucketsQuery = service.Buckets.List(projectName);
bucketsQuery.OauthToken = userCredential.Result.Token.AccessToken;
var buckets = bucketsQuery.Execute();
Last part is uploading new file:
//enter bucket name to which you want to upload file
var bucketToUpload = buckets.Items.FirstOrDefault().Name;
var newObject = new Object()
{
Bucket = bucketToUpload,
Name = "some-file-"+new Random().Next(1,666)
};
FileStream fileStream = null;
try
{
var dir = Directory.GetCurrentDirectory();
var path = Path.Combine(dir, "test.png");
fileStream = new FileStream(path, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload,fileStream,"image/png");
uploadRequest.OauthToken = userCredential.Result.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
And bam! New file will be visible in you Google Developers Console inside of selected bucket.
You can use Google Cloud APIs without SDK in the following ways:
Required api-key.json file
Install package Google.Apis.Auth.OAuth2 in order to authorize the
HTTP web request
You can set the default configuration for your application in this
way
I did the same using .NET core web API and details are given below:
Url details:
"GoogleCloudStorageBaseUrl": "https://www.googleapis.com/upload/storage/v1/b/",
"GoogleSpeechBaseUrl": "https://speech.googleapis.com/v1/operations/",
"GoogleLongRunningRecognizeBaseUrl": "https://speech.googleapis.com/v1/speech:longrunningrecognize",
"GoogleCloudScope": "https://www.googleapis.com/auth/cloud-platform",
public void GetConfiguration()
{
// Set global configuration
bucketName = _configuration.GetValue<string>("BucketName");
googleCloudStorageBaseUrl = _configuration.GetValue<string>("GoogleCloudStorageBaseUrl");
googleSpeechBaseUrl = _configuration.GetValue<string>("GoogleSpeechBaseUrl");
googleLongRunningRecognizeBaseUrl = _configuration.GetValue<string>("GoogleLongRunningRecognizeBaseUrl");
// Set google cloud credentials
string googleApplicationCredentialsPath = _configuration.GetValue<string>("GoogleCloudCredentialPath");
using (Stream stream = new FileStream(googleApplicationCredentialsPath, FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(stream).CreateScoped(_configuration.GetValue<string>("GoogleCloudScope"));
}
Get Oauth token:
public string GetOAuthToken()
{
return googleCredential.UnderlyingCredential.GetAccessTokenForRequestAsync("https://accounts.google.com/o/oauth2/v2/auth", CancellationToken.None).Result;
}
To upload file to cloud bucket:
public async Task<string> UploadMediaToCloud(string filePath, string objectName = null)
{
string bearerToken = GetOAuthToken();
byte[] fileBytes = File.ReadAllBytes(filePath);
objectName = objectName ?? Path.GetFileName(filePath);
var baseUrl = new Uri(string.Format(googleCloudStorageBaseUrl + "" + bucketName + "/o?uploadType=media&name=" + objectName + ""));
using (WebClient client = new WebClient())
{
client.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + bearerToken);
client.Headers.Add(HttpRequestHeader.ContentType, "application/octet-stream");
byte[] response = await Task.Run(() => client.UploadData(baseUrl, "POST", fileBytes));
string responseInString = Encoding.UTF8.GetString(response);
return responseInString;
}
}
In order to perform any action to the cloud API, just need to make a HttpClient get/post request as per the requirement.
Thanks
This is for Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1), but appears to be a bit simpler to perform an upload now. I started with the Client libraries "Getting Started" instructions to create a service account and bucket, then experimented to find out how to upload an image.
The process I followed was:
Sign up for Google Cloud free trial
Create a new project in Google Cloud (remember the project name\ID for later)
Create a Project Owner service account - this will result in a json file being downloaded that contains the service account credentials. Remember where you put that file.
The getting started docs get you to add the path to the JSON credentials file into an environment variable called GOOGLE_APPLICATION_CREDENTIALS - I couldn't get this to work through the provided instructions. Turns out it is not required, as you can just read the JSON file into a string and pass it to the client constructor.
I created an empty WPF project as a starting point, and a single ViewModel to house the application logic.
Install the Google.Cloud.Storage.V1 nuget package and it should pull in all the dependencies it needs.
Onto the code.
MainWindow.xaml
<StackPanel>
<Button
Margin="50"
Height="50"
Content="BEGIN UPLOAD"
Click="OnButtonClick" />
<ContentControl
Content="{Binding Path=ProgressBar}" />
</StackPanel>
MainWindow.xaml.cs
public partial class MainWindow
{
readonly ViewModel _viewModel;
public MainWindow()
{
_viewModel = new ViewModel(Dispatcher);
DataContext = _viewModel;
InitializeComponent();
}
void OnButtonClick(object sender, RoutedEventArgs args)
{
_viewModel.UploadAsync().ConfigureAwait(false);
}
}
ViewModel.cs
public class ViewModel
{
readonly Dispatcher _dispatcher;
public ViewModel(Dispatcher dispatcher)
{
_dispatcher = dispatcher;
ProgressBar = new ProgressBar {Height=30};
}
public async Task UploadAsync()
{
// Google Cloud Platform project ID.
const string projectId = "project-id-goes-here";
// The name for the new bucket.
const string bucketName = projectId + "-test-bucket";
// Path to the file to upload
const string filePath = #"C:\path\to\image.jpg";
var newObject = new Google.Apis.Storage.v1.Data.Object
{
Bucket = bucketName,
Name = System.IO.Path.GetFileNameWithoutExtension(filePath),
ContentType = "image/jpeg"
};
// read the JSON credential file saved when you created the service account
var credential = Google.Apis.Auth.OAuth2.GoogleCredential.FromJson(System.IO.File.ReadAllText(
#"c:\path\to\service-account-credentials.json"));
// Instantiates a client.
using (var storageClient = Google.Cloud.Storage.V1.StorageClient.Create(credential))
{
try
{
// Creates the new bucket. Only required the first time.
// You can also create buckets through the GCP cloud console web interface
storageClient.CreateBucket(projectId, bucketName);
System.Windows.MessageBox.Show($"Bucket {bucketName} created.");
// Open the image file filestream
using (var fileStream = new System.IO.FileStream(filePath, System.IO.FileMode.Open))
{
ProgressBar.Maximum = fileStream.Length;
// set minimum chunksize just to see progress updating
var uploadObjectOptions = new Google.Cloud.Storage.V1.UploadObjectOptions
{
ChunkSize = Google.Cloud.Storage.V1.UploadObjectOptions.MinimumChunkSize
};
// Hook up the progress callback
var progressReporter = new Progress<Google.Apis.Upload.IUploadProgress>(OnUploadProgress);
await storageClient.UploadObjectAsync(
newObject,
fileStream,
uploadObjectOptions,
progress: progressReporter)
.ConfigureAwait(false);
}
}
catch (Google.GoogleApiException e)
when (e.Error.Code == 409)
{
// When creating the bucket - The bucket already exists. That's fine.
System.Windows.MessageBox.Show(e.Error.Message);
}
catch (Exception e)
{
// other exception
System.Windows.MessageBox.Show(e.Message);
}
}
}
// Called when progress updates
void OnUploadProgress(Google.Apis.Upload.IUploadProgress progress)
{
switch (progress.Status)
{
case Google.Apis.Upload.UploadStatus.Starting:
ProgressBar.Minimum = 0;
ProgressBar.Value = 0;
break;
case Google.Apis.Upload.UploadStatus.Completed:
ProgressBar.Value = ProgressBar.Maximum;
System.Windows.MessageBox.Show("Upload completed");
break;
case Google.Apis.Upload.UploadStatus.Uploading:
UpdateProgressBar(progress.BytesSent);
break;
case Google.Apis.Upload.UploadStatus.Failed:
System.Windows.MessageBox.Show("Upload failed"
+ Environment.NewLine
+ progress.Exception);
break;
}
}
void UpdateProgressBar(long value)
{
_dispatcher.Invoke(() => { ProgressBar.Value = value; });
}
// probably better to expose progress value directly and bind to
// a ProgressBar in the XAML
public ProgressBar ProgressBar { get; }
}
Use of Google.Apis.Storage.v1 for uploading files using SDK to google cloud storage in c#:
Required api-key.json file
Install the package Google.Cloud.Storage.V1; and Google.Apis.Auth.OAuth2;
The code is given below to upload the file to the cloud
private string UploadFile(string localPath, string objectName = null)
{
string projectId = ((Google.Apis.Auth.OAuth2.ServiceAccountCredential)googleCredential.UnderlyingCredential).ProjectId;
try
{
// Creates the new bucket.
var objResult = storageClient.CreateBucket(projectId, bucketName);
if (!string.IsNullOrEmpty(objResult.Id))
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus1 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Google.GoogleApiException ex)
{
// Error code =409, means bucket already created/exist then upload file in the bucket
if (ex.Error.Code == 409)
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus2 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
return objectName;
}
To set the credentials
private bool SetStorageCredentials()
{
bool status = true;
try
{
if (File.Exists(credential_path))
{
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", credential_path);
using (Stream objStream = new FileStream(Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS"), FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(objStream);
// Instantiates a client.
storageClient = StorageClient.Create();
channel = new Grpc.Core.Channel(SpeechClient.DefaultEndpoint.Host, googleCredential.ToChannelCredentials());
}
else
{
DialogResult result = MessageBox.Show("File " + Path.GetFileName(credential_path) + " does not exist. Please provide the correct path.");
if (result == System.Windows.Forms.DialogResult.OK)
{
status = false;
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
status = false;
}
return status;
}
I used SDK in one of my window application. You can use the same code according to your needs/requirements.
You'll be happy to know it still works in 2016...
I was googling all over using fancy key words like "google gcp C# upload image", until I just plain asked the question: "How do I upload an image to google bucket using C#"... and here I am. I removed the .Result in the user credential, and this was the final edit that worked for me.
// ******
static string bucketForImage = ConfigurationManager.AppSettings["testStorageName"];
static string projectName = ConfigurationManager.AppSettings["GCPProjectName"];
string gcpPath = Path.Combine(Server.MapPath("~/Images/Gallery/"), uniqueGcpName + ext);
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = ConfigurationManager.AppSettings["GCPClientID"];
clientSecrets.ClientSecret = ConfigurationManager.AppSettings["GCPClientSc"];
var scopes = new[] { #"https://www.googleapis.com/auth/devstorage.full_control" };
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets, scopes, ConfigurationManager.AppSettings["GCPAccountEmail"], cts.Token);
var service = new Google.Apis.Storage.v1.StorageService();
var bucketToUpload = bucketForImage;
var newObject = new Google.Apis.Storage.v1.Data.Object()
{
Bucket = bucketToUpload,
Name = bkFileName
};
FileStream fileStream = null;
try
{
fileStream = new FileStream(gcpPath, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload, fileStream, "image/"+ ext);
uploadRequest.OauthToken = userCredential.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
// ******
Here is the link to their official C# example of ".NET Bookshelf App" using Google Cloud storage.
https://cloud.google.com/dotnet/docs/getting-started/using-cloud-storage
Source on github:
https://github.com/GoogleCloudPlatform/getting-started-dotnet/blob/master/aspnet/3-binary-data/Services/ImageUploader.cs
https://github.com/GoogleCloudPlatform/getting-started-dotnet/tree/master/aspnet/3-binary-data
Nuget
https://www.nuget.org/packages/Google.Cloud.Storage.V1/
Here are 2 examples that helped me to upload files to a bucket in Google Cloud Storage with Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1):
Upload files to Google cloud storage using c#
Uploading .csv Files to Google Cloud Storage using C# .Net
I got both working on a C# Console Application just for testing purposes.
#February 2021
string _projectId = "YOUR-PROJECT-ID-GCP"; //ProjectID also present in the json file
GoogleCredential _credential = GoogleCredential.FromFile("credential-cloud-file-123418c9e06c.json");
/// <summary>
/// UploadFile to GCS Bucket
/// </summary>
/// <param name="bucketName"></param>
/// <param name="localPath">my-local-path/my-file-name</param>
/// <param name="objectName">my-file-name</param>
public void UploadFile(string bucketName, string localPath, string objectName)
{
var storage = StorageClient.Create(_credential);
using var fileStream = File.OpenRead(localPath);
storage.UploadObject(bucketName, objectName, null, fileStream);
Console.WriteLine($"Uploaded {objectName}.");
}
You get the credentials JSON file from the google cloud portal where you create a bucket under your project..
Simple, with auth:
private void SaveFileToGoogleStorage(string path, string? fileName, string ext)
{
var filePath = Path.Combine(path, fileName + ext);
var gcCredentialsPath = Path.Combine(Environment.CurrentDirectory, "gc_sa_key.json");
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", gcCredentialsPath);
var gcsStorage = StorageClient.Create();
using var f = File.OpenRead(filePath);
var objectName = Path.GetFileName(filePath);
gcsStorage.UploadObject(_bucketName, objectName, null, f);
Console.WriteLine($"Uploaded {objectName}.");
}

Append (n) to file names (not using system.io.file) to ensure unique names

I have a class that holds the name of a file, and the data of a file:
public class FileMeta
{
public string FileName { get; set; }
public byte[] FileData { get; set; }
}
I have a method that populates a collection of this class through an async download operation (Files are not coming from a local file system):
async Task<List<FileMeta>> ReturnFileData(IEnumerable<string> urls)
{
using (var client = new HttpClient())
{
var results = await Task.WhenAll(urls.Select(async url => new
{
FileName = Path.GetFileName(url),
FileData = await client.GetByteArrayAsync(url),
}));
return results.Select(result =>
new FileMeta
{
FileName = result.FileName,
FileData = result.FileData
}).ToList();
}
}
I am going to feed this List<FileMeta> into a ZipFile creator, and the ZipFile like all File containers needs unique file names.
Readability is important, and I would like to be able to do the following:
file.txt => file.txt
file.txt => file(1).txt
file.txt => file(2).txt
There are a number of examples on how to do this within the file system, but not with a simple object collection. (Using System.IO.File.Exists for example)
What's the best way to loop through this collection of objects and return a unique set of file names?
How far I've gotten
private List<FileMeta> EnsureUniqueFileNames(IEnumerable<FileMeta> fileMetas)
{
var returnList = new List<FileMeta>();
foreach (var file in fileMetas)
{
while (DoesFileNameExist(file.FileName, returnList))
{
//Append (n) in sequence until match is not found?
}
}
return returnList;
}
private bool DoesFileNameExist(string fileName, IEnumerable<FileMeta> fileMeta)
{
var fileNames = fileMeta.Select(file => file.FileName).ToList();
return fileNames.Contains(fileName);
}
You can try the following to increment the filenames:
private List<FileMeta> EnsureUniqueFileNames(IEnumerable<FileMeta> fileMetas)
{
var returnedList = new List<FileMeta>();
foreach (var file in fileMetas)
{
int count = 0;
string originalFileName = file.FileName;
while (returnedList.Any(fileMeta => fileMeta.FileName.Equals(file.FileName,
StringComparison.OrdinalIgnoreCase))
{
string fileNameOnly = Path.GetFileNameWithoutExtension(originalFileName);
string extension = Path.GetExtension(file.FileName);
file.FileName = string.Format("{0}({1}){2}", fileNameOnly, count, extension);
count++;
}
returnList.Add(file);
}
return returnList;
}
As a side note, in your ReturnFileData, you're generating two lists, one of anonymous type and one of your actual FileMeta type. You can reduce the creation of the intermediate list. Actually, you don't need to await inside the method at all:
private Task<FileMeta[]> ReturnFileDataAsync(IEnumerable<string> urls)
{
var client = new HttpClient();
return Task.WhenAll(urls.Select(async url => new FileMeta
{
FileName = Path.GetFileName(url),
FileData = await client.GetByteArrayAsync(url),
}));
}
I made the return type a FileMeta[] instead of a List<FileMeta>, as it is a fixed sized returning anyway, and reduces the need to call ToList on the returned array. I also added the Async postfix, to follow the TAP guidelines.

Categories