Only upload updated files into Blob Storage - c#

I have a method that uploads XML files from a folder into a Blob Storage. Connected to that Blob Storage i have a Blob Trigger that listens to changes in the Blob Storage, takes the files and then does a PUT request to a server. I got that sorted out and working.
My problem is that when I want to update a specific file in the folder and run my code, all the files in the folder seems to be uploaded again and my Blob Trigger goes of, doing a PUT for all the files. I only want to do a PUT on the files that changed in the folder (except for my initial upload to the blob of course).
The code I have so far is as basic as my level of experience. For the import I followed a simple guide.
My code that uploads the file into the Blob Storage:
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Starting...");
string accountName = ConfigurationManager.AppSettings["accountName"];
string accountKey = ConfigurationManager.AppSettings["accountKey"];
string localFolder = ConfigurationManager.AppSettings["mySourceFolder"];
string destContainer = ConfigurationManager.AppSettings["destContainer"];
var stringReturned = BlobSetup(accountName, accountKey, localFolder, destContainer);
Console.WriteLine(stringReturned);
Console.Read();
}
static async Task UploadBlob(CloudBlobContainer container, string key, string filePath, bool deleteAfter)
{
//Get a blob reference to write this file to
var blob = container.GetBlockBlobReference(key);
await blob.UploadFromFileAsync(filePath);
Console.WriteLine("Uploaded {0}", filePath);
//if delete of file is requested, do that
if (deleteAfter)
{
File.Delete(filePath);
}
}
static async Task<string> BlobSetup(string accountName, string accountKey, string localFolder, string destContainer)
{
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(destContainer);
//create container if not exists
await container.CreateIfNotExistsAsync();
await container.SetPermissionsAsync(new BlobContainerPermissions()
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
string[] fileEntries = Directory.GetFiles(localFolder);
foreach (string filePath in fileEntries)
{
//Handle only json and xml?
if(filePath.EndsWith(".json") || filePath.EndsWith(".xml"))
{
string keys = Path.GetFileName(filePath);
await UploadBlob(container, keys, filePath, false);
}
}
return "some response";
}
My Blob Trigger that does the PUT:
public static class BlobTriggerExample
{
const string serverUrl= "theurl";
static HttpClient client = new HttpClient();
[FunctionName("BlobTriggerExample")]
public static async Task Run([BlobTrigger("myblob/{name}", Connection = "AzureWebJobsStorage")]CloudBlockBlob myBlob, string name, TraceWriter log)
{
string putUrlString = "";
string idValue = "";
XDocument xdoc = new XDocument();
myBlob.StreamMinimumReadSizeInBytes = 20 * 1024 * 1024;
await myBlob.FetchAttributesAsync();
//Read stream
var blobStream = await myBlob.OpenReadAsync();
xdoc = new XDocument(XDocument.Load(blobStream));
//Read root node(resourceType)
string resourceType = xdoc.Root.Name.LocalName;
//Get id value
idValue = xdoc.Descendants().Where(x => x.Name.LocalName == "id").First().LastAttribute.Value;
//Build redirect string
putUrlString = serverUrl + resourceType + "/" + idValue;
//PUT
var httpContent = new StringContent(xdoc.ToString(), Encoding.UTF8, "application/xml");
var response = await client.PutAsync(putUrlString, httpContent);
Console.WriteLine($"Response: {response}");
Console.Read();
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.StreamWriteSizeInBytes} Bytes : Response message: {response}");
}
}
My guess is that I want to be able to control what files I'm uploading into the Blob Storage by doing some sort of check if the exact same file already exist. Or maybe I want to make some sort of check in the Blob Trigger before doing the PUT?
File names in the folder I'm uploading from is always the same(a must), even though some if the content might have changed.
Is there anyone that could be so kind to give me some guidelines on how I might approach this? I have been googling around for hours and I'm getting nowhere.

Yes, your code loops through and uploads all the files in your local folder. The blob trigger just sees that the blobs have been written and has no concept of whether or not their content has changed (or whether that matters) so it also processes all of them.
What you need to do is to compare your local files against the ones in blob storage before you upload to see whether they're new versions or not, so in your UploadBlob method you want something along the lines of
// Get a blob reference to write this file to
var blob = container.GetBlockBlobReference(key);
// If the blob already exists
if (await blob.ExistsAsync()) {
// Fetch the blob's properties
await blob.FetchAttributesAsync();
// Only proceed if modification time of local file is newer
if (blob.Properties.LastModified > File.GetLastWriteTimeUtc(filePath))
return;
}
If checking the modification time isn't enough then you can also attach your own metadata (e.g. a checksum) to blobs and use that for the comparison - see https://learn.microsoft.com/en-us/azure/storage/blobs/storage-properties-metadata.

Below is a version we use in production. Note that Metadata is used instead of blob last modified property for better code flexibility & reliability. Might not fit everybody's needs but this is one way of doing it that is tested working.
for (var i = 0; i < files.Length; i++)
{
var filePath = files[i];
//maintains folder structure
var blobName = filePath.Replace(selectedBuildPath, "").Replace("\\", "/");
var blobClient = GetNewBlobClientWithMaxRetry(blobName, containerClient);
// If the blob already exists, get the last modified tick count in the blobs metadata
long blobLastModifiedTick = 0;
if (await blobClient.ExistsAsync())
{
try
{
var blobProperties = (await blobClient.GetPropertiesAsync()).Value;
//this line will fail if lastmodifiedticks was not set before
blobLastModifiedTick = long.Parse(blobProperties.Metadata[_lastmodifiedticks]);
}
//if fail, blob last tick will be 0, aka outdated
catch { }
}
//if local file is latest, then upload else skip
var localLastModified = File.GetLastWriteTimeUtc(filePath);
var localIsNewer = blobLastModifiedTick < localLastModified.Ticks;
if (localIsNewer)
{
//print name 1st to know which gets stuck
//extra space to delete extra prev
Console.Write($"\r{i}/{files.Length} Uploading:{blobName} ");
//note will overwrite existing
await using var fs = File.Open(filePath, FileMode.Open);
await blobClient.UploadAsync(fs, overwrite: true, CancellationToken.None);
//auto correct content type from wrongly set "octet/stream"
var blobHttpHeaders = new BlobHttpHeaders { ContentType = MimeTypeMap.GetMimeType(filePath) };
await blobClient.SetHttpHeadersAsync(blobHttpHeaders);
await blobClient.SetMetadataAsync(new Dictionary<string, string>() { { _lastmodifiedticks, localLastModified.Ticks.ToString() } });
}
else
{
Console.Write($"\r{i}/{files.Length} Skipped:{blobName} "); //space to delete extra previous line
}
}

Related

Reading and copying large files/blobs without storing them in memory stream in C#

below is the code which reads the blobs from my blob storage and then copy the contents in a tabular storage. Everything works fine now. but I know that if my file is too big then it will fail this reading and copying. I would like to know how do we handle this ideally, is it we write the file temporarily instead of storing it in memory? If yes , can someone give me example or show me how to do it in my existing code below >
public async Task<Stream> ReadStream(string containerName, string digestFileName, string fileName, string connectionString)
{
string data = string.Empty;
string fileExtension = Path.GetExtension(fileName);
var contents = await DownloadBlob(containerName, digestFileName, connectionString);
return contents;
}
public async Task<Stream> DownloadBlob(string containerName, string fileName, string connectionString)
{
Microsoft.Azure.Storage.CloudStorageAccount storageAccount = Microsoft.Azure.Storage.CloudStorageAccount.Parse(connectionString);
CloudBlobClient serviceClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = serviceClient.GetContainerReference(containerName);
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
if (!blob.Exists())
{
throw new Exception($"Unable to upload data in table store for document");
}
return await blob.OpenReadAsync();
}
private IEnumerable<Dictionary<string, EntityProperty>> ReadCSV(Stream source, IEnumerable<TableField> cols)
{
using (TextReader reader = new StreamReader(source, Encoding.UTF8))
{
var cache = new TypeConverterCache();
cache.AddConverter<float>(new CSVSingleConverter());
cache.AddConverter<double>(new CSVDoubleConverter());
var csv = new CsvReader(reader,
new CsvHelper.Configuration.CsvConfiguration(global::System.Globalization.CultureInfo.InvariantCulture)
{
Delimiter = ";",
HasHeaderRecord = true,
CultureInfo = global::System.Globalization.CultureInfo.InvariantCulture,
TypeConverterCache = cache
});
csv.Read();
csv.ReadHeader();
var map = (
from col in cols
from src in col.Sources()
let index = csv.GetFieldIndex(src, isTryGet: true)
where index != -1
select new { col.Name, Index = index, Type = col.DataType }).ToList();
while (csv.Read())
{
yield return map.ToDictionary(
col => col.Name,
col => EntityProperty.CreateEntityPropertyFromObject(csv.GetField(col.Type, col.Index)));
}
}
}
At your insistence that CsvHelper is incapable of reading from a stream connected to a blob, I threw something together:
WinForms core app (3.1)
CsvHelper latest (19)
Azure.Storage.Blobs (12.8)
A CSV from my disk:
On my blob storage:
In my debugger, it has record CAf255 OK by Read/GetRecord:
Or by EnumerateRecords:
Using this code:
private async void button1_Click(object sender, EventArgs e)
{
var cstr = "MY CONNECTION STRING HERE";
var bbc = new BlockBlobClient(cstr, "temp", "call.csv");
var s = await bbc.OpenReadAsync(new BlobOpenReadOptions(true) { BufferSize = 16384 });
var sr = new StreamReader(s);
var csv = new CsvHelper.CsvReader(sr, new CsvConfiguration(CultureInfo.CurrentCulture) { HasHeaderRecord = true });
var x = new X();
//try by read/getrecord (breakpoint and skip over it if you want to try the other way)
while(await csv.ReadAsync())
{
var rec = csv.GetRecord<X>();
Console.WriteLine(rec.Sid);
}
//try by await foreach
await foreach (var r in csv.EnumerateRecordsAsync(x))
{
Console.WriteLine(r.Sid);
}
}
Oh, and the class that represents a CSV record in my app (I only modeled one property, Sid, to prove the concept):
class X {
public string Sid{ get; set; }
}
Maybe dial things back a bit, start simple. One string prop in your CSV, no yielding etc, just get the file reading in OK. I didn't bother with all the header faffing either - seems to just work OK by saying "file has headers" in the options - you can see my debugger has an instance of X with a correctly populated Sid property showing the first value. I ran some more loops and they populated OK too
I imagine it could look something like this (modify your ReadCSV to take a stream, not lines):
private IEnumerable<Dictionary<string, EntityProperty>> ReadCSV(Stream source, IEnumerable<TableField> cols)
{
using (TextReader reader = new StreamReader(source))
And this (modify your DownloadBlob to instead return a stream):
public async Task<Stream> GetBlobStream(string containerName, string fileName, string connectionString)
{
Microsoft.Azure.Storage.CloudStorageAccount storageAccount = Microsoft.Azure.Storage.CloudStorageAccount.Parse(connectionString);
CloudBlobClient serviceClient = storageAccount.CreateCloudBlobClient();
CloudBlobContainer container = serviceClient.GetContainerReference(containerName);
CloudBlockBlob blob = container.GetBlockBlobReference(fileName);
if (!blob.Exists())
{
throw ...
}
return await blob.OpenReadAsync();
}
And then wire them together:
var stream = GetBlobStream(...)
ReadCSV(stream, ...)

How upload file to blob storage and reference it from cosmosdb document in attachment

I'm new to CosmosDb and azure blob storage, as a requirement I need to reference a file uploaded to azure blob storage from a document in CosmosDb and use it in the attachment section to save the metadata.
I know the json metadata stucture should be like this:
{
"id":"image13d65101-90c4-4c2a-a423-fbf221c73233",
"contentType":"image/jpg",
"media":"www.bing.com",
"_rid":"rnYYAMVFUAUBAAAAAAAAAEC+LNM=",
"_ts":1408056025,
"_self":"dbs\/rnYYAA==\/colls\/rnYYAMVFUAU=\/docs\/rnYYAMVFUAUBAAAAAAAAAA==\/attachments\/rnYYAMVFUAUBAAAAAAAAAEC+LNM=",
"_etag":"00002a00-0000-0000-0000-53ed3ad90000"
}
but how can I get a reference to the media property when uploading the file to azure blob storage, more accurately how do I upload a file from c# to azure blob storage and reference it by setting the url to the media property.
Here is a sample code from one of my application to upload to blob storage and then save the reference to the cosmosdb as a uri,
public async Task<IActionResult> UploadImageAsync([FromBody] ImageUploadRequest imageRequest)
{
if (string.IsNullOrEmpty(imageRequest?.Base64))
{
return BadRequest();
}
var tokens = imageRequest.Base64.Split(',');
var ctype = tokens[0].Replace("data:", "");
var base64 = tokens[1];
var content = Convert.FromBase64String(base64);
// Upload photo to storage...
var blobUri = await UploadImageToStorage(content);
// Then create a Document in CosmosDb to notify our Function
var identifier = await UploadDocument(blobUri, imageRequest.Name ?? "Bob");
return Ok(identifier);
}
private async Task<Guid> UploadDocument(Uri uri, string imageName)
{
var endpoint = new Uri(_settings.ImageConfig.CosmosUri);
var auth = _settings.ImageConfig.CosmosKey;
var client = new DocumentClient(endpoint, auth);
var identifier = Guid.NewGuid();
await client.CreateDatabaseIfNotExistsAsync(new Database() { Id = dbName });
await client.CreateDocumentCollectionIfNotExistsAsync(UriFactory.CreateDatabaseUri(dbName),
new DocumentCollection { Id = colName });
await client.CreateDocumentAsync(
UriFactory.CreateDocumentCollectionUri(dbName, colName),
new ImageDocument
{
Id = identifier,
IsApproved = null,
PetName = petName,
MediaUrl = uri.ToString(),
Created = DateTime.UtcNow
});
return identifier;
}
private async Task<Uri> UploadImageToStorage(byte[] content)
{
var storageName = _settings.PetsConfig.BlobName;
var auth = _settings.PetsConfig.BlobKey;
var uploader = new PhotoUploader(storageName, auth);
var blob = await uploader.UploadPetPhoto(content);
return blob.Uri;
}
how do I upload a file from c# to azure blob storage and reference it
by setting the url to the media property.
You can refer to this article Setting and retrieving metadata, you should get the blob again after uploading it.
Sample code like this:
//the code to get the blob again after uploading.
var blockblob = blobContainer.GetBlockBlobReference(blobname);
//the code to set medadata.
blockblob.FetchAttributes();
blockblob.Metadata["media"] = "www.bing.com";
blockblob.SetMetadata();

Web API Upload Files

I have some data to save into a database.
I have created a web api post method to save data. Following is my post method:
[Route("PostRequirementTypeProcessing")]
public IEnumerable<NPAAddRequirementTypeProcessing> PostRequirementTypeProcessing(mdlAddAddRequirementTypeProcessing requTypeProcess)
{
mdlAddAddRequirementTypeProcessing rTyeProcessing = new mdlAddAddRequirementTypeProcessing();
rTyeProcessing.szDescription = requTypeProcess.szDescription;
rTyeProcessing.iRequirementTypeId = requTypeProcess.iRequirementTypeId;
rTyeProcessing.szRequirementNumber = requTypeProcess.szRequirementNumber;
rTyeProcessing.szRequirementIssuer = requTypeProcess.szRequirementIssuer;
rTyeProcessing.szOrganization = requTypeProcess.szOrganization;
rTyeProcessing.dIssuedate = requTypeProcess.dIssuedate;
rTyeProcessing.dExpirydate = requTypeProcess.dExpirydate;
rTyeProcessing.szSignedBy = requTypeProcess.szSignedBy;
rTyeProcessing.szAttachedDocumentNo = requTypeProcess.szAttachedDocumentNo;
if (String.IsNullOrEmpty(rTyeProcessing.szAttachedDocumentNo))
{
}
else
{
UploadFile();
}
rTyeProcessing.szSubject = requTypeProcess.szSubject;
rTyeProcessing.iApplicationDetailsId = requTypeProcess.iApplicationDetailsId;
rTyeProcessing.iEmpId = requTypeProcess.iEmpId;
NPAEntities context = new NPAEntities();
Log.Debug("PostRequirementTypeProcessing Request traced");
var newRTP = context.NPAAddRequirementTypeProcessing(requTypeProcess.szDescription, requTypeProcess.iRequirementTypeId,
requTypeProcess.szRequirementNumber, requTypeProcess.szRequirementIssuer, requTypeProcess.szOrganization,
requTypeProcess.dIssuedate, requTypeProcess.dExpirydate, requTypeProcess.szSignedBy,
requTypeProcess.szAttachedDocumentNo, requTypeProcess.szSubject, requTypeProcess.iApplicationDetailsId,
requTypeProcess.iEmpId);
return newRTP.ToList();
}
There is a field called 'szAttachedDocumentNo' which is a document that's being saved in the database as well.
After saving all data, I want the physical file of the 'szAttachedDocumentNo' to be saved on the server. So i created a method called "UploadFile" as follows:
[HttpPost]
public void UploadFile()
{
if (HttpContext.Current.Request.Files.AllKeys.Any())
{
// Get the uploaded file from the Files collection
var httpPostedFile = HttpContext.Current.Request.Files["UploadedFile"];
if (httpPostedFile != null)
{
// Validate the uploaded image(optional)
string folderPath = HttpContext.Current.Server.MapPath("~/UploadedFiles");
//string folderPath1 = Convert.ToString(ConfigurationManager.AppSettings["DocPath"]);
//Directory not exists then create new directory
if (!Directory.Exists(folderPath))
{
Directory.CreateDirectory(folderPath);
}
// Get the complete file path
var fileSavePath = Path.Combine(folderPath, httpPostedFile.FileName);
// Save the uploaded file to "UploadedFiles" folder
httpPostedFile.SaveAs(fileSavePath);
}
}
}
Before running the project, i debbugged the post method, so when it comes to "UploadFile" line, it takes me to its method.
From the file line, it skipped the remaining lines and went to the last line; what means it didn't see any file.
I am able to save everything to the database, just that i didn't see the physical file in the specified location.
Any help would be much appreciated.
Regards,
Somad
Makes sure the request "content-type": "multipart/form-data" is set
[HttpPost()]
public async Task<IHttpActionResult> UploadFile()
{
if (!Request.Content.IsMimeMultipartContent())
throw new HttpResponseException(HttpStatusCode.UnsupportedMediaType);
try
{
MultipartMemoryStreamProvider provider = new MultipartMemoryStreamProvider();
await Request.Content.ReadAsMultipartAsync(provider);
if (provider.Contents != null && provider.Contents.Count == 0)
{
return BadRequest("No files provided.");
}
foreach (HttpContent file in provider.Contents)
{
string filename = file.Headers.ContentDisposition.FileName.Trim('\"');
byte[] buffer = await file.ReadAsByteArrayAsync();
using (MemoryStream stream = new MemoryStream(buffer))
{
// save the file whereever you want
}
}
return Ok("files Uploded");
}
catch (Exception ex)
{
return InternalServerError(ex);
}
}

Uploading objects to google cloud storage buckets in c#

Can someone please provide an example of how to use Google.Apis.Storage.v1 for uploading files to google cloud storage in c#?
I found that this basic operation is not as straight forward as you might expect. Google's documentation about it's Storage API is lacking in information about using it in C# (or any other .NET language). Searching for 'how to upload file to google cloud storage in c#' didn't exactly help me, so here is my working solution with some comments:
Preparation:
You need to create OAuth2 account in your Google Developers Console - go to Project/APIs & auth/Credentials.
Copy Client ID & Client Secret to your code. You will also need your Project name.
Code (it assumes that you've added Google.Apis.Storage.v1 via NuGet):
First, you need to authorize your requests:
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = clientId;
clientSecrets.ClientSecret = clientSecret;
//there are different scopes, which you can find here https://cloud.google.com/storage/docs/authentication
var scopes = new[] {#"https://www.googleapis.com/auth/devstorage.full_control"};
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets,scopes, "yourGoogle#email", cts.Token);
Sometimes you might also want to refresh authorization token via:
await userCredential.RefreshTokenAsync(cts.Token);
You also need to create Storage Service:
var service = new Google.Apis.Storage.v1.StorageService();
Now you can make requests to Google Storage API.
Let's start with creating a new bucket:
var newBucket = new Google.Apis.Storage.v1.Data.Bucket()
{
Name = "your-bucket-name-1"
};
var newBucketQuery = service.Buckets.Insert(newBucket, projectName);
newBucketQuery.OauthToken = userCredential.Result.Token.AccessToken;
//you probably want to wrap this into try..catch block
newBucketQuery.Execute();
And it's done. Now, you can send a request to get list of all of your buckets:
var bucketsQuery = service.Buckets.List(projectName);
bucketsQuery.OauthToken = userCredential.Result.Token.AccessToken;
var buckets = bucketsQuery.Execute();
Last part is uploading new file:
//enter bucket name to which you want to upload file
var bucketToUpload = buckets.Items.FirstOrDefault().Name;
var newObject = new Object()
{
Bucket = bucketToUpload,
Name = "some-file-"+new Random().Next(1,666)
};
FileStream fileStream = null;
try
{
var dir = Directory.GetCurrentDirectory();
var path = Path.Combine(dir, "test.png");
fileStream = new FileStream(path, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload,fileStream,"image/png");
uploadRequest.OauthToken = userCredential.Result.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
And bam! New file will be visible in you Google Developers Console inside of selected bucket.
You can use Google Cloud APIs without SDK in the following ways:
Required api-key.json file
Install package Google.Apis.Auth.OAuth2 in order to authorize the
HTTP web request
You can set the default configuration for your application in this
way
I did the same using .NET core web API and details are given below:
Url details:
"GoogleCloudStorageBaseUrl": "https://www.googleapis.com/upload/storage/v1/b/",
"GoogleSpeechBaseUrl": "https://speech.googleapis.com/v1/operations/",
"GoogleLongRunningRecognizeBaseUrl": "https://speech.googleapis.com/v1/speech:longrunningrecognize",
"GoogleCloudScope": "https://www.googleapis.com/auth/cloud-platform",
public void GetConfiguration()
{
// Set global configuration
bucketName = _configuration.GetValue<string>("BucketName");
googleCloudStorageBaseUrl = _configuration.GetValue<string>("GoogleCloudStorageBaseUrl");
googleSpeechBaseUrl = _configuration.GetValue<string>("GoogleSpeechBaseUrl");
googleLongRunningRecognizeBaseUrl = _configuration.GetValue<string>("GoogleLongRunningRecognizeBaseUrl");
// Set google cloud credentials
string googleApplicationCredentialsPath = _configuration.GetValue<string>("GoogleCloudCredentialPath");
using (Stream stream = new FileStream(googleApplicationCredentialsPath, FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(stream).CreateScoped(_configuration.GetValue<string>("GoogleCloudScope"));
}
Get Oauth token:
public string GetOAuthToken()
{
return googleCredential.UnderlyingCredential.GetAccessTokenForRequestAsync("https://accounts.google.com/o/oauth2/v2/auth", CancellationToken.None).Result;
}
To upload file to cloud bucket:
public async Task<string> UploadMediaToCloud(string filePath, string objectName = null)
{
string bearerToken = GetOAuthToken();
byte[] fileBytes = File.ReadAllBytes(filePath);
objectName = objectName ?? Path.GetFileName(filePath);
var baseUrl = new Uri(string.Format(googleCloudStorageBaseUrl + "" + bucketName + "/o?uploadType=media&name=" + objectName + ""));
using (WebClient client = new WebClient())
{
client.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + bearerToken);
client.Headers.Add(HttpRequestHeader.ContentType, "application/octet-stream");
byte[] response = await Task.Run(() => client.UploadData(baseUrl, "POST", fileBytes));
string responseInString = Encoding.UTF8.GetString(response);
return responseInString;
}
}
In order to perform any action to the cloud API, just need to make a HttpClient get/post request as per the requirement.
Thanks
This is for Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1), but appears to be a bit simpler to perform an upload now. I started with the Client libraries "Getting Started" instructions to create a service account and bucket, then experimented to find out how to upload an image.
The process I followed was:
Sign up for Google Cloud free trial
Create a new project in Google Cloud (remember the project name\ID for later)
Create a Project Owner service account - this will result in a json file being downloaded that contains the service account credentials. Remember where you put that file.
The getting started docs get you to add the path to the JSON credentials file into an environment variable called GOOGLE_APPLICATION_CREDENTIALS - I couldn't get this to work through the provided instructions. Turns out it is not required, as you can just read the JSON file into a string and pass it to the client constructor.
I created an empty WPF project as a starting point, and a single ViewModel to house the application logic.
Install the Google.Cloud.Storage.V1 nuget package and it should pull in all the dependencies it needs.
Onto the code.
MainWindow.xaml
<StackPanel>
<Button
Margin="50"
Height="50"
Content="BEGIN UPLOAD"
Click="OnButtonClick" />
<ContentControl
Content="{Binding Path=ProgressBar}" />
</StackPanel>
MainWindow.xaml.cs
public partial class MainWindow
{
readonly ViewModel _viewModel;
public MainWindow()
{
_viewModel = new ViewModel(Dispatcher);
DataContext = _viewModel;
InitializeComponent();
}
void OnButtonClick(object sender, RoutedEventArgs args)
{
_viewModel.UploadAsync().ConfigureAwait(false);
}
}
ViewModel.cs
public class ViewModel
{
readonly Dispatcher _dispatcher;
public ViewModel(Dispatcher dispatcher)
{
_dispatcher = dispatcher;
ProgressBar = new ProgressBar {Height=30};
}
public async Task UploadAsync()
{
// Google Cloud Platform project ID.
const string projectId = "project-id-goes-here";
// The name for the new bucket.
const string bucketName = projectId + "-test-bucket";
// Path to the file to upload
const string filePath = #"C:\path\to\image.jpg";
var newObject = new Google.Apis.Storage.v1.Data.Object
{
Bucket = bucketName,
Name = System.IO.Path.GetFileNameWithoutExtension(filePath),
ContentType = "image/jpeg"
};
// read the JSON credential file saved when you created the service account
var credential = Google.Apis.Auth.OAuth2.GoogleCredential.FromJson(System.IO.File.ReadAllText(
#"c:\path\to\service-account-credentials.json"));
// Instantiates a client.
using (var storageClient = Google.Cloud.Storage.V1.StorageClient.Create(credential))
{
try
{
// Creates the new bucket. Only required the first time.
// You can also create buckets through the GCP cloud console web interface
storageClient.CreateBucket(projectId, bucketName);
System.Windows.MessageBox.Show($"Bucket {bucketName} created.");
// Open the image file filestream
using (var fileStream = new System.IO.FileStream(filePath, System.IO.FileMode.Open))
{
ProgressBar.Maximum = fileStream.Length;
// set minimum chunksize just to see progress updating
var uploadObjectOptions = new Google.Cloud.Storage.V1.UploadObjectOptions
{
ChunkSize = Google.Cloud.Storage.V1.UploadObjectOptions.MinimumChunkSize
};
// Hook up the progress callback
var progressReporter = new Progress<Google.Apis.Upload.IUploadProgress>(OnUploadProgress);
await storageClient.UploadObjectAsync(
newObject,
fileStream,
uploadObjectOptions,
progress: progressReporter)
.ConfigureAwait(false);
}
}
catch (Google.GoogleApiException e)
when (e.Error.Code == 409)
{
// When creating the bucket - The bucket already exists. That's fine.
System.Windows.MessageBox.Show(e.Error.Message);
}
catch (Exception e)
{
// other exception
System.Windows.MessageBox.Show(e.Message);
}
}
}
// Called when progress updates
void OnUploadProgress(Google.Apis.Upload.IUploadProgress progress)
{
switch (progress.Status)
{
case Google.Apis.Upload.UploadStatus.Starting:
ProgressBar.Minimum = 0;
ProgressBar.Value = 0;
break;
case Google.Apis.Upload.UploadStatus.Completed:
ProgressBar.Value = ProgressBar.Maximum;
System.Windows.MessageBox.Show("Upload completed");
break;
case Google.Apis.Upload.UploadStatus.Uploading:
UpdateProgressBar(progress.BytesSent);
break;
case Google.Apis.Upload.UploadStatus.Failed:
System.Windows.MessageBox.Show("Upload failed"
+ Environment.NewLine
+ progress.Exception);
break;
}
}
void UpdateProgressBar(long value)
{
_dispatcher.Invoke(() => { ProgressBar.Value = value; });
}
// probably better to expose progress value directly and bind to
// a ProgressBar in the XAML
public ProgressBar ProgressBar { get; }
}
Use of Google.Apis.Storage.v1 for uploading files using SDK to google cloud storage in c#:
Required api-key.json file
Install the package Google.Cloud.Storage.V1; and Google.Apis.Auth.OAuth2;
The code is given below to upload the file to the cloud
private string UploadFile(string localPath, string objectName = null)
{
string projectId = ((Google.Apis.Auth.OAuth2.ServiceAccountCredential)googleCredential.UnderlyingCredential).ProjectId;
try
{
// Creates the new bucket.
var objResult = storageClient.CreateBucket(projectId, bucketName);
if (!string.IsNullOrEmpty(objResult.Id))
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus1 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Google.GoogleApiException ex)
{
// Error code =409, means bucket already created/exist then upload file in the bucket
if (ex.Error.Code == 409)
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus2 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
return objectName;
}
To set the credentials
private bool SetStorageCredentials()
{
bool status = true;
try
{
if (File.Exists(credential_path))
{
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", credential_path);
using (Stream objStream = new FileStream(Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS"), FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(objStream);
// Instantiates a client.
storageClient = StorageClient.Create();
channel = new Grpc.Core.Channel(SpeechClient.DefaultEndpoint.Host, googleCredential.ToChannelCredentials());
}
else
{
DialogResult result = MessageBox.Show("File " + Path.GetFileName(credential_path) + " does not exist. Please provide the correct path.");
if (result == System.Windows.Forms.DialogResult.OK)
{
status = false;
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
status = false;
}
return status;
}
I used SDK in one of my window application. You can use the same code according to your needs/requirements.
You'll be happy to know it still works in 2016...
I was googling all over using fancy key words like "google gcp C# upload image", until I just plain asked the question: "How do I upload an image to google bucket using C#"... and here I am. I removed the .Result in the user credential, and this was the final edit that worked for me.
// ******
static string bucketForImage = ConfigurationManager.AppSettings["testStorageName"];
static string projectName = ConfigurationManager.AppSettings["GCPProjectName"];
string gcpPath = Path.Combine(Server.MapPath("~/Images/Gallery/"), uniqueGcpName + ext);
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = ConfigurationManager.AppSettings["GCPClientID"];
clientSecrets.ClientSecret = ConfigurationManager.AppSettings["GCPClientSc"];
var scopes = new[] { #"https://www.googleapis.com/auth/devstorage.full_control" };
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets, scopes, ConfigurationManager.AppSettings["GCPAccountEmail"], cts.Token);
var service = new Google.Apis.Storage.v1.StorageService();
var bucketToUpload = bucketForImage;
var newObject = new Google.Apis.Storage.v1.Data.Object()
{
Bucket = bucketToUpload,
Name = bkFileName
};
FileStream fileStream = null;
try
{
fileStream = new FileStream(gcpPath, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload, fileStream, "image/"+ ext);
uploadRequest.OauthToken = userCredential.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
// ******
Here is the link to their official C# example of ".NET Bookshelf App" using Google Cloud storage.
https://cloud.google.com/dotnet/docs/getting-started/using-cloud-storage
Source on github:
https://github.com/GoogleCloudPlatform/getting-started-dotnet/blob/master/aspnet/3-binary-data/Services/ImageUploader.cs
https://github.com/GoogleCloudPlatform/getting-started-dotnet/tree/master/aspnet/3-binary-data
Nuget
https://www.nuget.org/packages/Google.Cloud.Storage.V1/
Here are 2 examples that helped me to upload files to a bucket in Google Cloud Storage with Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1):
Upload files to Google cloud storage using c#
Uploading .csv Files to Google Cloud Storage using C# .Net
I got both working on a C# Console Application just for testing purposes.
#February 2021
string _projectId = "YOUR-PROJECT-ID-GCP"; //ProjectID also present in the json file
GoogleCredential _credential = GoogleCredential.FromFile("credential-cloud-file-123418c9e06c.json");
/// <summary>
/// UploadFile to GCS Bucket
/// </summary>
/// <param name="bucketName"></param>
/// <param name="localPath">my-local-path/my-file-name</param>
/// <param name="objectName">my-file-name</param>
public void UploadFile(string bucketName, string localPath, string objectName)
{
var storage = StorageClient.Create(_credential);
using var fileStream = File.OpenRead(localPath);
storage.UploadObject(bucketName, objectName, null, fileStream);
Console.WriteLine($"Uploaded {objectName}.");
}
You get the credentials JSON file from the google cloud portal where you create a bucket under your project..
Simple, with auth:
private void SaveFileToGoogleStorage(string path, string? fileName, string ext)
{
var filePath = Path.Combine(path, fileName + ext);
var gcCredentialsPath = Path.Combine(Environment.CurrentDirectory, "gc_sa_key.json");
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", gcCredentialsPath);
var gcsStorage = StorageClient.Create();
using var f = File.OpenRead(filePath);
var objectName = Path.GetFileName(filePath);
gcsStorage.UploadObject(_bucketName, objectName, null, f);
Console.WriteLine($"Uploaded {objectName}.");
}

OneDrive Upload/Download to Specified Directory

I'm trying to use the Live SDK (v5.6) to include backup/restore from OneDrive in my Windows Phone 8.1 Silverlight application. I can read/write to the standard "me/skydrive" directory, but I am having a horrible time in finding a way to upload/download to a specified directory. I can create the folder if it doesn't exist no problem.
I have been trying below with no luck.
var res = await _client.UploadAsync("me/skydrive/mydir", fileName, isoStoreFileStream, OverwriteOption.Overwrite);
I've also tried getting the directory ID and passing that in also.
var res = await _client.UploadAsync("me/skydrive/" + folderId, fileName, isoStoreFileStream, OverwriteOption.Overwrite);
Same error.. I receive 'mydir' or the id isn't supported...
"{request_url_invalid: Microsoft.Live.LiveConnectException: The URL contains the path 'mydir', which isn't supported."
Any suggestions? If you suggest an answer for the uploadasync, could you also include how I could download my file from the specified directory? Thanks!
Here's an extension method that checks if a folder is created and:
If created returns the folder id.
If not created, creates it and returns the folder id.
You can then use this id to upload to and download from that folder.
public async static Task<string> CreateDirectoryAsync(this LiveConnectClient client,
string folderName, string parentFolder)
{
string folderId = null;
// Retrieves all the directories.
var queryFolder = parentFolder + "/files?filter=folders,albums";
var opResult = await client.GetAsync(queryFolder);
dynamic result = opResult.Result;
foreach (dynamic folder in result.data)
{
// Checks if current folder has the passed name.
if (folder.name.ToLowerInvariant() == folderName.ToLowerInvariant())
{
folderId = folder.id;
break;
}
}
if (folderId == null)
{
// Directory hasn't been found, so creates it using the PostAsync method.
var folderData = new Dictionary<string, object>();
folderData.Add("name", folderName);
opResult = await client.PostAsync(parentFolder, folderData);
result = opResult.Result;
// Retrieves the id of the created folder.
folderId = result.id;
}
return folderId;
}
You then use this as:
string skyDriveFolder = await CreateDirectoryAsync(liveConnectClient, "<YourFolderNameHere>", "me/skydrive");
Now skyDriveFolder has the folder id that you can use when uploading and downloading. Here's a sample Upload:
LiveOperationResult result = await liveConnectClient.UploadAsync(skyDriveFolder, fileName,
fileStream, OverwriteOption.Overwrite);
ADDITION TO COMPLETE THE ANSWER BY YnotDraw
Using what you provided, here's how to download a text file by specifying the file name. Below does not include if the file is not found and other potential exceptions, but here is what works when the stars align properly:
public async static Task<string> DownloadFileAsync(this LiveConnectClient client, string directory, string fileName)
{
string skyDriveFolder = await OneDriveHelper.CreateOrGetDirectoryAsync(client, directory, "me/skydrive");
var result = await client.DownloadAsync(skyDriveFolder);
var operation = await client.GetAsync(skyDriveFolder + "/files");
var items = operation.Result["data"] as List<object>;
string id = string.Empty;
// Search for the file - add handling here if File Not Found
foreach (object item in items)
{
IDictionary<string, object> file = item as IDictionary<string, object>;
if (file["name"].ToString() == fileName)
{
id = file["id"].ToString();
break;
}
}
var downloadResult= await client.DownloadAsync(string.Format("{0}/content", id));
var reader = new StreamReader(downloadResult.Stream);
string text = await reader.ReadToEndAsync();
return text;
}
And in usage:
var result = await DownloadFile(_client, "MyDir", "backup.txt");

Categories