Uploading objects to google cloud storage buckets in c# - c#

Can someone please provide an example of how to use Google.Apis.Storage.v1 for uploading files to google cloud storage in c#?

I found that this basic operation is not as straight forward as you might expect. Google's documentation about it's Storage API is lacking in information about using it in C# (or any other .NET language). Searching for 'how to upload file to google cloud storage in c#' didn't exactly help me, so here is my working solution with some comments:
Preparation:
You need to create OAuth2 account in your Google Developers Console - go to Project/APIs & auth/Credentials.
Copy Client ID & Client Secret to your code. You will also need your Project name.
Code (it assumes that you've added Google.Apis.Storage.v1 via NuGet):
First, you need to authorize your requests:
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = clientId;
clientSecrets.ClientSecret = clientSecret;
//there are different scopes, which you can find here https://cloud.google.com/storage/docs/authentication
var scopes = new[] {#"https://www.googleapis.com/auth/devstorage.full_control"};
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets,scopes, "yourGoogle#email", cts.Token);
Sometimes you might also want to refresh authorization token via:
await userCredential.RefreshTokenAsync(cts.Token);
You also need to create Storage Service:
var service = new Google.Apis.Storage.v1.StorageService();
Now you can make requests to Google Storage API.
Let's start with creating a new bucket:
var newBucket = new Google.Apis.Storage.v1.Data.Bucket()
{
Name = "your-bucket-name-1"
};
var newBucketQuery = service.Buckets.Insert(newBucket, projectName);
newBucketQuery.OauthToken = userCredential.Result.Token.AccessToken;
//you probably want to wrap this into try..catch block
newBucketQuery.Execute();
And it's done. Now, you can send a request to get list of all of your buckets:
var bucketsQuery = service.Buckets.List(projectName);
bucketsQuery.OauthToken = userCredential.Result.Token.AccessToken;
var buckets = bucketsQuery.Execute();
Last part is uploading new file:
//enter bucket name to which you want to upload file
var bucketToUpload = buckets.Items.FirstOrDefault().Name;
var newObject = new Object()
{
Bucket = bucketToUpload,
Name = "some-file-"+new Random().Next(1,666)
};
FileStream fileStream = null;
try
{
var dir = Directory.GetCurrentDirectory();
var path = Path.Combine(dir, "test.png");
fileStream = new FileStream(path, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload,fileStream,"image/png");
uploadRequest.OauthToken = userCredential.Result.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
And bam! New file will be visible in you Google Developers Console inside of selected bucket.

You can use Google Cloud APIs without SDK in the following ways:
Required api-key.json file
Install package Google.Apis.Auth.OAuth2 in order to authorize the
HTTP web request
You can set the default configuration for your application in this
way
I did the same using .NET core web API and details are given below:
Url details:
"GoogleCloudStorageBaseUrl": "https://www.googleapis.com/upload/storage/v1/b/",
"GoogleSpeechBaseUrl": "https://speech.googleapis.com/v1/operations/",
"GoogleLongRunningRecognizeBaseUrl": "https://speech.googleapis.com/v1/speech:longrunningrecognize",
"GoogleCloudScope": "https://www.googleapis.com/auth/cloud-platform",
public void GetConfiguration()
{
// Set global configuration
bucketName = _configuration.GetValue<string>("BucketName");
googleCloudStorageBaseUrl = _configuration.GetValue<string>("GoogleCloudStorageBaseUrl");
googleSpeechBaseUrl = _configuration.GetValue<string>("GoogleSpeechBaseUrl");
googleLongRunningRecognizeBaseUrl = _configuration.GetValue<string>("GoogleLongRunningRecognizeBaseUrl");
// Set google cloud credentials
string googleApplicationCredentialsPath = _configuration.GetValue<string>("GoogleCloudCredentialPath");
using (Stream stream = new FileStream(googleApplicationCredentialsPath, FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(stream).CreateScoped(_configuration.GetValue<string>("GoogleCloudScope"));
}
Get Oauth token:
public string GetOAuthToken()
{
return googleCredential.UnderlyingCredential.GetAccessTokenForRequestAsync("https://accounts.google.com/o/oauth2/v2/auth", CancellationToken.None).Result;
}
To upload file to cloud bucket:
public async Task<string> UploadMediaToCloud(string filePath, string objectName = null)
{
string bearerToken = GetOAuthToken();
byte[] fileBytes = File.ReadAllBytes(filePath);
objectName = objectName ?? Path.GetFileName(filePath);
var baseUrl = new Uri(string.Format(googleCloudStorageBaseUrl + "" + bucketName + "/o?uploadType=media&name=" + objectName + ""));
using (WebClient client = new WebClient())
{
client.Headers.Add(HttpRequestHeader.Authorization, "Bearer " + bearerToken);
client.Headers.Add(HttpRequestHeader.ContentType, "application/octet-stream");
byte[] response = await Task.Run(() => client.UploadData(baseUrl, "POST", fileBytes));
string responseInString = Encoding.UTF8.GetString(response);
return responseInString;
}
}
In order to perform any action to the cloud API, just need to make a HttpClient get/post request as per the requirement.
Thanks

This is for Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1), but appears to be a bit simpler to perform an upload now. I started with the Client libraries "Getting Started" instructions to create a service account and bucket, then experimented to find out how to upload an image.
The process I followed was:
Sign up for Google Cloud free trial
Create a new project in Google Cloud (remember the project name\ID for later)
Create a Project Owner service account - this will result in a json file being downloaded that contains the service account credentials. Remember where you put that file.
The getting started docs get you to add the path to the JSON credentials file into an environment variable called GOOGLE_APPLICATION_CREDENTIALS - I couldn't get this to work through the provided instructions. Turns out it is not required, as you can just read the JSON file into a string and pass it to the client constructor.
I created an empty WPF project as a starting point, and a single ViewModel to house the application logic.
Install the Google.Cloud.Storage.V1 nuget package and it should pull in all the dependencies it needs.
Onto the code.
MainWindow.xaml
<StackPanel>
<Button
Margin="50"
Height="50"
Content="BEGIN UPLOAD"
Click="OnButtonClick" />
<ContentControl
Content="{Binding Path=ProgressBar}" />
</StackPanel>
MainWindow.xaml.cs
public partial class MainWindow
{
readonly ViewModel _viewModel;
public MainWindow()
{
_viewModel = new ViewModel(Dispatcher);
DataContext = _viewModel;
InitializeComponent();
}
void OnButtonClick(object sender, RoutedEventArgs args)
{
_viewModel.UploadAsync().ConfigureAwait(false);
}
}
ViewModel.cs
public class ViewModel
{
readonly Dispatcher _dispatcher;
public ViewModel(Dispatcher dispatcher)
{
_dispatcher = dispatcher;
ProgressBar = new ProgressBar {Height=30};
}
public async Task UploadAsync()
{
// Google Cloud Platform project ID.
const string projectId = "project-id-goes-here";
// The name for the new bucket.
const string bucketName = projectId + "-test-bucket";
// Path to the file to upload
const string filePath = #"C:\path\to\image.jpg";
var newObject = new Google.Apis.Storage.v1.Data.Object
{
Bucket = bucketName,
Name = System.IO.Path.GetFileNameWithoutExtension(filePath),
ContentType = "image/jpeg"
};
// read the JSON credential file saved when you created the service account
var credential = Google.Apis.Auth.OAuth2.GoogleCredential.FromJson(System.IO.File.ReadAllText(
#"c:\path\to\service-account-credentials.json"));
// Instantiates a client.
using (var storageClient = Google.Cloud.Storage.V1.StorageClient.Create(credential))
{
try
{
// Creates the new bucket. Only required the first time.
// You can also create buckets through the GCP cloud console web interface
storageClient.CreateBucket(projectId, bucketName);
System.Windows.MessageBox.Show($"Bucket {bucketName} created.");
// Open the image file filestream
using (var fileStream = new System.IO.FileStream(filePath, System.IO.FileMode.Open))
{
ProgressBar.Maximum = fileStream.Length;
// set minimum chunksize just to see progress updating
var uploadObjectOptions = new Google.Cloud.Storage.V1.UploadObjectOptions
{
ChunkSize = Google.Cloud.Storage.V1.UploadObjectOptions.MinimumChunkSize
};
// Hook up the progress callback
var progressReporter = new Progress<Google.Apis.Upload.IUploadProgress>(OnUploadProgress);
await storageClient.UploadObjectAsync(
newObject,
fileStream,
uploadObjectOptions,
progress: progressReporter)
.ConfigureAwait(false);
}
}
catch (Google.GoogleApiException e)
when (e.Error.Code == 409)
{
// When creating the bucket - The bucket already exists. That's fine.
System.Windows.MessageBox.Show(e.Error.Message);
}
catch (Exception e)
{
// other exception
System.Windows.MessageBox.Show(e.Message);
}
}
}
// Called when progress updates
void OnUploadProgress(Google.Apis.Upload.IUploadProgress progress)
{
switch (progress.Status)
{
case Google.Apis.Upload.UploadStatus.Starting:
ProgressBar.Minimum = 0;
ProgressBar.Value = 0;
break;
case Google.Apis.Upload.UploadStatus.Completed:
ProgressBar.Value = ProgressBar.Maximum;
System.Windows.MessageBox.Show("Upload completed");
break;
case Google.Apis.Upload.UploadStatus.Uploading:
UpdateProgressBar(progress.BytesSent);
break;
case Google.Apis.Upload.UploadStatus.Failed:
System.Windows.MessageBox.Show("Upload failed"
+ Environment.NewLine
+ progress.Exception);
break;
}
}
void UpdateProgressBar(long value)
{
_dispatcher.Invoke(() => { ProgressBar.Value = value; });
}
// probably better to expose progress value directly and bind to
// a ProgressBar in the XAML
public ProgressBar ProgressBar { get; }
}

Use of Google.Apis.Storage.v1 for uploading files using SDK to google cloud storage in c#:
Required api-key.json file
Install the package Google.Cloud.Storage.V1; and Google.Apis.Auth.OAuth2;
The code is given below to upload the file to the cloud
private string UploadFile(string localPath, string objectName = null)
{
string projectId = ((Google.Apis.Auth.OAuth2.ServiceAccountCredential)googleCredential.UnderlyingCredential).ProjectId;
try
{
// Creates the new bucket.
var objResult = storageClient.CreateBucket(projectId, bucketName);
if (!string.IsNullOrEmpty(objResult.Id))
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus1 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Google.GoogleApiException ex)
{
// Error code =409, means bucket already created/exist then upload file in the bucket
if (ex.Error.Code == 409)
{
// Upload file to google cloud server
using (var f = File.OpenRead(localPath))
{
objectName = objectName ?? Path.GetFileName(localPath);
var objFileUploadStatus2 = storageClient.UploadObject(bucketName, objectName, null, f);
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
}
return objectName;
}
To set the credentials
private bool SetStorageCredentials()
{
bool status = true;
try
{
if (File.Exists(credential_path))
{
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", credential_path);
using (Stream objStream = new FileStream(Environment.GetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS"), FileMode.Open, FileAccess.Read))
googleCredential = GoogleCredential.FromStream(objStream);
// Instantiates a client.
storageClient = StorageClient.Create();
channel = new Grpc.Core.Channel(SpeechClient.DefaultEndpoint.Host, googleCredential.ToChannelCredentials());
}
else
{
DialogResult result = MessageBox.Show("File " + Path.GetFileName(credential_path) + " does not exist. Please provide the correct path.");
if (result == System.Windows.Forms.DialogResult.OK)
{
status = false;
}
}
}
catch (Exception ex)
{
MessageBox.Show(ex.ToString());
status = false;
}
return status;
}
I used SDK in one of my window application. You can use the same code according to your needs/requirements.

You'll be happy to know it still works in 2016...
I was googling all over using fancy key words like "google gcp C# upload image", until I just plain asked the question: "How do I upload an image to google bucket using C#"... and here I am. I removed the .Result in the user credential, and this was the final edit that worked for me.
// ******
static string bucketForImage = ConfigurationManager.AppSettings["testStorageName"];
static string projectName = ConfigurationManager.AppSettings["GCPProjectName"];
string gcpPath = Path.Combine(Server.MapPath("~/Images/Gallery/"), uniqueGcpName + ext);
var clientSecrets = new ClientSecrets();
clientSecrets.ClientId = ConfigurationManager.AppSettings["GCPClientID"];
clientSecrets.ClientSecret = ConfigurationManager.AppSettings["GCPClientSc"];
var scopes = new[] { #"https://www.googleapis.com/auth/devstorage.full_control" };
var cts = new CancellationTokenSource();
var userCredential = await GoogleWebAuthorizationBroker.AuthorizeAsync(clientSecrets, scopes, ConfigurationManager.AppSettings["GCPAccountEmail"], cts.Token);
var service = new Google.Apis.Storage.v1.StorageService();
var bucketToUpload = bucketForImage;
var newObject = new Google.Apis.Storage.v1.Data.Object()
{
Bucket = bucketToUpload,
Name = bkFileName
};
FileStream fileStream = null;
try
{
fileStream = new FileStream(gcpPath, FileMode.Open);
var uploadRequest = new Google.Apis.Storage.v1.ObjectsResource.InsertMediaUpload(service, newObject,
bucketToUpload, fileStream, "image/"+ ext);
uploadRequest.OauthToken = userCredential.Token.AccessToken;
await uploadRequest.UploadAsync();
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
if (fileStream != null)
{
fileStream.Dispose();
}
}
// ******

Here is the link to their official C# example of ".NET Bookshelf App" using Google Cloud storage.
https://cloud.google.com/dotnet/docs/getting-started/using-cloud-storage
Source on github:
https://github.com/GoogleCloudPlatform/getting-started-dotnet/blob/master/aspnet/3-binary-data/Services/ImageUploader.cs
https://github.com/GoogleCloudPlatform/getting-started-dotnet/tree/master/aspnet/3-binary-data
Nuget
https://www.nuget.org/packages/Google.Cloud.Storage.V1/

Here are 2 examples that helped me to upload files to a bucket in Google Cloud Storage with Google.Cloud.Storage.V1 (not Google.Apis.Storage.v1):
Upload files to Google cloud storage using c#
Uploading .csv Files to Google Cloud Storage using C# .Net
I got both working on a C# Console Application just for testing purposes.

#February 2021
string _projectId = "YOUR-PROJECT-ID-GCP"; //ProjectID also present in the json file
GoogleCredential _credential = GoogleCredential.FromFile("credential-cloud-file-123418c9e06c.json");
/// <summary>
/// UploadFile to GCS Bucket
/// </summary>
/// <param name="bucketName"></param>
/// <param name="localPath">my-local-path/my-file-name</param>
/// <param name="objectName">my-file-name</param>
public void UploadFile(string bucketName, string localPath, string objectName)
{
var storage = StorageClient.Create(_credential);
using var fileStream = File.OpenRead(localPath);
storage.UploadObject(bucketName, objectName, null, fileStream);
Console.WriteLine($"Uploaded {objectName}.");
}
You get the credentials JSON file from the google cloud portal where you create a bucket under your project..

Simple, with auth:
private void SaveFileToGoogleStorage(string path, string? fileName, string ext)
{
var filePath = Path.Combine(path, fileName + ext);
var gcCredentialsPath = Path.Combine(Environment.CurrentDirectory, "gc_sa_key.json");
Environment.SetEnvironmentVariable("GOOGLE_APPLICATION_CREDENTIALS", gcCredentialsPath);
var gcsStorage = StorageClient.Create();
using var f = File.OpenRead(filePath);
var objectName = Path.GetFileName(filePath);
gcsStorage.UploadObject(_bucketName, objectName, null, f);
Console.WriteLine($"Uploaded {objectName}.");
}

Related

Downloading file from MS teams programatically

Previously, I developed an application which downloaded a file from a corporate Sharepoint site and then performed some magic with it.
The powers that be have since migrated to MS Teams and I'm trying to update the application to use the new platform. However, I'm having all sorts of issues getting the file to download.
My old (working for Sharepoint) code uses a WebClient to retrieve the file based on credentials previously provided by the user:
private string GetSchedule(string username, string password, string domain)
{
string tempPath = Path.GetTempFileName().Replace(".tmp", ".xlsm");
using (WebClient client = new WebClient())
{
client.Credentials = new NetworkCredential(username, password, domain);
try
{
client.DownloadFile(_networkSchedulePath, tempPath);
}
catch (WebException e)
{
if (e.Message.Contains("401"))
{
StatusUpdated?.Invoke(this, new EventArgs<string>("Invalid Credentials Provided"));
Finished?.Invoke(this, null);
return null;
}
if (e.Message.Contains("404"))
{
StatusUpdated?.Invoke(this, new EventArgs<string>("File Not Found"));
Finished?.Invoke(this, null);
return null;
}
else
{
StatusUpdated?.Invoke(this, new EventArgs<string>(e.Message));
Finished?.Invoke(this, null);
return null;
}
}
}
return tempPath;
}
However, when I use this with the new teams link I'm getting a 403 Forbidden error. So is there any way to programmatically retrieve a file from MS Teams?
I was mistaken in the comments. Simply replacing the NetworkCredentials with SharePointOnlineCredentials is not the solution.
I'm not sure if the following is the "right" approach, but it works and seems pretty solid. Please give it a try:
private static string GetFile(string path, string username, string password, string domain)
{
var secureString = new SecureString();
foreach (var ch in password)
{
secureString.AppendChar(ch);
}
string tempPath = Path.GetTempFileName().Replace(".tmp", ".xlsm");
using (WebClient client = new WebClient())
{
var credentials = new SharePointOnlineCredentials(username, secureString);
client.Headers[HttpRequestHeader.Cookie] = credentials.GetAuthenticationCookie(new Uri(path));
try
{
client.DownloadFile(path, tempPath);
}
catch (WebException e)
{
// Error Handling
}
}
return tempPath;
}
Another option is to use the CSOM rather than using a webclient directly. n.b., I encountered errors at the OpenBinaryDirect() call when using the Microsoft.SharePoint.Client NuGet package and it looks like this package is wildly out of date. It appears that the one to use now is Microsoft.SharePointOnline.CSOM or Microsoft.SharePoint2019.CSOM:
private static string GetFileWithClientContext(string path, string username, string password, string domain)
{
var secureString = new SecureString();
foreach (var ch in password)
{
secureString.AppendChar(ch);
}
string tempPath = Path.GetTempFileName().Replace(".tmp", Path.GetExtension(path));
using (var context = new ClientContext(path))
{
context.Credentials = new SharePointOnlineCredentials(username, secureString);
try
{
using (var file = Microsoft.SharePoint.Client.File.OpenBinaryDirect(context, new Uri(path).AbsolutePath))
using (var outFile = System.IO.File.OpenWrite(tempPath))
{
file.Stream.CopyTo(outFile);
}
}
catch (WebException e)
{
// Error Handling
}
}
return tempPath;
}
Thanks to JLRishe for the help his answer and comments provided. However, the final solution varied from the one in his answer, which is why I'm posting it here:
The OfficeDevPnP.Core package is used extensively for this.
Firstly, the AuthenticationManager is used to get a ClientContext in terms of the specific sharepoint site that needs to be accessed. This pops a window up to allow for the MFA. Then various components are loaded in via the ClientContext object. From here, the file is fetched via Guid and dumped to disk.
private string GetSchedule()
{
string tempPath = Path.GetTempFileName().Replace(".tmp", ".xlsm");
try
{
AuthenticationManager authManager = new OfficeDevPnP.Core.AuthenticationManager();
ClientContext ctx = authManager.GetWebLoginClientContext("https://oursite.sharepoint.com/sites/ourspecificsite/");
Web web = ctx.Web;
Microsoft.SharePoint.Client.File schedule = web.GetFileById(new Guid("ourguid"));
ctx.Load(web);
ctx.Load(schedule);
ctx.ExecuteQuery();
FileInformation fInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(ctx, schedule.ServerRelativeUrl);
using (var fileStream = File.Create(tempPath))
{
fInfo.Stream.CopyTo(fileStream);
}
}
catch (WebException e)
{
StatusUpdated?.Invoke(this, new EventArgs<string>(e.Message));
return null;
}
return tempPath;
}
using Microsoft.Graph;
using Microsoft.Graph.Auth;
using Microsoft.Identity.Client;
using System.IO;
using System.Linq;
namespace Answer
{
class Answer
{
static void Main(string[] args)
{
// Create Confidential Application
IConfidentialClientApplication confidentialClientApplication = ConfidentialClientApplicationBuilder
.Create("<My_Azure_Application_Client_ID>")
.WithTenantId("<My_Azure_Tenant_ID>")
.WithClientSecret("<My_Azure_Application_Client_Secret>")
.Build();
// Create an authentication provider.
ClientCredentialProvider authenticationProvider = new ClientCredentialProvider(confidentialClientApplication);
// Configure GraphServiceClient with provider.
GraphServiceClient graphServiceClient = new GraphServiceClient(authenticationProvider);
// Get a user
var user = graphServiceClient.Users["<My_Azure_User_Name>"].Request().GetAsync().Result;
// Get the teams the user is member of
var joinedTeams = graphServiceClient.Users[user.Id].JoinedTeams.Request().GetAsync().Result;
// Get the team we are intereseted in
var team1 = joinedTeams.FirstOrDefault(t => t.DisplayName == "<TeamName_Of_Interest>");
// Get the main folders
var folders = graphServiceClient.Groups[team1.Id].Drive.Root.Children
.Request()
.GetAsync().Result;
// Get the files in the first main folder
var files = graphServiceClient.Groups[team1.Id].Drive.Items[folders[0].Id].Children
.Request()
.GetAsync().Result;
// Get the file-Data of the first file
MemoryStream fileData = graphServiceClient.Groups[team1.Id].Drive.Items[files[0].Id].Content
.Request()
.GetAsync().Result as MemoryStream;
// Save the file to the hard-disc
System.IO.File.WriteAllBytes($"C:\\{files[0].Name}", fileData.ToArray());
}
}
}

AmazonSQSClient not refreshing AWSCredentials when Credentials File is updated

When my AWS Credentials File (see docs) is updated by an external process the AmazonSQSClient doesn't re-read it, SendMessageAsync fails with a security/token error.
We use a custom powershell script to refresh the local AWS cred's file periodically. The script works fine, the file is refreshed prior to the credentials expiring on AWS. However, if my app is running when the file is refreshed the new credentials are not re-read from the file, the "client" will show that the previous credentials are still in use.
The AWS docs list several AWSCredential providers but none of them seem to be the correct choice...I think..
Restarting the app works, the new credentials are read correctly and messages are sent until the next time the cred's file is updated.
using (var client = new AmazonSQSClient(Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
I don't think there is a way for a running app to pick up the default credentials being refreshed in credentials file. There is a solution for Node.js loading credentials from a JSON file. You can create a similar solution in C#. You can also run a local DB to store credentials so whenever credentials file is updated DB table or JSON file is also updated. You will need to use access key and secret key in your SQS client constructor as opposed to using default credentials.
// Load these from JSON file or DB.
var accessKey = "";
var secretKey = "";
using (var client = new AmazonSQSClient(accessKey, secretKey, Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
The following works "ok" but I've only tested it with one profile and the file watcher is not as timely as you'd like so I'd recommend you wrap your usage inside a Retry mechanism.
// Usage..
var credentials = new AwsCredentialsFile();
using (var client = new AmazonSQSClient(credentials, Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
public class AwsCredentialsFile : AWSCredentials
{
// https://docs.aws.amazon.com/sdk-for-net/v2/developer-guide/net-dg-config-creds.html#creds-file
private const string DefaultProfileName = "default";
private static ConcurrentDictionary<string, ImmutableCredentials> _credentials = new ConcurrentDictionary<string, ImmutableCredentials>(StringComparer.OrdinalIgnoreCase);
private static FileSystemWatcher _watcher = BuildFileSystemWatcher();
private readonly System.Text.Encoding _encoding;
private readonly string _profileName;
public AwsCredentialsFile()
: this(AwsCredentialsFile.DefaultProfileName, System.Text.Encoding.UTF8)
{
}
public AwsCredentialsFile(string profileName)
: this(profileName, System.Text.Encoding.UTF8)
{
}
public AwsCredentialsFile(string profileName, System.Text.Encoding encoding)
{
_profileName = profileName;
_encoding = encoding;
}
private static FileSystemWatcher BuildFileSystemWatcher()
{
var watcher = new FileSystemWatcher
{
Path = Path.GetDirectoryName(GetDefaultCredentialsFilePath()),
NotifyFilter = NotifyFilters.LastWrite,
Filter = "credentials"
};
watcher.Changed += (object source, FileSystemEventArgs e) => { _credentials?.Clear(); };
watcher.EnableRaisingEvents = true;
return watcher;
}
public static string GetDefaultCredentialsFilePath()
{
return System.Environment.ExpandEnvironmentVariables(#"C:\Users\%USERNAME%\.aws\credentials");
}
public static (string AccessKey, string SecretAccessKey, string Token) ReadCredentialsFromFile(string profileName, System.Text.Encoding encoding)
{
var profile = $"[{profileName}]";
string awsAccessKeyId = null;
string awsSecretAccessKey = null;
string token = null;
var lines = File.ReadAllLines(GetDefaultCredentialsFilePath(), encoding);
for (int i = 0; i < lines.Length; i++)
{
var text = lines[i];
if (text.Equals(profile, StringComparison.OrdinalIgnoreCase))
{
awsAccessKeyId = lines[i + 1].Replace("aws_access_key_id = ", string.Empty);
awsSecretAccessKey = lines[i + 2].Replace("aws_secret_access_key = ", string.Empty);
if (lines.Length >= i + 3)
{
token = lines[i + 3].Replace("aws_session_token = ", string.Empty);
}
break;
}
}
var result = (AccessKey: awsAccessKeyId, SecretAccessKey: awsSecretAccessKey, Token: token);
return result;
}
public override ImmutableCredentials GetCredentials()
{
if (_credentials.TryGetValue(_profileName, out ImmutableCredentials value))
{
return value;
}
else
{
var (AccessKey, SecretAccessKey, Token) = ReadCredentialsFromFile(_profileName, _encoding);
var credentials = new ImmutableCredentials(AccessKey, SecretAccessKey, Token);
_credentials.TryAdd(_profileName, credentials);
return credentials;
}
}
}

Only upload updated files into Blob Storage

I have a method that uploads XML files from a folder into a Blob Storage. Connected to that Blob Storage i have a Blob Trigger that listens to changes in the Blob Storage, takes the files and then does a PUT request to a server. I got that sorted out and working.
My problem is that when I want to update a specific file in the folder and run my code, all the files in the folder seems to be uploaded again and my Blob Trigger goes of, doing a PUT for all the files. I only want to do a PUT on the files that changed in the folder (except for my initial upload to the blob of course).
The code I have so far is as basic as my level of experience. For the import I followed a simple guide.
My code that uploads the file into the Blob Storage:
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Starting...");
string accountName = ConfigurationManager.AppSettings["accountName"];
string accountKey = ConfigurationManager.AppSettings["accountKey"];
string localFolder = ConfigurationManager.AppSettings["mySourceFolder"];
string destContainer = ConfigurationManager.AppSettings["destContainer"];
var stringReturned = BlobSetup(accountName, accountKey, localFolder, destContainer);
Console.WriteLine(stringReturned);
Console.Read();
}
static async Task UploadBlob(CloudBlobContainer container, string key, string filePath, bool deleteAfter)
{
//Get a blob reference to write this file to
var blob = container.GetBlockBlobReference(key);
await blob.UploadFromFileAsync(filePath);
Console.WriteLine("Uploaded {0}", filePath);
//if delete of file is requested, do that
if (deleteAfter)
{
File.Delete(filePath);
}
}
static async Task<string> BlobSetup(string accountName, string accountKey, string localFolder, string destContainer)
{
var storageAccount = new CloudStorageAccount(new StorageCredentials(accountName, accountKey), true);
var blobClient = storageAccount.CreateCloudBlobClient();
var container = blobClient.GetContainerReference(destContainer);
//create container if not exists
await container.CreateIfNotExistsAsync();
await container.SetPermissionsAsync(new BlobContainerPermissions()
{
PublicAccess = BlobContainerPublicAccessType.Blob
});
string[] fileEntries = Directory.GetFiles(localFolder);
foreach (string filePath in fileEntries)
{
//Handle only json and xml?
if(filePath.EndsWith(".json") || filePath.EndsWith(".xml"))
{
string keys = Path.GetFileName(filePath);
await UploadBlob(container, keys, filePath, false);
}
}
return "some response";
}
My Blob Trigger that does the PUT:
public static class BlobTriggerExample
{
const string serverUrl= "theurl";
static HttpClient client = new HttpClient();
[FunctionName("BlobTriggerExample")]
public static async Task Run([BlobTrigger("myblob/{name}", Connection = "AzureWebJobsStorage")]CloudBlockBlob myBlob, string name, TraceWriter log)
{
string putUrlString = "";
string idValue = "";
XDocument xdoc = new XDocument();
myBlob.StreamMinimumReadSizeInBytes = 20 * 1024 * 1024;
await myBlob.FetchAttributesAsync();
//Read stream
var blobStream = await myBlob.OpenReadAsync();
xdoc = new XDocument(XDocument.Load(blobStream));
//Read root node(resourceType)
string resourceType = xdoc.Root.Name.LocalName;
//Get id value
idValue = xdoc.Descendants().Where(x => x.Name.LocalName == "id").First().LastAttribute.Value;
//Build redirect string
putUrlString = serverUrl + resourceType + "/" + idValue;
//PUT
var httpContent = new StringContent(xdoc.ToString(), Encoding.UTF8, "application/xml");
var response = await client.PutAsync(putUrlString, httpContent);
Console.WriteLine($"Response: {response}");
Console.Read();
log.Info($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.StreamWriteSizeInBytes} Bytes : Response message: {response}");
}
}
My guess is that I want to be able to control what files I'm uploading into the Blob Storage by doing some sort of check if the exact same file already exist. Or maybe I want to make some sort of check in the Blob Trigger before doing the PUT?
File names in the folder I'm uploading from is always the same(a must), even though some if the content might have changed.
Is there anyone that could be so kind to give me some guidelines on how I might approach this? I have been googling around for hours and I'm getting nowhere.
Yes, your code loops through and uploads all the files in your local folder. The blob trigger just sees that the blobs have been written and has no concept of whether or not their content has changed (or whether that matters) so it also processes all of them.
What you need to do is to compare your local files against the ones in blob storage before you upload to see whether they're new versions or not, so in your UploadBlob method you want something along the lines of
// Get a blob reference to write this file to
var blob = container.GetBlockBlobReference(key);
// If the blob already exists
if (await blob.ExistsAsync()) {
// Fetch the blob's properties
await blob.FetchAttributesAsync();
// Only proceed if modification time of local file is newer
if (blob.Properties.LastModified > File.GetLastWriteTimeUtc(filePath))
return;
}
If checking the modification time isn't enough then you can also attach your own metadata (e.g. a checksum) to blobs and use that for the comparison - see https://learn.microsoft.com/en-us/azure/storage/blobs/storage-properties-metadata.
Below is a version we use in production. Note that Metadata is used instead of blob last modified property for better code flexibility & reliability. Might not fit everybody's needs but this is one way of doing it that is tested working.
for (var i = 0; i < files.Length; i++)
{
var filePath = files[i];
//maintains folder structure
var blobName = filePath.Replace(selectedBuildPath, "").Replace("\\", "/");
var blobClient = GetNewBlobClientWithMaxRetry(blobName, containerClient);
// If the blob already exists, get the last modified tick count in the blobs metadata
long blobLastModifiedTick = 0;
if (await blobClient.ExistsAsync())
{
try
{
var blobProperties = (await blobClient.GetPropertiesAsync()).Value;
//this line will fail if lastmodifiedticks was not set before
blobLastModifiedTick = long.Parse(blobProperties.Metadata[_lastmodifiedticks]);
}
//if fail, blob last tick will be 0, aka outdated
catch { }
}
//if local file is latest, then upload else skip
var localLastModified = File.GetLastWriteTimeUtc(filePath);
var localIsNewer = blobLastModifiedTick < localLastModified.Ticks;
if (localIsNewer)
{
//print name 1st to know which gets stuck
//extra space to delete extra prev
Console.Write($"\r{i}/{files.Length} Uploading:{blobName} ");
//note will overwrite existing
await using var fs = File.Open(filePath, FileMode.Open);
await blobClient.UploadAsync(fs, overwrite: true, CancellationToken.None);
//auto correct content type from wrongly set "octet/stream"
var blobHttpHeaders = new BlobHttpHeaders { ContentType = MimeTypeMap.GetMimeType(filePath) };
await blobClient.SetHttpHeadersAsync(blobHttpHeaders);
await blobClient.SetMetadataAsync(new Dictionary<string, string>() { { _lastmodifiedticks, localLastModified.Ticks.ToString() } });
}
else
{
Console.Write($"\r{i}/{files.Length} Skipped:{blobName} "); //space to delete extra previous line
}
}

How to download/upload files from/to SharePoint 2013 using CSOM?

I am developing a Win8 (WinRT, C#, XAML) client application (CSOM) that needs to download/upload files from/to SharePoint 2013.
How do I do the Download/Upload?
Upload a file
Upload a file to a SharePoint site (including SharePoint Online) using File.SaveBinaryDirect Method:
using (var clientContext = new ClientContext(url))
{
using (var fs = new FileStream(fileName, FileMode.Open))
{
var fi = new FileInfo(fileName);
var list = clientContext.Web.Lists.GetByTitle(listTitle);
clientContext.Load(list.RootFolder);
clientContext.ExecuteQuery();
var fileUrl = String.Format("{0}/{1}", list.RootFolder.ServerRelativeUrl, fi.Name);
Microsoft.SharePoint.Client.File.SaveBinaryDirect(clientContext, fileUrl, fs, true);
}
}
Download file
Download file from a SharePoint site (including SharePoint Online) using File.OpenBinaryDirect Method:
using (var clientContext = new ClientContext(url))
{
var list = clientContext.Web.Lists.GetByTitle(listTitle);
var listItem = list.GetItemById(listItemId);
clientContext.Load(list);
clientContext.Load(listItem, i => i.File);
clientContext.ExecuteQuery();
var fileRef = listItem.File.ServerRelativeUrl;
var fileInfo = Microsoft.SharePoint.Client.File.OpenBinaryDirect(clientContext, fileRef);
var fileName = Path.Combine(filePath,(string)listItem.File.Name);
using (var fileStream = System.IO.File.Create(fileName))
{
fileInfo.Stream.CopyTo(fileStream);
}
}
This article describes various options for accessing SharePoint content. You have a choice between REST and CSOM. I'd try CSOM if possible. File upload / download specifically is nicely described in this article.
Overall notes:
//First construct client context, the object which will be responsible for
//communication with SharePoint:
var context = new ClientContext(#"http://site.absolute.url")
//then get a hold of the list item you want to download, for example
var list = context.Web.Lists.GetByTitle("Pipeline");
var query = CamlQuery.CreateAllItemsQuery(10000);
var result = list.GetItems(query);
//note that data has not been loaded yet. In order to load the data
//you need to tell SharePoint client what you want to download:
context.Load(result, items=>items.Include(
item => item["Title"],
item => item["FileRef"]
));
//now you get the data
context.ExecuteQuery();
//here you have list items, but not their content (files). To download file
//you'll have to do something like this:
var item = items.First();
//get the URL of the file you want:
var fileRef = item["FileRef"];
//get the file contents:
FileInformation fileInfo = File.OpenBinaryDirect(context, fileRef.ToString());
using (var memory = new MemoryStream())
{
byte[] buffer = new byte[1024 * 64];
int nread = 0;
while ((nread = fileInfo.Stream.Read(buffer, 0, buffer.Length)) > 0)
{
memory.Write(buffer, 0, nread);
}
memory.Seek(0, SeekOrigin.Begin);
// ... here you have the contents of your file in memory,
// do whatever you want
}
Avoid working with the stream directly, read it into the memory first. Network-bound streams are not necessarily supporting stream operations, not to mention performance. So, if you are reading a pic from that stream or parsing a document, you may end up with some unexpected behavior.
On a side note, I have a related question re: performance of this code above, as you are taking some penalty with every file request. See here. And yes, you need 4.5 full .NET profile for this.
File.OpenBinaryDirect may cause exception when you are using Oauth accestoken
Explained in This Article
Code should be written as below to avoid exceptions
Uri filename = new Uri(filepath);
string server = filename.AbsoluteUri.Replace(filename.AbsolutePath,
"");
string serverrelative = filename.AbsolutePath;
Microsoft.SharePoint.Client.File file =
this.ClientContext.Web.GetFileByServerRelativeUrl(serverrelative);
this.ClientContext.Load(file);
ClientResult<Stream> streamResult = file.OpenBinaryStream();
this.ClientContext.ExecuteQuery();
return streamResult.Value;
A little late this comment but I will leave here my results working with the library of SharePoin Online and it is very easy to use and implement in your project, just go to the NuGet administrator of .Net and Add Microsoft.SharePoint.CSOM to your project .
[https://developer.microsoft.com/en-us/office/blogs/new-sharepoint-csom-version-released-for-office-365-may-2017/][1]
The following code snippet will help you connect your credentials to your SharePoint site, you can also read and download files from a specific site and folder.
using System;
using System.IO;
using System.Linq;
using System.Web;
using Microsoft.SharePoint.Client;
using System.Security;
using ClientOM = Microsoft.SharePoint.Client;
namespace MvcApplication.Models.Home
{
public class SharepointModel
{
public ClientContext clientContext { get; set; }
private string ServerSiteUrl = "https://somecompany.sharepoint.com/sites/ITVillahermosa";
private string LibraryUrl = "Shared Documents/Invoices/";
private string UserName = "someone.surname#somecompany.com";
private string Password = "********";
private Web WebClient { get; set; }
public SharepointModel()
{
this.Connect();
}
public void Connect()
{
try
{
using (clientContext = new ClientContext(ServerSiteUrl))
{
var securePassword = new SecureString();
foreach (char c in Password)
{
securePassword.AppendChar(c);
}
clientContext.Credentials = new SharePointOnlineCredentials(UserName, securePassword);
WebClient = clientContext.Web;
}
}
catch (Exception ex)
{
throw (ex);
}
}
public string UploadMultiFiles(HttpRequestBase Request, HttpServerUtilityBase Server)
{
try
{
HttpPostedFileBase file = null;
for (int f = 0; f < Request.Files.Count; f++)
{
file = Request.Files[f] as HttpPostedFileBase;
string[] SubFolders = LibraryUrl.Split('/');
string filename = System.IO.Path.GetFileName(file.FileName);
var path = System.IO.Path.Combine(Server.MapPath("~/App_Data/uploads"), filename);
file.SaveAs(path);
clientContext.Load(WebClient, website => website.Lists, website => website.ServerRelativeUrl);
clientContext.ExecuteQuery();
//https://somecompany.sharepoint.com/sites/ITVillahermosa/Shared Documents/
List documentsList = clientContext.Web.Lists.GetByTitle("Documents"); //Shared Documents -> Documents
clientContext.Load(documentsList, i => i.RootFolder.Folders, i => i.RootFolder);
clientContext.ExecuteQuery();
string SubFolderName = SubFolders[1];//Get SubFolder 'Invoice'
var folderToBindTo = documentsList.RootFolder.Folders;
var folderToUpload = folderToBindTo.Where(i => i.Name == SubFolderName).First();
var fileCreationInformation = new FileCreationInformation();
//Assign to content byte[] i.e. documentStream
fileCreationInformation.Content = System.IO.File.ReadAllBytes(path);
//Allow owerwrite of document
fileCreationInformation.Overwrite = true;
//Upload URL
fileCreationInformation.Url = ServerSiteUrl + LibraryUrl + filename;
Microsoft.SharePoint.Client.File uploadFile = documentsList.RootFolder.Files.Add(fileCreationInformation);
//Update the metadata for a field having name "DocType"
uploadFile.ListItemAllFields["Title"] = "UploadedCSOM";
uploadFile.ListItemAllFields.Update();
clientContext.ExecuteQuery();
}
return "";
}
catch (Exception ex)
{
throw (ex);
}
}
public string DownloadFiles()
{
try
{
string tempLocation = #"c:\Downloads\Sharepoint\";
System.IO.DirectoryInfo di = new DirectoryInfo(tempLocation);
foreach (FileInfo file in di.GetFiles())
{
file.Delete();
}
FileCollection files = WebClient.GetFolderByServerRelativeUrl(this.LibraryUrl).Files;
clientContext.Load(files);
clientContext.ExecuteQuery();
if (clientContext.HasPendingRequest)
clientContext.ExecuteQuery();
foreach (ClientOM.File file in files)
{
FileInformation fileInfo = ClientOM.File.OpenBinaryDirect(clientContext, file.ServerRelativeUrl);
clientContext.ExecuteQuery();
var filePath = tempLocation + file.Name;
using (var fileStream = new System.IO.FileStream(filePath, System.IO.FileMode.Create))
{
fileInfo.Stream.CopyTo(fileStream);
}
}
return "";
}
catch (Exception ex)
{
throw (ex);
}
}
}
}
Then to invoke the functions from the controller in this case MVC ASP.NET is done in the following way.
using MvcApplication.Models.Home;
using System;
using System.Web.Mvc;
namespace MvcApplication.Controllers
{
public class SharepointController : MvcBoostraBaseController
{
[HttpPost]
public ActionResult Upload(FormCollection form)
{
try
{
SharepointModel sharepointModel = new SharepointModel();
return Json(sharepointModel.UploadMultiFiles(Request, Server), JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
return ThrowJSONError(ex);
}
}
public ActionResult Download(string ServerUrl, string RelativeUrl)
{
try
{
SharepointModel sharepointModel = new SharepointModel();
return Json(sharepointModel.DownloadFiles(), JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
return ThrowJSONError(ex);
}
}
}
}
If you need this source code you can visit my github repository
https://github.com/israelz11/MvcBoostrapTestSharePoint/
Private Sub DownloadFile(relativeUrl As String, destinationPath As String, name As String)
Try
destinationPath = Replace(destinationPath + "\" + name, "\\", "\")
Dim fi As FileInformation = Microsoft.SharePoint.Client.File.OpenBinaryDirect(Me.context, relativeUrl)
Dim down As Stream = System.IO.File.Create(destinationPath)
Dim a As Integer = fi.Stream.ReadByte()
While a <> -1
down.WriteByte(CType(a, Byte))
a = fi.Stream.ReadByte()
End While
Catch ex As Exception
ToLog(Type.ERROR, ex.Message)
End Try
End Sub
Though this is an old post and have many answers, but here I have my version of code to upload the file to sharepoint 2013 using CSOM(c#)
I hope if you are working with downloading and uploading files then you know how to create Clientcontext object and Web object
/* Assuming you have created ClientContext object and Web object*/
string listTitle = "List title where you want your file to upload";
string filePath = "your file physical path";
List oList = web.Lists.GetByTitle(listTitle);
clientContext.Load(oList.RootFolder);//to load the folder where you will upload the file
FileCreationInformation fileInfo = new FileCreationInformation();
fileInfo.Overwrite = true;
fileInfo.Content = System.IO.File.ReadAllBytes(filePath);
fileInfo.Url = fileName;
File fileToUpload = fileCollection.Add(fileInfo);
clientContext.ExecuteQuery();
fileToUpload.CheckIn("your checkin comment", CheckinType.MajorCheckIn);
if (oList.EnableMinorVersions)
{
fileToUpload.Publish("your publish comment");
clientContext.ExecuteQuery();
}
if (oList.EnableModeration)
{
fileToUpload.Approve("your approve comment");
}
clientContext.ExecuteQuery();
And here is the code for download
List oList = web.Lists.GetByTitle("ListNameWhereFileExist");
clientContext.Load(oList);
clientContext.Load(oList.RootFolder);
clientContext.Load(oList.RootFolder.Files);
clientContext.ExecuteQuery();
FileCollection fileCollection = oList.RootFolder.Files;
File SP_file = fileCollection.GetByUrl("fileNameToDownloadWithExtension");
clientContext.Load(SP_file);
clientContext.ExecuteQuery();
var Local_stream = System.IO.File.Open("c:/testing/" + SP_file.Name, System.IO.FileMode.CreateNew);
var fileInformation = File.OpenBinaryDirect(clientContext, SP_file.ServerRelativeUrl);
var Sp_Stream = fileInformation.Stream;
Sp_Stream.CopyTo(Local_stream);
Still there are different ways I believe that can be used to upload and download.
Just a suggestion SharePoint 2013 online & on-prem file encoding is UTF-8 BOM.
Make sure your file is UTF-8 BOM, otherwise your uploaded html and scripts may not rendered correctly in browser.
I would suggest reading some Microsoft documentation on what you can do with CSOM. This might be one example of what you are looking for, but there is a huge API documented in msdn.
// Starting with ClientContext, the constructor requires a URL to the
// server running SharePoint.
ClientContext context = new ClientContext("http://SiteUrl");
// Assume that the web has a list named "Announcements".
List announcementsList = context.Web.Lists.GetByTitle("Announcements");
// Assume there is a list item with ID=1.
ListItem listItem = announcementsList.Items.GetById(1);
// Write a new value to the Body field of the Announcement item.
listItem["Body"] = "This is my new value!!";
listItem.Update();
context.ExecuteQuery();
From: http://msdn.microsoft.com/en-us/library/fp179912.aspx

Determine if an object exists in a S3 bucket based on wildcard

Can someone please show me how to determine if a certain file/object exists in a S3 bucket and display a message if it exists or if it does not exist.
Basically I want it to:
1) Check a bucket on my S3 account such as testbucket
2) Inside of that bucket, look to see if there is a file with the prefix test_ (test_file.txt or test_data.txt).
3) If that file exists, then display a MessageBox (or Console message) that the file exists, or that the file does not exist.
Can someone please show me how to do this?
Using the AWSSDK For .Net I Currently do something along the lines of:
public bool Exists(string fileKey, string bucketName)
{
try
{
response = _s3Client.GetObjectMetadata(new GetObjectMetadataRequest()
.WithBucketName(bucketName)
.WithKey(key));
return true;
}
catch (Amazon.S3.AmazonS3Exception ex)
{
if (ex.StatusCode == System.Net.HttpStatusCode.NotFound)
return false;
//status wasn't not found, so throw the exception
throw;
}
}
It kinda sucks, but it works for now.
Use the S3FileInfo.Exists method:
using (var client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretKey))
{
S3FileInfo s3FileInfo = new Amazon.S3.IO.S3FileInfo(client, "your-bucket-name", "your-file-name");
if (s3FileInfo.Exists)
{
// file exists
}
else
{
// file does not exist
}
}
Not sure if this applies to .NET Framework, but the .NET Core version of AWS SDK (v3) only supports async requests, so I had to use a slightly different solution:
/// <summary>
/// Determines whether a file exists within the specified bucket
/// </summary>
/// <param name="bucket">The name of the bucket to search</param>
/// <param name="filePrefix">Match files that begin with this prefix</param>
/// <returns>True if the file exists</returns>
public async Task<bool> FileExists(string bucket, string filePrefix)
{
// Set this to your S3 region (of course)
var region = Amazon.RegionEndpoint.USEast1;
using (var client = new AmazonS3Client(region))
{
var request = new ListObjectsRequest {
BucketName = bucket,
Prefix = filePrefix,
MaxKeys = 1
};
var response = await client.ListObjectsAsync(request, CancellationToken.None);
return response.S3Objects.Any();
}
}
And, if you want to search a folder:
/// <summary>
/// Determines whether a file exists within the specified folder
/// </summary>
/// <param name="bucket">The name of the bucket to search</param>
/// <param name="folder">The name of the folder to search</param>
/// <param name="filePrefix">Match files that begin with this prefix</param>
/// <returns>True if the file exists</returns>
public async Task<bool> FileExists(string bucket, string folder, string filePrefix)
{
return await FileExists(bucket, $"{folder}/{filePrefix}");
}
Usage:
var testExists = await FileExists("testBucket", "test_");
// or...
var testExistsInFolder = await FileExists("testBucket", "testFolder/testSubFolder", "test_");
This solves it:
List the bucket for existing objects and use a prefix like so.
var request = new ListObjectsRequest()
.WithBucketName(_bucketName)
.WithPrefix(keyPrefix);
var response = _amazonS3Client.ListObjects(request);
var exists = response.S3Objects.Count > 0;
foreach (var obj in response.S3Objects) {
// act
}
I know this question is a few years old but the new SDK handles this beautifully. If anyone is still searching this. You are looking for S3DirectoryInfo Class
using (IAmazonS3 s3Client = new AmazonS3Client(accessKey, secretKey))
{
S3DirectoryInfo s3DirectoryInfo = new Amazon.S3.IO.S3DirectoryInfo(s3Client, "testbucket");
if (s3DirectoryInfo.GetFiles("test*").Any())
{
//file exists -- do something
}
else
{
//file doesn't exist -- do something else
}
}
I know this question is a few years old but the new SDK nowdays handles this in an easier manner.
public async Task<bool> ObjectExistsAsync(string prefix)
{
var response = await _amazonS3.GetAllObjectKeysAsync(_awsS3Configuration.BucketName, prefix, null);
return response.Count > 0;
}
Where _amazonS3 is your IAmazonS3 instance and _awsS3Configuration.BucketName is your bucket name.
You can use your complete key as a prefix.
I used the following code in C# with Amazon S3 version 3.1.5(.net 3.5) to check if the bucket exists:
BasicAWSCredentials credentials = new BasicAWSCredentials("accessKey", "secretKey");
AmazonS3Config configurationAmazon = new AmazonS3Config();
configurationAmazon.RegionEndpoint = S3Region.EU; // or you can use ServiceUrl
AmazonS3Client s3Client = new AmazonS3Client(credentials, configurationAmazon);
S3DirectoryInfo directoryInfo = new S3DirectoryInfo(s3Client, bucketName);
bucketExists = directoryInfo.Exists;// true if the bucket exists in other case false.
I used the followings code(in C# with Amazon S3 version 3.1.5 .net 3.5) the file Exists.
Option 1:
S3FileInfo info = new S3FileInfo(s3Client, "butcketName", "key");
bool fileExists = info.Exists; // true if the key Exists in other case false
Option 2:
ListObjectsRequest request = new ListObjectsRequest();
try
{
request.BucketName = "bucketName";
request.Prefix = "prefix"; // or part of the key
request.MaxKeys = 1; // max limit to find objects
ListObjectsResponse response = s3Client .ListObjects(request);
return response.S3Objects.Count > 0;
}
I'm not familiar with C#, but I use this method from Java (conversion to c# is immediate):
public boolean exists(AmazonS3 s3, String bucket, String key) {
ObjectListing list = s3.listObjects(bucket, key);
return list.getObjectSummaries().size() > 0;
}
my 2 cents
public async Task<bool> DoesKeyExistsAsync(string key)
{
ListObjectsResponse response = null;
try
{
response = await _s3Client.ListObjectsAsync(new ListObjectsRequest { BucketName = _settings.BucketName, Prefix = key });
}
catch (Exception ex)
{
_logger.LogError($"Error while checking key {key}");
return false;
}
return (response?.S3Objects?.Count > 0);
}
s3 = new S3(S3KEY, S3SECRET, false);
res = s3->getObjectInfo(bucketName, filename);
It will return array if file exists
try this one:
NameValueCollection appConfig = ConfigurationManager.AppSettings;
AmazonS3 s3Client = AWSClientFactory.CreateAmazonS3Client(
appConfig["AWSAccessKey"],
appConfig["AWSSecretKey"],
Amazon.RegionEndpoint.USEast1
);
S3DirectoryInfo source = new S3DirectoryInfo(s3Client, "BUCKET_NAME", "Key");
if(source.Exist)
{
//do ur stuff
}
using Amazon;
using Amazon.S3;
using Amazon.S3.IO;
using Amazon.S3.Model;
string accessKey = "xxxxx";
string secretKey = "xxxxx";
string regionEndpoint = "EU-WEST-1";
string bucketName = "Bucket1";
string filePath = "https://Bucket1/users/delivery/file.json"
public bool FileExistsOnS3(string filePath)
{
try
{
Uri myUri = new Uri(filePath);
string absolutePath = myUri.AbsolutePath; // /users/delivery/file.json
string key = absolutePath.Substring(1); // users/delivery/file.json
using(var client = AWSClientFactory.CreateAmazonS3Client(accessKey, secretKey, regionEndpoint))
{
S3FileInfo file = new S3FileInfo(client, bucketName, key);
if (file.Exists)
{
return true;
// custom logic
}
else
{
return false;
// custom logic
}
}
}
catch(AmazonS3Exception ex)
{
return false;
}
}
There is an overload for GetFileSystemInfos
Notice this line has filename.*
var files= s3DirectoryInfo.GetFileSystemInfos("filename.*");
public bool Check()
{
var awsCredentials = new Amazon.Runtime.BasicAWSCredentials("AccessKey", "SecretKey");
using (var client = new AmazonS3Client(awsCredentials, Amazon.RegionEndpoint.USEast1))
{
S3DirectoryInfo s3DirectoryInfo = new S3DirectoryInfo(client, bucketName, "YourFilePath");
var files= s3DirectoryInfo.GetFileSystemInfos("filename.*");
if(files.Any())
{
//fles exists
}
}
}

Categories