My code looks like this
CloudFileClient client = ...;
client.GetShareReference("fileStorageShare")
.GetRootDirectoryReference()
.GetDirectoryReference("one/two/three")
.Create();
This errors if directories one or two don't exist. Is there a way to create these nested directories with a single call?
It is impossible. The SDK does not support it this way, you should create them one by one.
A issue has already submitted here.
If you wanna create them one by one, you can use the following sample code:
static void NestedDirectoriesTest()
{
var cred = new StorageCredentials(accountName, accountKey);
var account = new CloudStorageAccount(cred, true);
var client = account.CreateCloudFileClient();
var share = client.GetShareReference("temp2");
share.CreateIfNotExists();
var cloudFileDirectory = share.GetRootDirectoryReference();
//Specify the nested folder
var nestedFolderStructure = "Folder/SubFolder";
var delimiter = new char[] { '/' };
var nestedFolderArray = nestedFolderStructure.Split(delimiter);
for (var i=0; i<nestedFolderArray.Length; i++)
{
cloudFileDirectory = cloudFileDirectory.GetDirectoryReference(nestedFolderArray[i]);
cloudFileDirectory.CreateIfNotExists();
Console.WriteLine(cloudFileDirectory.Name + " created...");
}
}
Following the advice of Ivan Yang, I adapted my code using Azure.Storage.Files.Shares (Version=12.2.3.0).
Here's my contribution:
readonly string storageConnectionString = "yourConnectionString";
readonly string shareName = "yourShareName";
public string StoreFile(string dirName,string fileName, Stream fileContent)
{
// Get a reference to a share and then create it
ShareClient share = new ShareClient(storageConnectionString, shareName);
share.CreateIfNotExists();
// Get a reference to a directory and create it
string[] arrayPath = dirName.Split('/');
string buildPath = string.Empty;
var tempoShare = share;
ShareDirectoryClient directory = null; // share.GetDirectoryClient(dirName);
// Here's goes the nested directories builder
for (int i=0; i < arrayPath.Length; i++)
{
buildPath += arrayPath[i];
directory = share.GetDirectoryClient(buildPath);
directory.CreateIfNotExists();
buildPath += '/';
}
// Get a reference to a file and upload it
ShareFileClient file = directory.GetFileClient(fileName);
using (Stream stream = fileContent)
{
file.Create(stream.Length);
file.UploadRange(new HttpRange(0, stream.Length), stream);
}
return directory.Path;
}
Here is a simplified version of Hagen's code:
public async Task<ShareFileClient> CreateFileClient(string connection, string shareName, string path)
{
var share = new ShareClient(connection, shareName);
await share.CreateIfNotExistsAsync();
var dir = share.GetRootDirectoryClient();
var pathChain = path.Split(Path.DirectorySeparatorChar, Path.AltDirectorySeparatorChar);
int dirCount = pathChain.Length - 1;
for (int i = 0; i < dirCount; ++i)
{
dir = dir.GetSubdirectoryClient(pathChain[i]);
await dir.CreateIfNotExistsAsync();
}
return dir.GetFileClient(pathChain[dirCount]);
}
Related
My task is to download a .txt file, remove some of the data and save it as .json. Currently I use one interface to download a file as .txt file and another interface to read all lines of said file, make it into an object and remove what I don't need.
Currently what happens is, that I have to go to download endpoint to get my .txt then to edit endpoint to parse and save as .json.
This is the filter method:
public void FilterOutInvalidRates(string path)
{
try
{
var targetLocation = #"TargetLocation/";
var name = File.ReadAllLines(path).First();
var fileName = name.Substring(0,3);
var lines = File.ReadAllLines(path).Skip(2);
var model = lines.Select(p => new Rates
{
a = p.Split("|")[0],
b = p.Split("|")[1],
c = p.Split("|")[2],
d = p.Split("|")[3],
e = p.Split("|")[4],
});
List<Rates> rates = model.Where(model => IsTheCountryValid(model.Country)).ToList();
var jsonString = Newtonsoft.Json.JsonConvert.SerializeObject(rates.ToArray(), Formatting.Indented);
System.IO.File.WriteAllText(targetLocation + fileName + ".json", jsonString);
System.IO.File.Delete(path);
}
catch (FileNotFoundException)
{
Console.WriteLine(#"Unable to read file:" + path.ToString());
}
}
And this is the download service:
public async void DownloadRatesTxtFile(string uri, string outputPath, int number)
{
var policy = BuildRetryPolicy();
var path = await policy.ExecuteAsync(() => uri
.DownloadFileAsync(outputPath, #"CurrencyRate" + number + ".txt"));
}
This is the controller implementation - I know it's not ideal, if I could do it all in one endpoint or a different architecture I could get rid of a ton of code. The issue is, that before the endpoint returns OK I am not able to manipulate the files in any way or read them.
[HttpGet("download")]
public async Task<IActionResult> Download()
{
var fileCounter = 1;
var outputPath2 = AppDomain.CurrentDomain.BaseDirectory + #"Data/";
var outputPath = #"TargetLocation/";
DateTime begindate = Convert.ToDateTime("01/07/2022");
DateTime enddate = Convert.ToDateTime("5/07/2022");
while (begindate < enddate)
{
if (begindate.DayOfWeek == DayOfWeek.Saturday)
{
begindate = begindate.AddDays(2);
}
_exchangeRateConnector.DownloadRatesTxtFile(_exchangeRateConnector.GenerateRatesUrl(begindate), outputPath2, fileCounter);
begindate = begindate.AddDays(1);
fileCounter++;
}
return Ok();
}
[HttpGet("edit")]
public async Task<IActionResult> Edit()
{
var outputPath2 = AppDomain.CurrentDomain.BaseDirectory + #"Data/";
var outputPath = #"TargetLocation/";
var fileCount = (from file in Directory.EnumerateFiles(outputPath2, "*.txt", SearchOption.TopDirectoryOnly)
select file).Count();
for (int i = 1; i <= fileCount; i++)
{
_exchangeRateRepository.FilterOutInvalidRates(outputPath2 + "CurrencyRate" + i + ".txt");
}
return Ok();
}
Thank you so much for any sort of advice.
When my AWS Credentials File (see docs) is updated by an external process the AmazonSQSClient doesn't re-read it, SendMessageAsync fails with a security/token error.
We use a custom powershell script to refresh the local AWS cred's file periodically. The script works fine, the file is refreshed prior to the credentials expiring on AWS. However, if my app is running when the file is refreshed the new credentials are not re-read from the file, the "client" will show that the previous credentials are still in use.
The AWS docs list several AWSCredential providers but none of them seem to be the correct choice...I think..
Restarting the app works, the new credentials are read correctly and messages are sent until the next time the cred's file is updated.
using (var client = new AmazonSQSClient(Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
I don't think there is a way for a running app to pick up the default credentials being refreshed in credentials file. There is a solution for Node.js loading credentials from a JSON file. You can create a similar solution in C#. You can also run a local DB to store credentials so whenever credentials file is updated DB table or JSON file is also updated. You will need to use access key and secret key in your SQS client constructor as opposed to using default credentials.
// Load these from JSON file or DB.
var accessKey = "";
var secretKey = "";
using (var client = new AmazonSQSClient(accessKey, secretKey, Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
The following works "ok" but I've only tested it with one profile and the file watcher is not as timely as you'd like so I'd recommend you wrap your usage inside a Retry mechanism.
// Usage..
var credentials = new AwsCredentialsFile();
using (var client = new AmazonSQSClient(credentials, Amazon.RegionEndpoint.EUWest1))
{
return client.SendMessageAsync(request);
}
public class AwsCredentialsFile : AWSCredentials
{
// https://docs.aws.amazon.com/sdk-for-net/v2/developer-guide/net-dg-config-creds.html#creds-file
private const string DefaultProfileName = "default";
private static ConcurrentDictionary<string, ImmutableCredentials> _credentials = new ConcurrentDictionary<string, ImmutableCredentials>(StringComparer.OrdinalIgnoreCase);
private static FileSystemWatcher _watcher = BuildFileSystemWatcher();
private readonly System.Text.Encoding _encoding;
private readonly string _profileName;
public AwsCredentialsFile()
: this(AwsCredentialsFile.DefaultProfileName, System.Text.Encoding.UTF8)
{
}
public AwsCredentialsFile(string profileName)
: this(profileName, System.Text.Encoding.UTF8)
{
}
public AwsCredentialsFile(string profileName, System.Text.Encoding encoding)
{
_profileName = profileName;
_encoding = encoding;
}
private static FileSystemWatcher BuildFileSystemWatcher()
{
var watcher = new FileSystemWatcher
{
Path = Path.GetDirectoryName(GetDefaultCredentialsFilePath()),
NotifyFilter = NotifyFilters.LastWrite,
Filter = "credentials"
};
watcher.Changed += (object source, FileSystemEventArgs e) => { _credentials?.Clear(); };
watcher.EnableRaisingEvents = true;
return watcher;
}
public static string GetDefaultCredentialsFilePath()
{
return System.Environment.ExpandEnvironmentVariables(#"C:\Users\%USERNAME%\.aws\credentials");
}
public static (string AccessKey, string SecretAccessKey, string Token) ReadCredentialsFromFile(string profileName, System.Text.Encoding encoding)
{
var profile = $"[{profileName}]";
string awsAccessKeyId = null;
string awsSecretAccessKey = null;
string token = null;
var lines = File.ReadAllLines(GetDefaultCredentialsFilePath(), encoding);
for (int i = 0; i < lines.Length; i++)
{
var text = lines[i];
if (text.Equals(profile, StringComparison.OrdinalIgnoreCase))
{
awsAccessKeyId = lines[i + 1].Replace("aws_access_key_id = ", string.Empty);
awsSecretAccessKey = lines[i + 2].Replace("aws_secret_access_key = ", string.Empty);
if (lines.Length >= i + 3)
{
token = lines[i + 3].Replace("aws_session_token = ", string.Empty);
}
break;
}
}
var result = (AccessKey: awsAccessKeyId, SecretAccessKey: awsSecretAccessKey, Token: token);
return result;
}
public override ImmutableCredentials GetCredentials()
{
if (_credentials.TryGetValue(_profileName, out ImmutableCredentials value))
{
return value;
}
else
{
var (AccessKey, SecretAccessKey, Token) = ReadCredentialsFromFile(_profileName, _encoding);
var credentials = new ImmutableCredentials(AccessKey, SecretAccessKey, Token);
_credentials.TryAdd(_profileName, credentials);
return credentials;
}
}
}
I'm fairly new to coding and c#. I'm building an app that accesses google spreadsheets via an API, I turn this data into XML, zip it and write it to a stream. It works fine, but instead of adding every spreadsheet ID manually to my list I want to make another API call that will retrieve every ID in my google sheets account so that I can store them to a list and update it every time I run the app.
I'm banging my head off a wall looking for answers but I think this is not available via a sheets API but maybe through the drive API??, any help would be greatly appreciated, I'm not looking for anyone to write my code, but please point me in the right direction, or am I trying to do something that cannot be done?
public class GoogleSheetsReader
{
private string apiKey;
public Stream OutStream { get; set; }
List<string> SpreadSheetIdList { get; set; }
public GoogleSheetsReader(string apiKey)
{
this.apiKey = apiKey;
}
public Stream Main()
{
SheetsService sheetsService = new SheetsService(new BaseClientService.Initializer
{
HttpClientInitializer = GetCredential(),
ApplicationName = "My Project To XML",
ApiKey = apiKey,
});
using (MemoryStream memory = new MemoryStream())
{
using (ZipArchive zip = new ZipArchive(memory, ZipArchiveMode.Create))
{
SpreadSheetIdList = new List<string>(); // not the best place for the list, just testing funtionality
SpreadSheetIdList.Add("1oZlfj6XZOrPs9Qti9K9iKL6itZChM8dlwwJFSvBNzUc");
SpreadSheetIdList.Add("1oU3sjd7QoOgQ2PvmC7NxciyM1MRXns6-Z9vMayFgOjU");
foreach (var spreadSheetId in SpreadSheetIdList)
{
var ssRequest = sheetsService.Spreadsheets.Get(spreadSheetId);
Data.Spreadsheet service = ssRequest.Execute();
foreach (var sheet in service.Sheets)
{
var sheetName = sheet.Properties.Title;
SpreadsheetsResource.ValuesResource.GetRequest request = sheetsService.Spreadsheets.Values.Get(spreadSheetId, sheetName);
SpreadsheetsResource.GetRequest requestForSpreadSheet = sheetsService.Spreadsheets.Get(spreadSheetId);
requestForSpreadSheet.Ranges = sheetName;
Data.Spreadsheet response1 = requestForSpreadSheet.Execute();
var spreadSheetName = response1.Properties.Title;
Data.ValueRange response = request.Execute();
ZipArchiveEntry entry = zip.CreateEntry($"{spreadSheetName}\\{sheetName}.xml");
using (Stream ZipFile = entry.Open())
{
byte[] data = Encoding.ASCII.GetBytes(SchemaMaker(response)); // SchemaMaker converts my sheet data to required xml format
ZipFile.Write(data, 0, data.Length);
}
}
}
}
OutStream = new MemoryStream(memory.ToArray());
}
return OutStream;
}
public string SchemaMaker(ValueRange _param)
{
var result = "<TABLE>";
var headers = _param.Values[0];
for (int i = 1; i < _param.Values.Count; i++)
{
result = result + "<ROW>";
for (int j = 0; j < _param.Values[i].Count; j++)
{
result = result + $"<{headers[j]}>{_param.Values[i][j]}</{headers[j]}>";
}
result = result + "</ROW>";
}
result = result + "</TABLE>";
var element = XElement.Parse(result);
var settings = new System.Xml.XmlWriterSettings();
settings.OmitXmlDeclaration = true;
settings.Indent = true;
settings.NewLineOnAttributes = true;
var xmlFormat = element.ToString();
return xmlFormat;
}
public static UserCredential GetCredential()
{
string[] Scopes = { SheetsService.Scope.SpreadsheetsReadonly };
return null;
}
}
I use the tool Octokit.net to make an own GitHub client and my question is how can I delete the files in the repository when they aren't in the local folder anymore?
This is the code I already have:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
using System.Reactive.Linq;
using System.IO;
using Octokit;
using Octokit.Internal;
using Octokit.Reactive;
class Program
{
static void Main(string[] args)
{
MainAsync(args[0], args[1], args[2], args[3]).Wait();
}
static async Task MainAsync(string rootPath, string owner, string repo, string branch)
{
var github = new GitHubClient(new ProductHeaderValue("repository"));
var user = await github.User.Get("username");
github.Credentials = new Credentials("token");
var contents = await github.Repository.Content.GetAllContents(owner, repo);
foreach(var githubFilePath in contents)
{
Console.WriteLine(githubFilePath);
}
Console.WriteLine(contents);
UpdateFull().Wait();
// Update a whole folder
async Task UpdateFull()
{
updateGithub(rootPath).Wait();
async Task updateGithub(string path)
{
var localContents = Directory.GetFiles(path);
var localContentFolders = Directory.GetDirectories(path);
var headMasterRef = "heads/master";
var masterReference = await github.Git.Reference.Get(owner, repo, headMasterRef);
var latestCommit = await github.Git.Commit.Get(owner, repo, masterReference.Object.Sha);
var nt = new NewTree { BaseTree = latestCommit.Tree.Sha };
// Add items based on blobs
AddDirectoryToTree(path, path + "\\", nt).Wait();
var rootTree = await github.Git.Tree.Create(owner, repo, nt);
// Create Commit
var newCommit = new NewCommit("Commit test with several files", rootTree.Sha, masterReference.Object.Sha);
var commit = await github.Git.Commit.Create(owner, repo, newCommit);
await github.Git.Reference.Update(owner, repo, headMasterRef, new ReferenceUpdate(commit.Sha));
}
async Task AddDirectoryToTree(string DirectoryPath, string RootPath, NewTree Tree)
{
// all direct files in the folder
//
var localContents = Directory.GetFiles(DirectoryPath);
foreach (var filePath in localContents)
{
string sExtension = Path.GetExtension(filePath).ToUpper();
string sFilename = filePath.Replace(RootPath, "").Replace(#"\", "/");
Console.WriteLine("File(Local): " + sFilename);
switch (sExtension)
{
default:
// Create text blob
var textBlob = new NewBlob { Encoding = EncodingType.Utf8, Content = File.ReadAllText(filePath) };
var textBlobRef = await github.Git.Blob.Create(owner, repo, textBlob);
Tree.Tree.Add(new NewTreeItem { Path = sFilename, Mode = "100644", Type = TreeType.Blob, Sha = textBlobRef.Sha });
break;
case ".PNG":
case ".JPEG":
case ".JPG":
// For image, get image content and convert it to base64
var imgBase64 = Convert.ToBase64String(File.ReadAllBytes(filePath));
// Create image blob
var imgBlob = new NewBlob { Encoding = EncodingType.Base64, Content = (imgBase64) };
var imgBlobRef = await github.Git.Blob.Create(owner, repo, imgBlob);
Tree.Tree.Add(new NewTreeItem { Path = sFilename, Mode = "100644", Type = TreeType.Blob, Sha = imgBlobRef.Sha });
break;
}
}
// subfolder
//
var localContentFolders = Directory.GetDirectories(DirectoryPath);
foreach (string subDirPath in localContentFolders)
{
AddDirectoryToTree(subDirPath, RootPath, Tree).Wait();
}
}
}
Console.WriteLine("-----------------------------------");
Console.WriteLine("Successfully commited Files");
Console.ReadKey();
Do I maybe need to add a new foreach for the Github repository and make then an if statement to look at the repository and local folder? :o
Would appreciateif you guys could help me with that! :D
Is there any possible way to get the list of files with parent folder name and without making additional queries?
My Files.List() code sample:
var sheets = new List<File>();
var sheetsRequest = Service.Files.List();
sheetsRequest.Fields = "nextPageToken, files(id, name, owners, parents)";
sheetsRequest.Q = $"mimeType = '{mimeType}' and name = '{name}'";
sheetsRequest.PageSize = 500;
var sheetsFeed = await sheetsRequest.ExecuteAsync();
while (sheetsFeed.Files != null)
{
foreach (var file in sheetsFeed.Files)
{
sheets.Add(file);
}
if (sheetsFeed.NextPageToken == null)
break;
sheetsRequest.PageToken = sheetsFeed.NextPageToken;
sheetsFeed = await sheetsRequest.ExecuteAsync();
}
return sheets;
Because for now I making GetFileParentDirAsync call for each File using following method (and it's very slow):
public async Task<string> GetFileParentDirAsync(File file)
{
return await Task.Run(async () =>
{
var folderName = string.Empty;
if (file.Parents?.First() == null) return folderName;
var parentId = file.Parents.First();
var parentRequest = Service.Files.Get(parentId);
var parentResponse = await parentRequest.ExecuteAsync();
folderName = parentResponse.Name;
return folderName;
});
}
Thanks.