I am not able to access any path in Azure file share folders. Everything I've tried is giving me an error "Operation: GETFILESTATUS failed with Unknown Error: Name or service not known Name or service not known Source". Does this code look okay?
var adlsClient = AdlsClient.CreateClient("myDataLakeAccount.azuredatalakestore.net", "Token");
using MemoryStream memoryStream = new MemoryStream();
using StreamWriter streamWriter = new StreamWriter(memoryStream);
streamWriter.WriteLine("Testing file content to insert.");
using var file = adlsClient.CreateFile("/Folder1/Folder2/Pending/TestFile.txt", IfExists.Overwrite);
byte[] textByteArray = memoryStream.ToArray();
file.Write(textByteArray, 0, textByteArray.Length);
I am using below snippet of code to add a file in ADLS Gen2 container. You can use the below :
var storageAccountName = <YourStorageAccountName>;
var storageAccountKey = <YourStorageAccountKey>;
string serviceUri = "https://" + storageAccountName + ".dfs.core.windows.net";
var sampleFilePath = <YourLocalFilePath>;
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(new Uri(serviceUri), sharedKeyCredential);
// Get a reference to a filesystem named "sample-filesystem-append" and then create it
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("folder1");
filesystem.CreateIfNotExists();
DataLakeDirectoryClient directory = filesystem.GetDirectoryClient("/folder2/pending/");
directory.CreateIfNotExists();
DataLakeFileClient file = directory.GetFileClient("test1.txt");
file.CreateIfNotExists();
var SampleFileContent = File.OpenRead(sampleFilePath);
file.Append(SampleFileContent, 0);
file.Flush(SampleFileContent.Length);
After executing the above code I could see the results in the ADLS Gen2 storage account
Date is written in the file
Refer here
Related
I have an Azure application built with ASP.NET Core using the MVC pattern. Document uploads are stored in Azure Blob Containers and the C# upload code I wrote is working great.
I am using Azure.Storage.Blobs version 12.14.1
Here is my download blob code:
//get document metadata stored in sql table, id comes from a method param
var document = _unitOfWork.Documents.GetById(id);
if (document == null)
{
throw new FileNotFoundException();
}
//get connection and container from appsettings.json
string connection = _appConfig.AzureStorageConnection;
string containerName = _appConfig.AzureStorageContainer;
//work with blob client
var serviceClient = new BlobServiceClient(connection);
var container = serviceClient.GetBlobContainerClient(containerName);
var fileName = document.UniqueDocumentName;
var blobClient = container.GetBlobClient(document.UniqueDocumentName);
using (FileStream fileStream = System.IO.File.OpenWrite("<path>"))
{
blobClient.DownloadTo(fileStream);
}
After I get to using code to set up the file stream, I don't understand what to pass into the OpenWrite method as a path. This application is a B2C app and so how do I just prompt a user to download the file?
I do get a file download with the above code but the file is called download.xml. That is not the file that should be downloaded. I expected the download file to be an .odt file.
Documentation seems to be very sparse on downloading from Azure Blob Storage
EDIT 1.
I got rid of the FileStream and did this instead:
MemoryStream ms = new MemoryStream();
ms.Position = 0;
blobClient.DownloadTo(ms);
return new FileStreamResult(ms, document.FileType);
I did this instead of using FileStream and returned FileResult instead:
MemoryStream ms = new MemoryStream();
ms.Position = 0;
blobClient.DownloadTo(ms);
return new FileStreamResult(ms, document.FileType);
I have written a console app in c# following this tutorial: https://learn.microsoft.com/en-gb/training/modules/msgraph-access-file-data/3-exercise-access-files-onedrive
Now when I download a file from my OneDrive via the console app using Microsoft Graph API, all the files get downloaded in type "File". However, the files are of type "Docx".
So how do I ensure that the files get downloaded in their original extension format? (.docx, .ppt, .csv, etc.)
var fileId = "01HLTXGBVIH3R6ILTKF5FKB2EMZKFG3MQ6";
var request = client.Me.Drive.Items[fileId].Content.Request();
var stream = request.GetAsync().Result;
var driveItemPath = Path.Combine(System.IO.Directory.GetCurrentDirectory(), "driveItem_" + fileId + ".file");
var driveItemFile = System.IO.File.Create(driveItemPath);
stream.Seek(0, SeekOrigin.Begin);
stream.CopyTo(driveItemFile);
Console.WriteLine("Saved file to: " + driveItemPath);
Make a request to get file and read name property which represents the name of the item (filename and extension).
var fileId = "01HLTXGBVIH3R6ILTKF5FKB2EMZKFG3MQ6";
// make a request to get the file
var file = client.Me.Drive.Items[fileId].Request().GetAsync().Result;
var fileName = file.Name;
var request = client.Me.Drive.Items[fileId].Content.Request();
var stream = request.GetAsync().Result;
// create a file with the same name
var driveItemPath = Path.Combine(System.IO.Directory.GetCurrentDirectory(), fileName);
var driveItemFile = System.IO.File.Create(driveItemPath);
stream.Seek(0, SeekOrigin.Begin);
stream.CopyTo(driveItemFile);
Console.WriteLine("Saved file to: " + driveItemPath);
I was given the address like following to upload file to Azure File Share using Shared Access Signature(SAS)
https://myaccount.file.core.windows.net/xxxxx?sv=2020-08-04&ss=bfqt&srt=so&sp=rwdlacupitfx&se=2022-12-30T18:11:32Z&st=2021-12-12T10:11:32Z&spr=https&sig=signature
This is my test program
using Azure.Storage.Files.Shares;
public async Task TestAsync()
{
var sas = #"https://myaccount.file.core.windows.net/xxxxx?sv=2020-08-04&ss=bfqt&srt=so&sp=rwdlacupitfx&se=2022-12-30T18:11:32Z&st=2021-12-12T10:11:32Z&spr=https&sig=signature";
var localfile = #"C:\Test\local.txt";
var client = new ShareFileClient(new Uri(sas));
using (var stream = new FileStream(localfile, FileMode.Open, FileAccess.Read))
{
var response = await client.UploadAsync(stream);
}
}
The program throw RequestFailedException with following error:
Status: 400 (The requested URI does not represent any resource on the server.)
ErrorCode: InvalidUri
Additional Information:
UriPath: /xxxxx
My question is what this error mean, is it anything wrong in my test code?
According to this Document uploading with SAS is not possible as it requires authentication to do so. Even though we try having the code correctly it still throws Authentication information is not given in the correct format. Check the value of the Authorization header exception. An alternative way is to use a connection string which of working fine when we try reproducing from our end.
Here is the code
static async Task Main(string[] args) {
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("<YOUR CONNECTION STRING>");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("<YOUR FILE SHARE>");
if (await share.ExistsAsync()) {
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
CloudFile file = rootDir.GetFileReference("sample.txt");
byte[] data = File.ReadAllBytes(# "sample.txt");
Stream fileStream = new MemoryStream(data);
await file.UploadFromStreamAsync(fileStream);
}
}
Here is the workaround using SAS that you can try:
using System.IO;
using Azure.Storage;
using Azure.Storage.Files.Shares;
public class SampleClass {
public void Upload() {
///Get information from your Azure storage account and configure
string accountName = "{Get the account name from the Azure portal site.}";
string accessKey = "{Get the access key from the Azure portal site.}";
Uri serverurl = new Uri(# "{Get the URL from the Azure portal.}");
///Upload destination(azure side)
string azureDirectoryPath = # "{Destination(azure side)Specify the directory of}";
string azureFileName = "{Specify the file name to save}";
///Upload target(Local side)
string localDirectoryPath = # "{Upload target(Local side)Specify the directory of}";
string localFileName = "{Upload target(Local side)Specify the file name of}";
//SSL communication permission setting
//If you don't do this, SSL(https)An error occurs in communication.
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls11 | System.Net.SecurityProtocolType.Tls12;
try {
//Preparing to connect to Azure: Setting connection information
StorageSharedKeyCredential credential = new StorageSharedKeyCredential(accountName, accessKey);
//Connect to Azure
ShareClient share = new ShareClient(serverurl, credential);
ShareDirectoryClient directory = share.GetDirectoryClient(azureDirectoryPath);
//Upload destination(azure side)Create if there is no folder in.
directory.CreateIfNotExists();
//Upload destination(azure side)Create a file instance in.
ShareFileClient file = directory.GetFileClient(azureFileName);
//Delete any file with the same name
file.DeleteIfExists();
//Open the Local file to be uploaded. It is easy to get binary information by opening it with FileStream type.
FileStream stream = File.OpenRead(Path.Combine(localDirectoryPath, localFileName));
//Upload destination(azure side)Inject binary information into a file instance
file.Create(stream.Length);
file.UploadRange(new Azure.HttpRange(0, stream.Length), stream);
//Free local files
stream.Dispose();
} catch (Exception ex) {
System.Console.WriteLine(ex.Message);
return;
}
}
}
REFERENCE:
How to programmatically upload files to Azure Storage (File Share)
I wanted to know how to download all documents from a SharePoint list using the SharePoint client object model (CSOM) (Microsoft.SharePoint.Client) and the lists full URL.
For example, if the URL was http://teamhub.myorg.local/sites/teams/it/ISLibrary/Guides/
Is it possible to connect directly to that URL and retrieve all documents stored there?
I have tried out the below code but I am getting an error, also it seems to require that I split the URL into two parts.
string baseURL = "http://teamhub.myorg.local/sites/";
string listURL = "teams/it/ISLibrary/Guides/";
var ctx = new ClientContext(baseURL);
ctx.Credentials = new SharePointOnlineCredentials(userName, SecuredpassWord);
var list = ctx.Web.GetList(listURL);
ctx.Load(list);
ctx.ExecuteQuery();
Console.WriteLine(list.Title);
When I run this code I simply get a "File not found" error.
Can it be done by simply passing in the full url somewhere?
I will need to do this connection and get all documents 100's of times over for many different lists, so it would be best if there is a way to do it using the full URL.
Any advice is appreciated. Thanks
Microsoft.SharePoint.Client.Web.GetListByUrl use webRelativeUrl, for example:
My site: https://tenant.sharepoint.com/sites/TST, library: https://tenant.sharepoint.com/sites/TST/MyDoc4
So the code would be:
Web web = clientContext.Web;
var lib=web.GetListByUrl("/MyDoc4");
The listURL you shared seems a folder, so we could get the folder and files in folder as below:
Web web = clientContext.Web;
Folder folder = web.GetFolderByServerRelativeUrl("/sites/TST/MyDoc4/Folder");
var files = folder.Files;
clientContext.Load(files);
clientContext.ExecuteQuery();
Download fileļ¼
foreach (var file in files)
{
clientContext.Load(file);
Console.WriteLine(file.Name);
ClientResult<Stream> stream = file.OpenBinaryStream();
clientContext.ExecuteQuery();
var fileOut = Path.Combine(localPath, file.Name);
if (!System.IO.File.Exists(fileOut))
{
using (Stream fileStream = new FileStream(fileOut, FileMode.Create))
{
CopyStream(stream.Value, fileStream);
}
}
}
private static void CopyStream(Stream src, Stream dest)
{
byte[] buf = new byte[8192];
for (; ; )
{
int numRead = src.Read(buf, 0, buf.Length);
if (numRead == 0)
break;
dest.Write(buf, 0, numRead);
}
}
I am writing an Azure Function that moves files from AWS S3 to Azure Datalake, I got the download working and I got the upload working but I am struggling to piece the two together because I don't want to store the file in the intermediate app so to say as the azure function itself does not need to store it just pass it on.
Its not so easy to explain so please bear with me a little here while I try explain what I want to do.
When I download from S3 using this code
await client.GetObjectAsync(new GetObjectRequest { BucketName = bucketName, Key = entry.Key });
I don't have a file system to store it on and I don't want to store it, I want it as some sort of "object" that I can pass directly to the azure data lake writer which looks like this
adlsFileSystemClient.FileSystem.UploadFile(adlsAccountName, source, destination, 1, false, true);
The code works fine if I download it to my local disk, and then uploads it, but that's not what I want since the azure function has no storage I want to pass the downloaded object directly to the uploader so to say
How can I achieve this?
**** EDIT ****
// Process the response.
foreach (S3Object entry in response.S3Objects)
{
Console.WriteLine("key = {0} size = {1}", entry.Key.Split('/').Last(), entry.Size);
string fileNameOnly = entry.Key.Split('/').Last();
//await client.GetObjectAsync(new GetObjectRequest { BucketName = bucketName, Key = entry.Key });
GetObjectResponse getObjRespone = await client.GetObjectAsync(bucketName, entry.Key);
MemoryStream stream = new MemoryStream();
getObjRespone.ResponseStream.CopyTo(stream);
if (entry.Key.Contains("MerchandiseHierarchy") == true)
{
WriteToAzureDataLake(stream, #"/PIMRAW/MerchandiseHierarchy/" + fileNameOnly);
}
}
and then I pass the memory stream to the azure method but I need a streamuploader, and I cannot fid it, the following complains it cannot convert stream to string
adlsFileSystemClient.FileSystem.UploadFile(adlsAccountName, source, destination, 1, false, true);
* EDIT2 *
Change the upload method as follows and it creates the file at destination but with 0 size, so I am wondering if I am creating before the download is done?
static void WriteToAzureDataLake(MemoryStream inputSource, string inputDestination)
{
// 1. Set Synchronization Context
SynchronizationContext.SetSynchronizationContext(new SynchronizationContext());
// 2. Create credentials to authenticate requests as an Active Directory application
var clientCredential = new ClientCredential(clientId, clientSecret);
var creds = ApplicationTokenProvider.LoginSilentAsync(tenantId, clientCredential).Result;
// 2. Initialise Data Lake Store File System Client
adlsFileSystemClient = new DataLakeStoreFileSystemManagementClient(creds);
// 3. Upload a file to the Data Lake Store
//var source = #"c:\nwsys\source.txt";
var source = inputSource;
//var destination = "/PIMRAW/MerchandiseHierarchy/destination.txt";
var destination = inputDestination;
//adlsFileSystemClient.FileSystem.UploadFile(adlsAccountName, source, destination, 1, false, true);
adlsFileSystemClient.FileSystem.Create(adlsAccountName, destination, source);
// FINISHED
Console.WriteLine("6. Finished!");
}
Change the upload method as follows and it creates the file at destination but with 0 size
It seems that need to set stream position to 0 before write to datalake.
stream.Position = 0;