Error InvalidUri when upload file to Azure File Share - c#

I was given the address like following to upload file to Azure File Share using Shared Access Signature(SAS)
https://myaccount.file.core.windows.net/xxxxx?sv=2020-08-04&ss=bfqt&srt=so&sp=rwdlacupitfx&se=2022-12-30T18:11:32Z&st=2021-12-12T10:11:32Z&spr=https&sig=signature
This is my test program
using Azure.Storage.Files.Shares;
public async Task TestAsync()
{
var sas = #"https://myaccount.file.core.windows.net/xxxxx?sv=2020-08-04&ss=bfqt&srt=so&sp=rwdlacupitfx&se=2022-12-30T18:11:32Z&st=2021-12-12T10:11:32Z&spr=https&sig=signature";
var localfile = #"C:\Test\local.txt";
var client = new ShareFileClient(new Uri(sas));
using (var stream = new FileStream(localfile, FileMode.Open, FileAccess.Read))
{
var response = await client.UploadAsync(stream);
}
}
The program throw RequestFailedException with following error:
Status: 400 (The requested URI does not represent any resource on the server.)
ErrorCode: InvalidUri
Additional Information:
UriPath: /xxxxx
My question is what this error mean, is it anything wrong in my test code?

According to this Document uploading with SAS is not possible as it requires authentication to do so. Even though we try having the code correctly it still throws Authentication information is not given in the correct format. Check the value of the Authorization header exception. An alternative way is to use a connection string which of working fine when we try reproducing from our end.
Here is the code
static async Task Main(string[] args) {
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("<YOUR CONNECTION STRING>");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("<YOUR FILE SHARE>");
if (await share.ExistsAsync()) {
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
CloudFile file = rootDir.GetFileReference("sample.txt");
byte[] data = File.ReadAllBytes(# "sample.txt");
Stream fileStream = new MemoryStream(data);
await file.UploadFromStreamAsync(fileStream);
}
}
Here is the workaround using SAS that you can try:
using System.IO;
using Azure.Storage;
using Azure.Storage.Files.Shares;
public class SampleClass {
public void Upload() {
///Get information from your Azure storage account and configure
string accountName = "{Get the account name from the Azure portal site.}";
string accessKey = "{Get the access key from the Azure portal site.}";
Uri serverurl = new Uri(# "{Get the URL from the Azure portal.}");
///Upload destination(azure side)
string azureDirectoryPath = # "{Destination(azure side)Specify the directory of}";
string azureFileName = "{Specify the file name to save}";
///Upload target(Local side)
string localDirectoryPath = # "{Upload target(Local side)Specify the directory of}";
string localFileName = "{Upload target(Local side)Specify the file name of}";
//SSL communication permission setting
//If you don't do this, SSL(https)An error occurs in communication.
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls11 | System.Net.SecurityProtocolType.Tls12;
try {
//Preparing to connect to Azure: Setting connection information
StorageSharedKeyCredential credential = new StorageSharedKeyCredential(accountName, accessKey);
//Connect to Azure
ShareClient share = new ShareClient(serverurl, credential);
ShareDirectoryClient directory = share.GetDirectoryClient(azureDirectoryPath);
//Upload destination(azure side)Create if there is no folder in.
directory.CreateIfNotExists();
//Upload destination(azure side)Create a file instance in.
ShareFileClient file = directory.GetFileClient(azureFileName);
//Delete any file with the same name
file.DeleteIfExists();
//Open the Local file to be uploaded. It is easy to get binary information by opening it with FileStream type.
FileStream stream = File.OpenRead(Path.Combine(localDirectoryPath, localFileName));
//Upload destination(azure side)Inject binary information into a file instance
file.Create(stream.Length);
file.UploadRange(new Azure.HttpRange(0, stream.Length), stream);
//Free local files
stream.Dispose();
} catch (Exception ex) {
System.Console.WriteLine(ex.Message);
return;
}
}
}
REFERENCE:
How to programmatically upload files to Azure Storage (File Share)

Related

Getting "Invalid path" when sending MemoryStream from Amazon S3 bucket to a remote server using SFTP in C#

I'm working on logic to retrieve a file from an Amazon S3 Bucket and then send it to a remote server using SFTP.
However, I'm not able to send the file because I don't have a path value - i.e., since my file is in S3, I don't have a path for it my local machine.
This is my method to send a file - it's inside a class called SftpService:
public async Task SendFile(SftpRequest request, CancellationToken cancellationToken = default)
{
// Connect to server
SftpClient client = new SftpClient(request.Host, request.Port, request.Username, request.Password);
client.Connect();
if (!client.IsConnected)
{
throw new SftpClientException("SFTP Connection was not complete.");
}
// Create an object of File Stream and pass a temp file path.
var tempFilePath = Path.GetTempFileName();
FileStream fileStream = new FileStream(tempFilePath, FileMode.Open);
// Copy the MemoryStream (from S3) to the FileStream
request.RefundFile.CopyTo(fileStream);
fileStream.Close();
// Upload the file. [TBD] What's the path for the file? (it is a MemoryStream)
client.UploadFile(fileStream, tempFilePath);
// Dispose the object by calling dispose method of sftpClient once the file has uploaded.
client.Dispose();
}
Now, I'm trying to perform a test on this method (using NUnit).
[SetUp]
public void Setup()
{
_sut = new SftpService(); //class
}
[Test]
public async Task SftpService_ShouldSendFileToServer_Success()
{
var request = MockSftpRequest();
await _sut.SendFile(request);
}
private SftpRequest MockSftpRequest()
{
var serverMock = new ServerOptions()
{
BaseAddress = "195.144.107.198",
Username = "demo",
Password = "password",
};
// Create a mock MemoryStream
var s3FileMock = new MemoryStream(Encoding.UTF8.GetBytes("This is a mock of the s3 file."));
SftpRequest request = new SftpRequest(serverMock)
{
RefundFile = refundFileMock,
Port = 22,
};
return request;
}
When running the test, I'm getting this error:
Message: Renci.SshNet.Common.SftpPathNotFoundException : Invalid path.
So, my question is: how can I achieve to send a MemoryStream over SFTP, without a path?
You have both parameters of SftpClient.UploadFile wrong.
Your immediate issue it that you are passing local path to the second parameter of SftpClient.UploadFile. Why? Obviously one parameter must be for local file/data and the other for remote. You are already passing local data to the first argument. So SftpClient.UploadFile does not need the local path anymore. It needs the remote path.
Why are you saving the [Memory]Stream to a temporary file only to open the file as another [File]Stream to pass it to SftpClient.UploadFile? Pass the MemoryStream to the SftpClient.UploadFile straight away.
client.UploadFile(request.RefundFile, "/remote/path/file");
Related question: Upload data from memory to SFTP server using SSH.NET

Azure Data Lake Storage File Share File Path

I am not able to access any path in Azure file share folders. Everything I've tried is giving me an error "Operation: GETFILESTATUS failed with Unknown Error: Name or service not known Name or service not known Source". Does this code look okay?
var adlsClient = AdlsClient.CreateClient("myDataLakeAccount.azuredatalakestore.net", "Token");
using MemoryStream memoryStream = new MemoryStream();
using StreamWriter streamWriter = new StreamWriter(memoryStream);
streamWriter.WriteLine("Testing file content to insert.");
using var file = adlsClient.CreateFile("/Folder1/Folder2/Pending/TestFile.txt", IfExists.Overwrite);
byte[] textByteArray = memoryStream.ToArray();
file.Write(textByteArray, 0, textByteArray.Length);
I am using below snippet of code to add a file in ADLS Gen2 container. You can use the below :
var storageAccountName = <YourStorageAccountName>;
var storageAccountKey = <YourStorageAccountKey>;
string serviceUri = "https://" + storageAccountName + ".dfs.core.windows.net";
var sampleFilePath = <YourLocalFilePath>;
StorageSharedKeyCredential sharedKeyCredential = new StorageSharedKeyCredential(storageAccountName, storageAccountKey);
// Create DataLakeServiceClient using StorageSharedKeyCredentials
DataLakeServiceClient serviceClient = new DataLakeServiceClient(new Uri(serviceUri), sharedKeyCredential);
// Get a reference to a filesystem named "sample-filesystem-append" and then create it
DataLakeFileSystemClient filesystem = serviceClient.GetFileSystemClient("folder1");
filesystem.CreateIfNotExists();
DataLakeDirectoryClient directory = filesystem.GetDirectoryClient("/folder2/pending/");
directory.CreateIfNotExists();
DataLakeFileClient file = directory.GetFileClient("test1.txt");
file.CreateIfNotExists();
var SampleFileContent = File.OpenRead(sampleFilePath);
file.Append(SampleFileContent, 0);
file.Flush(SampleFileContent.Length);
After executing the above code I could see the results in the ADLS Gen2 storage account
Date is written in the file
Refer here

Why cant I upload to an existing Azure container?

I am new to working with azure via C# and I am looking to upload an existing file to an existing azure storage container. I am currently able to create a new local file and then upload it to a container that I create as seen in the CreateContainerAndUploadFile method. However when I try to upload an existing file to an existing container it does not work. When using the CreateContainerAndUploadFile method I see the new container and .txt file appear, but the UploadFile method runs all the way through with no errors and I do not see the file appear in the container.
If anyone knows why the method is running through but not uploading I would greatly appreciate the help.
public class Launcher
{
public static void Main(string[] args)
{
BlobManager manager = new BlobManager();
manager.UploadFile("DemoText.txt", "democontainer");
manager.CreateContainerAndUploadFile("demo");
}
}
public class BlobManager
{
private BlobServiceClient blobServiceClient;
public BlobManager()
{
try
{
// Get azure table storage connection string.
string connectionString = ConfigurationManager.AppSettings.Get("StorageConnectionString");
blobServiceClient = new BlobServiceClient(connectionString);
}
catch (Exception ExceptionObj)
{
throw ExceptionObj;
}
}
public void UploadFile(string fileName, string container)
{
Console.WriteLine("Entering Upload to existing blob");
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient(container);
containerClient.CreateIfNotExistsAsync();
Console.WriteLine("Existing blob obtained");
string localPath = "./";
string localFilePath = Path.Combine(localPath, fileName);
Console.WriteLine("Local file path set");
CloudBlockBlob blockBlob;
//// Create a block blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine($"BlobClient established with filename {fileName}");
//// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
Console.WriteLine("Uploaded File to existing container");
}
public void CreateContainerAndUploadFile(string containerName)
{
//Create a BlobServiceClient object which will be used to create a container client
BlobServiceClient blobServiceClient =
new BlobServiceClient(ConfigurationManager.AppSettings.Get("StorageConnectionString"));
Console.WriteLine("Pre-container");
// Create the container and return a container client object
BlobContainerClient containerClient = blobServiceClient.CreateBlobContainer(containerName + Guid.NewGuid());
Console.WriteLine("Post-container");
// Create a local file in the ./data/ directory for uploading and downloading
string localPath = "./";
string fileName = "demo" + Guid.NewGuid().ToString() + ".txt";
string localFilePath = Path.Combine(localPath, fileName);
// Write text to the file
File.WriteAllTextAsync(localFilePath, "Hello, World!");
// Get a reference to the blob
BlobClient blobClient = containerClient.GetBlobClient(fileName);
Console.WriteLine($"Uploading to Blob storage as blob: {fileName}");
//// Open the file and upload its data
using FileStream uploadFileStream = File.OpenRead(localFilePath);
blobClient.UploadAsync(uploadFileStream, true);
uploadFileStream.Close();
}
}
The reason you're not able to upload the blob is because you're calling an async function but not waiting for it to complete.
Considering your calling method is sync and not async, I would recommend changing the following line of code:
blobClient.UploadAsync(uploadFileStream, true);
to
blobClient.Upload(uploadFileStream, true);
Other alternative would be to convert the wrapper method as async methods and await the completion of all async methods.

Move files from Azure Storage blob to an Ftp server

I need to upload a few files from Azure Storage to an external Ftp server.
Is there any way with Azure to uplodad these files directly without download them before ?
You will need to use two classes/libraries and create two methods here:
WebClient class to download the file from the blob storage to your local drive
FTP library such WinSCP to move the file
WebClient Class:
You need to supply the URI parameter with the format: https://[accountname].blob.core.windows.net/[containername]/[filetodownloadincludingextension]
The download location must then be a variable as the origin location of the file to be uploaded to your FTP server.
string uri = "https://[accountname].blob.core.windows.net/[containername]/[filetodownloadincludingextension]/";
string file = "file1.txt";
string downloadLocation = #"C:\";
WebClient webClient = new WebClient();
Log("Downloading File from web...");
try
{
webClient.DownloadFile(new Uri(uri+file), downloadLocation);
Log("Download from web complete");
webClient.Dispose();
}
catch (Exception ex)
{
Log("Error Occurred in downloading file. See below for exception details");
Log(ex.Message);
webClient.Dispose();
}
return downloadLocation + file;
Once downloaded in your local drive, you need to upload it to your FTP/SFTP server. You may use the library of WinSCP for this:
string absPathSource = downloadLocation + file;
string destination = "/root/folder"; //this basically is your FTP path
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Sftp,
HostName = ConfigurationManager.AppSettings["scpurl"],
UserName = ConfigurationManager.AppSettings["scpuser"],
Password = ConfigurationManager.AppSettings["scppass"].Trim(),
SshHostKeyFingerprint = ConfigurationManager.AppSettings["scprsa"].Trim()
};
using (Session session = new Session())
{
//disable version checking
session.DisableVersionCheck = true;
// Connect
session.Open(sessionOptions);
// Upload files
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
TransferOperationResult transferResult;
transferResult = session.PutFiles(absPathSource, destination, false, transferOptions);
// Throw on any error
transferResult.Check();
// Print results
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
//Console.WriteLine("Upload of {0} succeeded", transfer.FileName);
}
}
You may include a File.Delete code at the end of the upload to FTP code if you want to delete the file from your local hard drive after the upload.
I came across this question whilst looking for the same answer, I came up with the following solution:
Get the Azure file as a Stream [Handled by Azure Functions for you]
Using WebClient Upload the Stream
This allowed me to transfer the file directly from Blob Storage to an FTP client. For me the Azure Blob file as a Stream was already done as I was creating an Azure Function based on a blob trigger.
I then converted the Stream to a MemoryStream and passed that to WebClient.UploadData() as a byte array [very roughly something like]:
// ... Get the Azure Blob file in to a Stream called myBlob
// As mentioned above the Azure function does this for you:
// public static void Run([BlobTrigger("containerName/{name}", Connection = "BlobConnection")]Stream myBlob, string name, ILogger log)
public void UploadStreamToFtp(Stream file, string targetFilePath)
{
using (MemoryStream ms = new MemoryStream())
{
// As memory stream already handles ToArray() copy the Stream to the MemoryStream
file.CopyTo(ms);
using (WebClient client = new WebClient())
{
// Use login credentails if required
client.Credentials = new NetworkCredential("username", "password");
// Upload the stream as Data with the STOR method call
// targetFilePath is a fully qualified filepath on the FTP, e.g. ftp://targetserver/directory/filename.ext
client.UploadData(targetFilePath, WebRequestMethods.Ftp.UploadFile, ms.ToArray());
}
}
}

Downloading from Azure Blob storage in C#

I have a very basic but working Azure Blob uploader/downloader built on C# ASP.net.
Except the download portion does not work. This block is called by the webpage and I simply get no response. The uploads are a mixture of images and raw files. I'm looking for the user to get prompted to select a destination and just have the file download to their machine. Can anyone see where I am going wrong?
[HttpPost]
public void DownloadFile(string Name)
{
Uri uri = new Uri(Name);
string filename = System.IO.Path.GetFileName(uri.LocalPath);
CloudBlobContainer blobContainer = _blobStorageService.GetCloudBlobContainer();
CloudBlockBlob blob = blobContainer.GetBlockBlobReference(filename);
using (Stream outputFile = new FileStream("Downloaded.jpg", FileMode.Create))
{
blob.DownloadToStream(outputFile);

Categories