I need to upload a few files from Azure Storage to an external Ftp server.
Is there any way with Azure to uplodad these files directly without download them before ?
You will need to use two classes/libraries and create two methods here:
WebClient class to download the file from the blob storage to your local drive
FTP library such WinSCP to move the file
WebClient Class:
You need to supply the URI parameter with the format: https://[accountname].blob.core.windows.net/[containername]/[filetodownloadincludingextension]
The download location must then be a variable as the origin location of the file to be uploaded to your FTP server.
string uri = "https://[accountname].blob.core.windows.net/[containername]/[filetodownloadincludingextension]/";
string file = "file1.txt";
string downloadLocation = #"C:\";
WebClient webClient = new WebClient();
Log("Downloading File from web...");
try
{
webClient.DownloadFile(new Uri(uri+file), downloadLocation);
Log("Download from web complete");
webClient.Dispose();
}
catch (Exception ex)
{
Log("Error Occurred in downloading file. See below for exception details");
Log(ex.Message);
webClient.Dispose();
}
return downloadLocation + file;
Once downloaded in your local drive, you need to upload it to your FTP/SFTP server. You may use the library of WinSCP for this:
string absPathSource = downloadLocation + file;
string destination = "/root/folder"; //this basically is your FTP path
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Sftp,
HostName = ConfigurationManager.AppSettings["scpurl"],
UserName = ConfigurationManager.AppSettings["scpuser"],
Password = ConfigurationManager.AppSettings["scppass"].Trim(),
SshHostKeyFingerprint = ConfigurationManager.AppSettings["scprsa"].Trim()
};
using (Session session = new Session())
{
//disable version checking
session.DisableVersionCheck = true;
// Connect
session.Open(sessionOptions);
// Upload files
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
TransferOperationResult transferResult;
transferResult = session.PutFiles(absPathSource, destination, false, transferOptions);
// Throw on any error
transferResult.Check();
// Print results
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
//Console.WriteLine("Upload of {0} succeeded", transfer.FileName);
}
}
You may include a File.Delete code at the end of the upload to FTP code if you want to delete the file from your local hard drive after the upload.
I came across this question whilst looking for the same answer, I came up with the following solution:
Get the Azure file as a Stream [Handled by Azure Functions for you]
Using WebClient Upload the Stream
This allowed me to transfer the file directly from Blob Storage to an FTP client. For me the Azure Blob file as a Stream was already done as I was creating an Azure Function based on a blob trigger.
I then converted the Stream to a MemoryStream and passed that to WebClient.UploadData() as a byte array [very roughly something like]:
// ... Get the Azure Blob file in to a Stream called myBlob
// As mentioned above the Azure function does this for you:
// public static void Run([BlobTrigger("containerName/{name}", Connection = "BlobConnection")]Stream myBlob, string name, ILogger log)
public void UploadStreamToFtp(Stream file, string targetFilePath)
{
using (MemoryStream ms = new MemoryStream())
{
// As memory stream already handles ToArray() copy the Stream to the MemoryStream
file.CopyTo(ms);
using (WebClient client = new WebClient())
{
// Use login credentails if required
client.Credentials = new NetworkCredential("username", "password");
// Upload the stream as Data with the STOR method call
// targetFilePath is a fully qualified filepath on the FTP, e.g. ftp://targetserver/directory/filename.ext
client.UploadData(targetFilePath, WebRequestMethods.Ftp.UploadFile, ms.ToArray());
}
}
}
Related
I'm working on logic to retrieve a file from an Amazon S3 Bucket and then send it to a remote server using SFTP.
However, I'm not able to send the file because I don't have a path value - i.e., since my file is in S3, I don't have a path for it my local machine.
This is my method to send a file - it's inside a class called SftpService:
public async Task SendFile(SftpRequest request, CancellationToken cancellationToken = default)
{
// Connect to server
SftpClient client = new SftpClient(request.Host, request.Port, request.Username, request.Password);
client.Connect();
if (!client.IsConnected)
{
throw new SftpClientException("SFTP Connection was not complete.");
}
// Create an object of File Stream and pass a temp file path.
var tempFilePath = Path.GetTempFileName();
FileStream fileStream = new FileStream(tempFilePath, FileMode.Open);
// Copy the MemoryStream (from S3) to the FileStream
request.RefundFile.CopyTo(fileStream);
fileStream.Close();
// Upload the file. [TBD] What's the path for the file? (it is a MemoryStream)
client.UploadFile(fileStream, tempFilePath);
// Dispose the object by calling dispose method of sftpClient once the file has uploaded.
client.Dispose();
}
Now, I'm trying to perform a test on this method (using NUnit).
[SetUp]
public void Setup()
{
_sut = new SftpService(); //class
}
[Test]
public async Task SftpService_ShouldSendFileToServer_Success()
{
var request = MockSftpRequest();
await _sut.SendFile(request);
}
private SftpRequest MockSftpRequest()
{
var serverMock = new ServerOptions()
{
BaseAddress = "195.144.107.198",
Username = "demo",
Password = "password",
};
// Create a mock MemoryStream
var s3FileMock = new MemoryStream(Encoding.UTF8.GetBytes("This is a mock of the s3 file."));
SftpRequest request = new SftpRequest(serverMock)
{
RefundFile = refundFileMock,
Port = 22,
};
return request;
}
When running the test, I'm getting this error:
Message: Renci.SshNet.Common.SftpPathNotFoundException : Invalid path.
So, my question is: how can I achieve to send a MemoryStream over SFTP, without a path?
You have both parameters of SftpClient.UploadFile wrong.
Your immediate issue it that you are passing local path to the second parameter of SftpClient.UploadFile. Why? Obviously one parameter must be for local file/data and the other for remote. You are already passing local data to the first argument. So SftpClient.UploadFile does not need the local path anymore. It needs the remote path.
Why are you saving the [Memory]Stream to a temporary file only to open the file as another [File]Stream to pass it to SftpClient.UploadFile? Pass the MemoryStream to the SftpClient.UploadFile straight away.
client.UploadFile(request.RefundFile, "/remote/path/file");
Related question: Upload data from memory to SFTP server using SSH.NET
To start with, my goal is to use FTP to retrieve a file and put it on multiple servers. This is intended as a backup procedure where we're taking a file and putting it onto two different backup servers then removing it from the FTP server.
In case this is relevant, the program runs as a service on a server.
I have tried two different methods and gotten 3 different errors so I will detail those. I am open to trying different libraries, whole different methods, and that's why I'm focusing on the overall goal instead of the specific code and errors.
So the first method worked successfully when I tested it locally on my dev laptop; I was able to do everything I wanted. It's worth noting that I debugged as the same domain account that the service runs on in the environments. When I deployed to test I received the error "The underlying connection was closed: The server committed a protocol violation."
I have (of course) trimmed this code down to what I believe are the relevant parts.
This gets called for each path that's being delivered to. They read similar to:
first case: \\fqdn\directory
second case: \\192.168.123.123\directory$
private bool CopyFTPToPath(string file, string path)
{
try
{
string filePath = "ftp://" + Settings["Host"].Value + "/" + Settings["Path"].Value + "/" + file;
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(filePath);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.Credentials = new NetworkCredential("user", "pass");
using (FtpWebResponse response = (FtpWebResponse)request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (StreamReader reader = new StreamReader(responseStream))
{
using (StreamWriter writer = new StreamWriter(path + "/" + file, false))
{
writer.Write(reader.ReadToEnd());
}
}
}
}
return File.Exists(path + "/" + file);
catch(Exception ex)
{
Utility.LogError(ErrorType.Alert, ex);
}
}
Again this is successful when I run it locally, it fails when run on the server with the same credentials, with the above protocol violation error. The error occurs when it attempts to create the StreamWriter.
So I tried using a library to handle the connection and grabbed the WinSCP library because I've seen it recommended on here frequently. The code I wrote for that is this:
private bool CopyFTPToPath(string file, string path)
{
try
{
string filePath = "ftp://" + Settings["Host"].Value + "/" + Settings["Path"].Value + "/" + file;
SessionOptions sessionOptions = new SessionOptions() {
Protocol = Protocol.Ftp,
HostName = Settings.Settings["Host"].Value,
UserName = "user",
Password = "pass"
};
using(Session session = new Session())
{
session.Open(sessionOptions);
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
TransferEventArgs result = session.GetFileToDirectory(filePath, path,false,transferOptions);
}
return File.Exists(path + "/" + file);
catch(Exception ex)
{
Utility.LogError(ErrorType.Alert, ex);
}
}
Now this block of code fails at the session.GetFileToDirectory call, throwing a System.IO.DirectoryNotFoundException on the path. In this case I'm wondering if WinSCP is unable to handle a network path as the local directory path.
Here is a sanitized stacktrace from the error, for what that's worth. If the local directory parameter is actually local it seems to work, so I think that's what's going on there.
{System.IO.DirectoryNotFoundException: \\192.***.***.***\Path\Path
at WinSCP.Session.DoGetFilesToDirectory(String remoteDirectory, String localDirectory, String filemask, Boolean remove, TransferOptions options, String additionalParams)
at WinSCP.Session.GetEntryToDirectory(String remoteFilePath, String localDirectory, Boolean remove, TransferOptions options, String additionalParams)
at WinSCP.Session.GetFileToDirectory(String remoteFilePath, String localDirectory, Boolean remove, TransferOptions options)
at Program.RoutineTasks.Backup.CopyFTPToPath(String file, String path) in C:\Projects\Program\RoutineTasks\Backup.cs:line 114}
FTP is my only option to access this file. The file is 130gb, I don't have disk space on the server that runs this to copy it local to hand out. Both of those are outside of my control.
EDIT: I have found a solution that will definitely work if I can figure out how to manage the streams better, because the files are huge so I need to break them up to prevent running out of memory.
That code is this, in place of the using Session block above
using(Session session = new Session())
{
session.Open(sessionOptions);
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
using (StreamReader reader = new StreamReader(session.GetFile(filePath, transferOptions)))
{
using (StreamWriter writer = new StreamWriter(path + "/" + file, false))
{
while (!reader.EndOfStream)
{
writer.Write(reader.Read());
}
}
}
}
In the end this wound up being related to permissions on the account. When I presented to the sysadmin that the service account's access to the path was behaving inconsistently between my laptop and the test environment here is what he said:
"you have to configure all non windows shares through the mmc snapin"
The paths I am delivering to are non-windows. He made a change there and the SA was then able to access the paths from the test server. This was not something that could have been solved simply by code.
I was given the address like following to upload file to Azure File Share using Shared Access Signature(SAS)
https://myaccount.file.core.windows.net/xxxxx?sv=2020-08-04&ss=bfqt&srt=so&sp=rwdlacupitfx&se=2022-12-30T18:11:32Z&st=2021-12-12T10:11:32Z&spr=https&sig=signature
This is my test program
using Azure.Storage.Files.Shares;
public async Task TestAsync()
{
var sas = #"https://myaccount.file.core.windows.net/xxxxx?sv=2020-08-04&ss=bfqt&srt=so&sp=rwdlacupitfx&se=2022-12-30T18:11:32Z&st=2021-12-12T10:11:32Z&spr=https&sig=signature";
var localfile = #"C:\Test\local.txt";
var client = new ShareFileClient(new Uri(sas));
using (var stream = new FileStream(localfile, FileMode.Open, FileAccess.Read))
{
var response = await client.UploadAsync(stream);
}
}
The program throw RequestFailedException with following error:
Status: 400 (The requested URI does not represent any resource on the server.)
ErrorCode: InvalidUri
Additional Information:
UriPath: /xxxxx
My question is what this error mean, is it anything wrong in my test code?
According to this Document uploading with SAS is not possible as it requires authentication to do so. Even though we try having the code correctly it still throws Authentication information is not given in the correct format. Check the value of the Authorization header exception. An alternative way is to use a connection string which of working fine when we try reproducing from our end.
Here is the code
static async Task Main(string[] args) {
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("<YOUR CONNECTION STRING>");
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
CloudFileShare share = fileClient.GetShareReference("<YOUR FILE SHARE>");
if (await share.ExistsAsync()) {
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
CloudFile file = rootDir.GetFileReference("sample.txt");
byte[] data = File.ReadAllBytes(# "sample.txt");
Stream fileStream = new MemoryStream(data);
await file.UploadFromStreamAsync(fileStream);
}
}
Here is the workaround using SAS that you can try:
using System.IO;
using Azure.Storage;
using Azure.Storage.Files.Shares;
public class SampleClass {
public void Upload() {
///Get information from your Azure storage account and configure
string accountName = "{Get the account name from the Azure portal site.}";
string accessKey = "{Get the access key from the Azure portal site.}";
Uri serverurl = new Uri(# "{Get the URL from the Azure portal.}");
///Upload destination(azure side)
string azureDirectoryPath = # "{Destination(azure side)Specify the directory of}";
string azureFileName = "{Specify the file name to save}";
///Upload target(Local side)
string localDirectoryPath = # "{Upload target(Local side)Specify the directory of}";
string localFileName = "{Upload target(Local side)Specify the file name of}";
//SSL communication permission setting
//If you don't do this, SSL(https)An error occurs in communication.
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls11 | System.Net.SecurityProtocolType.Tls12;
try {
//Preparing to connect to Azure: Setting connection information
StorageSharedKeyCredential credential = new StorageSharedKeyCredential(accountName, accessKey);
//Connect to Azure
ShareClient share = new ShareClient(serverurl, credential);
ShareDirectoryClient directory = share.GetDirectoryClient(azureDirectoryPath);
//Upload destination(azure side)Create if there is no folder in.
directory.CreateIfNotExists();
//Upload destination(azure side)Create a file instance in.
ShareFileClient file = directory.GetFileClient(azureFileName);
//Delete any file with the same name
file.DeleteIfExists();
//Open the Local file to be uploaded. It is easy to get binary information by opening it with FileStream type.
FileStream stream = File.OpenRead(Path.Combine(localDirectoryPath, localFileName));
//Upload destination(azure side)Inject binary information into a file instance
file.Create(stream.Length);
file.UploadRange(new Azure.HttpRange(0, stream.Length), stream);
//Free local files
stream.Dispose();
} catch (Exception ex) {
System.Console.WriteLine(ex.Message);
return;
}
}
}
REFERENCE:
How to programmatically upload files to Azure Storage (File Share)
I have an Azure Timer Trigger function, which generates an excel file, and I need to send it via an email, as an attachment.
I have done the following successfully -
Created the file and uploaded it into an azure blob
Sent email without attachment (Using SmtpClient and MailMessage)
How can I fetch the file from Azure Blob and send it as an attachment?
P.S.
I was able to send it as an attachment, when I stored the file in Azure's local storage of the function. However, I want to move the storage of the file to Azure Blob
How can I fetch the file from Azure Blob and send it as an attachment?
Per my understanding, you could download your blob into a temp local file in your function as follows:
// Save blob contents to a file.
using (var fileStream = System.IO.File.OpenWrite(#"path\myfile"))
{
blockBlob.DownloadToStream(fileStream);
}
Then, you could construct your Attachment as follows:
System.Net.Mail.Attachment attach = new System.Net.Mail.Attachment("{file-path}");
Or you could directly download your blob into the MemoryStream as follows:
using (var memoryStream = new MemoryStream())
{
blockBlob2.DownloadToStream(memoryStream);
memoryStream.Position = 0;
System.Net.Mail.Attachment attach = new System.Net.Mail.Attachment(memoryStream , "{contentType}");
}
Details you could follow Download blobs.
var blob_Archive = Environment.GetEnvironmentVariable("blobName", EnvironmentVariableTarget.Process);
var archiveSheetFile = $"{blob_Archive}/FileName-{DateTime.UtcNow:ddMMyyyy_hh.mm}.csv";
using (var txReader = await binder.BindAsync<TextReader>(
new BlobAttribute(file2read, FileAccess.Read)))
{
if(txReader != null)
sBlobNotification = await txReader.ReadToEndAsync();
}
log.LogInformation($"{sBlobNotification}");
you can write the above code in timer trigger function and get the blob content.
Does this helps!!
I have code that allows me to access a server to do ftp transactions. I have tested the connection and it works. the problem is saving files. to help paint a picture this is how the address are set up.
ftp server: ftp.MyMainDomain.com
path login ftp points to: www.another_website_Under_myDomain.com/gallery/images
when I tested connection, ftp server tacks me directly to images folder and I have even read the subdirectories out (ie ..images/subdirectory1, ..images/subdirectory2).
What I need now it to be able to save files into each of the folders. I thought that all I had to do was add subdir that I wanted to access to the end of the ftp_server uri but that doesn't work. What should I do?
Uri ftpUri = new Uri((Ftp_Server_Address + "/" + SubDirectory+ "/"), UriKind.Absolute);
if (ftpUri.Scheme == Uri.UriSchemeFtp)// check ftp address,
{
DirRequest = (FtpWebRequest)FtpWebRequest.Create(ftpUri);
DirRequest.Method = ReqMethod;
DirRequest.Credentials = new NetworkCredential(FtpUserName, FtpPassword);
DirRequest.UsePassive = true;
DirRequest.UseBinary = true;
DirRequest.KeepAlive = false;
//change picture to stream
Stream PicAsStream = Pic.Bitmap_to_Stream(Pic.BitmapImage_to_Bitmap(Pic.Photo));
//send ftp with picture
Stream ftpReqStream = DirRequest.GetRequestStream();
ftpReqStream = PicAsStream;
ftpReqStream.Close();
SendFtpRequest(ReqMethod);
}
In this line:
ftpReqStream = PicAsStream;
You are not sending the Stream to ftp Server but you are assigning PicAsStream to ftpReqStream and then closing it. Your code does nothing.
Do it like this:
Create a buffer of your image file (not a stream):
FileStream ImageFileStream = File.OpenRead(your_file_path_Here);
byte[] ImageBuffer = new byte[ImageFileStream.Length];
ImageFileStream.Read(ImageBuffer, 0, ImageBuffer.Length);
ImageFileStream.Close();
and then simply write it to your ftpReqStream:
Stream ftpReqStream = DirRequest.GetRequestStream();
ftpReqStream.Write(ImageBuffer, 0, ImageBuffer.Length);
ftpReqStream.Close();