I have FileSystem watcher for a local directory. It's working fine. I want same to implement for FTP. Is there any way I can achieve it? I have checked many solutions but it's not clear.
Logic: Want to get files from FTP later than some timestamp.
Problem faced: Getting all files from FTP and then filtering the result is hitting the performance (used FtpWebRequest).
Is there any right way to do this? (WinSCP is on hold. Cant use it now.)
FileSystemWatcher oFsWatcher = new FileSystemWatcher();
OFSWatchers.Add(oFsWatcher);
oFsWatcher.Path = sFilePath;
oFsWatcher.Filter = string.IsNullOrWhiteSpace(sFileFilter) ? "*.*" : sFileFilter;
oFsWatcher.NotifyFilter = NotifyFilters.FileName;
oFsWatcher.EnableRaisingEvents = true;
oFsWatcher.IncludeSubdirectories = bIncludeSubdirectories;
oFsWatcher.Created += new FileSystemEventHandler(OFsWatcher_Created);
You cannot use the FileSystemWatcher or any other way, because the FTP protocol does not have any API to notify a client about changes in the remote directory.
All you can do is to periodically iterate the remote tree and find changes.
It's actually rather easy to implement, if you use an FTP client library that supports recursive listing of a remote tree. Unfortunately, the built-in .NET FTP client, the FtpWebRequest does not. But for example with WinSCP .NET assembly, you can use the Session.EnumerateRemoteFiles method.
See the article Watching for changes in SFTP/FTP server:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
List<string> prevFiles = null;
while (true)
{
// Collect file list
List<string> files =
session.EnumerateRemoteFiles(
"/remote/path", "*.*", EnumerationOptions.AllDirectories)
.Select(fileInfo => fileInfo.FullName)
.ToList();
if (prevFiles == null)
{
// In the first round, just print number of files found
Console.WriteLine("Found {0} files", files.Count);
}
else
{
// Then look for differences against the previous list
IEnumerable<string> added = files.Except(prevFiles);
if (added.Any())
{
Console.WriteLine("Added files:");
foreach (string path in added)
{
Console.WriteLine(path);
}
}
IEnumerable<string> removed = prevFiles.Except(files);
if (removed.Any())
{
Console.WriteLine("Removed files:");
foreach (string path in removed)
{
Console.WriteLine(path);
}
}
}
prevFiles = files;
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
}
(I'm the author of WinSCP)
Though, if you actually want to just download the changes, it's a way easier. Just use the Session.SynchronizeDirectories in the loop.
while (true)
{
SynchronizationResult result =
session.SynchronizeDirectories(
SynchronizationMode.Local, "/remote/path", #"C:\local\path", true);
result.Check();
// You can inspect result.Downloads for a list for updated files
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
This will update even modified files, not only new files.
Though using WinSCP .NET assembly from a web application might be problematic. If you do not want to use a 3rd party library, you have to do with limitations of the FtpWebRequest. For an example how to recursively list a remote directory tree with the FtpWebRequest, see my answer to List names of files in FTP directory and its subdirectories.
You have edited your question to say that you have performance problems with the solutions I've suggested. Though you have already asked a new question that covers this:
Get FTP file details based on datetime in C#
Unless you have access to the OS which hosts the service; it will be a bit harder.
FileSystemWatcher places a hook on the filesystem, which will notify your application as soon as something happened.
FTP command specifications does not have such a hook. Besides that it's always initiated by the client.
Therefor, to implement such logic you should periodical perform a NLST to list the FTP-directory contents and track the changes (or hashes, perhaps (MDTM)) yourself.
More info:
FTP return codes
FTP
I have got an alternative solution to do my functionality.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}
Related
Normally we can move the directory using
// source is: "C:\Songs\Elvis my Man"
// newLocation is: "C:\Songs\Elvis"
try
{
// Previous command was: Directory.Move(source, newLocation);
DirectoryInfo dir = new DirectoryInfo(source);
dir.MoveTo(newLocation);
}
catch (Exception e)
{
Console.WriteLine("Error: "+ e.Message);
}
Now, when using the azure:
string myconnectionString = ConfigurationManager.ConnectionStrings["StorageConnection"].ConnectionString.ToString();
string myshareName = "Mysampleshare";
string mydirName = "Mysampledir";
// Get a reference to a share and then create it
ShareClient myshare = new ShareClient(myconnectionString, myshareName);
ShareDirectoryClient directory1 = myshare.GetDirectoryClient(mydirName);
string myshareName2 = "Mysampleshare2";
string mydirName2 = "Mysampledir2";
// Get a reference to a share and then create it
ShareClient myshare2 = new ShareClient(myconnectionString, myshareName2);
ShareDirectoryClient directory2 = myshare2.GetDirectoryClient(mydirName2);
Directory.Move(directory1.Path, directory2.Path);
I am unable to move the directory from one location to another using azure. I am getting exception.
DirectoryNotFoundException: The path specified by sourceDirName is invalid
Please suggest your advice.
ShareDirectoryClient essentially implements Azure File Service REST API thus you cannot use System.IO operations like Directory.Move with it.
There are two possible solutions:
If you want to use the SDK, what you will need to do is list files and directories recursively in the source directory and then copy individual files from source directory to target directory. You will also need to create directories in the target directory as well. Once the copy operation is complete, then you will need to delete all files and directories from the source directory. Once the source directory is empty, only then you will be able to delete the source directory.
You will need to do all this because the REST API does not natively support move operation. To accomplish move, you will need to perform copy operation followed by delete operation.
If you want to use System.IO, then you will need to mount the file share as a shared network drive so that you can get a drive letter assigned to that file share. Once you have that, then you will be able to use operations like Directory.Move available in System.IO namespace.
UPDATE
Please try the code below:
using System;
using System.Threading.Tasks;
using Azure.Storage.Files.Shares;
namespace SO69798149
{
class Program
{
const string MyconnectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key";
const string MyshareName = "share-name";
const string SourceDirectoryName = "source-directory-name";
private const string RenamedDirectoryName = "new-directory-name";
static async Task Main(string[] args)
{
ShareClient myshare = new ShareClient(MyconnectionString, MyshareName);
ShareDirectoryClient sourceDirectoryClient = myshare.GetDirectoryClient(SourceDirectoryName);
ShareDirectoryClient targetDirectoryClient = myshare.GetDirectoryClient(RenamedDirectoryName);
await RenameDirectory(sourceDirectoryClient, targetDirectoryClient);
Console.WriteLine("Directory renamed.");
}
static async Task RenameDirectory(ShareDirectoryClient sourceDirectoryClient,
ShareDirectoryClient targetDirectoryClient)
{
//Create target directory
await targetDirectoryClient.CreateIfNotExistsAsync();
//List files and folders from the source directory
var result = sourceDirectoryClient.GetFilesAndDirectoriesAsync();
await foreach (var items in result.AsPages())
{
foreach (var item in items.Values)
{
if (item.IsDirectory)
{
//If item is directory, then get the child items in that directory recursively.
await RenameDirectory(sourceDirectoryClient.GetSubdirectoryClient(item.Name),
targetDirectoryClient.GetSubdirectoryClient(item.Name));
}
else
{
//If item is file, then copy the file and then delete it.
var sourceFileClient = sourceDirectoryClient.GetFileClient(item.Name);
var targetFileClient = targetDirectoryClient.GetFileClient(item.Name);
await targetFileClient.StartCopyAsync(sourceFileClient.Uri);
await sourceFileClient.DeleteIfExistsAsync();
}
}
}
//Delete source directory.
await sourceDirectoryClient.DeleteIfExistsAsync();
}
}
}
I have been using WinSCP to download files, periodically, from a Unix server to Windows server and it has been working with no issues. I also check if remote file is older or already exists (don't copy).
Now, I have to do the same but this time I have to download files and folders. Files are copied fine but folders aren't. When playing with the settings, I got it to copy the contents of the folder but they get copied to my root local folder; I thought WinSCP would copy everything.
In below code, LocalFolder is Z:\My_Data and LogRootFolder is /xyz/gtc/a00/
Folder structure on remote is /xyz/gtc/a00/ABCD/outcomes/ with subfolder "backup" that has many subfolders named as dates (e.g. /xyz/gtc/a00/ABCD/outcomes/backup/2021-06-23/)
Either none of "backup/2021-xx-xx/" files and folder are copied, or they are all copied to Z:\My_Data\ABCD
After setting up the session, called SFTP_Session:
string sRemotePath = LogRootFolder + "ABCD/outcomes/";
string sLocalFolder = Path.Combine(LocalFolder, #"ABCD\");
if (SFTP_Session.Opened)
{
using (SFTP_Session)
{
SFTP_Session.QueryReceived += (sender, e) =>
{
...
e.Continue();
};
//var opts = EnumerationOptions.EnumerateDirectories | EnumerationOptions.AllDirectories;
//IEnumerable<RemoteFileInfo> fileInfos = SFTP_Session.EnumerateRemoteFiles(sRemotePath, "*.dat", opts); <-- This copies files in folder(s) to m local root folder
Regex mask = new Regex(#"\.(dat|err)$", RegexOptions.IgnoreCase);
IEnumerable<RemoteFileInfo> fileInfos =
SFTP_Session.EnumerateRemoteFiles(sRemotePath, null, EnumerationOptions.AllDirectories)
.Where(fileInfo => mask.Match(fileInfo.Name).Success)
.ToList();
foreach (RemoteFileInfo fileInfo in fileInfos)
{
string localFilePath = Path.Combine(sLocalFolder, fileInfo.Name);
if (fileInfo.IsDirectory)
{
// Create local subdirectory, if it does not exist yet
if (!Directory.Exists(localFilePath))
{
Directory.CreateDirectory(localFilePath);
}
}
else
{
string remoteFilePath = RemotePath.EscapeFileMask(fileInfo.FullName);
// If file does not exist in local folder, download
if (!File.Exists(localFilePath))
{
bDownload = true;
}
else // If file exists in local folder but is older, download; else skip
{
DateTime remoteWriteTime = SFTP_Session.GetFileInfo(remoteFilePath).LastWriteTime;
DateTime localWriteTime = File.GetLastWriteTime(localFilePath);
if (remoteWriteTime > localWriteTime)
{
bDownload = true;
}
else
{
bDownload = false;
}
}
if (bDownload)
{
// Download file
TransferOptions oTrRes = new TransferOptions();
oTrRes.TransferMode = TransferMode.Automatic; //The Transfer Mode - Automatic, Binary, or Ascii
oTrRes.FilePermissions = null; //Permissions applied to remote files; null for default permissions. Can set user, Group, or other Read/Write/Execute permissions.
oTrRes.PreserveTimestamp = false; //Set last write time of destination file to that of source file - basically change the timestamp to match destination and source files.
oTrRes.ResumeSupport.State = TransferResumeSupportState.Off;
TransferOperationResult transferResult = SFTP_Session.GetFiles(remoteFilePath, localFilePath, false, oTrRes);//.Replace("\\","")); // I thought this would get files AND folders
// Throw on any error
transferResult.Check();
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
// Store local file info in a data table for processing later
...
}
SessionRemoteExceptionCollection srec = transferResult.Failures;
foreach (SessionRemoteException sre in srec)
{
// Log errors
}
// Did the download succeeded?
if (!transferResult.IsSuccess)
{
// Log error (but continue with other files)
}
}
}
}
At the end, in local folder I see the files downloaded and copied and subfolders that I created (using above code) but no files in those folders. Can't see what I am missing here.
Your code basically synchronizes a remote directory to a local one.
Instead of fixing your code, you can simply replace most of it with a simple call to Session.SynchronizeDirectories:
https://winscp.net/eng/docs/library_session_synchronizedirectories
Try the synchronization first in WinSCP GUI to see if it does what you need.
If you need to do some processing with the synchronized files, use the SynchronizationResult returned by Session.SynchronizeDirectories. It contains a list of all synchronized files.
If you need to exclude some files from the synchronization, use TransferOptions.FileMask.
I have been searched in the net and I didn't found any result. Actually I want to get the name of all the files that I have in the root and Directory and Sub Directory. I tried the code as bellow but its give me only the files in the root of my FTP.
The folder that I have in the FTP is like as bellow:
/ds/product/Jan/
/ds/subproduct/Jan/
/ds/category/Jan/
The code that I tried:
FtpWebRequest ftpRequest = (FtpWebRequest)WebRequest.Create("ftp://" + FtpIP);
ftpRequest.Credentials = new NetworkCredential(FtpUser, FtpPass);
ftpRequest.Method = WebRequestMethods.Ftp.ListDirectory;
FtpWebResponse response = (FtpWebResponse)ftpRequest.GetResponse();
StreamReader streamReader = new StreamReader(response.GetResponseStream());
List<string> directories = new List<string>();
string line = streamReader.ReadLine();
while (!string.IsNullOrEmpty(line))
{
// directories.Add(line);
line = streamReader.ReadLine().ToString();
MessageBox.Show(line);
}
streamReader.Close();
It is s not easy to implement this without any external library. Unfortunately, neither the .NET Framework nor PowerShell have any explicit support for recursively listing files in an FTP directory.
You have to implement that yourself:
List the remote directory
Iterate the entries, recursing into subdirectories - listing them again, etc.
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET Framework (FtpWebRequest). The .NET Framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name".
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
static void ListFtpDirectory(string url, NetworkCredential credentials)
{
WebRequest listRequest = WebRequest.Create(url);
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (WebResponse listResponse = listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
string line = listReader.ReadLine();
lines.Add(line);
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
if (permissions[0] == 'd')
{
Console.WriteLine($"Directory {name}");
string fileUrl = url + name;
ListFtpDirectory(fileUrl + "/", credentials);
}
else
{
Console.WriteLine($"File {name}");
}
}
}
Use the function like:
NetworkCredential credentials = new NetworkCredential("user", "mypassword");
string url = "ftp://ftp.example.com/directory/to/list/";
ListFtpDirectory(url, credentials);
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats.
For example with WinSCP .NET assembly you can list whole directory recursively with a single call to Session.EnumerateRemoteFiles:
// Setup session options
var sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "ftp.example.com",
UserName = "user",
Password = "mypassword",
};
using (var session = new Session())
{
// Connect
session.Open(sessionOptions);
// Enumerate files
var options =
EnumerationOptions.EnumerateDirectories |
EnumerationOptions.AllDirectories;
IEnumerable<RemoteFileInfo> fileInfos =
session.EnumerateRemoteFiles("/directory/to/list", null, options);
foreach (var fileInfo in fileInfos)
{
Console.WriteLine(fileInfo.FullName);
}
}
Not only the code is simpler, more robust and platform-independent. It also makes all other file attributes (size, modification time, permissions, ownership) readily available via the RemoteFileInfo class.
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)
I want to use FileSystemWatcher.Changed on a FTP directory
What do I put in the property FileSystemWatcher.Path ?
You can not do this. A FileSystemWatcher watches the filesystem, not an FTP-folder. So unless you have filesystem access to the FTP-path (UNC paths are supported), you are unable to do this.
You cannot use the FileSystemWatcher or any other way, because the FTP protocol does not have any API to notify a client about changes in the remote directory.
All you can do its to periodically iterate the remote tree and find changes.
It's actually rather easy to implement, if you use an FTP client that supports recursive listing of a remote tree. Unfortunately, the build-in .NET FTP client, the FtpWebRequest does not. For example with WinSCP .NET assembly, you can use the Session.EnumerateRemoteFiles method.
See the article Watching for changes in SFTP/FTP server:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
List<string> prevFiles = null;
while (true)
{
// Collect file list
List<string> files =
session.EnumerateRemoteFiles(
"/remote/path", "*.*", EnumerationOptions.AllDirectories)
.Select(fileInfo => fileInfo.FullName)
.ToList();
if (prevFiles == null)
{
// In the first round, just print number of files found
Console.WriteLine("Found {0} files", files.Count);
}
else
{
// Then look for differences against the previous list
IEnumerable<string> added = files.Except(prevFiles);
if (added.Any())
{
Console.WriteLine("Added files:");
foreach (string path in added)
{
Console.WriteLine(path);
}
}
IEnumerable<string> removed = prevFiles.Except(files);
if (removed.Any())
{
Console.WriteLine("Removed files:");
foreach (string path in removed)
{
Console.WriteLine(path);
}
}
}
prevFiles = files;
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
}
(I'm the author of WinSCP)
Though, if you actually want to just download the changes, it's a way easier. Just use the Session.SynchronizeDirectories in the loop.
session.SynchronizeDirectories(
SynchronizationMode.Local, "/remote/path", #"C:\local\path", true).Check();
If you do not want to use a 3rd party library, you have to do with limitations of the FtpWebRequest. For an example how to recursively list a remote directory tree with the FtpWebRequest, see my answer to C# Download all files and subdirectories through FTP.
How to connect server already hosted on IIS Website Folder and compare the Local Client Folder?
This is my Website URL on localhost
IISHostedWebsite/Updates //Folder in Website
I need to compare in this url update folder files with my local client machine's D:\Updates folder.
If new updates available into server it will copy to my D:\Updates folder.
How can we achieve this sort of situation ?
I have some code that in C#
var directory = new DirectoryInfo(#"D:\\Anand\\Work\\FolderCheck\\Server");
var myFile = (from f in directory.GetFiles()
orderby f.LastWriteTime descending
select f).First();
This code generates the latest updated file from folder
The MSDeploy tool (http://www.iis.net/downloads/microsoft/web-deploy) was designed to solve this problem. It lets you compare IIS virtual directories to other directories, and synchronize them if needed. It can also be used just as a diff tool.
In your example, after installing MSDeploy, you could do the following:
msdeploy.exe -verb:sync -source:contentPath="IISHostedWebSite/Updates" -dest:contentPath=d:\updates -whatIf
This command will show the list of changes needed to update d:\updates to look like IISHostedWebSites/Updates. If you remove the "-whatif" it will actually do the changes.
You can also call the MSDeploy programatically to do the same thing.
Also the problem with your code snippet is that you wouldn't detect deleted files from source that also need to be deleted from destination.
Here my code is for Getting Files from Directory and Check Whether are same if not It will Showing No updates Found and After that It will Check latest Files Available into Folder if Available then Copy this Files and Move to Destination Folder This is my Task but i have completed my code in Latest updated file i get from this code but i dont no how this files are copy into destination folder
string serverPath = #"D:\Anand\Work\FolderCheck\Server"; //Source File
string clientPath = #"D:\Anand\Work\FolderCheck\Client"; //destination Folder
private static bool CompareFileSizes(string fileServer, string fileClient)
{
bool fileSizeEqual = true;
if (fileServer.Length == fileClient.Length) // Compare file sizes
{
fileSizeEqual = false; // File sizes are not equal therefore files are not identical
}
return fileSizeEqual;
}
try
{
if (!File.Exists(serverPath) || !File.Exists(clientPath))
{
try
{
var Server = Path.GetFileName(serverPath);
var Client = Path.GetFileName(clientPath);
string ServerFile = Server.ToString();
string ClientFile = Client.ToString();
if (CompareFileSizes(ServerFile, ClientFile))
{
lblServerMsg.Text = "No Updates are Found: ";
}
else
{
var directoryServer = new DirectoryInfo(#"D:\Anand\Work\FolderCheck\Server"); //check latest Available File From Server
var myFile = (from f in directoryServer.GetFiles()
orderby f.LastWriteTime descending
select f).First();
lblServerMsg.Text = "Updates Are Available Click for Update Button:";
btnCheckUpates.Visible = false;
btnUpdates.Visible = true;
}
}
catch (Exception msg)
{
lblServerMsg.Text = "No Updates are Found: " + msg.Message;
}
}
else
{
throw new FileNotFoundException();
}
}
catch (FileNotFoundException ex)
{
lblServerMsg.Text = ex.Message;
}
Above code i have done Get latest File from Source Folder but i dont no how to
var myFile = (from f in directoryServer.GetFiles()
orderby f.LastWriteTime descending
select f).First();
Above doe myFile are latest File from Source Folder this i want to copy into Destination Folder