I want to use FileSystemWatcher.Changed on a FTP directory
What do I put in the property FileSystemWatcher.Path ?
You can not do this. A FileSystemWatcher watches the filesystem, not an FTP-folder. So unless you have filesystem access to the FTP-path (UNC paths are supported), you are unable to do this.
You cannot use the FileSystemWatcher or any other way, because the FTP protocol does not have any API to notify a client about changes in the remote directory.
All you can do its to periodically iterate the remote tree and find changes.
It's actually rather easy to implement, if you use an FTP client that supports recursive listing of a remote tree. Unfortunately, the build-in .NET FTP client, the FtpWebRequest does not. For example with WinSCP .NET assembly, you can use the Session.EnumerateRemoteFiles method.
See the article Watching for changes in SFTP/FTP server:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
List<string> prevFiles = null;
while (true)
{
// Collect file list
List<string> files =
session.EnumerateRemoteFiles(
"/remote/path", "*.*", EnumerationOptions.AllDirectories)
.Select(fileInfo => fileInfo.FullName)
.ToList();
if (prevFiles == null)
{
// In the first round, just print number of files found
Console.WriteLine("Found {0} files", files.Count);
}
else
{
// Then look for differences against the previous list
IEnumerable<string> added = files.Except(prevFiles);
if (added.Any())
{
Console.WriteLine("Added files:");
foreach (string path in added)
{
Console.WriteLine(path);
}
}
IEnumerable<string> removed = prevFiles.Except(files);
if (removed.Any())
{
Console.WriteLine("Removed files:");
foreach (string path in removed)
{
Console.WriteLine(path);
}
}
}
prevFiles = files;
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
}
(I'm the author of WinSCP)
Though, if you actually want to just download the changes, it's a way easier. Just use the Session.SynchronizeDirectories in the loop.
session.SynchronizeDirectories(
SynchronizationMode.Local, "/remote/path", #"C:\local\path", true).Check();
If you do not want to use a 3rd party library, you have to do with limitations of the FtpWebRequest. For an example how to recursively list a remote directory tree with the FtpWebRequest, see my answer to C# Download all files and subdirectories through FTP.
Related
I'm trying to implement some logic to compare file information between remote server and local server.
I need to compare the file name between local folder and remote folder and download only the new files.
I tried using loading files in a list and use Except function, it didn't work.
Appreciate your help.
Please find one of the scenario I tried.
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
const string remotePath = "/Test";
const string localPath = #"C:\Local";
const string ArchivePath = #"C:\Users\Local\Archive";
System.IO.DirectoryInfo dir2 = new System.IO.DirectoryInfo(ArchivePath);
RemoteDirectoryInfo dir1 = session.ListDirectory(remotePath);
IEnumerable<System.IO.FileInfo> list2 =
dir2.GetFiles("*.*", System.IO.SearchOption.AllDirectories);
IEnumerable<RemoteFileInfo> list1 =
session.EnumerateRemoteFiles(remotePath, "*.csv", EnumerationOptions.None);
var firstNotSecond = list1.Except(list2).ToList();
}
Getting error like
'IEnumerable' does not contain a definition for 'Except' and the best extension method overload 'Queryable.Except(IQueryable, IEnumerable)' requires a receiver of type 'IQueryable'
You will have to compare just the filenames:
var firstNotSecond =
list1.Select(_ => _.Name).Except(list2.Select(_ => _.Name)).ToList();
Though note that WinSCP .NET has this functionality built-in. There's Session.CompareDirectories.
And if you actually want to synchronize the directories, there's Session.SynchronizeDirectories. One method that will do everything for you.
session.SynchronizeDirectories(
SynchronizationMode.Local, localPath, remotePath, false).Check()
I have been using WinSCP to download files, periodically, from a Unix server to Windows server and it has been working with no issues. I also check if remote file is older or already exists (don't copy).
Now, I have to do the same but this time I have to download files and folders. Files are copied fine but folders aren't. When playing with the settings, I got it to copy the contents of the folder but they get copied to my root local folder; I thought WinSCP would copy everything.
In below code, LocalFolder is Z:\My_Data and LogRootFolder is /xyz/gtc/a00/
Folder structure on remote is /xyz/gtc/a00/ABCD/outcomes/ with subfolder "backup" that has many subfolders named as dates (e.g. /xyz/gtc/a00/ABCD/outcomes/backup/2021-06-23/)
Either none of "backup/2021-xx-xx/" files and folder are copied, or they are all copied to Z:\My_Data\ABCD
After setting up the session, called SFTP_Session:
string sRemotePath = LogRootFolder + "ABCD/outcomes/";
string sLocalFolder = Path.Combine(LocalFolder, #"ABCD\");
if (SFTP_Session.Opened)
{
using (SFTP_Session)
{
SFTP_Session.QueryReceived += (sender, e) =>
{
...
e.Continue();
};
//var opts = EnumerationOptions.EnumerateDirectories | EnumerationOptions.AllDirectories;
//IEnumerable<RemoteFileInfo> fileInfos = SFTP_Session.EnumerateRemoteFiles(sRemotePath, "*.dat", opts); <-- This copies files in folder(s) to m local root folder
Regex mask = new Regex(#"\.(dat|err)$", RegexOptions.IgnoreCase);
IEnumerable<RemoteFileInfo> fileInfos =
SFTP_Session.EnumerateRemoteFiles(sRemotePath, null, EnumerationOptions.AllDirectories)
.Where(fileInfo => mask.Match(fileInfo.Name).Success)
.ToList();
foreach (RemoteFileInfo fileInfo in fileInfos)
{
string localFilePath = Path.Combine(sLocalFolder, fileInfo.Name);
if (fileInfo.IsDirectory)
{
// Create local subdirectory, if it does not exist yet
if (!Directory.Exists(localFilePath))
{
Directory.CreateDirectory(localFilePath);
}
}
else
{
string remoteFilePath = RemotePath.EscapeFileMask(fileInfo.FullName);
// If file does not exist in local folder, download
if (!File.Exists(localFilePath))
{
bDownload = true;
}
else // If file exists in local folder but is older, download; else skip
{
DateTime remoteWriteTime = SFTP_Session.GetFileInfo(remoteFilePath).LastWriteTime;
DateTime localWriteTime = File.GetLastWriteTime(localFilePath);
if (remoteWriteTime > localWriteTime)
{
bDownload = true;
}
else
{
bDownload = false;
}
}
if (bDownload)
{
// Download file
TransferOptions oTrRes = new TransferOptions();
oTrRes.TransferMode = TransferMode.Automatic; //The Transfer Mode - Automatic, Binary, or Ascii
oTrRes.FilePermissions = null; //Permissions applied to remote files; null for default permissions. Can set user, Group, or other Read/Write/Execute permissions.
oTrRes.PreserveTimestamp = false; //Set last write time of destination file to that of source file - basically change the timestamp to match destination and source files.
oTrRes.ResumeSupport.State = TransferResumeSupportState.Off;
TransferOperationResult transferResult = SFTP_Session.GetFiles(remoteFilePath, localFilePath, false, oTrRes);//.Replace("\\","")); // I thought this would get files AND folders
// Throw on any error
transferResult.Check();
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
// Store local file info in a data table for processing later
...
}
SessionRemoteExceptionCollection srec = transferResult.Failures;
foreach (SessionRemoteException sre in srec)
{
// Log errors
}
// Did the download succeeded?
if (!transferResult.IsSuccess)
{
// Log error (but continue with other files)
}
}
}
}
At the end, in local folder I see the files downloaded and copied and subfolders that I created (using above code) but no files in those folders. Can't see what I am missing here.
Your code basically synchronizes a remote directory to a local one.
Instead of fixing your code, you can simply replace most of it with a simple call to Session.SynchronizeDirectories:
https://winscp.net/eng/docs/library_session_synchronizedirectories
Try the synchronization first in WinSCP GUI to see if it does what you need.
If you need to do some processing with the synchronized files, use the SynchronizationResult returned by Session.SynchronizeDirectories. It contains a list of all synchronized files.
If you need to exclude some files from the synchronization, use TransferOptions.FileMask.
I have FileSystem watcher for a local directory. It's working fine. I want same to implement for FTP. Is there any way I can achieve it? I have checked many solutions but it's not clear.
Logic: Want to get files from FTP later than some timestamp.
Problem faced: Getting all files from FTP and then filtering the result is hitting the performance (used FtpWebRequest).
Is there any right way to do this? (WinSCP is on hold. Cant use it now.)
FileSystemWatcher oFsWatcher = new FileSystemWatcher();
OFSWatchers.Add(oFsWatcher);
oFsWatcher.Path = sFilePath;
oFsWatcher.Filter = string.IsNullOrWhiteSpace(sFileFilter) ? "*.*" : sFileFilter;
oFsWatcher.NotifyFilter = NotifyFilters.FileName;
oFsWatcher.EnableRaisingEvents = true;
oFsWatcher.IncludeSubdirectories = bIncludeSubdirectories;
oFsWatcher.Created += new FileSystemEventHandler(OFsWatcher_Created);
You cannot use the FileSystemWatcher or any other way, because the FTP protocol does not have any API to notify a client about changes in the remote directory.
All you can do is to periodically iterate the remote tree and find changes.
It's actually rather easy to implement, if you use an FTP client library that supports recursive listing of a remote tree. Unfortunately, the built-in .NET FTP client, the FtpWebRequest does not. But for example with WinSCP .NET assembly, you can use the Session.EnumerateRemoteFiles method.
See the article Watching for changes in SFTP/FTP server:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
List<string> prevFiles = null;
while (true)
{
// Collect file list
List<string> files =
session.EnumerateRemoteFiles(
"/remote/path", "*.*", EnumerationOptions.AllDirectories)
.Select(fileInfo => fileInfo.FullName)
.ToList();
if (prevFiles == null)
{
// In the first round, just print number of files found
Console.WriteLine("Found {0} files", files.Count);
}
else
{
// Then look for differences against the previous list
IEnumerable<string> added = files.Except(prevFiles);
if (added.Any())
{
Console.WriteLine("Added files:");
foreach (string path in added)
{
Console.WriteLine(path);
}
}
IEnumerable<string> removed = prevFiles.Except(files);
if (removed.Any())
{
Console.WriteLine("Removed files:");
foreach (string path in removed)
{
Console.WriteLine(path);
}
}
}
prevFiles = files;
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
}
(I'm the author of WinSCP)
Though, if you actually want to just download the changes, it's a way easier. Just use the Session.SynchronizeDirectories in the loop.
while (true)
{
SynchronizationResult result =
session.SynchronizeDirectories(
SynchronizationMode.Local, "/remote/path", #"C:\local\path", true);
result.Check();
// You can inspect result.Downloads for a list for updated files
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
This will update even modified files, not only new files.
Though using WinSCP .NET assembly from a web application might be problematic. If you do not want to use a 3rd party library, you have to do with limitations of the FtpWebRequest. For an example how to recursively list a remote directory tree with the FtpWebRequest, see my answer to List names of files in FTP directory and its subdirectories.
You have edited your question to say that you have performance problems with the solutions I've suggested. Though you have already asked a new question that covers this:
Get FTP file details based on datetime in C#
Unless you have access to the OS which hosts the service; it will be a bit harder.
FileSystemWatcher places a hook on the filesystem, which will notify your application as soon as something happened.
FTP command specifications does not have such a hook. Besides that it's always initiated by the client.
Therefor, to implement such logic you should periodical perform a NLST to list the FTP-directory contents and track the changes (or hashes, perhaps (MDTM)) yourself.
More info:
FTP return codes
FTP
I have got an alternative solution to do my functionality.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}
I have been searched in the net and I didn't found any result. Actually I want to get the name of all the files that I have in the root and Directory and Sub Directory. I tried the code as bellow but its give me only the files in the root of my FTP.
The folder that I have in the FTP is like as bellow:
/ds/product/Jan/
/ds/subproduct/Jan/
/ds/category/Jan/
The code that I tried:
FtpWebRequest ftpRequest = (FtpWebRequest)WebRequest.Create("ftp://" + FtpIP);
ftpRequest.Credentials = new NetworkCredential(FtpUser, FtpPass);
ftpRequest.Method = WebRequestMethods.Ftp.ListDirectory;
FtpWebResponse response = (FtpWebResponse)ftpRequest.GetResponse();
StreamReader streamReader = new StreamReader(response.GetResponseStream());
List<string> directories = new List<string>();
string line = streamReader.ReadLine();
while (!string.IsNullOrEmpty(line))
{
// directories.Add(line);
line = streamReader.ReadLine().ToString();
MessageBox.Show(line);
}
streamReader.Close();
It is s not easy to implement this without any external library. Unfortunately, neither the .NET Framework nor PowerShell have any explicit support for recursively listing files in an FTP directory.
You have to implement that yourself:
List the remote directory
Iterate the entries, recursing into subdirectories - listing them again, etc.
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET Framework (FtpWebRequest). The .NET Framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name".
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
static void ListFtpDirectory(string url, NetworkCredential credentials)
{
WebRequest listRequest = WebRequest.Create(url);
listRequest.Method = WebRequestMethods.Ftp.ListDirectoryDetails;
listRequest.Credentials = credentials;
List<string> lines = new List<string>();
using (WebResponse listResponse = listRequest.GetResponse())
using (Stream listStream = listResponse.GetResponseStream())
using (StreamReader listReader = new StreamReader(listStream))
{
while (!listReader.EndOfStream)
{
string line = listReader.ReadLine();
lines.Add(line);
}
}
foreach (string line in lines)
{
string[] tokens =
line.Split(new[] { ' ' }, 9, StringSplitOptions.RemoveEmptyEntries);
string name = tokens[8];
string permissions = tokens[0];
if (permissions[0] == 'd')
{
Console.WriteLine($"Directory {name}");
string fileUrl = url + name;
ListFtpDirectory(fileUrl + "/", credentials);
}
else
{
Console.WriteLine($"File {name}");
}
}
}
Use the function like:
NetworkCredential credentials = new NetworkCredential("user", "mypassword");
string url = "ftp://ftp.example.com/directory/to/list/";
ListFtpDirectory(url, credentials);
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats.
For example with WinSCP .NET assembly you can list whole directory recursively with a single call to Session.EnumerateRemoteFiles:
// Setup session options
var sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "ftp.example.com",
UserName = "user",
Password = "mypassword",
};
using (var session = new Session())
{
// Connect
session.Open(sessionOptions);
// Enumerate files
var options =
EnumerationOptions.EnumerateDirectories |
EnumerationOptions.AllDirectories;
IEnumerable<RemoteFileInfo> fileInfos =
session.EnumerateRemoteFiles("/directory/to/list", null, options);
foreach (var fileInfo in fileInfos)
{
Console.WriteLine(fileInfo.FullName);
}
}
Not only the code is simpler, more robust and platform-independent. It also makes all other file attributes (size, modification time, permissions, ownership) readily available via the RemoteFileInfo class.
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
(I'm the author of WinSCP)
I want to find out if the remote host has r/w access to a network share. To start out I wanted to see if I could query the target host's ability to query the UNC path for info, ala
var query = string.Format("select * from CIM_Directory where name = '{0}'", path);
This works fine for local files, e.g.
var path = #"c:\\Windows";
However, I can't figure out an appropriate way of querying a UNC path (e.g. \\foo\bar). The query always returns a blank set. I saw a related question about executing remote files and the solution for that one ended up being PsExec. I was hoping to ideally solve this problem entirely using WMI without having to rely on 3rd party execs, or uploading my own tool to the remote host.
Cheers
Here's a little usage sample of what I am trying to do right now (var values taken out):
using System;
using System.Linq;
using System.Management;
namespace netie
{
class Program
{
static void Main()
{
var connection = new ConnectionOptions
{
Username = "user",
Password = "pass",
Authority = "domain",
Impersonation = ImpersonationLevel.Impersonate,
EnablePrivileges = true
};
var scope = new ManagementScope("\\\\remote\\root\\CIMV2", connection);
scope.Connect();
var path = #"\\\\foo\\bar\\";
var queryString = string.Format("select * from CIM_Directory where name = '{0}'", path);
try
{
var query = new ObjectQuery(queryString);
var searcher = new ManagementObjectSearcher(scope, query);
foreach (var queryObj in searcher.Get().Cast<ManagementObject>())
{
Console.WriteLine("Number of properties: {0}", queryObj.Properties.Count);
foreach (var prop in queryObj.Properties)
{
Console.WriteLine("{0}: {1}", prop.Name, prop.Value);
}
Console.WriteLine();
}
}
catch (Exception e)
{
Console.WriteLine(e);
}
Console.ReadLine();
}
}
}
So it looks like this is basically impossible as WMI locks you out of network access for security reasons. Looks like your best bet is WinRM or PsExec for one-offs. You can potentially enable WinRM through WMI if that's your only path of access, but I imagine that ability can be blocked by group policies. The third option is to write your own Windows Service that will respond to requests and installing that through WMI if you have the access.
In short: the answer to my question is a No. Use WinRm, PsExec, or a custom win-service solution.
I know this is an old question, but for anyone looking to do this, the following code works. (I know that it's not WMI. Given the OP's answer I didn't even try it with WMI, but I shudder to think that people may write a service for something like this.)
if (System.IO.Directory.Exists(#"[SOME UNC PATH]"))
{
System.IO.DirectoryInfo info = new System.IO.DirectoryInfo(#"[SOME UNC PATH]");
var securityInfo = info.GetAccessControl();
var rules = securityInfo.GetAccessRules(
true,
true,
typeof(System.Security.Principal.SecurityIdentifier));
foreach (var rule in rules)
{
var fileSystemRule = rule as System.Security.AccessControl.FileSystemAccessRule;
if (ruleastype != null)
{
string user = fileSystemRule.IdentityReference.Translate(
typeof(System.Security.Principal.NTAccount)).Value;
System.Diagnostics.Debug.Print("{0} User: {1} Permissions: {2}",
fileSystemRule.AccessControlType.ToString(),
user,
fileSystemRule.FileSystemRights.ToString());
}
}
}
When run it produces the following output:
Allow User: Everyone Permissions: ReadAndExecute, Synchronize
Allow User: CREATOR OWNER Permissions: FullControl
Allow User: NT AUTHORITY\SYSTEM Permissions: FullControl
Allow User: BUILTIN\Administrators Permissions: FullControl
Allow User: BUILTIN\Users Permissions: ReadAndExecute, Synchronize