I have been using WinSCP to download files, periodically, from a Unix server to Windows server and it has been working with no issues. I also check if remote file is older or already exists (don't copy).
Now, I have to do the same but this time I have to download files and folders. Files are copied fine but folders aren't. When playing with the settings, I got it to copy the contents of the folder but they get copied to my root local folder; I thought WinSCP would copy everything.
In below code, LocalFolder is Z:\My_Data and LogRootFolder is /xyz/gtc/a00/
Folder structure on remote is /xyz/gtc/a00/ABCD/outcomes/ with subfolder "backup" that has many subfolders named as dates (e.g. /xyz/gtc/a00/ABCD/outcomes/backup/2021-06-23/)
Either none of "backup/2021-xx-xx/" files and folder are copied, or they are all copied to Z:\My_Data\ABCD
After setting up the session, called SFTP_Session:
string sRemotePath = LogRootFolder + "ABCD/outcomes/";
string sLocalFolder = Path.Combine(LocalFolder, #"ABCD\");
if (SFTP_Session.Opened)
{
using (SFTP_Session)
{
SFTP_Session.QueryReceived += (sender, e) =>
{
...
e.Continue();
};
//var opts = EnumerationOptions.EnumerateDirectories | EnumerationOptions.AllDirectories;
//IEnumerable<RemoteFileInfo> fileInfos = SFTP_Session.EnumerateRemoteFiles(sRemotePath, "*.dat", opts); <-- This copies files in folder(s) to m local root folder
Regex mask = new Regex(#"\.(dat|err)$", RegexOptions.IgnoreCase);
IEnumerable<RemoteFileInfo> fileInfos =
SFTP_Session.EnumerateRemoteFiles(sRemotePath, null, EnumerationOptions.AllDirectories)
.Where(fileInfo => mask.Match(fileInfo.Name).Success)
.ToList();
foreach (RemoteFileInfo fileInfo in fileInfos)
{
string localFilePath = Path.Combine(sLocalFolder, fileInfo.Name);
if (fileInfo.IsDirectory)
{
// Create local subdirectory, if it does not exist yet
if (!Directory.Exists(localFilePath))
{
Directory.CreateDirectory(localFilePath);
}
}
else
{
string remoteFilePath = RemotePath.EscapeFileMask(fileInfo.FullName);
// If file does not exist in local folder, download
if (!File.Exists(localFilePath))
{
bDownload = true;
}
else // If file exists in local folder but is older, download; else skip
{
DateTime remoteWriteTime = SFTP_Session.GetFileInfo(remoteFilePath).LastWriteTime;
DateTime localWriteTime = File.GetLastWriteTime(localFilePath);
if (remoteWriteTime > localWriteTime)
{
bDownload = true;
}
else
{
bDownload = false;
}
}
if (bDownload)
{
// Download file
TransferOptions oTrRes = new TransferOptions();
oTrRes.TransferMode = TransferMode.Automatic; //The Transfer Mode - Automatic, Binary, or Ascii
oTrRes.FilePermissions = null; //Permissions applied to remote files; null for default permissions. Can set user, Group, or other Read/Write/Execute permissions.
oTrRes.PreserveTimestamp = false; //Set last write time of destination file to that of source file - basically change the timestamp to match destination and source files.
oTrRes.ResumeSupport.State = TransferResumeSupportState.Off;
TransferOperationResult transferResult = SFTP_Session.GetFiles(remoteFilePath, localFilePath, false, oTrRes);//.Replace("\\","")); // I thought this would get files AND folders
// Throw on any error
transferResult.Check();
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
// Store local file info in a data table for processing later
...
}
SessionRemoteExceptionCollection srec = transferResult.Failures;
foreach (SessionRemoteException sre in srec)
{
// Log errors
}
// Did the download succeeded?
if (!transferResult.IsSuccess)
{
// Log error (but continue with other files)
}
}
}
}
At the end, in local folder I see the files downloaded and copied and subfolders that I created (using above code) but no files in those folders. Can't see what I am missing here.
Your code basically synchronizes a remote directory to a local one.
Instead of fixing your code, you can simply replace most of it with a simple call to Session.SynchronizeDirectories:
https://winscp.net/eng/docs/library_session_synchronizedirectories
Try the synchronization first in WinSCP GUI to see if it does what you need.
If you need to do some processing with the synchronized files, use the SynchronizationResult returned by Session.SynchronizeDirectories. It contains a list of all synchronized files.
If you need to exclude some files from the synchronization, use TransferOptions.FileMask.
Related
Normally we can move the directory using
// source is: "C:\Songs\Elvis my Man"
// newLocation is: "C:\Songs\Elvis"
try
{
// Previous command was: Directory.Move(source, newLocation);
DirectoryInfo dir = new DirectoryInfo(source);
dir.MoveTo(newLocation);
}
catch (Exception e)
{
Console.WriteLine("Error: "+ e.Message);
}
Now, when using the azure:
string myconnectionString = ConfigurationManager.ConnectionStrings["StorageConnection"].ConnectionString.ToString();
string myshareName = "Mysampleshare";
string mydirName = "Mysampledir";
// Get a reference to a share and then create it
ShareClient myshare = new ShareClient(myconnectionString, myshareName);
ShareDirectoryClient directory1 = myshare.GetDirectoryClient(mydirName);
string myshareName2 = "Mysampleshare2";
string mydirName2 = "Mysampledir2";
// Get a reference to a share and then create it
ShareClient myshare2 = new ShareClient(myconnectionString, myshareName2);
ShareDirectoryClient directory2 = myshare2.GetDirectoryClient(mydirName2);
Directory.Move(directory1.Path, directory2.Path);
I am unable to move the directory from one location to another using azure. I am getting exception.
DirectoryNotFoundException: The path specified by sourceDirName is invalid
Please suggest your advice.
ShareDirectoryClient essentially implements Azure File Service REST API thus you cannot use System.IO operations like Directory.Move with it.
There are two possible solutions:
If you want to use the SDK, what you will need to do is list files and directories recursively in the source directory and then copy individual files from source directory to target directory. You will also need to create directories in the target directory as well. Once the copy operation is complete, then you will need to delete all files and directories from the source directory. Once the source directory is empty, only then you will be able to delete the source directory.
You will need to do all this because the REST API does not natively support move operation. To accomplish move, you will need to perform copy operation followed by delete operation.
If you want to use System.IO, then you will need to mount the file share as a shared network drive so that you can get a drive letter assigned to that file share. Once you have that, then you will be able to use operations like Directory.Move available in System.IO namespace.
UPDATE
Please try the code below:
using System;
using System.Threading.Tasks;
using Azure.Storage.Files.Shares;
namespace SO69798149
{
class Program
{
const string MyconnectionString = "DefaultEndpointsProtocol=https;AccountName=account-name;AccountKey=account-key";
const string MyshareName = "share-name";
const string SourceDirectoryName = "source-directory-name";
private const string RenamedDirectoryName = "new-directory-name";
static async Task Main(string[] args)
{
ShareClient myshare = new ShareClient(MyconnectionString, MyshareName);
ShareDirectoryClient sourceDirectoryClient = myshare.GetDirectoryClient(SourceDirectoryName);
ShareDirectoryClient targetDirectoryClient = myshare.GetDirectoryClient(RenamedDirectoryName);
await RenameDirectory(sourceDirectoryClient, targetDirectoryClient);
Console.WriteLine("Directory renamed.");
}
static async Task RenameDirectory(ShareDirectoryClient sourceDirectoryClient,
ShareDirectoryClient targetDirectoryClient)
{
//Create target directory
await targetDirectoryClient.CreateIfNotExistsAsync();
//List files and folders from the source directory
var result = sourceDirectoryClient.GetFilesAndDirectoriesAsync();
await foreach (var items in result.AsPages())
{
foreach (var item in items.Values)
{
if (item.IsDirectory)
{
//If item is directory, then get the child items in that directory recursively.
await RenameDirectory(sourceDirectoryClient.GetSubdirectoryClient(item.Name),
targetDirectoryClient.GetSubdirectoryClient(item.Name));
}
else
{
//If item is file, then copy the file and then delete it.
var sourceFileClient = sourceDirectoryClient.GetFileClient(item.Name);
var targetFileClient = targetDirectoryClient.GetFileClient(item.Name);
await targetFileClient.StartCopyAsync(sourceFileClient.Uri);
await sourceFileClient.DeleteIfExistsAsync();
}
}
}
//Delete source directory.
await sourceDirectoryClient.DeleteIfExistsAsync();
}
}
}
I am trying to create a simple “directory/file copy" console application in C#. What I need is to copy all folders and files (keeping the original hierarchy) from one drive to another, like from drive C:\Data to drive E:\Data.
However, I only want it to copy any NEW or MODIFIED files from the source to the destination.
If the file on the destination drive is newer than the one on the source drive, then it does not copy.
(the problem)
In the code I have, it's comparing file "abc.pdf" in the source with file "xyz.pdf" in the destination and thus is overwriting the destination file with whatever is in the source even though the destination file is newer. I am trying to figure out how to make it compare "abc.pdf" in the source to "abc.pdf" in the destination.
This works if I drill the source and destination down to a specific file, but when I back out to the folder level, it overwrites the destination file with the source file, even though the destination file is newer.
(my solutions – that didn’t work)
I thought by putting the “if (file.LastWriteTime > destination.LastWriteTime)” after the “foreach” command, that it would compare the files in the two folders, File1 source to File1 destination, but it’s not.
It seems I’m missing something in either the “FileInfo[]”, “foreach” or “if” statements to make this a one-to-one comparison. I think maybe some reference to the “Path.Combine” statement or a “SearchOption.AllDirectories”, but I’m not sure.
Any suggestions?
As you can see from my basic code sample, I'm new to C# so please put your answer in simple terms.
Thank you.
Here is the code I have tried, but it’s not working.
class Copy
{
public static void CopyDirectory(DirectoryInfo source, DirectoryInfo destination)
{
if (!destination.Exists)
{
destination.Create();
}
// Copy files.
FileInfo[] files = source.GetFiles();
FileInfo[] destFiles = destination.GetFiles();
foreach (FileInfo file in files)
foreach (FileInfo fileD in destFiles)
// Copy only modified files
if (file.LastWriteTime > fileD.LastWriteTime)
{
file.CopyTo(Path.Combine(destination.FullName,
file.Name), true);
}
// Copy all new files
else
if (!fileD.Exists)
{
file.CopyTo(Path.Combine(destination.FullName, file.Name), true);
}
// Process subdirectories.
DirectoryInfo[] dirs = source.GetDirectories();
foreach (DirectoryInfo dir in dirs)
{
// Get destination directory.
string destinationDir = Path.Combine(destination.FullName, dir.Name);
// Call CopyDirectory() recursively.
CopyDirectory(dir, new DirectoryInfo(destinationDir));
}
}
}
You can just take the array of files in "source" and check for a matching name in "destination"
/// <summary>
/// checks whether the target file needs an update (if it doesn't exist: it needs one)
/// </summary>
public static bool NeedsUpdate(FileInfo localFile, DirectoryInfo localDir, DirectoryInfo backUpDir)
{
bool needsUpdate = false;
if (!File.Exists(Path.Combine(backUpDir.FullName, localFile.Name)))
{
needsUpdate = true;
}
else
{
FileInfo backUpFile = new FileInfo(Path.Combine(backUpDir.FullName, localFile.Name));
DateTime lastBackUp = backUpFile.LastWriteTimeUtc;
DateTime lastChange = localFile.LastWriteTimeUtc;
if (lastChange != lastBackUp)
{
needsUpdate = true;
}
else
{/*no change*/}
}
return needsUpdate;
}
Update:
I modified my code with the suggestions above and all went well. It did exactly as I expected.
However, the problem I ran into was the amount of time it took run the application on a large folder. (containing 6,000 files and 5 sub-folders)
On a small folder, (28 files in 5 sub-folders) it only took a few seconds to run. But, on the larger folder it took 35 minutes to process only 1,300 files.
Solution:
The code below will do the same thing but much faster. This new version processed 6,000 files in about 10 seconds. It processed 40,000 files in about 1 minute and 50 seconds.
What this new code does (and doesn’t do)
If the destination folder is empty, copy all from the source to the destination.
If the destination has some or all of the same files / folders as the source, compare and copy any new or modified files from the source to the destination.
If the destination file is newer than the source, don’t copy.
So, here’s the code to make it happen. Enjoy and share.
Thanks to everyone who helped me get a better understanding of this.
using System;
using System.IO;
namespace VSU1vFileCopy
{
class Program
{
static void Main(string[] args)
{
const string Src_FOLDER = #"C:\Data";
const string Dest_FOLDER = #"E:\Data";
string[] originalFiles = Directory.GetFiles(Src_FOLDER, "*", SearchOption.AllDirectories);
Array.ForEach(originalFiles, (originalFileLocation) =>
{
FileInfo originalFile = new FileInfo(originalFileLocation);
FileInfo destFile = new FileInfo(originalFileLocation.Replace(Src_FOLDER, Dest_FOLDER));
if (destFile.Exists)
{
if (originalFile.Length > destFile.Length)
{
originalFile.CopyTo(destFile.FullName, true);
}
}
else
{
Directory.CreateDirectory(destFile.DirectoryName);
originalFile.CopyTo(destFile.FullName, false);
}
});
}
}
}
I have FileSystem watcher for a local directory. It's working fine. I want same to implement for FTP. Is there any way I can achieve it? I have checked many solutions but it's not clear.
Logic: Want to get files from FTP later than some timestamp.
Problem faced: Getting all files from FTP and then filtering the result is hitting the performance (used FtpWebRequest).
Is there any right way to do this? (WinSCP is on hold. Cant use it now.)
FileSystemWatcher oFsWatcher = new FileSystemWatcher();
OFSWatchers.Add(oFsWatcher);
oFsWatcher.Path = sFilePath;
oFsWatcher.Filter = string.IsNullOrWhiteSpace(sFileFilter) ? "*.*" : sFileFilter;
oFsWatcher.NotifyFilter = NotifyFilters.FileName;
oFsWatcher.EnableRaisingEvents = true;
oFsWatcher.IncludeSubdirectories = bIncludeSubdirectories;
oFsWatcher.Created += new FileSystemEventHandler(OFsWatcher_Created);
You cannot use the FileSystemWatcher or any other way, because the FTP protocol does not have any API to notify a client about changes in the remote directory.
All you can do is to periodically iterate the remote tree and find changes.
It's actually rather easy to implement, if you use an FTP client library that supports recursive listing of a remote tree. Unfortunately, the built-in .NET FTP client, the FtpWebRequest does not. But for example with WinSCP .NET assembly, you can use the Session.EnumerateRemoteFiles method.
See the article Watching for changes in SFTP/FTP server:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "example.com",
UserName = "user",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
List<string> prevFiles = null;
while (true)
{
// Collect file list
List<string> files =
session.EnumerateRemoteFiles(
"/remote/path", "*.*", EnumerationOptions.AllDirectories)
.Select(fileInfo => fileInfo.FullName)
.ToList();
if (prevFiles == null)
{
// In the first round, just print number of files found
Console.WriteLine("Found {0} files", files.Count);
}
else
{
// Then look for differences against the previous list
IEnumerable<string> added = files.Except(prevFiles);
if (added.Any())
{
Console.WriteLine("Added files:");
foreach (string path in added)
{
Console.WriteLine(path);
}
}
IEnumerable<string> removed = prevFiles.Except(files);
if (removed.Any())
{
Console.WriteLine("Removed files:");
foreach (string path in removed)
{
Console.WriteLine(path);
}
}
}
prevFiles = files;
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
}
(I'm the author of WinSCP)
Though, if you actually want to just download the changes, it's a way easier. Just use the Session.SynchronizeDirectories in the loop.
while (true)
{
SynchronizationResult result =
session.SynchronizeDirectories(
SynchronizationMode.Local, "/remote/path", #"C:\local\path", true);
result.Check();
// You can inspect result.Downloads for a list for updated files
Console.WriteLine("Sleeping 10s...");
Thread.Sleep(10000);
}
This will update even modified files, not only new files.
Though using WinSCP .NET assembly from a web application might be problematic. If you do not want to use a 3rd party library, you have to do with limitations of the FtpWebRequest. For an example how to recursively list a remote directory tree with the FtpWebRequest, see my answer to List names of files in FTP directory and its subdirectories.
You have edited your question to say that you have performance problems with the solutions I've suggested. Though you have already asked a new question that covers this:
Get FTP file details based on datetime in C#
Unless you have access to the OS which hosts the service; it will be a bit harder.
FileSystemWatcher places a hook on the filesystem, which will notify your application as soon as something happened.
FTP command specifications does not have such a hook. Besides that it's always initiated by the client.
Therefor, to implement such logic you should periodical perform a NLST to list the FTP-directory contents and track the changes (or hashes, perhaps (MDTM)) yourself.
More info:
FTP return codes
FTP
I have got an alternative solution to do my functionality.
Explanation:
I am downloading the files from FTP (Read permission reqd.) with same folder structure.
So everytime the job/service runs I can check into the physical path same file(Full Path) exists or not If not exists then it can be consider as a new file. And Ii can do some action for the same and download as well.
Its just an alternative solution.
Code Changes:
private static void GetFiles()
{
using (FtpClient conn = new FtpClient())
{
string ftpPath = "ftp://myftp/";
string downloadFileName = #"C:\temp\FTPTest\";
downloadFileName += "\\";
conn.Host = ftpPath;
//conn.Credentials = new NetworkCredential("ftptest", "ftptest");
conn.Connect();
//Get all directories
foreach (FtpListItem item in conn.GetListing(conn.GetWorkingDirectory(),
FtpListOption.Modify | FtpListOption.Recursive))
{
// if this is a file
if (item.Type == FtpFileSystemObjectType.File)
{
string localFilePath = downloadFileName + item.FullName;
//Only newly created files will be downloaded.
if (!File.Exists(localFilePath))
{
conn.DownloadFile(localFilePath, item.FullName);
//Do any action here.
Console.WriteLine(item.FullName);
}
}
}
}
}
I'm writing application launcher as a Window Application in C#, VS 2017. Currently, having problem with this piece of code:
if (System.IO.Directory.Exists(extractPath))
{
string[] files = System.IO.Directory.GetFiles(extractPath);
string[] dirs = Directory.GetDirectories(extractPath);
// Copy the files and overwrite destination files if they already exist.
foreach (string s in files)
{
// Use static Path methods to extract only the file name from the path.
var fileName = System.IO.Path.GetFileName(s);
var destFile = System.IO.Path.Combine(oldPath, fileName);
System.IO.File.Move(s, destFile);
}
foreach (string dir in dirs)
{
//var dirSplit = dir.Split('\\');
//var last = dirSplit.Last();
//if (last != "Resources")
//{
var fileName = System.IO.Path.GetFileName(dir);
var destFile = System.IO.Path.Combine(oldPath, fileName);
System.IO.Directory.Move(dir, destFile);
//}
}
}
I'm getting well known error
"The process cannot access the file 'XXX' because it is being used by another process."
I was looking for solution to fix it, found several on MSDN and StackOvervflow, but my problem is quite specific. I cannot move only 1 directory to another, which is Resources folder of my main application:
Here is my explanation why problem is specific:
I'm not having any issues with moving other files from parent directory. Error occurs only when loop reaches /Resources directory.
At first, I was thinking that it's beeing used by VS instantion, in which I've had main app opened. Nothing have changed after closing VS and killing process.
I've copied and moved whole project to another directory. Never opened it in VS nor started via *.exe file, to make sure that none of files in new, copied directory, is used by any process.
Finally, I've restarted PC.
I know that this error is pretty common when you try to Del/Move files, but in my case, I'm sure that it's being used only by my launcher app. Here is a little longer sample code to show what files operation I'm actually doing:
private void RozpakujRepo()
{
string oldPath = #"path\Debug Kopia\Old";
string extractPath = #"path\Debug Kopia";
var tempPath = #"path\ZipRepo\RexTempRepo.zip";
if (System.IO.File.Exists(tempPath) == true)
{
System.IO.File.Delete(tempPath);
}
System.IO.Compression.ZipFile.CreateFromDirectory(extractPath, tempPath);
if (System.IO.Directory.Exists(oldPath))
{
DeleteDirectory(oldPath);
}
if (!System.IO.Directory.Exists(oldPath))
{
System.IO.Directory.CreateDirectory(oldPath);
}
if (System.IO.Directory.Exists(extractPath))
{
string[] files = System.IO.Directory.GetFiles(extractPath);
string[] dirs = Directory.GetDirectories(extractPath);
// Copy the files and overwrite destination files if they already exist.
foreach (string s in files)
{
// Use static Path methods to extract only the file name from the path.
var fileName = System.IO.Path.GetFileName(s);
var destFile = System.IO.Path.Combine(oldPath, fileName);
System.IO.File.Move(s, destFile);
}
foreach (string dir in dirs)
{
//var dirSplit = dir.Split('\\');
//var last = dirSplit.Last();
//if (last != "Resources")
//{
var fileName = System.IO.Path.GetFileName(dir);
var destFile = System.IO.Path.Combine(oldPath, fileName);
System.IO.Directory.Move(dir, destFile);
//}
}
}
string zipPath = #"path\ZipRepo\RexRepo.zip";
ZipFile.ExtractToDirectory(zipPath, extractPath);
}
And now, my questions:
Can it be related to file types (.png, .ico, .bmp) ?
Can it be related to fact, that those resources files are being used like, as, for example .exe file icon in my main application? Or just because those are resources files?
Is there anything else what I'm missing and what can cause the error?
EDIT:
To clarify:
There are 2 apps:
Main Application
Launcher Application (to launch Main Application)
And Resources folder is Main Application/Resources, I'm moving it while I'm doing application version update.
It appeared that problem is in different place than in /Resources directory. Actually problem was with /Old directory, because it caused inifinite recurrence.
How to connect server already hosted on IIS Website Folder and compare the Local Client Folder?
This is my Website URL on localhost
IISHostedWebsite/Updates //Folder in Website
I need to compare in this url update folder files with my local client machine's D:\Updates folder.
If new updates available into server it will copy to my D:\Updates folder.
How can we achieve this sort of situation ?
I have some code that in C#
var directory = new DirectoryInfo(#"D:\\Anand\\Work\\FolderCheck\\Server");
var myFile = (from f in directory.GetFiles()
orderby f.LastWriteTime descending
select f).First();
This code generates the latest updated file from folder
The MSDeploy tool (http://www.iis.net/downloads/microsoft/web-deploy) was designed to solve this problem. It lets you compare IIS virtual directories to other directories, and synchronize them if needed. It can also be used just as a diff tool.
In your example, after installing MSDeploy, you could do the following:
msdeploy.exe -verb:sync -source:contentPath="IISHostedWebSite/Updates" -dest:contentPath=d:\updates -whatIf
This command will show the list of changes needed to update d:\updates to look like IISHostedWebSites/Updates. If you remove the "-whatif" it will actually do the changes.
You can also call the MSDeploy programatically to do the same thing.
Also the problem with your code snippet is that you wouldn't detect deleted files from source that also need to be deleted from destination.
Here my code is for Getting Files from Directory and Check Whether are same if not It will Showing No updates Found and After that It will Check latest Files Available into Folder if Available then Copy this Files and Move to Destination Folder This is my Task but i have completed my code in Latest updated file i get from this code but i dont no how this files are copy into destination folder
string serverPath = #"D:\Anand\Work\FolderCheck\Server"; //Source File
string clientPath = #"D:\Anand\Work\FolderCheck\Client"; //destination Folder
private static bool CompareFileSizes(string fileServer, string fileClient)
{
bool fileSizeEqual = true;
if (fileServer.Length == fileClient.Length) // Compare file sizes
{
fileSizeEqual = false; // File sizes are not equal therefore files are not identical
}
return fileSizeEqual;
}
try
{
if (!File.Exists(serverPath) || !File.Exists(clientPath))
{
try
{
var Server = Path.GetFileName(serverPath);
var Client = Path.GetFileName(clientPath);
string ServerFile = Server.ToString();
string ClientFile = Client.ToString();
if (CompareFileSizes(ServerFile, ClientFile))
{
lblServerMsg.Text = "No Updates are Found: ";
}
else
{
var directoryServer = new DirectoryInfo(#"D:\Anand\Work\FolderCheck\Server"); //check latest Available File From Server
var myFile = (from f in directoryServer.GetFiles()
orderby f.LastWriteTime descending
select f).First();
lblServerMsg.Text = "Updates Are Available Click for Update Button:";
btnCheckUpates.Visible = false;
btnUpdates.Visible = true;
}
}
catch (Exception msg)
{
lblServerMsg.Text = "No Updates are Found: " + msg.Message;
}
}
else
{
throw new FileNotFoundException();
}
}
catch (FileNotFoundException ex)
{
lblServerMsg.Text = ex.Message;
}
Above code i have done Get latest File from Source Folder but i dont no how to
var myFile = (from f in directoryServer.GetFiles()
orderby f.LastWriteTime descending
select f).First();
Above doe myFile are latest File from Source Folder this i want to copy into Destination Folder