This function takes a few seconds to run locally. It unpacks a zip file onto the local server and is called before serving individual files in the zip. It should only run once if the extracted folder does not exist.
internal static void Init(string releaseName)
{
var unpackedFolderDirectory = Settings.Editor.FilesRootFolder + releaseName + "\\";
var dirInfo = new DirectoryInfo(unpackedFolderDirectory);
if (dirInfo.Exists) return;
lock ("UnpackLock")
{
dirInfo.Refresh();
if (dirInfo.Exists) return;
// Unpack all
var bytes = getBytesFromAzure();
using var ms = new MemoryStream(bytes);
using (var archive = new ZipArchive(ms))
{
archive.ExtractToDirectory(unpackedFolderDirectory);
}
}
}
If I make multiple requests to this function, some of them return the error:
The file 'C:\SomeFolder\SomeSubFolder\SomeFile.png' already exists.
On line -> archive.ExtractToDirectory(unpackedFolderDirectory);
I am expecting the archive.ExtractToDirectory(unpackedFolderDirectory); to only execute once, but it appears to be running multiple times.
What am I doing wrong? Is there some race condition here I'm not spotting?
Related
I have been using WinSCP to download files, periodically, from a Unix server to Windows server and it has been working with no issues. I also check if remote file is older or already exists (don't copy).
Now, I have to do the same but this time I have to download files and folders. Files are copied fine but folders aren't. When playing with the settings, I got it to copy the contents of the folder but they get copied to my root local folder; I thought WinSCP would copy everything.
In below code, LocalFolder is Z:\My_Data and LogRootFolder is /xyz/gtc/a00/
Folder structure on remote is /xyz/gtc/a00/ABCD/outcomes/ with subfolder "backup" that has many subfolders named as dates (e.g. /xyz/gtc/a00/ABCD/outcomes/backup/2021-06-23/)
Either none of "backup/2021-xx-xx/" files and folder are copied, or they are all copied to Z:\My_Data\ABCD
After setting up the session, called SFTP_Session:
string sRemotePath = LogRootFolder + "ABCD/outcomes/";
string sLocalFolder = Path.Combine(LocalFolder, #"ABCD\");
if (SFTP_Session.Opened)
{
using (SFTP_Session)
{
SFTP_Session.QueryReceived += (sender, e) =>
{
...
e.Continue();
};
//var opts = EnumerationOptions.EnumerateDirectories | EnumerationOptions.AllDirectories;
//IEnumerable<RemoteFileInfo> fileInfos = SFTP_Session.EnumerateRemoteFiles(sRemotePath, "*.dat", opts); <-- This copies files in folder(s) to m local root folder
Regex mask = new Regex(#"\.(dat|err)$", RegexOptions.IgnoreCase);
IEnumerable<RemoteFileInfo> fileInfos =
SFTP_Session.EnumerateRemoteFiles(sRemotePath, null, EnumerationOptions.AllDirectories)
.Where(fileInfo => mask.Match(fileInfo.Name).Success)
.ToList();
foreach (RemoteFileInfo fileInfo in fileInfos)
{
string localFilePath = Path.Combine(sLocalFolder, fileInfo.Name);
if (fileInfo.IsDirectory)
{
// Create local subdirectory, if it does not exist yet
if (!Directory.Exists(localFilePath))
{
Directory.CreateDirectory(localFilePath);
}
}
else
{
string remoteFilePath = RemotePath.EscapeFileMask(fileInfo.FullName);
// If file does not exist in local folder, download
if (!File.Exists(localFilePath))
{
bDownload = true;
}
else // If file exists in local folder but is older, download; else skip
{
DateTime remoteWriteTime = SFTP_Session.GetFileInfo(remoteFilePath).LastWriteTime;
DateTime localWriteTime = File.GetLastWriteTime(localFilePath);
if (remoteWriteTime > localWriteTime)
{
bDownload = true;
}
else
{
bDownload = false;
}
}
if (bDownload)
{
// Download file
TransferOptions oTrRes = new TransferOptions();
oTrRes.TransferMode = TransferMode.Automatic; //The Transfer Mode - Automatic, Binary, or Ascii
oTrRes.FilePermissions = null; //Permissions applied to remote files; null for default permissions. Can set user, Group, or other Read/Write/Execute permissions.
oTrRes.PreserveTimestamp = false; //Set last write time of destination file to that of source file - basically change the timestamp to match destination and source files.
oTrRes.ResumeSupport.State = TransferResumeSupportState.Off;
TransferOperationResult transferResult = SFTP_Session.GetFiles(remoteFilePath, localFilePath, false, oTrRes);//.Replace("\\","")); // I thought this would get files AND folders
// Throw on any error
transferResult.Check();
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
// Store local file info in a data table for processing later
...
}
SessionRemoteExceptionCollection srec = transferResult.Failures;
foreach (SessionRemoteException sre in srec)
{
// Log errors
}
// Did the download succeeded?
if (!transferResult.IsSuccess)
{
// Log error (but continue with other files)
}
}
}
}
At the end, in local folder I see the files downloaded and copied and subfolders that I created (using above code) but no files in those folders. Can't see what I am missing here.
Your code basically synchronizes a remote directory to a local one.
Instead of fixing your code, you can simply replace most of it with a simple call to Session.SynchronizeDirectories:
https://winscp.net/eng/docs/library_session_synchronizedirectories
Try the synchronization first in WinSCP GUI to see if it does what you need.
If you need to do some processing with the synchronized files, use the SynchronizationResult returned by Session.SynchronizeDirectories. It contains a list of all synchronized files.
If you need to exclude some files from the synchronization, use TransferOptions.FileMask.
I have the following code that I want to chance using batch. In this code, first I create a copy of a file, and then using the id of the new file, I am adding a permission.
File readOnlyFile = new File();
readOnlyFile.Name = newFileName.Replace(' ', '-') + "_assess";
readOnlyFile.Parents = new List<string> { targetFolderId };
FilesResource.CopyRequest fileCreateRequest = _driveService.Files.Copy(readOnlyFile, fileId);
string readOnlyFileId = fileCreateRequest.Execute().Id;
if (readOnlyFileId != null)
{
newPermission.ExpirationTime = expirationDate;
newPermission.Type = "anyone";
newPermission.Role = "reader";
PermissionsResource.CreateRequest req = _driveService.Permissions.Create(newPermission, readOnlyFileId);
req.SendNotificationEmail = false;
req.Execute();
}
However, I am puzzled when trying to use batch for this task since I will need the id of the newly copied file to add permission. Below is my initial attempt where I do not know how to proceed after batch.Queue(fileCreateRequest, callback). I can add new action to the batch to add permission. But, I do not know how to get the id of the file. Any suggestions? I need to do this for three different files.
var batch = new BatchRequest(_driveService);
BatchRequest.OnResponse<Permission> callback = delegate (
Permission permission,
RequestError error,
int index,
System.Net.Http.HttpResponseMessage message)
{
if (error != null)
{
// Handle error
Console.WriteLine(error.Message);
}
else
{
Console.WriteLine("Permission ID: " + permission.Id);
}
};
Permission newPermission = new Permission();
File readOnlyFile = new File();
readOnlyFile.Name = newFileName.Replace(' ', '-') + "_assess";
readOnlyFile.Parents = new List<string> { targetFolderId };
FilesResource.CopyRequest fileCreateRequest = _driveService.Files.Copy(readOnlyFile, fileId);
batch.Queue(fileCreateRequest, callback);
To upload and to configure permissions of a file are two different operations that cannot be batched together. Your approach is correct, you only have to use set up the permissions in a second call.
After uploading the files and retrieving the ids as you do, you have to create a second call to create the permissions.
There is no way to do it in a single request, because for setting up the permissions you need the id of the file; and that is only created after finishing the upload. If you need any more clarification, please ask me without hesitating.
Im currently building a Windows service that will be used to create backups of logs. Currently, the logs are stored at the path E:\Logs and intent is to copy the contents, timestamp their new folder and compress it. After this, you should have E:\Logs and E:\Logs_[Timestamp].zip. The zip will be moved to C:\Backups\ for later processing. Currently, I am using the following to try and zip the log folder:
var logDirectory = "E://Logs";
var timeStamp = DateTime.Now.ToString("yyyyMMddHHmm");
var zippedFolder = logDirectory + "_" + timeStamp + ".zip";
System.IO.Compression.ZipFile.CreateFromDirectory(logDirectory, zippedFolder);
While this appears to create a zip folder, I get the error Windows cannot open the folder. The Compressed (zipped) Folder E:\Logs_201805161035.zip is invalid.
To address any troubleshooting issues, the service is running with an AD account that has a sufficient permission level to perform administrative tasks. Another thing to consider is that the service kicks off when its FileSystemWatcher detects a new zip folder in the path C:\Aggregate. Since there are many zip folders that are added to C:\Aggregate at once, the FileSystemWatcher creates a new Task for each zip found. You can see how this works in the following:
private void FileFoundInDrops(object sender, FileSystemEventArgs e)
{
var aggregatePath = new DirectoryInfo("C://Aggregate");
if (e.FullPath.Contains(".zip"))
{
Task task = Task.Factory.StartNew(() =>
{
try
{
var logDirectory = "E://Logs";
var timeStamp = DateTime.Now.ToString("yyyyMMddHHmm");
var zippedFolder = logDirectory + "_" + timeStamp + ".zip";
ZipFile.CreateFromDirectory(logDirectory, zippedFolder);
}
catch (Exception ex)
{
Log.WriteLine(System.DateTime.Now.ToString() + " - ERROR: " + ex);
}
});
task.Dispose();
}
}
How can I get around the error I am receiving? Any help would be appreciated!
Environment: I have a windows console application and I am running the exe from command line.
Below is my code:
static void Main(string[] args)
{
CreatePSTUsingRedemption(args[0], args[1]);
}
private static void CreatePSTUsingRedemption(string messageFilePath, string pstPath)
{
RDOSession pstSession = new RDOSession();
RDOPstStore store = null;
store = pstSession.LogonPstStore(pstPath, 1, "combinedPST");
//actually there is a loop here to loop through each message files.
RDOMail rdo_Mail = pstSession.GetMessageFromMsgFile(messageFilePath);
rdo_Mail.CopyTo(store.IPMRootFolder);
rdo_Mail.Save();
store.Save();
completedCount++;
Console.WriteLine("FILES_PROCESSED:" + completedCount);
pstSession.Logoff();
}
Main purpose of this code is to create a single pst file combining email message(.msg) files.
Now when I run the exe, a pst file is created in the given location and its size keeps increasing as the code runs. After all the message files are processed application exists. There is no any error. Now, when I try to load this pst file in Outlook 2013. Pst is empty and its size is also reduced to 265KB every time. I don't think any process is using this pst because I can copy and move it anywhere I want. What might be the issue? Any suggestion, please?
UPDATE 1
private static void CreatePSTUsingRedemption(XmlNodeList nodelist, string pstPath)
{
System.Diagnostics.Debugger.Launch();
RDOSession pstSession = null;
RDOPstStore store = null;
RDOFolder folder = null;
RDOMail rdo_Mail = null;
try
{
pstSession = new RDOSession();
store = pstSession.LogonPstStore(pstPath, 1, Path.GetFileNameWithoutExtension(pstPath));
var enumerator = store.IPMRootFolder.Folders.GetEnumerator(); //DELETE DEFAULT FOLDERS
while (enumerator.MoveNext())
{
var defaultFolders = enumerator.Current as RDOFolder;
defaultFolders.Delete();
}
int completedCount = 0;
folder = store.IPMRootFolder;
foreach (XmlNode node in nodelist)
{
rdo_Mail = pstSession.GetMessageFromMsgFile(node["FullPath"].InnerText);
rdo_Mail.CopyTo(folder);
rdo_Mail.Save();
store.Save();
completedCount++;
Console.WriteLine("FILES_PROCESSED:" + completedCount);
}
}
finally
{
Marshal.ReleaseComObject(rdo_Mail);
Marshal.ReleaseComObject(folder);
Marshal.ReleaseComObject(store);
}
pstSession.Logoff();
Marshal.ReleaseComObject(pstSession);
GC.Collect();
}
Above is my code for the actual loop. I am loading all the email messages file path from an xml file. I still encounter same issue as above.
This is an indication that the PST store is not fully flushed to the disk and Outlook "fixes' the PST file by resetting it.
Try to explicitly release all Redemption objects first before logging off and call GC.Collect(). If you have a loop processing multiple files, release the message on each step of the loop.
private static void CreatePSTUsingRedemption(string messageFilePath, string pstPath)
{
RDOSession pstSession;
try
{
RDOPstStore store;
RDOFolder folder;
RDOMail rdo_Mail;
pstSession = new RDOSession();
store = pstSession.LogonPstStore(pstPath, 1, "combinedPST");
//actually there is a loop here to loop through each message files.
rdo_Mail = pstSession.GetMessageFromMsgFile(messageFilePath);
folder = store.IPMRootFolder;
rdo_Mail.CopyTo(folder);
rdo_Mail.Save();
store.Save();
completedCount++;
Console.WriteLine("FILES_PROCESSED:" + completedCount);
}
finally
{
Marshal.ReleaseComObject(rdo_Mail);
Marshal.ReleaseComObject(folder);
Marshal.ReleaseComObject(store);
}
pstSession.Logoff();
Marshal.ReleaseComObject(pstSession);
GC.Collect();
}
I have a question about dealing with threads. I am copying files from one folder to another, then zipping them up. Problem is the winform appears to be attempting to zip the files before they are finished copying which in turn is causing the zip function to not complete. I did some looking around on here and to be honest I am having issues wrapping my head around how it works. MSDN has a nice little snippet:
// Wait on a single task with no timeout specified.
Task taskA = Task.Factory.StartNew(() => DoSomeWork(10000000));
taskA.Wait();
Console.WriteLine("taskA has completed.");
static void DoSomeWork(int val)
{
// Pretend to do something.
Thread.SpinWait(val);
}
again it's a bit over my head the code that I am trying to wait for complete
error_handling("Daily Backup Started", "BackupLog.txt");
string fileName = "";
string Source = #"C:\folder\Program";
string target = #"C:\folder\day_backup";
string datestamp = DateTime.Now.ToString("MMddyy-HHmm");
string[] files = System.IO.Directory.GetFiles(Source, "*.mdb");
foreach (string file in files)
{
fileName = System.IO.Path.GetFileName(file);
string destfile = System.IO.Path.Combine(target, fileName);
System.IO.File.Copy(file, destfile);
sub_error_handling(fileName+" has been copied", "DailyBackupLog.txt");
}
compression(#"C:\backupfolder\day_backup", #"\day_backup"+datestamp+".zip");
sub_error_handling("Files were packaged for transmission", "DailyBackupLog.txt");
also here is my zip code:
private void compression(string zipdir, string zipfilename)
{
try
{
using (ZipFile zip = new ZipFile())
{
zip.AddDirectory(zipdir);
zip.Comment = "This backup was created at " + System.DateTime.Now.ToString("G");
zip.Save(zipdir + zipfilename);
}
}
catch (Exception error)
{
error_handling("Incremental Backup Failed Compression was unsuccessful", "Incbackuplog.txt");
sub_error_handling(error + "", "Incbackuplog.txt");
error_handling("End Of Error Report", "Incbackuplog.txt");
}
}
it won't let me use a void to perform the new task, so not sure what else to try. Any suggestions ?
I'd say this is causing the hang:
zip.AddDirectory(zipdir);
zip.Comment = "This backup was created at " + System.DateTime.Now.ToString("G");
zip.Save(zipdir + zipfilename);
You are compressing all files in zipdir to a new file in the same location, so that the new file will be included in the zip file. So you are trying to include the new file inside itself, which is obviously impossible and explains, I think, why the zip process never ends.
By the way, you've set zipdir to a directory that's not the same as the one to which your code copies the files.