I'm trying to find out if a user added new music to the Music folder on the phone since app was last used.
I try to do this by checking the DateModified of the Music folder (which updates correctly on the computer when adding new music to the phone):
async void GetModifiedDate ()
{
BasicProperties props = await KnownFolders.MusicLibrary.GetBasicPropertiesAsync();
Debug.WriteLine("DATEMODIFIED: " + props.DateModified.ToString());
}
Unfortunately this returns:
DATEMODIFIED: 1/1/1601 1:00:00 AM +01:00
Am I doing something wrong or is there another quick way to check if user added new music?
KnownFolders.MusicLibrary is a virtual location. Therefore I think, may be a problem in getting its properties.
The other problem is that DateModified may be a bad idea, as it may stay the same when user adds a file. You cannot relay on it. (some information). You can check it - when I've tried to move files in folders, their DateModified hadn't changed.
So in this case, I'm afraid you will have to list your files in MusicLibrary and then decide, what to save for future comparison. The sum of filesizes can be a good idea, hence there is a little chance that two different music files would be the same size. It depends also if you want to be notified if the user had moved file from one folder to another (total size won't change). If you want to ensure more you can remember the whole list of Tuple<file.FolderRelativeId, fileSize> (for ecample).
As FileQueries are not yet available for Windows Phone, you will have to retrive files recursively the simple code can look like this:
// first - a method to retrieve files from folder recursively
private async Task RetriveFilesInFolder(List<StorageFile> list, StorageFolder parent)
{
foreach (var item in await parent.GetFilesAsync()) list.Add(item);
foreach (var item in await parent.GetFoldersAsync()) await RetriveFilesInFolder(list, item);
}
private async Task<List<StorageFile>> GetFilesInMusic()
{
StorageFolder folder = KnownFolders.MusicLibrary;
List<StorageFile> listOfFiles = new List<StorageFile>();
await RetriveFilesInFolder(listOfFiles, folder);
return listOfFiles;
}
Once you have a list of your files, you can decide what to remember for further comparison upon next app launch.
You can check the folder size instead:
ulong musicFolderSize = (ulong)ApplicationData.Current.LocalSettings.Values["musicFolderSize"];
BasicProperties props = await KnownFolders.MusicLibrary.GetBasicPropertiesAsync();
if (props.Size != musicFolderSize)
{
// ...
ApplicationData.Current.LocalSettings.Values["musicFolderSize"] = props.Size;
}
Adding onto Romansz's answer:
The only way to guarantee that you know of a file change would be to track the files on the device and then compare if there is a change between launches.
A lazier way to get all of the files would be to use KnownFolders.MusicLibrary.GetFilesAsync(CommonFileQuery.OrderByName); which is a deep query by default. It will grab all of the files in one query, but can be really slow on massive directories. You then have to page through all of the files as shown
//cache the virtual storage folder for the loop
StorageFolder musicLibrary = KnownFolders.MusicLibrary;
uint stepSize= 50;
uint startIndex = 0;
while (true)
{
IReadOnlyList<StorageFile> files = await musicLibrary.GetFilesAsync(CommonFileQuery.OrderByName,startIndex,stepSize);
foreach (var file in files)
{
//compare to see if file is in your list
}
if (files.Count < stepSize) break;
startIndex += stepSize;
}
Related
I have the following code to rename files in the following tree as from 00000001.pdf to the last file with this 8 character left padding, e.g: 00000100.pdf
Folder1
subfolder1
childfolder1
pdffile1
pdffile2
childfolder2
pdffile3
pdffile4
subfolder2
childfolder3
pdffile5
pdffile6
But for some reason in some of those child folders it keeps renaming them with no end.
Some times it just jumps to another number, as if it was an async operation. But if I stop and start again it goes okay until the second next folder, when it messes up again.
But this error only happened within 19 folders.
Indeed their pdf names are different from the others, but I don't see how it is related.
The other files were named something like "DOCUMENT_01" and so on, but these are:
0000000100000001.pdf
0000000200000001.pdf
0000000300000001.pdf
etc
static void Main(string[] args)
{
Console.WriteLine("Digite a pasta 'pai' onde serão buscados pdfs dentro das pastas 'filhas':");
string path = Console.ReadLine();
foreach (string dir in Directory.EnumerateDirectories(path))
{
foreach (string subdir in Directory.EnumerateDirectories(dir))
{
Console.WriteLine($"{dir} - {subdir}");
int n = 1;
foreach (string pdffile in Directory.EnumerateFiles(subdir, "*.pdf", SearchOption.AllDirectories))
{
Console.WriteLine(n.ToString().PadLeft(8, '0') + " " + new FileInfo(pdffile).Length);
File.Move(pdffile, subdir + $"\\{n.ToString().PadLeft(8, '0')}.pdf");
n++;
}
Console.WriteLine("\n\n");
}
}
}
What could be going wrong?
It should await for the File.Move method to end to add the n + 1 and then moving to the next pdffile as a synchronous operation. So why does it jumps numbers after a random time and why it keeps going forever other times?
And just to remember, if I stop the program and start again and put the folder that was messed up as the first one, it goes ok and only when it goes to the next folder, or the folder after next that it start to give me this error again.
Hope that I could make myself clear... Thanks for your attention!
EDIT: will try using FileInfo class to give me the parent folder with the SearchOption.AllDirectories option and exclude this 3 stage loop plus actually working for any kind of tree structure
EDIT2: Tried, worked as a "tree indepent" script but getting the same result with the files name after the first folder... As it's really fast, in 3 seconds it goes from 00000169.pdf to 00006239.pdf in a folder with just 330 items.
As commented already, it is not a good idea to move or rename files “WHILE” the code is enumerating though the list of those files as the posted code appears to do. This will cause obvious problems and you should simply mark the files somehow, then later come back and rename or move them.
More importantly, the big issue related to renaming/moving files is exactly as you describe with your current issue. The problem is that the errors are erratic and not consistent. Making it very difficult to trace. However, the problems you describe are classic trademarks of moving/renaming files while enumerating through those files.
With that said, the best way and easiest way to traverse an unknown number of folder levels given a starting folder is by using recursion. In a lot of cases, recursion can be avoided with some well though out loops, however when we do not know how many levels of folders there are, then, using a simple loop or foreach loop paradigm may be doable, however, you will most likely be adding variables and code that only makes this more complex. This is shown in the current code with the addition of the dir variable to keep track of “when” a different folder is used. Recursion is suited ideally for this situation.
In this case, this recursive method will be called ONCE for each folder and subfolders from a given “starting” folder location. This means that each time this recursive method is called is when a different folder is beginning to be processed. So n would always start at 1 and we do not need to keep track of the current folders path.
So the signature of this method will take a DirectoryFolder object as a “starting” folder. First we create some variables; a FileInfo array pdffiles to hold the pdf files in the given folder; in addition to a DirectoryInfo array foldersInThisFolder to hold all the other folders in this starting folder. Lastly an int n to index the files as the posted code is doing.
Next we get all the pdf files in this “starting” folder. If there are pdf files in this folder, then we loop through those files and process them. Next, we get all the other folders in this “starting” folder. Then start a loop through each folder. For each folder in this collection we will make the recursive call back to this method using the next folder as the “starting” folder, then the whole process continues until the loop through those folders ends.
static void TraverseDirectoryTree(DirectoryInfo startingFolder) {
FileInfo[] pdffiles = null;
DirectoryInfo[] foldersInThisFolder = null;
int n = 1;
Console.WriteLine(startingFolder.FullName);
// get all the pdf files in this folder
try {
pdffiles = startingFolder.GetFiles("*.pdf");
}
catch (Exception e) {
// you may want to catch specific exceptions
// however in this example we do not care what
// the exception is, we will simply ignore this.
// in most cases pdffiles will be null if an exception is thrown
Console.WriteLine(e.Message);
}
if (pdffiles != null) {
foreach (FileInfo pdffile in pdffiles) {
Console.WriteLine(pdffile.FullName + " -> " + n.ToString().PadLeft(8, '0') + " " + pdffile.Length);
//File.Move(pdffile.FullName, pdffile.DirectoryName + $"\\{n.ToString().PadLeft(8, '0')}.pdf");
// add file path to a list of files to rename later?
n++;
}
// start over wiith the sub folders in this folder
foldersInThisFolder = startingFolder.GetDirectories();
foreach (DirectoryInfo dirInfo in foldersInThisFolder) {
TraverseDirectoryTree(dirInfo);
}
}
}
Usage…
Console.WriteLine("Type the folder you want to start with:");
string path = Console.ReadLine();
DirectoryInfo di = new DirectoryInfo(path);
TraverseDirectoryTree(di);
Edit… after further testing it appears that what you are wanting to do is simply “rename” the pdf files. As suggested a simple solution is to save the files that we want to rename, then, after we collect the files we want to rename, we simply loop through those files and rename them. This should eliminate any problems by renaming files while enumerating though the files collection.
To help, I created a Dictionary<string, int> called filesToRename. While recursively looping through all the folders, we will add the full path of each pdf file we want to rename as the Key and the int value n as the Value. After the dictionary is filled we would simply loop through it and rename the files.
private static Dictionary<string, int> filesToRename = new Dictionary<string, int>();
Then replace the commented-out line in the recursive method TraverseDirectoryTree…
//File.Move(pdffile.FullName, pdffile.DirectoryName + $"\\{n.ToString().PadLeft(8, '0')}.pdf");
With…
filesToRename.Add(pdffile.FullName, n);
Then after the dictionary is filled we would loop through it and rename the files, something like…
DirectoryInfo di = new DirectoryInfo(path);
TraverseDirectoryTree(di);
foreach (KeyValuePair<string, int> kvp in filesToRename) {
int index = kvp.Key.ToString().LastIndexOf(#"\");
string dir = kvp.Key.ToString().Substring(0, index);
File.Move(kvp.Key, dir + $"\\{kvp.Value.ToString().PadLeft(8, '0')}.pdf");
}
I am hoping this makes sense…
Answer as Klaus Gütter helped me, I just added .ToList() to the Directory.EnumerateFiles so it made a fixed list first, and then made the foreach for each file
It will rename every pdf within the folder and it's subfolders
Console.WriteLine("Type the folder you want to start with:");
string path = Console.ReadLine();
string dir = "";
int n = 1;
foreach (string pdffile in Directory.EnumerateFiles(path, "*.pdf", SearchOption.AllDirectories).ToList())
{
FileInfo fi = new FileInfo(pdffile);
if (fi.DirectoryName == dir)
{
Console.WriteLine("\t" + n.ToString().PadLeft(8, '0'));
File.Move(pdffile, dir + $"\\{n.ToString().PadLeft(8, '0')}.pdf");
n++;
}
else
{
n = 1;
dir = fi.DirectoryName;
Console.WriteLine("\n\n" + dir);
File.Move(pdffile, dir + $"\\{n.ToString().PadLeft(8, '0')}.pdf");
Console.WriteLine("\t" + n.ToString().PadLeft(8, '0'));
n++;
}
}
I need to process N files at a time, So I've stored all files information in Dictionary with Filename, Size, and SequenceNo, Now I've to select 5 files from that Dictionary and process that file, meanwhile if process for any file completed then it will select another 1 file from that dictionary.
For Example :
If I've 10 Files in the dictionary and I select the first 5 files File 1, File 2, File 3, File 4, File 5 from the dictionary and process it. If process File 3 is completed then the process for File 6 should be started.
So Help me.
Thank You.
Thanks, #netmage I finally find my answer with the user of ConcurrentBag so I'll post the answer of my own question.
There is one namespace that provides several thread-safe collection classes that is System.Collections.Concurrent. I have used one from that namespace that is ConcurrentBag
Unlike List, A ConcurrentBag bag allow modification while we are doing iteration on it. it's also thread-safe and allow concurrent access on it.
I have Implemented the following code for the solution of my problem.
I have declared ConcurrentBag object FileData as global.
ConcurrentBag<string[]> FileData = new ConcurrentBag<string[]>();
Created one function to get file information and store them into FileData.
private void GetFileInfoIntoBag(string DirectoryPath)
{
var files = Directory.GetFiles(DirectoryPath, " *", SearchOption.AllDirectories);
foreach (var file in files)
{
FileInfo f1 = new FileInfo(file);
fileData = new string[4];
fileData[0] = f1.Name;
fileData[1] = GetFileSize.ToActualFileSize(f1.Length, 2);
fileData[2] = Convert.ToString(i);
fileData[3] = f1.FullName;
i++;
FileData.Add(fileData);
}
}
And then on then for upload proccess, I have created N Task as I required and Implemented logic for upload inside them.
private void Upload_Click(object sender, EventArgs e)
{
List<Task> tskCopy = new List<Task>();
for (int i = 0; i < N; i++)
{
tskCopy.Add(Task.Run(() =>
{
while (FileData.Count > 0)
{
string[] file;
FileData.TryTake(out file);
if (file != null && file.Count() > 3)
{
/* Upload Logic*/
GC.Collect();
}
}
}));
}
Task.WaitAll(tskCopy.ToArray());
MessageBox.Show("Upload Complited Successfully");
}
Thank you all for your support.
Apparently, you wish to process your files in a specific order, at most five at a time.
So far, information about your files is stored sequentially in a List<T>.
One straightforward way to move across the list is to store the index of the next element to access in an int variable, e.g. nextFileIndex. You initialize it to 0.
When starting to process one of your files, you take the information from your list:
MyFileInfo currentFile;
lock (myFiles)
{
if (nextFileIndex < myFiles.Count)
{
currentFile = myFiles[nextFileIndex++];
}
}
You start five "processes" like that in the beginning, and whenever one of them has ended, you start a new one.
Now, for these "processes" to run in parallel (it seems like that is what you intend), please read about multithreading, e.g. the task parallel library that is part of .NET. My suggestion would be to create five tasks that grab the next file as long as the nextFileIndex has not exceeded the maximum index in the list, and use something like Task<TResult>.WaitAll to wait until none of the tasks has anything to do anymore.
Be aware of multi-threading issues.
I'm trying to get all the videos in a specific folder inside the Videos library using UWP, right now I can get all videos inside the Videos library, but I'd like to reduce my results to only those inside the specified folder. My code is this:
Windows.Storage.Search.QueryOptions queryOption = new QueryOptions(CommonFileQuery.OrderByTitle, new string[] {".mp4"});
queryOption.FolderDepth = FolderDepth.Deep;
var files = await KnownFolders.VideosLibrary.CreateFileQueryWithOptions(queryOption).GetFilesAsync();
StorageFile videoToPlay = (files[new Random().Next(0, files.Count)] as StorageFile);
var stream = await videoToPlay.OpenAsync(Windows.Storage.FileAccessMode.Read);
Player.SetSource(stream, videoToPlay.ContentType);
Debug.WriteLine(Player.Source);
How could I access a subfolder named "Videos to Play" and then get all the videos inside that folder? I tried accesing it by using a path like:
string localfolder = Windows.Storage.ApplicationData.Current.LocalFolder.Path;
var array = localfolder.Split('\\');
var username = array[2];
string[] allVideos = System.IO.Directory.GetFiles("C:/Users/" + username + "/Videos/Videos to Play");
But I get access denied even though I already requested access to the Videos library (and the fact that the first example works shows that I actually have access to it).
try
{
var folder = await KnownFolders.VideosLibrary.GetFolderAsync("Videos to Play");
}
catch (FileNotFoundException exc)
{
// TODO: Handle the case when the folder wasn't found on the user's machine.
}
In the folder variable you'll have the reference to the desired folder. Then it's the very same stuff that you already do, but instead of KnownFolders.VideosLibrary folder use this one!
How to get list of all physical drives in UWP (Windows 10) App? I'm try to use Windows.Storage.KnownFolders, but this way I can get only folders from Library.
In UWP you cannot list all the files/drives just like that (with official API) - this is by design, probably for security reasons. Windows Store apps work are isolated and the access is only granted to limited resources/locations. In this case you are freely able to access virtual locations like MusicLibray, PicturesLibrary and so on. The list of access permisions you will find at MSDN.
If you want to access a file/folder from out of above scope, the user will have to grand the access to it for your app. For this purpose you can use pickers.
I know you asked this question a long time ago, but I created a question (Get Internal Drives Using Windows.Storage Namespace in UWP) to provide my method for getting internal drives and encourage feedback/discussion on a better alternative.
I had exactly the same problem to solve and everything else I can find online doesn't fit with what I'm trying to do. So, with the broadFileSystemAccess attribute added to the manifest file and File System access switched on for the app in Privacy Settings, it is possible to call StorageFolder.GetFolderFromPathAsync for a drive letter and it will return an instance of StorageFolder if the drive exists.
Sadly there isn't a method to list the drives, so I wrote something to cycle through all the letters of the alphabet and call GetFolderFromPathAsync to see if a drive handle is returned.
The method I created to obtain the list of drives is as follows:
public List<StorageFolder> GetInternalDrives()
{
string driveLetters = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";
int driveLettersLen = driveLetters.Length;
string removableDriveLetters = "";
string driveLetter;
List<StorageFolder> drives = new List<StorageFolder>();
StorageFolder removableDevices = KnownFolders.RemovableDevices;
IReadOnlyList<StorageFolder> folders = Task.Run<IReadOnlyList<StorageFolder>>(async () => await removableDevices.GetFoldersAsync()).Result;
foreach (StorageFolder removableDevice in folders)
{
if (string.IsNullOrEmpty(removableDevice.Path)) continue;
driveLetter = removableDevice.Path.Substring(0, 1).ToUpper();
if (driveLetters.IndexOf(driveLetter) > -1) removableDriveLetters += driveLetter;
}
for (int curDrive = 0; curDrive < driveLettersLen; curDrive++)
{
driveLetter = driveLetters.Substring(curDrive, 1);
if (removableDriveLetters.IndexOf(driveLetter) > -1) continue;
try
{
StorageFolder drive = Task.Run<StorageFolder>(async () => await StorageFolder.GetFolderFromPathAsync(driveLetter + ":")).Result;
drives.Add(drive);
}
catch (System.AggregateException) { }
}
return drives;
}
And here is the calling code:
List<StorageFolder> drives = GetInternalDrives();
panScanParams.Children.Clear();
foreach (StorageFolder drive in drives)
{
CheckBox cb = new CheckBox();
cb.Content = drive.DisplayName;
cb.IsChecked = true;
panScanParams.Children.Add(cb);
}
Whilst the code works, it's not good practice to call methods with bad parameters and handle the exception. But with a lack of suitable alternative, I don't know what other choice there is.
I want to index all my music files and store them in a database.
I have this function that i call recusively, starting from the root of my music drive.
i.e.
start > ReadFiles(C:\music\);
ReadFiles(path){
foreach(file)
save to index;
foreach(directory)
ReadFiles(directory);
}
This works fine, but while running the program the amount of memory that is used grows and grows and.. finally my system runs out of memory.
Does anyone have a better approach that doesnt need 4GB of RAM to complete this task?
Best Regards, Tys
Alxandr's queue based solution should work fine.
If you're using .NET 4.0, you could also take advantage of the new Directory.EnumerateFiles method, which enumerates files lazily, without loading them all in memory:
void ReadFiles(string path)
{
IEnumerable<string> files =
Directory.EnumerateFiles(
path,
"*",
SearchOption.AllDirectories); // search recursively
foreach(string file in files)
SaveToIndex(file);
}
Did you check for the . and .. entries that show up in every directory except the root?
If you don't skip those, you'll have an infinite loop.
You can implement this as a queue. I think (but I'm not sure) that this will save memory. At least it will free up your stack. Whenever you find a folder you add it to the queue, and whenever you find a file you just read it. This prevents recursion.
Something like this:
Queue<string> dirs = new Queue<string>();
dirs.Enqueue("basedir");
while(dirs.Count > 0) {
foreach(directory)
dirs.Enqueue(directory);
ReadFiles();
}
Beware, though, that EnumerateFiles() will stop running if you don't have access to a file or if a path is too long or if some other exception occurs. This is what I use for the moment to solve those problems:
public static List<string> getFiles(string path, List<string> files)
{
IEnumerable<string> fileInfo = null;
IEnumerable<string> folderInfo = null;
try
{
fileInfo = Directory.EnumerateFiles(str);
}
catch
{
}
if (fileInfo != null)
{
files.AddRange(fileInfo);
//recurse through the subfolders
fileInfo = Directory.EnumerateDirectories(str);
foreach (string s in folderInfo)
{
try
{
getFiles(s, files);
}
catch
{
}
}
}
return files;
}
Example use:
List<string> files = new List<string>();
files = folder.getFiles(path, files);
My solution is based on the code at this page: http://msdn.microsoft.com/en-us/library/vstudio/bb513869.aspx.
Update: A MUCH faster method to get files recursively can be found at http://social.msdn.microsoft.com/Forums/vstudio/en-US/ae61e5a6-97f9-4eaa-9f1a-856541c6dcce/directorygetfiles-gives-me-access-denied?forum=csharpgeneral. Using Stack is new to me (I didn't even know it existed), but the method seems to work. At least it listed all files on my C and D partition with no errors.
It could be junction folders wich leads to infinite loop when doing recursion but i am not sure , check this out and see by yourself . Link: https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/mklink