My goal is to check all files which are shared BY me (whetever type or permissions).
So I need to find files which I'm the owner and has Shared property set to true, so I wrote the following code:
var listRequest = driveService.Files.List();
listRequest.Q = "'me' in owners and shared = 'true'";
FileList fileList = listRequest.Execute();
foreach (var file in fileList.Items)
SharedByMe.Add(file);
but the part with "shared = 'true'" doesn't working. I found that custom properties can be put in query, but what about standard ones? Because downloading all files and then checking it, is obviously not an option.
Related
I'm trying to query list of all files and folders from drive of G Suite user and copy all this files to another domain user.
Anyhow i am able to get list of all files but i am not able copy those files to another domain.
I have gone through several reference on sharing drive files but couldn't understand of granting authority so that destination domain can copy files.
I will be happy if anyone can help me to solve this problem.
public void Execute(){
string strFileId = "";
//get list of all files from source domain(eg. user#a.jp)
IList<Google.Apis.Drive.v3.Data.File> fileList = service.getDriveFiles();
Google.Apis.Drive.v3.Data.File title = new Google.Apis.Drive.v3.Data.File();
//loop files list
foreach (var fileItem in fileList)
{
strFileId = fileItem.Id;
title.Name = fileItem.Name;
//copy files to destination domain(eg. user#b.jp) with file ID
service.Files.Copy(title, strFileId).Execute();
}
}
private IList<Google.Apis.Drive.v3.Data.File> getDriveFiles()
{
// Define parameters of request.
Google.Apis.Drive.v3.FilesResource.ListRequest FileListRequest = source_drive_service.Files.List();
// get all files
FileListRequest.Fields = "nextPageToken, files(*)";
IList<Google.Apis.Drive.v3.Data.File> files = FileListRequest.Execute().Files;
return files;
}
There are many ways to go about this, but one simple way is to:
With the Drive API get a list of all your files (which include folders in Drive) by using the Files.list().
For every file, you can transfer ownership to a new owner (the admin of Domain B for example).
Note: When a file is transferred, the previous owner's role is downgraded to writer.
I've got an app that communicates with Drive (specifically TeamDrives) and I need to pull a hierarchical folder structure within a given TeamDrive.
I can list my available TeamDrives then get the files (filter: folders mime type) within them but there doesn't seem to be any parent info to each folder so my structure is seemingly 'flat'.
I get that on Drive a folder is just a label and that 'sub-folders' could be shared in several places so I will cater for that but I just want to be able to create the directory tree on my app.
e.g. structure as I want to show it in my app:
Team Drive Name
Main Folder
Sub Folder
My 'list' code:
var request = service.Files.List();
request.Corpora = "teamDrive";
request.IncludeTeamDriveItems = true;
request.SupportsTeamDrives = true;
request.OrderBy = "name";
request.PageSize = 100;
request.TeamDriveId = "[teamDriveId]";
request.Q = "mimeType='application/vnd.google-apps.folder'";
This gives me for a given 'teamDriveId':
Main Folder
Sub Folder
parent on the Sub Folder result is null.
You haven't specified which fields you want in the response, so you are getting the default which doesn't include parent information. Try adding request.Fields = "*". I'm not familiar with the library you are using, so I might have the wrong syntax - please double check.
You might also find this useful In Google Apps Script, how would I get all subfolders and all subsubfolders and all subsubsub folders etc.?
As pinoyyid suggested you need to add 'Fields' to your request.
request.Fields = "name, id, parents";
https://developers.google.com/drive/api/v3/folder
I using .Net Framework 4.0; VS 2015; Ionic.Zip.Reduced (DotNetZip.Reduced) v1.9.1.8. When I try to add a folder to the archive get an exception with the text:
The path is too long
Sample code:
using (var zipFile = new ZipFile(zipFilePath))
{
zipFile.UseZip64WhenSaving = Zip64Option.AsNecessary;
zipFile.AlternateEncodingUsage = ZipOption.Always;
zipFile.AlternateEncoding = Encoding.UTF8;
zipFile.ParallelDeflateThreshold = -1;
var dirPath = #"C:\AAAAAAAAAAA\AAAAAA\AAAAAAAAAAAAAAA\AAAAAAAAA\AAAAAAAAAAAAA\AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA\";
zipFile.AddDirectory(dirPath); <-Exception
zipFile.Save();
}
In the folder is a file named: AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA.zip
As a result of an error:
The path is too long
Rewritten in the file-based addition to the archive (using a relative path):
using (var zipFile = new ZipFile(zipFilePath))
{
zipFile.UseZip64WhenSaving = Zip64Option.AsNecessary;
zipFile.AlternateEncodingUsage = ZipOption.Always;
zipFile.AlternateEncoding = Encoding.UTF8;
zipFile.ParallelDeflateThreshold = -1;
var dirPath = #"C:\AAAAAAAAAAA\AAAAAA\AAAAAAAAAAAAAAA\AAAAAAAAA\AAAAAAAAAAAAA\AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA\";
Directory.SetCurrentDirectory(dirPath);
var files = Directory.GetFiles(dirPath, "*", SearchOption.AllDirectories).ToArray();
foreach (var fullFilePath in files)
{
var fileName = Path.GetFileName(fullFilePath);
var relatedPath = fullFilePath.Substring(0, fullFilePath.LastIndexOf(fileName, StringComparison.InvariantCultureIgnoreCase)).Replace(zipDir, "");
var relatedFilePath = Path.Combine(relatedPath, fileName);
zipFile.AddFile(relatedFilePath); <-Exception
}
zipFile.Save();
}
The error is the same:
The path is too long
I tried to call Path.GetDirectoryName() method, but it also returns an error:
The specified path, file name, or both are too long. The fully
qualified file name must be less than 260 characters, and the
directory name must be less than 248 characters.
I found a lot of solutions but to get to work and did not work (because of the specifics of the application to the new version Framework'a can not go).
Use Framework 4.6.2. Set UseLegacyPathHandling = false option in App.Config or Switch.System.IO.UseLegacyPathHandling = false; Switch.System.IO.BlockLongPaths = false
With the mention of a Group Policy and the inclusion of the option Configuration> Administrative Templates> System> Filesystem> Enable NTFS long paths, or to enable the option via the manifest <ws2:longPathAware>true</ws2:longPathAware>
Use the prefix \\?\ In the path (I understand that for the new version of Framework)
Convert path to the file in 8.3 format using GetShortPathName function .... (Error remains)
Maybe someone faced such problem. I will be glad to any advice. Thanks.
If your path is too long there's not much you can do about it. Even if you can move Windows limits a step further your application won't work well on a non ad-hoc configured system in that scenario.
You can workaround copying the files you have to work with to a temp folder like C:\temp and add the files to the archive from there.
You can even mimic the same folder tree structure with directory names composed of only 1 or 2 letters and then map the complete (but really shorter) directory path to the original path somewhere (on a file for example), so that you can rebuild the original folder tree structure with the same names later on.
I'm trying to get a list of files in date order in a Metro App in C#
I thought this code should do it,
var queryOptions = new QueryOptions(CommonFileQuery.OrderByDate, new[] { ".xml" });
queryOptions.FolderDepth = FolderDepth.Deep;
StorageFolder folder = await ApplicationData.Current.LocalFolder.CreateFolderAsync("Recent", CreationCollisionOption.OpenIfExists);
StorageFileQueryResult query = folder.CreateFileQueryWithOptions(queryOptions);
var files = await query.GetFilesAsync();
but this gives me the following error:
WinRT information: The requested enumeration option is not available
for this folder because it is not within a library or homegroup. Only folders within a library or a homegroup support all enumeration options.
Is there a way to get a list of files in date order when reading files from directories inside the Local folder?
You could recover the files and then use LINQ to Objects to perform the sorting for you.
Say I have the following directories and files in an Amazon S3 bucket (files are in bold):
bucketname/
bucketname/folder1/
bucketname/folder1/foobar.txt
bucketname/folder1/subfolder1/
bucketname/folder1/subfolder1/hello.txt
bucketname/folder1/subfolder2/
bucketname/folder1/subfolder2/world.txt
bucketname/folder1/subfolder2/subsubfolder1/
bucketname/folder1/subfolder2/subsubfolder1/file.txt
How can I list all objects and immediate subdirectories of a given directory with the .NET AWS S3 API, without recursively getting everything below that directory? In other words, how can I "browse" the contents of a directory at a single level?
For example, imagine I want to browse the contents of bucketname/folder1/. What I would like to see is the following:
bucketname/folder1/foobar.txt
bucketname/folder1/subfolder1/
bucketname/folder1/subfolder2/
...and nothing else. I don't want to list the files and directories in subdirectories, I just want to list the files and subdirectories at the folder1 level.
Is there a way to apply filters to a single AWS API call so that it doesn't return everything and force me to manually parse only what I need?
I've found that this code let's me get just the immediate subdirectories (as intended), but I can't figure out how to include the immediate files too:
var request = new ListObjectsRequest()
.WithBucketName("bucketname")
.WithPrefix(#"folder1/")
.WithDelimiter(#"/");
using (var client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretKey))
using (var response = client.ListObjects(request))
{
foreach (var item in response.CommonPrefixes)
{
/* ... */
}
}
I had the opposite problem (I knew how to get the files in the specified folder, but not the subdirectories).
The answer is that Amazon lists files differently than it does sub-folders.
Sub-folders are listed, as your example shows, in the ListObjectsResponse.CommonPrefixes collection.
Files are listed in the ListObjectsResponse.S3Objects collection.
So your code should look like this:
var request = new ListObjectsRequest()
.WithBucketName("bucketname")
.WithPrefix(#"folder1/")
.WithDelimiter(#"/");
using (var client = Amazon.AWSClientFactory.CreateAmazonS3Client(accessKey, secretKey))
using (var response = client.ListObjects(request))
{
foreach (var subFolder in response.CommonPrefixes)
{
/* list the sub-folders */
}
foreach (var file in response.S3Objects) {
/* list the files */
}
}
my google search turned up this post on the burningmonk blog with this in the comment section:
When you make the ListObjects request, to list the top level folders, don’t set the prefix but set the delimiter to ‘/’, then inspect the ‘CommonPrefixes’ property on the response for the folders that are in the top folder.
To list the contents of a ‘rootfolder’, make the request with prefix set to the name of the folder plus the backslash, e.g. ‘rootfolder/’ and set the delimiter to ‘/’. In the response you’ll always have the folder itself as an element with the same key as the prefix you used in the request, plus any subfolders in the ‘CommonPrefixes’ property.