Get to an Exchange folder by path using EWS - c#

I need to retrieve items from the 'Inbox\test\final' Exchange folder using EWS. The folder is provided by a literal path as written above. I know I can split this string into folder names and recursively search for the necessary folder, but is there a more optimal way that can translate a string path into a folder instance or folder ID?
I'm using the latest EWS 2.0 assemblies. Do these assemblies provide any help, or am I stuck with manual recursion?

You could use an extended property as in this example
private string GetFolderPath(ExchangeService service, FolderId folderId)
{
var folderPathExtendedProp = new ExtendedPropertyDefinition(26293, MapiPropertyType.String);
var folderPropSet = new PropertySet(BasePropertySet.FirstClassProperties) { folderPathExtendedProp };
var folder = Folder.Bind(service, folderId, folderPropSet);
string path = null;
folder.TryGetProperty(folderPathExtendedProp, out path);
return path?.Replace("\ufffe", "\\");
}
Source: https://social.msdn.microsoft.com/Forums/en-US/e5d07492-f8a3-4db5-b137-46e920ab3dde/exchange-ews-managed-getting-full-path-for-a-folder?forum=exchangesvrdevelopment

Since Exchange Server likes to map everything together with Folder.Id, the only way to find the path you're looking for is by looking at folder names.
You'll need to create a recursive function to go through all folders in a folder collection, and track the path as it moves through the tree of email folders.
Another parameter is needed to track the path that you're looking for.
public static Folder GetPathFolder(ExchangeService service, FindFoldersResults results,
string lookupPath, string currentPath)
{
foreach (Folder folder in results)
{
string path = currentPath + #"\" + folder.DisplayName;
if (folder.DisplayName == "Calendar")
{
continue;
}
Console.WriteLine(path);
FolderView view = new FolderView(50);
SearchFilter filter = new SearchFilter.IsEqualTo(FolderSchema.Id, folder.Id);
FindFoldersResults folderResults = service.FindFolders(folder.Id, view);
Folder result = GetPathFolder(service, folderResults, lookupPath, path);
if (result != null)
{
return result;
}
string[] pathSplitForward = path.Split(new[] { "/" }, StringSplitOptions.RemoveEmptyEntries);
string[] pathSplitBack = path.Split(new[] { #"\" }, StringSplitOptions.RemoveEmptyEntries);
string[] lookupPathSplitForward = lookupPath.Split(new[] { "/" }, StringSplitOptions.RemoveEmptyEntries);
string[] lookupPathSplitBack = lookupPath.Split(new[] { #"\" }, StringSplitOptions.RemoveEmptyEntries);
if (ArraysEqual(pathSplitForward, lookupPathSplitForward) ||
ArraysEqual(pathSplitBack, lookupPathSplitBack) ||
ArraysEqual(pathSplitForward, lookupPathSplitBack) ||
ArraysEqual(pathSplitBack, lookupPathSplitForward))
{
return folder;
}
}
return null;
}
"ArraysEqual":
public static bool ArraysEqual<T>(T[] a1, T[] a2)
{
if (ReferenceEquals(a1, a2))
return true;
if (a1 == null || a2 == null)
return false;
if (a1.Length != a2.Length)
return false;
EqualityComparer<T> comparer = EqualityComparer<T>.Default;
for (int i = 0; i < a1.Length; i++)
{
if (!comparer.Equals(a1[i], a2[i])) return false;
}
return true;
}
I do all the extra array checking since sometimes my clients enter paths with forward slashes, back slashes, starting with a slash, etc. They're not tech savvy so let's make sure the program works every time!
As you go through each directory, compare the desired path to the iterated path. Once it's found, bubble up the Folder object that it's currently on. You'll need to create a search filter for that folder's id:
FindItemsResults<item> results = service.FindItems(foundFolder.Id, searchFilter, view);
Loop through the emails in results!
foreach (Item item in results)
{
// do something with item (email)
}

Here's my recursive descent implementation, which attempts to fetch as little information as possible on the way to the target folder:
private readonly FolderView _folderTraversalView = new FolderView(1) { PropertySet = PropertySet.IdOnly };
private Folder TraceFolderPathRec(string[] pathTokens, FolderId rootId)
{
var token = pathTokens.FirstOrDefault();
var matchingSubFolder = _exchangeService.FindFolders(
rootId,
new SearchFilter.IsEqualTo(FolderSchema.DisplayName, token),
_folderTraversalView)
.FirstOrDefault();
if (matchingSubFolder != null && pathTokens.Length == 1) return matchingSubFolder;
return matchingSubFolder == null ? null : TraceFolderPathRec(pathTokens.Skip(1).ToArray(), matchingSubFolder.Id);
}
For a '/'-delimited path, it can be called as follows:
public Folder TraceFolderPath(string folderPath)
{ // Handle folder names with '/' in them
var tokens = folderPath
.Replace("\\/", "<slash>")
.Split('/')
.Select(t => t.Replace("<slash>", "/"))
.ToArray();
return TraceFolderPathRec(tokens, WellKnownFolderName.MsgFolderRoot);
}

No, you don't need recursion and you efficiently go straight to the folder. This uses the same extended property as Tom, and uses it to apply a search filter:
using Microsoft.Exchange.WebServices.Data; // from nuget package "Microsoft.Exchange.WebServices"
...
private static Folder GetOneFolder(ExchangeService service, string folderPath)
{
var propertySet = new PropertySet(BasePropertySet.IdOnly);
propertySet.AddRange(new List<PropertyDefinitionBase> {
FolderSchema.DisplayName,
FolderSchema.TotalCount
});
var pageSize = 100;
var folderView = new FolderView(pageSize)
{
Offset = 0,
OffsetBasePoint = OffsetBasePoint.Beginning,
PropertySet = propertySet
};
folderView.Traversal = FolderTraversal.Deep;
var searchFilter = new SearchFilter.IsEqualTo(ExchangeExtendedProperty.FolderPathname, folderPath);
FindFoldersResults findFoldersResults;
var baseFolder = new FolderId(WellKnownFolderName.MsgFolderRoot);
var localFolderList = new List<Folder>();
do
{
findFoldersResults = service.FindFolders(baseFolder, searchFilter, folderView);
localFolderList.AddRange(findFoldersResults.Folders);
folderView.Offset += pageSize;
} while (findFoldersResults.MoreAvailable);
return localFolderList.SingleOrDefault();
}
...
public static class ExchangeExtendedProperty
{
/// <summary>PR_FOLDER_PATHNAME String</summary>
public static ExtendedPropertyDefinition FolderPathname { get => new ExtendedPropertyDefinition(0x66B5, MapiPropertyType.String); }
}
The path will need to be prefixed with a backslash, ie. \Inbox\test\final.

Related

How to have an AWS Lambda/Rekognition Function return an array of object keys

This feels like a simple question and I feel like I am overthinking it. I am doing an AWS project that will compare face(s) on an image to a database (s3bucket) of other faces. So far, I have a lambda function for the comparefacerequest, a class library which invokes the function, and an UWP that inputs the image file and outputs a result. It has worked so far being based on boolean (true or false) functions, but now I want it to instead return what face(s) are recognized via an array. I struggling at implementing this.
Below is my lambda function. I have adjusted the task to be an Array instead of a bool and changed the return to be an array. At the bottom, I have created a global variable class with a testing array so I could attempt to reference the array elsewhere.
public class Function
{
//Function
public async Task<Array> FunctionHandler(string input, ILambdaContext context)
{
//number of matched faces
int matched = 0;
//Client setup
var rekognitionclient = new AmazonRekognitionClient();
var s3client = new AmazonS3Client();
//Create list of target images
ListObjectsRequest list = new ListObjectsRequest
{
BucketName = "bucket2"
};
ListObjectsResponse listre = await s3client.ListObjectsAsync(list);
//loop of list
foreach (Amazon.S3.Model.S3Object obj in listre.S3Objects)
{
//face request with input and obj.key images
var comparefacesrequest = new CompareFacesRequest
{
SourceImage = new Image
{
S3Object = new S3Objects
{
Bucket = "bucket1",
Name = input
}
},
TargetImage = new Image
{
S3Object = new S3Objects
{
Bucket = "bucket2",
Name = obj.Key
}
},
};
//compare with confidence of 95 (subject to change) to current target image
var detectresponse = await rekognitionclient.CompareFacesAsync(comparefacesrequest);
detectresponse.FaceMatches.ForEach(match =>
{
ComparedFace face = match.Face;
if (match.Similarity > 95)
{
//if face detected, raise matched
matched++;
for(int i = 0; i < Globaltest.testingarray.Length; i++)
{
if (Globaltest.testingarray[i] == "test")
{
Globaltest.testingarray[i] = obj.Key;
}
}
}
});
}
//Return true or false depending on if it is matched
if (matched > 0)
{
return Globaltest.testingarray;
}
return Globaltest.testingarray;
}
}
public static class Globaltest
{
public static string[] testingarray = { "test", "test", "test" };
}
Next, is my invoke request in my class library. It has so far been based on the lambda outputting a boolean result, but I thought, "hey, it is parsing the result, it should be fine, right"? I do convert the result to a string, as there is no GetArray, from what I know.
public async Task<bool> IsFace(string filePath, string fileName)
{
await UploadS3(filePath, fileName);
AmazonLambdaClient client = new AmazonLambdaClient(accessKey, secretKey, Amazon.RegionEndpoint.USWest2);
InvokeRequest ir = new InvokeRequest();
ir.InvocationType = InvocationType.RequestResponse;
ir.FunctionName = "ImageTesting";
ir.Payload = "\"" + fileName + "\"";
var result = await client.InvokeAsync(ir);
var strResponse = Encoding.ASCII.GetString(result.Payload.ToArray());
if (bool.TryParse(strResponse, out bool result2))
{
return result2;
}
return false;
}
Finally, here is the section of my UWP where I perform the function. I am referencing the lambda client via "using Lambdaclienttest" (name of lamda project, and this is its only instance I use the reference though). When I run my project, I do still get a face detected when it should, but the Globaltest.testingarray[0] is still equal to "test".
var Facedetector = new FaceDetector(Credentials.accesskey, Credentials.secretkey);
try
{
var result = await Facedetector.IsFace(filepath, filename);
if (result)
{
textBox1.Text = "There is a face detected";
textBox2.Text = Globaltest.testingarray[0];
}
else
{
textBox1.Text = "Try Again";
}
}
catch
{
textBox1.Text = "Please use a photo";
}
Does anyone have any suggestions?

Is it better to use IEnumerable as read only list?

I am downloading .tgz file from the remote server to a folder locally and then unzipping it out. After that I read all those json/txt files in memory. Below is my code which does that:
public IEnumerable<DataHolder> GetFiles(string fileName)
{
// this will download files to a directory
var isDownloadSuccess = DownloadFiles(_url, fileName, _directoryToDownload);
if (!isDownloadSuccess.Result) { yield return default; }
// this will unzip files in same directory
var isUnzipSuccess = UnzipTgzFile(_directoryToDownload, fileName);
if (!isUnzipSuccess) { yield return default; }
// this will get list of all files in same directory
IList<string> files = GetListOfFiles(_directoryToDownload);
if (files == null || files.Count == 0) { yield return default; }
// total files will be 500 max
for (int i = 0; i < files.Count; i++)
{
var cfgPath = files[i];
if (!File.Exists(cfgPath)) { continue; }
var fileDate = File.GetLastWriteTimeUtc(cfgPath);
var fileContent = File.ReadAllText(cfgPath);
var pathPieces = cfgPath.Split(System.IO.Path.DirectorySeparatorChar, StringSplitOptions.RemoveEmptyEntries);
var fileName = pathPieces[pathPieces.Length - 1];
var md5Hash = CheckMD5(cfgPath);
yield return new DataHolder
{
FileName = fileName,
FileDate = fileDate,
FileContent = fileContent,
FileMD5HashValue = md5Hash
};
}
}
Use Case:
If I am not able to download files successfully (isDownloadSuccess is false) then I want to return empty IEnumerable back.
If I am not able to unzip files successfully (isUnzipSuccess is false) then I want to return empty IEnumerable back as well.
If I am not able to get list of files successfully (files list is empty) then I want to return empty IEnumerable back as well.
If I had some processing issues in the for loop then I want to return empty IEnumerable back as well.
Otherwise just return readonly IEnumerable back to the caller with data in it.
Problem I am having with above approach is - I cannot do empty check in the cases when it returns yield return default and also I am confuse on what happens if processing fails in for loop, will it return empty IEnumerable as well back to the caller?
IEnumerable<DataHolder> dataHolders = GetFiles(fileName);
// below check doesn't work on negative cases
if (dataHolders == null || !dataHolders.Any())
return false;
//....
So is this the right way to use IEnumerable here or I can use any other data structure which can provide read only list to the caller along with empty list (for negative cases) which I can easily check for null or empty.
Question:
My goal is just to return read only list back to the user with data in it (for positive cases). And for all negative cases, I need to return back empty read only list to the user.
We talked in chat, but will reiterate.
Yield doesn't really work here as we don't really need those semantics for any specific reason. You want to get a list of files to use later for comparing against other lists of files (they all have to be read in to memory eventually, may as well do it now):
public IReadOnlyList<DataHolder> GetFiles(string fileName)
{
// this will download files to a directory
var isDownloadSuccess = DownloadFiles(_url, fileName, _directoryToDownload);
if (!isDownloadSuccess.Result) { return Array.Empty<DataHolder>(); }
// this will unzip files in same directory
var isUnzipSuccess = UnzipTgzFile(_directoryToDownload, fileName);
if (!isUnzipSuccess) { return Array.Empty<DataHolder>(); }
// this will get list of all files in same directory
IList<string> files = GetListOfFiles(_directoryToDownload);
if (files == null || files.Count == 0) { return Array.Empty<DataHolder>(); }
var lst = new List<DataHolder>(files.Count);
for (int i = 0; i < files.Count; i++)
{
var cfgPath = files[i];
if (!File.Exists(cfgPath)) { continue; }
var fileDate = File.GetLastWriteTimeUtc(cfgPath);
var fileContent = File.ReadAllText(cfgPath);
var pathPieces = cfgPath.Split(System.IO.Path.DirectorySeparatorChar, StringSplitOptions.RemoveEmptyEntries);
var fileName = pathPieces[pathPieces.Length - 1];
var md5Hash = CheckMD5(cfgPath);
lst.Add(new DataHolder
{
FileName = fileName,
FileDate = fileDate,
FileContent = fileContent,
FileMD5HashValue = md5Hash
});
}
return lst.AsReadOnly();
}
We are now just returning a read-only list of all your items, which allows you to do checks if any items exist, such as:
if(lst?.Count > 0){ /* There are items to process */ }
Also, this doesn't break your pattern as IReadOnlyList implements IEnumerable, so it will fit in quite nicely.
Since you download and unzip all files at once, I understand that you're not concerned about this implementation being an actual iteratable (as a foreach would wait until everything is done before being able to iterate).
Keeping that in mind, the easiest you can do is to get rid of yields and return arrays.
Sample implementation (might need some spell check):
public IEnumerable<DataHolder> GetFiles(string fileName)
{
// this will download files to a directory
var isDownloadSuccess = DownloadFiles(_url, fileName, _directoryToDownload);
if (!isDownloadSuccess.Result) { return Array.Empty<DataHolder>(); }
// this will unzip files in same directory
var isUnzipSuccess = UnzipTgzFile(_directoryToDownload, fileName);
if (!isUnzipSuccess) { return Array.Empty<DataHolder>(); }
// this will get list of all files in same directory
IList<string> files = GetListOfFiles(_directoryToDownload);
if (files == null || files.Count == 0) { return Array.Empty<DataHolder>(); }
var data = new DataHolder[files.Count];
try
{
for (int i = 0; i < files.Count; i++)
{
var cfgPath = files[i];
if (!File.Exists(cfgPath)) { continue; }
var fileDate = File.GetLastWriteTimeUtc(cfgPath);
var fileContent = File.ReadAllText(cfgPath);
var pathPieces = cfgPath.Split(System.IO.Path.DirectorySeparatorChar, StringSplitOptions.RemoveEmptyEntries);
var fileName = pathPieces[pathPieces.Length - 1];
var md5Hash = CheckMD5(cfgPath);
data[i] = new DataHolder
{
FileName = fileName,
FileDate = fileDate,
FileContent = fileContent,
FileMD5HashValue = md5Hash
};
}
return data;
}
catch (Exception ex)
{
return Array.Empty<DataHolder>();
}
}
For consuming this, you would, for example:
var files = GetFiles("somename.txt");
if (!files.Any()) // do not check for files being null
{
return;
}
Side note, I would change the first few lines into this, so you don't do sync-over-async which can cause deadlocks:
public async Task<IEnumerable<DataHolder>> GetFiles(string fileName)
{
// this will download files to a directory
var isDownloadSuccess = await DownloadFiles(_url, fileName, _directoryToDownload);
if (!isDownloadSuccess) { return Array.Empty<DataHolder>(); }
...
}

how do i get a full path of a folder from google drive api with c#

how do i get a full path of a folder with google drive api c#. Lets say i want a .net list filled with Folder class and Folder being a class with 2 properties. URL and folder name. Iam new to this so sorry if the question is bad/dumb. Any thing would help at this point.
There is a fantastic command line for working with google drive available on github.com/prasmussen/gdrive/
The logic exists in that codebase to walk the directory tree up from each file and construct the full path.
I've followed the .NET Quickstart instructions, then converted the relevant go-lang code from path.go into the C# equivalent below.
using Google.Apis.Auth.OAuth2;
using Google.Apis.Drive.v3;
using Google.Apis.Services;
using Google.Apis.Util.Store;
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Threading;
namespace DriveQuickstart
{
class Program
{
// If modifying these scopes, delete your previously saved credentials
// at ~/.credentials/drive-dotnet-quickstart.json
static string[] Scopes = { DriveService.Scope.DriveReadonly };
static string ApplicationName = "Drive API .NET Quickstart";
static DriveService service;
static Dictionary<string, Google.Apis.Drive.v3.Data.File> files = new Dictionary<string, Google.Apis.Drive.v3.Data.File>();
static void Main(string[] args)
{
UserCredential credential;
using (var stream =
new FileStream("client_secret.json", FileMode.Open, FileAccess.Read))
{
string credPath = System.Environment.GetFolderPath(
System.Environment.SpecialFolder.Personal);
credPath = Path.Combine(credPath, ".credentials/drive-dotnet-quickstart.json");
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
Scopes,
"user",
CancellationToken.None,
new FileDataStore(credPath, true)).Result;
Console.WriteLine("Credential file saved to: " + credPath);
}
// Create Drive API service.
service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName,
});
// Define parameters of request.
FilesResource.ListRequest listRequest = service.Files.List();
listRequest.PageSize = 10;
listRequest.Fields = "nextPageToken, files(id, name, parents)";
// List files.
IList<Google.Apis.Drive.v3.Data.File> files = listRequest.Execute()
.Files;
Console.WriteLine("Files:");
if (files != null && files.Count > 0)
{
foreach (var file in files)
{
var absPath = AbsPath(file);
Console.WriteLine("{0} ({1})", absPath, file.Id);
}
}
else
{
Console.WriteLine("No files found.");
}
Console.Read();
}
private static object AbsPath(Google.Apis.Drive.v3.Data.File file)
{
var name = file.Name;
if (file.Parents.Count() == 0)
{
return name;
}
var path = new List<string>();
while (true)
{
var parent = GetParent(file.Parents[0]);
// Stop when we find the root dir
if (parent.Parents == null || parent.Parents.Count() == 0)
{
break;
}
path.Insert(0, parent.Name);
file = parent;
}
path.Add(name);
return path.Aggregate((current, next) => Path.Combine(current, next));
}
private static Google.Apis.Drive.v3.Data.File GetParent(string id)
{
// Check cache
if (files.ContainsKey(id))
{
return files[id];
}
// Fetch file from drive
var request = service.Files.Get(id);
request.Fields = "name,parents";
var parent = request.Execute();
// Save in cache
files[id] = parent;
return parent;
}
}
}
Folder and Files both are considered as a File in Google Drive hence the following code will work for both the scenarios.
Create a function to return Full Path and Two other functions required in the same Task :
private IList<string> GetFullPath(Google.Apis.Drive.v3.Data.File file, IList<Google.Apis.Drive.v3.Data.File> files)
{
IList<string> Path = new List<string>();
if (file.Parents == null || file.Parents.Count == 0)
{
return Path;
}
Google.Apis.Drive.v3.Data.File Mainfile = file;
while (GetParentFromID(file.Parents[0], files) != null)
{
Path.Add(GetFolderNameFromID(GetParentFromID(file.Parents[0], files).Id, files));
file = GetParentFromID(file.Parents[0], files);
}
return Path;
}
private Google.Apis.Drive.v3.Data.File GetParentFromID(string FileID, IList<Google.Apis.Drive.v3.Data.File> files)
{
if (files != null && files.Count > 0)
{
foreach (var file in files)
{
if (file.Parents != null && file.Parents.Count > 0)
{
if (file.Id == FileID)
{
return file;
}
}
}
}
return null;
}
private string GetFolderNameFromID(string FolderID, IList<Google.Apis.Drive.v3.Data.File> files)
{
string FolderName = "";
if (files != null && files.Count > 0)
{
foreach (var file in files)
{
if (file.Id == FolderID)
{
FolderName = file.Name;
}
}
}
return FolderName;
}
Now you may call the function as :
string Path = "My Drive";
foreach (string Item in GetFullPath(file, files).Reverse())
{
Path += " / " + Item;
}
here, two parameters are passed -
1. file - it is the file whose path you are trying to find.
2. files - the list of files on your drive.

Cannot find public folder using Exchange web service API 2.0?

My outlook client has a shared folder "xxxx yyyy". However, the following code, which iterates all the folder and sub folder recursively, doesn't print out the folder. Why the code cannot get the folder?
private static void PrintAllPubFolder(ExchangeService service)
{
var folderView = new FolderView(int.MaxValue);
var findFolderResults = service.FindFolders(WellKnownFolderName.PublicFoldersRoot, folderView);
foreach (var folder in findFolderResults.Where(x => !ignore.Any(i => i == x.DisplayName)))
{
Console.WriteLine(folder.DisplayName);
PrintSubFolder(service, folder.Id, " ");
}
}
private static void PrintSubFolder(ExchangeService service, FolderId folderId, string p)
{
var folderView = new FolderView(int.MaxValue);
var findFolderResults = service.FindFolders(folderId, folderView);
foreach (var folder in findFolderResults.Where(x => !ignore.Any(i => i == x.DisplayName)))
{
Console.WriteLine("{0}{1}", p, folder.DisplayName);
PrintSubFolder(service, folder.Id, p + " ");
}
}
If your using Exchange 2010 or later don't use
var folderView = new FolderView(int.MaxValue);
Throttling will limit the results returned to 1000 so if you expect more the 1000 entries to be return then you'll need to page the results. However it doesn't make much sense to enumerate through every public folder to get the target look at the method in the following link
Searching Of Folders in Public Folders by giving its PATH Name
if the folder is in your mailbox then just do a search for that based on the name eg
FolderView ffView = new FolderView(1000);
ffView.Traversal = FolderTraversal.Deep;
SearchFilter fSearch = new SearchFilter.IsEqualTo(FolderSchema.DisplayName, "xxxx yyyy");
FindFoldersResults ffResults = service.FindFolders(WellKnownFolderName.MsgFolderRoot, fSearch, ffView);
Cheers
Glen

How to create folder structure in SDL Tridion 2011 SP1 using Core Service

I am using the Core Service on Tridion 2011. I want to create a folder structure, and then create a component in that structure.
Example:
Path of folder structure: /ABCD/DEFG/aaaaa
If the folder exists, we need not create folder. If it doesn't exist we have to create it and create component in it.
I know how to create the component in a folder having URI.
The following is the code I use when I need to Get or Create Folders with SDL Tridion's CoreService. It's a simple recursive method that checks for the existence of the current folder. If it doesn't exist, it goes into GetOrCreate the parent folder and so on until it finds an existing path. On the way out of the recursion, it simply creates the new Folders relative to their immediate parent.
Note: this method does not check the input folderPath. Rather, it assumes it represents a valid path.
private FolderData GetOrCreateFolder(string folderPath, SessionAwareCoreServiceClient client)
{
ReadOptions readOptions = new ReadOptions();
if (client.IsExistingObject(folderPath))
{
return client.Read(folderPath, readOptions) as FolderData;
}
else
{
int lastSlashIdx = folderPath.LastIndexOf("/");
string newFolder = folderPath.Substring(lastSlashIdx + 1);
string parentFolder = folderPath.Substring(0, lastSlashIdx);
FolderData parentFolderData = GetOrCreateFolder(parentFolder, client);
FolderData newFolderData = client.GetDefaultData(ItemType.Folder, parentFolderData.Id) as FolderData;
newFolderData.Title = newFolder;
return client.Save(newFolderData, readOptions) as FolderData;
}
}
I would use IsExistingObject - passing in the WebDAV URL - to see if the Folder already exists. If it returns false, you can go ahead and create the folder.
Edit: Here's some quick pseudo code...
string parentFolderId = #"/webdav/MyPublication/Building%20Blocks";
var client = GetCoreServiceClient();
if (!client.IsExistingObject(parentFolderId + "/AAA"))
{
var folder = client.GetDefaultData(2, parentFolderId);
folder.Title = "AAA";
client.Save(folder);
// Create the other folders and components here
}
This is what we used on one of our projects to create folders for a path.
static FolderData GetOrCreateFolder(List<string> folders,
FolderData root,
SessionAwareCoreService2010Client client)
{
var filter = new OrganizationalItemItemsFilterData();
filter.ItemTypes = new [] { ItemType.Folder };
var items = client.GetListXml(root.Id, filter).
Elements(TRIDION_NAMESPACE + "Item");
foreach (var element in items)
{
if (folders.Count == 0)
{
break; // break from foreach
}
var titleAttribute = element.Attribute("Title");
var idAttribute = element.Attribute("ID");
if (titleAttribute != null && titleAttribute.Value == folders[0] &&
idAttribute != null)
{
// folder exists
FolderData fd = client.Read(idAttribute.Value,
EXPANDED_READ_OPTIONS) as FolderData;
// We just took care of this guy, remove it to recurse
folders.RemoveAt(0);
return GetOrCreateFolder(folders, fd, client);
}
}
if (folders.Count != 0)
{
//Folder doesn't exist, lets create it and return its folderdata
var newfolder = new FolderData();
newfolder.Title = folders[0];
newfolder.LocationInfo = new LocationInfo {
OrganizationalItem = new LinkToOrganizationalItemData {
IdRef = root.Id
}
};
newfolder.Id = "tcm:0-0-0";
var folder = client.Create(newfolder, EXPANDED_READ_OPTIONS)
as FolderData;
folders.RemoveAt(0);
if (folders.Count > 0)
{
folder = GetOrCreateFolder(folders, folder, client);
}
return folder;
}
return root;
}
So you'd invoke it with something like this:
var root = client.Read("tcm:1-1-2", null) as FolderData;
var pathParts = "/ABCD/DEFG/aaaaa".Trim('/').Split('/').ToList();
var folder = GetOrCreateFolder(pathParts, root, client);
For Create a folder use the following code as sample...
You will have to check if the folder exists of course, this code shows how to create a folder within a folder
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using CoreWebService.ServiceReference1;
namespace CoreWebService
{
class CoreWebServiceSamples
{
public static void createFolder()
{
string folderWebDavUrl = "/webdav/020%20Content/Building%20Blocks/Content/wstest";
CoreServicesUtil coreServicesUtil = new CoreServicesUtil();
FolderData folderData = coreServicesUtil.getFolderData(folderWebDavUrl);
FolderData folderDataChild = folderData.AddFolderData();
folderDataChild.Title = "childFolder";
folderDataChild = (FolderData)coreServicesUtil.coreServiceClient.Save(folderDataChild, coreServicesUtil.readOptions);
coreServicesUtil.coreServiceClient.Close();
}
}
}
Here is some code for the methods referenced ....
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using CoreWebService.ServiceReference1;
using CoreWebService.Properties;
using System.Xml;
using System.Xml.Serialization;
namespace CoreWebService
{
public class CoreServicesUtil
{
public CoreService2010Client coreServiceClient;
public ReadOptions readOptions;
/// <summary>
///
/// </summary>
public CoreServicesUtil()
{
this.coreServiceClient = new CoreService2010Client("basicHttp_2010");
this.readOptions = new ReadOptions();
}
public FolderData getFolderData(string tcmuri)
{
FolderData folderData = (FolderData)coreServiceClient.Read(tcmuri, readOptions);
return folderData;
}
}
public static class CoreServicesItemCreator
{
/**
* <summary>
* Name: AddFolder
* Description: returns a new Folder Data created in the folder Data
* </summary>
**/
public static FolderData AddFolderData(this FolderData folderData)
{
FolderData childFolder = new FolderData();
childFolder.LocationInfo = getLocationInfo(folderData);
childFolder.Id = "tcm:0-0-0";
return childFolder;
}
}
}

Categories