I have the problem with counting the messages of mailbox.
I use c# and Microsoft.Graph 1.18.0
Here is my code
public async Task<long> GetItemsCountAsync(string userId)
{
var countOption = new QueryOption("$count", "true");
var request = ServiceClient.Value.Users[userId].Messages.Request();
request.QueryOptions.Add(countOption);
var resultMessages = new List<Message>();
var count = 0L;
do
{
var messagesResult = await request.GetAsync();
if (messagesResult.AdditionalData != null && messagesResult.AdditionalData.TryGetValue("#odata.count", out var messagesCount))
{
count = (long)messagesCount;
}
resultMessages.AddRange(messagesResult);
request = messagesResult.NextPageRequest;
}
while (request != null);
return count;
}
And I have at the end count = 1417 and resultMessages.Count = 760
Did I miss something?
Thank you for any help!
Everything is fine with the provided example. It appears $count for List messages endpoint could not be trusted here since API does not return accurate count for messages from a specified search folder (refer, for example, this answer for a more details).
To get messages count List mailFolders endpoint could be utilized instead:
GET /users/{id | userPrincipalName}/mailFolders?$select=totalItemCount
where totalItemCount represents the number of items in the mail folder.
C# example
var folders = await graphClient.Users[userId].MailFolders.Request().Select(f =>f.TotalItemCount).GetAsync();
var totalMessagesCount = folders.Sum(folder => folder.TotalItemCount);
Related
What is the code to add users to AAD group or remove users from AAD group in batches in C#? (first find batch size and then add or remove users). Any sample code would be great.
UPDATE:
I added the following code:
private HttpRequestMessage MakeRequest(AzureADUser user, Guid targetGroup)
{
return new HttpRequestMessage(HttpMethod.Patch, $"https://graph.microsoft.com/v1.0/groups/{targetGroup}")
{
Content = new StringContent(MakeAddRequestBody(user), System.Text.Encoding.UTF8, "application/json"),
};
}
private static string MakeAddRequestBody(AzureADUser user)
{
JObject body = new JObject
{
["members#odata.bind"] = JArray.FromObject($"https://graph.microsoft.com/v1.0/users/{user.ObjectId}")
};
return body.ToString(Newtonsoft.Json.Formatting.None);
}
public async Task AddUsersToGroup1(IEnumerable<AzureADUser> users, AzureADGroup targetGroup)
{
try
{
var batches = GetBatchRequest(users, targetGroup.ObjectId);
foreach (var batchRequestContent in batches)
{
var response = await _graphServiceClient
.Batch
.Request()
.WithMaxRetry(10)
.PostAsync(batchRequestContent);
var responses = await response.GetResponsesAsync();
}
}
catch (Exception ex)
{
}
}
On running this I get the following exception: Object serialized to String. JArray instance expected. What am I missing? Also, once I get the responses, I need to check if all of the response returned an 'OK' response or not similar to:
return responses.Any(x => x == ResponseCode.Error) ? ResponseCode.Error : ResponseCode.Ok;
How would I do that?
Add users into AAD Group in batch:
GraphServiceClient graphClient = new GraphServiceClient(authProvider);
var additionalData = new Dictionary<string, object>()
{
{"members#odata.bind", new List<string>()}
};
(additionalData["members#odata.bind"] as List<string>).Add("https://graph.microsoft.com/v1.0/users/{id}"");
(additionalData["members#odata.bind"] as List<string>).Add("https://graph.microsoft.com/v1.0/users/{id}"");
var group = new Group
{
AdditionalData = additionalData
};
await graphClient.Groups["{group-id}"]
.Request()
.UpdateAsync(group);
There is not an endpoint which we can use to remove users from AAD Group in batch. But there is a batch endpoint which combines multiple requests in one HTTP call. It seems to have a limitation of 20. So we can't delete too many users in one call.
Here is an example, remove users from AAD Group in batch (Reference here):
GraphServiceClient graphClient = new GraphServiceClient(authProvider);
var removeUserRequest1 = graphClient.Groups["{group-id}"].Members["{id}"].Reference.Request().GetHttpRequestMessage();
var removeUserRequest2 = graphClient.Groups["{group-id}"].Members["{id}"].Reference.Request().GetHttpRequestMessage();
removeUserRequest1.Method = HttpMethod.Delete;
removeUserRequest2.Method = HttpMethod.Delete;
var batchRequestContent = new BatchRequestContent();
batchRequestContent.AddBatchRequestStep(removeUserRequest1);
batchRequestContent.AddBatchRequestStep(removeUserRequest2);
await graphClient.Batch.Request().PostAsync(batchRequestContent);
I'm working on project that uses the YouTube api v3 with the Google .net client libary.
It's done, however sometimes it takes long time to respond when getting a video-list from playlist id. When this happens, it gives error "bad request" after 1 minute and tells me request timed out.
Is there any way to increase the request timeout or another solution for this problem?
List<YTVideo> videos = new List<YTVideo>();
var searchRequest = youtubeService.PlaylistItems.List("snippet");
searchRequest.PlaylistId = playlistId;
searchRequest.MaxResults = 1;
var searchResponse = await searchRequest.ExecuteAsync();
var playlistItemsInfo = searchResponse.Items.FirstOrDefault();
// to get videos from playlistItemsInfo
foreach (var searchResult in searchResponse.Items)
{
var videoSearchRequest = youtubeService.Videos.List("snippet, statistics, contentDetails");
videoSearchRequest.Id = searchResult.Snippet.ResourceId.VideoId;
videoSearchRequest.MaxResults = 1;
var videoSearchResponse = videoSearchRequest.Execute();
var video = videoSearchResponse.Items.FirstOrDefault();
if (video != null)
{
YTVideo yTVideo = new YTVideo
{
Title = video.Snippet.Title,
VideoId = video.Id,
Image = video.Snippet.Thumbnails.Maxres != null ? video.Snippet.Thumbnails.Maxres.Url : video.Snippet.Thumbnails.High.Url,
IsSelected = true
};
videos.Add(yTVideo);
}
}
My problem happens in searchResponse when I execute search the request.
I will load data via API into my ListView async.
I have a list of items (1 to 6300) and in my search box I can search for the items and then it will be displayed in the ListView.
Now I want to show the items Avg prices, which come from a JSON Api.
If you want to take a look at the tool, here is the git link: https://github.com/Triky313/AlbionOnline-StatisticsAnalysis
My current method looks like this. Once new data is loaded from the API and then again only if they are older than one hour.
public static ObservableCollection<MarketStatChartItem> MarketStatChartItemList = new ObservableCollection<MarketStatChartItem>();
public static async Task<string> GetMarketStatAvgPriceAsync(string uniqueName, Location location)
{
try
{
using (var wc = new WebClient())
{
var apiString = "https://www.albion-online-data.com/api/v1/stats/charts/" +
$"{FormattingUniqueNameForApi(uniqueName)}?date={DateTime.Now:MM-dd-yyyy}";
var itemCheck = MarketStatChartItemList?.FirstOrDefault(i => i.UniqueName == uniqueName);
if (itemCheck == null)
{
var itemString = await wc.DownloadStringTaskAsync(apiString);
var values = JsonConvert.DeserializeObject<List<MarketStatChartResponse>>(itemString);
var newItem = new MarketStatChartItem()
{
UniqueName = uniqueName,
MarketStatChartResponse = values,
LastUpdate = DateTime.Now
};
MarketStatChartItemList?.Add(newItem);
var data = newItem.MarketStatChartResponse
.FirstOrDefault(itm => itm.Location == Locations.GetName(location))?.Data;
var findIndex = data?.TimeStamps?.FindIndex(t => t == data.TimeStamps.Max());
if (findIndex != null)
return data.PricesAvg[(int) findIndex].ToString("N", LanguageController.DefaultCultureInfo);
return "-";
}
if (itemCheck.LastUpdate <= DateTime.Now.AddHours(-1))
{
var itemString = await wc.DownloadStringTaskAsync(apiString);
var values = JsonConvert.DeserializeObject<List<MarketStatChartResponse>>(itemString);
itemCheck.LastUpdate = DateTime.Now;
itemCheck.MarketStatChartResponse = values;
}
var itemCheckData = itemCheck.MarketStatChartResponse
.FirstOrDefault(itm => itm.Location == Locations.GetName(location))?.Data;
var itemCheckFindIndex =
itemCheckData?.TimeStamps?.FindIndex(t => t == itemCheckData.TimeStamps.Max());
if (itemCheckFindIndex != null)
return itemCheckData.PricesAvg[(int) itemCheckFindIndex]
.ToString("N", LanguageController.DefaultCultureInfo);
return "-";
}
}
catch (Exception ex)
{
Debug.Print(ex.StackTrace);
Debug.Print(ex.Message);
return "-";
}
}
Through the API requests everything loads very long and I can not normally use the search.
Does anyone know of a better solution for loading asnyc data without the search problems?
EDIT:
Here again visually represented...
The item list is already loaded and the search is very fast.
Now you can see some minuses on the right side, there should be numbers. These numbers are loaded later when the item is shown in the list.
The problem: The search is extremely stale if he has 50+ items in the search and then fill them with the data of the API.
For each item, an API request is made.
Can this API query be canceled if the search changes or is there another possibility?
I have 3000 emails in my gmail account. I want to create an aggregated list of all the senders so that I can more effectively clean up my inbox. I dont need to download the message bodys or the attachments.
I used this sample to get me started (https://developers.google.com/gmail/api/quickstart/dotnet) althought now I cant figure out how to return more than 100 message ids when i execute this code:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Runtime.InteropServices;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Google.Apis.Auth.OAuth2;
using Google.Apis.Gmail.v1;
using Google.Apis.Gmail.v1.Data;
using Google.Apis.Requests;
using Google.Apis.Services;
using Google.Apis.Util;
using Google.Apis.Util.Store;
namespace GmailQuickstart
{
class Program
{
static string[] Scopes = { GmailService.Scope.GmailReadonly };
static string ApplicationName = "Gmail API .NET Quickstart";
static void Main(string[] args)
{
UserCredential credential;
using (var stream = new FileStream("credentials.json", FileMode.Open, FileAccess.Read))
{
string credPath = "token.json";
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
Scopes,
"user",
CancellationToken.None,
new FileDataStore(credPath, true)).Result;
Console.WriteLine("Credential file saved to: " + credPath);
}
// Create Gmail API service.
var service = new GmailService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName,
});
////get all of the message ids for the messages in the inbox
var messageRequest = service.Users.Messages.List("me");
messageRequest.LabelIds = "INBOX";
var messageList = new List<Message>();
ListMessagesResponse messageResponse1 = new ListMessagesResponse();
var k = 0;
do
{
messageResponse1 = messageRequest.Execute();
messageList.AddRange(messageResponse1.Messages);
var output = $"Request {k} - Message Count: {messageList.Count()} Page Token: {messageRequest.PageToken} - Next Page Token: {messageResponse1.NextPageToken}";
Console.WriteLine(output);
System.IO.File.AppendAllText(#"C:\000\log.txt", output);
messageRequest.PageToken = messageResponse1.NextPageToken;
k++;
//this switch allowed me to walk through getting multiple pages of emails without having to get them all
//if (k == 5)
//{
// break;
//}
} while (!String.IsNullOrEmpty(messageRequest.PageToken));
//once i created the list of all the message ids i serialized the list to JSON and wrote it to a file
//so I could test the next portions without having to make the calls against the above each time
var serializedMessageIdList = Newtonsoft.Json.JsonConvert.SerializeObject(messageList);
System.IO.File.WriteAllText(#"C:\000\MessageIds.json", serializedMessageIdList);
//read in the serialized list and rehydrate it to test the next portion
var mIdList = Newtonsoft.Json.JsonConvert.DeserializeObject<List<Message>>(System.IO.File.ReadAllText(#"C:\000\MessageIds.json"));
//this method takes those message ids and gets the message object from the api for each of them
//1000 is the maximum number of requests google allows in a batch request
var messages = BatchDownloadEmails(service, mIdList.Select(m => m.Id), 1000);
//again i'm serializing the message list and writing them to a file
var serializedMessageList = Newtonsoft.Json.JsonConvert.SerializeObject(messages);
System.IO.File.WriteAllText(#"C:\000\Messages.json", serializedMessageList);
//and then reading them in and rehydrating the list to test the next portion
var mList = Newtonsoft.Json.JsonConvert.DeserializeObject<IList<Message>>(System.IO.File.ReadAllText(#"C:\000\Messages.json"));
//then i loop through each message and pull the values out of the payload header i'm looking for
var emailList = new List<EmailItem>();
foreach (var message in mList)
{
if (message != null)
{
var from = message.Payload.Headers.SingleOrDefault(h => h.Name == "From")?.Value;
var date = message.Payload.Headers.SingleOrDefault(h => h.Name == "Date")?.Value;
var subject = message.Payload.Headers.SingleOrDefault(h => h.Name == "Subject")?.Value;
emailList.Add(new EmailItem() { From = from, Subject = subject, Date = date });
}
}
//i serialized this list as well
var serializedEmailItemList = Newtonsoft.Json.JsonConvert.SerializeObject(emailList);
System.IO.File.WriteAllText(#"C:\000\EmailItems.json", serializedEmailItemList);
//rehydrate for testing
var eiList = Newtonsoft.Json.JsonConvert.DeserializeObject<List<EmailItem>>(System.IO.File.ReadAllText(#"C:\000\EmailItems.json"));
//here is where i do the actual aggregation to determine which senders i have the most email from
var senderSummary = eiList.GroupBy(g => g.From).Select(g => new { Sender = g.Key, Count = g.Count() }).OrderByDescending(g => g.Count);
//serialize and output the results
var serializedSummaryList = Newtonsoft.Json.JsonConvert.SerializeObject(senderSummary);
System.IO.File.WriteAllText(#"C:\000\SenderSummary.json", serializedSummaryList);
}
public static IList<Message> BatchDownloadEmails(GmailService service, IEnumerable<string> messageIds, int chunkSize)
{
// Create a batch request.
var messages = new List<Message>();
//because the google batch request will only allow 1000 requests per batch the list needs to be split
//based on chunk size
var lists = messageIds.ChunkBy(chunkSize);
//double batchRequests = (2500 + 999) / 1000;
//for each list create a request with teh message id and add it to the batch request queue
for (int i = 0; i < lists.Count(); i++)
{
var list = lists.ElementAt(i);
Console.WriteLine($"list: {i}...");
var request = new BatchRequest(service);
foreach (var messageId in list)
{
//Console.WriteLine($"message id: {messageId}...");
var messageBodyRequest = service.Users.Messages.Get("me", messageId);
//messageBodyRequest.Format = UsersResource.MessagesResource.GetRequest.FormatEnum.Metadata;
request.Queue<Message>(messageBodyRequest,
(content, error, index, message) =>
{
messages.Add(content);
});
}
Console.WriteLine("");
Console.WriteLine("ExecuteAsync");
//execute all the requests in the queue
request.ExecuteAsync().Wait();
System.Threading.Thread.Sleep(5000);
}
return messages;
}
}
public class EmailItem
{
public string From { get; set; }
public string Subject { get; set; }
public string Date { get; set; }
}
public static class IEnumerableExtensions
{
public static IEnumerable<IEnumerable<T>> ChunkBy<T>(this IEnumerable<T> source, int chunkSize)
{
return source
.Select((x, i) => new { Index = i, Value = x })
.GroupBy(x => x.Index / chunkSize)
.Select(x => x.Select(v => v.Value));
}
}
}
The research I've done says I need to use a batch request and based on the information I've found Im not able to adapt it to what I'm trying to accomplish. My understanding is that I would use the batch request to get all of the message ids and then 3000 individual calls to get the actual from, subject, and date received from each email in my inbox??
You can use paging to get a full list.
Pass the page token from the previous page to get the next call to Users.Messages.List (don't pass into the first call to get things started). Detect the end when the result contains no messages.
This allows you to get all the messages in the mailbox.
NB. I suggest you make the code async: if there are more than a few messages to read, it can take an appreciable time to get them all.
You can also use PageStreamer to get the remainder of the results.
var pageStreamer = new PageStreamer<Google.Apis.Gmail.v1.Data.Message, UsersResource.MessagesResource.ListRequest, ListMessagesResponse, string>(
(request, token) => request.PageToken = token,
response => response.NextPageToken,
response => response.Messages);
var req = service.Users.Messages.List("me");
req.MaxResults = 1000;
foreach (var result in pageStreamer.Fetch(req))
{
Console.WriteLine(result.Id);
}
This code will continue to run as long as there are additional results to request. Batching isnt really going to help you here as there is no way to know what the next page token will be.
I am trying to perform paginated search on Active Directory using System.DirectoryServices.Protocols.PageResultRequestControl.
I do get the search results in pages, however, the searchResponse that I get does NOT have the correct TotalCount for total number of pages.
Is it not supported? Or am I missing something here?
This is sample code that I have used in order to implement above. I am using System.DirectoryServices.Protocols to query Active Directory.
When PageResultRequestControl is added with page number, everything works perfectly except for totalSize.
For example, in this code
LdapConnection connection = new LdapConnection(ldapDirectoryIdentifier, credential);
SearchRequest sr = new SearchRequest("", "(displayName=*)", System.DirectoryServices.Protocols.SearchScope.Subtree, new[] { "displayName"});
PageResultRequestControl pr = new PageResultRequestControl(50);
SearchOptionsControl so = new SearchOptionsControl(SearchOption.DomainScope);
sr.Controls.Add(pr);
sr.Controls.Add(so);
SearchResponse searchResponse;
while (true)
{
searchResponse = (SearchResponse)connection.SendRequest(sr);
if (searchResponse.Controls.Length != 1 || !(searchResponse.Controls[0] is PageResultResponseControl))
{
totalPageCount = 0;
return null;
}
PageResultResponseControl pageResponse = (PageResultResponseControl)searchResponse.Controls[0];
totalPageCount = pageResponse.TotalCount;
if (pageResponse.Cookie.Length == 0)
{
break;
}
else
{
pageRequest.Cookie = pageResponse.Cookie;
}
}
As documentation says, the TotalCount property contains the estimated result set count (https://technet.microsoft.com/en-us/library/system.directoryservices.protocols.pageresultresponsecontrol.totalcount)