c# Synchronise asynchronous messages processing - TPL Dataflow pipelines - c#

What am I doing:
I have a microservice, part of an order processing system, which constantly consumes order messages from RabbitMQ and I need to temporary keep them in my microservice DB, until they are handled (accepted/declined). The order messages are forwarded to a TPL Dataflow pipeline, which has two branches.
The first one being for processing 'created' orders.
When an order is created I perform some operations, such as statistics for the order, validations, persisting in to DB and notifying the user through SignalR. (I have 5 blocks for this branch, and some include I/O network calls).
The second branch (2 blocks) is for accepted/declined orders. When I receive an order with status accepted/declined I need to remove it from my DB and also notify the user through SignalR.
The message for 'created' and 'accepted/declined' order differs only in its status property.
The forwarding to each of the two branches happens through TPL Dataflow's link predicate, based on the order status.
My problem:
Sometimes a message for declined/accpeted order arrives 50-150 ms after the message for the same order being created. Usually, 50-150 ms are a lot of time in computing, but the first dataflow branch is dependent on external calls to other services, which might cause a delay in the processing.
I want to make sure that I have fully processed the message with status 'created' and only after that to process the message for the same order being 'accepted/declined'.
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading;
using System.Threading.Tasks;
using System.Threading.Tasks.Dataflow;
namespace ConsoleApp12
{
public static class Program
{
static void Main(string[] args)
{
var linkOptions = new DataflowLinkOptions { PropagateCompletion = true };
var executionOptions = new ExecutionDataflowBlockOptions
{
MaxDegreeOfParallelism = 1, // potentially would be increased
BoundedCapacity = 50
};
var deserialize = new TransformBlock<OrderStatus, Order>(o =>
{
return new Order { Status = o };
}, executionOptions);
#region Creted order
var b11 = new TransformBlock<Order, Order>(async o =>
{
await Task.Delay(15); // do something
return o;
}, executionOptions);
var b12 = new TransformBlock<Order, Order>(async o =>
{
await Task.Delay(15); // do something
return o;
}, executionOptions);
var b13 = new TransformBlock<Order, Order>(async o =>
{
await Task.Delay(15);
Console.WriteLine("Saved In DB");
return o;
}, executionOptions);
var b14 = new ActionBlock<Order>(async o =>
{
await Task.Delay(5);
Console.WriteLine("SignalR order created");
}, executionOptions);
#endregion
#region Accepted/Declined
var b21 = new ActionBlock<Order>(async o =>
{
await Task.Delay(5);
Console.WriteLine("Deleted from DB");
}, executionOptions);
var b22 = new ActionBlock<Order>(async o =>
{
await Task.Delay(5);
Console.WriteLine("SignalR order deleted");
}, executionOptions);
#endregion
var deleteFromDbAndSignalRInParallelJob = new List<ITargetBlock<Order>> { b21, b22 }.CreateGuaranteedBroadcastBlock();
deserialize.LinkTo(b11, linkOptions, x => x.Status == OrderStatus.Created);
b11.LinkTo(b12, linkOptions);
b12.LinkTo(b13, linkOptions);
b13.LinkTo(b14, linkOptions);
deserialize.LinkTo(deleteFromDbAndSignalRInParallelJob, linkOptions);
deserialize.Post(OrderStatus.Created);
Thread.Sleep(30); // delay between messaged
deserialize.Post(OrderStatus.Declined);
Console.ReadKey();
}
}
class Order
{
public OrderStatus Status { get; init; }
}
enum OrderStatus
{
Created = 1,
Declined = 2
}
public static class DataflowExtensions
{
public static ITargetBlock<T> CreateGuaranteedBroadcastBlock<T>(this IEnumerable<ITargetBlock<T>> targets)
{
var targetsList = targets.ToList();
return new ActionBlock<T>(async item =>
{
var tasks = targetsList.Select(t => t.SendAsync(item));
await Task.WhenAll(tasks);
},
new ExecutionDataflowBlockOptions { BoundedCapacity = 100 });
}
}
}
Here is a sample with simplified models and logic simulating that the 'created' order branch is taking more time to complete.
The output is:
SignalR order deleted
Deleted from DB
Saved In DB
SignalR order created

Related

Kafaka Subscriber not receiving the messages

I am writing sample applications in .Net core to interact with Kafka.
I have downloaded Kafka and Zookeeper official docker images to my machine.
I am using Confluent.Kafka nuget package for both producer and consumer. I am able to produce the message to Kafka. But my consumer part is not working.
Below is my producer and consumer code snippet. I am not sure what mistake I'm doing here.
Do we need to Explicitly create a consumer group?
Consumer Code (This code not working. Thread waiting at consumer.Consume(cToken);)
var config = new ConsumerConfig
{
BootstrapServers = "localhost:9092",
GroupId = "myroupidd",
AutoOffsetReset = AutoOffsetReset.Earliest,
};
var cToken = ctokenSource.Token;
var consumerBuilder = new ConsumerBuilder<Null, string>(config);
consumerBuilder.SetPartitionsAssignedHandler((consumer, partitionlist) =>
{
consumer.Assign(new TopicPartition("myactual-toppics", 0) { });
Console.WriteLine("inside SetPartitionsAssignedHandler action");
});
using var consumer = consumerBuilder.Build();
consumer.Subscribe("myactual-toppics");
while (!cToken.IsCancellationRequested)
{
var consumeResult = consumer.Consume(cToken);
if (consumeResult.Message != null)
Console.WriteLine(consumeResult.Message.Value);
}
Producer Code (This is working fine. I am able to see the messages using Conduckto Tool)
var config = new ProducerConfig
{
BootstrapServers = "localhost:9092",
ClientId = Dns.GetHostName(),
};
using (var producer = new ProducerBuilder<Null, string>(config).Build())
{
while (!stoppingToken.IsCancellationRequested)
{
var top = new TopicPartition("myactual-toppics", 0);
var result = await producer.ProduceAsync(top, new Message<Null, string> { Value = "My First Message" });
Console.WriteLine($"Publishedss1234" );
await Task.Delay(5000, stoppingToken);

Run one job queue in the same time on multiple servers

I have a real scenario with two servers. I have one queue and I want to run only one job at the same time on one of two servers. But it looks like both servers are trying to handle this job. It is possible?
My code doesn’t work. The requirement is if one server runs a job, another one should not put on the queue but forget. In a real scenario, I need to distribute this across multiple servers.
Possible storages can be only Redis and PostgreSQL.
Any ideas on how to resolve this?
using System.Threading;
using System.Threading.Tasks;
using Hangfire;
using Hangfire.Logging;
using Hangfire.Logging.LogProviders;
using Hangfire.MemoryStorage;
using Hangfire.MemoryStorage.Database;
using Hangfire.PostgreSql;
using Hangfire.Pro.Redis;
namespace HangfireLockQueue
{
public class Program
{
static void Main(string[] args)
{
// 1.memory
//JobStorage storage = new MemoryStorageFake(new MemoryStorageOptions());
// 2.redis
//JobStorage storage = new RedisStorage("redis:6379,ssl = false", new RedisStorageOptions
//{
// Prefix = "HangfireRedis-Local:"
//});
// 3.postgres
var connectionString = "Host=postgres;Database=scheduler;Username=test;Password=<>;";
JobStorage storage = new PostgreSqlStorage(connectionString, new PostgreSqlStorageOptions
{
InvisibilityTimeout = TimeSpan.FromMinutes(5),
QueuePollInterval = TimeSpan.FromMilliseconds(200),
DistributedLockTimeout = TimeSpan.FromSeconds(10),
});
JobStorage.Current = storage;
Console.WriteLine(JobStorage.Current);
LogProvider.SetCurrentLogProvider(new ColouredConsoleLogProvider());
RunServer1(storage);
RunServer2(storage);
Console.ReadLine();
}
private static void RunServer1(JobStorage storage)
{
Task.Factory.StartNew(() => RunServer1Client1Queue(storage));
}
private static void RunServer1Client1Queue(JobStorage storage)
{
var serverOptions = new BackgroundJobServerOptions
{
ShutdownTimeout = TimeSpan.FromMinutes(5),
ServerName = $"{Environment.MachineName}1.{Guid.NewGuid()}",
Queues = new[] { "queue1" },
WorkerCount = 1
};
using (new BackgroundJobServer(serverOptions, storage))
{
Log("Hangfire Server 1 started. Press any key to exit...");
RecurringJob.AddOrUpdate(
Guid.NewGuid().ToString(),
() => JobThree(),
"* * * * *",
queue: "queue1");
Console.ReadKey();
}
}
private static void RunServer2(JobStorage storage)
{
Task.Factory.StartNew(() => RunServer2Client1Queue(storage));
}
private static void RunServer2Client1Queue(JobStorage storage)
{
Thread.Sleep(2000);
var serverOptions = new BackgroundJobServerOptions
{
ShutdownTimeout = TimeSpan.FromMinutes(5),
ServerName = $"{Environment.MachineName}2.{Guid.NewGuid()}",
Queues = new[] { "queue1" },
WorkerCount = 1
};
using (new BackgroundJobServer(serverOptions, storage))
{
Log("Hangfire Server 2 started. Press any key to exit...");
RecurringJob.AddOrUpdate(
Guid.NewGuid().ToString(),
() => JobThree(),
"* * * * *",
queue: "queue1");
Console.ReadKey();
}
}
public static void JobThree()
{
// implement guard if not working
Thread.Sleep(3000);
Log($"JobThree, current time:{DateTime.Now.ToString()}");
}
public static void Log(string msg, LogLevel level = LogLevel.Info)
{
LogProvider.GetLogger("Main").Log(level, () => { return msg; });
}
}
}

Parallel.ForEach blocking calling method

I am having a problem with Parallel.ForEach. I have written simple application that adds file names to be downloaded to the queue, then using while loop it iterates through the queue, downloads file one at a time, then when file has been downloaded, another async method is called to create object from downloaded memoryStream. Returned result of this method is not awaited, it is discarded, so the next download starts immediately. Everything works fine if I use simple foreach in object creation - objects are being created while download is continuing. But if I would like to speed up the object creation process and use Parallel.ForEach it stops download process until the object is created. UI is fully responsive, but it just won't download the next object. I don't understand why is this happening - Parallel.ForEach is inside await Task.Run() and to my limited knowledge about asynchronous programming this should do the trick. Can anyone help me understand why is it blocking first method and how to avoid it?
Here is a small sample:
public async Task DownloadFromCloud(List<string> constructNames)
{
_downloadDataQueue = new Queue<string>();
var _gcsClient = StorageClient.Create();
foreach (var item in constructNames)
{
_downloadDataQueue.Enqueue(item);
}
while (_downloadDataQueue.Count > 0)
{
var memoryStream = new MemoryStream();
await _gcsClient.DownloadObjectAsync("companyprojects",
_downloadDataQueue.Peek(), memoryStream);
memoryStream.Position = 0;
_ = ReadFileXml(memoryStream);
_downloadDataQueue.Dequeue();
}
}
private async Task ReadFileXml(MemoryStream memoryStream)
{
var reader = new XmlReader();
var properties = reader.ReadXmlTest(memoryStream);
await Task.Run(() =>
{
var entityList = new List<Entity>();
foreach (var item in properties)
{
entityList.Add(CreateObjectsFromDownloadedProperties(item));
}
//Parallel.ForEach(properties item =>
//{
// entityList.Add(CreateObjectsFromDownloadedProperties(item));
//});
});
}
EDIT
This is simplified object creation method:
public Entity CreateObjectsFromDownloadedProperties(RebarProperties properties)
{
var path = new LinearPath(properties.Path);
var section = new Region(properties.Region);
var sweep = section.SweepAsMesh(path, 1);
return sweep;
}
Returned result of this method is not awaited, it is discarded, so the next download starts immediately.
This is also dangerous. "Fire and forget" means "I don't care when this operation completes, or if it completes. Just discard all exceptions because I don't care." So fire-and-forget should be extremely rare in practice. It's not appropriate here.
UI is fully responsive, but it just won't download the next object.
I have no idea why it would block the downloads, but there's a definite problem in switching to Parallel.ForEach: List<T>.Add is not threadsafe.
private async Task ReadFileXml(MemoryStream memoryStream)
{
var reader = new XmlReader();
var properties = reader.ReadXmlTest(memoryStream);
await Task.Run(() =>
{
var entityList = new List<Entity>();
Parallel.ForEach(properties, item =>
{
var itemToAdd = CreateObjectsFromDownloadedProperties(item);
lock (entityList) { entityList.Add(itemToAdd); }
});
});
}
One tip: if you have a result value, PLINQ is often cleaner than Parallel:
private async Task ReadFileXml(MemoryStream memoryStream)
{
var reader = new XmlReader();
var properties = reader.ReadXmlTest(memoryStream);
await Task.Run(() =>
{
var entityList = proeprties
.AsParallel()
.Select(CreateObjectsFromDownloadedProperties)
.ToList();
});
}
However, the code still suffers from the fire-and-forget problem.
For a better fix, I'd recommend taking a step back and using something more suited to "pipeline"-style processing. E.g., TPL Dataflow:
public async Task DownloadFromCloud(List<string> constructNames)
{
// Set up the pipeline.
var gcsClient = StorageClient.Create();
var downloadBlock = new TransformBlock<string, MemoryStream>(async constructName =>
{
var memoryStream = new MemoryStream();
await gcsClient.DownloadObjectAsync("companyprojects", constructName, memoryStream);
memoryStream.Position = 0;
return memoryStream;
});
var processBlock = new TransformBlock<MemoryStream, List<Entity>>(memoryStream =>
{
var reader = new XmlReader();
var properties = reader.ReadXmlTest(memoryStream);
return proeprties
.AsParallel()
.Select(CreateObjectsFromDownloadedProperties)
.ToList();
});
var resultsBlock = new ActionBlock<List<Entity>>(entities => { /* TODO */ });
downloadBlock.LinkTo(processBlock, new DataflowLinkOptions { PropagateCompletion = true });
processBlock.LinkTo(resultsBlock, new DataflowLinkOptions { PropagateCompletion = true });
// Push data into the pipeline.
foreach (var constructName in constructNames)
await downloadBlock.SendAsync(constructName);
downlodBlock.Complete();
// Wait for pipeline to complete.
await resultsBlock.Completion;
}

How to find all id and autopaste as parameter using linq?

I have 2 projects. One of them aspnet core webapi and second one is console application which is consuming api.
Api method looks like:
[HttpPost]
public async Task<IActionResult> CreateBillingInfo(BillingSummary
billingSummaryCreateDto)
{
var role = User.FindFirst(ClaimTypes.Role).Value;
if (role != "admin")
{
return BadRequest("Available only for admin");
}
... other properties
billingSummaryCreateDto.Price = icu * roc.Price;
billingSummaryCreateDto.Project =
await _context.Projects.FirstOrDefaultAsync(x => x.Id ==
billingSummaryCreateDto.ProjectId);
await _context.BillingSummaries.AddAsync(billingSummaryCreateDto);
await _context.SaveChangesAsync();
return StatusCode(201);
}
Console application which consuming api:
public static async Task CreateBillingSummary(int projectId)
{
var json = JsonConvert.SerializeObject(new {projectId});
var data = new StringContent(json, Encoding.UTF8, "application/json");
using var client = new HttpClient();
client.DefaultRequestHeaders.Authorization =
new AuthenticationHeaderValue("Bearer", await Token.GetToken());
var loginResponse = await client.PostAsync(LibvirtUrls.createBillingSummaryUrl,
data);
WriteLine("Response Status Code: " + (int) loginResponse.StatusCode);
string result = loginResponse.Content.ReadAsStringAsync().Result;
WriteLine(result);
}
Program.cs main method looks like:
static async Task Main(string[] args)
{
if (Environment.GetEnvironmentVariable("TAIKUN_USER") == null ||
Environment.GetEnvironmentVariable("TAIKUN_PASSWORD") == null ||
Environment.GetEnvironmentVariable("TAIKUN_URL") == null)
{
Console.WriteLine("Please specify all credentials");
Environment.Exit(0);
}
Timer timer = new Timer(1000); // show time every second
timer.Elapsed += Timer_Elapsed;
timer.Start();
while (true)
{
Thread.Sleep(1000); // after 1 second begin
await PollerRequests.CreateBillingSummary(60); // auto id
await PollerRequests.CreateBillingSummary(59); // auto id
Thread.Sleep(3600000); // 1hour wait again requests
}
}
Is it possible find all id and paste it automatically instead of 59 and 60? Ids from projects table. _context.Projects
Tried also approach using method which returns ids
public static async Task<IEnumerable<int>> GetProjectIds2()
{
var json = await
Helpers.Transformer(LibvirtUrls.projectsUrl);
List<ProjectListDto> vmList =
JsonConvert.DeserializeObject<List<ProjectListDto>>(json);
return vmList.Select(x => x.Id).AsEnumerable(); // tried
ToList() as well
}
and in main method used:
foreach (var i in await PollerRequests.GetProjectIds2())
new List<int> { i }
.ForEach(async c => await
PollerRequests.CreateBillingSummary(c));
for first 3 ids it worked but does not get other ones,
tested with console writeline method returns all ids
First get all Ids:
var ids = await PollerRequests.GetProjectIds2();
Then create list of task and run all tasks:
var taskList = new List<Task>();
foreach(var id in ids)
taskList.Add(PollerRequests.CreateBillingSummary(id));
await Task.WhenAll(taskList);

Retrieving more than 100 message ids from Gmail API

I have 3000 emails in my gmail account. I want to create an aggregated list of all the senders so that I can more effectively clean up my inbox. I dont need to download the message bodys or the attachments.
I used this sample to get me started (https://developers.google.com/gmail/api/quickstart/dotnet) althought now I cant figure out how to return more than 100 message ids when i execute this code:
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Runtime.InteropServices;
using System.Text;
using System.Threading;
using System.Threading.Tasks;
using Google.Apis.Auth.OAuth2;
using Google.Apis.Gmail.v1;
using Google.Apis.Gmail.v1.Data;
using Google.Apis.Requests;
using Google.Apis.Services;
using Google.Apis.Util;
using Google.Apis.Util.Store;
namespace GmailQuickstart
{
class Program
{
static string[] Scopes = { GmailService.Scope.GmailReadonly };
static string ApplicationName = "Gmail API .NET Quickstart";
static void Main(string[] args)
{
UserCredential credential;
using (var stream = new FileStream("credentials.json", FileMode.Open, FileAccess.Read))
{
string credPath = "token.json";
credential = GoogleWebAuthorizationBroker.AuthorizeAsync(
GoogleClientSecrets.Load(stream).Secrets,
Scopes,
"user",
CancellationToken.None,
new FileDataStore(credPath, true)).Result;
Console.WriteLine("Credential file saved to: " + credPath);
}
// Create Gmail API service.
var service = new GmailService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential,
ApplicationName = ApplicationName,
});
////get all of the message ids for the messages in the inbox
var messageRequest = service.Users.Messages.List("me");
messageRequest.LabelIds = "INBOX";
var messageList = new List<Message>();
ListMessagesResponse messageResponse1 = new ListMessagesResponse();
var k = 0;
do
{
messageResponse1 = messageRequest.Execute();
messageList.AddRange(messageResponse1.Messages);
var output = $"Request {k} - Message Count: {messageList.Count()} Page Token: {messageRequest.PageToken} - Next Page Token: {messageResponse1.NextPageToken}";
Console.WriteLine(output);
System.IO.File.AppendAllText(#"C:\000\log.txt", output);
messageRequest.PageToken = messageResponse1.NextPageToken;
k++;
//this switch allowed me to walk through getting multiple pages of emails without having to get them all
//if (k == 5)
//{
// break;
//}
} while (!String.IsNullOrEmpty(messageRequest.PageToken));
//once i created the list of all the message ids i serialized the list to JSON and wrote it to a file
//so I could test the next portions without having to make the calls against the above each time
var serializedMessageIdList = Newtonsoft.Json.JsonConvert.SerializeObject(messageList);
System.IO.File.WriteAllText(#"C:\000\MessageIds.json", serializedMessageIdList);
//read in the serialized list and rehydrate it to test the next portion
var mIdList = Newtonsoft.Json.JsonConvert.DeserializeObject<List<Message>>(System.IO.File.ReadAllText(#"C:\000\MessageIds.json"));
//this method takes those message ids and gets the message object from the api for each of them
//1000 is the maximum number of requests google allows in a batch request
var messages = BatchDownloadEmails(service, mIdList.Select(m => m.Id), 1000);
//again i'm serializing the message list and writing them to a file
var serializedMessageList = Newtonsoft.Json.JsonConvert.SerializeObject(messages);
System.IO.File.WriteAllText(#"C:\000\Messages.json", serializedMessageList);
//and then reading them in and rehydrating the list to test the next portion
var mList = Newtonsoft.Json.JsonConvert.DeserializeObject<IList<Message>>(System.IO.File.ReadAllText(#"C:\000\Messages.json"));
//then i loop through each message and pull the values out of the payload header i'm looking for
var emailList = new List<EmailItem>();
foreach (var message in mList)
{
if (message != null)
{
var from = message.Payload.Headers.SingleOrDefault(h => h.Name == "From")?.Value;
var date = message.Payload.Headers.SingleOrDefault(h => h.Name == "Date")?.Value;
var subject = message.Payload.Headers.SingleOrDefault(h => h.Name == "Subject")?.Value;
emailList.Add(new EmailItem() { From = from, Subject = subject, Date = date });
}
}
//i serialized this list as well
var serializedEmailItemList = Newtonsoft.Json.JsonConvert.SerializeObject(emailList);
System.IO.File.WriteAllText(#"C:\000\EmailItems.json", serializedEmailItemList);
//rehydrate for testing
var eiList = Newtonsoft.Json.JsonConvert.DeserializeObject<List<EmailItem>>(System.IO.File.ReadAllText(#"C:\000\EmailItems.json"));
//here is where i do the actual aggregation to determine which senders i have the most email from
var senderSummary = eiList.GroupBy(g => g.From).Select(g => new { Sender = g.Key, Count = g.Count() }).OrderByDescending(g => g.Count);
//serialize and output the results
var serializedSummaryList = Newtonsoft.Json.JsonConvert.SerializeObject(senderSummary);
System.IO.File.WriteAllText(#"C:\000\SenderSummary.json", serializedSummaryList);
}
public static IList<Message> BatchDownloadEmails(GmailService service, IEnumerable<string> messageIds, int chunkSize)
{
// Create a batch request.
var messages = new List<Message>();
//because the google batch request will only allow 1000 requests per batch the list needs to be split
//based on chunk size
var lists = messageIds.ChunkBy(chunkSize);
//double batchRequests = (2500 + 999) / 1000;
//for each list create a request with teh message id and add it to the batch request queue
for (int i = 0; i < lists.Count(); i++)
{
var list = lists.ElementAt(i);
Console.WriteLine($"list: {i}...");
var request = new BatchRequest(service);
foreach (var messageId in list)
{
//Console.WriteLine($"message id: {messageId}...");
var messageBodyRequest = service.Users.Messages.Get("me", messageId);
//messageBodyRequest.Format = UsersResource.MessagesResource.GetRequest.FormatEnum.Metadata;
request.Queue<Message>(messageBodyRequest,
(content, error, index, message) =>
{
messages.Add(content);
});
}
Console.WriteLine("");
Console.WriteLine("ExecuteAsync");
//execute all the requests in the queue
request.ExecuteAsync().Wait();
System.Threading.Thread.Sleep(5000);
}
return messages;
}
}
public class EmailItem
{
public string From { get; set; }
public string Subject { get; set; }
public string Date { get; set; }
}
public static class IEnumerableExtensions
{
public static IEnumerable<IEnumerable<T>> ChunkBy<T>(this IEnumerable<T> source, int chunkSize)
{
return source
.Select((x, i) => new { Index = i, Value = x })
.GroupBy(x => x.Index / chunkSize)
.Select(x => x.Select(v => v.Value));
}
}
}
The research I've done says I need to use a batch request and based on the information I've found Im not able to adapt it to what I'm trying to accomplish. My understanding is that I would use the batch request to get all of the message ids and then 3000 individual calls to get the actual from, subject, and date received from each email in my inbox??
You can use paging to get a full list.
Pass the page token from the previous page to get the next call to Users.Messages.List (don't pass into the first call to get things started). Detect the end when the result contains no messages.
This allows you to get all the messages in the mailbox.
NB. I suggest you make the code async: if there are more than a few messages to read, it can take an appreciable time to get them all.
You can also use PageStreamer to get the remainder of the results.
var pageStreamer = new PageStreamer<Google.Apis.Gmail.v1.Data.Message, UsersResource.MessagesResource.ListRequest, ListMessagesResponse, string>(
(request, token) => request.PageToken = token,
response => response.NextPageToken,
response => response.Messages);
var req = service.Users.Messages.List("me");
req.MaxResults = 1000;
foreach (var result in pageStreamer.Fetch(req))
{
Console.WriteLine(result.Id);
}
This code will continue to run as long as there are additional results to request. Batching isnt really going to help you here as there is no way to know what the next page token will be.

Categories