I am using Microsoft Graph SDK to create a batch request that contains individual requests to request 20 different users. When I call GetNextLinkAsync() the result is always null. I have tried requesting 1000 different users using batch requests each containing 20 individual requests. This works fine, response is always returned in a single batch response.
I couldn't understand why the response is returned in single batch response content rather than giving me the link to fetch the next response?
Even though next link is null always, how can I follow it using Graph SDK? It is a string. It's not like next page request.
foreach (var batchRequest in batchRequests)
{
try
{
var responses = await PostBatchRequest(batchRequest.Request);
foreach (var id in batchRequest.RequestIds)
{
try
{
var user = await responses.GetResponseByIdAsync<User>(id);
users.Add(user.UserPrincipalName, user.Id);
} catch (ServiceException e)
{
logger.LogInformation(e.StatusCode);
}
}
} catch (ServiceException e)
{
logger.LogInformation(e.StatusCode);
}
}
I couldn't find proper documentation that tells you how to follow next link using Graph SDK or why it is always null for this type of requests, or are there any special type of requests for which next link is returned?
So a lot of this is framework stuff that's inside my wrapper classes, but the crux of the solution to your issue should be inside here:
var results = await batch.ParseAsync<ContactFolderContactsCollectionResponse, IContactFolderContactsCollectionPage>(response => {
var page = response.Value;
if (response?.AdditionalData != null && response.AdditionalData.ContainsKey("#odata.nextLink"))
{
page.InitializeNextPageRequest(Application.GraphConnection.Client, (string)response.AdditionalData["#odata.nextLink"]);
}
return page;
});
In this snippet I'm parsing out a ContactFolderContactsCollectionResponse from a batch with steps generated from a get request that would normally return a IContactFolderContactsCollectionPage. The ContactFolderContactsCollectionResponse is fetched by the wrapper internals using
GetResponseByIdAsync<ContactFolderContactsCollectionResponse>(id)
So it's pretty analogous to what you're doing except that there is probably some kind of UserResponse type that you should probably be using instead of User.
In my case the ContactFolderContactsCollectionResponse contains the IContactFolderContactsCollectionPage I actionally want in the Vvalue property hence:
var page = response.Value;
Now, IContactFolderContactsCollectionPage normally has a NextPageRequest property, but when you parse it directly from the ContactFolderContactsCollectionResponse, this is not filled out. Luckily, we can find the raw #odata.nextPage link in the ContactFolderContactsCollectionResponse's AdditionalData dictionary, and we can set it using the IContactFolderContactsCollectionPage.InitializeNextPageRequest methhod.
Hence the :
if (response?.AdditionalData != null && response.AdditionalData.ContainsKey("#odata.nextLink"))
{
page.InitializeNextPageRequest(Application.GraphConnection.Client, (string)response.AdditionalData["#odata.nextLink"]);
}
Hopefully that give you enough thread to pull on. Sorry if the rest of the syntax is confusing, as I said, a lot of it is operating in a wrapper framework I'm building and I
don't have time to build and test a clean solution.
It's also possible that the whole thing is different anyway on account of you having a batch with a thousand steps as opposed to a batch whose steps return thousands of objects like as in my case.
Either way, happy hunting.
Related
A very simple command to get the user's avatar. However, it works unstable and incorrectly (example in the screenshot). Bot does not respond to all users and not every time. What can this be related to? Maybe I don't understand something, but in my opinion, there is a problem with SocketGuildUser. Here is a simplified code snippet for clarity:
[Command("avatar")]
[Alias("getavatar")]
public async Task GetAvatar(ushort res, SocketGuildUser user = null)
{
if (user == null)
{
await Context.Channel.SendMessageAsync(Context.User.GetAvatarUrl(size: res));
}
else
{
await Context.Channel.SendMessageAsync(user.GetAvatarUrl(size: res));
}
}
Simply enable intents in the discord developer portal. Without it, users will not be cached and your user type reader will fail to parse any mentioned users
Edit: An alternate method would be to create your own custom type reader that will fallback to making a rest request if fetching the user from cache yields no results.
I'm polling data to my angular app from a c# web api. Every time all data is polled, even though much of the data haven't changed. I would like to poll only the objects which have actually been updated in any sort of way.
This is my code in my Controller.cs
//Get all details of the available vehicles
[HttpGet]
[Route("api/details")]
public object GetFleetStatusDetails()
{
var fmsData = this.fmsdb.Value.GetFleetStatusDetails();
var data = fmsData.Entries;
List<VehicleDetails> result = new List<VehicleDetails>();
foreach (var item in data)
{
if (item != null)
{
var details = ConvertVehicleDetail(item);
result.Add(details);
}
}
return result;
}
As you can see im converting the data into VehicleDetails which I later add to my VehicleDetails list. The data im getting is in JSON-format. Is there a way of comparing my last poll with the current poll without going to much deeper down into the database? If so how would I do that?
I suggest saving the last state either through the Browser's local storage or using ngrx.
Without knowing what this.fmsdb.Value.GetFleetStatusDetails(); does or the schema of the database you're calling the best guess answer would be to create two endpoints in the API.
The first would use this.fmsdb.Value.GetFleetStatusDetails(); to get the full set of results. Once the data has been downloaded from the endpoint store the current date/time in the angular view and pass that to the second endpoint...
The second endpoint would get the records that have been changed since the date/time you got the first set of data. This assumes that you are storing the updated date time.
[HttpGet]
[Route("api/updateddetails")]
public object GetUpdatedFleetStatusDetails([FromUrl]DateTime date)
{
var fmsData = this.fmsdb.Value.GetUpdatedFleetStatusDetails(date);
return fmsData.Entries
.Where(x => x != null)
.Select(ConvertVehicleDetail);
}
Could give more details if knew what was in this.fmsdb.Value.GetFleetStatusDetails()
PS Haven't tested this code.
I am using Google API to get information about an authenticated user. I can get the basic profile information, such as the ID and the full name. From the profile information, I can get the URL to the picture:
var plusMeUri = new Uri($"https://www.googleapis.com/plus/v1/people/me?key=<APP-ID>&access_token=<ACCESS-TOKEN>");
string userResponse = await HttpClient.GetStringAsync(plusMeUri);
JObject userObject = JObject.Parse(userResponse);
...
var imageObject = userObject.GetValue("image") as JObject;
var pictureUrl = imageObject.GetValue("url").Value<string>();
var pictureUri = new Uri(pictureUrl);
string uri = $"{pictureUri.Scheme}://{pictureUri.Host}{pictureUri.AbsolutePath}";
var pictureRequest = new HttpRequestMessage(HttpMethod.Get, uri);
pictureRequest.Headers.IfModifiedSince = <previous-timestamp>;
HttpResponseMessage pictureResponse = await HttpClient.SendAsync(pictureRequest);
if (pictureResponse.StatusCode == HttpStatusCode.NotModified)
// No need to handle anything else
return;
Question
I do not want to download the user's picture if it has not changed. This is why I am using the IfModifiedSince property. It does work with Facebook's API but it does not seem to work with Google's. How can I make it work?
From the information given, it seems like what you're trying to do is determine whether the image you're downloading/about to download is the same image as you've downloaded before. After looking at the Google+ API docs, it looks like the header you've been using isn't officially (at least not obviously) supported by their APIs.
But this is not the only way we can determine whether the image has changed or not (in fact, date last modified isn't necessarily the best way to do this anyway). Alternative methods include:
1) diffing the two images
2) checking the url (if we can assume different resources have different urls)
1 is likely the most accurate but also likely the least efficient, so I'll leave that to you to solve if you decide to go that route. I think the most promising is #2. I went ahead and played around with the API a little bit and it looks like the image.url field changes when you update your profile picture.
For example, here are my last two Google+ profile picture URLs:
https://lh4.googleusercontent.com/-oaUVPGFNkV8/AAAAAAAAAAI/AAAAAAAAAqs/KM7H8ZIFuxk/photo.jpg?sz=50
https://lh4.googleusercontent.com/-oaUVPGFNkV8/AAAAAAAAAAI/AAAAAAAAl24/yHU99opjgN4/photo.jpg?sz=50
As such, instead of waiting for the response from the server and checking its header to decide whether the image has been updated or not, you may be able to short-circuit the entire HTTP request by simply checking whether the last image you pulled down was from the same url or not. If it was from the same URL, it's likely you've already acquired that image otherwise you may not have it so should incur the cost of downloading anyway.
In this case, your code would read something like:
var imageObject = userObject.GetValue("image") as JObject;
var pictureUrl = imageObject.GetValue("url").Value<string>();
if(pictureUrl != <previous-picture-url>)
{
// insert get new picture logic here...
}
I have wrapped the action in Task.Run but it seems that I am missing something very basic. But unable to figure it out.
public void SaveOrderList(List<Order> inputList)
{
Dictionary<string, string> result = new Dictionary<string, string>();
string code = string.Empty;
Task.Run(() =>
{
foreach (var item in inputList)
{
code = CreateSingleOrder(item);
result.Add(item.TicketNumber, code);
}
////TODO: Write logic to send mail
emailSender.SendEmail("abc#xyz.com");
});
}
Since there can be many entries in inputList and each entry may take 5 sec to process, I don't want the UI to be blocked for end user. Instead, I will send a mail and notify how many processed successfully and what all are failed.
To achieve this, best I knew was Task.Run. But, the problem is as soon as function completes, I don't see that the code inside the foreach loop ever worked because it never made to the DB.
Can anyone help me find out what is that I am missing here.
Just for information, this function is called from Web API and Web API POST method is called from javascript. Below is the code for Web API endpoint.
[HttpPost, Route("SaveOrderList")]
[ResponseType(typeof(bool))]
public IHttpActionResult SaveOrderList(List<Order> orderList)
{
orderManagerService.SaveOrderList(orderList)
return this.Ok();
}
Thanks in advance for help.
You need to consider carefully how this works. There are a few suggestions in this article:
https://blog.stephencleary.com/2014/06/fire-and-forget-on-asp-net.html
But I would point out that 'fire and forget' on a web application is usually the wrong approach.
For your example, you really want to consider your UX - if I make an order on your site and then only find out some time later that the order failed (via email, which I may not be checking), I'd not be too impressed. It would be better to await the save result, or make multiple API requests for single order items and show the incremental result of successful orders on your front end.
I'd also suggest a hard look at why your order saving is so slow - this will continue to be problematic for you until it's faster.
I have a c# script task in an ssis package designed to geocode data through my company's proprietary system. It currently works like this:
1) Pull query of addresses and put in data table
2) Loop through that table and Foreach row, build request, send request, wait for response, then insert back into the database.
The issue is that each call takes forever to return, because before going out and getting a new address on the api side, it checks a current database(string match) to ensure the address does not already exist. If not exists, then go out and get me new data from a service like google.
Because I'm doing one at a time, it makes it easy to keep the ID field with the record when I go back to insert it into the database.
Now comes the issue at hand... I was told to configure this as multi-thread or asynchronous. Here is the page I was reading on here about this topic:
ASP.NET Multithreading Web Requests
var urls = new List<string>();
var results = new ConcurrentBag<OccupationSearch>();
Parallel.ForEach(urls, url =>
{
WebRequest request = WebRequest.Create(requestUrl);
string response = new StreamReader(request.GetResponse().GetResponseStream()).ReadToEnd();
var result = JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response)));
results.Add(result);
});
Perhaps I'm thinking about this wrong, but if I send 2 requests(A & B) and lets say B actually returns first, how can I ensure that when I go back to update my database I'm updating the correct record? Can I send the ID with the API call and return it?
My thoughts are to create an array of requests, burn through them without waiting for a response and return those value in another array, that I will then loop through on my insert statement.
Is this a good way of going about this? I've never used Parrallel.ForEach, and all the info I find on it is too technical for me to visualize and apply to my situation.
Perhaps I'm thinking about this wrong, but if I send 2 requests(A & B) and lets say B actually returns first, how can I ensure that when I go back to update my database I'm updating the correct record? Can I send the ID with the API call and return it?
None of your code contains anything that looks like an "ID," but I assume everything you need is in the URL. If that is the case, one simple answer is to use a Dictionary instead of a Bag.
List<string> urls = GetListOfUrlsFromSomewhere();
var results = new ConcurrentDictionary<string, OccupationSearch>();
Parallel.ForEach(urls.Distinct(), url =>
{
WebRequest request = WebRequest.Create(url);
string response = new StreamReader(request.GetResponse().GetResponseStream()).ReadToEnd();
var result = JsonSerializer().Deserialize<OccupationSearch>(new JsonTextReader(new StringReader(response)));
results.TryAdd(url, result);
});
After this code is done, the results dictionary will contain entries that correlate each response back to the original URL.
Note: you might want to use HttpClient instead of WebClient, and you should take care to dispose of your disposable objects, e.g. StreamReader and StringReader.