Jama REST Api - Injecting a where clause to GET - c#

I'm using the REST API of Jama, detailed here:
https://dev.jamasoftware.com/rest#operation_getItems
Using this allows me to get a JSON object of all the items in a given project.
However what is returned is limited to just 20 results, to get all the results I need to loop through using pagination.
There is a field in the returned data which is itemType. Is there anyway of injecting a 'where' clause into the GET request so that only items of certain type is returned?
All I'm after is the total number of matching items, seems unnecssary to have to grab everything and then loop through to extract the items of a certain type?

Not sure which version of Jama you are using but in 8.25 we have the abstractitems endpoint, which optionally takes an itemType [array] parameter in the query.
See https://dev.jamasoftware.com/rest#operation_getAbstractItems
the GET request looks like this:
https://myjamahost.com/rest/latest/abstractitems?project=23&itemType=60&itemType=67&...&maxResults=50
unfortunately the response size here is also default 20 and max. 50
Hope that helps
S.

Related

Sharepoint Lookup values

I have one sharepoint list which is using Lookup Field, whose source is pointing to other list in the same site (Master data). While using OData query i don't see that column in my result.
If I use OData url with "FieldValuesAsText" I can see data. Also, I have tried using ContentType Expand and filter but i have no luck in finding.
Since, my list is very big and want to retrieve data in minimum number of calls. looking for some kind of approach or URL which will help me achieve the same
Try to expand look up field to get real value like this:
/_api/web/lists(guid'')/items?$select=Title,LookupFieldName1/FieldToBeExpanded1,LookupField2/FieldToBeExpanded2&$expand=LookupFieldName1,LookupFieldName2
In this endpoint, FieldToBeExpand1 should be the same as the column where lookup get from:

How to create pagination based on client.Search results Elasticsearch Nest

Is there any way in which I can retrieve all the results from client.Search (I think this can be done using scroll API) and create a pagination for these results when displaying them? Is there any API from ES for doing so?
Or using From() and size(), can it be done?
For eg: Lets say, I have 100,000 documents on the index and when I search for a keyword it generates some 200 results. How can I use scroll, from and size to show them?
TIA
We use from and size options to implement pagination for Elasticsearch results.
The code snippet can be something like below:
def query(page)
size = 10
page ||= 1
from = (page-1) * size
# elasticsearch query with from * size options
end
You may need to know total number of results to implement pagination without sending a additional count request. To get the total results, you can use the total field of the response.
=== Updated
If you want to get the search results of the first page, then you can use query(1). If you want to get the search results of the second page, then you can use query(2) and so on.
The purpose of the scroll is slightly different. Let's say you need to get all records of the search results and the number of results are too large (eg., millions of results). If you retrieve all the data at once, it will arise a kind of memory issue or problems because of the high load. In this case, you can use scroll to fetch results step by step.
For the pagination, you don't need to get all data of the search results. You only need to get some data of a specific page. In this case, you may need to use just query with from and size options NOT scroll.

How to add limit parameter in the soap request url?

My aim is to get all records from a table in ServiceNow. By default, I am able to get 250 records. But I need to get all the records. I learnt from here that
we can set the limit in the soap request url.
Code:
var url = "https://*****.service-now.com/rm_story.do?SOAP";
Additional code in getting records for reference
I added the limit parameter to the above url as
https://*****.service-now.com/rm_story.do?&limit=300&SOAP.
Even though, I am getting same 250 records.
Is this the correct way to add limit ? If not, how to add limit in the SOAP request url in ServiceNow? Thanks.
The _limit parameter must be passed in the body of the http request not the url. See answer to
ServiceNow - Getting all records for example http request to ServiceNow SOAP API using _limit parameter.
No, you need to look at first_row (perhaps combined with limit):
first_row: Instruct the results to be offset by this number of records from the beginning of the set. When used with __last_row has
the effect of querying for a window of results. The results are
inclusive of the first row number.

CAML query items in increments

I currently have a sharepoint2010 list that contains roughly 200,000 records.
I want to get each record, tweak it, massage it, and store it in a SQL table.
As of now, I am using the sharepoint 2010 web service method GetListItems like so...
System.Xml.XmlNode nodeListItems = client.GetListItems("guid", "", query, viewFields, RowNumber, queryOptions, null);
querying 200000 records is too much to query at once. How can I get around this? The GetListItems method takes CAML query parameters.
Is there a way to do this in increments, like say 5000 records at a time? How would one structure the CAML query to do that?
Unless someone has a better way of accomplishing this altogether?
Yes, there is, you can paginate the results. The fifth parameter is the page size, you have it set via RowNumber. Set it to 5000 if you want pages of size 5000.
Details on accessing subsequent pages can be seen from the documentation for the GetListItems method
The GetListItems method supports server-side paging. The XML data
returned by this method includes a ListItemCollectionPositionNext
attribute inside the rs:Data element that contains the information to
support paging. This string contains data for the fields in the sort
and for other items needed for paging. You should consider this string
internal and not to be modified; modifying it can produce unexpected
results. The following example shows the form of this return value
when paging is supported.
<rs:Data ListItemCollectionPositionNext="
Paged=TRUE&p_ID=100&View=
%7bC68F4A6A%2d9AFD%2d406C%2dB624%2d2CF8D729901E%7d&PageFirstRow=
101" Count=1000 >
<z:row ows_FirstName="Nancy" ows_LastName="Name" ….. />
...
</rs:Data>
To get the next page of data, the queryOption parameter is used, as
shown in the following example.
<QueryOptions>
<Paging ListItemCollectionPositionNext="
Paged=TRUE&p_ID=100&View=
%7bC68F4A6A%2d9AFD%2d406C%2dB624%2d2CF8D729901E%7d&PageFirstRow=
101" />
</QueryOptions>
So all you need to do is grab the data in that attribute from each page's result set and add it to the query to get the next page.

Can I use LINQ to skip a collection and just return 100 records?

I have the following that returns a collection from Azure table storage where Skip is not implemented. The number of rows returned is approximately 500.
ICollection<City> a = cityService.Get("0001I");
What I would like to do is to be able to depending on an argument have just the following ranges returned:
records 1-100 passing in 0 as an argument to a LINQ expression
records 101-200 passing in 100 as an argument to a LINQ expression
records 201-300 passing in 200 as an argument to a LINQ expression
records 301-400 passing in 300 as an argument to a LINQ expression
etc
Is there some way I can add to the above and use link to get these ranges
of records returned:
As you already stated in your question, the Skip method is not implemented in Windows Azure Table storage. This means you have 2 options left:
Option 1
Download all data from table storage (by using ToList, see abatishchev's answer) and execute the Skip and Take methods on this complete list. In your question you're talking about 500 records. If the number of records doesn't grow too much this solution should be OK for you, just make sure that all records have the same partition key.
If the data grows you can still use this approach, but I suggest you evaluate a caching solution to store all the records instead of loading them from table storage over and over again (this will improve the performance, but don't expect this to work with very large amounts of data). Caching is possible in Windows Azure using:
Windows Azure Caching (Preview)
Windows Azure Shared Caching
Option 2
The CloudTableQuery class allows you to query for data, but more important to receive a continuation token to build a paging implementation. This allows you to detect if you can query for more data, the pagination example on Scott's blogpost (see nemensv's comment) uses this.
For more information on continuation tokens I suggest you take a look at Jim's blogpost: Azure#home Part 7: Asynchronous Table Storage Pagination. By using continuation tokens you only download the data for the current page meaning it will also work correctly even if you have millions of records. But you have to know the downside of using continuation tokens:
This won't work with the Skip method out of the box, so it might not be a solution for you.
No page 'numbers', because you only know if there's more data (not how much)
No way to count all records
If paging is not supported by the underlying engine, the only way to implement it is to load all the data into memory and then perform paging:
var list = cityService.Get("0001I").ToList(); // meterialize
var result = list.Skip(x).Take(y);
Try something like this:
cityService.Get("0001I").ToList().Skip(n).Take(100);
This should return records 201-300:
cityService.Get("0001I").ToList().Skip(200).Take(100);
a.AsEnumerable().Skip(m).Take(n)

Categories