I am a bit new to NAV and so far I have published web services on NAV and have been able to consume these SOAP web services using C#.
Now, the data has increased and its taking longer to load. I have an idea of querying the data in chunks (e.g chunks of 10) using Datatables, but this I am yet too figure out how to set limits and offsets.
Here is my C# code to read the NAV soap service
public string getItemCardList(itemCardService_Service itemCardServiceObj, List<itemCardService_Filter> filter)
{
serializer.MaxJsonLength = 50000000;
return serializer.Serialize(itemCardServiceObj.ReadMultiple(filter.ToArray(), null, 0));
}
After some several searches I have gotten the answer on the [MSDN Website][https://msdn.microsoft.com/en-us/library/ff477110.aspx] and modified it to work for me.
public string holder()
{
const int fetchSize = 10;
string bookmarkKey = null;
List<itemCardService> itemList = new List<itemCardService>();
//Read items data in pages of 10
itemCardService[] results = itemCardServiceObj.ReadMultiple(filter.ToArray(), bookmarkKey, fetchSize);
while(results.Length > 0)
{
bookmarkKey = results.Last().Key;
itemList.AddRange(results);
results = itemCardServiceObj.ReadMultiple(filter.ToArray(), bookmarkKey, fetchSize);
}
serializer.MaxJsonLength = 50000000;
return serializer.Serialize(itemList);
}
Related
I am trying to get the full contents of my modules From Zoho to our local Server. The deluge code does work as it returns to me the data which is being sent via the API. However, once it reaches the API, it is null. Any idea?
Below is the deluge code:
// Create a map that holds the values of the new contact that needs to be created
evaluation_info = Map();
evaluation_info.put("BulkData",zoho.crm.getRecords("Publishers"));
data = Map();
data.put(evaluation_info);
response = invokeurl
[
url :"https://zohoapi.xxxxx.com/publisher/publish"
type :POST
parameters:data
connection:"zohowebapi"
];
info data; (data returns all the data from publishers)
Here is my ASP.NET core restful API. It does ping it and create the file but the content of the file is null.
Route("[controller]")]
[ApiController]
public class PublisherController : ControllerBase
{
[HttpGet("[action]"), HttpPost("[action]")]
public void Publish(string data)
{
(it's already null when it comes here. why?)
string JSONresult = JsonConvert.SerializeObject(data);
string path = #"C:\storage\journalytics_evaluationsv2.json";
using (var file = new StreamWriter(path, true))
{
file.WriteLine(JSONresult.ToString());
file.Close();
}
}
}
}
What am I missing? Thank you
After contacting Zoho support, the solution he offered was to loop through the data in order to get all the contents from a module (if they are more than 200 records. With the solution provided, one doesn't really need the deluge code anymore as long as you have the ZOHO api set to your account in code. This was my final solution. This solution is not scalable at all. It's best to work with the BULK CSV.
// Our own ZohoAPI which lets us connect and authenticate etc. Yours may look slightly different
ZohoApi zohoApi = new ZohoApi();
zohoApi.Initialize();
ZCRMRestClient restClient = ZCRMRestClient.GetInstance();
var allMedicalJournals = new List<ZCRMRecord>();
for (int i = 1; i <= 30; i++)
{
List<ZCRMRecord> accountAccessRecords2 =
restClient.GetModuleInstance("Journals").SearchByCriteria("Tag:equals:MedicalSet", i, 200).BulkData.ToList();
foreach (var newData in accountAccessRecords2)
allMedicalJournals.Add(newData);
}
I'm getting mixed results with the Azure KeyPhrase API - sometimes successful (by that I mean 200 result) and others I'm getting 400 bad request. To test the service, I'm sending the contents from a Azure PDF on their NoSQL service.
The documentation says that each document may be upto 5k characters. So as to rule that out, (I started off with 5k) I'm limiting each to at most 1k characters.
How can I can get more info on what is the cause of the failure? I've already checked the Portal, but there's not much detail there.
I am using this endpoint: https://eastus.api.cognitive.microsoft.com/text/analytics/v2.0/keyPhrases
Some sample failures:
{"documents":[{"language":"en","id":1,"text":"David Chappell Understanding NoSQL on Microsoft Azure Sponsored by Microsoft Corporation Copyright © 2014 Chappell & Associates"}]}
{"documents":[{"language":"en","id":1,"text":"3 Relational technology has been the dominant approach to working with data for decades. Typically accessed using Structured Query Language (SQL), relational databases are incredibly useful. And as their popularity suggests, they can be applied in many different situations. But relational technology isn’t always the best approach. Suppose you need to work with very large amounts of data, for example, too much to store on a single machine. Scaling relational technology to work effectively across many servers (physical or virtual) can be challenging. Or suppose your application works with data that’s not a natural fit for relational systems, such as JavaScript Object Notation (JSON) documents. Shoehorning the data into relational tables is possible, but a storage technology expressly designed to work with this kind of information might be simpler. NoSQL technologies have been created to address problems like these. As the name suggests, the label encompasses a variety of storage"}]}
** added my quick/dirty poc code ***
List<string> sendRequest(object data)
{
string url = "https://eastus.api.cognitive.microsoft.com/text/analytics/v2.0/keyPhrases";
string key = "api-code-here";
string hdr = "Ocp-Apim-Subscription-Key";
var wc = new WebClient();
wc.Headers.Add(hdr, key);
wc.Headers.Add(HttpRequestHeader.ContentType, "application/json");
TextAnalyticsResult results = null;
string json = JsonConvert.SerializeObject(data);
try
{
var bytes = Encoding.Default.GetBytes(json);
var d2 = wc.UploadData(url, bytes);
var dataString = Encoding.Default.GetString(d2);
results = JsonConvert.DeserializeObject<TextAnalyticsResult>(dataString);
}
catch (Exception ex)
{
var s = ex.Message;
}
System.Threading.Thread.Sleep(125);
if (results != null && results.documents != null)
return results.documents.SelectMany(x => x.keyPhrases).ToList();
else
return new List<string>();
}
Called by:
foreach (var k in vals)
{
data.documents.Clear();
int countSpaces = k.Count(Char.IsWhiteSpace);
if (countSpaces > 3)
{
if (k.Length > maxLen)
{
var v = k;
while (v.Length > maxLen)
{
var tmp = v.Substring(0, maxLen);
var idx = tmp.LastIndexOf(" ");
tmp = tmp.Substring(0, idx).Trim();
data.documents.Add(new
{
language = "en",
id = data.documents.Count() + 1,
text = tmp
});
v = v.Substring(idx + 1).Trim();
phrases.AddRange(sendRequest(data));
data.documents.Clear();
}
data.documents.Add(new
{
language = "en",
id = data.documents.Count() + 1,
text = v
});
phrases.AddRange(sendRequest(data));
data.documents.Clear();
}
else
{
data.documents.Add(new
{
language = "en",
id = 1,
text = k
});
phrases.AddRange(sendRequest(data));
data.documents.Clear();
};
}
}
I manually created some requests using the document samples that you indicated had errors and they were processed by the service correctly and returned key phrases. So an encoding issue looks likely.
In the future, you can also look at the inner error returned by the service. Usually you'll see some more details like in the response sample below.
{
"code": "BadRequest",
"message": "Invalid request",
"innerError": {
"code": "InvalidRequestContent",
"message": "Request contains duplicated Ids. Make sure each document has a unique Id."
}
}
Also, there is a .NET SDK for Text Analytics that can help simplify calling the service.
https://github.com/Azure/azure-rest-api-specs/tree/current/specification/cognitiveservices/data-plane/TextAnalytics
Try changing this line
var bytes = Encoding.Default.GetBytes(json);
to
var bytes = Encoding.UTF8.GetBytes(json);
I am working on an application which requires to fetch entities from one system (Contacts from Dynamics CRM). The app does some process on those contacts, basically convert to compatible payload for other system and then sync it back.
The Dynamics CRM has over 8000 contacts so no brainer, I went ahead with pagination approach.. The code looks something like this -
string nextPageUrl = string.Empty;
bool nextPageAvailable = true;
int totalContactsCount = 0;
bool firstCall = true;
JArray totalContacts = new JArray();
while (nextPageAvailable)
{
string contacts = string.Empty;
if (string.IsNullOrEmpty(nextPageUrl) && firstCall)
{
contacts = await _genericService.GetEntity("contacts", pageSize: 100);
firstCall = false;
}
else
{
contacts = await _genericService.GetEntityFromPaginationUrl(nextPageUrl, pageSize: 100);
}
var contactsObj = JObject.Parse(contacts);
var contactArr = (JArray)contactsObj["value"];
var contactsLength = contactArr.Count;
foreach(var contactObject in contactArr)
{
totalContacts.Add(contactObject);
}
var contactsCt = totalContacts.Count;
nextPageUrl = contactsObj["#odata.nextLink"].IsNullOrEmpty() ? string.Empty : contactsObj["#odata.nextLink"].ToString();
if(string.IsNullOrEmpty(nextPageUrl) && !firstCall)
{
nextPageAvailable = false;
}
Console.WriteLine(nextPageUrl);
}
This works perfectly but till I get 8000 contacts (100 at a time), the memory utilization is almost over 370mb. I am sure there is a better way to achieve this but can not think of any.
The memory problem will get more serious when I reiterate through totalContacts which will be having over 8000 contacts and process them.
Since the contacts coming from Dynamics CRM may have newly added properties, I can not strongly type the objects with models. This is why I have decided to go with JArray. Is there any better alternative to achieve this?
Thanks in advance.
In ServiceNow, I am able to get only a maximum of 250 records in a SOAP request. How to get all the records?
Web Reference Url = https://*****.service-now.com/rm_story.do?WSDL
Code:
var url = "https://*****.service-now.com/rm_story.do?SOAP";
var userName = *****;
var password = *****;
var proxy = new ServiceNow_rm_story
{
Url = url,
Credentials = new NetworkCredential(userName, password)
};
try
{
var objRecord = new Namespace.WebReference.getRecords
{
// filters..
};
var recordResults = proxy.getRecords(objRecord);
}
catch (Exception ex)
{
}
In recordResults, I am getting only 250 records. How to get all the records ?
Also see this stack overflow answer which provides info.
Get ServiceNow Records Powershell - More than 250
Note that returning a large number of records can affect performance of the response and it may be more efficient to process your query in batches using offsets (i.e., get 1-100, then 101-200, ...). This can be achieved by using a sort order and offset. The ServiceNow REST Table API actually returns link headers from Get requests providing you links for the first, next and last set of records making it easy to know the url to query the next batch of records.
See: http://wiki.servicenow.com/index.php?title=Table_API#Methods
and look under 'Response Header'.
Have u tried to pass/override __limit parameter?
Google / wiki / Users manual / Release notes are always helpful
In your code snippet in line where it says //filter you should define __limit (and potentially __first_row and __last_row as explained in the example bellow)
int Skip = 0;
int Take = 250;
while (true)
{
using (var soapClient = new ServiceNowLocations.ServiceNow_cmn_location())
{
var cred = new System.Net.NetworkCredential(_user, _pass);
soapClient.Credentials = cred;
soapClient.Url = _apiUrl + "cmn_location.do?SOAP";
var getParams = new ServiceNowLocations.getRecords()
{
__first_row = Skip.ToString(),
__last_row = (Skip + Take).ToString(),
__limit = Take.ToString()
};
var records = soapClient.getRecords(getParams);
if (records != null)
{
if (records.Count() == 0)
{
break;
}
Skip += records.Count();
if (records.Count() != Take)
{
// last batch or everything in first batch
break;
}
}
else
{
// service now web service endpoint not configured correctly
break;
}
}
}
I made an library that handles interacting with ServiceNow Rest API much easier
https://emersonbottero.github.io/ServiceNow.Core/
I am trying to get direction details between a source and destination by calling Google Map API XML in a web service using c#. When I try to call below function locally, it works fine. But when I deploy my code on the server, for some locations, it does not give direction details. The local System where I try is Win2K8 R2 and the web server is Win2K3. Here is my Code
public List<DirectionSteps> getDistance(string sourceLat, string sourceLong, string destLat, string destLong)
{
var requestUrl = String.Format("http://maps.google.com/maps/api/directions/xml?origin=" + sourceLat + "," + sourceLong + "&destination=" + destLat + "," + destLong + "&sensor=false&units=metric");
try
{
var client = new WebClient();
var result = client.DownloadString(requestUrl);
//return ParseDirectionResults(result);
var directionStepsList = new List<DirectionSteps>();
var xmlDoc = new System.Xml.XmlDocument { InnerXml = result };
if (xmlDoc.HasChildNodes)
{
var directionsResponseNode = xmlDoc.SelectSingleNode("DirectionsResponse");
if (directionsResponseNode != null)
{
var statusNode = directionsResponseNode.SelectSingleNode("status");
if (statusNode != null && statusNode.InnerText.Equals("OK"))
{
var legs = directionsResponseNode.SelectNodes("route/leg");
foreach (System.Xml.XmlNode leg in legs)
{
//int stepCount = 1;
var stepNodes = leg.SelectNodes("step");
var steps = new List<DirectionStep>();
foreach (System.Xml.XmlNode stepNode in stepNodes)
{
var directionStep = new DirectionStep();
directionStep.Index = stepCount++;
directionStep.Distance = stepNode.SelectSingleNode("distance/text").InnerText;
directionStep.Duration = stepNode.SelectSingleNode("duration/text").InnerText;
directionStep.Description = Regex.Replace(stepNode.SelectSingleNode("html_instructions").InnerText, "<[^<]+?>", "");
steps.Add(directionStep);
}
var directionSteps = new DirectionSteps();
//directionSteps.OriginAddress = leg.SelectSingleNode("start_address").InnerText;
//directionSteps.DestinationAddress = leg.SelectSingleNode("end_address").InnerText;
directionSteps.TotalDistance = leg.SelectSingleNode("distance/text").InnerText;
directionSteps.TotalDuration = leg.SelectSingleNode("duration/text").InnerText;
directionSteps.Steps = steps;
directionStepsList.Add(directionSteps);
}
}
}
}
return directionStepsList;
}
catch (Exception ex)
{
throw ex;
}
}
After reading many posts and google usage policy, it turns out that Google does not allow such automated queries from FQDN or any public server. I was making around 15-20 direction request which was blocked after around 10 requests. I had to change my logic and implemented the same function in mobile device and called the google maps directions api from mobile and it works perfectly !! It seems that google is not blocking such requests when they come from mobile devices. But you never know when they change it back.
You can use Google maps api
http://maps.googleapis.com/maps/api/directions/
Instaructions here: https://developers.google.com/maps/documentation/directions/
Limits:
2,500 directions requests per day.
Google Maps API for Business customers have higher limits:
100,000 directions requests per day.
23 waypoints allowed in each request.