Getting a list of all users via Valence - c#

I am trying to get a list of all users in our instance of Desire2Learn using a looping structure through the bookmarks however for some reason it continuously loops and doesn't return. When I debug it it is showing massive amounts of users (far more than we have in the system as shown by the User Management Tool. A portion of my code is here:
public async Task<List<UserData>> GetAllUsers(int pages = 0)
{
//List<UserData> users = new List<UserData>();
HashSet<UserData> users = new HashSet<UserData>();
int pageCount = 0;
bool getMorePages = true;
var response = await Get<PagedResultSet<UserData>>("/d2l/api/lp/1.4/users/");
var qParams = new Dictionary<string, string>();
do
{
qParams["bookmark"] = response.PagingInfo.Bookmark;
//users = users.Concat(response.Items).ToList<UserData>();
users.UnionWith(response.Items);
response = await Get<PagedResultSet<UserData>>("/d2l/api/lp/1.4/users/", qParams);
if (pages != 0)
{
pageCount++;
if (pageCount >= pages)
{
getMorePages = false;
}
}
}
while (response.PagingInfo.HasMoreItems && getMorePages);
return users.ToList();
}
I originally was using the List container that is commented out but just switched to the HashSet to see if I could notice if duplicates where being added.
It's fairly simple, but for whatever reason it's not working. The Get<PagedResultSet<UserData>>() method simply wraps the HTTP request logic. We set the bookmark each time and send it on.
The User Management Tool indicates there are 39,695 users in the system. After running for just a couple of minutes and breaking on the UnionWith in the loop I'm showing that my set has 211,800 users.
What am I missing?

It appears that you’ve encountered a defect in this API. The next course of action is for you to have your institution’s Approved Support Contact open an Incident through the Desire2Learn Helpdesk. Please make mention in the Incident report that Sarah-Beth Bianchi is aware of the issue, and I will work with our Support team to direct this issue appropriately.

Related

Loop API till null

I need some assistance with creating a loop until null is returned on a piece of software I've written.
The software basically takes information from an API call and deserializes it to a readable format for our online service. The difficulty I am facing is that when I make an API call it is only returning 100 records of employees when the client has far more than that.
foreach (var bankRecord in bankDetailDto.Value) //
{
var deducRecords = deductions.Where(d => d.Ee_Number == bankRecord.EmployeeNumber).ToList();
if (deducRecords.Any())
{
foreach(var deducRecord in deducRecords)
{
deducRecord.Bank_Account_Number = bankRecord.BankAccountNo;
deducRecord.Bank_Account_Type = bankRecord.AccountType;
}
}
}
This is just an example of the loop I've tried to create but does not seem to work. I am under the impression i need to create a class to perhaps run a loop on the backround worker?
Apologize I have not been developing for very long.
I guess the API call has pagination with a limit is 100 per call, I think you need to use the while-loop and check if the API response still returns any object or not.
since you didn't include your API call code, I guess it's something like this
parameter.page = 1;
List<BankData> response = yourApi.yourApiAction(parameter);
while (response != null && response.Count > 0)
{
... do your logic to process the data here ...
// Increase the pagination number
parameter.page += 1;
// Call the API again to get next page data
response = yourApi.yourApiAction(parameter)
}
This is just an example code about how it should have been done.
Check your API documentation if it has pagination, and how to increase it.

Avoid fast post on webapi c#

I have problem in when user post the data. Some times the post run so fast and this make problem in my website.
The user want to register a form about 100$ and have 120$ balance.
When the post (save) button pressed sometimes two post come to server very fast like:
2018-01-31 19:34:43.660 Register Form 5760$
2018-01-31 19:34:43.663 Register Form 5760$
Therefore my client balance become negative.
I use If in my code to check balance but the code run many fast and I think both if happen together and I missed them.
Therefore I made Lock Controll class to avoid concurrency per user but not work well.
I made global Action Filter to control the users this is my code:
public void OnActionExecuting(ActionExecutingContext context)
{
try
{
var controller = (Controller)context.Controller;
if (controller.User.Identity.IsAuthenticated)
{
bool jobDone = false;
int delay = 0;
int counter = 0;
do
{
delay = LockControllers.IsRequested(controller.User.Identity.Name);
if (delay == 0)
{
LockControllers.AddUser(controller.User.Identity.Name);
jobDone = true;
}
else
{
counter++;
System.Threading.Thread.Sleep(delay);
}
if (counter >= 10000)
{
context.HttpContext.Response.StatusCode = 400;
jobDone = true;
context.Result = new ContentResult()
{
Content = "Attack Detected"
};
}
} while (!jobDone);
}
}
catch (System.Exception)
{
}
}
public void OnActionExecuted(ActionExecutedContext context)
{
try
{
var controller = (Controller)context.Controller;
if (controller.User.Identity.IsAuthenticated)
{
LockControllers.RemoveUser(controller.User.Identity.Name);
}
}
catch (System.Exception)
{
}
}
I made list static list of user and sleep their thread until previous task happen.
Is there any better way to manage this problem?
So the original question has been edited so this answer is invalid.
so the issue isn't that the code runs too fast. Fast is always good :) The issue is that the account is going into negative funds. If the client decides to post a form twice that is the clients fault. It maybe that you only want the client to pay only once which is an other problem.
So for the first problem, I would recommend a using transactions (https://en.wikipedia.org/wiki/Database_transaction) to lock your table. Which means that the add update/add a change (or set of changes) and you force other calls to that table to wait until those operations have been done. You can always begin your transaction and check that the account has the correct amount of funds.
If the case is that they are only ever meant to pay once then.. then have a separate table that records if the user has payed (again within a transaction), before processing the update/add.
http://www.entityframeworktutorial.net/entityframework6/transaction-in-entity-framework.aspx
(Edit: fixing link)
You have a few options here
You implement ETag functionality in your app which you can use for optimistic concurrency. This works well, when you are working with records, i.e. you have a database with a data record, return that to the user and then the user changes it.
You could add an required field with a guid to your view model which you pass to your app and add it to in memory cache and check it on each request.
public class RegisterViewModel
{
[Required]
public Guid Id { get; set; }
/* other properties here */
...
}
and then use IMemoryCache or IDistributedMemoryCache (see ASP.NET Core Docs) to put this Id into the memory cache and validate it on request
public Task<IActioNResult> Register(RegisterViewModel register)
{
if(!ModelState.IsValid)
return BadRequest(ModelState);
var userId = ...; /* get userId */
if(_cache.TryGetValue($"Registration-{userId}", register.Id))
{
return BadRequest(new { ErrorMessage = "Command already recieved by this user" });
}
// Set cache options.
var cacheEntryOptions = new MemoryCacheEntryOptions()
// Keep in cache for 5 minutes, reset time if accessed.
.SetSlidingExpiration(TimeSpan.FromMinutes(5));
// when we're here, the command wasn't executed before, so we save the key in the cache
_cache.Set($"Registration-{userId}", register.Id, cacheEntryOptions );
// call your service here to process it
registrationService.Register(...);
}
When the second request arrives, the value will already be in the (distributed) memory cache and the operation will fail.
If the caller do not sets the Id, validation will fail.
Of course all that Jonathan Hickey listed in his answer below applies to, you should always validate that there is enough balance and use EF-Cores optimistic or pessimistic concurrency

NetSuite SuiteTalk: SavedSearch for "Deleted Record" Type

How does one get the results of a "Saved Search" of Type "Deleted Record" in NetSuite? Other search types are obvious(CustomerSearchAdvanced, ItemSearchAdvanced, etc...) but this one seems to have no reference online, just documentation around deleting records, not running saved searches on them.
Update 1
I should clarify a little bit more what I'm trying to do. In NetSuite you can run(and Save) Saved Search's on the record type "Deleted Record", I believe you are able to access at least 5 columns(excluding user defined ones) through this process from the web interface:
Date Deleted
Deleted By
Context
Record Type
Name
You are also able to setup search criteria as part of the "Saved Search". I would like to access a series of these "Saved Search's" already present in my system utilizing their already setup search criteria and retrieving data from all 5 of their displayed columns.
The Deleted Record record isn't supported in SuiteTalk as of version 2016_2 which means you can't run a Saved Search and pull down the results.
This is not uncommon when integrating with NetSuite. :(
What I've always done in these situations is create a RESTlet (NetSuite's wannabe RESTful API framework) SuiteScript that will run the search (or do whatever is possible with SuiteScript and not possible with SuiteTalk) and return the results.
From the documentation:
You can deploy server-side scripts that interact with NetSuite data
following RESTful principles. RESTlets extend the SuiteScript API to
allow custom integrations with NetSuite. Some benefits of using
RESTlets include the ability to:
Find opportunities to enhance usability and performance, by
implementing a RESTful integration that is more lightweight and
flexible than SOAP-based web services. Support stateless communication
between client and server. Control client and server implementation.
Use built-in authentication based on token or user credentials in the
HTTP header. Develop mobile clients on platforms such as iPhone and
Android. Integrate external Web-based applications such as Gmail or
Google Apps. Create backends for Suitelet-based user interfaces.
RESTlets offer ease of adoption for developers familiar with
SuiteScript and support more behaviors than NetSuite's SOAP-based web
services, which are limited to those defined as SuiteTalk operations.
RESTlets are also more secure than Suitelets, which are made available
to users without login. For a more detailed comparison, see RESTlets
vs. Other NetSuite Integration Options.
In your case this would be a near trivial script to create, it would gather the results and return JSON encoded (easiest) or whatever format you need.
You will likely spend more time getting the Token Based Authentication (TBA) working than you will writing the script.
[Update] Adding some code samples related to what I mentioned in the comments below:
Note that the SuiteTalk proxy object model is frustrating in that it
lacks inheritance that it could make such good use of. So you end with
code like your SafeTypeCastName(). Reflection is one of the best tools
in my toolbox when it comes to working with SuiteTalk proxies. For
example, all *RecordRef types have common fields/props so reflection
saves you type checking all over the place to work with the object you
suspect you have.
public static TType GetProperty<TType>(object record, string propertyID)
{
PropertyInfo pi = record.GetType().GetProperty(propertyID);
return (TType)pi.GetValue(record, null);
}
public static string GetInternalID(Record record)
{
return GetProperty<string>(record, "internalId");
}
public static string GetInternalID(BaseRef recordRef)
{
PropertyInfo pi = recordRef.GetType().GetProperty("internalId");
return (string)pi.GetValue(recordRef, null);
}
public static CustomFieldRef[] GetCustomFieldList(Record record)
{
return GetProperty<CustomFieldRef[]>(record, CustomFieldPropertyName);
}
Credit to #SteveK for both his revised and final answer. I think long term I'm going to have to implement what is suggested, short term I tried implementing his first solution("getDeleted") and I'd like to add some more detail on this in case anyone needs to use this method in the future:
//private NetSuiteService nsService = new DataCenterAwareNetSuiteService("login");
//private TokenPassport createTokenPassport() { ... }
private IEnumerable<DeletedRecord> DeletedRecordSearch()
{
List<DeletedRecord> results = new List<DeletedRecord>();
int totalPages = Int32.MaxValue;
int currentPage = 1;
while (currentPage <= totalPages)
{
//You may need to reauthenticate here
nsService.tokenPassport = createTokenPassport();
var queryResults = nsService.getDeleted(new GetDeletedFilter
{
//Add any filters here...
//Example
/*
deletedDate = new SearchDateField()
{
#operator = SearchDateFieldOperator.after,
operatorSpecified = true,
searchValue = DateTime.Now.AddDays(-49),
searchValueSpecified = true,
predefinedSearchValueSpecified = false,
searchValue2Specified = false
}
*/
}, currentPage);
currentPage++;
totalPages = queryResults.totalPages;
results.AddRange(queryResults.deletedRecordList);
}
return results;
}
private Tuple<string, string> SafeTypeCastName(
Dictionary<string, string> customList,
BaseRef input)
{
if (input.GetType() == typeof(RecordRef)) {
return new Tuple<string, string>(((RecordRef)input).name,
((RecordRef)input).type.ToString());
}
//Not sure why "Last Sales Activity Record" doesn't return a type...
else if (input.GetType() == typeof(CustomRecordRef)) {
return new Tuple<string, string>(((CustomRecordRef)input).name,
customList.ContainsKey(((CustomRecordRef)input).internalId) ?
customList[((CustomRecordRef)input).internalId] :
"Last Sales Activity Record"));
}
else {
return new Tuple<string, string>("", "");
}
}
public Dictionary<string, string> GetListCustomTypeName()
{
//You may need to reauthenticate here
nsService.tokenPassport = createTokenPassport();
return
nsService.search(new CustomListSearch())
.recordList.Select(a => (CustomList)a)
.ToDictionary(a => a.internalId, a => a.name);
}
//Main code starts here
var results = DeletedRecordSearch();
var customList = GetListCustomTypeName();
var demoResults = results.Select(a => new
{
DeletedDate = a.deletedDate,
Type = SafeTypeCastName(customList, a.record).Item2,
Name = SafeTypeCastName(customList, a.record).Item1
}).ToList();
I have to apply all the filters API side, and this only returns three columns:
Date Deleted
Record Type(Not formatted in the same way as the Web UI)
Name

How to separate parallel requests?

I'll try to explain the issue with a simplified console application example, however the real project is a ASP.NET MVC3 application.
Having the following tables:
imagine the following scenario:
user creates a report (a line in TestReport, where Text is the report string content, and Ready is a bool flag, saying, if the report is ready to be processed); by default Ready is set to false, i.e. not ready.
user wants the report to be processed, so he submits it; Ready is set to true here.
The system gives an opportunity to recall the report back, if it has not been processed yet. So, when the report is recalled, Ready is set to false back. On the contrary, when the report is processed, a line in TestReportRef, referencing report by its Id, is created.
Now imagine that at one and the same moment
user wants to recall the report;
the report is added to the process list;
As soon as this can happen simultaneously, errors may occur. That is the report will have Ready == false and it'll be referenced in TestReportRef.
Here is a simple console example of how this may happen:
var dc = new TestDataContext('my connection string');
dc.TestReport.InsertOnSubmit(new TestReport
{
Text = "My report content",
Ready = true //ready at once
});
dc.SubmitChanges();
Action recallReport = () =>
{
var _dc = new TestDataContext(cs);
var report = _dc.TestReport.FirstOrDefault(t => t.Ready);
if (report != null && !report.TestReportRef.Any())
{
Thread.Sleep(1000);
report.Ready = false;
_dc.SubmitChanges();
}
};
Action acceptReport = () =>
{
var _dc = new TestDataContext(cs);
var report = _dc.TestReport.FirstOrDefault(t => t.Ready);
if (report != null && !report.TestReportRef.Any())
{
Thread.Sleep(1000);
_dc.TestReportRef.InsertOnSubmit(new TestReportRef
{
FK_ReportId = report.Id
});
_dc.SubmitChanges();
}
};
var task1 = new Task(recallReport);
var task2 = new Task(acceptReport);
task1.Start();
task2.Start();
task1.Wait();
task2.Wait();
foreach (var t in dc.TestReport)
{
Console.WriteLine(string.Format("{0}\t{1}\t{2}", t.Id, t.Text, t.Ready));
}
foreach (var t in dc.TestReportRef)
{
Console.WriteLine("ref id:\t" + t.FK_ReportId);
}
Thread.Sleep(1000); is added to be ensure, that tasks will check one and the same situation.
The given example may sound awkward, however, I hope, it should explain the issue I'm dealing with.
How can I avoid this? Making the repository singleton doesn't seem to be a good idea. Shall I use some shared mutex (one for all web requests) to separate write-operations only?
Or is there a pattern I should use in this kind of scenario?
This is only a simplified example of one of the scenarios I have. However, there are several scenarios in which it may run into a similar discrepancy. The best thing would be to make this kind of intersection impossible, I guess.
Why don't add a version column on the Report table? Task starts by tracking current version,when task end, if the version is the same that the tracked one, operation is ok, otherwise fail. If operation appear ok, update the version to version +1. This is a sort of optimistic lock; that implicitly suppose that conflicts may occur, but they are not so frequent.
UPDATE
If you are using linqto sql maybe you can have a check at the parameter UpdateCheck [Column(UpdateCheck=UpdateCheck.Always)]
This can be useful to handle concurrency in your case.

Implementing queue in c#

I am developing a c# application, in which the server gets requests from many clients at a time. Each client also gets their data from different databases. In this situation sometimes data leakage is happening, means clients get data from an incorrect database. Say for example client1 should get data from db1 and client2 gets data from db2. Instead they get data from opposite databases; client1 gets from db2 and client2 gets from db1.
I am adding the code below where it collects the data.
public string List()
{
Response.ContentType = ContentType.Xml;
try
{
ThingzFilter filter = null;
Dictionary<string, string> parameters = new Dictionary<string, string>();
if (Id!="")
{
// get parameters from http request
foreach (HttpInputItem param in Request.Param)
parameters.Add(param.Name, param.Value);
setServerURLs();
//Request.Clear();
if (Request.QueryString["lang"].Value != null)
{
ThingzDB.TzThing.get_language = Request.QueryString["lang"].Value.ToString();
}
else
{
ThingzDB.TzThing.get_language = SessionDatabase.DefaultLanguage;
}
}
ThingzDatabase db = SessionDatabase;
langStr = db.Language;
// this is run if there was no ID supplied
// which means we want all items of all types
if (Id == "")
{
if (Request.AcceptTypes == null)
{
//TypeController.session_id = Request.QueryString["sessionid"].Value;
jobs.Add(Request.QueryString["sessionid"].Value);
if (nextJobPos > jobs.Count - 1)
return "";
else
{
TypeController.session_id = jobs[nextJobPos];
nextJobPos++;
langStr = SessionDatabase.Language;
}
filter = new AllThingzFilter(SessionDatabase, parameters, langStr);
TypeController.session_id = "";
filter.Execute();
}
In this server is console application and clients are windows where the site names , means the databse names are mentioned.
Please give me a solution to overcome this issue.
Without precisely knowing how SessionDatabase is scoped (from the name it seems to be a session variable) or whether it's implementation is a property that does some kind of complex logic, I would guess you have two problems:
Storing the value at the wrong scope with multiple clients accessing it
Using db and SessionDatabase interchangeably in your code.
For the latter, I would suggest db = SessionDatabase once at the top of the code (making sure that SessionDatabase was the right thing for that client, and then using db for the rest of the method.

Categories