I want to create multiple User Stories using the .NET REST API, but I only know how to create one-by-one.
I'm also getting a very slow response, average of 2 secs for each creation.
Can anybody help me?
Code that I'm using:
foreach(RCStoryRecord RecStory in ssStoryList)
{
StoryObject = new DynamicJsonObject();
RecStoryResult = new RCStoryResultRecord();
CreateResult creationResult = restApi.Create("HierarchicalRequirement", StoryObject);
}
Related
I need some help in the batchupdate function for Google Sheet api. Currently in my project I am doing some processing and call my createEntry function to add a MSG to a specific cell on the gsheet using the append function as shown below. I realised every time I call that it adds to my quota of API calls requests and Google api stops and hence my program.
To workaround I thought I would store my data in 2D array and update all oneshot towards the end using batchUpdate to avoid exceeding the number of user requests per 100secs and per user set by Google API limits.
My questions are:
Is my logic correct in using batchupdate to solve the user request to Google API problem (keeping in mind I might be adding thousands of data across 6 columns later)?
PS: I have tried already putting the thread to sleep for 20 secs after every 20 or so inserts, but that is not solving the problem as it is unpredictable. sometimes I can go up to 400 cells of data and sometimes it will stop at just 70.
if yes, how can I batchupdate insert various data points in a range of cells?
My code I am using currently is-
public void CreateEntry(string col,int ctw, string msg)
{
var range = $"{sheet}!" + col.ToString() + ctw.ToString();
var valueRange = new ValueRange();
var oblist = new List<object>() { };
oblist.Add(msg);
valueRange.Values = new List<IList<object>> { oblist };
var appendRequest = service.Spreadsheets.Values.Append(valueRange, SpreadsheetId, range);
appendRequest.ValueInputOption = SpreadsheetsResource.ValuesResource.AppendRequest.ValueInputOptionEnum.USERENTERED;
var appendReponse = appendRequest.Execute();
}
A simple sample of the gsheet data points for reference-
As of right now as you can see from my code above i am writing each cell individually. I want to now add all oneshot to avoid exceeding number of user requests per 100sec limit by Google.
I'm currently trying to prevent lags on my Discord server using a bot checking the latency, and switching to another region if it does happen. I saw a couple of bots in js doing that, I'd like to know if there's something in Discord.NET able to do it. Of course, I've been parsing Google for hours to find a way.
Thanks
You can call ModifyAsync on an instance of your SocketGuild and set the RegionId property to your desired new region.
//I'm doing this as a command, but the logic can be implemented wherever you decide.
[Command("region")]
[Remarks("Let's pretend my module has access to a singleton instance of random")]
public async Task ChangeRegion()
{
//Get a collection of all available voice regions
//Exclude the region we are currently in
var voiceRegions = Context.Client.VoiceRegions.Where(r => !r.Id.Equals(Context.Guild.VoiceRegionId));
//Select a random new region
var newRegion = voiceRegions(random.Next(voiceRegions.Count));
//Update the RegionId property for the guild.
await Context.Guild.ModifyAsync(prop => prop.RegionId = newRegion.Id)
await ReplyAsync($"Region updated to {newRegion.Name}: {newRegion.Id}");
}
I'm having a very hard time with what I feel should be a simple task. Every week, our team queries VMware vCenter for three pieces of output: VM counts in three different locations. Here is what it looks like:
Name Value
---- -----
locationA 1433
locationB 278
locationC 23
The information is emailed to our team, as well as some of the higher-ups who like to see the data. This is all automated with a Powershell script and Windows Task Scheduler running on a server, no problems.
That data is also placed in a Google sheet. We just append a new row with the date, and copy and paste the data into the three existing columns. It takes 30 seconds, once a week. Seems silly given how little time it takes to copy it over to the Google sheet but I really want to automate that last process using Google Sheets API.
I seem to keep finding and persuing what feel are online wild goose chases, in the Google scripting to accessing and editing Google sheets. I've downloaded and installed the Sheets API libraries, Drive API libraries, the Google .net library, set up the Google developer site, and run through the Google sheets API documentation and OAuth authenticating. I'm using Visual Studio 2013 because I figured that would play the best with Powershell and calling the .net commands.
I have pretty much no coding experience outside of Powershell (if you can call that coding). I can't even figure out how to pull the Google sheet, much less do anything to it. Nothing I've tried is working so far, and for what little time it takes to copy this info manually every week I've already spent so much more time than is probably worth it. I feel like if I can get a handle on this, that would open the door for further Google automation in the future since we operate with a Google domain. At any rate, help is very much appreciated.
Here is my latest scripting attempt in Visual Studio:
using System;
using Google.GData.Client;
using Google.GData.Spreadsheets;
namespace MySpreadsheetIntegration
{
class Program {
static void Main(string[] args)
{
string CLIENT_ID = "abunchofcharacters.apps.googleusercontent.com";
string CLIENT_SECRET = "secretnumber";
string REDIRECT_URI = "https://code.google.com/apis/console";
OAuth2Parameters parameters = new OAuth2Parameters();
parameters.ClientId = CLIENT_ID;
parameters.ClientSecret = CLIENT_SECRET;
parameters.RedirectUri = REDIRECT_URI;
parameters.Scope = SCOPE;
string authorizationUrl = OAuthUtil.CreateOAuth2AuthorizationUrl(parameters);
Console.WriteLine(https://code.google.com/apis/console);
Console.WriteLine("Please visit the URL above to authorize your OAuth "
+ "request token. Once that is complete, type in your access code to "
+ "continue..."));
parameters.AccessCode = Console.ReadLine();
OAuthUtil.GetAccessToken(parameters);
string accessToken = parameters.AccessToken;
Console.WriteLine("OAuth Access Token: " + accessToken);
GOAuth2RequestFactory requestFactory =
new GOAuth2RequestFactory(null, "MySpreadsheetIntegration-v1", parameters);
SpreadsheetsService service = new SpreadsheetsService("MySpreadsheetIntegration-v1");
service.RequestFactory = requestFactory;
var driveService = new DriveService(auth);
var file = new File();
file.Title = "VSI - VM Totals by Service TEST";
file.Description = string.Format("Created via {0} at {1}", ApplicationName, DateTime.Now.ToString());
file.MimeType = "application/vnd.google-apps.spreadsheet";
var request = driveService.Files.Insert(file);
var result = request.Fetch();
var spreadsheetLink = "https://docs.google.com/spreadsheets/d/GoogleDoc_ID";
Console.WriteLine("Created at " + spreadsheetLink);
End Class;
End Namespace;
}
}
}
For anyone still following this, I found a solution. I was going about this entirely the wrong way (or at least a way that I could comprehend). One solution to my issue was to create a new Google Script that only accessed my email once a week (after we got the report) and teased out everything but the data I was looking for and sent it to the Google spreadsheet.
Here's the script:
function SendtoSheet(){
var threads = GmailApp.search("from:THESENDER in:anywhere subject:THESUBJECTOFTHEEMAILWHICHNEVERCHANGES")[0];
var message = threads.getMessages().pop()
var bodytext = message.getBody();
counts = []
bodytext = bodytext.split('<br>');
for (var i=0 ; i < bodytext.length; i++){
line = bodytext[i].split(':');
if (line.length > 0){
if (!isNaN(line[1])){counts.push(line[1]);}}}
var now = new Date()
counts.unshift(Utilities.formatDate(now, 'EST', 'MM/dd/yyyy'))
var sheet = SpreadsheetApp.openById("GoogleDocID")
sheet = sheet.setActiveSheet(sheet.getSheetByName("Data by Week"))
sheet.appendRow(counts)
}
That Counts array contains the magic to extract the numeric data by breaking up by line breaks and :'s. Works perfectly. It didn't involve figuring out how to use Visual Studio, or the .net Google libraries, or editing the running PowerShell script. Clean and easy.
Hope this helps someone.
I have been working on an application that sends DOPU (drop-off/pick-up) requests for CCD documents via Health. Creating the DOPU requests and getting the corresponding token generated by HealthVault work fine.
There are two SDK methods I am using to get Meaningful Use report data right now:
OfflineWebApplicationConnection.GetMeaningfulUseTimelyAccessDOPUDocumentReport gets me all the DPU requests sent. This works fine, this always gives me the correct DOPU requests (with data/time stamp, token, and application ID).
The other is OfflineWebApplicationConnection.GetMeaningfulUseVDTReport method. This is the one causing problems. No matter what date range I set (a week, a month, Datetime.MinValue to DateTime.MaxValue), I always get no results. No matter how many time I go into my HV account, to view and download my connection DOPU documents. That SDK method still gives me no results.
I have also tried using CCD extension XML when sending a CCD to specifically set the patient-id and entry-date. Again, this doesn't affect my report results.
Does anyone else with more experience than I have with the Meaningful User methods in the SDK that I, have any suggestions on why I get nothing at all, ever, for the OfflineWebApplicationConnection.GetMeaningfulUseVDTReport call?
Here is some sample code that I am using to run the reports (some of the commented lines are just me trying different date ranges). I can also post snippets of code showing how I am sending the DOPU requests, even though that all seems to be behaving as expected.
class Program
{
static void Main(string[] args)
{
var applicationId = ConfigurationManager.AppSettings["ApplicationId"];
var url = ConfigurationManager.AppSettings["HealthServiceUrl"];
var connection = new OfflineWebApplicationConnection(new Guid(applicationId), url, Guid.Empty/* offlinePersonId */);
Console.WriteLine("\nGetMeaningfulUseTimelyAccessDOPUDocumentReport");
//var receipts = connection.GetMeaningfulUseTimelyAccessDOPUDocumentReport(new DateRange(new DateTime(2014, 11, 19), new DateTime(2014, 12, 19)));
var receipts = connection.GetMeaningfulUseTimelyAccessDOPUDocumentReport(new DateRange(DateTime.MinValue, DateTime.MaxValue));
//var receipts = connection.GetMeaningfulUseTimelyAccessDOPUDocumentReport(new DateRange(DateTime.UtcNow.AddMonths(-12), DateTime.UtcNow));
foreach (var receipt in receipts)
{
Console.WriteLine(string.Format("{0} - {1} - {2}", receipt.AvailableDate, receipt.PackageId, receipt.Source));
}
Console.WriteLine("\nGetMeaningfulUseVDTReport");
//var activities = connection.GetMeaningfulUseVDTReport(new DateRange(new DateTime(2000, 12, 3), new DateTime(2014, 12, 10)));
//var activities = connection.GetMeaningfulUseVDTReport(new DateRange(DateTime.MinValue, DateTime.MaxValue));
var activities = connection.GetMeaningfulUseVDTReport(new DateRange(DateTime.UtcNow.AddMonths(-12), DateTime.UtcNow.AddDays(1)));
foreach (var activity in activities)
{
Console.WriteLine(activity.PatientId);
}
Console.ReadLine();
}
}
Update 1
Tried the sample Meaningful Use web application that MS had on codeplex. Used it with our application ID/credentials. Well, it worked. Not sure what is different, at least, so far.
Update 2
So I have tried many other real CCDs (in our PPE enrionment, immediately deleting them when done) including test CCDs. I even set up the ConnectPackage in my app to behave the same as the test application from MS. No matter what I send, I get know Meaningful Use VDT data for the CCDs. The test CCD in the MS test application, however, works.
Update 3
Tried sending CCDs through the MS test application. Again, it sends and I can connect to an HV account with no problem. I get no VDT data, no matter the date range used. Maybe there is an issue with our CCDs?
I am finding my self looking in front of a wall right now. I've started working with eConnect to communicate with Dynamic GP in order to access information.
I've come accross a questions that I have yet to see answered and I'm tired of searching all over the web and all over the bunch of documents I have. In case someone reads this I'll give you a few sources after my question so you can guide your self even if this post doesn't help you.
My question is how can I create a new PMClassMaster through C#? In the end its an XML file that you need to generate but I wonder if there is a method that does that for me? For example, to create a new Vendor you can do the following:
PMVendorMasterType vendorMasterType = new PMVendorMasterType();
vendorMasterType.eConnectProcessInfo = new eConnectProcessInfo();
vendorMasterType.eConnectProcessInfo.ConnectionString = dynamicGPcs;
vendorMasterType.taUpdateCreateVendorRcd = new taUpdateCreateVendorRcd();
vendorMasterType.taUpdateCreateVendorRcd.VENDORID = vendorGP.VENDORID;
vendorMasterType.taUpdateCreateVendorRcd.VENDNAME = vendorGP.VENDNAME;
vendorMasterType.taUpdateCreateVendorRcd.VENDSHNM = vendorGP.VENDSHNM;
//... etc...
PMVendorMasterType[] vendors = { vendorMasterType };
eConnect.PMVendorMasterType = vendors;
This will pretty much create an XML for you, because thats what GP recevies through eConnect's "CreateEntity" and "UpdateEntity" methods.
I can't seem to find the same for PMClassMaster which is the table that has all the Vendor Class IDs. Does anyone know the answer? For reference: https://www.gptablereference.com/2010/Table/PM00100
----- Sources for GP -----
http://mbsguru.blogspot.pt/
http://victoriayudin.com/
http://www.gptablereference.com
There is no eConnect node for the PM Class Master. Not everything that can be done in GP can be done via eConnect.
For this you will have to manually insert records into the relevant SQL table in the desired database.