I have been working on an application that sends DOPU (drop-off/pick-up) requests for CCD documents via Health. Creating the DOPU requests and getting the corresponding token generated by HealthVault work fine.
There are two SDK methods I am using to get Meaningful Use report data right now:
OfflineWebApplicationConnection.GetMeaningfulUseTimelyAccessDOPUDocumentReport gets me all the DPU requests sent. This works fine, this always gives me the correct DOPU requests (with data/time stamp, token, and application ID).
The other is OfflineWebApplicationConnection.GetMeaningfulUseVDTReport method. This is the one causing problems. No matter what date range I set (a week, a month, Datetime.MinValue to DateTime.MaxValue), I always get no results. No matter how many time I go into my HV account, to view and download my connection DOPU documents. That SDK method still gives me no results.
I have also tried using CCD extension XML when sending a CCD to specifically set the patient-id and entry-date. Again, this doesn't affect my report results.
Does anyone else with more experience than I have with the Meaningful User methods in the SDK that I, have any suggestions on why I get nothing at all, ever, for the OfflineWebApplicationConnection.GetMeaningfulUseVDTReport call?
Here is some sample code that I am using to run the reports (some of the commented lines are just me trying different date ranges). I can also post snippets of code showing how I am sending the DOPU requests, even though that all seems to be behaving as expected.
class Program
{
static void Main(string[] args)
{
var applicationId = ConfigurationManager.AppSettings["ApplicationId"];
var url = ConfigurationManager.AppSettings["HealthServiceUrl"];
var connection = new OfflineWebApplicationConnection(new Guid(applicationId), url, Guid.Empty/* offlinePersonId */);
Console.WriteLine("\nGetMeaningfulUseTimelyAccessDOPUDocumentReport");
//var receipts = connection.GetMeaningfulUseTimelyAccessDOPUDocumentReport(new DateRange(new DateTime(2014, 11, 19), new DateTime(2014, 12, 19)));
var receipts = connection.GetMeaningfulUseTimelyAccessDOPUDocumentReport(new DateRange(DateTime.MinValue, DateTime.MaxValue));
//var receipts = connection.GetMeaningfulUseTimelyAccessDOPUDocumentReport(new DateRange(DateTime.UtcNow.AddMonths(-12), DateTime.UtcNow));
foreach (var receipt in receipts)
{
Console.WriteLine(string.Format("{0} - {1} - {2}", receipt.AvailableDate, receipt.PackageId, receipt.Source));
}
Console.WriteLine("\nGetMeaningfulUseVDTReport");
//var activities = connection.GetMeaningfulUseVDTReport(new DateRange(new DateTime(2000, 12, 3), new DateTime(2014, 12, 10)));
//var activities = connection.GetMeaningfulUseVDTReport(new DateRange(DateTime.MinValue, DateTime.MaxValue));
var activities = connection.GetMeaningfulUseVDTReport(new DateRange(DateTime.UtcNow.AddMonths(-12), DateTime.UtcNow.AddDays(1)));
foreach (var activity in activities)
{
Console.WriteLine(activity.PatientId);
}
Console.ReadLine();
}
}
Update 1
Tried the sample Meaningful Use web application that MS had on codeplex. Used it with our application ID/credentials. Well, it worked. Not sure what is different, at least, so far.
Update 2
So I have tried many other real CCDs (in our PPE enrionment, immediately deleting them when done) including test CCDs. I even set up the ConnectPackage in my app to behave the same as the test application from MS. No matter what I send, I get know Meaningful Use VDT data for the CCDs. The test CCD in the MS test application, however, works.
Update 3
Tried sending CCDs through the MS test application. Again, it sends and I can connect to an HV account with no problem. I get no VDT data, no matter the date range used. Maybe there is an issue with our CCDs?
Related
I'm trying to push about 150k updates into Mongo database (v 4.2.9 running on Windows, stage replica with two nodes) using BulkWrite on c# driver (v2.11.6) and looks like it is impossible. The project is .Net Framework 4.7.2.
Mongo c# driver documentation is terrible, but somehow on forums and with a lot of googling, I was finnaly able to find a way how to run about 150k updates using a batch, something like this (a little simplified for SO):
client = new MongoClient(connString);
database = client.GetDatabase(db);
// Build all the updates
List<UpdateOneModel<GroupEntry>> updates = new List<UpdateOneModel<GroupEntry>>();
foreach (GroupEntry groupEntry in stats)
{
FilterDefinition<GroupEntry> filter = Builders<GroupEntry>.Filter.Eq(e => e.Key, groupEntry.Key);
UpdateDefinitionBuilder<GroupEntry> update = Builders<GroupEntry>.Update;
var groupEntrySubUpdates = new List<UpdateDefinition<GroupEntry>>();
if (groupEntry.Value.Clicks != 0)
groupEntrySubUpdates.Add(update.Inc(u => u.Value.Clicks, groupEntry.Value.Clicks));
if (groupEntry.Value.Position != 0)
groupEntrySubUpdates.Add(update.Set(u => u.Value.Position, groupEntry.Value.Position));
UpdateOneModel<GroupEntry> groupEntryUpdate = new UpdateOneModel<GroupEntry>(filter, update.Combine(updates));
groupEntryUpdate.IsUpsert = true;
updates.Add(groupEntryUpdate);
}
// Now BulkWrite them in transaction to make sure data are consistent
IClientSessionHandle session = client.StartSession();
session.StartTransaction();
IMongoCollection<GroupEntry> collection = database.GetCollection<GroupEntry>(collectionName);
// Following line FAILS after some time
BulkWriteResult<GroupEntry> bulkWriteResult = collection.BulkWrite(session, updates);
if (!bulkWriteResult.IsAcknowledged)
throw new Exception("Mongo BulkWrite is not acknowledged!");
session.CommitTransaction();
The problem is that I keep getting the following exception:
{
"operationTime":Timestamp(1612737199,
1),
"ok":0.0,
"errmsg":"Exec error resulting in state FAILURE :: caused by :: operation was interrupted",
"code":262,
"codeName":"ExceededTimeLimit",
"$clusterTime":{
"clusterTime":Timestamp(1612737199,
1),
"signature":{
"hash":new BinData(0,
"ljcwS5Gf2JBpEu/OgPFbvRqclLw="")",
"keyId":"NumberLong(""6890288652832735234"")"
}
}
}
Does anyone have any clue? Mongo c# driver docs are completely useless. It looks like I should somehow set property $maxTimeMS, but it is not possible on BulkInsert. I have tried:
Restarts and rebuilds
Different versions of MongoDriver
Set much bigger timeouts for all "timeout" properties on MongoClient and session
Create smaller batches for BulkWrite (up to 1000 items per batch). Fails after 50-100 updates.
Spent hours and hours in useless Mongo docs and Mongo JIRA
So far no luck. The funny thing is, that the same approach works on c# driver 2.10.3 on .Net CORE 3.1 (yes, i tried) even with bigger batches (about 300k updates).
What am I missing?
EDIT:
I tried set maxCommitTime to 25 minutes based on dododo's comments like this:
IClientSessionHandle session = client.StartSession(new ClientSessionOptions()
{
DefaultTransactionOptions = new TransactionOptions(new Optional<ReadConcern>(ReadConcern.Default),
new Optional<ReadPreference>(ReadPreference.Primary),
new Optional<WriteConcern>(WriteConcern.Acknowledged),
new Optional<TimeSpan?>(TimeSpan.FromMinutes(25)))
});
It now throws exception while doing commmit: NoSuchTransaction - Transaction 1 has been aborted.. We checked MongoDB log file and found new error in there:
Aborting transaction with txnNumber 1 on session
09ea7755-7148-43e8-83d8-8bf58c211bda because it has been running for
longer than 'transactionLifetimeLimitSeconds
Based on docs, this is 60 seconds by default. So we set it to 5 minutes and now it works.
So, thank you dododo for pointing me the right direction.
Anyway, it would be really great if Mongo team described errors better and write documentation above basic CRUD operations.
As dododo suggested, this error was manifestation of server closing the transaction, because it took longer then transactionLifetimeLimitSeconds, which is 60 seconds by default. So two things needs to be done:
Set parameter transactionLifetimeLimitSeconds to more than 60 seconds
Set maxCommitTime to higher value. I'm unable to find default value, so I set it to 10 minutes (same as transactionLifetimeLimitSeconds). Set it while starting a session (see the question).
Anyway documentation for this is missing and the error itself was misleading. So I hope it helps anyone who will have to deal with with this.
I need some help in the batchupdate function for Google Sheet api. Currently in my project I am doing some processing and call my createEntry function to add a MSG to a specific cell on the gsheet using the append function as shown below. I realised every time I call that it adds to my quota of API calls requests and Google api stops and hence my program.
To workaround I thought I would store my data in 2D array and update all oneshot towards the end using batchUpdate to avoid exceeding the number of user requests per 100secs and per user set by Google API limits.
My questions are:
Is my logic correct in using batchupdate to solve the user request to Google API problem (keeping in mind I might be adding thousands of data across 6 columns later)?
PS: I have tried already putting the thread to sleep for 20 secs after every 20 or so inserts, but that is not solving the problem as it is unpredictable. sometimes I can go up to 400 cells of data and sometimes it will stop at just 70.
if yes, how can I batchupdate insert various data points in a range of cells?
My code I am using currently is-
public void CreateEntry(string col,int ctw, string msg)
{
var range = $"{sheet}!" + col.ToString() + ctw.ToString();
var valueRange = new ValueRange();
var oblist = new List<object>() { };
oblist.Add(msg);
valueRange.Values = new List<IList<object>> { oblist };
var appendRequest = service.Spreadsheets.Values.Append(valueRange, SpreadsheetId, range);
appendRequest.ValueInputOption = SpreadsheetsResource.ValuesResource.AppendRequest.ValueInputOptionEnum.USERENTERED;
var appendReponse = appendRequest.Execute();
}
A simple sample of the gsheet data points for reference-
As of right now as you can see from my code above i am writing each cell individually. I want to now add all oneshot to avoid exceeding number of user requests per 100sec limit by Google.
How can I reset the scores of the game on certain days using firebase in "Unity"
I want the scores I marked in the picture to be reset every day, every week, at the end of every month, in short, when their time comes. How can I do this?
What you want to look into is the Cloud Functions feature called scheduled functions.
If you're only familiar with Unity, you'll want to follow this getting started guide for more details. The basic gist of it is that you'll create a tiny snippet of JavaScript that runs at a fixed schedule and lets you perform some administrative tasks on your database.
I'll try to encapsulate the basic setup:
install Node
run npm install -g firebase-tools
create a directory where you want to work on functions - you probably want to to do this outside of your Unity directory
run firebase login to log in to the Firebase CLI
run firebase init (or firebase init functions) and follow the steps in the wizard to create some functions code to test
when you're ready to use them in your game, you can use firebase deploy to send them off to the cloud.
From the Scheduled functions doc page, you can see this example of how to run a function every day:
exports.scheduledFunctionCrontab = functions.pubsub.schedule('5 11 * * *')
.timeZone('America/New_York') // Users can choose timezone - default is America/Los_Angeles
.onRun((context) => {
console.log('This will be run every day at 11:05 AM Eastern!');
return null;
});
You can use these with the Node Admin SDK. Something like:
// Import Admin SDK
var admin = require("firebase-admin");
// Get a database reference to our blog
var db = admin.database();
exports.scheduledFunctionCrontab = functions.pubsub.schedule('5 11 * * *')
.timeZone('America/New_York') // Users can choose timezone - default is America/Los_Angeles
.onRun((context) => {
db.ref(`users/${user_id}/`).update({userGameScore: 0, userMonthScore: 0, userScore: 0, userWeeklyScore: 0});
return null;
});
Of course, here I'm not iterating over user ids &c.
One final note: this is a very literal interpretation and answer to your question. It may be easier to (and save you some money if your game scales up) to write a score and timestamp (maybe using ServerValue.Timestamp) together into your database and just cause the scores to appear zeroed out as client logic. I would personally first try taking this approach and abandon it if it felt like it was getting too complex.
I'm having a very hard time with what I feel should be a simple task. Every week, our team queries VMware vCenter for three pieces of output: VM counts in three different locations. Here is what it looks like:
Name Value
---- -----
locationA 1433
locationB 278
locationC 23
The information is emailed to our team, as well as some of the higher-ups who like to see the data. This is all automated with a Powershell script and Windows Task Scheduler running on a server, no problems.
That data is also placed in a Google sheet. We just append a new row with the date, and copy and paste the data into the three existing columns. It takes 30 seconds, once a week. Seems silly given how little time it takes to copy it over to the Google sheet but I really want to automate that last process using Google Sheets API.
I seem to keep finding and persuing what feel are online wild goose chases, in the Google scripting to accessing and editing Google sheets. I've downloaded and installed the Sheets API libraries, Drive API libraries, the Google .net library, set up the Google developer site, and run through the Google sheets API documentation and OAuth authenticating. I'm using Visual Studio 2013 because I figured that would play the best with Powershell and calling the .net commands.
I have pretty much no coding experience outside of Powershell (if you can call that coding). I can't even figure out how to pull the Google sheet, much less do anything to it. Nothing I've tried is working so far, and for what little time it takes to copy this info manually every week I've already spent so much more time than is probably worth it. I feel like if I can get a handle on this, that would open the door for further Google automation in the future since we operate with a Google domain. At any rate, help is very much appreciated.
Here is my latest scripting attempt in Visual Studio:
using System;
using Google.GData.Client;
using Google.GData.Spreadsheets;
namespace MySpreadsheetIntegration
{
class Program {
static void Main(string[] args)
{
string CLIENT_ID = "abunchofcharacters.apps.googleusercontent.com";
string CLIENT_SECRET = "secretnumber";
string REDIRECT_URI = "https://code.google.com/apis/console";
OAuth2Parameters parameters = new OAuth2Parameters();
parameters.ClientId = CLIENT_ID;
parameters.ClientSecret = CLIENT_SECRET;
parameters.RedirectUri = REDIRECT_URI;
parameters.Scope = SCOPE;
string authorizationUrl = OAuthUtil.CreateOAuth2AuthorizationUrl(parameters);
Console.WriteLine(https://code.google.com/apis/console);
Console.WriteLine("Please visit the URL above to authorize your OAuth "
+ "request token. Once that is complete, type in your access code to "
+ "continue..."));
parameters.AccessCode = Console.ReadLine();
OAuthUtil.GetAccessToken(parameters);
string accessToken = parameters.AccessToken;
Console.WriteLine("OAuth Access Token: " + accessToken);
GOAuth2RequestFactory requestFactory =
new GOAuth2RequestFactory(null, "MySpreadsheetIntegration-v1", parameters);
SpreadsheetsService service = new SpreadsheetsService("MySpreadsheetIntegration-v1");
service.RequestFactory = requestFactory;
var driveService = new DriveService(auth);
var file = new File();
file.Title = "VSI - VM Totals by Service TEST";
file.Description = string.Format("Created via {0} at {1}", ApplicationName, DateTime.Now.ToString());
file.MimeType = "application/vnd.google-apps.spreadsheet";
var request = driveService.Files.Insert(file);
var result = request.Fetch();
var spreadsheetLink = "https://docs.google.com/spreadsheets/d/GoogleDoc_ID";
Console.WriteLine("Created at " + spreadsheetLink);
End Class;
End Namespace;
}
}
}
For anyone still following this, I found a solution. I was going about this entirely the wrong way (or at least a way that I could comprehend). One solution to my issue was to create a new Google Script that only accessed my email once a week (after we got the report) and teased out everything but the data I was looking for and sent it to the Google spreadsheet.
Here's the script:
function SendtoSheet(){
var threads = GmailApp.search("from:THESENDER in:anywhere subject:THESUBJECTOFTHEEMAILWHICHNEVERCHANGES")[0];
var message = threads.getMessages().pop()
var bodytext = message.getBody();
counts = []
bodytext = bodytext.split('<br>');
for (var i=0 ; i < bodytext.length; i++){
line = bodytext[i].split(':');
if (line.length > 0){
if (!isNaN(line[1])){counts.push(line[1]);}}}
var now = new Date()
counts.unshift(Utilities.formatDate(now, 'EST', 'MM/dd/yyyy'))
var sheet = SpreadsheetApp.openById("GoogleDocID")
sheet = sheet.setActiveSheet(sheet.getSheetByName("Data by Week"))
sheet.appendRow(counts)
}
That Counts array contains the magic to extract the numeric data by breaking up by line breaks and :'s. Works perfectly. It didn't involve figuring out how to use Visual Studio, or the .net Google libraries, or editing the running PowerShell script. Clean and easy.
Hope this helps someone.
Using Rotativa 1.6.4 from NuGet and have noticed the following issue using the code below.
ActionAsPdf hangs randomly for indeterminate amount of time.
Code below that is hanging:
var pdfResult = new ActionAsPdf("Report", new {id = Request.Params["id"]})
{
Cookies = cookieCollection,
FormsAuthenticationCookieName = FormsAuthentication.FormsCookieName,
CustomSwitches = "--load-error-handling ignore"
};
Background info that may help:
The customSwitches is in use to ignore a documented issue calling wkhtmltopdf.exe using the ActionAsPdf, but it does not suppress errors in the code only in the wkhtmltopdf call.
Observations, usage and testing:
It works but when running the application (whether or not stepping through code), it can be anywhere from 10 seconds up to about 4 minutes between hitting the pdfResult = new ActionAsPdf and finally entering into the "Report" action being called. Can't discern anything actually happening in the output window of Visual Studio, no errors are being thrown that I have found. Just random slow transition into the Reports() action.
I can run the Reports() action directly via URL and it never slows like this and is quite fast for PDF generation. I am running it using the ActionAsPdf to obtain the binary to save to file system and send via email, which is the prescribed method of doing so for this library.
The behavior exists on both a local Windows 10 dev box and a remote Server 2008R2 Test box. .Net 4.5.1 on both boxes, default IIS on each.
Questions I have:
Any idea on what might cause this slow down and how to remedy it?
I ended up using UrlAsPdf() instead of ActionAsPdf() and it works. Seems there may be some issues with the ActionAsPdf() and I have filed a bug with Rotative project on GitHub. The ActionAsPdf() is still marked as beta, so hopefully it get's fixed in future versions or by the community.
In my case, I had to do few more tweaks along with using UrlAsPdf(). I have narrowed down the issue to the cookie collection that I was adding. So I tried just adding the cookie that I needed, and the issue was resolved. Following is the sample code that I have used.
var report = new UrlAsPdf(url);
Dictionary<string, string> cookieCollection = new Dictionary<string, string>();
foreach (var key in Request.Cookies.AllKeys)
{
if (Crypto.Hash("_user").Equals(key))
{
cookieCollection.Add(key, Request.Cookies.Get(key).Value);
break;
}
}
report.Cookies = cookieCollection;
report.FormsAuthenticationCookieName = FormsAuthentication.FormsCookieName;