I try to encapsule our PowerBI Server REST API via the PowerBI Client NuGet Package from Microsoft.
I ran into the problem that the URI gets a wrong relative path added.
The REST API of our Reporting Server (on-premise) has a base URI like: https://niceReportingURL.com/reports/api/v2.0 but the NuGet package adds another "/v1.0/myorg" to the URI, which is not necessary.
So resulting of that, the request URI looks like this: https://niceReportingURL.com/reports/api/v2.0/v1.0/myorg
I saw in the source code of the class "ReportsOperations" that this weird relative URI gets added hardcoded!
string uriString = new Uri(new Uri(absoluteUri + (absoluteUri.EndsWith("/") ? "" : "/")), "v1.0/myorg/reports").ToString();
I omitted the "/Reports" in my example URIs because it looks like a general problem.
Is there an option or workaround that the NuGet Package doesn't add this relative URI?
The request looks like this:
var c = new BasicAuthenticationCredentials
{
UserName = "reportingUser",
Password = "secretReportingPW"
};
var client = new PowerBIClient(new Uri("https://niceReportingURL.com/reports/api/v2.0"), c);
var result = await client.Reports.GetReportsAsync().ConfigureAwait(false); // Here comes the fail
The PowerBI Client package encapsulates access to the PowerBI REST API.
This API is distinct from the Reporting Services REST API, which appears to have no ready-made NuGet package that encapsulates it, but does have an OpenAPI specification that makes it easy to use.
Both APIs have endpoints for retrieving reports, but they're different kinds of report. Confusingly, Microsoft has chosen to rebrand Reporting Services as "paginated reports" in the PowerBI ecosystem, so at least some Reporting Services reports can be retrieved using the PowerBI REST API. For reports hosted by an on-premise Reporting Services instance, though, you want the Reporting Services API and can't use the PowerBI REST API.
Related
We are a team trying to upgrade our windows application to use Azure DevOps Services' new REST based .NET Client Libraries instead of Client OM that uses SOAP.
The part of the application that we are upgrading does the following:
Checks out all AssemblyInfoVersion.cs files.
Updates the version on those files.
Checks in all the files.
Create a Label with information about that the version was updated.
We managed to do the first three steps with the new REST based .NET Client Libraries using the CreateChangesetAsync method.
But we can not find any information about how to create a Label so we have not been able to do the last step. Is this really not supported?
Currenlty you can't create a new label with the new Azure DevOps Rest API, you can only get labels.
As workaround, you can use tf.exe with the command label to label the files.
In your code add something like this (using System.Diagnostic):
string tfExePath = "path/to/exe";
string tfArgs = "label test /version:45 $test/src"
Process.Start(tfExePath, tfArgs)
According to the Azure DevOps Services REST API Reference, the request URI has the following format:
https://{instance}[/{team-project}]/_apis[/{area}]/{resource}?api-version={version}
Regarding the api-version:
Every API request should include an api-version to avoid having your app or service break as APIs evolve.
I started using the .NET client libraries for Azure DevOps Services (and TFS) to manage dashboards programmatically.
I am able to connect to Azure DevOps using a Personal Access Token:
var credential = new VssBasicCredential(string.Empty, "PersonalAccessToken");
using (VssConnection connection = new VssConnection(new Uri("...."), credential))
using (var client = connection.GetClient<DashboardHttpClient>())
{
// ...
}
How can I specify the API version? Does it still make sense to do it, when using the .NET client libraries?
The API version is decided by the client libraries. You can confirm this by disassembling them (e.g. using ILSpy).
For example, in the current stable release of Microsoft.TeamFoundationServer.Client, DashboardHttpClientBase has a CreateDashboardAsnc method that makes the following call:
this.SendAsync<Dashboard>(..., new ApiResourceVersion("4.1-preview.2"), ...);
Why are some properties of the build defintions obtained through the TFS client libraries empty? For example, I want to get the retentionRules and daysToKeep properties of a certain build definition but the values returned are empty. When I enter the URL of the build defintion in the browser I get the JSON object with all the expected details.
public static void BuildDefinitionSample()
{
VssConnection connection = new VssConnection(new Uri(collectionUri), new VssClientCredentials());
var buildClient = connection.GetClient<BuildHttpClient>();
var buildDefinitions = buildClient.GetDefinitionsAsync(project: projectName, name: "MyBuildDefinition").Result;
var daysToKeep = buildDefinitions.FirstOrDefault().retentionRules.daysToKeep;
}
How can I get the daysToKeep property of a certain build defintion through the TFS client libraries?
Thank you
TFS Client libraries (SOAP API) use Legacy Client Object Model while WebApi libraries calls the New Rest API to achieve the functions.
Client libraries are primarily there to supply backward compatibility with XAML builds. They cannot work well with the new vNext build system as they were written before their time.
TFS are even using two different Build Retention Policy for XAML build and vNext build. You could not set "daysToKeep" for a XAML build. Details please refer my answer in this question.
So the REST API is the future way to follow and you need use it to get above values in your question.
I'm interested in writing a client library for the NuGet v3 API in a non-.NET language.
What are the requests required to get a package, and what does the response looks like?
i.e.
GET {package-versions}
GET {package-version}
Can you also link to the official documentation that covers this scenario?
Here is the official NuGet V3 API documentation. The API is composed of multiple protocols, including:
The PackageBaseAddress - The store that contains the actual packages and their manifest files (the nuspec).
The Service Index - used by the client to discover the NuGet services
The Search Service - used by clients to search for NuGet packages
The Registration - A json-LD based structure that stores the packages' metadata. This includes packages' content, dependencies, descriptions, etc...
For example, say you wanted to download the package "Newtonsoft.
Json":
Get the service index: `GET https://api.nuget.org/v3/index.json
The response contains the address of the PackageBaseAddress (aka, incorrectly as the flat container, as it is hierarchical and not flat :) ):
{
"#id": "https://api.nuget.org/v3-flatcontainer/",
"#type": "PackageBaseAddress/3.0.0",
"comment": "Base URL of Azure storage where NuGet package registration info for DNX is stored, in the format https://api.nuget.org/v3-flatcontainer/{id-lower}/{version-lower}.{version-lower}.nupkg"
},
Use the uri provided by the #id as a base uri to list the versions of the desired package: GET https://api.nuget.org/v3-flatcontainer/newtonsoft.json/index.json, note that this uri is subject to change and is not part of the API
Use the same base uri to download a package: GET https://api.nuget.org/v3-flatcontainer/newtonsoft.json/6.0.4/newtonsoft.json.6.0.4.nupkg
You may also want to look at the NuGet client. The client's source code is
here; you'll want to start from the NuGet.CommandLine project and walk your way down the stack.
I have recently been trying ot write code to add and delete content form an Amazon S3 bucket. I am completely new to Amazon S3 and the AmazonWS .Net SDK.
The bucket region endpoint is http://sqs.eu-west-1.amazonaws.com so I constructed my client like this:
_s3Client = AWSClientFactory.CreateAmazonS3Client(accessKey, awsSecretKey, new AmazonS3Config().WithServiceURL("http://sqs.eu-west-1.amazonaws.com"));
If I leave out the AmazonS3Config bit I get this error:
A redirect was returned without a new location. This can be caused by
attempting to access buckets with periods in the name in a different
region then the client is configured for.
When I put in the AmazonS3Config bit I no longer get that error but I appear to have no access to this bucket at all or any other bucket that I would usually have access to. Any request I send returns null.
I have tested my code with other buckets that are configured to the standard US region and it all works well. The single difference is in the CreateAmazonS3Client method where I set the config with the EU endpoint.
Could anybody give me some guidance on how I should set up my client to work with a bucket in the EU(Ireland) region. I have been searching for a few hours and every tutorial or document I have followed has not worked so far.
Just use the standard endpoint - s3.amazonaws.com
AmazonS3Config S3Config = new AmazonS3Config {
ServiceURL = "s3.amazonaws.com",
CommunicationProtocol = Amazon.S3.Model.Protocol.HTTP
};
AmazonS3 client = Amazon.AWSClientFactory.CreateAmazonS3Client(AWS_Key, AWS_SecretKey, S3Config);
PutObjectRequest UploadToS3Request = new PutObjectRequest();
UploadToS3Request.WithFilePath(localPath)
.WithBucketName(bucket)
.WithKey(key);
client.PutObject(UploadToS3Request);
To whom it may still concern..
With the old AWS SDKs (version 1), you can simply create the S3 client without a region or an AmazonS3Config. No need to specify a service URL, it uses the default, mentioned above, for you. The only time you really need the region for work with S3 is when you create a bucket which is probably rarely a requirement for an application.
This works for me and all communication I perform to S3 is over https.
With the new AWS SDK for .Net (version 2 and above) it seems the region parameter is required and in fact the AmazonS3Client would throw an exception if not given one. I've tried working around this limitation with specifying a generic https://s3.amazonaws.com URL and failed, because the new SDK does not follow the 301 redirect from the default (US-EAST-1 I think) endpoint.
So in summary, best to specify region, even on the old API, to avoid breaking in the future. If your application is making cross-region calls, and are slower (possibly) and more expensive, it's probably best that your code will testify to that.