I'm trying to connect to an SSRS server and get report data via .NET webClient. I'm doing this because I can't use forms and I don't want to just send the user to the report server. I'd rather keep everything in my web application.
So I have this bit of code in a controller:
public IHttpActionResult GetSpecs(int Id)
{
var client = new WebClient();
client.Credentials = new System.Net.NetworkCredential("username", "pw", "domain");
var data = client.DownloadString(ReportServerUrl + "?%2fFactory+Specs+Reports%2fSpecs_Stats_Matrix&rs:Command=Render&a=" + Id + "&b=" + CurrentUser.Id);
return Ok(data)
}
It successfully connects to the SSRS server, and it does get data. Inspecting the data, it looks like it's the report I need, but it's just one giant string of html and javascript that the SSRS server spits out.
My question is, is there a good way of handling this data?
I'm in unfamiliar territory, and it doesn't seem like a lot of people interact with SSRS in this way.
I'm not quite sure how to display all the data the end user.
Thanks!
Thats quite easy. :)
First you need some assemblies to access the reporting service.
All these assemblies are easy to include in your project via Nuget,
After this you need to connect to your SSRS service instance like this:
using Microsoft.Reporting.WebForms;
CustomReportCredentials reportServerCredentials = new CustomReportCredentials("User", "Password", "REPORTINGSERVER");
ServerReport report = new ServerReport()
{
ReportServerUrl = new Uri("https://reporting.xxxx.com/ReportServer"),
ReportServerCredentials = reportServerCredentials,
ReportPath = "/Reports/MyReport",
Timeout = 200000
};
Ask the service for supported render extentions and generate the report:
var renderExtentions = report.ListRenderingExtensions();
report.SetParameters(new ReportParameter[]
{
new ReportParameter("parameter1", dcStringID.ToString()),
new ReportParameter("parameter2", begin.ToString()),
new ReportParameter("parameter3", end.ToString())
});
String mineType = String.Empty;
String fileNameExtention = String.Empty;
Stream stream = report.Render(renderExtentions.First(), null, null, out mineType, out fileNameExtention);
At this point you will have all you need. Stream, mine type, file extention.
Related
I am trying to get the full contents of my modules From Zoho to our local Server. The deluge code does work as it returns to me the data which is being sent via the API. However, once it reaches the API, it is null. Any idea?
Below is the deluge code:
// Create a map that holds the values of the new contact that needs to be created
evaluation_info = Map();
evaluation_info.put("BulkData",zoho.crm.getRecords("Publishers"));
data = Map();
data.put(evaluation_info);
response = invokeurl
[
url :"https://zohoapi.xxxxx.com/publisher/publish"
type :POST
parameters:data
connection:"zohowebapi"
];
info data; (data returns all the data from publishers)
Here is my ASP.NET core restful API. It does ping it and create the file but the content of the file is null.
Route("[controller]")]
[ApiController]
public class PublisherController : ControllerBase
{
[HttpGet("[action]"), HttpPost("[action]")]
public void Publish(string data)
{
(it's already null when it comes here. why?)
string JSONresult = JsonConvert.SerializeObject(data);
string path = #"C:\storage\journalytics_evaluationsv2.json";
using (var file = new StreamWriter(path, true))
{
file.WriteLine(JSONresult.ToString());
file.Close();
}
}
}
}
What am I missing? Thank you
After contacting Zoho support, the solution he offered was to loop through the data in order to get all the contents from a module (if they are more than 200 records. With the solution provided, one doesn't really need the deluge code anymore as long as you have the ZOHO api set to your account in code. This was my final solution. This solution is not scalable at all. It's best to work with the BULK CSV.
// Our own ZohoAPI which lets us connect and authenticate etc. Yours may look slightly different
ZohoApi zohoApi = new ZohoApi();
zohoApi.Initialize();
ZCRMRestClient restClient = ZCRMRestClient.GetInstance();
var allMedicalJournals = new List<ZCRMRecord>();
for (int i = 1; i <= 30; i++)
{
List<ZCRMRecord> accountAccessRecords2 =
restClient.GetModuleInstance("Journals").SearchByCriteria("Tag:equals:MedicalSet", i, 200).BulkData.ToList();
foreach (var newData in accountAccessRecords2)
allMedicalJournals.Add(newData);
}
As part of ML automation process I want to dynamically create new AutoML model. I'm using C# (.net framework) and Google.Cloud.AutoML.V1.
After trying to run CreateDataSet code:
var autoMlClient = AutoMlClient.Create();
var parent = LocationName.FromProjectLocation(_projectId, _locationId);
var dataset = new Google.Cloud.AutoML.V1.Dataset();
dataset.DisplayName = "NewDataSet";
var response = autoMlClient.CreateDataset(parent, dataset);
I get the following error:
Field: dataset.dataset_metadata; Message: Required field not set
According to this user manual I should set Dataset Metadata Type, but the list contains only specific types of classifications (Translation/ImageClassifications etc.), I can't find a simple classification type.
How do I create a simple classification data set with the API ? in the AutoML UI its just with a simple button click ("NEW DATASET") - and have to provide only name & region - no classification type.
I also tried to set:
dataset.TextClassificationDatasetMetadata =
new TextClassificationDatasetMetadata() { ClassificationType = ClassificationType.Multiclass };
But I was unable to import data to it (got too many errors of invalid inputs from the input CSV file), I guess its related to the reason that the input format is not suitable for Text Classification.
UPDATE
I've just notice that the Nuget works with AutoML v1 but v1 beta does contains TablesDatasetMetadata Dataset Metadata Type for normal classifications. I'm speechless.
I also experienced this scenario today while creating a dataset using the NodeJS client. Since the Google AutoML table service is in the beta level you need to use the beta version of the AutoML client. In the Google cloud documentation they have used the beta client to create a dataset.
In NodeJS importing the beta version require('#google-cloud/automl').v1beta1.AutoMlClient instead of importing the normal version (v1) require('#google-cloud/automl').v1 worked for me to successfully execute the create dataset functionality.
In C# you can achieve the same through a POST request. Hope this helps :)
After #RajithaWarusavitarana comment, and my last question update , below is the code that did the trick. The token is being generated by GoogleClientAPI nuget and AutoML is handled by REST.
string GcpGlobalEndPointUrl = "https://automl.googleapis.com";
string GcpGlobalLocation = "us-central1"; // api "parent" parameter
public string GetToken(string jsonFilePath)
{
var serviceAccountCredentialFileContents = System.IO.File.ReadAllText(jsonFilePath);
var credentialParameters = NewtonsoftJsonSerializer.Instance.Deserialize<JsonCredentialParameters>(serviceAccountCredentialFileContents);
var initializer = new ServiceAccountCredential.Initializer(credentialParameters.ClientEmail)
{
Scopes = new List<string> { "https://www.googleapis.com/auth/cloud-platform" }
};
var cred = new ServiceAccountCredential(initializer.FromPrivateKey(credentialParameters.PrivateKey));
string accessToken = cred.GetAccessTokenForRequestAsync("https://oauth2.googleapis.com/token").Result;
return accessToken;
}
public void GetDataSetList(string projectId, string token)
{
var restClient = new RestClient(GcpGlobalEndPointUrl);
var createDataSetReqUrl = $"v1beta1/projects/{projectId}/locations/{GcpGlobalLocation}/datasets";
var createDataSetReq = new RestRequest(createDataSetReqUrl, Method.GET);
createDataSetReq.AddHeader("Authorization", $"Bearer {token}");
var createDatasetResponse = restClient.Execute(createDataSetReq);
createDatasetResponse.Dump();
}
I took the token generation code from google-api-dotnet-client Test File
I am trying to Print multiple file Using google cloud printer api.
I have integrate local printer to chrome://devices/
and try to implement Printer api from beloved link
https://github.com/lppkarl/GoogleCloudPrintApi
var provider = new GoogleCloudPrintOAuth2Provider(ClientId, SecreteKey);
var url = provider.BuildAuthorizationUrl(redirectUri);
var token = provider.GenerateRefreshTokenAsync(url, redirectUri);
var client = new GoogleCloudPrintClient(provider, token);
var request = new ListRequest { Proxy = proxy };
var response = client.ListPrinterAsync(request);
In this method I am not getting token or Authorization code from request url
and also try to implement print using Print to Google Cloud Print with dynamic URL
for (var i = 0; i < url.length; i++) {
var currenturl = url[i]
var gadget = new cloudprint.gadget();
gadget.setprintdocument("application/pdf", "pdf");
gadget.setprintdocument("url", "document title1", currenturl);
gadget.openprintdialog();
}
Above code is working fine for multiple files.
But it's asking for print for every files.
I want to print multiple file but it should be ask only first time and rest of queued file automatically add in queue for print.
I tried many ways, but no one give me success.
If anyone have any idea than it'll helpfull. Thank you
Im trying to move a file from one folder to another using the Google Drive API v3. I found documentation how to this here. I used the .NET sample code from the documentation page and created a method that looks like this:
public ActionResult MoveFile(string fileToMove, string destination)
{
DriveService service = new DriveService(new BaseClientService.Initializer
{
HttpClientInitializer = <USER CREDENTIAL>,
ApplicationName = "APPNAME"
});
var searchFiles = service.Files.List();
searchFiles.Corpus = FilesResource.ListRequest.CorpusEnum.User;
searchFiles.Q = "name = '" + fileToMove + "'";
searchFiles.Fields = "files(*)";
string fileToMoveId = searchFiles.Execute().Files[0].Id;
searchFiles.Q = "name = '" + destination + "'";
string destinationId = searchFiles.Execute().Files[0].Id;
//Code used from documentation
// Retrieve the existing parents to remove
var getRequest = service.Files.Get(fileToMoveId);
getRequest.Fields = "parents";
var file = getRequest.Execute();
var previousParents = String.Join(",", file.Parents);
// Move the file to the new folder
var updateRequest = service.Files.Update(file, fileToMoveId);
updateRequest.Fields = "id, parents";
updateRequest.AddParents = destinationId;
updateRequest.RemoveParents = previousParents;
file = updateRequest.Execute();
return RedirectToAction("Files", new {folderId = destinationId});
}
When I execute this code I get the following error:
The parents field is not directly writable in update requests. Use the
addParents and removeParents parameters instead.
The error doesn't really makes sense to me because this code sample came from the documentation page itself. I can't figure out what other paramters they mean. What addParents and removeParents parameters do they mean? Are updateRequest.AddParents and updateRequest.RemoveParents not the right parameters?
Ok here is the problem.
var updateRequest = service.Files.Update(file, fileToMoveId);
The method is requiring that you send a body of a file to be updated. This normally makes sense as any changes you want to make you can add to the body.
Now the problem you are having is that you got your file from a file.get. Which is totally normal. This is how you should be doing it. THe problem is there are some fields in that file that you cant update. So by sending the full file the API is rejecting your update. If you check Files: update under Request body you will see which fiends are updateable.
Issue:
Now this is either a problem with the client library or the API I am going to have to track down a few people at Google to see which is the case.
Fix:
I did some testing and sending an empty file object as the body works just fine. The file is moved.
var updateRequest = service.Files.Update(new Google.Apis.Drive.v3.Data.File(), fileToMove.Id);
updateRequest.AddParents = directoryToMove.Id;
updateRequest.RemoveParents = fileToMove.Parents[0];
var movedFile = updateRequest.Execute();
This method works well when working in your own drive, but not in a team drive where a file (folder) can only have 1 parent strictly. I do not have the solution in a team drive
EDIT: I think I can simplify this question a bit to ask for only what is needed to know:
I am working with C# using the SSRS 2010 Web Service:
'ReportService2010.asmx' http://technet.microsoft.com/en-us/library/ee640743.aspx
I can use the method 'CreateDataSource' to create a Datasource on an instance of an SSRS Server http:// (servername)/ReportServer.
I can also use the method 'CreateCatalogItem' to create a report on a server from referencing a project's RDL local file to serialize it to a byte array and then pass that as a 'Definition' to the method to create it on the server.
Now everything I do works with a caveat, and a major one. I can only deploy everything to the same folder. If I deploy a Data Source to say the 'Data Sources' folder and then a report to say: 'Test Reports', the report does not know it has a shared data source to reference at a different location. So I dug a little at the technet articles and have tried to 'GetItemDataSources' method but it only gives a name and a type for the ReportingService2010.DataSource return type. Does anyone know the method to link up a 'Report' or 'Dataset's CatalogItem property of 'DataSource', so it points to a reference in a different folder on the SSRS Server when deploying? There has to be a way to do it as I know I can deploy from Business Intelligence Development Studio and it can do this.
I've had similar issues when deploying report files; when deploying through rs.exe or code you run into these issues where reports lose their link to a Data Source.
We solved this by explicitly pointing the report to the server-side Data Source immediately after being deployed by our application; is this similar to what you're trying to do?
Anyway, here's the slightly adapted code we use in our report deployment application:
static void SetReportDataSource(string reportPath)
{
string dsPath = CombinePath(DataSourcePath, DataSourceFolder, DataSourceName);
DataSourceReference dsRef = new DataSourceReference()
{
Reference = dsPath
};
DataSource ds = new DataSource();
ds.Item = dsRef as DataSourceDefinitionOrReference;
ds.Name = DataSourceName;
var rptDataSources = Server.GetItemDataSources(reportPath);
foreach (var rptDs in rptDataSources)
{
Server.SetItemDataSources(filePath, new DataSource[] { ds });
}
}
So, basically we have variables that define information like the Data Source name, Data Source location on server, and the same for a report. They can be in different folders.
Based on this, we create a new reference to a Data Source and then repoint the report to this using SetItemDataSources.
This sorted out the Data Source issue for me, anyway. Not sure about Shared Datasets and how they handle all of this, but hopefully this will be of some help.
Also, just thought that this would be using the ReportService2005 endpoint, but it's probably not too different for ReportService2010.
Edit:
For the paths mentioned here, these are relative to the server, e.g. /Reports/. You don't need the fully qualified name as you define the Url property of the ReportService2010 object which contains the destination.
Maybe this might be some help. I used it to reset the DataSource for all the reports in a given Parent Folder, and it's subfolders:
using System;
using GetPropertiesSample.ReportService2010;
using System.Diagnostics;
using System.Collections.Generic; //<== required for LISTS
using System.Reflection;
namespace GetPropertiesSample
{
class Program
{
static void Main(string[] args)
{
GetListOfObjectsInGivenFolder_and_ResetTheReportDataSource("0_Contacts"); //<=== This is the parent folder
}
private static void GetListOfObjectsInGivenFolder_and_ResetTheReportDataSource(string sParentFolder)
{
// Create a Web service proxy object and set credentials
ReportingService2010 rs = new ReportingService2010();
rs.Credentials = System.Net.CredentialCache.DefaultCredentials;
CatalogItem[] reportList = rs.ListChildren(#"/" + sParentFolder, true);
int iCounter = 0;
foreach (CatalogItem item in reportList)
{
iCounter += 1;
Debug.Print(iCounter.ToString() + "]#########################################");
if (item.TypeName == "Report")
{
Debug.Print("Report: " + item.Name);
ResetTheDataSource_for_a_Report(item.Path, "/DataSources/Shared_New"); //<=== This is the DataSource that I want them to use
}
}
}
private static void ResetTheDataSource_for_a_Report(string sPathAndFileNameOfTheReport, string sPathAndFileNameForDataSource)
{
//from: http://stackoverflow.com/questions/13144604/ssrs-reportingservice2010-change-embedded-datasource-to-shared-datasource
ReportingService2010 rs = new ReportingService2010();
rs.Credentials = System.Net.CredentialCache.DefaultCredentials;
string reportPathAndName = sPathAndFileNameOfTheReport;
//example of sPathAndFileNameOfTheReport "/0_Contacts/207_Practices_County_CareManager_Role_ContactInfo";
List<ReportService2010.ItemReference> itemRefs = new List<ReportService2010.ItemReference>();
ReportService2010.DataSource[] itemDataSources = rs.GetItemDataSources(reportPathAndName);
foreach (ReportService2010.DataSource itemDataSource in itemDataSources)
{
ReportService2010.ItemReference itemRef = new ReportService2010.ItemReference();
itemRef.Name = itemDataSource.Name;
//example of DataSource i.e. 'itemRef.Reference': "/DataSources/SharedDataSource_DB2_CRM";
itemRef.Reference = sPathAndFileNameForDataSource;
itemRefs.Add(itemRef);
}
rs.SetItemReferences(reportPathAndName, itemRefs.ToArray());
}
}
}