Moving files with Google Drive API v3 - c#

Im trying to move a file from one folder to another using the Google Drive API v3. I found documentation how to this here. I used the .NET sample code from the documentation page and created a method that looks like this:
public ActionResult MoveFile(string fileToMove, string destination)
{
DriveService service = new DriveService(new BaseClientService.Initializer
{
HttpClientInitializer = <USER CREDENTIAL>,
ApplicationName = "APPNAME"
});
var searchFiles = service.Files.List();
searchFiles.Corpus = FilesResource.ListRequest.CorpusEnum.User;
searchFiles.Q = "name = '" + fileToMove + "'";
searchFiles.Fields = "files(*)";
string fileToMoveId = searchFiles.Execute().Files[0].Id;
searchFiles.Q = "name = '" + destination + "'";
string destinationId = searchFiles.Execute().Files[0].Id;
//Code used from documentation
// Retrieve the existing parents to remove
var getRequest = service.Files.Get(fileToMoveId);
getRequest.Fields = "parents";
var file = getRequest.Execute();
var previousParents = String.Join(",", file.Parents);
// Move the file to the new folder
var updateRequest = service.Files.Update(file, fileToMoveId);
updateRequest.Fields = "id, parents";
updateRequest.AddParents = destinationId;
updateRequest.RemoveParents = previousParents;
file = updateRequest.Execute();
return RedirectToAction("Files", new {folderId = destinationId});
}
When I execute this code I get the following error:
The parents field is not directly writable in update requests. Use the
addParents and removeParents parameters instead.
The error doesn't really makes sense to me because this code sample came from the documentation page itself. I can't figure out what other paramters they mean. What addParents and removeParents parameters do they mean? Are updateRequest.AddParents and updateRequest.RemoveParents not the right parameters?

Ok here is the problem.
var updateRequest = service.Files.Update(file, fileToMoveId);
The method is requiring that you send a body of a file to be updated. This normally makes sense as any changes you want to make you can add to the body.
Now the problem you are having is that you got your file from a file.get. Which is totally normal. This is how you should be doing it. THe problem is there are some fields in that file that you cant update. So by sending the full file the API is rejecting your update. If you check Files: update under Request body you will see which fiends are updateable.
Issue:
Now this is either a problem with the client library or the API I am going to have to track down a few people at Google to see which is the case.
Fix:
I did some testing and sending an empty file object as the body works just fine. The file is moved.
var updateRequest = service.Files.Update(new Google.Apis.Drive.v3.Data.File(), fileToMove.Id);
updateRequest.AddParents = directoryToMove.Id;
updateRequest.RemoveParents = fileToMove.Parents[0];
var movedFile = updateRequest.Execute();

This method works well when working in your own drive, but not in a team drive where a file (folder) can only have 1 parent strictly. I do not have the solution in a team drive

Related

Retrieve all contents of Zoho module via REST API c#

I am trying to get the full contents of my modules From Zoho to our local Server. The deluge code does work as it returns to me the data which is being sent via the API. However, once it reaches the API, it is null. Any idea?
Below is the deluge code:
// Create a map that holds the values of the new contact that needs to be created
evaluation_info = Map();
evaluation_info.put("BulkData",zoho.crm.getRecords("Publishers"));
data = Map();
data.put(evaluation_info);
response = invokeurl
[
url :"https://zohoapi.xxxxx.com/publisher/publish"
type :POST
parameters:data
connection:"zohowebapi"
];
info data; (data returns all the data from publishers)
Here is my ASP.NET core restful API. It does ping it and create the file but the content of the file is null.
Route("[controller]")]
[ApiController]
public class PublisherController : ControllerBase
{
[HttpGet("[action]"), HttpPost("[action]")]
public void Publish(string data)
{
(it's already null when it comes here. why?)
string JSONresult = JsonConvert.SerializeObject(data);
string path = #"C:\storage\journalytics_evaluationsv2.json";
using (var file = new StreamWriter(path, true))
{
file.WriteLine(JSONresult.ToString());
file.Close();
}
}
}
}
What am I missing? Thank you
After contacting Zoho support, the solution he offered was to loop through the data in order to get all the contents from a module (if they are more than 200 records. With the solution provided, one doesn't really need the deluge code anymore as long as you have the ZOHO api set to your account in code. This was my final solution. This solution is not scalable at all. It's best to work with the BULK CSV.
// Our own ZohoAPI which lets us connect and authenticate etc. Yours may look slightly different
ZohoApi zohoApi = new ZohoApi();
zohoApi.Initialize();
ZCRMRestClient restClient = ZCRMRestClient.GetInstance();
var allMedicalJournals = new List<ZCRMRecord>();
for (int i = 1; i <= 30; i++)
{
List<ZCRMRecord> accountAccessRecords2 =
restClient.GetModuleInstance("Journals").SearchByCriteria("Tag:equals:MedicalSet", i, 200).BulkData.ToList();
foreach (var newData in accountAccessRecords2)
allMedicalJournals.Add(newData);
}

How to create new AutoML DataSet for simple classification (C#)

As part of ML automation process I want to dynamically create new AutoML model. I'm using C# (.net framework) and Google.Cloud.AutoML.V1.
After trying to run CreateDataSet code:
var autoMlClient = AutoMlClient.Create();
var parent = LocationName.FromProjectLocation(_projectId, _locationId);
var dataset = new Google.Cloud.AutoML.V1.Dataset();
dataset.DisplayName = "NewDataSet";
var response = autoMlClient.CreateDataset(parent, dataset);
I get the following error:
Field: dataset.dataset_metadata; Message: Required field not set
According to this user manual I should set Dataset Metadata Type, but the list contains only specific types of classifications (Translation/ImageClassifications etc.), I can't find a simple classification type.
How do I create a simple classification data set with the API ? in the AutoML UI its just with a simple button click ("NEW DATASET") - and have to provide only name & region - no classification type.
I also tried to set:
dataset.TextClassificationDatasetMetadata =
new TextClassificationDatasetMetadata() { ClassificationType = ClassificationType.Multiclass };
But I was unable to import data to it (got too many errors of invalid inputs from the input CSV file), I guess its related to the reason that the input format is not suitable for Text Classification.
UPDATE
I've just notice that the Nuget works with AutoML v1 but v1 beta does contains TablesDatasetMetadata Dataset Metadata Type for normal classifications. I'm speechless.
I also experienced this scenario today while creating a dataset using the NodeJS client. Since the Google AutoML table service is in the beta level you need to use the beta version of the AutoML client. In the Google cloud documentation they have used the beta client to create a dataset.
In NodeJS importing the beta version require('#google-cloud/automl').v1beta1.AutoMlClient instead of importing the normal version (v1) require('#google-cloud/automl').v1 worked for me to successfully execute the create dataset functionality.
In C# you can achieve the same through a POST request. Hope this helps :)
After #RajithaWarusavitarana comment, and my last question update , below is the code that did the trick. The token is being generated by GoogleClientAPI nuget and AutoML is handled by REST.
string GcpGlobalEndPointUrl = "https://automl.googleapis.com";
string GcpGlobalLocation = "us-central1"; // api "parent" parameter
public string GetToken(string jsonFilePath)
{
var serviceAccountCredentialFileContents = System.IO.File.ReadAllText(jsonFilePath);
var credentialParameters = NewtonsoftJsonSerializer.Instance.Deserialize<JsonCredentialParameters>(serviceAccountCredentialFileContents);
var initializer = new ServiceAccountCredential.Initializer(credentialParameters.ClientEmail)
{
Scopes = new List<string> { "https://www.googleapis.com/auth/cloud-platform" }
};
var cred = new ServiceAccountCredential(initializer.FromPrivateKey(credentialParameters.PrivateKey));
string accessToken = cred.GetAccessTokenForRequestAsync("https://oauth2.googleapis.com/token").Result;
return accessToken;
}
public void GetDataSetList(string projectId, string token)
{
var restClient = new RestClient(GcpGlobalEndPointUrl);
var createDataSetReqUrl = $"v1beta1/projects/{projectId}/locations/{GcpGlobalLocation}/datasets";
var createDataSetReq = new RestRequest(createDataSetReqUrl, Method.GET);
createDataSetReq.AddHeader("Authorization", $"Bearer {token}");
var createDatasetResponse = restClient.Execute(createDataSetReq);
createDatasetResponse.Dump();
}
I took the token generation code from google-api-dotnet-client Test File

TFS API Returns only first level folders C#

Im trying to implement a TFS API to fetch all work items. Its establishing the connection without any trouble. But the issue Im facing is its only fetching the fist levels of folders. Actually the fodler I am looking for is inside those folders. Somewhere deep inside 4th level. Here is the code Im trying
string collectionUri = ConfigurationManager.AppSettings["tfsPath"].ToString();//http://myserver:8080/tfs/defaultcollection
string teamProjectName = ConfigurationManager.AppSettings["tfsProject"];//mycompany
VssConnection connection = new VssConnection(new Uri(collectionUri), new VssCredentials());
// Create instance of WorkItemTrackingHttpClient using VssConnection
WorkItemTrackingHttpClient witClient = connection.GetClient<WorkItemTrackingHttpClient>();
List<QueryHierarchyItem> queryHierarchyItems = witClient.GetQueriesAsync(teamProjectName, depth:2).Result;
foreach(QueryHierarchyItem qh in queryHierarchyItems )
{
string s = qh.Name;
}
// Search for 'Special Queries' folder
QueryHierarchyItem myQueriesFolder = queryHierarchyItems.FirstOrDefault(qhi => qhi.Name.Equals("Special Queries"));
Here queryHierarchyItems is always null. I tried using that for loop above and I found its not getting into second levels of folders. So how can I accomplish my requirement or what I did wrong
If you know the path to the query folder use:
var folder = witClient.GetQueryAsync(teamProject, path, depth: 1).Result;
Then you can access the queries in that folder using something like:
var queries = folder.Children.Where(x => !x.IsFolder.GetValueOrDefault());

Navigating to DNN Module

I'm forming a newsletter with links to various html modules within my DNN website. I have access to each of their ModuleID's and I'm wanting to use that to get the url. The current approach (made by a third party developer) worked, but only to a degree. The url's are incorrectly formed when the Modules are located deeper in the website.
For example module located at www.website.com/website/articles.aspx is works fine, but a module located www.website.com/website/articles/subarticles.aspx won't. I know this is because the url is incorrectly formed.
Here's the current code:
DotNetNuke.Entities.Modules.ModuleController objModCtrlg = new DotNetNuke.Entities.Modules.ModuleController();
DotNetNuke.Entities.Modules.ModuleInfo dgfdgdg = objModCtrlg.GetModule(ContentMID);
TabController objtabctrll = new TabController();
TabInfo objtabinfoo = objtabctrll.GetTab(tabidfrcontent);
string tabnamefremail= objtabinfoo.TabName;
moduletitlefrEmail = dgfdgdg.ModuleTitle;
string readmorelinkpath = basePath + "/" + tabnamefremail + ".aspx";
ContentMID is the current module ID I'm looking at. I've tried to use Globals.NavigateURL, but that always crashes with Object reference not set to an instance of an object. error. Same thing when I use objtabinfoo.FullUrl so I'm currently at a loss as to how I get the specific modules URL.
EDIT: Here's some more code as to how the tabId is retrieved.
IDictionary<int, TabInfo> dicTabInfo12 = new Dictionary<int, TabInfo>();
ContentMID = Convert.ToInt32(dsNewsList.Tables[0].Rows[i]["ModuleID"]);
dicTabInfo12 = objTabctrl.GetTabsByModuleID(ContentMID);
if (dicTabInfo12.Count > 0)
{
string tester = ""; //Debug
foreach (KeyValuePair<int, TabInfo> item1 in dicTabInfo12)
{
tabidfrcontent = item1.Key;
}
}
You really should be using NavigateUrl to build the links ance if you have the tabid, you are golden.
string readMoreLinkPath = NavigateUrl(tabidfrcontent);
Nice and simple
Okay, colleague suggested this and it works great within a scheduler.
string linkPath = basePath + "/Default.aspx?TabID=" + tabID;
Will Navigate you to the correct tab ID. So this would be the best solution if you're forced to work within a scheduler where you can't use NavigateUrl without some major workarounds.

Setting up 3DCart API with a C# App

I have been trying to create an application to go through our database at a set interval and update/add any new items to 3DCarts database. Their code example uses soap in an xml file to send 1 request per call. So I need to to be able to generate the xml I need with the items information on the fly before sending it. I have done hardly anything with XML files like this and cannot figure out how to create the chunk of code I need and send it. One method that has been suggested is create a file but still executing has been a problem and would be very inefficient for a large number of items. Here is what I have so far
sqlStatement = "SELECT * FROM products WHERE name = '" + Convert.ToString(reader.GetValue(0)) + "'";
ServiceReferenceCart.cartAPIAdvancedSoapClient bcsClient = new ServiceReferenceCart.cartAPIAdvancedSoapClient();
ServiceReferenceCart.runQueryResponse bcsResponse = new ServiceReferenceCart.runQueryResponse();
bcsClient.runQuery(storeUrl, userKey, sqlStatement, callBackURL);
string result = Convert.ToString(bcsResponse);
listBox1.Items.Add(result);
EDIT: Changed from sample code block to current code block as I got a service reference setup finally. They provide no details though for using the functions in the reference. With this bcsResponse is just a blank, when I try adding .Body I have the same result but when I add .runQuery to the .Body I get a "Object reference not set to an instance of an object." error. As I have said I have not messed with service references before.
I hope I have explained well enough I just really have not worked with this kind of stuff before and it has become extremely frustrating.
Thank you in advance for any assistance.
I actually ended up figuring this out after playing around with it. Here is what I did to get the reference to work. This may have been easy for anyone who have used the references before but I have not and have decided to post this in case anyone else has this problem. The SQL can be SELECT, ADD, UPDATE and DELETE statements this was to see if the sku was listed before updating/adding.
//Will be using these multiple times so a variable makes more sense
// DO NOT include http:// in the url, also id is not shown in their
//database layout pdf they will give but it is the sku/product number
string sqlStatement = "SELECT id FROM products WHERE id = '" + Convert.ToString(reader.GetValue(0)) + "')))";
string userKey = "YourKeyHere";
string storeUrl = "YourStoresURLHere";
// Setting up instances from the 3DCart API
cartAPIAdvancedSoapClient bcsClient = new cartAPIAdvancedSoapClient();
runQueryRequest bcsRequest = new runQueryRequest();
runQueryResponse bcsResponse = new runQueryResponse();
runQueryResponseBody bcsRespBod = new runQueryResponseBody();
runQueryRequestBody bcsReqBod = new runQueryRequestBody();
//assigning required variables to the requests body
bcsReqBod.storeUrl = storeUrl;
bcsReqBod.sqlStatement = sqlStatement;
bcsReqBod.userKey = userKey;
//assigning the body to the request
bcsRequest.Body = bcsReqBod;
//Setting the response body to be the result
bcsRespBod.runQueryResult = bcsClient.runQuery(bcsReqBod.storeUrl, bcsReqBod.userKey, bcsReqBod.sqlStatement, bcsReqBod.callBackURL );
bcsResponse.Body = bcsRespBod;
//adding the result to a string
string result = bcsResponse.Body.runQueryResult.Value;
//displaying the string, this for me was more of a test
listBox1.Items.Add(result);
You will also need to activate the Advanced API on your shop as you may notice there is no actual option as the pdf's say, you need to go to their store and purchase(its free) and wait for them to activate it. This took about 2 hrs for us.

Categories