We have a build definition i Azure DevOps that creates a docker image and pushes it to Azure Container Registry. The definition has a pipeline variable that has to be inserted at build time. I have to queue the build from my .Net code. I can get the definition but don't see how to update the pipeline variable.
VssBasicCredential credentials = new VssBasicCredential("",persAccToken);
VssConnection connection = new VssConnection(uri, credentials);
BuildHttpClient buildClient = connection.GetClient<BuildHttpClient>();
BuildDefinition def = buildClient.GetDefinitionAsync(projectName, definitionId).Result;
The pipeline variable is "settable at queue time". However I don't find a way to do it from my code.
BuildDefinition has property Variables that contains pipeline variables. That variable can be removed and than added with new value or updated
BuildDefinitionVariable bdv = new BuildDefinitionVariable { AllowOverride = true,
IsSecret = false, Value = "new-vaule" };
def.Variables.Remove("variable-name");
def.Variables.Add("variable-name", bdv);
buildClient.UpdateDefinitionAsync(def, projectName, def.Id);
Update Build Pipeline variables from .Net code
As workaround, you can update a variable value by emitting the following with Powershell scripts:
"##vso[task.setvariable variable=testvar;]testvalue"
Check the document Logging Commands for some more details.
Then you could call this .ps1 with your .Net code.
Hope this helps.
Related
i am writing a c# binary cmdlet. at end it should installed on machine for immediate using when starting the ISE or PS-Console without Import-Module.
now i want to provide machine-wide some own $env:Variables like ex. $env:ProgramFiles. how i can do this?
Thanks!
EDIT:
for a more descriptive example here a code sniped:
namespace InstallTools.UpdateEnvironment
{
[Cmdlet(VerbsData.Update, "Environment")]
[OutputType(typeof(UpdateEnvironment))]
public class UpdateEnvironment : PSCmdlet
{
[Parameter(Position = 1,
Mandatory = false,
ValueFromPipeline = true,
ValueFromPipelineByPropertyName = true,
ParameterSetName = "ENVIRONMENT")]
public SwitchParameter SetEnvironment { get; set; }
protected override void ProcessRecord()
{
if(SetEnvironment .IsPresent)
{
Environment.SetEnvironmentVariable("CDir", this.MyInvocation.PSScriptRoot);
Environment.SetEnvironmentVariable("CurrentTime", DateTime.Now.ToString("yyyy.MM.dd HH:mm:ss"));
}
}
}
}
if i call in PS Update-Environment -SetEnvironment all the doings will executed. in my case, Environment.SetEnvironmentVariable ("test", "testval") causes an $env:test to be available at runtime.
However, I want the variables to be initialized automatically when the ISE is opened, without calling Update-Environment.
Thanks at all!!
For immediate usage of your module when starting the ISE or PS-Console typing Import-Module you can modify the PSModulePath environment variable.
According to Modifying the PSModulePath Installation Path.
The PSModulePath environment variable stores the paths to the locations of the modules that are installed on disk. PowerShell uses this variable to locate modules when the user does not specify the full path to a module. The paths in this variable are searched in the order in which they appear.
When PowerShell starts, PSModulePath is created as a system environment variable with the following default value: $HOME\Documents\PowerShell\Modules; $PSHOME\Modules on Windows and $HOME/.local/share/powershell/Modules: usr/local/share/powershell/Modules on Linux or Mac, and $HOME\Documents\WindowsPowerShell\Modules; $PSHOME\Modules for Windows PowerShell.
This article also answered the second part of your question :
To add paths to the PSModulePath environement variable, use one of the 3 following methods:
1) To add a temporary value that is available only for the current session, run the following command at the command line:
$env:PSModulePath = $env:PSModulePath + "$([System.IO.Path]::PathSeparator)$MyModulePath"
2) To add a persistent value that is available whenever a session is opened, add the above command to a PowerShell profile file ($PROFILE)>
$profile.AllUsersAllHosts
C:\Windows\System32\WindowsPowerShell\v1.0\profile.ps1
For more information about profiles, see about_Profiles.
3) To add a persistent variable to the registry, create a new user environment variable called PSModulePath using the Environment Variables Editor in the System Properties dialog box.
To add a persistent variable by using a script, use the .Net method SetEnvironmentVariable on the System.Environment class. For example, the following script adds the C:\Program Files\Fabrikam\Module path to the value of the PSModulePath environment variable for the computer. To add the path to the user PSModulePath environment variable, set the target to "User".
$CurrentValue = [Environment]::GetEnvironmentVariable("PSModulePath", "Machine")
[Environment]::SetEnvironmentVariable("PSModulePath", $CurrentValue + [System.IO.Path]::PathSeparator + "C:\Program Files\Fabrikam\Modules", "Machine")
I'm using the Azure DevOps .NET SDK (v16.530.0-preview) and when I try to create a test run using CreateTestRunAsync (in TestManagementHttpClient) and use the RunCreateModel object as a parameter, it looks like the RunCreateModel's State property is read-only. The docs also confirm this.
TestApi.RunCreateModel run = new TestApi.RunCreateModel
{
Name = testRun.Name
//State = testRun.State (is read-only)
};
https://learn.microsoft.com/en-us/dotnet/api/microsoft.teamfoundation.testmanagement.webapi.runcreatemodel.state?view=azure-devops-dotnet-preview#Microsoft_TeamFoundation_TestManagement_WebApi_RunCreateModel_State
But the weirdness is that when creating the run, the UI shows In Progress then it shows the test run running for {x hours, minutes, sec}.
Is there a way I can avoid this and set the state as NotStarted?
You can define the state in the constructor:
var testRun = new RunCreateModel(name: "API Test Run", state: "NotStarted");
var results = testApi.CreateTestAsync(testRun, "Project").Result;
I succeeded to create a test run with "Not Started" state with the above code.
When I install Microsoft.ML stable version in my Web Application and doing same as this tutorial https://dotnet.microsoft.com/learn/machinelearning-ai/ml-dotnet-get-started-tutorial#install
But var model = pipeline.Fit(trainingDataView); this code does not throw any error or does not continue to the next step. Moreover, I tried this same step in a console app and gave the same result.
My Code is:
var mlContext = new MLContext();
var reader = mlContext.Data.CreateTextReader<IrisData>(separatorChar: ',', hasHeader: true);
IDataView trainingDataView = reader.Read("C:/Users/HACKBAL/Documents/visual studio 2017/Projects/WebApplication1/WebApplication1/Data/Test.txt");
var pipeline = mlContext.Transforms.Conversion.MapValueToKey("Label")
.Append(mlContext.Transforms.Concatenate("Features", "SepalLength", "SepalWidth", "PetalLength", "PetalWidth"))
.Append(mlContext.MulticlassClassification.Trainers.StochasticDualCoordinateAscent(labelColumn: "Label", featureColumn: "Features"))
.Append(mlContext.Transforms.Conversion.MapKeyToValue("PredictedLabel"));
var model = pipeline.Fit(trainingDataView);
var prediction = model.CreatePredictionEngine<IrisData, IrisPrediction>(mlContext).Predict(
new IrisData()
{
SepalLength = 3.3f,
SepalWidth = 1.6f,
PetalLength = 0.2f,
PetalWidth = 5.1f,
});
Console.WriteLine($"Predicted flower type is: {prediction.PredictedLabels}");
I might be a bit late, but found the cause and the solution.
After waiting for the .Fit() method to 'run' for about 30 minutes, it finally gave me the following error:
Apparently it's trying to download the ResNet model from a remote server. Why this is not documented, and why this is being done is the Fit() method is beyond me...
Anyways, the suggested solution is to download the meta file from here:
https://aka.ms/mlnet-resources/meta/resnet_v2_101_299.meta
Then copy the file to the following folder:
C:\Users\YourName\AppData\Local\Temp\MLNET\
Once the file is copied, run your app again, it should be working now.
Try to use .NET Core SDK, not .NET FrameWork.
Download .NET SDKs for Visual Studio
Try to use my work sample on github. Console.Example.ML.NET
I'm trying to access the branch name in Cake on TeamCity running inside a Linux Docker container, but whenever I try to fetch any of the "Configuration Parameters", the values are returning nothing.
In my branch, the following build parameter values are visible on TeamCity:
Configuration parameters
vcsroot.branch: refs/heads/master
teamcity.build.branch: 5/merge
Environment variables
env.vcsroot.branch: 5/merge
The env.vcsroot.branchvariable has a value of %teamcity.build.branch%.
My cake script simply tries to spit out the values, and all of the ones below are coming back empty:
var branch = EnvironmentVariable("vcsroot.branch");
var tcbranch = EnvironmentVariable("teamcity.build.branch");
var agent = EnvironmentVariable("system.agent.name");
var confName = EnvironmentVariable("system.teamcity.buildConfName");
var buildId = EnvironmentVariable("teamcity.build.id");
var vcsRootBranch = EnvironmentVariable("vcsroot.Root_TemplatedVcsRoot1.branch");
var argOrEnv = ArgumentOrEnvironmentVariable("teamcity.build.branch", "vcsroot.branch", "Unfound");
Information($"vcsroot.branch = {branch}");
Information($"teamcity.build.branch = {tcbranch}");
Information($"system agent name = {agent}");
Information($"system TC build cof name= {confName}");
Information($"param buildId = {buildId}");
Information($"vcsroot template branch = {vcsRootBranch}");
Information($"test argument or env variables = {argOrEnv}");
The actual output:
[12:34:51][Step 1/2] vcsroot.branch =
[12:34:51][Step 1/2] teamcity.build.branch =
[12:34:51][Step 1/2] system agent name =
[12:34:51][Step 1/2] system TC build cof name=
[12:34:51][Step 1/2] param buildId =
[12:34:51][Step 1/2] vcsroot template branch =
[12:34:51][Step 1/2] test argument or env variables = Unfound
Oddly enough, on our non-docker Windows-based TeamCity agents, the values seem to return fine. I have a feeling I'm missing something here that's painfully simple, but I'm a relative novice when it comes to Cake, TeamCity, and Docker. Any help would be greatly appreciated. Thanks!
Edit: to claify, most of the environment variables are coming back as expected; the only one that I've noticed that doesn't is the one that references a configuration parameter.
For Environment variables TeamCity replaces non alpha numeric characters with "_"
I.e vcsroot.branch becomes vcsroot_branch
I figured it out...
First off, I missed the subtext on the TC project parameters page for Configuration Parameters; it states Configuration parameters are not passed into build, can be used in references only.
Secondly, I didn't realize that none of the System Properties were visible (don't know if that's an issue), but its subtext also states System properties will be passed into the build (without system. prefix), they are only supported by the build runners that understand the property notion.
Therefore, in order to get the Configuration Parameter value, I needed to create an Environment Variable using the Configuration Parameter as it's value:
env.TCBranch = %teamcity.build.branch%
It was a little unsettling that teamcity.build.branch didn't show up in the type-ahead when specifying the value, but it works.
This begs the question about why the Environment Value of env.vcsroot.branch didn't work, and I presume it's because the name of the variable is identical to a different Configuration Variable name. Considering these parameters aren't passed into the build, I don't see why that should matter, but I can't think of why else it wouldn't work. Anyway, thanks to #devlead for the suggestions (above).
I have a SSIS package with a script task, I get the following error when i try to run it in my local system. It works fine for my collegues as well as in production. However, I am not able to run it locally, to test. I keep a debug point in the main method, but it is never reached, I get the error before it goes to main method.
I am using VS 2010, .Net framework 4.5.
The script task does compile. I get the following messages SSIS package "..\Test.dtsx" starting. Error: 0x1 at Test: Exception has been thrown by the target of an invocation. Task failed: Test SSIS package "..\Test.dtsx" finished: Success. The program '[2552] DtsDebugHost.exe: DTS' has exited with code 0 (0x0).
The following is the code:
public void Main()
{
try
{
LogMessages("Update Bug package execution started at :: " + DateTime.Now.ToLongTimeString());
LogMessages("Loading package configuration values to local variables.");
strDBConn = Dts.Variables["User::DBConnection"] != null ? Dts.Variables["User::DBConnection"].Value.ToString() : string.Empty;
strTPCUrl = Dts.Variables["User::TPCUrl"] != null ? Dts.Variables["User::TPCUrl"].Value.ToString() : string.Empty;
TfsTeamProjectCollection objTPC = new TfsTeamProjectCollection(new Uri(strTPCUrl));
WorkItemStore objWIS = new WorkItemStore(objTPC);
WorkItemCollection objWIC = objWIS.Query("SELECT...");
foreach (WorkItem wi in objWIC)
{
}
}
catch(Exception ex)
{
}
When I commented the code from TfsTeamProjectCollection objTPC = new TfsTeamProjectCollection(new Uri(strTPCUrl)); The script executes successfully. However, if i keep TfsTeamProjectCollection objTPC = new TfsTeamProjectCollection(new Uri(strTPCUrl)); and comment the rest, i get the exception.
I do have access to the URL.
I am using Microsoft.TeamFoundation.Client.dll and Microsoft.TeamFoundation.WorkItemTracking.Client.dll, in my script task. However the dll version in the package is 10.0, and the version of the dll in my GAC is 12.0. Would that cause a problem?
I had the same Problem (i.e. the same error code Error: 0x1 ...).
The issue was with some of the libraries referenced from a missing folder.
Removing the references and adding them back from the correct path fixed the issue.
The Microsoft Reference (https://msdn.microsoft.com/en-us/library/ms345164.aspx) related to the Error code is very generic and doesn't help you much.
However, reading other articles it is quite likely it indicates an unknown failure reason to run the Script Task.
Hexadecimal code: 0x1
Decimal Code: 1
Symbolic Name: DTS_MSG_CATEGORY_SERVICE_CONTROL
Description: Incorrect function.
I got this error message when I referred to a passed ssis variable in Dts.Variables["User::xxxx].Value(); where xxxx did not exist and was not passed from the calling program. It was a simple Console.Writeline referring to a passed variable that didn't exist.
I fixed this error by changing the TargetServerVersion of the SSIS Project.
Integration Services Project Property Pages
This is just a different situation and not intended to be the end all be all solution for everyone.
When I was installing my DLLs into the GAC I forgot to run my script as Administrator and the script ran silently without error as though it was working.
I felt really dumb when I realized that's what I did wrong. Hopefully this can help prevent other people from wasting time on something so silly.
For reference this is what I use for installing my DLLs into the GAC and I modified it to tell me when I am not running it as Administrator now:
#https://superuser.com/questions/749243/detect-if-powershell-is-running-as-administrator
$isAdmin = ([Security.Principal.WindowsPrincipal] `
[Security.Principal.WindowsIdentity]::GetCurrent() `
).IsInRole([Security.Principal.WindowsBuiltInRole]::Administrator)
if($isAdmin -eq $false)
{
Write-Host "You have to run this script as Administrator or it just won't work!" -ForegroundColor Red
return;
}
$strDllPath = "C:\PathToYourDllsHere\"
#Note that you should be running PowerShell as an Administrator
[System.Reflection.Assembly]::Load("System.EnterpriseServices, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a")
$publish = New-Object System.EnterpriseServices.Internal.Publish
$arr = #(
"YourDLL01.dll",
"YourDLL02.dll",
"YourDLL03.dll"
)
get-date
foreach($d in $arr)
{
$p = ($strDllPath + $d);
$p
$publish.GacInstall($p);
}
#If installing into the GAC on a server hosting web applications in IIS, you need to restart IIS for the applications to pick up the change.
#Uncomment the next line if necessary...
#iisreset
Credit for how to determine if your PowerShell script is running in Admin mode or not:
https://superuser.com/questions/749243/detect-if-powershell-is-running-as-administrator
In my case it was missing DLLs or not having the correct version installed on the server.
Locally all tests were fine but on the server the error message Runtime error Exception has been thrown by the target of an invocation kept popping up.
No exception handler would be able to catch that error - the code in the script task would not even be executed, as soon as a DLL would be needed, that is not in the assembly cache on the server (it could happen on a local machine as well, with the same error).
The difficulty here is finding out what is missing and then either update the references to the correct version or install the missing DLL in the assembly cache with gacutil. The way I approached the debugging was to remove parts of the code in the script task until that error wouldn't appear, then analyze the missing part for references.
After not having any luck with the other answers here, I finally found that in my package, the problem was that I had created a new variable but not carried its name across into my new copy of the C# script. The variables were the ones to be used as connection string expressions. So it was ultimately a matter of changing:
Dts.Variables["Exists"].Value = File.Exists(Dts.Variables["OldSSISPackageVariableName"].Value.ToString());
to:
Dts.Variables["Exists"].Value = File.Exists(Dts.Variables["NewSSISPackageVariableName"].Value.ToString());
Once I kept them in sync, it worked fine.
Its fixed, when added reference to dll version 12.0.0 and changed Target Framework to .Net Framework 4.5
My problem was that I, in the script task, tried to fetch data like this:
public void Main()
{
using (var connection = Dts.Connections["localhost.Test"].AcquireConnection(Dts.Transaction) as SqlConnection)
{
connection.Open();
var command = new SqlCommand("select * from Table;", connection);
var reader = command.ExecuteReader();
while (reader.Read())
{
MessageBox.Show($"{reader[0]} {reader[1]} {reader[2]}");
}
}
Dts.TaskResult = (int)ScriptResults.Success;
}
However, my connection is of the type OLE DB, and therefore I needed to connect to it like this instead:
public void Main()
{
var connectionString = Dts.Connections["localhost.Test"].ConnectionString;
using (var connection = new OleDbConnection(connectionString))
{
connection.Open();
var command = new OleDbCommand("select * Table;", connection);
var reader = command.ExecuteReader();
while (reader.Read())
{
MessageBox.Show($"{reader[0]} {reader[1]} {reader[2]}");
}
}
Dts.TaskResult = (int)ScriptResults.Success;
}
Notice that I'm using OleDbConnection here.
For me what fixed the issue was updating the string parameter I passed to the script. it was missing "" at the end of the path (i.e. "e:\arcive" - needed to add "" at the end)