How to use or access newly created custom pipeline variable inside a executable created through C# so that I can assign value inside program in Azure DevOps build pipeline?
Let me explain my query in detail:
I have a build pipeline created in Azure DevOps for my project. If I select EDIT option on this Pipeline I see a tab or section at top with title as Variables, here selected Pipeline Variables and Clicked on ADD to create my own custom variable lets say VersionNumber and checked the settable at queue time option.
On the side I also have a C# based Console Application which accepts command line arguments and does some process.
I created an additional task in the build pipeline of Command Line where i have given my Executable file path such that it can pick and execute it.
Now, here's is my query or issue:
How to pass the custom variable created above as a command line argument to my exe such that i can assign the value inside my program to be used in further tasks in the pipeline?
If this is not possible then:
How can i return values from my exe such that when executed as part of pipeline the values can be used in next tasks of the build pipeline?
Please note that the logic in the exe is written using C# and inside main method itself as there are only few lines of code which gets executed and gets the value.
How to achieve or tackle the above issue/query?
Can you please help me on this? and please try to explain in a detailed step by step guide manner as I am bit new to coding, C# and Azure DevOps?
Tried to pass the command line arguments as $VersionNumber but was not useful apart from this did not get anything to process.
In the Command Line task you pass the variable in this way (under the Arguments field):
$(VersionNumber)
In your C# you read the variable in this way (under your Main(string[] args) method):
var versionNumber = args[0]
To set a new variable that will be available in the next steps you do it in this way:
Console.WriteLine("##vso[task.setvariable variable=newVar;]myValue")
I have very big SSIS processes that contain a lot of SQL Script Tasks.
I want to export the content of all the SQL Script Tasks to one file
by c# code.
I looked for code but I didn't find.
According to KeithL's comment:
You can open the file with the .dtsx extension of the project
And take all the elements named SQLTask: SQLTaskData The statment is in the attribute SQLTask: SQLStatmentTask
I want to use some shared classes on my Azure Functions to not duplicate code.
I have tried to create a empty C# function and create the classes inside the function and then import to the other functions with:
#r "../Shared/Class.cs"
First, put your shared code inside a folder in the root of your Function App directory (e.g. "Shared"). Let's say I put a shared Message.csx class in that folder (e.g. full path D:\home\site\wwwroot\Shared\Message.csx).
To include this into your function use the #load command:
#load "..\Shared\Message.csx"
using System;
using Microsoft.Azure.WebJobs.Host;
public static void Run(Message message, TraceWriter log)
{
log.Info($"C# Queue trigger function processed message: {message.Id}");
}
See the help page here for more information. By default, the files in that directory won't be tracked for changes. If you want to ensure that when files in that directory change your functions will pick up the changes and be recompiled, you can add your "Shared" directory to the watchDirectories list in host.json. E.g.:
{
"watchDirectories": [ "Shared" ]
}
Due to the rate of change within Azure Functions, this is no longer the recommended approach for C# functions( see Azure Functions Tools Roadmap). Refer to the following blog posts for depictions of the most modern and efficient patterns for structure a C# project in visual studio, and get all the advantages of shared DLL's the way you normally do in C#.
https://azure.github.io/AppService/2017/03/16/Publishing-a-.NET-class-library-as-a-Function-App.html
https://github.com/devkimchi/Precompiled-Azure-Functions-Revisited
Let me answer this question in a more human-understandable way taking into account that Azure Functions are new and don't have proper documentation yet.
Let's go step by step.
You need to go to the Azure function "Platform features" section.
Then Navigate to Development tools->Advanced tools:
Next, navigate to Tools->Zip Push Deploy:
Next, create a folder called "Shared" inside root folder as it recommends in Microsoft documentation:
Inside this folder, you can also create additional folders, classes if you want,
for example, if you want to reuse Model classes between Azure functions then create an additional folder called "Models" and put your desired class there.
After the creation of *.csx file you can edit it and put your code there and save:
Then reuse the class inside you Azure function by loading it using #load:
Note:
Yet another way is to use Visual Studio with Microsoft DevOps CI/CD. There it will be much straightforward and easy to do the same.
When working from Visual Studio and looking for a way to share some C# script files between functions within your Function App like this:
#load "..\Shared\MySharedCode.csx"
Please be aware that you should set the 'Copy to Output Directory' property foor the files in your shared folder to 'Copy always'.
I have an application which has an oracle database, so the installation of the application needs running some oracle commands script files to create the database and perform some DDL operations.
I was trying to prepare an installation wizard using C# forms application. This wizard needs to run these commands. My questions are: Is C# windows forms app a good choice to achieve that? And how to perform it? i.e. how to run oracle commands script files from inside the application? I exactly need a function that takes the file path as input parameter and executes the commands within the script files...
Thanks in advance
I have about 10 different csv files that I need to parse and transform and load into sql server.
I don't want to create 10 different ssis packages for this, would it be possible to create a single SSIS package that I could create 10 different instances of, and simply alter a config file so that it pulls from the correct source csf file and saves to the correct table?
Is it also possible to create re-usable parts of code that would do things like email a report of what was loaded, what had errors and what the erros were etc. There are other parts that I am sure I could re-use between the 10 different files that need to be parsed.
Also, is modifying and deploying a ssis package painful?
This question is same as the below one. I got a solution for that and it really helped me. Hope it will help you.
Best Way to ETL Multiple Different Excel Sheets Into SQL Server 2008 Using SSIS