I am using Azure Functions backed by a SQL database using the Entity Framework.
To ensure code correctness, I would like to perform full automated component tests on my local machine.
Injecting a connection string which points to a local database solves the SQL server issue, but I am yet to find a way to spin up a local instance of Azure Functions programmatically. As VS exposes this behavior using the “Debug” functionality, it should theoretically possible.
How do I programmatically spin up a local version of Azure Functions using a project which is located in the same solutions as my testing project?
VS uses the Azure Functions Core Tools for running your functions locally.
You can run yourself without needing VS in the picture. By default VS downloads the core tools under C:\Users\<Name>\AppData\Local\Azure.Functions.Cli\<version>\func.exe
You can run func.exe for help, but to launch the same way VS does it, just cd into the output folder of your project (e.g: bin\Debug\net461\) and run func host start
more docs are here: Code and test Azure Functions locally
Related
I have several Azure solutions with multiple projects inside, let's say one solution name is Customer. Inside are web jobs and api like:
CustomerHandler (web job)
CustomerApi (api)
This is written in C#, runs on .NET, and every time I commit my changes to git, it goes trough a pipeline in Jenkins and then to the Azure portal.
I have read in many places that I need to publish my project in Visual Studio to be able to attach a debugger and then be able to debug the web job running in Azure.
And that is the problem, I want to keep the flow that I have via, git and Jenkins. I don't want to publish the project from Visual Studio directly to Azure portal without going through the proper way.
So the question is: how can I remote debug my CustomerHandler web job that is live and running on Azure?
I have of course the code locally and can run the debugger locally, but sometimes the web job works locally but not in Azure. And it would be great to be able to debug directly on the cloud from my Visual Studio code. I have Visual Studio 2022.
Maybe one solution that I don't know how to do is to tell my local version of CustomerHandler to connect to the Azure CustomerHandler. Or somehow connect both?
I don't know I'm at lost here.
But there must be a way to just connect to Azure without publishing my local code directly to Azure and maybe breaking the working web job running in Azure.
My company has recently switched from automatic migrations on our Entity Framework databases to now relying on a command being run to migrate and afterwards seed the database.
We want this to run on the database of each environment that is being deployed to via Appveyor, so when pushing to a specified branch Appveyor builds, deploys and then runs the migrate and seed command on each environment after the deploy is completed.
We usually place all build and deploy configuration in a yaml file, but there doesn't seem to be any way to run commands after deploy on the environment itself. The yaml script command after_deploy runs command as part of the build process, not the environment.
The Appveyor environment has a setting After deployment command that seems to be the key, but it requires the "runCommand" provider to be allowed on the server side. The reason for this is understandable, but how do I go about setting this up?
The guides I've found haven't really given anything. Nothing applicable on SO, and the link to Microsoft that Appveyor prints as an error in the deploy process is no longer accurate.
Because environment deployments run on shared worker severs, custom scripting is not allowed. But you could try a deployment project approach. This divides your project into a "main project" and "deploymnet project" thus simulating a deployment environment, allowing you to decouple builds and deployments.
Assuming you are deploying to azure, you could then use this script my colleague wrote to run your commands on the server.
If you are not deploying to azure, you could switch to AppVeyor deployment agent which uses web deploy behind the scenes.
I know you can publish a Service Fabric application written in C# using Visual Studio, and I have read this article on using TFS or VSTS to set up continuous integration DevOps builds of a Service Fabric application.
How can I just do this all manually using PowerShell? I know I can do the following using PowerShell from this article on deployment:
Use Visual Studio to package the project.
Transfer the package to a remote server.
Use the PowerShell script examples in the article to deploy the package while I am in the context of the remote server.
Instead, here are two bits I can't seem to figure out which would assist me in doing this from PowerShell:
Using PowerShell, how can I package my Service Fabric project the same way you can when you are in the context of Visual Studio?
Using PowerShell, how can I remotely deploy my Service Fabric project the same way you can when you are in the context of Visual Studio?
To generate the package through the command line, you can call the "Package" target on the sfproj file.
See my answer on create a deployment package for Service Fabric that includes all artifacts necessary to run the designed workflows at runtime
Then follow the instructions from https://azure.microsoft.com/en-us/documentation/articles/service-fabric-deploy-remove-applications/ as blackSphere suggested.
If you haven't seen this link, take a look at Deploy and Remove Packages using Powershell article.
Suppose you have a folder named MyApplicationType that contains the necessary application manifest, service manifests, and code/config/data packages. The Copy-ServiceFabricApplicationPackage command uploads the package to the cluster Image Store
That takes a directory and uploads it. Then you have to tell it to take that image and use it in the application.
The Register-ServiceFabricApplicationType command returns only after the system has successfully copied the application package. How long this takes depends on the contents of the application package. If needed, the -TimeoutSec parameter can be used to supply a longer timeout.
After you register it you can create the application:
You can instantiate an application by using any application type version that has been registered successfully through the New-ServiceFabricApplication command
I am completely new to Azure and PowerShell but have been tasked with setting up a build and deploy solution for several app services.
We currently have a build server (Azure VM) that is running CruiseControl.NET to build and test some C# .NET solutions that should be deployed in Azure.
This build server currently handles the following tasks:
Pulling code from source control when commits happen
Building the projects
Running some unit test cases
Copying output/binaries to an output location
However, as it exists now, developers of each of our app services need to 'Publish' their services manually from their development machines by clicking the button in Visual Studio once they have verified that the build and test cases have passed in the test environment on the server.
As I am hoping for a completely automated solution, I expect I need to use something like PowerShell or the Azure Cross Platform CLI (npm) to do this?
I'm extremely confused with the Azure Service Management vs Azure Resource Management versions with the new Azure Powershell 1.0. All of our services appear to be the newer Resource Management versions, not 'classic'.
The eventual goal is to have the build server do the following
Pulling code from source control when commits happen
Building the projects
Running some unit test cases
Copying output/binaries to an output location
If the build and test cases are successful, update the service in azure to the latest build
I am hoping there is a way to set up these projects, or take the existing binaries that result from the builds, and have them be deployed into Web Apps using the new Azure Resource Management Powershell features.
Any advice or resources for more information about how this can be done?
Hopefully this makes some sense. Please let me know if I am going about this completely the wrong way or direct me to a more correct forum.
Thanks!
have you consider to use Azure App Service? where you can get those build infrastructure for free. e.g https://azure.microsoft.com/en-us/documentation/articles/web-sites-publish-source-control/
Once you setup continues deployment, you will get below three when there is push event (if you are using git)
Pulling code from source control when commits happen
Building the projects
Copying output/binaries to an output location
and to "Running some unit test cases", you can create your own batch or powershell script with post deployment hook https://github.com/projectkudu/kudu/wiki/Post-Deployment-Action-Hooks
I have been tasked with updating the companies outdated build process. It is all done in batch and perl scripts. The current build process is:
Schedule a build through a web interface.
Build server takes the build process off the queue.
Build server checks out all of the files from the TFS source control.
Build server runs a couple of code injection scripts that modify the source before the build.
Build server updates versions and signs the code.
Build server uses visual studio to compile the projects.
After that is finished the build server zips up the output and drops it in a network share location.
The real difficult part is the code injection scripts. They are 3 perl scripts that modify a lot of code. They are also very machine dependent in the way they were designed. (So I can't just drop them in the build process without a lot of modification)
My end goal is to be able to run the build process on local dev machines and also have CI running on the TFS server.
In my searching it seems that there is no way to emulate a TFS Build on a local machine. So is my only option to use pre-/post-build command line scripts in my cs.proj files? Or is there a better way to do complex builds on the local machine and run the same builds on the TFS?
I have seen Using TFS build definitions on a local machine, but that seems a bit hacky to me. I guess it wouldn't be a horrible solution if there isn't a better one.
I have tried to do something similar in the past myself. Unfortunately, there isn't a good way to go about it because of everything that the TFS Build Workflow requires. What I found was that there are basically 2 ways to go about it.
Create a MSBuild script that will run on both Server and Local
Create a MSBuild script for local and Custom Activity for Server.
If you are intent or have a requirement that you be able to reproduce the build exactly on both the developer machine and the build server, then I would opt for #1. Otherwise I would go with #2. The second option is nice because then you can play within the TFS workflow for doing the main builds which provides you with many objects that you would need as well as giving you a nice place to configure settings without having to check out/in files to change how the build occurs.
For either method you would most likely have to modify your Perl scripts to take in parameters to account for any customization you have to do between systems. Then you can have the user either pass these in or default them in the MSBuild script for local builds and set them up as parameters in the TFS Build Workflow. Thus they can be easily modified if needed. Regardless of the method though, the only good way to do it is to standardize on how things need to be setup on the developer machines and the build server so that you don't have to provide the customization as much.
If you do opt for the first option then you can use the Legacy build configuration for TFS build which supports using a MSBuild script for everything and then you can share the script between both developers and the build server but if someone accidentally changes this script then it does run the risk of breaking the build.