Running an executable independently and simultaneously with "dotnet watch run" - c#

I have an ASP.NET Core 5 project that as the main function runs a typical ASP.NET web project. So typically I start it with dotnet watch run and the server runs in the background continuously. But I'm also trying to add a way to execute console commands to the same executable, which reuse the code from the server version but don't start the actual WebHost from ASP.NET. The command line version should be able to run either with the server version already running, or entirely standalone.
The problem now is that I'm quite obviously doing something that dotnet run wasn't meant for, because the build is extremely flaky and fails quite often with unhelpful errors (e.g. file access errors, . It does make sense that running dotnet run again while it's still running in the background isn't a supported case, so I'm trying to find a different way to do this.
The whole thing runs inside Docker, though this probably doesn't change anything for my specific problem. I'm using VS Code, so Visual Studio specific solutions would not work for me, it must run with the dotnet CLI alone.
What I want to do is to have a dotnet watch run in the background during development, and a way to just execute the binary it built separately without triggering a simultaneous build that leads to problems. At least as far as I can tell this should avoid the issues I've been running into, if I'm wrong here and running simultaneous dotnet run or dotnet watch run should not cause build issues, please correct me here. I assume that running a separate dotnet build would run into the same issues, and I could not see anything in the dotnet CLI documentation that would fit my use case.
How can I safely run my project in development a second time while it is still running via dotnet watch run in the background?

A quick workaround is to use different build configurations (i.e. dotnet watch run -c Debug and dotnet run -c Release). If you haven't overridden the default output directory (./bin/<configuration>/<framework>/), you will be effectively using two compiled versions of the application, avoiding those 'flaky builds'.
You could also create your own build configuration for this purpose (i.e. dotnet run -c My_Debug) and configure it to have 'debug' properties, avoiding having to use the default Release configuration.

Related

Is it possible to debug a C# console application using git bash?

I am using Visual Studio Community 2022, and I was wondering if you could use another CLI to run/debug a c# console application. (I can't use cmd.exe or powershell because it's blocked by the admins of the device I'm using). Git Bash would be preferred, because it's already installed. Thank you!
Edit: Looks like you can't do this. dotnet run just tries to open the blocked binary, no matter where you run it. I think the admins hate everyone though because they allow you to install lots of things (like Unity, vscode, etc.) but running anything just doesn't work.
Thanks for trying everyone!
use this to run your program in CLI but you cannot debug it if you want to debug use debugger mode of vs2022,
If it's a framework-dependent application (the default), you run it by dotnet yourapp.dll.
Run the project in the current directory:
dotnet run
check Microsoft Docs dotnet run for more options.
CLI: Use Vs2022 Terminal in View->Terminal it may help You to run it.

Unable to extract .NET Single File Application in Azure DevOps release pipeline

I did create a single file application with C#. I have this executable located in a repository and I push it as an artifact. The former is used in build pipelines where I simply add the repo so I can use it. The latter is to use it as an artifact for a release pipeline.
Now during a build pipeline I can simply use it, have tested it and it works. Not really important but the application does 2 things, it sends out mails and updates work items.
When using that exact same application in a release pipeline I get the following error:
Failure extracting contents of the application bundle.
I/O failure when writing extracted files.
I don't know for sure, but could this have something to do with the release pipeline's agent running in a containerized environment? The reason I am asking so is because someone else had such an issue while using this approach on AWS with containers, see this Reddit link
[UPDATE]
The release pipeline was running on a self-hosted Azure DevOps agent. The environment where it is installed on has no .NET 5 runtime nor SDK installed. But I expect the single file application to contain the runtime as well, or am I wrong?
I did publish my application as a simple folder publish. Then putting all the files within the publish folder in the Azure DevOps repository.
Next I pushed all those files as an artifact with a build pipeline, had to install the .NET 5 current runtime on the environment as the release pipeline is running on a self-hosted agent, and then I am able to run the application.

Dotnet build behaves differently when executed from TeamCity

We're seeing a Lucene.NET analyzer build warning that shouldn't be there when executing dotnet build for one of our ASP.NET Core 3.1 web apps. The strange thing is that this warning seems to happen only when executed with the .NET runner in TeamCity; it (correctly) doesn't happen when running the dotnet build command directly from a PowerShell runner, or from the PowerShell console on the same machine (with the same user account, same folder, same code).
The build warning is actually due to a bug (that was fixed in a later Lucene.NET version) but the point is that we see different behavior.
We have tried running the exact same command that's displayed in the Build Log, including passing the TeamCity-generated RSP file to it. Still, outside of the .NET runner we can't reproduce the warning. We confirmed that the same user account runs the process, uses the same NuGet cache and dotnet.exe (and .NET SDK), runs in the same folder for the same code, with the same parameters.
The only thing we can think of is that somehow the .NET Plugin executes dotnet build differently but in a way that's invisible from the Build Log. However, we are out of ideas what else to try or investigate.
Could you help pinpoint where the difference in the two builds could be? Thank you in advance!
I've previously asked this on the TeamCity forum but didn't get a reply.
We ended up just running dotnet build directly, not using the .NET Runner. It has also cut certain builds' times down drastically (40-50% where there were a lot of build warnings).

How to publish my WPF solution with Rider?

I cannot seem to deploy my WPF app using the Rider IDE. Most support tell me to use the "Publish" button in VS which of course doesn't really help me.
I do produce an executable with the .NET Project and .NET Executable build Configurations, but it doesn't seem to run on any other machine, where it seems to close immediately without spitting error messages (even from PowerShell)
N.B. the app launches fine from my own machine, and from anywhere I choose to move it to.
I've tried to set Edit Configurations... > Runtime arguments: to
dotnet publish -c Release -r win10-x64
as suggested in this post, but that wasn't enough.
Running dotnet publish -c Release -r win10-x64 .\my_app.csproj from PowerShell returns
error MSB4216: Impossible to execute the task "GenerateResource"
[...]
Failed to connect to "CLR4" runtime and the "x86" architecture.
Make sure that
(1) The necessary runtime and/or architecture is present on the machine
(2) "C:\Program Files\dotnet\sdk\<version>\MSBuild.exe" exists and
has permissions to execute.
Now MSBuild is missing from that folder, but there is MSBuild.dll there.
Am I really missing a fool-proof easy way of publishing a C# WPF solution with Rider?
I use https://wixtoolset.org/ to publish my apps. It's free and independent of the IDE...

How to get docker toolbox to work with .net core 2.0 project

I'm getting an error trying to use the Docker functionality with my .NET core 2.0 project. I've been getting an error message saying
Visual Studio Container Tools requires Docker to be running before
building, debugging or running a containerized project. For more info,
please see: http://aka.ms/DockerToolsTroubleshooting
I followed the link, and upon realizing I have Windows 10 Home x64, and had to install Docker Toolbox, instead of Docker For Windows. Now it installed this executable called
Docker Quickstart Terminal
Is this the way one is supposed to start up that docker services? I have tried running this executable, and it seems to be working. My containers are running, but the error for Visual Studio Container Tools still persists.
What am I missing? Is having a version of windows higher than Home required in order to use the Docker Container Support within Visual Studio 2017?
UPDATE:
I tried to follow Quetzcoatl's suggestion, and I am still getting the same error within visual studio about those tools. Here is what I ran in the Docker Quick Start Terminal. I tried building the project after Visual Studio successfully opened the project, and was still getting the aforementioned error regarding the container tools.
My devenv.exe file is located at
C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\Common7\IDE\devenv.exe
and my solution file is located at
D:\Development\Visual Studio\Musify2\Musify2\Musify2.sln
UPDATE 2:
I ran some of the suggested commands to try in the docker quickstart terminal and here were the results of those commands quetz
With Docker Toolbox that's a little tricky, but actually the core-2.0 has nothing to do here. It's all about docker, docker-toolbox, and VS.
First of all:
Is this the way one is supposed to start up that docker services? I have tried running this executable, and it seems to be working.
Yes it is. If docker machine/services are running - that's good!
Now, you have to realize that in docker, typically, the information about how/where the docker is running is kept in environment variables. The quickstart script not only starts the docker-machine for you and checks some basic things, it also sets up a couple of environmental variables so that later all commands like docker, docker-compose etc know where to look for the docker virtual machine. In your/our case that information mainly consists of an IP of the VM and a port number that Docker listens on.
.. and your Visual Studio has no knowledge of that, because, I bet on that, you have ran the VisualStudio from StartMenu or from Desktop icon or by double-clicking on a solution file, so it had no chance of getting the environmental variables from quickstart console.
The solution is quite simple: make sure that VS gets that information. That is, make sure it gets that environmental variables, and make sure that it gets the fresh state of them, because the IP/port may fluctuate sometimes. So don't just copy them to your OS settings, because nothing will ever automagically refresh them..
The simplest way I found is to just close Visual Studio, run docker toolbox quickstart console, then run the VisualStudio from within that console, for example, for my VS2017 Community Edition:
Starting "default"...
(default) Check network to re-create if needed...
(default) Waiting for an IP...
(.......snip..........)
## .
## ## ## ==
## ## ## ## ## ===
/"""""""""""""""""\___/ ===
~~~ {~~ ~~~~ ~~~ ~~~~ ~~~ ~ / ===- ~~~
\______ o __/
\ \ __/
\____\_______/
docker is configured to use the default machine with IP 192.168.99.100
For help getting started, check out the docs at https://docs.docker.com
Start interactive shell
quetzalcoatl#LAP049 MINGW32 ~
$ /c/Program\ Files\ \(x86\)/Microsoft\ Visual\ Studio/2017/Community/Common7/IDE/devenv.exe C:\\PATH\\TO\\MY\\SOLUTION.sln
The path is pretty long to write, even with TAB-completion, so usually make a tiny .sh script to run that for me.
BTW! Notice that the path to DEVENV must be unix-like (/c/Program\ Files...), because the mingw shell has to understand that, while the path to SOLUTION must be a normal windows path (c:\projects\foo\bar\..) because VisualStudio will try to read that after starting up.
This is what I did to get vs 2017 working on windows 10 home with docker-toolbox. You follow this and i guarantee it will work. Note this only applies to windows 10 home which doesn’t support native docker for windows application:
Install docker-toolbox on w10 home
Run docker QuickStart terminal once to create the docker-machine. It takes a while. So be patient while it assigns the IP address and other things
Once it’s done it will show you a command prompt. Type ‘docker-machine ip default’. Note down the ip address as you’re gonna need it later
Close the QuickStart terminal window. That was just to initialize the boot2docker.iso image of a tiny Ubuntu linux server into virtualbox app (aka docker-machine aka default vm). If you’re not familiar with virtualization technology or oracle virtualbox stop reading and read up on them first and then start over. But if you do then gladly continue
As I mentioned that your docker-machine instance is a Linux vm and therefore it’s obvious that you can only open projects built using .net core technology. Unfortunately for full .net framework you’ll either need to run Windows containers which are only available on windows 10 pro or build your own windows nano server or 2016 server vm on virtualbox and then use and follow steps for native docker for windows on dockers website. From here on the remainder of this answer will be helpful to those wanting to run core projects on Linux vm / docker-machine only
Open windows power shell in administrator mode and type ‘docker-machine ls’ to confirm that default vm is running. Can also do ‘docker-machine status default’ and it should return ‘running’
Now open virtualbox application which is running your default vm and click on settings. Open “shared drives” tab where you need to make sure ‘c:\Users’ folder on the host machine is mapped/mounted as ‘c/Users’ folder in the vm. Note that this step is super important and missing it will cause a lot of trouble getting it to work successfully
Also a quick note that your solution/project/codebase MUST be saved under ‘c:\Users\‘ for it to work correctly. This is if you want to use it OOTB. I didn’t wanna waste time trying to mount a folder outside the permitted path. But if you’re the adventurous kind, please, by all means try to figure it out and let us know how you did
Now as Quetzalcoatl correctly mentioned vs needs to know about this docker-machine. The only way that happens is if the environment variables are set. Therefore go ahead and run this command ‘docker-machine env default | Invoke-Expression’ in the powershell window. This is the magic sauce getting vs to behave nicely with docker-machine
Go ahead and open vs normally either by dbl clicking your project solution or creating a new project/solution. In Powershell use the 'start' command to open your existing vs solution or a new vs instance. Pro-tip: if you create a new solution DO NOT select the option of Linux docker at the time of picking the project template type. You can totally add docker support once your solution is all setup and ready to go. Matter of fact leave it unchecked and let vs create your solution. This way you’ll get a chance to build and run your solution in IIS Express or Self-hosted modes to see if your core2.0 even works properly
Once satisfied that everything worked and you saw the OOTB homepage now it’s time to add docker support by rt clicking on your project, hovering over Add and then clicking on ‘Add Docker Support’. This will create a new docker project (.dcproj) and add a bunch of docker related files
Now I’m not gonna go into the nitty gritty of docker here however you’ll notice that your project is no longer the startup and the newly created docker project is. That’s perfectly normal and intended behavior. It means you’re setup and ready to fire up your app using docker containers. So go ahead and click on the ‘Docker’ button to see your hard work finally pay off. Again be patient as it takes a while to build images and spin up containers but once it’s done the vs will start and attach the debugger
Here you’ll once again be disappointed and feel worthless because when the browser opens a new window or tab there’ll be a page unreachable error. The reason is the browser address points to localhost which is not the web server anymore. Your “web server” now is your docker container and therefore you’ll need to replace localhost with the IP address you retrieved above. Port number remains as it is. Once you submit the page you’ll be relieved and ecstatic to see the home page/route work. This should also enable debugging in vs. if for some reason it doesn’t then you may need to delete a folder called .vsdbg in c:\Users\ folder and rerun the application.

Categories