Visual Studio - Debug vs Release - c#

I built a windows service, targeted for .NET 2.0 in VS 2008. I run it as a console app to debug it.
Console app is working great. I put it on my local computer as a service, compiled in debug mode, still working great. I'm ready to release now, and suddenly, when I set it to release mode, the service compiles and installs, but nothing happens. (No code in service is running at all).
I realize that the release vs debug mode are property configuration settings, but it seems that in release mode, even when I check define DEBUG constant, uncheck Optimize code, and set Debug info to 'full', it is still doing nothing.
Set it back to debug and it's working like a charm again.
(As a sidenote, I tried resetting the target framework to 3.5 to make sure that wasn't the issue, too)
So my questions (in order of importance) are these:
Will using my "debug" version in any way ever cause any problems?
What settings are different between debug and release besides the three I've been trying to change already?
This seems like a weird error to me and has stirred up my curiosity. Any idea what would cause this?
EDIT:
Should mention, I already am using a custom installer. Basically I compile the program (in either debug or release) and then install it with the respective installer.

1) It might, if not directly, so indirectly by making the application slower and making it use more memory.
2) When it runs in debug mode, there are certain things that works differently, for example:
The code is compiled with some extra NOP instructions, so that there is at least one instruction at the beginning of each code line, so that it will be possible to place a break point at any line.
The instructions can be rearranged in release mode, but not in debug mode, so that the code can be single stepped and the result will correspond to the exact order of the source code.
The garbage collector works differently, by letting references survive throughout their entire scope instead of only for the time that they are used, so that variables can be viewed in debug mode without going away before the scope ends.
Exceptions contain more information and takes a lot longer to process when thrown.
All those differences are relatively small, but they are actual differences and they may matter in some cases.
If you see a great difference in performance between debug mode and release mode, it's usually because there is something wrong with the code, like for example if it's throwing and catching a huge amount of exceptions. If there is a race condition in the code, it may only happen in release mode because there is some extra overhead in debug mode that makes the code run slightly slower.
3) As to what the problem with your service is, I don't know, but it doesn't seem to be related to how the code is executed in debug mode or release mode. The code would start in any case, and if it was a problem with the code, it would crash and you would be able to see it in the event log.

I'm not sure I can speak to #1 or #2, but when I've had problems like that, it was because of incorrect threading/concurrency. I'm not sure how large your app is, but that might be a good place to start.

Related

Unable to properly step through code in static constructor (VS2019, C#, .NET 4.7.2)

I am trying to step through the code of a static constructor while in break mode.
The project is C#/.NET 4.7.2/64-bit/WinForms. Visual Studio version is 2019 16.9.4 Community.
Visual Studio correctly breaks when it gets to the breakpoint set in the static constructor. I can then step through the code using the "Step Into" command (F11). As you can see, the static constructor calls a static method which does the heavy work.
The code contains a loop that should iterate 10 times. I should be able to step through all iterations. Instead, after the first iteration, debugging suddenly jumps to -- or "resumes" -- at some much later point, back in the calling class (or more accurately, the class that triggered the CLR to execute the static constructor). So I am unable to step through the remaining 9 iterations.
I am confident that all iterations are indeed executed, because I added some debug code to print something every time the finally block is executed. But, I am frustrated that I cannot step through the code. Seems like a VS bug of the ludicrous kind, since it's a pretty fundamental thing to be able to step through code when debugging.
Because the problem is so hard to describe well, I have created an animated GIF to visually show the debug session:
I have searched the web far and wide and I can't find anyone else reporting the kind of issue I am here. Which has me wondering whether I am doing something wrong (e.g. do I need to adjust some debugger settings or something?). Any help or insight is much appreciated.
UPDATE
I cannot reproduce the issue in 32-bit Debug builds. The issue is (so far) only present in 64-bit Debug builds.
HOW TO REPRODUCE
I made a tiny demo project. Feel free to download and try debug for yourself. When the breakpoint is hit, use F11 to step through the code. See if you can get through the loop 10 times ;-)
Debug Test Project (VS2019)
On my end, the issue disappears if I change the build configuration from x64 to Any CPU. So it may be a 64-bit only issue.
The try/finally block really seems to reveal the issue. I'm not sure what other patterns might reveal the issue too.
BUG REPORT FILED
https://developercommunity.visualstudio.com/t/The-debugger-does-not-step-through-a-met/1407274
It's a bug in the .NET runtime. You can track the bug here:
https://github.com/dotnet/runtime/issues/52328

Application throw exception in configuration Debug but work in Release

I am trying to repair applications after my no longer working colleague. This application connects with devices by serial port and shows received data in windows form.
It turns out application works almost perfect when I use Release configuration, and throws an exception when it's in Debug.
I used other program to check frame which I send and received. Because in debug exceptions are related with reading this frame. It often throws an exception because frame is too short to read something. I read something that should not get at all, in addition, the second program shows that, despite everything, virtually all frames come correctly.
I don't know possibilities Release and Dubug, and I don't know where I can find something in my project about differences in creating app. Except Configuration Manager which are the same for both.
Can someone tell me why it works in Release and does not in Debug?
App was written in .NET 2.0. Now I changed it to 4.7.2 but it changed nothing in release nor debug.
did you already tried you rebuild your app or try to delete the debug file and restart your program.
Also this is a good explanation I found
When you compile in debug mode, you get ".pdb" files along with your .exe or .dll by default. The pdb files are called "symbols". This is what allows exceptions to give you a stack trace that tells you exactly which class and method failed, and even points to the line number in your .cs file. It also allows a debugger to be attached to your running program and allows you to "step through" your code.
When you compile in release mode, the compiler "optimizes" your compiled code (such that execution is as efficient as possible). To do this, it will compile your code a bit differently from what you actually wrote. In so doing, classes, methods and line numbers will not be as accurate if an exception is thrown. In some cases, the exception won't be traceable at except at a binary level, because something has been compiled into classes or methods that are not contained in any .cs file.

Visual Studio 2010 extremely slow when populating ListBoxes while debugging

While debugging inside VS2010, programs naturally run a lot slower than otherwise.
However, lately my programs run at an indescribably slow rate if I'm updating the values of a ListBox. (Other controls may also be affected, I'm not sure... but ListBox is a sure thing).
Operations which happen in tiny fractions of a second outside the debugger, like adding 100 elements to a ListBox, can take as long as 3 to 5 minutes inside VS.
Clearly, this isn't normal behaviour.
I'm not sure when this started, but it hasn't been happening always. It started happening a couple of months ago. Maybe when I installed the service pack? I'm not sure.
When I look at the processes, msvsmon.exe is chewing through CPU.
Any ideas if there is some option somewhere that I may have changed which causes this? I'm trying to debug something with a ListBox containing 8,000 elements and It's just completely impossible.
Windows 7 x64, 4GB RAM, VS2010-SP1
Yes, I can see a lot of System.InvalidCastExceptions in the output window
That's what causes the slow-down, the debugger does a lot of work when it processes an exception. Especially the remote debugger you are using now, required because your project's platform target is AnyCPU, adding the notification message to the Output window isn't cheap.
You can't ignore this problem, it is not just a debugger artifact. Debug + Exceptions, tick the Thrown box for CLR Exceptions. The debugger will now stop when the exception is thrown. You'll need to fix that code.
The problem might be the way VS2010 handles breakpoints. Look at this link:
VS2010 Debug entry very slow
Two interesting notes:
Searching for symbols is often very slow at the start of debug, particularly if you have one of the remote symbol options configured,
and have not set 'ignores' on the various DLLs which will not have
symbols on MS servers.
...
Yes, msvsmon.exe will be used when you debug a 64-bit program. Since Visual Studio is completely 32-bit, the remote debugger is needed to
bridge the divide. ... Working mightily to find and load the .pdb
files would be likely. Or accidentally having the mixed-mode debugging
option turned on so the debugger is also seeing all unmanaged DLL
loads and finding symbols for them. These are just guesses of course.
One more cause of slownes - conditional breakpoints as the condition need to be evaluated on each hit to the breakpoint. Having a breakpoint that have "false" for condition inside a long loop will slow debugging significantly.

Deploying app to production using Debug Mode rather than Release Mode?

I work for a shop that maintains a fairly new app. The app still has its fair share of bugs, with numerous tickets coming in daily. The error information we're given with those tickets is not as useful as it might be because the application was compiled in Release mode, which I read is smaller and faster (makes sense).
Are there any ramifications to deploying a .NET application to production that was compiled in Debug mode? I would expect it would be a bit slower, but I've read the difference is nominal. This would assure us that when we get errors on tickets we have line number associated with those errors and this, of course, makes debugging much easier.
Any major red flags that would prevent you from doing this? I'm tasked with researching the possibility. So thanks for any feedback.
Deploying your app in DEBUG instead of Release mode will slow down your performance. Of course compromises can be made. I would suggest one of the following:
Look at adding a global error handler to the OnError in your global.asax-
Look at a compromise similar to this one suggest by Scott Hanselman
My experience is that this can work okay if you're thinking about a desktop (winforms/WPF) app, but under no circumstances should you try this with an asp.net app.
You tagged this [vb.net], you cannot ship debug builds or programs that use WithEvents. There's a known and afaik unsolved memory leak for WeakReference instances if there is no debugger attached. They are used to support Edit+Continue.
First thing you can do is ship the .pdb files along with your apps. In the C# IDE use Project + Properties, Build tab, Advanced, change Debug Info to "Full". You'll get line number info in the exception stack trace.
You cannot completely trust the line number, the JIT optimizer will move code around to make it execute faster. And inline short functions like property getters. You can add an yourapp.ini file in the same directory as the executable that disables the JIT optimizer
[.NET Framework Debugging Control]
GenerateTrackingInfo=1
AllowOptimize=0
It all depends on significance of your production environment, business and performance requirement. Nothing is strict.
Deploying Debug builds is a red flag to me though it is not unheard of. Is this a desktop or server app? Any calls to Debug.Assert that fail could be an issue as those can shut down your app and/or cause a debugger to attach (VS.NET is not the only debugger and if I recall .net fx installs a lightweight debugger). While that can be helpful as a dev it certainly can be confusing to a normal person.
One option that works well is rather than debug builds ensure that your error reporting mechanism includes (either displays or logs) the stack trace information from any thrown exceptions. This help pinpoint bugs very nicely wihtout needing pdbs.
If this is a desktop app, you could try it with a few customers, but heed the advice given in other answers. Try someone who is more of a power-user or having a lot of issues may be willing to volunteer.

Reasons to NOT run a business-critical C# console application via the debugger?

I'm looking for a few talking points I could use to convince coworkers that it's NOT OK to run a 24/7 production application by simply opening Visual Studio and running the app in debug mode.
What's different about running a compiled console application vs. running that same app in debug mode?
Are there ever times when you would use the debugger in a live setting? (live: meaning connected to customer facing databases)
Am I wrong in assuming that it's always a bad idea to run a live configuration via the debugger?
You will suffer from reduced performance when running under the debugger (not to mention the complexity concerns mentioned by Bruce), and there is nothing to keep you from getting the same functionality as running under the debugger when compiled in release mode -- you can always set your program up to log unhandled exceptions and generate a core dump that will allow you to debug issues even after restarting your app.
In addition, it sounds just plain wrong to be manually managing an app that needs 24/7 availability. You should be using scheduled tasks or some sort of automated process restarting mechanism.
Stepping back a bit, this question may provide some guidance on influencing your team.
Just in itself there's no issue in running it in debugging if the performance is good enough. What strikes me as odd is that you are running business critical 24/7 applications as users, perhaps even on a workstation. If you want to ensure robustness and avaliability you should consider running this on dedicated hardware that no one uses besides the application. If you are indeed running this on a users machine, accidents can be easily made, such as closing down the "wrong" visual studio, or crashing the computer etc.
Running in debug should be done in the test environment. Where I've work/worked we usually have three environments, Production, Release and Test.
Production
Dedicated hardware
Limited access, usually only the main developers/technology
Version control, a certain tagged version from SVN/CVS
Runs the latest stable version that has been promoted to production status
Release
Dedicate hardware
Full access to all developers
Version control, a certain tagged version from SVN/CVS
Runs the next version of the product, not yet promoted to production status, but will probably be. "Gold" if you like.
Test
Virtual machine or louse hardware
Full access
No version control, could be the next, next version, or just a custom build that someone wanted to test out on "near prod environment"
This way we can easily test new version in Release, even debug them there. In Test environment it's anything-goes. It's more if someone want to test something involving more than one box (your own).
This way it will protect you against quick-hacks that wasn't tested enough by having dedicated test machines, but still allow you to release those hacks in an emergency.
Speaking very generically, when you run a program under a debugger you're actually running two processes - the target and the debugger - and tying them together pretty intimately. So the opportunities for unexpected influences and errors (that aren't in a production run) exist. Of course, the folks who write the debuggers do their best to minimize these effects, but running that scenario 24/7 is likely to expose any issues that do exist.
If you're trying to track down a particular failure, sometimes running under a debugger is the best solution; but even there, often enabling tracing of one sort or another is a lower-impact solution that is just as effective.
The debugger is also using up resources - depending on the machine and the app, that could be an issue. If you need more specific examples of things that could go wrong using a debugger 24/7 let me know.
Ask them if they'd like to be publicly mocked on The Daily WTF. (Because with enough details in the write up, this would qualify.)
I can't speak for everyone's experience, but for me Visual Studio crashes a lot. It not only crashes itself, but it crashes explorer. This is exacerbated by add-ons and plugins. I'm not sure if its ever been tested to run for 24/7 over days and days and days the same way the OS has.
Your essentially putting the running of your app at the mercy of this huge behemoth of a second app that sounds like its easily orders-of-magnitude larger and more complex than your app. Youre just going to get bug reports and most of them are going to involve visual studio crashing.
Also, are you paying for visual studio licenses for production machines?
You definitely don't want an application that needs to be up 24/7 to be run manually from the debugger, regardless of the performance issues. If you have to convince your co-workers of that, find a new job.
I have sometimes used the debugger live (i.e. against live customer data) to debug data-related application problems in situations where I couldn't exactly reproduce the production data in a test environment.
Simple answer: you will almost certainly reduce performance (most likely considerably) and you will vastly increase your dependencies. In one step you've added the entire VS stack including the IDE and every other little bit to your dependencies. Smart people keep the dependencies of high-uptime services as tight as possible.
If you want to run under a debugger then you should use a lighter weight debugger like ntsd, this is just madness.
We never run it via the debugger. There are compiler options which may accidentally be turned on/off. Optimizations aren't turned on, and running it in production is a huge security risk.
Aside from the debug code possibly having different code paths (#ifdef, Debug.Assert(), etc) code-wise it will run the same.
A little scary mind you - set breakpoints, set the next line of code you want to execute, interactive exceptions popup and the not-as-stable running under visual studio.There are also debugger options that allow you to break always when an exception occurs. Even inspecting classes can cause side-effects if you haven't written code properly... It sure isn't something i'd want to do as the normal 24x7 process.
The only reason to run from the debugger is to debug the application. If you're doing that on a regular basis in production, it's a big red flag that your code and your process need help.
To date I've never had to run debug mode interactively in production. The rare time we switched over to a debug version for extra logging, but never sat there with visual studio open.
I would ask them what is the advantage of running it via Visual Studio?
There are plenty of disadvantages that have been listed in the replies. I can't think of any advantages.

Categories