When debugging my project in Visual Studio 2008, my Settings.settings file keeps getting reset between builds. Is there a way to prevent this from happening?
Thanks.
Okay, I found out the answer I was really looking for. Basically, you need to call LocalFileSettingsProvider.Upgrade. However, since I will be deploying using ClickOnce, it will do it for you automatically.
Q: Okay, but how do I know when to call Upgrade?
A: Good question. In Clickonce, when you install a new version of your application, ApplicationSettingsBase will detect it and automatically upgrade settings for you at the point settings are loaded. In non-Clickonce cases, there is no automatic upgrade - you have to call Upgrade yourself. Here is one idea for determining when to call Upgrade:
Have a boolean setting called CallUpgrade and give it a default value of true. When your app starts up, you can do something like:
if (Properties.Settings.Value.CallUpgrade)
{
Properties.Settings.Value.Upgrade();
Properties.Settings.Value.CallUpgrade = false;
}
This will ensure that Upgrade() is called only the first time the application runs after a new version is deployed.
REF: http://blogs.msdn.com/rprabhu/articles/433979.aspx
I believe Settings.settings files are saved based on the current version number, basically as a "feature" where settings are not saved between differing versions of the same program on a machine. Assuming you're incrementing the version number automatically when compiling (1.0.* in AssemblyInfo.cs), you'll be resetting your settings everytime you compile a new version.
To correct this, the best course would be to serialize your own settings file to the Application Data directory.
Off the top of my head I think you can set in the properties of the file (in Visual Studio right click, Properties) an option to "do not copy" when the project is built/run. What is happening probably is when your project is built, the settings file is copied to the debug bin directory, overwriting the settings file from the previous run.
Amongst other reasons, the Settings file keeps getting reset each time you Debug simply because the next time you Debug, you'll be able to test the whole application all over again. Not resetting the Settings may lead to undetected bugs.
Related
I'm working on a windows service and using SlowCheetah to manage the transforms. I added some settings and therefore updated the app.config file for the first time in months, and for some reason when I build the Debug configuration it generates an old version of app.config with a modify date from six weeks ago (Jan 3rd.)
To make matters worse, looking through the source control history it doesn't look like the file was ever checked in with the changes that are showing up when I build. That is, one setting is set to a different url for me to test something... but that was never checked in.
If I build in Release or Test configurations it works perfectly.
I've tried:
cleaning/rebuilding the solution
Deleting/recreating the the app.config file (It generated the 1/3 version even - when I build with file deleted!)
Restarting Visual Studio
Rebooting my computer
Nothing works and none of the changes I make in the config file are reflected when I build Debug.
Any ideas why or how to fix it?
Did you try looking at the post build section? Sounds like something there is overriding the built version with a previous one.
This question is NOT answered at the link above...
I had some problems with TFS where visual studio builds on the local machine no problem, but when I try to build on the server I get build errors. The errors are not relevant to this question.
The issue I have is that in order to fix the errors I had to manually edit some of the files on my local machine, and since they were manually edited, TFS doesn't detect the change so it wont let me upload my change set to the server which would have fixed the problems there.
My first, and main question is:
How can I force TFS to copy all my local files to the server. Sort of like the reverse of the whole:
Goto View / Other Windows / Source Control Explorer / Right click on the relevant project / Advanced / Get specific version / Check "Overwrite all files..." / Press ok.
Which (arguably, cause it doesn't always get everything in my experience) gets a full file set from the server to the local machine.
I am so tired of TFS uploading partial file sets and then when the solution is removed locally, and then re downloaded from the server, I have to pull in the missing files and references from some backup. If it was uploaded fully, and correctly at the start, I would probably have less of these problems. So this is my main question. How do I force TFS to upload all my files regardless if they have changed or not according to it.
To explain further a problem I am having in particular with nuget package manager. I disabled the option "Allow NuGet to download missing packages during build". So locally I got actual build errors. Yes that's right, Build errors. like this...
Project XXX: Package restore is disabled by default. To give consent,
open the Visual Studio Options dialog, click on Package Manager node
and check 'Allow NuGet to download missing packages during build.' You
can also give consent by setting the environment variable
'EnableNuGetPackageRestore' to 'true'.
Which now forces me to re-enable this option to get rid of the errors. I am annoyed at this because such errors should be presented as warnings so as not to prevent a build. This is seriously disappointing and as a result I have a very poor opinion of NuGet.
Anyways enough ranting, I re-enabled this, not like I had a choice. But the version where this was disabled went up to the server. Now I get this error on the server. I tried re-enabling it and doing a check in, but TFS wont let me. (Warning...No pending changes). Of course not, its just a silly check box. What could possibly have changed. But now I get an error on the server that I cannot fix all because someone at NuGet didn't put some taught into properly designing their warnings (as opposed to preventing a build by causing errors).
So, once again, how can I force TFS to take all my files, changed or unchanged, verbatim, to the server. This way, when I fix a problem locally, it "should" fix it also on the server. Or at least, when I next download it, I am assured to have the whole file set.
Apologies if this sounds like I am being negative, but I am at this for 5 solid days and it has completely stopped development costing me and the company a lot of money. Any help would be very much appreciated.
There is no way to tell TFS to take everything in my file system. What you need to do is a multistep process. First Check Out the entire directory, this will not overwrite your manual changes but will allow you to check back in and have it detect those changes. Second, if any new files were created you need to add at the root directory and get all new adds. Lastly check in all of your changes. It really should be that simple.
If you are using TFS 2012, then Local Workspaces would be a good solution for you.
Just to be clear - Local Workspaces does not mean that your developers are working locally. All code is still under Source Control in exactly the same way with Server Workspaces.
Local Workspaces are new to TFS 2012 and would resolve your issue. The main advantage for you is that files are no longer marked read-only. This allows you to be able to edit them from anywhere (notepad, any other IDE, Visual Studio) without first doing a checkout. Visual Studio will then automatically detect that the file has changed and list it in the Pending Changes window.
Good article comparing the different types of workspaces: http://blogs.msdn.com/b/willy-peter_schaub/archive/2011/11/30/team-foundation-server-trying-to-understand-server-versus-local-workspaces.aspx
I like to use the Source Control Explorer window, then right click on a folder and choose Compare. You can choose to see what is different in the target folder, or what is a new file. The files that are different appear in red. You can then right click on them and check out and check in files, or use control-click to select more than one file at time.
Yes i know before the flames start to arise, this is not conventional....but working in unity I encounter this problem quite often. Literally dealing with it now as i updated to unity 5.5 and it fubared a lot of stuff in my current project. After the resolutions from unity, my project is completely out of sync... and for whatever reason vis stu refuses to monitor these changes... It is rather annoying to be sure, but not to worry as I have resolved it successfully every time over the iterations by doing the following:
create a backup folder in a separate location and copy the entire thing over.
It doesn't matter where you back up to as long as its in a unique separate folder. Be sure to copy FOLDERS not individual FILES as some files may be hidden.
do a full get of the latest version (according to TFS).
This can take some time and is why you back it up in a separate location. Be sure to KEEP VS OPEN AT THIS POINT!!!
overwrite ALL the files you "got" with the backup that you made in step 1.
The local work-space will monitor the changes to the files as you overwrite them with the backed up copy into your local space, and mark them as having pending changes accordingly if the new one has any difference.
update your pending changes to the TFS server in source control explorer. Any change detected during the overwrite will be added to the included changes section.
Voila you have forced your files, work, and will onto the TFS server!
We're creating WCF services (.NET 3.5) via Visual Studio 2010. When I make a change to the .svc.cs file, save, clean, rebuild and copy to GAC (using WSPBuilder, which recycles the various IIS processes) I still get cached data. Only drastic things like checking in all my files and running a gated checkin build, or restarting the computer clear this 'cache'.
The upshot of this is that the development->testing cycle is extremely slow. But it need not be! Here's my question. Where is the cache that VS2010 or Windows is keeping for WCFTestClient? I can add debug breakpoints and the symbols show up in the debug (so that means on some level I'm using the new assembly) but key things like watches will show old, cached variable values.
Is there a cache somewhere for this data? Looking at the list of .dll files in the output that the WCFTestClient is using when I run debug(F5) shows that it uses the correct .dll (and my observations during debug confirm this.)
I just need to be able to remove (manually if needed) this cache between rebuilds of my assembly. Otherwise, I can't actually rectify problems in the code.
If I'm missing something obvious here, let me know.
Try deleting your solution .suo file.
I've been working on a simple project that uses some common .NET classes, isolated storage, some resources and no external libraries.
Somehow the EXE generated (either in debug or release mode) no longer runs (stops working as soon as it's opened) without giving any details or displaying any exceptions.
It runs normally in visual studio, and there's a .application in the same folder that when clicked starts in install process.
I'm not interested in installation files, I just want it to be the way it was: running an EXE (it's easier to get testers when all you have to do is running it).
I have previous versions of the program, and all of them run normally through the EXE's.
I don't recall changing anything regarding framework, deployment or build. I revised it and there's nothing changed apart from using new objects from the .NET framework.
--[Update]--
Just checked the event viewer. Event data "not available" and answer "not available".
This is a classic example of when a personal version control system would have helped. It would have automatically kept every version of your code including the one right before you made the change that messed up your exe.
Anyway to fix your issue comment out the majority of the code untill it atleast runs. Add a simple output statement just to make sure it is doing something. Then slowly add back in more code.
I suggest you to run your exe file in a consol (cmd.exe) to see if your application displays errors or exceptions in it.
Check the <YourAppName>.Exe.Config file.
Probably it is not well-formed Xml.
I'd start with removing the setup project from the solution, rebuilding then run it in debug mode.
C# visual studio project: Properties.Settings.Default.SomeValueOrAnother has me baffled.
I have a relatively simple project. It saves a bunch of last-entered values between sessions, and restores them on next invocation. Was working fine. I changed a control so that the minimum allowed value was no longer 1, but was now 100. Last used value had been 3. On startup, it now complains that 3 is not a valid value.
Well, duh, of course – but hear the rest.
I have edited Settings in VS to default to 500. I have edited the exename.config file to contain 500 instead of 3. I have examined every possible config file (vshost.exe.config, the files in bin/debug, bin/release, obj/debug, obj/release. They all contain value=500. I have re-built repeatedly. I have copied just the exe and the config file to a separate PC, so that the development environment was not a factor. I still get this error message.
To further confuse me, on the dev PC, if I run (directly, not in the debugger) the exe in /obj/debug or /obj/release I do NOT get the error. If I run the ones under /bin I DO get the error. The config files have identical contents. If I copy the exe & config from /obj (the one that does not give an error on the dev PC) to another PC, I DO get the error.
I thought exename.config was all I had to deal with, but it looks like VS is doing something behind my back – at least something that I cannot find in the documentation. I imagine this is something trivial. If anyone can explain what I've missed I'd sure appreciate it. All I really want to do is reliably save some user settings from one run of the program to the next. And get this app to 'forget' that obsolete value.
TIA
Mickey
Look for *.settings files in your solution. That's where the value that's used when you build your project will be stored .
This was not the entire answer, but it did explain where the mystery values were being stored:
"This might help to some people dealing with Settings.settings and App.config: Watch out for GenerateDefaultValueInCode attribute in the Properties pane while editing any of the values in the Settings.settings grid in Visual Studio (VS2008 in my case). If you set GenerateDefaultValueInCode to True (True is the default here!), the default value is compiled into the exe (or dll), you can find it embeded in the file when you open it in a plain text editor. I was working on a console application and if I had defaults in the exe, the application always ignored the config file place in the same directory! Quite a nightmare and no information about this on the whole internet."
...in another post here. I am now individually testing values read from settings, rather than trusting it, and forcing them into valid range if required.
Truly an ugly, and well hidden, default. I haven't embedded data values in my executables since the days of CPM. Jeesh.
Thanks again, Microsoft.