My team has a fairly large set of desktop applications with many shared libraries between them all in one common solution file in our repository. We'd like to use semantic versioning for a number of reasons, chief of which is to make it easier for our users to install updates. However, given the number of assemblies we're dealing with, we're finding it pretty tedious updating AssemblyInfo files for each one, especially if it's for a library that's a dependency for multiple applications.
I was wondering if there's an easy way to use git tags or some kind of external tool to tell the build server that, for example, XYZ has a bug fix and its patch number needs to be updated.
Use GitVersion : https://gitversion.readthedocs.io/en/latest/
It will do automatically the semantic versionning based on the last tag and git history.
You could use GitVersionTask if you use msbuild or (better) use it with build tools like fake or cake.net
Edit: you now have also alternatives easier to use : https://www.nuget.org/packages/Nerdbank.GitVersioning/, https://www.nuget.org/packages/GitInfo/,. ..
I'm refactoring a WPF application and dealing with cleaning up storage of settings. I've re-written most to use the application's settings (Properties.Settings.Default) and this technically is working right now it seems to generate rather ugly paths in the %appdata% folder such as:
C:\Users\me\AppData\Local\Company_Name_With_Underscores\program.exe_Url_xs3vufrvyvfeg4xv01dvlt54k5g2xzfr\3.0.1.0\
These also then result in a new version number folder for each version that don't get cleaned up ever unless apparently I manually do so via file io functions.
Other programs don't seem to follow this format, including Microsoft programs, so I'm under the impression this seems like one of those 'technically the best way but not practical so nobody uses' solutions. Is this the case or am I missing something to make this more practical?
I ask mainly because I can foresee issues if we ever have to direct a customer to one of these folders to check or send us a file from there.
I'm using .NET 4.0 right now.
I am looking at moving our .net(c#) projects from TFS to git. The general consensus in the tema that we do not want to continue with tfs has been reached and we wish to trial git. We currentley do not have that many projects to migrate over but we expect these to grow as our old systems are replaced.
Currently we have a tfs project for all things that we think will be needed by multiple projects, database stuff, 3rd party dll's etc. What is the best way to have a similar structure in git?
The best way I could see is to have a similar thing to our current structure, with a seperate repository for all the common files.
I have read about using submodules but there seems to be a lot of complaints about these. Is it worth trying something like repo or another alternative? Or is there a better way to handle this?
This question is going to be pretty subjective, but IMO I would solve this by having a separate repository for your common stuff.
Another option is to migrate your common stuff to Nuget packages so you can move your common stuff forward without worrying about breaking all your existing projects.
In my experience common projects in an Enterprise environment tend to calcify your ability to respond to change quickly. Instead you spend lot's of time worrying about how changes in your "Core" or "Lib" modules will affect the 80+ projects you have that are using them. Worse, people just start shoving everything into those modules even if it is only pertinent to a few projects simply because it's easy.
So, my application depends on a huge number of small files. The actual number is somewhere around 90,000. Now, I use a component that needs an access to these files, but the only way it accepts them is by the use of an URI.
So far I have simply added a directory containing all the files to my debug-folder while I have developed the application. However, now I have to consider the deployment. What are my options on including all these files with my deployment?
So far I have come up with a couple of different solutions, none of which I've managed to make work completely. First was to simply add all the files to the installer which would then copy them to their places. This would, in theory at least, work, but it'd make maintaining the installer (a standard MSI-installer generated with VS) an absolute hell.
The next option I came up with was to zip them into a single file and add this as a part of the installer and then unzip them by the use of a custom action. The standard libraries, however, do not seem to support complex zip-files, making this a rather hard option.
Finally, I realized that I could create a separate project and add all the files as resources in that project. What I don't know is how do the URIs pointing to resources stored in other assemblies work. Meaning, is it "standard" for everything to support the "application://,,,:Assembly"-format?
So, are these the only options I have, or are there some other ones as well? And what would be the best option to go about this?
I would use a single zip-like archive file, and not unzip that file on your hard disk but leave it as is. This is also the approach used by several well known applications that depend on lots of smaller files.
Windows supports using zip files as virtual folders (as of XP), users can see and edit their content with standard tools like Windows Explorer.
C# also has excellent support for zip files, if you're not happy with the built in tools I recommend one of the major Zip libraries out there - they're very easy to use.
In case you worry about performance, caching files in memory is a simple exercise. If your use case actually requires the files to exist on disk, also not an issue, just unzip them on first use - it's just a few lines of code.
In short, just use a zip archive and a good library and you won't run into any trouble.
In any case, I would not embed this huge amount of files in your application directly. Data files are to be separate.
You could include the files in a zip archive, and have the application itself unzip them on first launch as part of a final configuration, if it's not practical to do that from the installer. This isn't entirely atypical (e.g. it seems like most Microsoft apps do a post-install config on first run).
Depending on how the resources are used, you could could have a service that provides them on demand from a store of some kind and caches them, rather than dumping them somewhere. This may or may not make sense depending on what these resources are for, e.g. if they're UI elements a delay on first access might not be acceptable.
You could even serve them using http from a local or non-local server, or a SQL server if it's already using one, caching them as well, which would be great for maintenance, but may not work for the environment.
I wouldn't do anything that involves an embedded resource for each file individually, that would be hell to maintain.
Another option could be to create a self-extract zip/rar archive and extract it from the installer.
One of the options is to keep them in compound storage and access them right in the storage. The article on our site describes various types of storages and their advantages / specifics.
All the refactoring tools for C# and VB.Net that I have seen only consider the source code in a single visual studio solution.
For better or worse, our large (many related programs) system is spread over many solution files, however:
All the code is below a single windows folder.
Our nAnt based build system, builds all files in a windows folder to produce a single dll (bit more complex then this but not important for this question).
Therefore ALL “.cs” and “.vb” files below the single root folder are part of the system.
So I am looking for refactoring and reverse engineering tools that take a single folder as input and act on all files below that folder.
(The tools may need some help to decide what “public” and “internal” means, however most of the time “internal” means “in the same code tree” when a “code tree” is a folder that contains code and any child folders.)
Now I am being greedy, I would like the tools to create a lot of all the refactorings that have been done and to be able to replay the refactorings. Then I could try out ideal and if they work, throw away my code, get the latest code, you do the refactorings and checkin the code before anyone else changes the files. (Likewise for when branches need merging)
In the past I have done what Pratik has suggested and pulled all projects into a single solution just for the purposes of refactoring. Then personally I would use Resharper every time.