CNTK.GPU conflicts with CNTK.CpuEval-mkl - c#

I was previously using the NuGet package Microsoft.Research.CNTK.CpuEval-mkl to evaluate some models, until I ran into problems with CPU speed. At that point I switched over to the CNTK.GPU library.
Unfortunately, I noticed a marked decrease in the efficacy of my models, so I suspect that I did something wrong. Just to compare, I would like to run both versions within a single application, have it output the raw evaluation results of each and compare them.
Sadly, when building I get messages about missing DLLs. Which ones are missing depend on which NuGet package was added first. If I add the GPU one first, I get the following list:
- Cntk.Core-2.0rc1.dll
- cudart64_80.dll
- curand64_80.dll
- cusparse64_80.dll
- cudnn64_5.dll
- cublas64_80.dll
- Cntk.Core.CSBinding-2.0rc1.dll
- nvml.dll
If I add the CPU version first, then only one DLL is missing:
- Ctnk.Eval-2.0rc1.dll
Is there a way to force the two to work together, or do I need to run two separate applications and manually compare the output?

It is not a supported scenario to install both Nuget pacages to the same application. You might get it work with manually adding references to your project, but we have never tried and tested this.
Thanks,

Related

Build project twice with different DefineConstants

As part of a bigger solution I'm writing a wrapper for a third party tool. We need to support two different versions of that tool, depending on which version is already installd on the end user's machine. The versions are similar, bug some APIs have changed.
I have a wrapper project that can do the right thing depending on some DefineConstants, say TOOL_VERSION_1 and TOOL_VERSION_2.
Is there a way to automaically build the wrapper project twice when I build the solution? Each of the builds should of cause use one of the DefineConstants and should output to a different file, say "Wrapper.V1.dll" and "Wrapper.V2.dll". Idealy the solution should scale for 3 or more versions.
I'm hoping for something simple like with target frameworks where you can just give a list of frameworks in the .csproj file and the SDK will build each one in turn. I was looking into custom build targets but that hasn't been very productive so far.
I only just leared about shared projects and they should able to solve my problem. However as I have several projects that all need to be treated like this I'd rather have a solution that doesn't require me to tripple the number of projects.

Do I really need whole Grpc.Core NuGet package to use Google PubSub in simple C# application

I am creating a simple C# desktop application that is pulling messages from Google Cloud PubSub. I noticed that PubSub is using Grpc.Core package that when installed is more than 500 mb! It contains many files that I don't need(for mac, android, etc.) and it doesn't seem reasonable to use such package if my application size is only few mb. There is a discussion here:
Why is Grpc.Core NuGet package so big?
In comments section it is said that it is possible to target more specific packages to suit specific needs. So my question is - is there more specific package/s that can be used to simply pull messages from Cloud PubSub to desktop application?
I feel your pain. (In fact, I feel it many times over. When I do a complete build of the google-cloud-dotnet repo that I work on, it pulls in those libraries many times over, and ends up being vast.)
It would be nice if you could add a sort of "negative dependency" to say "I don't want Grpc.Core even though Google.Cloud.PubSub.V1 depends on it indirectly, please use Grpc.Net.Client instead", but I don't believe there's any simple way of doing that in MSBuild projects.
We do make a "best-effort" attempt to support Grpc.Net.Client via the Google.Api.Gax.Grpc.GrpcNetClient package - you can depend on that package and then set the GrpcAdapter property in a ClientBuilderBase<TClient> to GrpcNetClientAdapter.Default. However:
The Pub/Sub libraries are slightly trickier to reconfigure than others, due to the manual layer of code wrapping the generated code. (I can look into how to perform that configuration if you're interested.)
We haven't done significant testing with Grpc.Net.Client, and the Pub/Sub library in particular performs a lot of streaming; while it should all just work, it's possible that there could be problems.
Doing that doesn't actually remove the Grpc.Core dependency anyway - so you'd need to manually remove the files you don't need.
There really isn't a more specific package that you can target - all I can suggest is that you delete the files you don't need. You could do that in a build target that runs post compile, for example. It's possible that there's some cunning way to tell MSBuild that when it would copy (say) the iOS libraries into a specific location, just exclude them instead - but I don't know enough MSBuild to say how you'd do that (when they're being copied due to a dependency rather than due to the project itself).

Version number in filename

At work we have a tracing library that has to be referenced for all applications.
Recently following a major version change, the trace lib name changed as well
From
dotnet_tracing-w32r-1-2
To
dotnet_tracing-w32r-2-0
This broke several of our pre-packaged solutions (projects that have a main branch that get forked for customization to specific customers).
What I'm trying to figure out, is there any way to (auto-magically) reference one OR the other? Having the version in the filename is screwing everything up and I really dont want to maintain two separate branches of these projects.
To solve this problem, we used Conditional References
First I created 2 different build configurations - and based upon those build configurations, I used conditional references to reference the proper assembly. Finally we then use some post build scripting to generate our 2 different NuGet Packages and publish to our NuGet feed.

Detecting dependency collisions

Short version of the question:
Is there a good way to detect at build time if I have any cases where 2 or more projects reference different versions of the same assembly? (really, I would like to teach our CI server to do this)
Long Version:
So here's an interesting problem (simplified a bit for easy digestion):
Recently encountered a situation where we had 2 projects in a solution, A and B. Both A and B depend upon a 3rd party nuget package C.
A always loads C, B only needs C in rare circumstances.
So, during this sprint, a developer updated Project A to use the latest version of the C package (not realizing that B also depended upon C)
Everything built and the tests that we had passed (we have insufficient test coverage), but when we released to production, we had failures occuring when B attempted to use the dependency (loader issues, because we wanted a different version of the strongly named assembly).
We found the problem, and corrected it, but. I would really love to be able to catch this during development. It would be even cooler if our build server could detect this (TFS 2012) when it does a CI build.
How might I go about detecting this situation?
VS can't do this for you because of the dynamic loading (unless I'm missing something): it just has no way of knowing which assemblies wil be loaded at runtime.
We had the same problem once (using Prism - all our assemblies are loaded at application startup normally, and the order is described in a config file though most are optional). I First thought of making a small tool that basically scans all packages.config or csproj files to see what assemblies are used in what version, and make it complain when two packages of different versions are found. But I ended up with dealing with it at a higher level, more direct and foolproof: we now have a simple class, sort of a stub of the actual application, that just loads all the application's components and modules as described in the config file. This results in all assemblies that can ever get loaded to be loaded so if something goes wrong it will be found. This functionality is simply placed in a unit test.

How to force VS 2010 to skip "builds" of projects which haven't changed?

Our product's solution has more than 100+ projects (500+ksloc of production code). Most of them are C# projects but we also have few using C++/CLI to bridge communication with native code.
Rebuilding the whole solution takes several minutes. That's fine. If I want to rebuilt the solution I expect that it will really take some time. What is not fine is time needed to build solution after full rebuild. Imagine I used full rebuild and now without doing any changes to to the solution I press Build (F6 or Ctrl+Shift+B). Why it takes 35s if there was no change? In output I see that it started "building" of each project - it doesn't perform real build but it does something which consumes significant amount of time.
That 35s delay is pain in the ass. Yes I can improve the time by not using build solution but only build project (Shift+F6). If I run build project on particular test project I'm currently working on it will take "only" 8+s. It requires me to run project build on correct project (the test project to ensure dependent tested code is build as well). At least ReSharper test runner correctly recognizes that only this single project must be build and rerunning test usually contains only 8+s compilation. My current coding Kata is: don't touch Ctrl+Shift+B.
The test project build will take 8s even if I don't do any changes. The reason why it takes 8s is because it also "builds" dependencies = in my case it "builds" more than 20 projects but I made changes only to unit test or single dependency! I don't want it to touch other projects.
Is there a way to simply tell VS to build only projects where some changes were done and projects which are dependent on changed ones (preferably this part as another build option)? I worry you will tell me that it is exactly what VS is doing but in MS way ...
I want to improve my TDD experience and reduce the time of compilation (in TDD the compilation can happen twice per minute).
To make this even more frustrated I'm working in a team where most of developers used to work on Java projects prior to joining this one. So you can imagine how they are pissed off when they must use VS in contrast to full incremental compilation in Java. I don't require incremental compilation of classes. I expect working incremental compilation of solutions. Especially in product like VS 2010 Ultimate which costs several thousands dollars.
I really don't want to get answers like:
Make a separate solution
Unload projects you don't need
etc.
I can read those answers here. Those are not acceptable solutions. We're not paying for VS to do such compromises.
By default Visual Studio will always perform build of every project in your solutuion when you run a single project. Even if that project doesn't depend on every other project in your solution.
Go to Tools | Options | Projects and Solutions | Build and Run and check the box "Only build startup projects and dependencies on Run".
Since now when run your project (F5 key), Visual Studio will only build your startup project and the those projects in your solution that it depends on.
Is there a way to simply tell VS to build only projects where some
changes were done and projects which are dependent on changed ones
(preferably this part as another build option)? I worry you will tell
me that it is exactly what VS is doing but in MS way ...
Not really (you understand it already).
You are talking about a "build system". MSVS is not that. It is an IDE, which happens to permit you to organize your assets into projects-and-solutions, and yes, to "build". But, it is not a build system. It will never be a build system (long story, but a very different technology is required).
In contrast, MSVS is an IDE for accelerated iterative development, including the "debugging" cycle (e.g., "step-into" and "step-over" in the debbugger during system run). That's where MSVS "shines".
It does not, and will never, "shine" as a build system. That's not what it was created to do. And, this will likely never change (long story, even Microsoft will likely agree).
I'm not trying to be cute, and I sincerely apologize for delivering this news. This answer hurts me too.
I expect working incremental compilation of solutions. Especially in
product like VS 2010 Ultimate which costs several thousands dollars.
MSVS is an IDE for interactive debugging/development, and not a build system (see above). So, you are measuring it in a product scenario for which it was not designed, and in which it will likely never function as you desire.
I really don't want to get answers like:
Make a separate solution
Unload projects you don't need
etc.
I can read those answers . Those are not acceptable solutions.
We're not paying for VS to do such compromises.
Your expectations are reasonable. I want them too. However, MSVS is not a product that will ever deliver that.
Again, I'm not trying to be "cute". If you are willing to invest in a "build system", you may find value in using something like CMake to manage your configurations and export Makefiles (or something) to perform your "real" builds, but to also "export" *.vcproj and *.sln files for when you want to do work iteratively and interactively within the MSVS IDE.
EDIT: Rather, what you want is a SSD (solid-state-disk) for your build workspace to get a 10x improvement-in-speed, or a RAM disk for a 100x improvement-in-speed for builds (not kidding, 64MB RAM on an LGA2011 socket gives you a 32MB RAM disk, which is what we use.)
One things you can do is to break your app into small solutions, each one being a cohesive part. Build each solution separately. Have each solution use the outputs of the solutions it depends on, rather than using the source code.
This will allow for shorter feedback cycles for each component
EDIT: Modified Solution
Additionally, you will create an integrative build that rather than getting all of the sources, compiling and testing, it will get the binary build products of the component CI builds. This integrative build should be triggered to run after every successful component build.
This build should be the binary equivalent of a complete build (which you still should build every night), but will take considerably less time to run, because it triggers after a component increment and doesn't need to compile or get any sources.
Moreover, if you use an enterprise grade build system that supports the concept of distributing your builds among multiple agents, you will be able to scale your efforts and shorten your complete CI cycle to the amount of time it takes to build the longest component, and test the integrative suite (at most).
Hope this helps.
Weighing a bit late on this, but have you considered having different build configurations?
You can tell visual studio not to build certain projects depending on the build configuration.
The developer could simply select the configuration relevant for the project their working on.
Pretty ancient thread, but I can say I was suffering from a smaller version of the same thing and I upgraded to Visual Studio 2012 and the problems seems to have finally been fixed. The RedGate .NET Demon solution mentioned above also seems to work pretty well so far.
This is an old problem.
Use parallel build and SSD . See here (I think - quick google):
http://www.hanselman.com/blog/HackParallelMSBuildsFromWithinTheVisualStudioIDE.aspx
I found a tool which does mostly what I want (and even more): RedGate .NET Demon. It is probably still the first version because I encountered few issues in our big solution (problems with C++ projects, problems with switching build targets and few others) but I really like it so far. I especially like the way how it tries to track changed files in VS IDE and rebuilds only affected projects.
Edit: .NET Demon has been retired as it should not be needed for VS 2015. It still works with previous versions.

Categories