I'm struggling to get my unit tests to work. I've wrestled with this issue for several hours now, and I have no explanation as to why things don't work. I had a fairly major refactor on my codebase and have since gone through and fixed all the unit tests. The test project builds, it outputs a new unit test dll. However, when I go to run the tests in the test explorer, I get this message:
[2/27/2019 5:08:05 PM Warning] [MSTest][Discovery][C:\pathtotest.dll] Failed to discover tests from assembly C:\pathtotest.dll. Reason:Could not load file or assembly 'System.Runtime, Version=4.1.2.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a' or one of its dependencies. The system cannot find the file specified.
[2/27/2019 5:08:05 PM Warning] No test matches the given testcase filter FullyQualifiedName=<namespace.namespace...testmethod> in C:\pathtotest.dll
Here is what I know:
I recently updated Visual Studio (within the past two weeks, don't remember exactly when I did it).
All packages in the test project have been updated and are running the latest versions of things.
I have another unit test project that is .Net Core 2.1, this is .Net Framework 4.7.2. The other project works.
Some suggestions other posts give are to make sure your test architecture is correct and to delete a folder in %TEMP% (don't recall the exact name, except is was something about VisualStudioExtensions). The folder they suggest to delete was not found in %TEMP% and I tried running my tests on both architectures with the same result.
So the next step was to take a sanity test and make sure the built test dll exists. It does.
At this point I'm about ready to just start a new test project and copy paste over all the tests one by one and see if maybe one is throwing a silent error. I can't find any useful information with my own Google-fu skills and I'm hoping someone has some useful insight or tricks to try.
From the comment above:
Using this C:\pathtotest.dll
Check also if it 32bit or 64bit runtime.
Most cases are using 32bit dlls.
Hope it helps
I had a similar issue with a brand new test project in an existing solution - all of my other test projects compiled and tested correctly, but the new one repeatedly came up with the error:
No test matches the given testcase filter...
The answer came from this post VSTest: A testsettings file or a runsettings with a ForcedLegacyMode set to true is not supported with the MSTest V2 Adapter. No test is available which suggested switching off the testsettings file, which somehow had been selected on the new project.
As soon as I deselected the Test -> Test Settings -> c:...\Repos...testsettings option in Visual Studio the tests were runnable.
In my case, the issue ended up being a mix of two different test engines. I inadvertently decorated my test methods with [TestMethod] (MSTest) when I was using NUnit. Once I changed my test methods to just [Test] and used the proper test runner, I was finally in business.
I had a similar issue. In my case, the problem was the return type.
The test method needed to use async but the return type was void instead of Task. Simple and small think that might eat a lot of time.
Related
I'm working with Unit Test using VS2013 Professional. In particular I'm using the NUnit framework (NUnit TestAdapter for VS2013). My problem is that when I run my tests then VS starts building all the projects inside the solution. Currently the Unit Test project does not reference any solution project.
If I simply code a single test method like:
[Test]
public void SimpleTestMethod(){
Assert.That("a", Is.EqualTo("a"));
}
and the Unit Test project is in a Solution with N project, when I run my test then VS will build all N-1 project... In my case this behavior is boring because it takes too much time (the solution contains many projects) and some projects contain errors.
Is there a way to run my SimpleTestMethod() without complete solution building?
Break your test project to multiple projects that only reference a subset of the solution's projects.
This is also good test housekeeping - have a separate unit test project for each solution project instead of one huge project with dependencies to anything else. There are several advantages to this:
Tests run faster
It's a lot easier to isolate test cases, especially configuration settings
You can version projects and their test cases together
A good naming practice is to name your test projects the same as their target projects with a .Tests suffix. You can also create a solution folder (not a real folder) called "Tests" and move the test projects in it.
As for the why: Test runners use the Unit Test assembly and its dependencies to run their tests. If any of the assembly's dependencies change, the assembly and the dependencies have to be rebuilt. Visual Studio doesn't know what the external tool will call so it has to build all changed assemblies and their dependents.
If the build fails, there are no valid assemblies for the test runner to use so VS has to rebuild the entire solution before the runner can work. In this case, the obvious solution is to fix the error.
There are some stopgap options you can use until you can fix the error:
Temporarily remove the broken project from the build configuration
Split the solution so that you have a solution that can be built and tested
I struggled with this for a very long time as well. I actually hated the automatic build process, even when everything was successful.
I started to run tests through the command line. No build process is necessary. You can write your own .bat files and keep logs of test results. There are plenty of command line parameters that can be added to customize for what you are looking for.
https://msdn.microsoft.com/en-us/library/jj155796.aspx
Is it possible to let Resharper (or NUnit?) know that I want each test to look for an App.config under it's own project, even when running all tests in a solution together?
Background:
I'm using NUnit and the test-runner that ships with Resharper, and I've got several test-projects in the same solution. Some of my tests depend on config-files located under their respective projects.
When I run a test-project by itself, it will use it's internal App.config, and everything works fine. When I try to run all the tests in the solution, or use the shortcut to run all tests in current test session however, no config-file will be selected, and any test depending on a config will fail by default.
For this reason, I typically end up running all tests in the solution once first, then right clicking the nodes in the test-runner for each of the config-dependent projects, and running them separately afterwards.
Solved:
Apparently assemblies containing NUnit-tests can be run in separate processes or domains using command line options.
For the testrunner under Resharper, this setting can be found under Resharper > Options > Unit Testing.
There is an option "Use separate AppDomain for each assempbly with tests". Checking that solved my problem.
I've written couple of unit tests in seperate project. While developing, I loaded the dll into NUnit.exe each time to check the results. Given that I am done with writing unit tests, how to organize and attach unit tests to the solution.
I'd tried creating "tools" folder in the solution directory and then placed all NUnit related libraries and in postbuild event of the test project I hooked up the nunit like below and it works.
"$(SolutionDir)tools\nunit\nunit-console.exe" "$(TargetPath)"
But in one of my tests, I refer to a sqlite DB. If I hardcode its location it works fine but I intend to place it in the tests project under "App_Data" folder. So, I tried with below code to get relative path but it is not working. But when I copied the file to "tools" folder it is working fine, I'd guess execution context is from NUnit folder.
Path.Combine(Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location),"App_Data\\test.txt")
If you don't need specifically nunit, you could as well use built-in visual studio support for unit testing... It has been incorporated to the newest Express editions as well and you don't have to struggle with problems like this ;)
Either way, I believe you've misunderstood the basics of unit testing. We never test against an actual database, that's a very bad practice. You should do some research on the topic and see the theory behind mocking stuff like this.
I'm trying to run some unit-tests for NxBRE before I start referencing it's implementation in the rules-engine project I'm working on. I'm using versions NUnit 2.6 for testing NxBRE 3.2. Since NxBRE came with it's own unit tests in it's own friendly project folder that utilizes the NUnit.Framework, I figured it'd be a quick one-two outta there ka-poo. After making sure everything compiled, I went ahead and ran the tests...and got a million errors. Mostly along the lines of:
NxBRE.Test.FlowEngine.TestBackwardChainer.CircularityDetection:
SetUp : System.IO.FileNotFoundException : Could not find file 'C:\car-loan-rules.xbre'.
or:
NxBRE.Test.InferenceEngine.TestEngineCoreFeaturesRuleML091.TestEngineCoreFeaturesRuleML09.NxBREOperators:
System.IO.FileNotFoundException : Could not find file 'C:\test-0_91.ruleml'.
Befuddled, I went to the NxBRE website and looked for information about their unit-tests. This was all I could find: http://sourceforge.net/apps/trac/nxbre/wiki/UnitTesting
Which doesn't describe the process very specifically. How do I configure the engine so that the paths point to the correct location of the test rule bases? Is this something that I have to do in NUnit? Or in my IDE (SharpDevelop)? Also, I know where the output folder is, but how do I reckon which Dtd or Xml files that I need to copy there? This probably exposes my inexperience, which is where you guys' expertise would be much appreciated.
Well, I figured it out--basically the unit-tests were pre-disposed to look in locations that didn't exist, and in fact bundled with the NxBRE was the pdf that defined the fields that needed to be defined for it to run.
Summary: I can run unit tests and code-coverage, but the report only includes NUnit classes, not my application classes.
I have successfully used PartCover in the past. Not so this time. I tried the latest PartCover (4.0), downgraded to the next latest (2.0), both with NUnit 2.5.6.
I created a simple .NET 4.0 class library (also tried this with a web application that has a project that's a class library) with a single class in some namespace, and two test methods in another library in another class.
NUnit/PartCover installed correctly; I can run the NUnit tests both in NUnit, and through PartCover (I can see them running and saying "2 passed"), but the report only shows me NUnit namespaces. (Yes, I'm using +[] as my coverage rule.)
Any ideas? As much as I like NUnit, I'd like to see coverage for my own classes :o)
And I also tried aligning the test-DLL and code-DLL namespaces, to no avail.
Edit: I tried re-running my previously working code-covered sample from a year ago; all the tests run, but the actual project namespaces don't appear. There's a hint here, which seems to imply that it depends on the NUnit version you use: http://sourceforge.net/projects/partcover/forums/forum/605222/topic/3308367 (and yes, I already tried the appdomain-reporting checkbox)
I've tried NUnit 2.5.5.x and 2.5.6.x and both give me the same results.
Edit: It seems this fork of the official 4.0 version seems to work, albeit sporadically (google for PartCover fork, I can't add more hyperlinks)
Madness. Apparently, pressing Pause/Break on your keyboard after NUnit prints out the summary of total passed/failed, and waiting approximately one second for the second "CoreProfiler is turned off" message, makes it all work.
Surely, this cannot be the real solution. Sure, I can rig up a batch file that'll sleep ~1 second after executing NUnit, but this seems like a major hack.
The correct way to handle this is to add the required runtime to nunit's configuration. You will notice that NUnit is running in CLR 2.0 instead of 4.0. There are numerous answers to this question on SO, but I found this one first. Doing this alone fixed it for me. Note, your version of the runtime may be slightly different. You may need to confirm.