Writing standards for unit testing - c#

I plan to introduce a set of standards for writing unit tests into my team. But what to include?
These two posts (Unit test naming best practices and Best practices for file system dependencies in unit/integration tests) have given me some food for thought already.
Other domains that should be covered in my standards should be how test classes are set up and how to organize them. For example if you have class called OrderLineProcessor there should be a test class called OrderLineProcessorTest. If there's a method called Process() on that class then there should be a test called ProcessTest (maybe more to test different states).
Any other things to include?
Does your company have standards for unit testing?
EDIT: I'm using Visual Studio Team System 2008 and I develop in C#.Net

Have a look at Michael Feathers on what is a unit test (or what makes unit tests bad unit tests)
Have a look at the idea of "Arrange, Act, Assert", i.e. the idea that a test does only three things, in a fixed order:
Arrange any input data and processing classes needed for the test
Perform the action under test
Test the results with one or more asserts. Yes, it can be more than one assert, so long as they all work to test the action that was performed.
Have a Look at Behaviour Driven Development for a way to align test cases with requirements.
Also, my opinion of standard documents today is that you shouldn't write them unless you have to - there are lots of resources available already written. Link to them rather than rehashing their content. Provide a reading list for developers who want to know more.

You should probably take a look at the "Pragmatic Unit Testing" series. This is the C# version but there is another for Java.
With respect to your spec, I would not go overboard. You have a very good start there - the naming conventions are very important. We also require that the directory structure match the original project. Coverage also needs to extend to boundary cases and illegal values (checking for exceptions). This is obvious but your spec is the place to write it down for that argument that you'll inevitably have in the future with the guy who doesn't want to test for someone passing an illegal value. But don't make the spec more than a few pages or no one will use it for a task that is so context-dependent.
Update: I disagree with Mr. Potato Head about only one assert per Unit Test. It sounds quite fine in theory but, in practice, it leads to either loads of mostly redundant tests or people doing tons of work in setup and tear-down that itself should be tested.

I follow the BDD style of TDD. See:
http://blog.daveastels.com/files/BDD_Intro.pdf
http://dannorth.net/introducing-bdd
http://behaviour-driven.org/Introduction
In short this means that
The tests are not thought as "tests", but as specifications of the system's behaviour (hereafter called "specs"). The intention of the specs is not to verify that the system works under every circumstance. Their intention is to specify the behaviour and to drive the design of the system.
The spec method names are written as full English sentences. For example the specs for a ball could include "the ball is round" and "when the ball hits a floor then it bounces".
There is no forced 1:1 relation between the production classes and the spec classes (and generating a test method for every production method would be insane). Instead there is a 1:1 relation between the behaviour of the system and the specs.
Some time ago I wrote TDD tutorial (where you begin writing a Tetris game using the provided tests) which shows this style of writing tests as specs. You can download it from http://www.orfjackal.net/tdd-tutorial/tdd-tutorial_2008-09-04.zip The instructions about how to do TDD/BDD are still missing from that tutorial, but the example code is ready, so you can see how the tests are organized and write code that passes them.
You will notice that in this tutorial the production classes are named such as Board, Block, Piece and Tetrominoe which are centered around the concepts of a Tetris game. But the test classes are centered around the behaviour of the Tetris game: FallingBlocksTest, RotatingPiecesOfBlocksTest, RotatingTetrominoesTest, FallingPiecesTest, MovingAFallingPieceTest, RotatingAFallingPieceTest etc.

Try to use as few assert statements per test method as possible. This makes sure that the purpose of the test is well-defined.
I know this will be controversial, but don't test the compiler - time spent testing Java Bean accessors and mutators is better spent writing other tests.
Try, where possible, to use TDD instead of writing your tests after your code.

I've found that most testing conventions can be enforced through the use of a standard base class for all your tests. Forcing the tester to override methods so that they all have the same name.
I also advocate the Arrange-Act-Assert (AAA) style of testing as you can then generate fairly useful documentation from your tests. It also forces you to consider what behaviour you are expecting due to the naming style.

Another item you can put in your standards is to try and keep your unit test size small. That is the actuall test methods themselves. Unless you are doing a full integration unit test there usually is no need for large unit tests, like say more than 100 lines. I'll give you that much in case you have a lot of setup to get to your one test. However if you do you should maybe refactor it.
People also talk about refactoring there code make sure people realize that unit tests is code too. So refactor, refactor, refactor.
I find the biggest problem in the uses I have seen is that people do not tend to recognize that you want to keep your unit tests light and agile. You don't want a monolithic beast for your tests after all. With that in mind if you have a method you are trying to test you should not test every possible path in one unit test. You should have multiple unit tests to account for every possible path through the method.
Yes if you are doing your unit tests correctly you should on average have more lines of unit test code than your application. While this sounds like a lot of work it will save you alot of time in the end when comes time for the inevitable business requirement change.

Users of full-featured IDE's will find that "some of them" have quite detailed support for creating tests in a specific pattern. Given this class:
public class MyService {
public String method1(){
return "";
}
public void method2(){
}
public void method3HasAlongName(){
}
}
When I press ctrl-shift-T in intellij IDEA I get this test class after answering 1 dialog box:
public class MyServiceTest {
#Test
public void testMethod1() {
// Add your code here
}
#Test
public void testMethod2() {
// Add your code here
}
#Test
public void testMethod3HasAlongName() {
// Add your code here
}
}
So you may want to take a close look at tool support before writing your standards.

I use nearly plain English for my unit test function names. Helps to define what they do exactly:
TEST( TestThatVariableFooDoesNotOverflowWhenCalledRecursively )
{
/* do test */
}
I use C++ but the naming convention can be used anywhere.

Make sure to include what is not an unit tests. See: What not to test when it comes to Unit Testing?
Include a guideline so integration tests are clearly identified and can be run separately from unit tests. This is important, because you can end with a set of "unit" tests that are really slow if the unit tests are mixed with other types of tests.
Check this for more info on it: How can I improve my junit tests ... specially the second update.

If you are using tools from the family of Junit (OCunit, SHunit, ...), names of tests already follow some rules.
For my tests, I use custom doxygen tags in order to gather their documentation in a specific page.

Related

How to do "component testing" in Visual Studio

I am looking for ideas to lead me to implement component testing for my application. Sure I use Unit Testing to test my single methods by utilizing TestMethods in a separate project but at this point I am more interested in testing in a higher level. Say that I have a class for Caching and I wrote unit tests for each and every method. They all contain their own instance of the class. And it works fine when I run the test, it initiates an object from that class and does one thing on it. But this doesnt cover the real life scenario in which the method is called by other methods and so on. I want to be able to test the entire caching component. How should I do it?
It sounds like you are talking about integration testing. Unit testing, as you say, does a great job of testing classes and methods in isolation but itegration testing tests that several components work together as expected.
One way to do this is to pick a top (or high) level object, create it with all of its dependencies as "real" objects as well and test that the public methods all produce the expected result.
In most cases you'll probably have to substitute stubs of the lowest level classes, like DB or file access classes and instrument them for the tests, but most objects would be the real thing.
Of course, like most testing efforts, all this is achieved much easier if your classes have been designed with some sort of dependency injection and paying attention to good design patterns like separation of concern.
All of this can be done using the same unit testing tools you've been using.
I would download the NUNIT-Framework
http://www.nunit.org/
It's free and simple

Do I need duplicate test methods; 1 for unit and 1 for integration tests?

I'm newer to unit testing and it seems like most of the information I find is on the unit testing side of things. I'm getting a good grasp around this and am planning on using MS Test Framework with Moq so I don't have to hand roll any mocks for my unit test dependencies.
Let's say I have the following unit test method:
[TestMethod]
public void GetCustomerByIDUnitTest()
{
//Uses Moq for dependency for getting customer to make sure
//ID I set up is same one returned to test in Assertion
}
Do I have to create another identical test that instead uses the actual Entity Framework and Database call to make an integration test?
[TestMethod]
public void GetCustomerByIDIntegrationTest()
{
//Uses actual repository interface for EF and DB to do integration testing
}
For the purpose of this question please leave topics about TDD or BDD out; I'm simple trying to determine if I physically need (2) separate tests and the manner of organizing these tests. Is this a requirement when doing both unit and integration testing?
Thanks!
In my opinion, it is somewhat situational. If I am working on a small personal project, then no, I just do the unit tests.
If it is a corporate / enterprise project then I do tend to do both unit and integration tests. However, keep unit and integration tests separated. Developers should be able to run unit tests frequently and quickly. Integration tests can be run less frequently because they usually take a long time to run. Usually I just run integration tests once before a commit, whereas I run unit tests much more frequently.
As an additional note, make your test names explain what should be happening. The test name GetCustomerByIDUnitTest really doesn't tell me much. Better would be something like: GetCustomerByID_ReturnsTheCorrectUser_WhenAValidIdIsPassed and conversely GetCustomerByID_ReturnsNull_WhenNonExistantIdIsPassed
I tend to favor a What_Does_When naming convention, but that too is a personal preference. In general, the more explanatory the better though.
Hmm, I hope I do not fail you by mentioning things you rather have unmentioned. But here my 2 cents on it. One disclaimer up-front. I use nunit with RhinoMocks, so syntax could be different, concepts are the same though.
Yes, you need separate tests. You can debate if you want to store the tests in the same test class, and tag them with [Category("integrationtest")] so that you can easily run your unit tests without running integration tests, and the other way around. With your TDD practices (oops, I know you don't want me talking about that :)) you need your unit tests to be completed as fast as possible.
To look at this from a slightly different angle; you are not really duplicating your tests. Your integration tests validate the functionality, while your unit tests validate a method in isolation. So they can very well have completely different names. As long as they make sense to you (or if you develop something with a team: as long as it makes sense for your team).
I think the most important thing is that you find a way that works for you. There isn't really a right or wrong. I think it's a big plus that you are writing both unit tests and integration tests. How you organize them is kinda up to you. I had different approaches in different projects I participated in:
Project A:
1 test class for integration tests
1 test class for unit tests
That helped to create meaningful names for the test classes, they could capture the actual feature we are testing. As for the unit tests, the test class had the same name as the class that we are testing.
Project B:
Mixed up integration tests with unit tests in one test class.
This worked fine as well, although we sometimes did have trouble finding an integration test. But tbh, with resharper at your side, how hard can it be :).
As I know you should have separate project for UnitTesting and IntegrationTesting.
The suggestion of This book is to create two projects and name them like ProjectName.UnitTests and ProjectName.IntegrationTests.
developers must run each of them separately and easily.
You can find many interesting topics and videos about testing here

How to organize unit tests and do not make refactoring a nightmare?

My current way of organizing unit tests boils down to the following:
Each project has its own dedicated project with unit tests. For a project BusinessLayer, there is a BusinessLayer.UnitTests test project.
For each class I want to test, there is a separate test class in the test project placed within exactly the same folder structure and in exactly the same namespace as the class under test. For a class CustomerRepository from a namespace BusinessLayer.Repositories, there is a test class CustomerRepositoryTests in a namespace BusinessLayerUnitTests.Repositories.
Methods within each test class follow simple naming convention MethodName_Condition_ExpectedOutcome. So the class CustomerRepositoryTests that contains tests for a class CustomerRepository with a Get method defined looks like the following:
[TestFixture]
public class CustomerRepositoryTests
{
[Test]
public void Get_WhenX_ThenRecordIsReturned()
{
// ...
}
[Test]
public void Get_WhenY_ThenExceptionIsThrown()
{
// ...
}
}
This approach has served me quite well, because it makes locating tests for some piece of code really simple. On the opposite site, it makes code refactoring really more difficult then it should be:
When I decide to split one project into multiple smaller ones, I also need to split my test project.
When I want to change namespace of a class, I have to remember to change a namespace (and folder structure) of a test class as well.
When I change name of a method, I have to go through all tests and change the name there, as well. Sure, I can use Search & Replace, but that is not very reliable. In the end, I still need to check the changes manually.
Is there some clever way of organizing unit tests that would still allow me to locate tests for a specific code quickly and at the same time lend itself more towards refactoring?
Alternatively, is there some, uh, perhaps Visual Studio extension, that would allow me to somehow say that "hey, these tests are for that method, so when name of the method changes, please be so kind and change the tests as well"? To be honest, I am seriously considering to write something like that myself :)
After working a lot with tests, I've come to realize that (at least for me) having all those restrictions bring a lot of problems in the long run, rather than good things. So instead of using "Names" and conventions to determine that, we've started using code. Each project and each class can have any number of test projects and test classes. All the test code is organized based on what is being tested from a functionality perspective (or which requirement it implements, or which bug it reproduced, etc...). Then for finding the tests for a piece of code we do this:
[TestFixture]
public class MyFunctionalityTests
{
public IEnumerable<Type> TestedClasses()
{
// We can find the tests for a class, because the test cases references in some special method.
return new []{typeof(SomeTestedType), typeof(OtherTestedType)};
}
[Test]
public void TestRequirement23423432()
{
// ... test code.
this.TestingMethod(someObject.methodBeingTested); //We do something similar for methods if we want to track which methods are being tested (we usually don't)
// ...
}
}
We can use tools like resharper "usages" to find the test cases, etc... And when that's not enough, we do some magic by reflection and LINQ by loading all the test classes, and running something like allTestClasses.where(testClass => testClass.TestedClasses().FindSomeTestClasses());
You can also use the TearDown to gather information about which methods are tested by each method/class and do the same.
One way to keep class and test locations in sync when moving the code:
Move the code to a uniquely named temporary namespace
Search for references to that namespace in your tests to identify the tests that need to be moved
Move the tests to the proper new location
Once all references to the temporary namespace from tests are in the right place, then move the original code to its intended target
One strength of end-to-end or behavioral tests is the tests are grouped by requirement and not code, so you avoid the problem of keeping test locations in sync with the corresponding code.
Regarding VS extensions that associate code to tests, take a look at Visual Studio's Test Impact. It runs the tests under a profiler and creates a compact database that maps IL sequence points to unit tests. So in other words, when you change the code Visual Studio knows which tests need to be run.
One unit test project per project is the way to go. We have tried with a mega unit test project but this increased the compile time.
To help you refactor use a product like resharper or code rush.
Is there some clever way of organizing unit tests that would still
allow me to locate tests for a specific code quickly
Resharper have some good short cuts that allows you to search file or code
As you said for class CustomerRepository their is a test CustomerRepositoryTests
R# shortcut shows inpput box for what you find in you case you can just input CRT and it will show you all the files starting with name have first as Capital C then R and then T
It also allow you do search by wild cards such as CR* will show you the list of file CustomerRepository and CustomerRepositoryTests

Is there a common way to test complex functions with NUnit?

is there a common way to test complex functions with several parameters with NUnit? I think it is very hard or impossible to test every condition.
I'm afraid the combination of parameters that isn't expected in the function is also not expected in the test.
So the expected condition will not fail but the unexpected.
Thanks
This shouldn't be hard to test at all. If it is, the method isn't designed for testability, and that is a code smell that tells you that you need to refactor it.
I tend to write tests in these cases as follows (others may have better suggestions):
Does it work as intended when all appropriate parameters are passed?
Does it throw expected exceptions when I think it should? (ArgumentNullException, etc.)
For each parameter, what happens when I pass null, the minimum and the maximum. (This can be very extensive, depending on the number of arguments.)
If your method takes a lot of parameters, consider refactoring it to take an object with the information on it, so that you can encapsulate the rules for it in the object, and pass the object to the method.
For data-driven tests in NUnit, there is [TestCase] attribute. Unit tests usually dont't test every possible scenario. They just test representative set of inputs, which have good coverage of what the SUT does on various inputs. Just pick some characteristic inputs, and you'll be fine.
Don't know if this is the kind of thing you are looking for, but there is an automated unit test generator that was created by Microsoft research called PEX.
Pex automatically generates test suites with high code coverage. Right from the Visual Studio code editor, Pex finds interesting input-output values of your methods, which you can save as a small test suite with high code coverage. Microsoft Pex is a Visual Studio add-in for testing .NET Framework applications.
Use RowTest similar question can be found at
C#, NUnit Assert in a Loop
have a look at #"Sam Holder" reply, I copied the code from it, with few tweaks.
[TestFixture]
public class TestExample
{
[RowTest]
[Row( 1)]
[Row( 2)]
[Row( 3)]
[Row( 4)]
public void TestMethodExample(int value)
{
...
...
...
Assert.IsTrue("some condition ..");
}
}
I agree with Mike Hofer, that the question indicates a code smell.
Nevertheless, NUnit has a Combinatorial attribute that might help you, if you're not refactoring/redesigning.

Is it ok to change method visibility for the sake of unit testing?

Many times I find myself torn between making a method private to prevent someone from calling it in a context that doesn't make sense (or would screw up the internal state of the object involved), or making the method public (or typically internal) in order to expose it to the unit test assembly. I was just wondering what the Stack Overflow community thought of this dilemma?
So I guess the question truly is, is it better to focus on testability or on maintaining proper encapsulation?
Lately I've been leaning towards testability, as most of the code is only going to be leveraged by a small group of developers, but I thought I would see what everyone else thought?
Its NOT ok to change method visibility on methods that the customers or users can see. Doing this is ugly, a hack, exposes methods that any dumb user could try to use and explode your app... its a liability you do not need.
You are using C# yes? Check out the internals visible to attribute class.
You can declare your testable methods as internal, and allow your unit testing assembly access to your internals.
It depends on whether the method is part of a public API or not. If a method does not belong to part of a public API, but is called publicly from other types within the same assembly, use internal, friend your unit test assembly, and unit test it.
However, if the method is not/should not be part of a public API, and it is not called by other types internal to the assembly, DO NOT test it directly. It should be protected or private, and it should only be tested indirectly by unit testing your public API. If you write unit tests for non-public (or what should be non-public) members of your types, you are binding test code to internal implementation details.
Thats a bad kind of coupling, increases the amount of unit tests you need, increases workload both in the short term (more unit tests) as well as in the long term (more test maintenance and modification in response to refactoring internal implementation details). Another problem with testing non-public members is that you test code that may not actually be needed or used. A GREAT way to find dead code is when it is not covered by any of your unit tests when your public API is covered 100%. Removing dead code is a great way to keep your code base lean and mean, and is impossible if you are not careful about what you put into your public API, and what parts of your code you unit test.
EDIT:
As a quick additional note...with a properly designed public API, you can very effectively use a tool like Microsoft PEX to automatically generate full-coverage unit tests that test every execution path of your code. Combined with a few manually written tests that cover critical behavior, anything not covered can be considered dead code and removed, and you can greatly shortcut your unit testing process.
This is a common thought.
It's generally best to test the private methods by testing the public methods that call them (so you don't explicitly test the private methods). However, I understand that there are times when you really do want to test those private methods.
The answers to this question (Java) and this question (.NET) should be helpful.
To answer the question: no, you shouldn't change method visibility for the sake of testing. You generally shouldn't be testing private methods, and when you do, there are better ways to do it.
In general I agree with #jrista. But, as usual, it depends.
When trying to work with legacy code, the key is to get it under test. After that, you can add tests for new features and existing bugs, refactor to improve design, etc. This is risky without tests. Legacy code tends to be rife with dependencies, and is often extremely difficult to get under test.
In Working Effectively with Legacy Code, Michael Feathers suggests multiple techniques for getting code under test. Many of these techniques involve breaking encapsulation or complicating the design, and the author is up front about this. Once tests are in place, the code can be improved safely.
So for legacy code, do what you have to do.
In .NET you should use Accessors for unit testing, even rather than the InternalsVisibleTo attribute. Accessors allow you to get access to any method in the class even if it is private. They even let you test abstract classes using an empty mock derived object (see the "PrivateObject" class).
Basically in your test project you use the accessor class rather than the actual class with the methods you want to test. The accessor class is the same as the "real" class, except everything is public to your test project. Visual studio can generate accessors for you.
NEVER make a type more visible to facilitate unit testing.
IMO is it WRONG to say that you should not unit test private methods. Unit tests are of exceptional value for regression testing and there is no reason why private methods should not be regression tested with granular unit tests.

Categories