How to do "component testing" in Visual Studio - c#

I am looking for ideas to lead me to implement component testing for my application. Sure I use Unit Testing to test my single methods by utilizing TestMethods in a separate project but at this point I am more interested in testing in a higher level. Say that I have a class for Caching and I wrote unit tests for each and every method. They all contain their own instance of the class. And it works fine when I run the test, it initiates an object from that class and does one thing on it. But this doesnt cover the real life scenario in which the method is called by other methods and so on. I want to be able to test the entire caching component. How should I do it?

It sounds like you are talking about integration testing. Unit testing, as you say, does a great job of testing classes and methods in isolation but itegration testing tests that several components work together as expected.
One way to do this is to pick a top (or high) level object, create it with all of its dependencies as "real" objects as well and test that the public methods all produce the expected result.
In most cases you'll probably have to substitute stubs of the lowest level classes, like DB or file access classes and instrument them for the tests, but most objects would be the real thing.
Of course, like most testing efforts, all this is achieved much easier if your classes have been designed with some sort of dependency injection and paying attention to good design patterns like separation of concern.
All of this can be done using the same unit testing tools you've been using.

I would download the NUNIT-Framework
http://www.nunit.org/
It's free and simple

Related

DDD and Unit Test, should I create entities directly or through their Domain service?

Some of the entities that are under test, cannot be directly created using the constructor, but only through a Domain service, because the use of a Repository is needed, may be for some validation that requires a hit in the DB (imagine a unique code validation).
In my tests I have two options:
Create an entity using the domain service that exposes the entity creation, this requires me to mock all the repository interfaces needed by that service and instruct the relevant ones to behave correctly for a successfull creation
Somehow use directly the entity constructor (I use c# so i can expose an internal constructor to the test assembly) and get the entity bypassing the service logic.
I'm not sure on which is the best approach,
the 1st is the one I prefer because it tests the public behaviour of the Domain model, since from an outside perspective the only way to create the entity is passing through the Domanin service. But this solution brings in al lot of "Arrange" code due to the mock configuration needed.
The 2nd one is more direct, it creates the object bypassing the service logic, but it's a sort of cheating on the Domain model, it assumes that the test code knows the internals of the Domain model and that's not a good point. But the code is a bit more readable.
I make use of Builders the create entities in tests, so the configuration code needed by the 1st approach would be isolated in the builder code, but I still want to know what would be the correct way.
Essentially you are asking what 'level' you should test at. Option 2 is very much a Unit Test, as it would test the code of a single class only. Option 1 is more of an Integration Test as it would test several components together.
I tend to prefer Option 2 for unit tests, for the following reasons:
Unit tests are simpler and more effective if they test a single class only. If you use the factory service to create the object under test, your test doesn't have direct control over how the object is constructed. The will lead to messy and tedious test code, such as mocking all the repository interfaces.
I will usually have, in a different part of my test code base, actual Integration Tests (or Acceptance Tests) which test the entire application from front to back via it's public interfaces (with external dependencies such as databases mocked/stubbed out). I would expect these tests to cover Option 1 from your question so I don't really need to repeat Option 1 in the unit test suite.
You may ask, what's the point of starting up my whole application just to test a couple of classes? The answer is quite simple - by sticking to only two levels of testing, your test code base will be clean, readable and easy to refactor. If your tests are very varied in terms of the 'level' that they test at (some test a single class, some a couple of classes together, some the whole application) then the test code just becomes hard to maintain.
Some caveats:
This advice is for if you are developing an "application" that will be deployed and run. If you are developing a "shared library" that will be distributed to other teams to use as they see fit, then you should test from all the public entry points to the library, regardless of the 'level'. (But I still wouldn't call these tests "unit tests" and would separate them in the code base.)
If you don't have the ability to write full integration tests, then I would use Option 1 and 2. Just be wary of the test code base becoming bloated.
One more point - test things together if they change for the same reason. The situation you don't want to end up in after choosing Option 1 is having to change your Entity tests every time you make a change to the factory/repository code. If the behavior of each Entity has not changed, then you shouldn't have to change the tests.
You could probably avoid that conundrum by not creating your entity through a domain service in the first place.
If you feel the need to validate something about an entity before creating it, you could probably see it as a domain invariant and have it enforced by an Aggregate. That aggregate root would expose a method to create the entity.
As soon as the invariant is guaranteed by the Aggregate in charge of spawning the new Entity, everything can be tested against concrete objects in memory since the aggregate should have all needed data inside itself to check the invariant - there is no resorting to an external Repository. You can set up the creator aggregate to be in an invariant breaking state or non-invariant-breaking state all in memory and exercise the test directly on the aggregate's CreateMyEntity method.
Don't Create Aggregate Roots by Udi Dahan is a good read on that approach - the basic idea is that entities and aggregate roots aren't just born out of nowhere.

Retrofit unit tests to large solution, IOC, Moq

I am in the process of retrofitting unit tests for a asp.net solution written in VB.Net and c#.
The unit tests need to verify the current functionality and act as a check for future breaking changes.
The solution comprises of:
1 MVC web project
written in vb.net (don't ask, it's a legacy thing)
10 other supporting projects each containing logically grouped functionality
written in C#, each project contains repositories and DAL
All the classes are tightly coupled as there is no inversion of control (IOC) implemented anywhere, yet.
currently to test a controller there is the following stack:
controller
repository
dal
logging
First question, to unit test this correctly would I setup 1 test project and run all tests from it, or should I setup 1 test project for each project to test the functionality of that DLL only?
Second question, do I need to implement IOC to be able to use MOQ?
Third question, is it even possible to refactor IOC into a huge solution like this?
Forth question, what other options are available to get this done asap?
I am in the process of retrofitting unit tests for a asp.net solution written in VB.Net and c#. The unit tests need to verify the current functionality and act as a check for future breaking changes.
When working with a large code base that doesn't have unit tests and hasn't been written with testing in mind, there is a good chance that in order to write a useful set of unit tests you will have to modify the code, hence you're going to be triggering the event that you're planning on writing the unit tests to support. This is obviously risky, but may not be any riskier than what you're already doing on a day to day basis.
There are a number of approaches that you could take (and there's a good chance that this question will be closed as too broad). One approach is to create a good set of integration tests ensure that the core functionality is working. These tests won't be as fast to run as unit tests, but they will be further decoupled from the legacy code base. This will give you a good safety net for any changes that you need to make as part of introducing unit testing.
If you have an appropriate version of visual studio, then you may also be able to use shims (or if you have funds, typemock may be an option) to isolate elements of your application when writing your initial tests. So, you could for example create shims of your dal to isolate the rest of your code from the db.
First question, to unit test this correctly would i setup 1 test project and run all tests from it, or should i setup 1 test project for each project to test the functionality of that dll only?
Personally, I prefer think of each assembly as a testable unit, so I tend to create at least one test project for each assembly containing production code. Whether or not that makes sense though, depends a bit on what's contained in each of the assemblies... I'd also tend to have at least one test project for integration tests of the top level project.
Second question, do i need to implement IOC to be able to use MOQ?
The short answer is no, but it depends what your classes do. If you want to test using Moq, then it's certainly easier to do so if your classes support dependency injection, although you don't need to use an IOC container to achieve this. Hand rolled injection either through constructors like below, or through properties can form a bridge to allow testing stubs to be injected.
public SomeConstructor(ISomeDependency someDependency = null) {
if(null == someDependency) {
someDependency = new SomeDependency();
}
_someDependency = someDependency;
}
Third question, is it even possible refactor IOC into a huge solution like this?
Yes it's possible. The bigger question is it worth it? You appear to be suggesting a big bang approach to the migration. If you have a team of developers that don't have much experience in this area, this seems awfully risky. A safer approach might be to target a specific area of the application and migrate that section. If your assemblies are discrete then they should form fairly easy split points in your application. Learn what works and what doesn't, along with what benefits and unexpected pain you're feeling. Use that to inform your decision about how and when to migrate the rest of the code.
Forth question, what other options are available to get this done asap?
As I've said above, I'm not sure that ASAP is really the right approach to take. Working towards unit-testing can be done as a slow migration, adding tests as you actually change the code due to business requirements. This helps to ensure that testers are also allocated to catch any errors that you introduce as part of the refactoring that might need to take place to support the testing.

Effective transition from unit testing to integration testing

I'm currently investigating how we should perform our testing in an upcoming project. In order to find bugs early in the development process, the developers will write unit tests before the actual code (TDDish). The unit tests will focus, as they should, on the unit (a method in this case) in isolation so dependencies will be mocked etc etc.
Now, I also would like to test these units when they interact with other units and I was thinking that there should be a effective best practice to do this since the unit tests have already been written. My thought is that the unit tests will be reused but the mocked objects will be removed and replaced with real ones. The diffent ideas I have right now is:
Use a global flag in each test class that decides if mock objects should be used or not. This approach will require several if statements
Use a factory class that either creates a "instanceWithMocks" or "instanceWithoutMocks". This approach might be troublesome for the new developers to use and requires some extra classes
Separate the integration tests from the unit tests in different classes. This will however require a lot of redundant code and maintaining the test cases will be twice the work
The way I see it all of these approaches have pros and cons. Which of these would be preferred and why? And is there a better way to effective transition from unit testing to integration testing? Or is this usually done in some other way?
I would go for the third option
Seperate the integration tests from the unit tests in different
classes. This will however require alot of redundant code and
maintaining the test cases will be twice the work
This is because unit tests and integration tests have different purposes. A unit test shows that an individual piece of functionality works in isolation. An integration test shows that different pieces of functionality still work when they interact with each other.
So for a unit test you want to mock things so that you are only testing the one piece of functionality.
For an integration test mock as little as possible.
I would have them in separate projects. What works well at my place is to have a unit test project using NUnit and Moq. This is written TDD as the code is written. The integration tests are Specflow/Selenium and the feature files are written with the help of the product owner in the planning session so we can verify that we are delivering what the owner wants.
This does create extra work in the short term but leads to fewer bugs, easier maintenance, and delivery matching requirements.
I agree to most other answers, that unittesting should be seperate from integrationtesting (option 3).
But i do not agree to your contra arguments:
[...] This (seperating unit from integration testing) will however
require a lot of redundant code and maintaining the test cases will be twice the work.
Generating objects with test data can be a lot of work but this can be refactored to test-helper clases aka ObjectMother that can be used from
unit and integration testing so there is no need for redundancy there
In unit tests you check different conditions of the class under tests.
For integration testing it is not neccessary to re-check every of these special cases.
Instead you check that the components work together.
Example
You may have unit-tests for 4 different situations where an exception is thrown.
For the integration it is not neccessary to re-test all 4 conditions
One exception-related integration test is enough to verify that the integrated system can handle exceptions.
An IoC container like Ninject/Autofac/StructureMap may be of use to you here. The unit tests can resolve the system-under-test through the container, and it is simply a matter of registration whether you have mocks or real objects registered. Similar to your factory approach, but the IoC container is the factory. New developers would need a little training to understand, but that's the case with any complex system. The disadvantage to this is that the registration scenarios can become fairly complicated, but it's hard to say for any given system whether they'd be too complicated without trying it out. I suspect this is the reason you haven't found any answers that seem definitive.
The integration tests should be different classes than your unit tests since you are testing a different behavior. The way that I think of integration tests is that they are the ones that you execute when trying to make sure that everything works together. They would be using inputs to portions of the application and making sure that the expected output is returned.
I think you are messing up the purpose of unit testing and integration testing.
Unit testing is for testing a single class - this is low level API.
Integration testing is testing how classes cooperate. This is another higher level API.
Normally, you can not reuse unit tests in integration testing because they represent different level of system view.
Using spring context may help with setting up environment for integration testing.
I'm not sure reusing your unit tests with real objects instead of mocks is the right approach to implementing integration tests.
The purpose of a unit test is to verify the basic correctness of an object in isolation from the outside world. Mocks are there to ensure that isolation. If you substitute them for real implementations, you'll actually end up testing something completely different - the correctness of large portions of the same chain of objects, and you're redundantly testing it many times.
By making integration tests distinct from unit tests, you'll be able to choose the portions of your system you want to verify - generally, it's a good idea to test parts that imply configuration, I/O, interation with third-party systems, the UI or anything else that unit tests have a hard time covering.

Is it ok to change method visibility for the sake of unit testing?

Many times I find myself torn between making a method private to prevent someone from calling it in a context that doesn't make sense (or would screw up the internal state of the object involved), or making the method public (or typically internal) in order to expose it to the unit test assembly. I was just wondering what the Stack Overflow community thought of this dilemma?
So I guess the question truly is, is it better to focus on testability or on maintaining proper encapsulation?
Lately I've been leaning towards testability, as most of the code is only going to be leveraged by a small group of developers, but I thought I would see what everyone else thought?
Its NOT ok to change method visibility on methods that the customers or users can see. Doing this is ugly, a hack, exposes methods that any dumb user could try to use and explode your app... its a liability you do not need.
You are using C# yes? Check out the internals visible to attribute class.
You can declare your testable methods as internal, and allow your unit testing assembly access to your internals.
It depends on whether the method is part of a public API or not. If a method does not belong to part of a public API, but is called publicly from other types within the same assembly, use internal, friend your unit test assembly, and unit test it.
However, if the method is not/should not be part of a public API, and it is not called by other types internal to the assembly, DO NOT test it directly. It should be protected or private, and it should only be tested indirectly by unit testing your public API. If you write unit tests for non-public (or what should be non-public) members of your types, you are binding test code to internal implementation details.
Thats a bad kind of coupling, increases the amount of unit tests you need, increases workload both in the short term (more unit tests) as well as in the long term (more test maintenance and modification in response to refactoring internal implementation details). Another problem with testing non-public members is that you test code that may not actually be needed or used. A GREAT way to find dead code is when it is not covered by any of your unit tests when your public API is covered 100%. Removing dead code is a great way to keep your code base lean and mean, and is impossible if you are not careful about what you put into your public API, and what parts of your code you unit test.
EDIT:
As a quick additional note...with a properly designed public API, you can very effectively use a tool like Microsoft PEX to automatically generate full-coverage unit tests that test every execution path of your code. Combined with a few manually written tests that cover critical behavior, anything not covered can be considered dead code and removed, and you can greatly shortcut your unit testing process.
This is a common thought.
It's generally best to test the private methods by testing the public methods that call them (so you don't explicitly test the private methods). However, I understand that there are times when you really do want to test those private methods.
The answers to this question (Java) and this question (.NET) should be helpful.
To answer the question: no, you shouldn't change method visibility for the sake of testing. You generally shouldn't be testing private methods, and when you do, there are better ways to do it.
In general I agree with #jrista. But, as usual, it depends.
When trying to work with legacy code, the key is to get it under test. After that, you can add tests for new features and existing bugs, refactor to improve design, etc. This is risky without tests. Legacy code tends to be rife with dependencies, and is often extremely difficult to get under test.
In Working Effectively with Legacy Code, Michael Feathers suggests multiple techniques for getting code under test. Many of these techniques involve breaking encapsulation or complicating the design, and the author is up front about this. Once tests are in place, the code can be improved safely.
So for legacy code, do what you have to do.
In .NET you should use Accessors for unit testing, even rather than the InternalsVisibleTo attribute. Accessors allow you to get access to any method in the class even if it is private. They even let you test abstract classes using an empty mock derived object (see the "PrivateObject" class).
Basically in your test project you use the accessor class rather than the actual class with the methods you want to test. The accessor class is the same as the "real" class, except everything is public to your test project. Visual studio can generate accessors for you.
NEVER make a type more visible to facilitate unit testing.
IMO is it WRONG to say that you should not unit test private methods. Unit tests are of exceptional value for regression testing and there is no reason why private methods should not be regression tested with granular unit tests.

Writing standards for unit testing

I plan to introduce a set of standards for writing unit tests into my team. But what to include?
These two posts (Unit test naming best practices and Best practices for file system dependencies in unit/integration tests) have given me some food for thought already.
Other domains that should be covered in my standards should be how test classes are set up and how to organize them. For example if you have class called OrderLineProcessor there should be a test class called OrderLineProcessorTest. If there's a method called Process() on that class then there should be a test called ProcessTest (maybe more to test different states).
Any other things to include?
Does your company have standards for unit testing?
EDIT: I'm using Visual Studio Team System 2008 and I develop in C#.Net
Have a look at Michael Feathers on what is a unit test (or what makes unit tests bad unit tests)
Have a look at the idea of "Arrange, Act, Assert", i.e. the idea that a test does only three things, in a fixed order:
Arrange any input data and processing classes needed for the test
Perform the action under test
Test the results with one or more asserts. Yes, it can be more than one assert, so long as they all work to test the action that was performed.
Have a Look at Behaviour Driven Development for a way to align test cases with requirements.
Also, my opinion of standard documents today is that you shouldn't write them unless you have to - there are lots of resources available already written. Link to them rather than rehashing their content. Provide a reading list for developers who want to know more.
You should probably take a look at the "Pragmatic Unit Testing" series. This is the C# version but there is another for Java.
With respect to your spec, I would not go overboard. You have a very good start there - the naming conventions are very important. We also require that the directory structure match the original project. Coverage also needs to extend to boundary cases and illegal values (checking for exceptions). This is obvious but your spec is the place to write it down for that argument that you'll inevitably have in the future with the guy who doesn't want to test for someone passing an illegal value. But don't make the spec more than a few pages or no one will use it for a task that is so context-dependent.
Update: I disagree with Mr. Potato Head about only one assert per Unit Test. It sounds quite fine in theory but, in practice, it leads to either loads of mostly redundant tests or people doing tons of work in setup and tear-down that itself should be tested.
I follow the BDD style of TDD. See:
http://blog.daveastels.com/files/BDD_Intro.pdf
http://dannorth.net/introducing-bdd
http://behaviour-driven.org/Introduction
In short this means that
The tests are not thought as "tests", but as specifications of the system's behaviour (hereafter called "specs"). The intention of the specs is not to verify that the system works under every circumstance. Their intention is to specify the behaviour and to drive the design of the system.
The spec method names are written as full English sentences. For example the specs for a ball could include "the ball is round" and "when the ball hits a floor then it bounces".
There is no forced 1:1 relation between the production classes and the spec classes (and generating a test method for every production method would be insane). Instead there is a 1:1 relation between the behaviour of the system and the specs.
Some time ago I wrote TDD tutorial (where you begin writing a Tetris game using the provided tests) which shows this style of writing tests as specs. You can download it from http://www.orfjackal.net/tdd-tutorial/tdd-tutorial_2008-09-04.zip The instructions about how to do TDD/BDD are still missing from that tutorial, but the example code is ready, so you can see how the tests are organized and write code that passes them.
You will notice that in this tutorial the production classes are named such as Board, Block, Piece and Tetrominoe which are centered around the concepts of a Tetris game. But the test classes are centered around the behaviour of the Tetris game: FallingBlocksTest, RotatingPiecesOfBlocksTest, RotatingTetrominoesTest, FallingPiecesTest, MovingAFallingPieceTest, RotatingAFallingPieceTest etc.
Try to use as few assert statements per test method as possible. This makes sure that the purpose of the test is well-defined.
I know this will be controversial, but don't test the compiler - time spent testing Java Bean accessors and mutators is better spent writing other tests.
Try, where possible, to use TDD instead of writing your tests after your code.
I've found that most testing conventions can be enforced through the use of a standard base class for all your tests. Forcing the tester to override methods so that they all have the same name.
I also advocate the Arrange-Act-Assert (AAA) style of testing as you can then generate fairly useful documentation from your tests. It also forces you to consider what behaviour you are expecting due to the naming style.
Another item you can put in your standards is to try and keep your unit test size small. That is the actuall test methods themselves. Unless you are doing a full integration unit test there usually is no need for large unit tests, like say more than 100 lines. I'll give you that much in case you have a lot of setup to get to your one test. However if you do you should maybe refactor it.
People also talk about refactoring there code make sure people realize that unit tests is code too. So refactor, refactor, refactor.
I find the biggest problem in the uses I have seen is that people do not tend to recognize that you want to keep your unit tests light and agile. You don't want a monolithic beast for your tests after all. With that in mind if you have a method you are trying to test you should not test every possible path in one unit test. You should have multiple unit tests to account for every possible path through the method.
Yes if you are doing your unit tests correctly you should on average have more lines of unit test code than your application. While this sounds like a lot of work it will save you alot of time in the end when comes time for the inevitable business requirement change.
Users of full-featured IDE's will find that "some of them" have quite detailed support for creating tests in a specific pattern. Given this class:
public class MyService {
public String method1(){
return "";
}
public void method2(){
}
public void method3HasAlongName(){
}
}
When I press ctrl-shift-T in intellij IDEA I get this test class after answering 1 dialog box:
public class MyServiceTest {
#Test
public void testMethod1() {
// Add your code here
}
#Test
public void testMethod2() {
// Add your code here
}
#Test
public void testMethod3HasAlongName() {
// Add your code here
}
}
So you may want to take a close look at tool support before writing your standards.
I use nearly plain English for my unit test function names. Helps to define what they do exactly:
TEST( TestThatVariableFooDoesNotOverflowWhenCalledRecursively )
{
/* do test */
}
I use C++ but the naming convention can be used anywhere.
Make sure to include what is not an unit tests. See: What not to test when it comes to Unit Testing?
Include a guideline so integration tests are clearly identified and can be run separately from unit tests. This is important, because you can end with a set of "unit" tests that are really slow if the unit tests are mixed with other types of tests.
Check this for more info on it: How can I improve my junit tests ... specially the second update.
If you are using tools from the family of Junit (OCunit, SHunit, ...), names of tests already follow some rules.
For my tests, I use custom doxygen tags in order to gather their documentation in a specific page.

Categories