Can resharper jump to the file that contains the unit tests? - c#

Is it possible to somehow link or use some convention so I can jump between my unit tests for a given class?
Also, creating short cuts for jumping between the interface, the implementations?
(keyboard shortcuts)
Example:
IUserService
UserService
UserServiceTests
It would be great if I can somehow link these together so I can jump to any of these files while in any one of them currently.

I just implemented that feature in TestLinker, which is a ReSharper 2016.1 extension. It is available to install from the ReSharper Gallery.

Is it possible to somehow link or use some convention so I can jump between my unit tests for a given class?
To jump between unit tests for a given class, launch ReSharper's Find Usages on the class name, and as soon as you have results in the Find Results tool window, group them in a way that helps focus on usages in a particular part of your code base - for example, by project and type. This will let detect usages in your test project. From there, you can quickly jump from Find Results to actual usages in code. As an alternative, you can use ReSharper's Go to Usages of Symbol that works in a similar way but displays search results in a pop-up menu instead of flushing them to Find Results.
If your test classes contain metadata showing which business logic they're covering, this would help even better in differentiating the usages you need. For example, if you're using MSpec, test classes are marked with the Subject attribute: [Subject(typeof (MyCoveredClass))] This is handy because usages within this attribute are very visible, and navigating to them leads you directly to declarations of your test classes:
With NUnit and MSTest, this is a bit more complicated as their attributes take strings as parameters, like this: [TestProperty("TestKind", "MyCoveredClass")]. In order to find such a usage of MyCoveredClass, you'll have to use ReSharper's Find Usages Advanced and turn on the Textual occurrences option.
Also, creating short cuts for jumping between the interface, the implementations?
As to jumping within an inheritance chain, ReSharper provides multiple options to do that, including Type Hierarchy (ReSharper > Inspect > Type Hierarchy) and Go to Implementation (ReSharper > Navigate > Go to Implementation):

As has already been mentioned you can do this using the TestCop ReSharper plugin (gallery link).
It ties the class under test to the test fixture by using regular expressions to identify class names and namespaces. You can customise these to fit your needs, but I found there to be a fair amount of trial and error to get this right on existing code.
Once it's all set up you can go back and forth with a keyboard shortcut. It can also do things like create the TestFixture or the class for you.

ReSharper doesn't have a specific Goto Test/Code feature other than navigating through the list of usages.
However, TestDriven.NET has this feature which uses naming conventions to find the Test/Code peer such that you can flip back and forth.
Also, creating short cuts for jumping between the interface, the
implementations?
ReSharper has this feature. Using the Visual Studio scheme:
Alt + Home navigates to the class' base, in case there are more than one a context menu will list them
Alt + End navigates down the inheritance hierarchy and behaves like Alt + Home
Ctrl + U and Ctrl + Alt + B respectively is the equivalent for the ReSharper 2.x / IDEA scheme.

i don't think this is going to be possible with just resharper. as far as resharper is concerned, your unit test is just another usage of UserService.
also, all of the different unit testing frameworks specify things differently, so it'd be tough to know. for instance, doing bdd would yield test class names almost entirely unrelated to the class(es) being tested.
you might be able to write an extension to do this, maybe using attributes or something? not sure.

You can use the ReSharper Extension TestCop
This plugin is designed for use with mstest & nunit but should work with any other unittest framework that requires you to assign a test Attribute.

With ReSharper and NUnit, to jump from test to subject, you can use the TestOf property of the TestFixture attribute. Just Ctrl + Click on MyClass in the test file:
[TestFixture(TestOf = typeof(MyClass))]
public class MyClassTest
To jump from subject to test, use the ReSharper Find usage command.

Related

C# class used only in refer projects inspection?

Take a look at following solution:
MySolution.sln
MyApp.csproj
MyClassLib.csproj
MyClass.cs
MyClassLib project referenced by MyApp project and contains MyClass.
MyClass is used only in MyApp, so it can be moved there.
Is there a way to determine such cases with some tool? Maybe Roslyn or Resharper inspections?
In case of complex solution with long history and many projects this is required feature.
No, there is no such tool for this.
Why? Easy: What if, sometime in the future, you create a MyApp2 and that also needs MyClass? Then it would be better if MyClass is not in the MyApp assembly.
Now you, as the human developing this, might know that there will never (although never say never) be a MyApp2 but a tool cannot possibly know this.
I have limited experience with ReSharper, but from my experience, ReSharper can not automatically detect these cases where a file can be moved, but can visualize these hierarchies.
Going back to your earlier example, the hierarchy tool would show that your MyClass.cs file is only used by a file in MyApp.csproj. (It would not explicitly say this, but you would be able to tell based on the hierarchy.)
You can either use CodeLens in visual studio to check where is used
or either right click on the class (or shift+f12) to "Find all references" and check where is used. This gives you a quick overview, give that you know your project structure, of the need of moving a class to some other place.
or use
Code analysis tools or other code tools to check redundancy etc.
You cannot determine those automatically unless you fiddle with these tools, as it's an edge case when yourself know wheter a class should be placed in some place or not and no AI can replace that, unless you write your own custom code analysis tool that does that particular task.
Edit: Since author seems so much driven and determined into digging into this problem, I suggest you to take a shot into T4 code generation, DSL, CodeDOM to check if you can actually generate or analyze the code you want
Or, create Custom code analysis rulesets or check if the ones already present suits for you
#MindSwipe is right. However, if you really need to do this then here's a hack:
ensure your solution is under version control. this can help later.
select project MyClassLib and run a find and replace in all files of the current project: public class with internal class.
build your solution to get a bunch of errors
open the ErrorList pane and sort it by Description
You should see error messages such as:
The type or namespace name 'MyClass' could not be found (are you missing a using directive or an assembly reference?).
If you see exactly 1 message per class then it means that class can be moved from the library project to the project that yielded this error. Otherwise it means it is shared by at least 2 projects; in this case you have to make it public again (undo the change made by the global replace for this class).

Override a step declaration in SpecFlow?

So at my job we have a core SpecFlow library that our different teams can use for their automation. This library has some declared steps.
For example, the library might have something like this:
When I click the button
However, let's say I want to define my own step declaration that uses that exact same wording. Is it possible to override it?
As #Grasshopper wrote, the step definition are global.
But you could use Scopes to overwrite it.
See http://www.specflow.org/documentation/Scoped-Bindings/
In this case do not forget to specify on every scenario the tag or the original step definition will be called.
It would be a very bad idea to do this, as any scenario that uses this step and fails will be very much harder to understand and debug.
In general using generic library steps in scenarios is also not such a good idea. Scenarios should not contain generic steps or descriptions of HOW things are done. Instead they should contain steps specific to your business context, and these should describe WHAT is being done and WHY its being done.
So instead of
When I click on sign in
And I fill in my email with ...
...
we get the much simpler and more abstract
When I sign in
which is all about WHAT we are doing, and nothing about HOW we are doing it.
You will get a DuplicateStepException if you have a same step (in your case - When I click the button) twice either in the same step definition file or another one. Even if you use a given or then annotation. This is because the step definitions are loaded globally thus resulting in conflict.
Also you cannot extend a stepdefinition or hook containing file as cucumber will throw an error that this is not acceptable. Thus no way you can override behaviour by inheritance.
You will need to write a different step all together, or if possible pass the button as a parameter to the existing step and put in the logic if you are allowed to modify the library code.

How to automate property/method headers when implementing an interface to satisfy StyleCop

We use StyleCop to enforce documentation of our code.
StyleCop (out of the box) requires properties and methods to be documented. Theoretically, interfaces and their concretions can have different headers but in practice they're usually identical.
However, when an interface is implemented in the concretion, the header isn't copied over meaning that it has to be done manually. Is there a better way to automate this rather than having to copy over each one?
Obviously we could simply copy the interface code en masse but you lose a lot of the stub code so it isn't really a perfect solution.
You can use Ghostdoc, an addin for Visual Studio. After installed it, just right click to the properties, classes or methods and click "Document This".
If you have long properties or methods you can use Resharper to auto implement and copy the interface documentations.
I found that Atomineer Pro Documentation does this nicely. I believe the statement in the overview summarizes what you have asked.
Intelligent automatic duplication of existing documentation for
overrides of interface and base class methods, throughout groups of
overloaded methods, and across related parameters within a class to
maximise documentation consistency with minimal effort.
There is a free trial if you want to take it for a test run and see if it meets your needs.
Comparison

How to organize unit tests and do not make refactoring a nightmare?

My current way of organizing unit tests boils down to the following:
Each project has its own dedicated project with unit tests. For a project BusinessLayer, there is a BusinessLayer.UnitTests test project.
For each class I want to test, there is a separate test class in the test project placed within exactly the same folder structure and in exactly the same namespace as the class under test. For a class CustomerRepository from a namespace BusinessLayer.Repositories, there is a test class CustomerRepositoryTests in a namespace BusinessLayerUnitTests.Repositories.
Methods within each test class follow simple naming convention MethodName_Condition_ExpectedOutcome. So the class CustomerRepositoryTests that contains tests for a class CustomerRepository with a Get method defined looks like the following:
[TestFixture]
public class CustomerRepositoryTests
{
[Test]
public void Get_WhenX_ThenRecordIsReturned()
{
// ...
}
[Test]
public void Get_WhenY_ThenExceptionIsThrown()
{
// ...
}
}
This approach has served me quite well, because it makes locating tests for some piece of code really simple. On the opposite site, it makes code refactoring really more difficult then it should be:
When I decide to split one project into multiple smaller ones, I also need to split my test project.
When I want to change namespace of a class, I have to remember to change a namespace (and folder structure) of a test class as well.
When I change name of a method, I have to go through all tests and change the name there, as well. Sure, I can use Search & Replace, but that is not very reliable. In the end, I still need to check the changes manually.
Is there some clever way of organizing unit tests that would still allow me to locate tests for a specific code quickly and at the same time lend itself more towards refactoring?
Alternatively, is there some, uh, perhaps Visual Studio extension, that would allow me to somehow say that "hey, these tests are for that method, so when name of the method changes, please be so kind and change the tests as well"? To be honest, I am seriously considering to write something like that myself :)
After working a lot with tests, I've come to realize that (at least for me) having all those restrictions bring a lot of problems in the long run, rather than good things. So instead of using "Names" and conventions to determine that, we've started using code. Each project and each class can have any number of test projects and test classes. All the test code is organized based on what is being tested from a functionality perspective (or which requirement it implements, or which bug it reproduced, etc...). Then for finding the tests for a piece of code we do this:
[TestFixture]
public class MyFunctionalityTests
{
public IEnumerable<Type> TestedClasses()
{
// We can find the tests for a class, because the test cases references in some special method.
return new []{typeof(SomeTestedType), typeof(OtherTestedType)};
}
[Test]
public void TestRequirement23423432()
{
// ... test code.
this.TestingMethod(someObject.methodBeingTested); //We do something similar for methods if we want to track which methods are being tested (we usually don't)
// ...
}
}
We can use tools like resharper "usages" to find the test cases, etc... And when that's not enough, we do some magic by reflection and LINQ by loading all the test classes, and running something like allTestClasses.where(testClass => testClass.TestedClasses().FindSomeTestClasses());
You can also use the TearDown to gather information about which methods are tested by each method/class and do the same.
One way to keep class and test locations in sync when moving the code:
Move the code to a uniquely named temporary namespace
Search for references to that namespace in your tests to identify the tests that need to be moved
Move the tests to the proper new location
Once all references to the temporary namespace from tests are in the right place, then move the original code to its intended target
One strength of end-to-end or behavioral tests is the tests are grouped by requirement and not code, so you avoid the problem of keeping test locations in sync with the corresponding code.
Regarding VS extensions that associate code to tests, take a look at Visual Studio's Test Impact. It runs the tests under a profiler and creates a compact database that maps IL sequence points to unit tests. So in other words, when you change the code Visual Studio knows which tests need to be run.
One unit test project per project is the way to go. We have tried with a mega unit test project but this increased the compile time.
To help you refactor use a product like resharper or code rush.
Is there some clever way of organizing unit tests that would still
allow me to locate tests for a specific code quickly
Resharper have some good short cuts that allows you to search file or code
As you said for class CustomerRepository their is a test CustomerRepositoryTests
R# shortcut shows inpput box for what you find in you case you can just input CRT and it will show you all the files starting with name have first as Capital C then R and then T
It also allow you do search by wild cards such as CR* will show you the list of file CustomerRepository and CustomerRepositoryTests

Writing standards for unit testing

I plan to introduce a set of standards for writing unit tests into my team. But what to include?
These two posts (Unit test naming best practices and Best practices for file system dependencies in unit/integration tests) have given me some food for thought already.
Other domains that should be covered in my standards should be how test classes are set up and how to organize them. For example if you have class called OrderLineProcessor there should be a test class called OrderLineProcessorTest. If there's a method called Process() on that class then there should be a test called ProcessTest (maybe more to test different states).
Any other things to include?
Does your company have standards for unit testing?
EDIT: I'm using Visual Studio Team System 2008 and I develop in C#.Net
Have a look at Michael Feathers on what is a unit test (or what makes unit tests bad unit tests)
Have a look at the idea of "Arrange, Act, Assert", i.e. the idea that a test does only three things, in a fixed order:
Arrange any input data and processing classes needed for the test
Perform the action under test
Test the results with one or more asserts. Yes, it can be more than one assert, so long as they all work to test the action that was performed.
Have a Look at Behaviour Driven Development for a way to align test cases with requirements.
Also, my opinion of standard documents today is that you shouldn't write them unless you have to - there are lots of resources available already written. Link to them rather than rehashing their content. Provide a reading list for developers who want to know more.
You should probably take a look at the "Pragmatic Unit Testing" series. This is the C# version but there is another for Java.
With respect to your spec, I would not go overboard. You have a very good start there - the naming conventions are very important. We also require that the directory structure match the original project. Coverage also needs to extend to boundary cases and illegal values (checking for exceptions). This is obvious but your spec is the place to write it down for that argument that you'll inevitably have in the future with the guy who doesn't want to test for someone passing an illegal value. But don't make the spec more than a few pages or no one will use it for a task that is so context-dependent.
Update: I disagree with Mr. Potato Head about only one assert per Unit Test. It sounds quite fine in theory but, in practice, it leads to either loads of mostly redundant tests or people doing tons of work in setup and tear-down that itself should be tested.
I follow the BDD style of TDD. See:
http://blog.daveastels.com/files/BDD_Intro.pdf
http://dannorth.net/introducing-bdd
http://behaviour-driven.org/Introduction
In short this means that
The tests are not thought as "tests", but as specifications of the system's behaviour (hereafter called "specs"). The intention of the specs is not to verify that the system works under every circumstance. Their intention is to specify the behaviour and to drive the design of the system.
The spec method names are written as full English sentences. For example the specs for a ball could include "the ball is round" and "when the ball hits a floor then it bounces".
There is no forced 1:1 relation between the production classes and the spec classes (and generating a test method for every production method would be insane). Instead there is a 1:1 relation between the behaviour of the system and the specs.
Some time ago I wrote TDD tutorial (where you begin writing a Tetris game using the provided tests) which shows this style of writing tests as specs. You can download it from http://www.orfjackal.net/tdd-tutorial/tdd-tutorial_2008-09-04.zip The instructions about how to do TDD/BDD are still missing from that tutorial, but the example code is ready, so you can see how the tests are organized and write code that passes them.
You will notice that in this tutorial the production classes are named such as Board, Block, Piece and Tetrominoe which are centered around the concepts of a Tetris game. But the test classes are centered around the behaviour of the Tetris game: FallingBlocksTest, RotatingPiecesOfBlocksTest, RotatingTetrominoesTest, FallingPiecesTest, MovingAFallingPieceTest, RotatingAFallingPieceTest etc.
Try to use as few assert statements per test method as possible. This makes sure that the purpose of the test is well-defined.
I know this will be controversial, but don't test the compiler - time spent testing Java Bean accessors and mutators is better spent writing other tests.
Try, where possible, to use TDD instead of writing your tests after your code.
I've found that most testing conventions can be enforced through the use of a standard base class for all your tests. Forcing the tester to override methods so that they all have the same name.
I also advocate the Arrange-Act-Assert (AAA) style of testing as you can then generate fairly useful documentation from your tests. It also forces you to consider what behaviour you are expecting due to the naming style.
Another item you can put in your standards is to try and keep your unit test size small. That is the actuall test methods themselves. Unless you are doing a full integration unit test there usually is no need for large unit tests, like say more than 100 lines. I'll give you that much in case you have a lot of setup to get to your one test. However if you do you should maybe refactor it.
People also talk about refactoring there code make sure people realize that unit tests is code too. So refactor, refactor, refactor.
I find the biggest problem in the uses I have seen is that people do not tend to recognize that you want to keep your unit tests light and agile. You don't want a monolithic beast for your tests after all. With that in mind if you have a method you are trying to test you should not test every possible path in one unit test. You should have multiple unit tests to account for every possible path through the method.
Yes if you are doing your unit tests correctly you should on average have more lines of unit test code than your application. While this sounds like a lot of work it will save you alot of time in the end when comes time for the inevitable business requirement change.
Users of full-featured IDE's will find that "some of them" have quite detailed support for creating tests in a specific pattern. Given this class:
public class MyService {
public String method1(){
return "";
}
public void method2(){
}
public void method3HasAlongName(){
}
}
When I press ctrl-shift-T in intellij IDEA I get this test class after answering 1 dialog box:
public class MyServiceTest {
#Test
public void testMethod1() {
// Add your code here
}
#Test
public void testMethod2() {
// Add your code here
}
#Test
public void testMethod3HasAlongName() {
// Add your code here
}
}
So you may want to take a close look at tool support before writing your standards.
I use nearly plain English for my unit test function names. Helps to define what they do exactly:
TEST( TestThatVariableFooDoesNotOverflowWhenCalledRecursively )
{
/* do test */
}
I use C++ but the naming convention can be used anywhere.
Make sure to include what is not an unit tests. See: What not to test when it comes to Unit Testing?
Include a guideline so integration tests are clearly identified and can be run separately from unit tests. This is important, because you can end with a set of "unit" tests that are really slow if the unit tests are mixed with other types of tests.
Check this for more info on it: How can I improve my junit tests ... specially the second update.
If you are using tools from the family of Junit (OCunit, SHunit, ...), names of tests already follow some rules.
For my tests, I use custom doxygen tags in order to gather their documentation in a specific page.

Categories