We've just started using specflow to do testing on a large codebase.
Since we're always adding features we said we'd be doing the tests on the new code and write tests for the old one when it's time to refactor.
One of our new features implies copying cars from one user's database to another's. There's not much ui for this so we don't have a way to check everything went ok other than hitting the database.
Is there an alternative to hitting the database when writing tests like this?
If the database access is isolated from the business logic well enough; i.e. behind some sort of repository interface then you cna mock the real repository in your tests and validate that the moked repository is called as would be apropriate for such a copy operation.
Related
I know my question looks like I am searching for some shortcut but actually I have searched a lot and couldn't find a resource to learn unit testing specially for business layer which involves data update in database.
I did get resources to learn writing unit test cases in nunit but they all were testing static data. I don't want to learn unit test writing as a beginner, I have the idea of writing unit tests in nunit, my main question is how do we test methods which involves data insert/update/delete in database? Do we actually insert data while testing? Do mocking play a role in it?
Please help me with any resource that covers specially this case.
Thanks!
If your business layer is written in C#, as your tag suggests, then mocking the database is the best practice. You use a mocking framework that allows you to control the data layer and write asserts that your business layer does the correct thing under controlled (mocked) responses. If you use this approach then your test will be isolated and execute very fast without any connection to the database (you should be able to execute thousands of unit test within a second.)
If your database contains a lot of business logic itself then you can unit test it in isolation, but I would not use nunit for that.
If you decide to include the database in you nunit C# business logic test then you will have to deal with connections to the database, and shared state between test. You will be in a much more complex setup and deal with more complex errors.
Here is a good resource to start with: https://www.amazon.com/Pragmatic-Unit-Testing-Nunit-Programmers/dp/0974514020
I have runned into to the issue regarding unit test of dataprovider's. What are the best way to implement that.
One solution would be to insert something into the database and read it to make sure that it's as expected. And then removing it again. But this requires more coding.
The other solution is to have an extra database, which i could test against. This also requires alot of work to implement it.
What are the correct way to implement it?
As others have pointed out, what you are describing is called integration testing. Integration testing is something you should definitely do but it's good to understand the differences.
A unit test tests an individual piece of code without any dependencies. Dependencies are things like a database, file system or a web service but also other internal classes that are complex and require their own unit tests. Unit tests are made to run very fast. Especially when performing test driven development (TDD) you want your unit tests to execute in the order of milliseconds.
Integration tests are used to test how different components work together. If you have made sure through unit tests that your business logic is correct, your integration tests only have to make sure that all connections between different elements are in place. Integration tests can take a long time but you have fewer of them than unit tests.
I wrote a blog post on this some time ago that explains the differences and shows you ways to remove external dependencies while unit testing: Unit Testing, hell or heaven?.
Now regarding, your question. When running integration tests against a database you have a couple of options:
Use delta testing. This means that at the beginning of your test you record the current state of your database. For example, you store that are now 3 people in the people table. Then in your test you add one person and verify that there are now 4 people. in the database. This can be used quite effectively in simple scenarios. However, when your project grows more complex this is probably not the way to go.
Use a transaction around your unit tests. This is an easy way to make sure that your tests don't leave any data behind. Just start a new transaction (using the TransactionScope class in the .NET Framework) at the beginning of the test. As long as you don't complete the transaction, all changes will be rolled back automatically.
Use a new database for each test. Using localdb support in Visual Studio 2012 and higher, this can be done relatively fast.
I've chosen for the transaction scope a couple of times before and it worked quite well. One thing that's very important when writing integration tests like this is to make sure that your tests don't depend upon eachother. They need to run in whatever order the test runner decides on.
You should also make sure to avoid any 'magic numbers'. For example, maybe you know that your database contains 3 people so in your test you add one person and then assert that there are four in the database. For readers of your tests (which will be you in a couple of days, weeks or months) this is very hard to understand. Make sure that your tests are self explaining and that you don't depend on external state that's not obvious from the test.
You cannot unit test external dependencies like database connections. There is a good post here about why this is the case. In short: external dependencies should be tested, but that's integration tests, not unit tests.
Normally you do write intergration test when you call your database from code. If you want to write unittest, you should have a look at mocking frameworks.
I'm in the midst of rewriting a database application and using Entity Framework to access the DB. Currently I am using MSTest and a copy of the underlying database as part of these tests. My MSTest involves the following code as part of each test:
[TestInitialize()]
public void MyTestInitialize()
{
transScope = new TransactionScope(TransactionScopeOption.RequiresNew, new TransactionOptions { Timeout = new TimeSpan(0, 10, 0) });
}
[TestCleanup()]
public void MyTestCleanup()
{
Transaction.Current.Rollback();
transScope.Dispose();
}
Now, this seems to work pretty well for testing and resets the DB between tests. My tests use the DB context to do CRUD operations against the test DB and then roll them back afterward.
I've read a bit about isolating the C# library from the underlying DB for testing but I'm wondering what this actually buys me. As part of this rewrite, most (but not all) of the code that was in stored procedures has been moved in the C# layer but a few remain which are called via triggers on tables. What do I gain from going through the exercise of mocking out that database layer? Frankly, when I look at doing this, it seems like a lot of additional work without any obvious value, but perhaps I'm missing the point here.
Thoughts?
It depends on the the kinds of tests you are writing. When writing unit tests where you want to test one unit of code--typically a class--then usually it's good for two things to hold true:
The tests should run as fast as possible. Meaning 100's of tests per second.
The code you are testing should be isolated from other code such that you can control the way dependencies work and test different kinds of inputs and outputs easily.
Both of these things are difficult to do if you use a real database for all tests. The first because a round-trip to the database usually takes a lot more time than just running some code. The second because you need to have your database setup with lots of different kinds of data, including negative cases and corner cases, and sometimes even make the database fail. It's often easier to instead mock the dependencies of your class and pass in any inputs needed.
That being said, I have written many tests that use a similar pattern to the one you describe and they work well and run relatively fast. Personally, I would use a combination of real unit tests without a database, and tests like the ones you have that are more like functional testing of a component.
In the end, do what works for you.
I am creating an unit test, but there are many entities. So do I have to insert all entities at database manually or is there any better solution?
Are you looking for something like Moq? You use it to create a Mock objects and Queryable lists of objects so that you don't need to put fake data into your database to test.
Have a look at this link on how to get going on writing unit tests. The one thing I think that may help you in regard to your question:
Mock out all external services and state
Otherwise, behaviour in those external services overlaps multiple tests, and state data means that different unit tests can influence each other’s outcome.
You’ve definitely taken a wrong turn if you have to run your tests in a specific order, or if they only work when your database or network connection is active.
(By the way, sometimes your architecture might mean your code touches static variables during unit tests. Avoid this if you can, but if you can’t, at least make sure each test resets the relevant statics to a known state before it runs.)
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Unit testing on code that uses the Database
I am just starting with unit testing and wondering how to unit test methods that are making actual changes to my database. Would the best way be to put them into transactions and then rollback, or are there better approaches to this?
If you want proper test coverage, you need two types of tests:
Unit-tests which mock all your actual data access. These tests will not acually write to the database, but test the behaviour of the class that does (which methods it calls on other dependencies, etc.)
System tests (or integration tests) which check that your database can be accessed and modified. I would considered two types of tests here: simple plain CRUD tests (create / read / update / delete) for each one of your model objects, and more complex system tests for your actual methods, and everything you deem interesting or valuable to test. Good practices here are to have each test starting from an empty (or "ready for the test") database, do its stuff, then check state of the database. Transactions / rollbacks are one good way to achieve this.
For unit testing you need to mock or stub the data access code, mostly you have repository interface and you can stub it by creating a concrete repository which stores data in memory, or you could mock it using dynamic mocking framework ..
For system or integration testing, you need to re-create the entire database before each test method in order to maintain a stable state before each test.
As per some of the previous answers if you want to test your data access code then you might want to think about mocks and a system/integration test strategy.
But, if you want to unit test your SQL objects (eg sprocs, views, constraints in tables etc) - then there are a number of database unit testing frameworks out there that might be of interest (including one that I have written).
Some implement tests within SQL, others within your code and use mbUnit/NUnit etc.
I have written a number of articles with examples on how I approach this - see http://dbtestunit.wordpress.com/
Other resources that might be of use:
http://www.simple-talk.com/sql/t-sql-programming/close-those-loopholes---testing-stored-procedures--/
http://tsqlt.org/articles/
The general approach is to have a way to mock you database actions. So that your unit tests are not reliant on the database being available or in a certain state. That said it also implies design that facilitates the isolation required to mock away your data layer. Unit test and how to do it well is a huge topic. Take a look on the googley for Mock Frameworks, and Dependency injection for a start.
If you are not developing an O/R mapper, there's no need to test database code. You don't want to test ADO.NET methods, right? Instead you want to verify that the ADO.NET methods are called with the right values.
Search Google for repository pattern. You will create an implementation of IRepository interface with CRUD methods and test/mock this.
If you want to test against a real database, this would be more of an integration then a unit test. Wrapping your tests in transaction could be an idea to keep your database in a consistent state.
We've done this in a base class and used the TestInitialize and TestCleanup functions to make sure this always happens.
But testing against a real database will certainly bring you into performance problems. So make sure from the beginning that you can swap your database access code with something that runs in memory. I don't now which database access code your targeting but design patterns like UnitOfWork and Repository can help you to isolate your database code and replace it with an in memory solution.