I know my question looks like I am searching for some shortcut but actually I have searched a lot and couldn't find a resource to learn unit testing specially for business layer which involves data update in database.
I did get resources to learn writing unit test cases in nunit but they all were testing static data. I don't want to learn unit test writing as a beginner, I have the idea of writing unit tests in nunit, my main question is how do we test methods which involves data insert/update/delete in database? Do we actually insert data while testing? Do mocking play a role in it?
Please help me with any resource that covers specially this case.
Thanks!
If your business layer is written in C#, as your tag suggests, then mocking the database is the best practice. You use a mocking framework that allows you to control the data layer and write asserts that your business layer does the correct thing under controlled (mocked) responses. If you use this approach then your test will be isolated and execute very fast without any connection to the database (you should be able to execute thousands of unit test within a second.)
If your database contains a lot of business logic itself then you can unit test it in isolation, but I would not use nunit for that.
If you decide to include the database in you nunit C# business logic test then you will have to deal with connections to the database, and shared state between test. You will be in a much more complex setup and deal with more complex errors.
Here is a good resource to start with: https://www.amazon.com/Pragmatic-Unit-Testing-Nunit-Programmers/dp/0974514020
Related
I have runned into to the issue regarding unit test of dataprovider's. What are the best way to implement that.
One solution would be to insert something into the database and read it to make sure that it's as expected. And then removing it again. But this requires more coding.
The other solution is to have an extra database, which i could test against. This also requires alot of work to implement it.
What are the correct way to implement it?
As others have pointed out, what you are describing is called integration testing. Integration testing is something you should definitely do but it's good to understand the differences.
A unit test tests an individual piece of code without any dependencies. Dependencies are things like a database, file system or a web service but also other internal classes that are complex and require their own unit tests. Unit tests are made to run very fast. Especially when performing test driven development (TDD) you want your unit tests to execute in the order of milliseconds.
Integration tests are used to test how different components work together. If you have made sure through unit tests that your business logic is correct, your integration tests only have to make sure that all connections between different elements are in place. Integration tests can take a long time but you have fewer of them than unit tests.
I wrote a blog post on this some time ago that explains the differences and shows you ways to remove external dependencies while unit testing: Unit Testing, hell or heaven?.
Now regarding, your question. When running integration tests against a database you have a couple of options:
Use delta testing. This means that at the beginning of your test you record the current state of your database. For example, you store that are now 3 people in the people table. Then in your test you add one person and verify that there are now 4 people. in the database. This can be used quite effectively in simple scenarios. However, when your project grows more complex this is probably not the way to go.
Use a transaction around your unit tests. This is an easy way to make sure that your tests don't leave any data behind. Just start a new transaction (using the TransactionScope class in the .NET Framework) at the beginning of the test. As long as you don't complete the transaction, all changes will be rolled back automatically.
Use a new database for each test. Using localdb support in Visual Studio 2012 and higher, this can be done relatively fast.
I've chosen for the transaction scope a couple of times before and it worked quite well. One thing that's very important when writing integration tests like this is to make sure that your tests don't depend upon eachother. They need to run in whatever order the test runner decides on.
You should also make sure to avoid any 'magic numbers'. For example, maybe you know that your database contains 3 people so in your test you add one person and then assert that there are four in the database. For readers of your tests (which will be you in a couple of days, weeks or months) this is very hard to understand. Make sure that your tests are self explaining and that you don't depend on external state that's not obvious from the test.
You cannot unit test external dependencies like database connections. There is a good post here about why this is the case. In short: external dependencies should be tested, but that's integration tests, not unit tests.
Normally you do write intergration test when you call your database from code. If you want to write unittest, you should have a look at mocking frameworks.
We've just started using specflow to do testing on a large codebase.
Since we're always adding features we said we'd be doing the tests on the new code and write tests for the old one when it's time to refactor.
One of our new features implies copying cars from one user's database to another's. There's not much ui for this so we don't have a way to check everything went ok other than hitting the database.
Is there an alternative to hitting the database when writing tests like this?
If the database access is isolated from the business logic well enough; i.e. behind some sort of repository interface then you cna mock the real repository in your tests and validate that the moked repository is called as would be apropriate for such a copy operation.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Unit testing on code that uses the Database
I am just starting with unit testing and wondering how to unit test methods that are making actual changes to my database. Would the best way be to put them into transactions and then rollback, or are there better approaches to this?
If you want proper test coverage, you need two types of tests:
Unit-tests which mock all your actual data access. These tests will not acually write to the database, but test the behaviour of the class that does (which methods it calls on other dependencies, etc.)
System tests (or integration tests) which check that your database can be accessed and modified. I would considered two types of tests here: simple plain CRUD tests (create / read / update / delete) for each one of your model objects, and more complex system tests for your actual methods, and everything you deem interesting or valuable to test. Good practices here are to have each test starting from an empty (or "ready for the test") database, do its stuff, then check state of the database. Transactions / rollbacks are one good way to achieve this.
For unit testing you need to mock or stub the data access code, mostly you have repository interface and you can stub it by creating a concrete repository which stores data in memory, or you could mock it using dynamic mocking framework ..
For system or integration testing, you need to re-create the entire database before each test method in order to maintain a stable state before each test.
As per some of the previous answers if you want to test your data access code then you might want to think about mocks and a system/integration test strategy.
But, if you want to unit test your SQL objects (eg sprocs, views, constraints in tables etc) - then there are a number of database unit testing frameworks out there that might be of interest (including one that I have written).
Some implement tests within SQL, others within your code and use mbUnit/NUnit etc.
I have written a number of articles with examples on how I approach this - see http://dbtestunit.wordpress.com/
Other resources that might be of use:
http://www.simple-talk.com/sql/t-sql-programming/close-those-loopholes---testing-stored-procedures--/
http://tsqlt.org/articles/
The general approach is to have a way to mock you database actions. So that your unit tests are not reliant on the database being available or in a certain state. That said it also implies design that facilitates the isolation required to mock away your data layer. Unit test and how to do it well is a huge topic. Take a look on the googley for Mock Frameworks, and Dependency injection for a start.
If you are not developing an O/R mapper, there's no need to test database code. You don't want to test ADO.NET methods, right? Instead you want to verify that the ADO.NET methods are called with the right values.
Search Google for repository pattern. You will create an implementation of IRepository interface with CRUD methods and test/mock this.
If you want to test against a real database, this would be more of an integration then a unit test. Wrapping your tests in transaction could be an idea to keep your database in a consistent state.
We've done this in a base class and used the TestInitialize and TestCleanup functions to make sure this always happens.
But testing against a real database will certainly bring you into performance problems. So make sure from the beginning that you can swap your database access code with something that runs in memory. I don't now which database access code your targeting but design patterns like UnitOfWork and Repository can help you to isolate your database code and replace it with an in memory solution.
I'm trying to improve the automated testing in my application, but am unsure of the best way to proceed.
My app gathers data from multiple forms, recodes it and stores it in a database. I have created a pretty complex SQL view, which flattens the structure out, so it can be imported into a stats package (SPSS).
My concern is that the view is complex, and I want to automate some tests around it.
Currently I have some functional tests, which create a complete form objects model, and sends it into the application. I then retrieve the view from the database, and use reflection to test that the retrieved view fields match the original data.
The problem is that this is very manual and heavy, my fixtures are lengthy, and it is cumbersome to add new scenarios (i.e. various parts of the model incomplete).
Does anyone have any advice on how I could improve my test strategy? Tips tricks all welcome!
Thanks!
DbFit is perfect for this. DbFit is an extension of FitNesse which maybe you are already using since you spoke of using "fixtures". In any case, DbFit makes it really easy to set up a test where you can seed some data, run the View, compare the expected results, and then it will automatically rollback the data that you just seeded for the test. And it is very easy to update as you add more fields to the View. AND it requires no additional objects in your DB like some other SQL "unit" testing suites.
You can find more info on using DbFit at:
http://benilovj.github.com/dbfit
http://groups.google.com/group/dbfit
And here is a tutorial that I wrote for it that explains the basic options:
http://www.sqlservercentral.com/articles/Testing/64636/
This is a very difficult question to answer. It almost sounds to me like You want to make a single test that tests all in one go.
First, Your app should be constructed, so each functionality is isolated in its own class, thereby making it easy to test AND easy to replace by stubs when testing other things. Dependencies on other functions should be injected (Dependency Injection).
Second, you should use the same technique for external systems like database connections and SPSS file writers. This involves wrapping such functionality so these dependencies also can be injected, and thus replaced by stubs when testing other aspects of your app.
Third, be aware that if tests are hard to write, 99,99% of the time this indicates that your design is not as strong as it could be.
Regards,
Morten
I use a code generator (CodeSmith with .NetTiers template) to generate all the DAL code. I write unit tests for my code (business layer), and these tests are becoming pretty slow to run. The problem is that for each test, I reset the database to have a clean state. Also, as I do a lot of tests, it seems that the latency of the database operations sum up to a pretty bit delay.
All DB operations are performed through a DataRepository class that is generated by .NetTiers. Do you know if there is a way to generate (or code myself) a mock-DataRepository that would use in-memory storage instead of using the database ?
This way, I would be able to use this mock repository in my unit tests, speeding them a lot, without actually changing anything to my current code !
Take a look at Dependency injection (DI) and Inversion of Control containers (IOC). Essentially, you will create an interface that that a new mock DB object can implement, and then the DI framework will inject your mock DB when running tests, and the real DB when running you app.
There are numerous free and open source libraries that you could use to help you out. Since you are in C#, one of the new and up and coming DI libraries is Ninject. There are many others too. Check out this Wikipedia article for others and a high level description.
From the description of the issue, I think you are performing the Integration test because your test is making use of the Business and the DAL and live database.
For unit testing, you deal with one layer of code with all other dependencies either mocked or stubbed. With this approach, you unit tests will be really fast to execute on every incremental code changes.
There are various mocking frameworks that you can use like Rhino Mock, Moq, typemock to name a few. (In my project, I use Rhino mock to mock the DAL layer and unit test Business Layer in Isolation)
Harsha
Some of our unit tests use data fetched from XML's which were generated from a database to mock db access. DAL classes are replaced by mock ones because they are all stored in a DI container.
The generation of the xml's is custom code, if you find an open source solution for this then I'm happy to hear it.
Edit after Stefan's answer: I recall another team using SQL CE for their test database