I am developing an ERP that is not based on EF. My question is: how can I do perform test using AutoFixture for all the CRUD without using doing mocking? How can I perform cleanup and Setup operation for Unit Test.
Likewise, I need to test update and delete operation alone with insert. What I have seen on different site there are only insert fixture examples without using mock.
Related
I need to write unit testing for legacy code. I'm using MoQ for wring DB Context based unit testing. One repository method contains saving 2 entities in a transaction.
And when the test execution hit the dbContext.Database.BeginTransaction() it gives the error "No connection string named 'LimaV3DbContext' could be found in the application config file".
It seems the test trying the execute the actual db context rather than using the mock db context. Is there a way to force the test execution to use the mock db context used in the test initialization.
I've setup the tests based on the following article.
https://learn.microsoft.com/en-us/ef/ef6/fundamentals/testing/mocking
I need to write integration tests for my layer that exposes methods of service. But I need my database to be in a certain state for the tests to pass
For example for testing the GetStoreByID method, I need to have store 1 in my database but not store 2 (for the ko test)
The database is developed and deployed by a another team using a sql project (dacpac)
I use Entity Framework 6.1.3 with an Edmx
What is the best way, in this case, to setup the data in database before tests ?
This link https://msdn.microsoft.com/en-us/library/dn314431(v=vs.113).aspx gives the Microsoft MSDN article on how to write tests using EF6 by faking out the database.
In summary the article covers the subject 'When writing tests for your application it is often desirable to avoid hitting the database. Entity Framework allows you to achieve this by creating a context – with behavior defined by your tests – that makes use of in-memory data.'
If your data access is separated by an interface (i.e. using something like IDBSet for each of the types), personally I'd avoid depending on the database completely where possible, and use either a fakedbsets (if you need to test repository/DAL code) or just mocking with moq or nsubstitute if you don't.
I know this doesn't specifically answer your question, but the best way to setup test data is do it in memory in my experience with as few external dependencies as possible. The database adds extra moving parts that you don't really want to have to depend on in unit/integration tests. This also adds complexities if you have a CI server etc... that you generally want to avoid.
I am implementing an interface into a C# class, whose entire job is basically just to call some T-SQL stored procedures and return the data. Other implementations of the interface may obtain data through web-services, reading files, etc, so to test this particular class I'd ideally mock an SQL Server database and its procedures.
I'm not sure if this is feasible. I've seen tools like RhinoMock used to mock database tables but since the entire purpose of my code is to talk to a DB, can I mock the entire DB or is that a bit of a waste of time? I'd ideally like a way to transparently provide a replacement for having a real DB so local testing can be done, making real stored-procedure calls against a fake DB.
If you are building unit tests then you should not execute the stored procedure and you should stub this interface.
If you are building integration tests then you must have it all working.
In both cases your class shouldn't do it directly, you should have an inner handler like IDbHandler and during your unit tests you should mock it and during the integration tests you should use your concrete implementation.
The purpose of having a mock is to verify that the other side (DB in your case) got a request with the expected parameters but a physical component can't be mocked so just by adding an interface between your code and the physical component will enable those validations.
BTW I would start with having the unit tests and just afterwards add the integration tests.
I am still having a issue getting over a small issue when it comes to TDD.
I need a method that will get a certain record set of filtered data from the data layer (linq2SQL). Please note that i am using the linq generated classes from that are generated from the DBML. Now the problem is that i want to write a test for this.
do i:
a) first insert the records in the test and then execute the method and test the results
b) use data that might be in the database. Not to keen on this logic cause it could cause things to break.
c) what ever you suggest?
You should choose option a).
A unit test should be repeatable and has to be fully under your control. So for the test to be meaningful it is absolutely necessary that the test itself prepares the data for its execution - only this way you can rely on the test outcome.
Use a testdatabase and clean it each time you run the tests. Or you might try to create a mock object.
When I run tests using a database, I usually use an in-memory SQLite database.
Using an in memory db generally makes the tests quicker.
Also it is easy to maintain, because the database is "gone" after you close the connection to it.
In the test setup, I set up the db connection and I create the database schema.
In the test, I insert the data needed by the test. (your option a))
In the test teardown, I close the connection to the db.
I used this approach successfully for my NHibernate applications (howto 1 | howto 2 + nice summary), but I'm not that familiar with Linq2SQL.
Some pointers on running SQLite and Linq2SQL are on SO (link 1 | link 2).
Some people argue that a test using a database isn't a unit test. Regardless, I belief that there are situations where you want automated testing using a database:
You can have an architecture / design, where the database is hard to mock out, for instance when using an ActiveRecord pattern, or when you're using Linq2SQL (although there is an interesting solution in one of the comments to Peter's answer)
You want to run integration tests, with the complete system of application and database
What I have done in the past:
Start a transaction
Delete all data from all the tables in the database
Setup the reference data all your tests need
Setup the test data you need in database tables
Run your test
Abort the transaction
This works well provided your database does not have much data in it, otherwise it is slow. So you will wish to use a test database. If you have a test database that is well controlled, you could just run the test in the transaction without the need to delete all data first.
Try to design your system, so you get mock the data access layer for most of your tests. It is valid (and often useful) to unit test database code, however the unit tests for your other code should not need to touch the database.
You should consider if you would get more benefits from “end to end” system tests, with unit tests only for your “logic” code. This depend to an large extent on other factors within the project.
I have an existing ASP.NET MVC application with some sample data in the SQL Server database, which is working fine..
Assuming I have all of the necessary repositories and IOC in place, is there a tool that will extract the data from a group of tables, and "freeze-dry" it into a mock object (perhaps using an XML file to store the data), so that I can detach the database and use the mock data for my unit tests?
Depending on what exactly you are trying to test there might be different approaches.
If you want to test the data access logic then this is no longer unit test but integration test. For this kind of tests it is a good idea to be able to easily replace the real database with some lighter maybe even in-memory database like SQLite which will be reconstructed before running each test. If you are using an ORM this task is easy. All you need to do is to generate SQL scripts (INSERT INTO...) from your existing database, modify and adapt the dialect to SQLite (if necessary), read and inject into a SQLite file and finally all that's left is to instruct your data access layer to use SQLite dialect and connection string for the unit test.
Now if you are not using an ORM and your data access logic is tied to MSSQL things get uglier you will need a live database in order to perform those integration tests. In this case I would suggest you duplicate your real database which you would use for the tests by modifying only the connection string. Once again you will need to properly setup and teardown (preferably in a transaction) this test database in order to put it into a known state for the tests.
If you want to test code that depends on those repositories (such as your controllers for example) you don't need to even bother about mocking the data as your controllers depend on abstract repositories and not the real implementations (aren't they), so you could easily mock the methods of the repository in order to test the logic in the controllers.
This is actually a well known "test smell":
http://xunitpatterns.com/Obscure%20Test.html#General
From:
2098937: Proper way to Mock repository objects for unit tests using Moq and Unity
I don't know of a direct way to do what you're asking for, but MSSQL supports export to CSV, Access, and Excel. Although, this require you to change the Data Access Layer in your in your mid-tier, and furthermore, I don't feel this answers your question:
"freeze-dry" it into a mock object
This I'm not sure of. Is it feasible to just restore a backup of the database either on the same SQL server as a new database, or possibly on a dev SQL server?