How do I close or "uninitialize" Castle ActiveRecord? - c#

I'm running some unit tests using Castle ActiveRecord that interact with a database. I have a procedure to drop the database (if it exists), then re-create it, before I interact with it in each test.
If I run one test, this works fine.
If I run multiple tests, the second one fails because it can't drop the database.
Is there some way in Castle ActiveRecord to tell it to shut down and let go of the database?

Instead of dropping the whole database, I recommend dropping and recreating the schema.
To drop the schema: ActiveRecordStarter.DropSchema();
To create the schema: ActiveRecordStarter.CreateSchema();
To reinitialize ActiveRecord: ActiveRecordStarter.ResetInitializationFlag(); then reconfigure it.
See the base AR test class for guidance.
For testing, I recommend taking a look at the new InMemoryTest.
See also: docs for ActiveRecord unit-testing.

Related

Unit Testing Legacy Code without DI

We are trying to add unit testing into our Business layer. The technology stack is asp.net web forms, WCF, ADO.Net calling stored procedures). The business layer is calling static methods on data classes, so it makes it difficult to introduce DI without making a lot of changes.
It may not be a conventional way to do it, but I'm thinking of keeping the DB in the unit test (dependency), but using it as a Test Db... either using an existing frozen db or having mocked data in tables. I was wondering about the feasibility of using a test db where the stored procedures are used like Mocks. Instead of duplicating the entire db, just create table names, named by the stored procedure.
The stored procedure would just call one table, and return static data... essentially, trying to emulate the functionality of Mocking data with something like Moq but from a DB perspective.
Can anyone recommend any designs that would include the DB in testing, that are still deterministic?
if you want to use the DB in the tests and have everything be deterministic then you need each test to have its own DB, which means creating (and potentially populating) a new db for each test.
Depending on how your DB layer creates its connection this is feasible. I have done similar by generating a DB using localDb in test setup with a GUID for the name and then deleting the DB again at the end of the test in the tear down.
It ends up being reasonably slow (not surprisingly) but having the DBs created on a Ram disk drive helped with that.
This worked ok for empty dbs, that then had schemas created by EF, but if you need a fixed set of data in the DB then you might need to restore it from a backup in the setup of the test
It seems to me that it's going to be a lot of work setting up your stored procedures to do what you want them to do when they are called for each test, and you still end up with the speed problems that databases always present. I'd recommend you do one or both of the following instead:
Use TypeMock, which has a powerful isolator tool. It basically changes your compilation to make it so that your unit test can mock even static methods.
Instead of just unit tests, try creating "acceptance tests," which focus on mimicking a complete user experience: log in, create object, view object (verify object looks right), update object, view object again (ditto), delete object (verify object is deleted). Begin each of these tests by setting up all the objects you'll need for this particular test, and end by deleting all those objects, so that other tests can run based on an assumed starting state.
The first approach gives you the speed and mockability of true "unit" tests, whereas the second one allows you to exercise much more of your code, increasing the likelihood that you'll catch bugs, even in things like stored procedures.

Entity Framework with Federated Service implementation

im not sure what im looking for, and everything that i've seen so far looks like it will work till I really dive into it. I just need some pointers from the brains here. Im working an ASP.NET MVC EF5 SQL2012 project. We have a model set that isn't code first (The entities were built using the designer) and as of right now, everything is working just fine. But, we have this setup script... (Convoluted as i've ever seen) and i need to get it into something more automated. Right now, the setup script pre-populates the tables with data. look ups, reference, etc. I'm looking for a way to automate this further, without having to run this script, and even more so. To generate the database and tables automatically. Every article i've read seems to do the trick (Migrations, seeding, etc.) but the one thing they don't take into consideration, we federate services. So the actual EDMX is on a WCF Dataservice 5.6. I have access to the models and what not but the WCF service exposes an DataServiceContext which doesn't have a seed on it. Am i looking at the right stuff here? or is the only option here to have this confounded setup script (All C# Driven). This website has been detrimental to this: http://www.entityframeworktutorial.net/code-first/seed-database-in-code-first.aspx as well as this: Auto Create Database Tables from Objects, Entity Framework but i don't see how i can use these over WCF 5.6.
The short answer is that Model-First doesn't give you a seed method because they want you to use a SQL script, but you have a few choices:
Use EF PowerTools (or VS2013's EF Designer) to generate the "Code-First" model from your DB. This will allow you to seed your DB, and have finer control over how everything operates under the hood.
Use a SQL script to seed with. Generally, if you make changes to your schema, you'll re-run your recreate DB script. Create a separate script to populate your DB and keep it handy. If you feel more comfortable in code than SQL, you can make a console app (or whatever type of app you want) and keep it up to date with your schema.
If all you need is seed data, and there is a good business case to expose a method to your service consumers, you can keep Model-First, create a stored procedure to seed your DB, and expose it as an EF function. You can then expose this in your WCF service
Personally, I tend towards designing the DB myself, using VS 2013's EF6 POCO generator, then using Code-First because of the better granular control that you get with real data classes. Then I do some cleanup work, write my seed methods, etc.

unit testing with rollback on database

I'm just starting to understand the importance of Unit Testing in a c# environment. Now, I'm wondering how do i implement a black-box unit test that does Inserts,Deletes and updates on a database and then cleaning up the data after a successful test.
How do you actually do a process that rollbacks data inserted/updated/deleted? do you simply reset the index and remove the inserted rows? or restore the original state of the table by creating a script?
please guide me, I appreciate it. thanks!
What we do here in our development cycle. we always have that unit testing and load testing in our mind when ever we are developing application. So we make a column in our every datadase's table with userId or else. Then when we run Load Test or Unit test we insert UserId -1 in that every column, pointing that it is a load test data and -2 in case of unit Test Data. then we have pre Define Jobs at data base end that will clean that data after some time.
As long as your test is concise, and i presume it must be - for testing your DAL, why not just do the insert / update / deletes in a transaction that is rolled back once your test is complete.
Another option is to just use specific Update / Delete scripts in your test cleanup methods to roll back the exact changes that you updated / inserted to their pre-test values.
I think deleting the rows in the CleanUp method should be good choice.
By this you will always be testing your deleting rows code.
I was doing a research recently and found this thread as well. Here are my findings, which might be of some help for the future readers:
Make tests responsible for restoring the data they change. Sth like undo for the command. Tests usually know what data changes are expected, so are able to revert those in theory. This will surely involve additional work and could introduce some noise, unless it's automated, e.g you might try to keep track of the data created/updated in the test somehow generally;
Wrap each test in transaction and revert it afterwards. Pretty much as the one above, but easier to implement with things like TransactionScope. Might be not suitable if app creates own transactions as transactions aren't composable in general and if app doesn't work with TransactionScope (there are issues with Entity Framework for example);
Assert in some smart way on data relevant to test only. Then you won't need to cleanup anything unless there is too much data. E.g you might make your app aware of tests and set specific value to a test-only column added to every table. I've never tried that in practice;
Create and initialize fresh database from scratch for every test;
Use database backups to restore database to the point you need;
Use database snapshots for the restore;
Execute scripts to delete all the data and insert it again.
I personally use the latter and even ended up implementing a Reseed library, which does all the work for me.
Test frameworks usually do allow execute some logic before and after each test/test fixture run, which will mostlikely be needed for the ideas above. E.g for NUnit this implemented with use of OneTimeSetUp, OneTimeTearDown, FixtureSetUp, FixtureTearDown, SetUp, TearDown attributes.
One option is to use a Mock database in place of a real database. Here's a link that describes it.

How would i unit test database logic?

I am still having a issue getting over a small issue when it comes to TDD.
I need a method that will get a certain record set of filtered data from the data layer (linq2SQL). Please note that i am using the linq generated classes from that are generated from the DBML. Now the problem is that i want to write a test for this.
do i:
a) first insert the records in the test and then execute the method and test the results
b) use data that might be in the database. Not to keen on this logic cause it could cause things to break.
c) what ever you suggest?
You should choose option a).
A unit test should be repeatable and has to be fully under your control. So for the test to be meaningful it is absolutely necessary that the test itself prepares the data for its execution - only this way you can rely on the test outcome.
Use a testdatabase and clean it each time you run the tests. Or you might try to create a mock object.
When I run tests using a database, I usually use an in-memory SQLite database.
Using an in memory db generally makes the tests quicker.
Also it is easy to maintain, because the database is "gone" after you close the connection to it.
In the test setup, I set up the db connection and I create the database schema.
In the test, I insert the data needed by the test. (your option a))
In the test teardown, I close the connection to the db.
I used this approach successfully for my NHibernate applications (howto 1 | howto 2 + nice summary), but I'm not that familiar with Linq2SQL.
Some pointers on running SQLite and Linq2SQL are on SO (link 1 | link 2).
Some people argue that a test using a database isn't a unit test. Regardless, I belief that there are situations where you want automated testing using a database:
You can have an architecture / design, where the database is hard to mock out, for instance when using an ActiveRecord pattern, or when you're using Linq2SQL (although there is an interesting solution in one of the comments to Peter's answer)
You want to run integration tests, with the complete system of application and database
What I have done in the past:
Start a transaction
Delete all data from all the tables in the database
Setup the reference data all your tests need
Setup the test data you need in database tables
Run your test
Abort the transaction
This works well provided your database does not have much data in it, otherwise it is slow. So you will wish to use a test database. If you have a test database that is well controlled, you could just run the test in the transaction without the need to delete all data first.
Try to design your system, so you get mock the data access layer for most of your tests. It is valid (and often useful) to unit test database code, however the unit tests for your other code should not need to touch the database.
You should consider if you would get more benefits from “end to end” system tests, with unit tests only for your “logic” code. This depend to an large extent on other factors within the project.

Clear NHibernate database fast

I am using NHibernate for ORM, and everything works fine.
Now I started to write some unit-tests (using the DB, I do not want to put tooo much effort in abstracting this away, I know its not perfect, but it works..).
I need to be sure that the DB is completly empty for some tests. I can, of course, create the whole DB. But that seems to be overkill and I think it takes longer...
Is there a DELETE_ALL command which clears all tables, I can use in NHibernate?
Chris
EDIT: A short update, I decided to go the SQLite way, no problem to change this with NHibernate. There are some pitfalls, I am using this config, and it works. Otherwise you might get "table not found" errors, due to nHibernate closing the connection while in session, resulting in a "lost" database...
For your convinience: Copy and paste...
.Database(SQLiteConfiguration.Standard.ConnectionString("Data Source=:memory:;Version=3;New=True;Pooling=True;Max Pool Size=1;")
.Raw("connection.release_mode", "on_close"))
.Mappings(obj => obj.AutoMappings.Add(_config.APModel));
Drop and recreate the database. You can use schemaexport:
var export = new SchemaExport(config);
export.Drop(false, true);
export.Create(true, true);
Sql lite in memory runs faster for tests than a "normal" database, but the disadvantage is that the sql-lite dialect can be different than the sql dialect in production.
I would recommend you check out ndbunit. It's not a NHibernate-specific, but I've used it for testing NHibernate projects in the past and it works well. Basically, it provides functions to clear the database, prefill it with test data, or restore it to known states after each test. You just have to provide an XSD of the database schema, and optionally some XML data for pre-filling.
I believe I first saw this in the Summer of NHibernate screen-cast series, so check those out to see it in use.
You're not writing unit tests (i.e. tests that test one unit), you're writing integration tests where units interact (i.e. with your database).
Part of your test infrastructure could run a sql script that does one of the following:
Drop db and recreate.
Truncate all tables.
Ideally, you do want to put a bit of work in abstracting the db away, especially since you have NH which makes it much easier than some other frameworks.
Use an in memory database like SQLite, and setup the necessary data in it before each test. The initial setup takes a bit of time, but each test runs very fast afterwards and you can make sure that you start off with a clean slate. Ayende has a few blog posts about how to set it up.
Just in case the Drop/Create DB does not suit your needs (like if the db contains object that NHibernate is not aware of, like SPs, functions etc) you could always make a backup point with the DB empty and after you're done testing just restore to that point

Categories