I have been looking for solution for a few days already and could find anything that would help to solve my problem.
I have a WCF service for which I have to make some unit tests. The problem is that service takes data from database in this way:
using (var context = new MyProjectEntities())
{
//here goes the actions
}
MyProjectEntities is autogenerated from edmx model i guess.. (Database first)
So this way it takes all the data from the database and operates on it.
My question is: whats the correct way to feed service with fake data for testing, instead of data from database?
The most trivial way is to use a live database. This isn't too flexible, because you need a new database in a fixed initial state for every single run, and also multiple developers can't use the same database at the same time.
What we do at my company is this: use a single-file database, namely SQL Server CE. If your code is database-engine independent, this can totally work, just change your connection string and you can even put your database to a fixed state by copying a template datafile to the right place. This isn't a really isolated unit test, but it's very simple to implement, it doesn't have the above problems, and you basically get what you need in the end. If your code is database-engine dependent, now you have one more reason to use an ORM solution like NHibernate or Entity Framework.
The best, most flexible, and also most complex solution is using a dependency injection or mocking framework. This is textbook stuff, there's tons of literature on that, it will give you all the flexibility there is.
Related
I have written a .Net application which has many components, some of those components are database access layers which abstract from the rest of the components where the data comes from.
I have unit tested the rest of the components by mocking the database access layer. One way I have of testing the database access layers is to use create new empty databases on the test servers. This can be slow, and most would argue that it is not a unit tests as I depend on the database server.
What I think I want, is a mocked database which I can use to test my database access layer. The mocked database can be given a schema and process SQL commands as if it were a remote database, but in fact it is all in-memory. Does this exists? Or how else can I test my SQL and database <-> data model code.
To solve my problem you may want to know I am using SQL Server, versions 2008 and later, and my code is written in C#, running with .Net 4.5 and using Visual Studio 2013
Note: I do not want to use Linq2SQL/EntityFramework to replace my database access layer, as in my experience it results difficult to debug issues and performance problems.
I tried to phrase my question carefully to avoid people lecturing me on their beliefs in what should be tested and how, but perhaps to be a little more blunt:
I want to unit test my SQL, small changes to that have a big impact on the outcome of the program. I do have integration tests, but it takes much longer to create a release for the test environment than it does to tweak code and run the unit tests.
I appreciate people taking the time to read my question and respond anyhow.
I don't know if it's going to be the best answer, but. The way we're doing is that we're using SQLite, which is an in-memory database. There are a number of different ways to set it up, we use NHibernate as an ORM, and for that it is fairly easy to set up using FluentNHibernate, but I don't think it's much harder using any other framework either:
Fluently.Configure()
.Database(SQLiteConfiguration.Standard.InMemory())
.Mappings(m => ... )
.BuildConfiguration()
.BuildSessionFactory();
I think you can run queries against a SQLite database without any ORMs as well, e.g. using the base SqlConnection class.
This database can accept migrations, schemas, etc. It behaves very close to a proper MsSql database, we can run any DDL an DML statements against it, and it works fine. Since it's all in-memory, it's also pretty fast.
im not sure what im looking for, and everything that i've seen so far looks like it will work till I really dive into it. I just need some pointers from the brains here. Im working an ASP.NET MVC EF5 SQL2012 project. We have a model set that isn't code first (The entities were built using the designer) and as of right now, everything is working just fine. But, we have this setup script... (Convoluted as i've ever seen) and i need to get it into something more automated. Right now, the setup script pre-populates the tables with data. look ups, reference, etc. I'm looking for a way to automate this further, without having to run this script, and even more so. To generate the database and tables automatically. Every article i've read seems to do the trick (Migrations, seeding, etc.) but the one thing they don't take into consideration, we federate services. So the actual EDMX is on a WCF Dataservice 5.6. I have access to the models and what not but the WCF service exposes an DataServiceContext which doesn't have a seed on it. Am i looking at the right stuff here? or is the only option here to have this confounded setup script (All C# Driven). This website has been detrimental to this: http://www.entityframeworktutorial.net/code-first/seed-database-in-code-first.aspx as well as this: Auto Create Database Tables from Objects, Entity Framework but i don't see how i can use these over WCF 5.6.
The short answer is that Model-First doesn't give you a seed method because they want you to use a SQL script, but you have a few choices:
Use EF PowerTools (or VS2013's EF Designer) to generate the "Code-First" model from your DB. This will allow you to seed your DB, and have finer control over how everything operates under the hood.
Use a SQL script to seed with. Generally, if you make changes to your schema, you'll re-run your recreate DB script. Create a separate script to populate your DB and keep it handy. If you feel more comfortable in code than SQL, you can make a console app (or whatever type of app you want) and keep it up to date with your schema.
If all you need is seed data, and there is a good business case to expose a method to your service consumers, you can keep Model-First, create a stored procedure to seed your DB, and expose it as an EF function. You can then expose this in your WCF service
Personally, I tend towards designing the DB myself, using VS 2013's EF6 POCO generator, then using Code-First because of the better granular control that you get with real data classes. Then I do some cleanup work, write my seed methods, etc.
I have searched back and forth but seemingly could not get a hold of what I need. I am sorry if this has been answered of late. A redirection to the discussion will do me good.
This is the scenario. I have been instructed to move from Microsoft Visual Foxpro (MS is withdrawing support come 2015) to .Net C# by my boss. For the sake of good foundation and adoption of best practices, I have decided to first learn, piece pertinent information together, then start coding. This is the second year.
We are a bureau company that offer payroll processing outsource services to over 50 clients. Each client currently has their own database. The databases have tables with completely identical structures.
I am a newbie. Totally new to .net world.
I had started off with raw SQL using datatables, datareaders but in my research I got some discussions discouraging this. Many were of the view that Entity Framework should serve the purpose. But one is allowed to mix approaches especially when complex queries are involved.
Can someone point me to some 'good read' where I can implement Entity Framework with over 50 indentical databases. Each database is totally independent and has nothing to dowith any other. When the user logs in, they select which client they need to process payroll for, then EF points to that database.
EF needs 2 different pieces of information to work with data from a database:
1) The database schema: This is included as compiled code in your application and cannot normally be changed at runtime.
2) The connection string: This is provided at runtime, normally from a config file.
In your case, all the databases have the same schema, so you can just model one database and it will work for all the others.
The piece you want to change is the connection string. This tells EF how to find the database and can be provided at runtime.
There is an overload of the DbContext constructor which takes a connection string as a parameter: MSDN: DbContext Constructor (String)
And there are even classes in the framework that help create connection strings for you:
MSDN: EntityConnectionStringBuilder Class
MSDN: Connection String Builders
It is very simple
I had,
//WMSEntities is conection string name in web.config
//also the name of Entitiframework
public WMSEntities() : base("name=WMSEntities")
{
}
already in autogenerated Model.Context.cs of edmx folder
To connect to multiple database in runtime, I created another constructor that takes connection string as parameter like below in same file Model.Context.cs
public WMSEntities(string connStringName)
: base("name=" + connStringName)
{
}
Now, I added other connection string in Web.Config for example
<add name="WMSEntities31" connectionString="data source=TESTDBSERVER_NAME;
initial catalog=TESTDB;userid=TestUser;password=TestUserPW/>
<add name="WMSEntities" connectionString="data source=TESTDBSERVER_NAME12;
initial catalog=TESTDB12;userid=TestUser12;password=TestUserPW12/>
Then, when connecting to database I call below method passing connetionString name as parameter
public static List<v_POVendor> GetPOVendorList(string connectionStringName)
{
using (WMSEntities db = new WMSEntities(connectionStringName))
{
vendorList = db.v_POVendor.ToList();
}
}
Hrmmm I happen to really like EF Code First but I'm not certain it suits what you're doing. How often does your schema change?
Should You Be Using EF?
Advantages of EF
If the schema changes somewhat regularly, the Migrations part of EF Code First might save you a lot of time and effort because you can often do away with SQL scripts for schema upgrades - schema changes end up in your source repository with the rest of your code instead. You'd start here:
https://stackoverflow.com/a/8909092/176877
I also happen to really like how easy EF is to setup, and how easy it is to write LINQ queries against it and return exactly the POCOs I built from the DB.
But EF might not be the best fit.
Other ORMs to consider
Many other ORMs support LINQ and POCOs with better support for existing databases (there are things that can be pretty difficult to map in EF Code First), --and existing support for asynchronous operation (EF is on 5.0 right now; 6.0 has async)-- (update: EF6 is the latest and its async support is great. Its bulk delete is terrible though and should be avoided like plague, drop to plain SQL for that).
In particular NHibernate is the beast on the scene for existing db support, but it's a bit of a configuration chore and what appears to be political infighting has caused the documentation to be conflicting for different versions and forks of it.
Much simpler are many "Micro ORMs" - that link is to a short list from 2011 but if you poke around you'll find 30 or so in .Net. Some generate better or less optimal queries, some none at all, some make you write SQL (don't use those) - you'll have to poke around to decide which is for you. This can be a bigger research task but I suspect the easy configuration and small learning curve for one of these best suits what you're trying to do.
Answer to your specific question
Talk to All client Dbs at once
If you're connecting to all 50 databases from one app at the same time you'll need to instantiate 50 DbContexts like:
var dbClient1 = new DbClient1();
var dbClient2 = new DbClient2();
Assuming you went around making little wrapper classes like:
public class DbClient1 : CoreDbContext
{
public DbClient1()
: base("DbClient1") // Means use the connection string named "DbClient1" in Web.Config
Where CoreDbContext is the main EF class in your Project that extends DbContext (standard part of any EF project).
Talk to just one at a time
If you're using just the one per app then any EF tutorial will do.
The only major trick will be migrating those Dbs when schema changes occur. Two basic approaches there. Either way you grab a backup and restore a copy of them locally so you can test your migrations against them (update-database -f -verbose). If you don't you risk data errors like changing a column to NOT NULL and finding your local test instance had no nulls, one client's did, kaboom. Once you get them working, you're onto deciding how you want to update Production. There are a lot of ways you might do this ranging from writing a custom roll-forward/back tool (or finding one) with SQL scripts checked into git, hiring a DBA, or much simpler:
The Obvious - SQL Script
Dump the migration to SQL (update-database -script) and run it against the actual production database.
My Crazy Way for Small Numbers of Dbs
Add entries for each db to Web.Config, and create a Project Configuration for each of them like "DbDeployClient1," "DbDeployClient2," etc. In each of those make a build define like DbDeployClient1, and then add this to your DbContext class:
public CoreDbContext()
#if DbDeployClient1
: base("DbDeployClient1")
#elseif DbDeployClient2
: base("DbDeployClient2")
// etc
#endif
{
That allows you to quickly switch to your DbDeploy config and run the migration directly from Visual Studio against the target database. Obviously if you're doing this you'll need to temporarily open a port, preferably only allowing in your IP, on the actual SQL Server instance you're migrating. One nicety is you get clear errors from your migration right there, and full rollback capability, without any real work - all that rollback support you're leveraging is just part of EF. And one dev can do it without a bunch of other bottlenecks. But it has a lot of opportunities to reduce risk and improve automation.
I have an existing ASP.NET MVC application with some sample data in the SQL Server database, which is working fine..
Assuming I have all of the necessary repositories and IOC in place, is there a tool that will extract the data from a group of tables, and "freeze-dry" it into a mock object (perhaps using an XML file to store the data), so that I can detach the database and use the mock data for my unit tests?
Depending on what exactly you are trying to test there might be different approaches.
If you want to test the data access logic then this is no longer unit test but integration test. For this kind of tests it is a good idea to be able to easily replace the real database with some lighter maybe even in-memory database like SQLite which will be reconstructed before running each test. If you are using an ORM this task is easy. All you need to do is to generate SQL scripts (INSERT INTO...) from your existing database, modify and adapt the dialect to SQLite (if necessary), read and inject into a SQLite file and finally all that's left is to instruct your data access layer to use SQLite dialect and connection string for the unit test.
Now if you are not using an ORM and your data access logic is tied to MSSQL things get uglier you will need a live database in order to perform those integration tests. In this case I would suggest you duplicate your real database which you would use for the tests by modifying only the connection string. Once again you will need to properly setup and teardown (preferably in a transaction) this test database in order to put it into a known state for the tests.
If you want to test code that depends on those repositories (such as your controllers for example) you don't need to even bother about mocking the data as your controllers depend on abstract repositories and not the real implementations (aren't they), so you could easily mock the methods of the repository in order to test the logic in the controllers.
This is actually a well known "test smell":
http://xunitpatterns.com/Obscure%20Test.html#General
From:
2098937: Proper way to Mock repository objects for unit tests using Moq and Unity
I don't know of a direct way to do what you're asking for, but MSSQL supports export to CSV, Access, and Excel. Although, this require you to change the Data Access Layer in your in your mid-tier, and furthermore, I don't feel this answers your question:
"freeze-dry" it into a mock object
This I'm not sure of. Is it feasible to just restore a backup of the database either on the same SQL server as a new database, or possibly on a dev SQL server?
I am using NHibernate for ORM, and everything works fine.
Now I started to write some unit-tests (using the DB, I do not want to put tooo much effort in abstracting this away, I know its not perfect, but it works..).
I need to be sure that the DB is completly empty for some tests. I can, of course, create the whole DB. But that seems to be overkill and I think it takes longer...
Is there a DELETE_ALL command which clears all tables, I can use in NHibernate?
Chris
EDIT: A short update, I decided to go the SQLite way, no problem to change this with NHibernate. There are some pitfalls, I am using this config, and it works. Otherwise you might get "table not found" errors, due to nHibernate closing the connection while in session, resulting in a "lost" database...
For your convinience: Copy and paste...
.Database(SQLiteConfiguration.Standard.ConnectionString("Data Source=:memory:;Version=3;New=True;Pooling=True;Max Pool Size=1;")
.Raw("connection.release_mode", "on_close"))
.Mappings(obj => obj.AutoMappings.Add(_config.APModel));
Drop and recreate the database. You can use schemaexport:
var export = new SchemaExport(config);
export.Drop(false, true);
export.Create(true, true);
Sql lite in memory runs faster for tests than a "normal" database, but the disadvantage is that the sql-lite dialect can be different than the sql dialect in production.
I would recommend you check out ndbunit. It's not a NHibernate-specific, but I've used it for testing NHibernate projects in the past and it works well. Basically, it provides functions to clear the database, prefill it with test data, or restore it to known states after each test. You just have to provide an XSD of the database schema, and optionally some XML data for pre-filling.
I believe I first saw this in the Summer of NHibernate screen-cast series, so check those out to see it in use.
You're not writing unit tests (i.e. tests that test one unit), you're writing integration tests where units interact (i.e. with your database).
Part of your test infrastructure could run a sql script that does one of the following:
Drop db and recreate.
Truncate all tables.
Ideally, you do want to put a bit of work in abstracting the db away, especially since you have NH which makes it much easier than some other frameworks.
Use an in memory database like SQLite, and setup the necessary data in it before each test. The initial setup takes a bit of time, but each test runs very fast afterwards and you can make sure that you start off with a clean slate. Ayende has a few blog posts about how to set it up.
Just in case the Drop/Create DB does not suit your needs (like if the db contains object that NHibernate is not aware of, like SPs, functions etc) you could always make a backup point with the DB empty and after you're done testing just restore to that point