using Deployment in test configuration makes a copy every time the unit test is run which is time consuming. The data is a bunch of bitmaps that might change only after builds.
What's the convention for deploying such large test data?
I just wrote up a blog post about File Dependencies and testing here.
http://tsells.wordpress.com/2012/03/06/how-to-run-integration-tests-with-file-dependencies/
The tests you are performing are integration tests since you are going to the file system. I would use the example per class method in the post for what you are trying to achieve.
Contents of Post
Disclaimer
In many cases a developer test needs to execute with a specific file / set of files that need to be on the file system. Many “purists” think this is a bad idea and you should “Mock” your system out in such a way that this is not required. In certain cases that may be true – but I am a “realist” and understand that the complexity requirements around doing things like this can and many times do far outweigh the benefits from just leveraging the file system for the test. This does however move the test from being a true “unit” test to an “integration” test. I am content with this as it is also my belief that integration tests provide more value than unit tests. If setup and run in the correct way – this test can run locally in visual studio, via MS build and the command line, or on a build server running build agents like Team City.
You can download the source code here: TestClass.zip
Requirements for this Test
Test Runner ( I like using TestDriven.Net as it lets me execute in line as well as debug).
A file class that wraps some of the system.IO functions that allows you implement IDisposable. This is useful for creating a “sandbox” that once is out of scope – and it removes any temp files that are used so clean up is done automatically (an example class is attached in the sample).
NUnit or MSTest. I still prefer NUnit.
Usage Options
The requirements for the usage of the test files will determine how and when to set them up and clean up (delete).
Per Test – files are regenerated per test run against them
Per Test Class – files are generated once per test class
In both instances the FileSandBox class is used to create a temporary location for the files to live and then be removed once the test(s) are complete.
Per Class Usage
[TestFixture]
public class PerClass
{
private FileSandbox _sandbox;
private string _tempFileLocation;
public PerClass() {}
/// <summary>
/// Setup class - runs once per class
/// </summary>
[TestFixtureSetUp]
public void SetupClass()
{
_sandbox = new FileSandbox();
// Getting Temp file name to use
_tempFileLocation = _sandbox.GetTempFileName("txt");
// Get the current executing assembly (in this case it's the test dll)
Assembly myassembly = Assembly.GetExecutingAssembly();
// Get the stream (embedded resource) - be sure to wrap in a using block
using (Stream stream = myassembly.GetManifestResourceStream("TestClass.TestFiles.TextFile1.txt"))
{
// In this case using an external method to write the stream to the file system
_tempFileLocation = TestHelper.StreamToFile(stream, _tempFileLocation);
}
}
/// <summary>
/// Tear down class (cleanup)
/// </summary>
[TestFixtureTearDown]
public void TearDownClass()
{
_sandbox.Dispose();
}
[Test, Description("Testing doing something with files on the filesystem")]
public void MyFileSystemTest()
{
string[] lines = File.ReadAllLines(_tempFileLocation);
Assert.IsTrue(lines.Length > 0);
}
}
Per Test Usage (Option 1)
[TestFixture]
public class PerTest
{
public PerTest(){}
/// <summary>
/// Setup class - runs once per class
/// </summary>
[TestFixtureSetUp]
public void SetupClass()
{
// NOOP
}
/// <summary>
/// Tear down class (cleanup)
/// </summary>
[TestFixtureTearDown]
public void TearDownClass()
{
// NOOP
}
[Test, Description("Testing doing something with files on the filesystem")]
public void MyFileSystemTest()
{
using (FileSandbox sandbox = new FileSandbox())
{
// Getting Temp file name to use
string tempfile = sandbox.GetTempFileName("txt");
// Get the current executing assembly (in this case it's the test dll)
Assembly myassembly = Assembly.GetExecutingAssembly();
// Get the stream (embedded resource) - be sure to wrap in a using block
using (Stream stream = myassembly.GetManifestResourceStream("TestClass.TestFiles.TextFile1.txt"))
{
// In this case using an external method to write the stream to the file system
tempfile = TestHelper.StreamToFile(stream, tempfile);
string[] lines = File.ReadAllLines(tempfile);
Assert.IsTrue(lines.Length > 0);
}
}
}
}
Per Test Usage (Option 2)
[TestFixture]
public class PerEachTest
{
private FileSandbox _sandbox;
private string _tempFileLocation;
public PerEachTest() { }
/// <summary>
/// Setup class - runs once per class
/// </summary>
[TestFixtureSetUp]
public void SetupClass()
{
// NOOP
}
/// <summary>
/// Tear down class (cleanup)
/// </summary>
[TestFixtureTearDown]
public void TearDownClass()
{
// NOOP
}
[SetUp]
public void Setup()
{
_sandbox = new FileSandbox();
// Getting Temp file name to use
_tempFileLocation = _sandbox.GetTempFileName("txt");
// Get the current executing assembly (in this case it's the test dll)
Assembly myassembly = Assembly.GetExecutingAssembly();
// Get the stream (embedded resource) - be sure to wrap in a using block
using (Stream stream = myassembly.GetManifestResourceStream("TestClass.TestFiles.TextFile1.txt"))
{
// In this case using an external method to write the stream to the file system
_tempFileLocation = TestHelper.StreamToFile(stream, _tempFileLocation);
}
}
[TearDown]
public void Teardown()
{
_sandbox.Dispose();
}
[Test, Description("Testing doing something with files on the filesystem")]
public void MyFileSystemTest()
{
string[] lines = File.ReadAllLines(_tempFileLocation);
Assert.IsTrue(lines.Length > 0);
}
}
You can download the source code here: Source Code
Related
I currently have a suite of Selenium automation tests using SpecFlow and C# (IDE is Visual Studio 2017).
I created a batch file to run the applicable feature files.
Currently I set my test environment (i.e. QA, UAT, Prod) within Environments.cs by using the following property
public static string CurrentEnvironment { get; set; } = uat;'
What I want to achieve is to some how pass the test environment via the batch file so there is no need to open the solution and modify before running the BAT file.
There will likely be other parameters I will want to update via this method in the future such as Specflow parameters whereby I might want to override a parameter value.
I've tried Googling a solution but I've found structuring my question doesn't yield the results I want.
Batch file:
ECHO ON
set Date=%date:~0,2%%date:~3,2%%date:~6,4%
set Time=%time:~0,2%%time:~3,2%
cd C:\Users\%username%\source\repos\AutomationTests\TestProject\packages\SpecRun.Runner.1.8.0\tools
SpecRun.exe run default.srprofile /basefolder:C:\Users\%username%\source\repos\AutomationTests\TestProject\TestProject\bin\Debug /filter:testpath:"Feature:TestFeature"
In essence if the CurrentEnvironment property in my solution is set to 'UAT' I want to be able to override that to say 'QA' via the BAT file.
What modifications do I need to make to the BAT file and what to my solution (if any)?
You can set the test environment with environment variables. Environment.GetEnvironmentVariable() is the method to call to read environment variables.
Here is an example:
Program.cs (in a console app):
using System;
namespace TestEnvironmentVariable
{
class Program
{
static void Main(string[] args)
{
string testEnvironment = Environment.GetEnvironmentVariable("test_env");
Console.WriteLine($"Test environment: {testEnvironment}");
}
}
}
run.bat:
set test_env=uat
TestEnvironmentVariable.exe
When running run.bat:
>run.bat
>set test_env=uat
>TestEnvironmentVariable.exe
Test environment: uat
You can also put all your settings in a json file that you use as a configuration file. It also makes it possible to change the settings without having to compile. Here is a small example:
Create a json file, e.g. settings.json:
{
"TestEnvironment": "UAT"
}
It can be created in the root folder of the solution. In the file's properties, set Copy to Output Directory to Copy always or Copy if newer. This makes sure it's moved to the binary output directory.
Then create a Settings.cs file that is representing the class we deserialize the json file to:
namespace TestEnvironmentVariable
{
public sealed class Settings
{
public Settings() { }
public string TestEnvironment { get; set; }
}
}
You can add more variables here when they are needed. The json file should have the same variables. And then the code that does the deserialization:
using System.IO;
using Newtonsoft.Json;
namespace TestEnvironmentVariable
{
public static class SettingsUtil
{
public static T GetObjectFromJsonFile<T>(string filename)
{
string json = File.ReadAllText(filename);
var deserializedObject = JsonConvert.DeserializeObject<T>(json);
return deserializedObject;
}
}
}
You have to add Newtonsoft.Json with NuGet. We can then read the json file in our code:
using System;
namespace TestEnvironmentVariable
{
class Program
{
static void Main(string[] args)
{
Settings settings = SettingsUtil.GetObjectFromJsonFile<Settings>("settings.json");
Console.WriteLine($"Test environment: {settings.TestEnvironment}");
}
}
}
Output:
>TestEnvironmentVariable.exe
Test environment: UAT
first of all I'm new to SpecFlow.
I have a feature file which I have / want to automate using MSTest to run as a functional test involving a fully set up server, data access ...
For this purpose I have to configure the server with the data in the SpecFlow's 'Given' blocks and start it afterwards. I also have to copy some files to the test's output directory.
In the non-SpecFlow functional tests I was using the ClassInitialize attribute to get the TestDeploymentDir from the TestContext; something like this:
[ClassInitialize]
public static void ClassSetup(TestContext context)
{
TargetDataDeploymentRoot = context.TestDeploymentDir;
}
Now with SpecFlow I can't use this attribute anymore as it is used by SpecFlow itself.
Some new attributes do exist, like BeforeFeature which acts similarly BUT it doesn't pass on the TestContext as a parameter.
I just need to get access to the TestContext's TestDeploymentDir in order to copy some files there before really lauching my functional test server - easily doable without SpecFlow but almost impossible with SpecFlow.
How to deal with this issue?
Is it possible at all?
Thanks a lot for advice!
robert
Environment:
Visual Studio 2012
SpecFlow 1.9.0.77
Since SpecFlow 2.2.1 the TestContext is available via Context Injection. (https://github.com/techtalk/SpecFlow/pull/882)
You can get it from the container directly:
ScenarioContext.Current.ScenarioContainer.Resolve<Microsoft.VisualStudio.TestTools.UnitTesting.TestContext>()
or via context injection:
public class MyStepDefs
{
private readonly TestContext _testContext;
public MyStepDefs(TestContext testContext) // use it as ctor parameter
{
_testContext = testContext;
}
[BeforeScenario()]
public void BeforeScenario()
{
//now you can access the TestContext
}
}
In order to have access to values in the TestContext you have to create partial class for each scenario file you have in which you add the .
using Microsoft.VisualStudio.TestTools.UnitTesting;
using TechTalk.SpecFlow;
/// <summary>
/// Partial class for TestContext support.
/// </summary>
public partial class DistributionFeature
{
/// <summary>
/// Test execution context.
/// </summary>
private TestContext testContext;
/// <summary>
/// Gets or sets test execution context.
/// </summary>
public TestContext TestContext
{
get
{
return this.testContext;
}
set
{
this.testContext = value;
//see https://github.com/techtalk/SpecFlow/issues/96
this.TestInitialize();
FeatureContext.Current["TestContext"] = value;
}
}
}
Then you could access the deployment directory from your steps using
var testContext = (TestContext)FeatureContext.Current["TestContext"];
var deploymentDir = testContext.TestDeploymentDir;
If you have too many scenarios, then you probably has to automate creation of such files with T4.
You can create a Plugin and customize the IUnitTestGeneratorProvider implementation. The following should add the line to MSTest's class initialize.
// It's very important this is named Generator.SpecflowPlugin.
namespace MyGenerator.Generator.SpecflowPlugin
{
public class MyGeneratorProvider : MsTest2010GeneratorProvider
{
public MyGeneratorProvider(CodeDomHelper codeDomHelper)
: base(codeDomHelper)
{
}
public override void SetTestClassInitializeMethod(TestClassGenerationContext generationContext)
{
base.SetTestClassInitializeMethod(generationContext);
generationContext.TestClassInitializeMethod.Statements.Add(new CodeSnippetStatement(
#"TargetDataDeploymentRoot = context.TestDeploymentDir;"));
}
}
[assembly: GeneratorPlugin(typeof(MyGeneratorPlugin))]
public class MyGeneratorPlugin : IGeneratorPlugin
{
public void RegisterDependencies(ObjectContainer container)
{
}
public void RegisterCustomizations(ObjectContainer container, SpecFlowProjectConfiguration generatorConfiguration)
{
container.RegisterTypeAs<MyGeneratorProvider, IUnitTestGeneratorProvider>();
}
public void RegisterConfigurationDefaults(SpecFlowProjectConfiguration specFlowConfiguration)
{
}
}
}
And reference it in the App.config file:
<specFlow>
<plugins>
<add name="MyGenerator" type="Generator"/>
</plugins>
</specFlow>
Next time you re-save the .feature files the generated code in ClassInitialize should set the TargetDataDeploymentDirectory.
I had to do something similar. Here's my working code https://github.com/marksl/Specflow-MsTest and blog post http://codealoc.wordpress.com/2013/09/30/bdding-with-specflow/
There is a FeatureContext as well as the more commonly used
ScenarioContext. The difference of course is that the FeatureContext
exists during the execution of the complete feature while the
ScenarioContext only exists during a scenario.
For example:
Add to context:
ScenarioContext.Current.Add("ObjectName", myObject);
Get:
var myObject = ScenarioContext.Current.Get<object>("ObjectName");
You can read more about it here.
I have set up some In Memory SQLite Unit Tests for my Fluent NHibernate Database, which looks like this. It works fine. (Using NUnit)
namespace Testing.Database {
/// <summary>
/// Represents a memory only database that does not persist beyond the immediate
/// testing usage, using <see cref="System.Data.SQLite"/>.
/// </summary>
public abstract class InMemoryDatabase : IDisposable {
/// <summary>
/// The configuration of the memorized database.
/// </summary>
private static Configuration Configuration { get; set; }
/// <summary>
/// The singleton session factory.
/// </summary>
protected static ISessionFactory SessionFactory { get; set; }
/// <summary>
/// The current session being used.
/// </summary>
protected ISession Session { get; set; }
protected InMemoryDatabase() {
SessionFactory = CreateSessionFactory();
Session = SessionFactory.OpenSession();
BuildSchema(Session);
}
/// <summary>
/// Construct a memory based session factory.
/// </summary>
/// <returns>
/// The session factory in an SQLite Memory Database.
/// </returns>
private static ISessionFactory CreateSessionFactory() {
return FluentNHibernate.Cfg.Fluently.Configure()
.Database(FluentNHibernate.Cfg.Db.SQLiteConfiguration
.Standard
.InMemory()
.ShowSql())
.Mappings(mappings => mappings.FluentMappings.AddFromAssemblyOf<Data.Mappings.AspectMap>())
.ExposeConfiguration(configuration => Configuration = configuration)
.BuildSessionFactory();
}
/// <summary>
/// Builds the NHibernate Schema so that it can be mapped to the SessionFactory.
/// </summary>
/// <param name="Session">
/// The <see cref="NHibernate.ISession"/> to build a schema into.
/// </param>
private static void BuildSchema(ISession Session) {
var export = new NHibernate.Tool.hbm2ddl.SchemaExport(Configuration);
export.Execute(true, true, false, Session.Connection, null);
}
/// <summary>
/// Dispose of the session and released resources.
/// </summary>
public void Dispose() {
Session.Dispose();
}
}
}
So now, in order to use it, I just inherit InMemoryDatabase and add my Test methods, like this.
[TestFixture]
public class PersistenceTests : InMemoryDatabase {
[Test]
public void Save_Member() {
var member = // ...;
Session.Save(member); // not really how it looks, but you get the idea...
}
}
My problem isn't that this doesn't work. It does. But if I have two tests in the same class that test similar data, for instance ...
Username_Is_Unique() and then Email_Is_Unique(). Not real tests again, but it's a good example.
[Test]
public void Username_Is_Unique(){
var user = new User {
Name = "uniqueName"
Email = "uniqueEmail"
};
// do some testing here...
}
[Test]
public void Email_Is_Unique(){
var user = new User {
Name = "uniqueName"
Email = "uniqueEmail"
};
// do some testing here...
}
I realize these are very bad tests. These are not real tests, I am just citing an example.
In both cases, I would construct a mock User or Member or what-have you and submit it to the database.
The first one works fine, but since the database is in memory (which makes sense, since I told it to be), the second one doesn't. Effectively, the Unit Tests do not reflect real-world situations, because each test stands alone. But when running them sequentially in a batch, it behaves like it should in the real world (I suppose that's partially a good thing)
What I want to do is flush the in memory database after each method. So I came up with a simple way to do this by repeating the constructor. This goes in the InMemoryDatabase class.
protected void Restart() {
SessionFactory = CreateSessionFactory();
Session = SessionFactory.OpenSession();
BuildSchema(Session);
}
So now, in each method in my inheriting class, I call Restart() before I do my testing.
I feel like this isn't the intended, or efficient way to solve my problem. Can anyone propose a better solution?
If it is of any relevance, I am using Fluent nHibernate for the persistence, and Telerik JustMock for my Mocking - but for my database stuff, I've yet to need any mocking.
You need to drop and recreate the database for every test. Every test should be independent of the other. You can do do two thing, first have your test use a setup method (Assuming NUnit here but others have the same functionality)
[SetUp]
public void Setup()
{
// Create in memory database
Memdb = new InMemoryDatabase();
}
Alternatively, you can wrap each test in a using statement for the database. For example
[Test]
public void Test()
{
using(var db = new InMemmoryDatabase())
{
Do Some Testing Here
}
}
I am writing some unit tests for the persistence layer of my C#.NET application. Before and after the tests of a test class execute, I want to do some cleaning up to erase possibly inserted dummy values, therefore, this cleaning up happens in methods marked with the attributes [ClassInitialize()] and [ClassCleanup()].
(I know that a better way would be to use an in-memory database, but it is not really doable so far as we depend on lots of stored procs....)
I would like to output some information about the results of the cleaning up, but I can not find a way to get the output in the test results with VISUAL Studio 2010.
This is what I am doing so far :
///... lots of stuff before ...
//global for the test run
private static TestContext context;
//for each test
private IRepository repo;
#region Initialisation and cleanup
/// <summary>
/// Execute once before the test-suite
/// </summary>
[ClassInitialize()]
public static void InitTestSuite(TestContext testContext)
{
context = testContext;
removeTestDataFromDb();
}
[ClassCleanup()]
public static void CleanupTestSuite()
{
removeTestDataFromDb();
}
private static void removeTestDataFromDb()
{
context.WriteLine("removeTestDataFromDb starting");
using (ISession session = NHibernateHelper.OpenSession())
{
IDbConnection cn = session.Connection;
IDbCommand cmd = cn.CreateCommand();
//remove anyt test data
cmd.CommandText = #"DELETE FROM SomeTable
WHERE somefield LIKE 'easyToFindTestData%Test'";
int res = cmd.ExecuteNonQuery();
context.WriteLine("removeTestDataFromDb done - affected {0} rows", res);
}
}
[TestInitialize()]
public void InitTest()
{
repo = new MyRepositoryImplementation();
}
[TestCleanup()]
public void CleanupTest()
{
//cleanup
repo = null;
}
#endregion
I'm trying to use context.WriteLine() ...
I also tried just using Console.WriteLine() with the same results.
How do you write to standard output in the ClassInitialize part and where can you access that output ?
The [ClassInitialize] and [ClassCleanup] run just once for all the tests in that class. You'd be better of using [TestInitialize] and [TestCleanUp] which run before and after each test. Also try wrapping the complete test in a database transaction. This way you can simply rollback the operation (by not committing the transaction) and your database stays in a consistent state (which is essential for trustworthy automated tests).
A trick I do for integration tests is to define a base class that all my integration test classes can inherit from. The base class ensures that each test is ran in a transaction and that this transaction is rolled back. Here is the code:
public abstract class IntegrationTestBase
{
private TransactionScope scope;
[TestInitialize]
public void TestInitialize()
{
scope = new TransactionScope();
}
[TestCleanup]
public void TestCleanup()
{
scope.Dispose();
}
}
Good luck.
The trace output from a ClassInitialize and ClassCleanup appears in the result summary.
You can access it by doing the following
Open the Test Results windw [ Test -> Windows -> Test Results ]
There should be a link named "Test run completed" on the top left corner of the [Test Results] window.
Click the clink
It should open a window with the header "Result Summary" and it will show the debug trace created during ClassInitialize and ClassCleanup
You can see the Console output on each test if you double-click the test method in the Test Results pane. It is also present in the .trx xml results file.
In addition, if you specify the "Define DEBUG constant", you can use the
System.Diagnostics.Debug.WriteLine("ClassInitialize Method invoked, yeah.");
.. which will end up in the "Output" pane.
I am creating Selenium RC test scripts in Visual Studio (C#). I am
struggling with re-factoring the tests; all my tests are in a single
file. I would appreciate any input and/or pointers to websites, books,
etc. to learn about modularizing the tests.
I have to run the same tests on different sites (same application but
configured differently for different clients and logins) which are 95%
same. Would anybody like to provide some good examples or best
practices to do this?
Thanks!
Best practise for writing Selenium tests or any UI tests is Page Object Model which is the idea that you create an Object for each of the pages. Each of these objects abstract the page so when you write a test it doesnt really look like you have been working with Selenium.
So for a blog you would do something like this to create an object for the home page
public class Home
{
private readonly ISelenium _selenium;
/// <summary>
/// Instantiates a new Home Page object. Pass in the Selenium object created in the test SetUp().
/// When the object in instantiated it will navigate to the root
/// </summary>
/// <param name="selenium">Selenium Object created in the tests
public Home(ISelenium selenium)
{
this._selenium = selenium;
if (!selenium.GetTitle().Contains("home"))
{
selenium.Open("/");
}
}
/// <summary>
/// Navigates to Selenium Tutorials Page. Selenium object wll be passed through
/// </summary>
/// <returns>SeleniumTutorials representing the selenium_training.htm</returns>
public SeleniumTutorials ClickSelenium()
{
_selenium.Click("link=selenium");
_selenium.WaitForPageToLoad("30000");
return new SeleniumTutorials(_selenium);
}
/// <summary>
/// Click on the blog or blog year and then wait for the page to load
/// </summary>
/// <param name="year">blog or blog year
/// <returns>Object representing /blog.* pages</returns>
public Blog ClickBlogYear(string year)
{
_selenium.Click("link=" + year);
_selenium.WaitForPageToLoad("30000");
return new Blog(_selenium);
}
// Add more methods as you need them
}
then you would create a test that looks like the following
[TestFixture]
public class SiteTests
{
private ISelenium selenium;
[SetUp]
public void Setup()
{
selenium = new DefaultSelenium("localhost", 4444, "*chrome", "http://www.theautomatedtester.co.uk");
selenium.Start();
}
[TearDown]
public void Teardown()
{
selenium.Stop();
}
[Test]
public void ShouldLoadHomeThenGoToXpathTutorial()
{
Home home = new Home(selenium);
SeleniumTutorials seleniumTutorials = home.ClickSelenium();
SeleniumXPathTutorial seleniumXPathTutorial = seleniumTutorials.ClickXpathTutorial();
Assert.True(seleniumXPathTutorial.
IsInputOnScreen(SeleniumXPathTutorial.FirstInput));
Assert.True(seleniumXPathTutorial
.IsInputOnScreen(SeleniumXPathTutorial.SecondInput));
Assert.True(seleniumXPathTutorial
.IsInputOnScreen(SeleniumXPathTutorial.Total));
}
}