Sunday, February 9, 2014

Solving the Challenges of Dapper Integration Testing: Part 1

In my last post I outlined the series of events that led me to want to write some lightweight integration tests for Dapper that would allow me to develop my Dapper repositories using Test-Driven Development (TDD). These tests wouldn't replace application-level integration tests, but would allow me to "unit test" my Dapper code so that I could develop it in a test-driven manner. My implementation would have to meet several requirements (see my previous post for a fuller explanation):
    A Dapper Gentleman
  • Automated
  • Fast
  • Simple
  • Minimal Tooling
  • Independent
  • Self-contained
  • Safe
Achieving these requirements could be a challenge with integration tests. I outlined several challenges that I would face including automatically setting up the database, cleaning up data between tests, isolating the operations of the tests, and programmatically providing a connection to the repository that could be mocked out. In this post I will describe how I addressed each of these challenges, including the code I created or adapted to use for my scenario.

All of the code for these examples can be found at The inline code snippets may omit some details to save space but you can find the complete code in the repository.

Let's take them from simplest to most complex.  In this post we’ll cover providing a connection and isolating the execution of the tests from each other.

Providing a Connection

Our repository will function as a "repository of queries." Each method encapsulates some query or command that interacts with the database. In some cases it might be several queries that function to provide complex data or complete a complex transaction. Importantly, each method in our repository will function as an independent unit of work. In this case, rather than provide a connection to the repository, instead we want to provide a mechanism to obtain a connection that can be created and disposed within the context of a single method.

We will use dependency injection to provide an IConnectionFactory instance. In production, the connection factory will be hooked up in our dependency injection framework to provide connections to our database provider of choice. As long as that provider supports DbConnection, it will work for our purposes. Since Dapper requires this to function, that doesn't restrict our choices of provider any more than the choice of Dapper itself does. IConnectionFactory needs only one method, Create(string connectionStringName), which takes the name of a connection string and provides a DbConnection to the provider specified by the connection string. In practices, our implementations of this class will likely be dependent on some configuration wrapper that allows it to retrieve the named connection string from a configuration file.
We'll provide a base class for our repositories that uses the factory and makes it available to all repositories derived from it.

public interface IConnectionFactory
    DbConnection Create(string connectionStringName);
And here is one possible implementation for a SqlServer-backed repository using a configuration wrapper (code not shown).
public class MsSqlConnectionFactory : IConnectionFactory
    private readonly IConfiguration _configuration;

    public MsSqlConnectionFactory(IConfiguration configuration)
        _configuration = configuration;

    public DbConnection Create(string connectionStringName)
        return new SqlConnection(_configuration.GetConnectionString(connectionStringName));
DapperRepositoryBase (constructor)
public abstract class DapperRepositoryBase
    protected readonly IConnectionFactory _connectionFactory;
    protected readonly string _connectionStringName;

    protected DapperRepositoryBase(IConnectionFactory connectionFactory, string connectionStringName)
        _connectionFactory = connectionFactory;
        _connectionStringName = connectionStringName;
    // remainder omitted
In our tests we will use a SqlServer LocalDB instance added to the test project as a content item. Each of our test classes will have a test context that provides a mechanism to obtain the connection factory and pass back a mock that will return a connection to our LocalDB instance. We’ll see more of this test context base class later as it provides additional common behavior to support our test classes. Notice that it’s internal – it’s not needed outside our test project – and it implements IDisposable. That will be important later, but I'm not going to give it away how just yet.

Note: I’m using FakeItEasy, a dead-simple to use dynamic fake framework to provide a mock connection factory. Since these are integration tests, this is the only usage we’re likely to see.  I’ve got it set up to provide a new connection instance every time it’s called since that’s the way our real implementations will work.

internal abstract class TestContextBase : IDisposable
    private readonly static string _dbFile = Path.Combine(Environment.CurrentDirectory,
    private static readonly string _connectionString =
        @"Data Source=(LocalDB)\v11.0;Initial Catalog=DapperTesting;"
        + "Integrated Security=True;AttachDbFilename="
        + _dbFile;

    protected static DbConnection GetConnection()
        return new SqlConnection(_connectionString);

    protected static IConnectionFactory CreateConnectionFactory(string connectionStringName)
        var connectionFactory = A.Fake<IConnectionFactory>();

        A.CallTo(() => connectionFactory.Create(connectionStringName))
         .ReturnsLazily(f => GetConnection());

        return connectionFactory;
    // Disposable implementation omitted (we'll see it later)

Isolating Test Execution

Because our tests are touching the same database, there’s a strong likelihood that we could get inconsistent or incorrect results if more than one test runs at a specific time. A test that inserts some data in the database, then runs a query against it may get different data than expected if another test, running at the same time, also inserts some data into the same tables being queried by the first test. To prevent this, we want to make sure that our tests run serially, one after the other, instead of in parallel.

One way to do this is to use Ordered Tests. This will force the tests to run in a specific order and, if we check the Continue after failure option, this should give us what we want, a way to run independent tests in an execution-isolated way. Doing this, however, adds some manual operations to maintaining our test suite and prevents us from simply running these tests along with our other (unit) tests in the same way that we are used to. My preference is to use the TestInitialze and TestCleanup methods with some synchronization code to enforce execution-isolation. To do that, I create a TestBase class that all of my test classes will derive from. This class will contain Start and Stop methods that can be used by our setup/teardown methods in the test class to enforce a single thread of execution through the test suite using a Monitor. Using a Monitor with a shared lock object will allow our tests to synchronize themselves through test setup/teardown.

Note: one downside to this method is that it may be possible for certain types of exceptions to fail our tests in a way that the teardown code does not get called. In this case, our tests will hang. This could be a problem if you do continuous integration and check in code that fails in that particular way. This should only happen in failure conditions, not when your tests are passing. You do make sure your tests pass before you check code into your repositories, don't you? The trade-off of being able to use my normal unit testing processes for these tests as well is worth the potential risk to me.

public abstract class TestBase
    private static readonly object _lock = new object();

    protected void Start()

    protected void End()
Example set up and teardown
public void Init()
    // create our concrete test context

public void Cleanup()
    // dispose of our concrete test context
That wraps it up for this post, in future posts I'll tackle data clean up, then database set up. These will require more complex solutions and some convention-based programming.