Tuesday, December 4, 2018

Controller Testing

Prompted by a tweet from @amandaiverso (retweeted by Scott Hanselman), I've decided to write down a few thoughts on what I consider to be "industry standard" (or at least my standard) for testing controllers. First, I should say that most often these days I'm writing APIs not web sites. That will likely skew my thoughts, but I'll try to be more general or at least differentiate where necessary between the two because there are subtle differences. My thoughts are going to cover both how and what to test. The what probably goes at least somewhat to what I think controllers ought to do, not just what you ought to test. I haven't spent a lot of time thinking this through - it's mostly a quick response to the tweet. I'd be happy to get additional thoughts about this from others.

Kinds of Tests

For controllers I typically do 2 or 3 types of testing. The two types of tests that I would always create are integration tests and unit tests. In some, but not all scenarios, I might add a quasi-integration test. What do I mean by that? In my practice it would be an automated test using a unit-test framework but invoking an in-memory server implementation. This type of test allows me to test the plumbing set up around the controller to ensure things like any custom model-binding or authentication is properly configured. In this latter case, because I typically mock out everything but the controller, I would consider it mostly a controller integration test, but I'd typically have them executed with the unit tests because they can be done fast enough to be run as unit tests and in isolation from other parts of the environment so that they can easily be run by the build server.  For API tests, I would probably also write unit tests using reflection to validate key parts of my attribute-based Swashbuckle documentation - ensuring for example that the return values of any actions match that documented via the attribute and that all public actions have certain types of document attributes.

Approach to Testing

Generally I follow an approach that keeps controller actions thin. Normally this means that my controllers handle some (but not all) validation, authorization, and output formatting.  They delegate business logic and data access to services. The things that I would test then are both success and failure conditions for the validation that is handled at the controller level, valid and invalid access, and proper translation of service models into view-specific models either for use in an MVC view or for presentation on the wire in an API. In the latter case, I'm mostly checking that the model is populated correctly. I don't typically mock out my mapping services, usually AutoMapper, so this functions as a check that they are configured properly when used.

Validation

Much of the time I used model-based validation. Typically I'll ensure that my controller checks Model.IsValid. If I'm using Fluent validation, then I'll make sure that it invokes the validator and properly responds depending on the validator's response. For Model.IsValid checking, this is typically configured in the test set up by constructing a controller context and setting the ModelState.IsValid property directly. In my controller tests for this I'm not particularly interested in whether the model binder works correctly ("don't test the framework"), just that my code responds correctly when it is either valid or invalid.  I'll test this by checking the output of the action being tested.

For manually invoked validation, I'll use mocks with particular set ups, then verify that the validator is called, then test the output of the action to ensure that the controller properly responded to the validation result the mock returns.

Access

Since access is frequently enforced by attributes, this is a place where I might use quasi-integration tests. If an API is public or all access is controlled by restricting IP ranges, then I might skip these. If I am testing these, then the tests use the set up to establish responses from authentication and authorization services, create a request using the credentials that trigger the configured responses, then checking that the proper response code is return in failure cases or that a valid OK response is returned in success cases. These tests typically don't inspect the response values, just the error codes. Some configuration for other services is required to ensure that you don't get errors from invalid data, but these can be configured loosely to avoid tight coupling to their implementations. Using something like AutoFixture works well to ensure that I get valid data without too much configuration trouble.

Model Translation

I typically treat model translation as whitebox testing. If I'm using something like AutoMapper, then I'm probably not rigorous in testing every individual property is being set. Those things that are a straightforward translation from the entity model to the view model don't really need to be tested. You can verify that they are handled by validating the AutoMapper configuration - that will let you know where you might have missing properties.  Generally, I'll restrict myself to making sure an important property has been copied correctly (the Id property is a good choice if it's a primary key) and then any custom mappings/flattenings that I expect to occur for that particular model.  Again, I don't try to test the framework, in this case AutoMapper, but only those things that I've customized. When testing one property, say of a flattened model, will suffice to ensure that the entire model has been flattened, I'll do that. Where it's possible to make mistakes, I'll be more rigorous.