How to test your C# Web API
If you've read some of my other blog posts already, you probably know that I'm not a big fan of unit tests. Sure, they have their purposes but often it means that one or more parts of the System Under Test are being mocked or stubbed. It's this practice that I'm not too keen about. To have full confidence in my code, it are integration tests that I will be grabbing for.
With an integration test, we test the API from the outside out by spinning up the API client and making an actual HTTP request. I get confidence out of it because I mock as little as possible, and I will consume my API in the same way as an application (or user) would.
Show me some code!
The following tests are written in .NET Core 3 and are using XUnit as test the runner. The setup might change with other versions and test runners but the idea remains the same.
The only requirement is that the
Microsoft.AspNetCore.Mvc.Testing package is installed, you can do this with the following command.
TIP: I also use
FluentAssertionsto write my assertions because the package contains some useful utility methods, and it's easy to read.
The packages includes a
WebApplicationFactory<TEntryPoint> class which is used to bootstrap the API in memory.
This is convenient, as we don't need to have the API running before we run these tests.
In the test class, we inject the factory into the constructor.
With the factory, we can create a
HttpClient which will be used in the tests to make HTTP requests.
Because the test class implements from XUnit's
IClassFixture interface, the tests inside this class will share a single test context. The API will only be bootstrapped once for all the tests and will be cleanup afterward.
This is everything we need to write the first test.
HttpClient we can make a GET request, and assert the response it gives back.
How neat is this! To write a test that provides real value, we (almost) had no setup!
Sadly, in a real application, things get more complicated. There will be external dependencies, these will still have need to be mocked or stubbed.
I suggest to keep using the real instances of dependencies you're in control, for example the database. But for dependencies that are out of reach, mostly 3rd-party driven ports, I would use a stubbed instance, or create a mocked instance.
Luckily, it's simple to overwrite service instances.
By creating a custom
WebApplicationFactory, the configuration can be altered before the API is built.
To do this, overwrite the
To work with a real database I find it easier to create a separate database to run these tests. Therefore, it's needed to provide some integration test settings. These settings contain the new connection string that points to the database for the integration tests. In more complex scenarios, the same settings can also be used to override environment variables.
To configure the application, we can use the
ConfigureAppConfiguration method to add our configuration settings.
What I like to do is making each test independent from each other. This has as benefit that tests won't interfere with each other, and each test can be written/debugged on its own. To be able to do this, we have to perform a reseed of the database before each test runs.
To reseed my databases I'm using the Respawn package
To keep things DRY and to hide some of this logic, one of the possibilities is to create an abstraction layer.
With an abstract class,
IntegrationTest, it's possible to expose commonly used variables, the most important one being the
HttpClient because we need it to create the HTTP requests.
The test class can now inherit from the
IntegrationTest fixture and looks as follows.
As you can see in the code above, the test class doesn't contain setup logic because of the
To prevent an exponential growth of test fixtures, we can use the
WithWebHostBuilder method on
WebApplicationFactory. This is helpful for tests that require a different, specific setup.
WithWebHostBuilder method will create a new instance of the
If a custom
WebApplicationFactory class is used (in this example,
ApiWebApplicationFactory) the logic inside
ConfigureWebHost will still be executed.
In the code below we use the
InvalidWeatherForecastConfigStub class to fake an invalid configuration, which should result in a bad request. Because this setup is only required once, we can set it up inside the test itself.
For tests that require an identical setup we can write a
InlineData to test multiple endpoints at once.
This tip only applies for simple queries and are a quick way to verify these endpoints to do fail.
For testing endpoints where you have to be authenticated, we have some options.
The most simple one is to just allow anonymous requests, this can be done by adding the
The second option is to create a custom authentication handler.
In a GitHub issue you can find multiple solutions to implement this.
The authentication handler will create a claim to represent an authenticated user.
We must configure application by adding the authentication handler.
To create an authenticated request we must add the
Authorization header to the request.
The last option is to use a real token. This also means that you will have to generate a token before the tests run. Once the token is generated it can be stored in order to not having to generate a token for each test, which will slow down the execution of the tests. Plus, we're not testing the authentication in these integration tests.
Just like before, we must add the token to the request header, but we're also assigning the token to the header.
If multiple tests try to read and write to the same database, this may lead to deadlocks.
That's why we had to turn of the parallelization of our tests.
For XUnit, this be done by setting the
parallelizeTestCollections property to
false inside the
xunit.runner.json config file.
Read more about this in the XUnit docs.
Previously I didn't like to write tests for a C# API. But now that I've discovered functional testing, I enjoy writing them.
With little to no setup required, the time spent on writing tests has been cut in half. Whereas previously most of the time was spent (at least for me) on the setup of the test, and not the actual test itself. The time spent on writing them feels more as time well spent.
If you follow the theory about a refactor, you shouldn't be changing your tests. In practice, we found out (the hard way) that this is not always true. Thus, this usually also meant regression bugs. Because integration tests don't care about the implementation details, it should mean that you won't have to refactor or rewrite previously written tests. This will give us, as maintainers of the codebase, more confidence when we change, move, and delete code. The test itself will almost not change over time, which also trims down the time spent on the maintenance of these tests.
Does this mean I don't write unit tests? No, it does not, but they are less written. Only for real business logic that don't require dependencies, just input in and a result as output.
These integration tests might be slower to run, but it's worth it in my opinion. Why? Because they give me more confidence that the code we ship, is actually working, the way it's intended to work. We're not mocking or stubbing parts of the application, we're testing the whole application. With machines being faster, there won't be much difference anyway between the other tests and the integration tests. A couple of years ago, this time difference was higher, and this usually meant that fewer (or no) integration tests were written. Time to change that, if you ask me!
The full example can be found on GitHub.
- The official docs about integration tests
- Easier functional and integration testing of ASP.NET Core applications by Scott Hanselman
- Avoid In-Memory Databases for Tests by Jimmy Bogard
Please consider supporting me if have you enjoyed this post and found it useful: