In this post, we're exploring the capabilities of the new output caching middleware and how to use it. As an example, let's create a new API and add caching to the weather forecast endpoint.
Use the following command to create a new web API project. To follow along with me, make sure you have at least version 7.0.0-preview.6 installed.
To use the output caching middleware on top of your endpoint, you first need to register the
OutputCache into your
Do this by using the
IServiceCollection.AddOutputCache and the
methods while setting up the API in the
With the output caching middleware enabled we can start to add a caching layer on top of the endpoints of the API.
The most basic way to do this is by using the
CacheOutput extension to an
IEndpointRouteBuilder (a route).
In the following example, the
/weatherforecast route is cached because we "marked" the route to be cached with
The result is that the handler of the endpoint is invoked the first time it receives a request. Then, when it receives new requests, the caching layer returns the cached response without executing the endpoint's logic.
This is made visible in the next GIF, where a delay is added to the handler to make it clearer when the cache is empty, or when the cache is hit.
Because a weather forecast is specific to a region, using a single cache doesn't make sense. Let's see what happens when we add a query parameter "city" to request the weather forecast for a specific city.
In the example below, the city query parameter (with Brussels as value) is added to the URL which means that the handler of the endpoint is executed. When the city value is changed (from Brussels to Paris), the handler is also invoked again. Lastly, a second query parameter "other" is added to the URL, this also results that the handler is run again.
By testing this out we learn that the default behavior uses the URL path with query parameters to identify unique requests, and also to read and write to the cache. When the value of a query parameter changes or when a new query parameter is added, the cache isn't hit. If this isn't the desired behavior, the configuration of the output cache can be fine-tuned to your needs.
To cache a request by a (or multiple) unique query parameter(s), use the overload of
The overload gives us access to the policy builder
OutputCachePolicyBuilder to configure the caching strategy.
One of the options is to use the
OutputCachePolicyBuilder.VaryByQuery method to uniquely identify requests by specific query parameter(s). The method expects none, one, or multiple query parameter names to vary the cache.
Let's see what happens when we configure the policy to identify requests by the "city" query parameter.
To completely ignore query parameters within the URL, use
OutputCachePolicyBuilder.VaryByQuery but leave the query parameters empty.
This gives us the following result.
Instead of using query parameters, you can also build the cache with request headers with
In the following example, the "X-City" request header differentiates requests by city, each city has an individual cache.
You can make use of
OutputCachePolicyBuilder.VaryByValue to even have more fine-grained control over the cache policy.
OutputCachePolicyBuilder.VaryByValue is a
Func that takes the
HttpContext as argument and expects a
KeyValuePair<string, string> as result.
I haven't had the need to implement a cache using this, if you have a good use case feel free to contact me.
The default implementation expires the cache after one minute (absolute duration).
To control how long a cache entry is valid, set expiration for routes by using the
OutputCachePolicyBuilder.Expire method and pass it a
To change the default expiration time, update the
DefaultExpirationTimeSpan while registering the Output Cache.
For example, to use the cache for one hour:
In some cases, you might want to purge (clear) the cache when a certain event occurs. For example, when you know that the data has been updated, and you want to provide your users the updated data. When the cache is purged, the cache is revalidated when the next request is received.
To purge the cache, first, add a tag (which is a
string) to the cache output with the
Then, inject the
IOutputCacheStore store and call the
The method accepts the tag name that needs to be purged.
Now that we've seen how the basics of the
CacheOutput middleware works, let's see how to use it in more complex scenarios.
When you have multiple endpoints that require a cache with similar policies, you don't have to add the
CacheOutput middleware to each endpoint. This would be hard to maintain, and easy to make mistakes.
To make it better manageable, create a route group and add the middleware to the group.
In the example below we first create a route group called
wf, and apply the output cache middleware to it.
Then, two endpoints are added to the group.
When working with groups (or policies later), the configuration that is applied on individual endpoints takes precedence over the configuration that is applied to the group.
For example, if you don't want to cache an endpoint, you can use the
In the example below, the
/nocache route is added to the
wf group that was created in the previous step.
NoCache is applied to the
/nocache route, because of this, this route won't use the cache policies that were applied on the
When you find yourself configuring many endpoints and/or endpoint groups with the same configuration, creating your own cache policy is probably the best way to go. While creating custom policies you can choose to create a named policy or to add a default policy to the whole application.
You can include your own custom policies within the callback of
In the next example, the named cache policy
InvariantQueries is created to ignore all query parameters.
This is done with the
AddPolicy, which expects a policy name (a
string), and an
Action<OutputCachePolicyBuilder> to configure the cache policy.
To make use of the
InvariantQueries policy, use another overload on the
CacheOutput middleware and pass it the name of the policy. In our example,
To create a default policy that is used for all requests, use
The difference with a named policy is that
AddBasePolicy doesn't expect a policy name as an argument.
Just like a named policy,
AddBasePolicy uses the
Action<OutputCachePolicyBuilder> to configure the cache policy.
In the implementation of the example below all requests are cached if they contain the
Notice that it isn't required to add
CacheOutput to a route to enable this base policy.
When a cache entry doesn't exist and needs to be created, but the server receives multiple requests simultaneously for that entry, then the handler is only executed once. The handler of the first request populates the cache, and the other requests wait until the first request is finished. This prevents the server from being overloaded with requests.
To know how the default implementation behaves, you can take a look at the source code of
One detail that pops out, and which I didn't expect at first but makes sense, is the check if output caching should be used. Here, we can see that authorized requests are ignored from being cached.
If this doesn't work for your application, you can create your own policy by implementing the
The created policy then has to be provided to
So far we've only seen how to use the
OutputCache middleware with minimal APIs.
But, because it's just a middleware it can also be used with the traditional controllers.
As you can see in the example below, you simply have to add the
OutputCache attribute on top of the controller's method.
Or, when you want to enable output caching for the whole controller, you can add the
OutputCache attribute to the controller class.
In this post, we've covered most of the features of the new output caching middleware. Using the middleware improves the performance of your application. Looking at the API, we've seen that it almost requires no code to use the caching middleware, while it's flexible to be configured to your own needs.
I appreciate it if you would support me if have you enjoyed this post and found it useful, thank you in advance.