Get started with the rate limiting middleware in ASP.NET Core 7

Take advantage of the new rate limiting middleware in ASP.NET Core 7 to protect against malicious attacks on your applications and ensure equitable use of server resources.

wait sign
Pete Smithies (CC0)

Rate limiting is a technique used to restrict the number of requests allowed to a particular resource to thwart DDoS attacks and API abuses. Once a rate limiting threshold is reached, subsequent requests to the resource are disallowed, delayed, or throttled.

Until .NET 7, implementing rate limiting in ASP.NET Core applications required using third-party packages such as AspNetCoreRateLimit. But with ASP.NET Core 7, rate limiting is now a built-in feature, available as a middleware designed to prevent abuse, protect against attacks, and ensure fair resource allocation.

I discussed how to use AspNetCoreRateLimit to implement rate limiting in earlier versions of ASP.NET Core in a previous article. In this article, we’ll examine how to use the new built-in rate limiting middleware in ASP.NET Core 7.

To use the code examples provided in this article, you should have Visual Studio 2022 installed in your system. If you don’t already have a copy, you can download Visual Studio 2022 here.

Create an ASP.NET Core 7 Web API project in Visual Studio 2022

First off, let’s create an ASP.NET Core 7 project in Visual Studio 2022. Follow these steps:

  1. Launch the Visual Studio 2022 IDE.
  2. Click on “Create new project.”
  3. In the “Create new project” window, select “ASP.NET Core Web API” from the list of templates displayed.
  4. Click Next.
  5. In the “Configure your new project” window, specify the name and location for the new project.
  6. Optionally check the “Place solution and project in the same directory” check box, depending on your preferences.
  7. Click Next.
  8. In the “Additional Information” window shown next, leave the “Use controllers (uncheck to use minimal APIs)” box checked, since we won’t be using minimal APIs in this project. Leave the “Authentication Type” set to “None” (the default).
  9. Ensure that the check boxes “Enable Open API Support,” “Configure for HTTPS,” and “Enable Docker” remain unchecked as we won’t be using those features here.
  10. Click Create.

We’ll use this ASP.NET Core 7 Web API project to work with built-in rate limiting middleware in the sections below.

Built-in rate limiting in ASP.NET Core 7

Rate limiting in ASP.NET Core 7 is available as part of the System.Threading.RateLimiting namespace. The main type is the abstract base class RateLimiter, which has several amazing features.

RateLimiter can be configured with several options including the maximum number of requests allowed, the response status code, and the time window. You can define the rate limit depending on the HTTP method, the client IP address, and other factors. You even have the option of queueing requests instead of rejecting them.

The following rate limiter algorithms are supported:

  • Fixed window
  • Sliding window
  • Token bucket
  • Concurrency

To add the rate limiting middleware to your ASP.NET Core 7 application, you should first add the required services to the container as shown in the code snippet given below.

builder.Services.AddRateLimiter(options =>
{
    //Write your code to configure the middleware here
});

To add the middleware to the pipeline, you should call the UseRateLimiter extension method as shown below.

app.UseRateLimiter();

Configure rate limiting middleware in ASP.NET Core 7

Now, write the following code in the Program.cs file to configure the rate limiter.

builder.Services.AddRateLimiter(options =>
{
    options.GlobalLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
    {
        return RateLimitPartition.GetFixedWindowLimiter(partitionKey: httpContext.Request.Headers.Host.ToString(), partition =>
            new FixedWindowRateLimiterOptions
            {
                PermitLimit = 5,
                AutoReplenishment = true,
                Window = TimeSpan.FromSeconds(10)
            });
    });
});

The call to the AddRateLimiter method registers the middleware with the service collection. This example uses a GlobalLimiter for all requests, and this GlobalLimiter is set to a PartitionedRateLimiter. 

The FixedWindowLimiter is then used to replenish allowed requests. Note that when you run the application and call an endpoint more than the permitted limit, HTTP Status Code 503 “Service unavailable” will be returned.

Alternatively, you can configure the middleware to return HTTP Status Code 429 “Too Many Requests.” To do so, use the following code snippet.

options.OnRejected = async (context, token) =>
{
    context.HttpContext.Response.StatusCode = 429;
    await Task.CompletedTask;
};

If you would like to customize the error message, you can use the following code snippet instead.

options.OnRejected = async (context, token) =>
{
   context.HttpContext.Response.StatusCode = 429;
   await Task.CompletedTask;
};

Complete rate limiting example (Program.cs source)

Here is the complete source code of the Program.cs file for your reference.

using System.Threading.RateLimiting;
var builder = WebApplication.CreateBuilder(args);
// Add services to the container.
builder.Services.AddControllers();
builder.Services.AddRateLimiter(options =>
{
    options.GlobalLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
    {
        return RateLimitPartition.GetFixedWindowLimiter(partitionKey: httpContext.Request.Headers.Host.ToString(), partition =>
            new FixedWindowRateLimiterOptions
            {
                PermitLimit = 5,
                AutoReplenishment = true,
                Window = TimeSpan.FromSeconds(10)
            });
    });
    options.OnRejected = async (context, token) =>
    {
        context.HttpContext.Response.StatusCode = 429;
        await context.HttpContext.Response.WriteAsync("Too many requests. Please try later again... ", cancellationToken: token);
    };
});
var app = builder.Build();
app.UseRateLimiter();
// Configure the HTTP request pipeline.
app.UseAuthorization();
app.MapControllers();
app.Run();

Queue requests instead of rejecting them

You can also queue requests instead of rejecting them. To achieve this, you should take advantage of the QueueLimit property and set your desired value as shown in the code snippet given below.

builder.Services.AddRateLimiter(options =>
{
    options.GlobalLimiter = PartitionedRateLimiter.Create<HttpContext, string>(httpContext =>
    {
        return RateLimitPartition.GetFixedWindowLimiter(partitionKey: httpContext.Request.Headers.Host.ToString(), partition =>
            new FixedWindowRateLimiterOptions
            {
                PermitLimit = 5,
                AutoReplenishment = true,
                QueueLimit = 5,
                QueueProcessingOrder = QueueProcessingOrder.OldestFirst,
                Window = TimeSpan.FromSeconds(10)
            });
    });
    options.OnRejected = async (context, token) =>
    {
        context.HttpContext.Response.StatusCode = 429;
        await context.HttpContext.Response.WriteAsync("Too many requests. Please try later again... ", cancellationToken: token);
    };
});

Note how QueueProcessingOrder has been set to OldestFirst. If instead you want the last inserted items in the queue to be processed first, you can set QueueProcessingOrder to NewestFirst.

By using rate limiting, you can reduce the load on your server and protect it from bad actors, ensuring the availability of your service and fair usage of available resources. In future posts on rate limiting, I’ll discuss the different rate limiting algorithms available in ASP.NET Core 7 and how we can implement custom rate limit policies.

Copyright © 2023 IDG Communications, Inc.