Integration Testing for ASP.NET APIs (2/3) - Data

Overview

Where's the code?

Fully working examples for this code are in the 02-api-with-sqlite and 03-api-with-postgres folder of the GitHub repo with the examples for this series of posts:

In the previous post we got started with some basic integration tests. In this one we'll use a real data store for the tests - first SQLite and next Postgres.

API Changes

I added a simple ProductsController that uses an EF Core DbContext to retrieve existing products (two GET methods) and to create new ones (a POST method).

Here's the code:

 1[HttpGet]
 2public async Task<IEnumerable<Product>> GetProducts(string category = "all")
 3{
 4    return await context.Products
 5        .Where(p => p.Category == category || category == "all")
 6        .ToListAsync();
 7}
 8
 9[HttpGet]
10[Route("{id}")]
11public async Task<Product> Get(int id)
12{
13    var product = await context.Products.FindAsync(id);
14    return product ?? throw new KeyNotFoundException($"Product with id {id} not found");
15}
16
17[HttpPost]
18public async Task<Product> Post(Product product, CancellationToken token)
19{
20    await validator.ValidateAndThrowAsync(product, token);
21    await context.Products.AddAsync(product, token);
22    await context.SaveChangesAsync(token);
23    return product;
24}

It's intentionally pretty simple but does have a few nuances:

  • Any GET by ID with an ID that doesn't exist with return a 404 Not Found with a ProblemDetails response
  • The POST method uses FluentValidation to perform validation on the submitted object and will return a 400 Bad Request with ProblemDetails if validation fails. The ProblemDetails object will include information about the validation failure(s).

In our tests, we want to start with some sample data and make sure all of the above actually works like we want.

Establishing Sample Data - SQLite

I've been using SQLite for most of my demo projects lately - it doesn't require any installs or anything to be running locally but is supported by EF Core. So people can simply clone a repo and run it, and things should work: super handy.

The DbContext I created for the demo API is really simple:

1public class LocalContext(DbContextOptions<LocalContext> options) : DbContext(options)
2{
3    public DbSet<Product> Products { get; set; } = null!;
4}

When the API starts up (in Program.cs) I call a method that will either simply perform any EF Core migrations or (if we're in the Development) environment, create some hard-coded sample data:

 1public static void InitializeDatabase(this IServiceProvider serviceProvider, string environmentName)
 2{
 3    // scope is required because dbContext is a scoped service
 4    using var scope = serviceProvider.CreateScope(); 
 5    using var context = new LocalContext(scope.ServiceProvider.GetRequiredService<DbContextOptions<LocalContext>>());
 6
 7    if (environmentName == "Development")
 8    {
 9        StartWithFreshDevData(context);
10    }
11    else
12    {
13        context.Database.Migrate(); // just make sure we're up-to-date
14    }
15}

You can see the Program.cs code and the StartWithFreshDevData in the repo - they're not that interesting and look like what you would expect. One note about StartWithFreshDevData is that it deletes the entire database on start, then does migrations, and finally inserts some hard-coded data.

The two main points to recall here are:

  • A DbContext is created during API startup
  • Either a migration is completed or (if Development) fresh sample data is created

Using Fixtures in XUnit

We can use a Fixture (see the "Class Fixture" section) to share context between test classes and conditions in our XUnit tests.

In our DatabaseFixture, we'll use an in-memory version of SQLite so that we don't create new files for the db when running tests, and we'll also generate our own data for the tests (the Development data is only 6 hard-coded products and we want to test with more).

Here's the starting point for a DatabaseFixture class that at least gets an in-moemory SQLite database created:

 1public class DatabaseFixture : IAsyncLifetime
 2{
 3    public const string DatabaseName = "InMemTestDb;Mode=Memory;Cache=Shared;";
 4    private LocalContext? _dbContext;
 5
 6    public async Task InitializeAsync()
 7    {
 8        var options = new DbContextOptionsBuilder<LocalContext>()
 9            .UseSqlite($"Data Source={DatabaseName}")
10            .UseQueryTrackingBehavior(QueryTrackingBehavior.NoTracking)
11            .Options;
12
13        _dbContext = new LocalContext(options);
14
15        await _dbContext.Database.EnsureDeletedAsync();
16        await _dbContext.Database.EnsureCreatedAsync();
17        await _dbContext.Database.OpenConnectionAsync();
18        await _dbContext.Database.MigrateAsync();
19    }
20
21    public async Task DisposeAsync()
22    {
23        if (_dbContext != null) await _dbContext.DisposeAsync();
24    }
25}

The DatabaseName proeprty defines the name of the SQLite database and specifies that it will be in-memory. The rest of the code should be pretty familiar-looking - just creating a DbContext and doing migration-type stuff.

Sample Data Creation with Bogus

To create sample data (I wanted more than 6 hard-coded products and didn't want to have to make them up myself), I used the handy Bogus NuGet package.

I added some code to the above DatabaseFixture class that works the magic (I am omitting some of the original code shown above - but it's all in the GitHub repo):

 1public class DatabaseFixture :IAsyncLifetime
 2{
 3    //...
 4    public List<Product> OriginalProducts = [];
 5    public Faker<Product> ProductFaker { get; } = new Faker<Product>()
 6        .RuleFor(p => p.Name, f => f.Commerce.ProductName())
 7        .RuleFor(p => p.Price, f => Convert.ToDouble(f.Commerce.Price()))
 8        .RuleFor(p => p.Description, f => f.Commerce.ProductDescription())
 9        .RuleFor(p => p.Category, f => f.Commerce.Categories(1)[0])
10        .RuleFor(p => p.ImgUrl, f => f.Image.PicsumUrl());
11
12    public async Task InitializeAsync()
13    {
14        // ... original code to create dbcontext and do migrations
15        CreateFreshSampleData(100);
16
17        OriginalProducts = await _dbContext.Products.ToListAsync();
18    }
19
20    private void CreateFreshSampleData(int numberOfProductsToCreate)
21    {
22        var products = ProductFaker.Generate(numberOfProductsToCreate);
23        _dbContext!.Products.AddRange(products);
24        _dbContext.SaveChanges();
25    }
26    //...
27}

The code above creates a Faker<Product> to generate sample Product objects and uses some handy built-in methods to the Bogus library to create things like product names and image URLs.

The invocation of the CreateFreshSampleData specifies to create 100 products - handy if I wanted to test paging, and this number can be whatever I want it to be. Really useful.

I capture the OriginalProducts based on exactly this first set of data that gets created so that I can reference them during tests.

Aside: The CollectionFixture in XUnit

When to Use Collection Fixtures

This comes straight from the XUnit docs: When to use: when you want to create a single test context and share it among tests in several test classes, and have it cleaned up after all the tests in the test classes have finished.

Since the database will be used by any / all API routes and we (likey) will have multiple API routes that we want to put in different test classes, this makes total sense for us to use.

We create a class definition like this (I generally put mine at the bottom of the file with the IClassFixture implementation - DatabaseFixture.cs in this case):

1[CollectionDefinition("IntegrationTests")]
2public class DatabaseCollection : ICollectionFixture<DatabaseFixture>
3{
4    // This class has no code, and is never created. Its purpose is simply
5    // to be the place to apply [CollectionDefinition] and all the
6    // ICollectionFixture<> interfaces.
7}

The generic parameter for the ICollectionFixture<T> value should be your custom fixture - the one for this example is DatabaseFixture.

Also, the [CollectionDefinition("IntegrationTests")] attribute should be placed above your test classes as well, so a test class declaration would look something like this:

1[Collection("IntegrationTests")]
2public class ProductControllerTests(CustomApiFactory<Program> factory, 
3  ITestOutputHelper output, DatabaseFixture dbFixture) : BaseTest(factory, output), IClassFixture<CustomApiFactory<Program>>

All of the above is "plumbing" to make sure your tests "wire up" properly.

Replacing the DbContext Used by the API

Now that we've got the in-memory SQLite database created and have added some sample data to it, we need to make sure that this database is used by the API code when we run our tests.

This is where we take more advantage of the CustomApiFactory we established in the previous post.

We specified builder.UseEnvironment("test") so we should not be creating any hard-coded sample data (it only does that in "Development"), and we can add some code into the same ConfigureWebHost to achieve what we want:

 1protected override void ConfigureWebHost(IWebHostBuilder builder)
 2{
 3    builder.UseEnvironment("test");
 4
 5    builder.ConfigureServices(services =>
 6    {
 7        var dbContextDescriptor = services.SingleOrDefault(
 8            d => d.ServiceType == typeof(DbContextOptions<LocalContext>));
 9        services.Remove(dbContextDescriptor!);
10
11        var ctx = services.SingleOrDefault(d => d.ServiceType == typeof(LocalContext));
12        services.Remove(ctx!);
13
14        // add back the container-based dbContext
15        services.AddDbContext<LocalContext>(opts =>
16            opts.UseSqlite($"Data Source={DatabaseFixture.DatabaseName}")
17                .UseQueryTrackingBehavior(QueryTrackingBehavior.NoTracking));
18    });
19}

The code above *removes the LocalContext that was added to the services collection in Program.cs and then adds the one that we created in the DatabaseFixture class -- note the use of the DatabaseFixture.DatabaseName property in the UseSqlite call.

With this code in place, any tests we write should use a new in-memory instance of SQLite and the sample data we generated in the DatabaseFixture -- sweet!

Writing Tests

With the HttpClient extensions we created in the previous post and the DatabaseFixture and the OriginalProducts list that we have, Writing highly readable tests is pretty straight-forward. Here's a first test for the GET many method:

 1[Fact(DisplayName = "Get all products")]
 2public async Task GetProductsReturnsProducts()
 3{
 4    var retrievedProducts = await Client.GetJsonResultAsync<List<Product>>("/v1/products", 
 5        HttpStatusCode.OK);
 6
 7    Assert.NotNull(retrievedProducts);
 8
 9    // NOTE: because the tests are not explicitly ordered and there are
10    // tests that add products, we may have more products than we started with
11    Assert.True(dbFixture.OriginalProducts.Count <= retrievedProducts.Count);
12
13    var randomProduct = BogusFaker.PickRandom(dbFixture.OriginalProducts);
14    Assert.Contains(retrievedProducts, c => c.Id == randomProduct.Id);
15
16    var product = retrievedProducts.First(c => c.Id == randomProduct.Id);
17    Assert.Equal(randomProduct.Name, product.Name);
18}

The above calls the v1/products route and expects a status of OK.

It also finds a random product from the OriginalProducts list that we created in the DatabaseFixture and makes sure that product is in the list of products we got back.

Here's another sample that checks for a validation error when trying to POST a new product:

 1[Fact(DisplayName = "Cannot create product with duplicate name")]
 2public async Task CreateProductDuplicate()
 3{
 4    var existingCompany = BogusFaker.PickRandom(dbFixture.OriginalProducts);
 5
 6    var newProduct = dbFixture.ProductFaker.Generate();
 7    newProduct.Name = existingCompany.Name;
 8
 9    var problem = await Client.PostJsonForResultAsync<ProblemDetails>(
10        $"/v1/products", newProduct, HttpStatusCode.BadRequest);
11
12    Assert.Equal("Validation error(s) occurred.", problem.Title);
13    Assert.Equal("A product with the same name already exists.", problem.Extensions["Name"]!.ToString());
14}

The above uses a Faker to generate a new Product record, and then overwrites the Name property with a random one from the OriginalProducts list.

Then it does a POST and expects a Bad Request response and evaluates the returned ProblemDetails record to make sure the right information was returned to the API caller.

You can explore the rest of the tests in the GitHub repo.

Switching to Use Postgres

The technique we used above makes for some great tests - as defined by reliability, readability, and ease of writing the test conditions.

And while using SQLite is fine for demo apps, most real production applications will use "eavier" databases like Postgres and SQL Server.

But these techinques apply equally well to those databases too!

In the 03-api-with-postgres folder of the GitHub repo, I've updated the API to use Postgres instead of SQLite.

This was a matter of:

  • Swapping out the EF Core SQLite NuGet package for the Npgsql one
  • Deleting the Data/Migrations folder and recreating the EF core migrations
  • Updating Program.cs to UseNpsql() when adding the LocalContext to the services collection (which uses a connection string defined in appsettings.json)

Updating the Tests to use Postgres - with TestContainers

Postgres doesn't run in-memory (and if it did we probably wouldn't want to test that way anyway).

But there's a handy NuGet package called TestContainers that lets you use container technology to host things like a Postgres database in a container that will exist during your test run and then be eliminated once the run completes. Again - super handy. It also has built-in support for a bunch of common modules - which includes Postgres (and SQL Server (MSSQL) and lots more).

Using TestContainers Requires a Container Runtime

You need to have a Docker (container) runtime set up on your computer to use TestContainers. On Windows, that is two steps:

Postgres is in the Testcontainers.PostgreSql NuGet package. Using it in our DatabaseFixture class looks like the updated code below:

 1public class DatabaseFixture : IAsyncLifetime
 2{
 3    private readonly PostgreSqlContainer _dbContainer =
 4        new PostgreSqlBuilder()
 5            .WithDatabase("simple_api_tests")
 6            .WithUsername("postgres")
 7            .WithPassword("notapassword")
 8            .Build();
 9
10    public string TestConnectionString => _dbContainer.GetConnectionString();
11
12    private LocalContext? _dbContext;
13
14    // OriginalProducts and Faker<Product> declarations
15
16    public async Task InitializeAsync()
17    {
18        await _dbContainer.StartAsync();
19
20        var optionsBuilder = new DbContextOptionsBuilder<LocalContext>()
21            .UseNpgsql(TestConnectionString);
22        _dbContext = new LocalContext(optionsBuilder.Options);
23
24        await _dbContext.Database.MigrateAsync();
25
26        //... rest of method and class

Note the handy GetConnectionString method that will return a connection string that we can use to connect to the database - which we can use in the CustomApiFactory when we need to replace the DbContext that was wired up in Program.cs for the API:

1// add back the container-based dbContext
2services.AddDbContext<LocalContext>(options =>
3    options.UseNpgsql(dbFixture.TestConnectionString));

Those were the only updates needed to use Postgres! The tests themselves don't change at all! You could use exactly the same technique if you were using a different database, including SQL Server, MySql, and more.

Next...

In the next post we'll get into how to add authentication and authorization logic into the tests.

Stay tuned!