Chapter 5

Implementation and Deployment Strategies

Automation at different test levels within pipelines

  • Configuration tests (part of the ‘build’ step of the pipeline) – check that all paths to the files used in the test scripts are correct, that the files exist and are located in the specified paths.
  • Component tests (part of the ‘build’ step of the pipeline) – executed on library classes & web components.
  • Component integration tests (‘continuous integration’ pipeline) – executed together with component tests.
  • System tests (‘continuous deployment’ pipeline) – the last quality gate of the system under test.
  • System integration tests (‘continuous delivery’ pipeline) – ensure that separately developed system components are working together.

Two approaches to incorporating system tests / system integration tests / user acceptance tests in pipelines:

  1. Test cases are executed as part of the deployment phase (CD) after the component deployment. If the tests fail, the deployment can be rolled back. If tests need to be re-run, a re-deployment has to be done.
  2. Test cases are executed as a separate pipeline, triggered by the successful deployment. (Beneficial if you want different suites etc. to run on each deployment). In this case, tests don’t act as a quality gate, and rollbacks will be manual. These tests can be used to ensure the system has been deployed, but don’t verify functional suitability.

Pipelines can also be used for other test automation purposes, such as:

  • Running different tests periodically, e.g. nightly regression.
  • Running non-functional tests, to periodically monitor things like performance efficiency.

Configuration management for testware

Test environment configuration

  • Each environment in the pipeline can have different configurations such as URLs or credentials.
  • The test environment configuration is usually stored with the testware.
  • It can also be part of the common core library or in a shared repository.

Test data

  • Test data can be specific for the test environment or for the release.
  • Test data is usually stored with smaller test automation frameworks, but test data management systems can also be used.

Test suites / test cases

  • It is common to set up different test suites, based on their purpose, such as smoke testing or regression testing.
  • A feature toggle configuration can be defined per each release or test environment.
  • The testware can also be released with the system under test using the same release version. In this way, there is an exact match between the system under test version and the testware that can test it. Such a release is usually implemented using a configuration management system using tags or branches.

Test automation dependencies for an API infrastructure

When performing API test automation, you need the following information:

  • API connections – understand the business logic that can be tested automatically and the relationship between APIs.
  • API documentation – serves as a baseline for test automation (e.g. parameters, headers, distinct types of request-response objects).

Contract testing

  • This is a type of integration testing verifying that services can communicate with each other and that the data shared is consistent with a specified set of rules.
  • Contract testing ensures that APIs follow predefined communication agreements, helping manage API dependencies.
  • Contract testing can validate the compatibility of two separate systems.
  • It captures interactions exchanged between each service, storing them in a contract, which can be used to verify that both parties adhere to it.
  • Advantage: defects occurring from underlying services can be found early on.
  • Consumer-driven approach: the consumer sets its expectation determining how the provider shall respond to requests coming from this consumer.
  • Provider-driven approach: the provider creates the contract, which shows how its services are operating.

Example: Contract testing between a frontend and backend API:

A frontend client calls an API to fetch user details. A backend API provider returns user data in JSON format. The frontend expects the backend to return:

{
  "id": 123,
  "name": "Bronwen",
  "email": "bronwen@example.com"
}

This structure becomes the contract – a formal agreement that defines how the backend should respond.

Any time the backend changes, the contract tests validate that changes don’t break the agreed structure. E.g. if the backend suddenly removes the email field or renames name to fullName, contract testing would catch the break before deployment.