Chapter 3

Test Automation Architecture

Major Capabilities in a Test Automation Architecture

A Test Automation Solution implements a Test Automation Framework, which is implemented by a Test Automation Architecture.

Generic Test Automation Architecture (gTAA)

  • System Under Test interface – the connection between the system under test and the test automation framework.
  • Project management interface – the test automation development progress.
  • Test management interface – the mapping of test case definitions and automated test cases.
  • Configuration management interface – the CI/CD pipelines, environments and testware.

Capabilities provided by test automation tools and libraries

  • Test generation – automatically generating test cases from models that define the System Under Test (i.e. automated model-based testing). The generated tests can then be traced back to the model (elements). E.g. GraphWalker is a model-based testing tool that uses graph-based models to represent system behaviour. It automatically generates test cases by exploring paths through the model, ensuring coverage of different scenarios.
  • Test definition – defining and implementing high- and low-level test cases and test suites, which can be derived from a test model.
  • Test execution – executing test cases automatically and reporting the test results. This includes the setup and teardown of test suites.
  • Test adaptation – supporting tools for controlling the test harness, monitoring the System Under Test and simulating or emulating the test environment. Functionality is provided for distributing the test execution across multiple test devices/interfaces. E.g. The VS Code Ruby Test Adapter, which runs tests and displays test results directly in the VS Code sidebar.

How to Design a Test Automation Solution

The test automation architecture defines the technical design for the overall test automation solution and should address:

  • Selecting test automation tools and specific libraries. E.g. if you are using Cucumber and need to check the contents of PDFs, you will need to install the pdf-reader Ruby gem.
  • Developing plugins & components.
  • Identifying connectivity and interface requirements (e.g. firewalls, database, URLs, mocks/stubs, message queues and protocols).
  • Connecting to the test management & defect management tools. E.g. you can integrate Cucumber with Jira via the AIO plugin, so that you can view test results on each ticket.
  • Using a version control system and repositories.

Layering of Test Automation Frameworks

The Test Automation Framework is the foundation of a Test Automation Solution and often includes a test harness (test runner), test libraries, test scripts and suites.

Test Automation Framework layers:

  1. Test scripts layer: It calls the services of the business logic layer. No direct calls to the core libraries should be made from the test scripts.
  2. Business logic layer: (Libraries dependent on the system under test). Libraries that inherit class files of the core libraries/use their facades. This layer uses facade classes or modules to abstract and group actions based on business workflows. This layer knows the intent (‘log in as a user’), not how to fill in fields. This layer is used to set up the test automation framework to run against the system under test and the additional configurations to the system under test.
  3. Core libraries layer (Libraries that are independent of any system under test) E.g. Capybara setup, browser configuration and assertion helpers.

Approaches for Automating Test Cases

Capture/playback

Test scripts are produced by manually capturing test cases. “No-code” solutions don’t expose code, while “low-code” solutions do.

Advantage:

  • initially easy to set up and use.

Disadvantages:

  • Difficult to maintain, scale and evolve.
  • The system needs to be available while capturing the tests.
  • Only feasible for a small scope and system that rarely changes.
  • The captured execution depends highly on the version from which the capture was taken.
  • Recording each individual test case instead of re-using code is time consuming.

Linear scripting

Does not require custom test libraries made by an automation tester. Test scripts created by a record/playback tool can be modified.

Advantages:

  • Easy to set up and start writing test scripts.
  • Compared to capture/playback, scripts can be modified more easily.

Disadvantages:

  • Hard to maintain, scale and evolve.
  • The system needs to be available while capturing a test case.
  • Only feasible for a small scope and a system that rarely changes.
  • Compared to capture/playback, some programming knowledge is needed.

Structured Scripting

Test libraries are introduced with re-usable elements, test steps and/or user journeys.

Advantages:

  • Easy to maintain, scale, port and evolve.
  • Business logic can be separated from the test scripts.

Disadvantages:

  • Programming knowledge is necessary.
  • Initial investment into development and defining the testware is time consuming.

Test-Driven Development

Test cases are defined as part of the development process: Test, code, refactor (red, green, refactor).

Advantages:

  • Simplifies component level test case development.
  • Improves code quality and the structure of code.
  • Improves testability.
  • Makes it easier to achieve a desired code coverage.
  • Reduces defect propagation to higher test levels.
  • Improves communication between developers, business representatives and testers, because the tests are defined before a new feature is implemented.
  • User stories that are not achieved using GUI testing and API testing can quickly achieve exit criteria by following TDD.

Disadvantages:

  • Initially takes more time to get accustomed to TDD.
  • Not following TDD properly can result in false confidence in code quality.

Data-driven testing

DDT builds upon the structured scripting approach. Test scripts are provided with test data, which allows for running the same test script multiple times with different data.

Advantages:

  • Quick and easy test case expansion through data feeds.
  • Reduced cost of adding new automated tests.
  • Test analysts can specify tests just by populating data files (e.g. updating a spreadsheet) so they have less dependency on technical test analysts.

Disadvantage:

  • Proper test data management may be necessary.

Keyword-driven testing

Test cases are a list / table of steps derived from keywords (e.g. Robot Framework). This technique is often built upon data-driven testing.

Advantages:

  • Test analysts and business analysts can be involved in test case creation.
  • Can also be used for manual testing, by manually following the keyword steps.

Disadvantages:

  • Implementing and maintaining keywords is a complex task.
  • Huge effort for smaller systems.

Behaviour-Driven Development

Advantages:

  • Improves communication between developers, business representatives and testers.
  • Automated BDD scenarios act as test cases and ensure coverage of specifications.
  • Multiple test types can be produced on different levels of the test pyramid (unit tests / integration tests / UI tests).

Disadvantages:

  • Additional test cases (negative tests & edge cases) still need to be defined.
  • Many teams don’t involve business representatives and just use BDD to write tests.
  • Implementing and maintaining natural language test steps is complex.
  • Overly complex test steps turn debugging into a difficult & costly activity.

Design Principles and Design Patterns in Test Automation

Object-oriented programming principles

  • Encapsulation – hide internal details of how an object works and expose only what’s necessary. E.g. a class BankAccount with a private variable, ‘balance’, and public method ‘deposit()’ – you can interact with the account but can’t directly change its internal state:
class BankAccount
  def initialize
    @balance = 0
  end

  def deposit(amount)
    @balance += amount if amount > 0
  end

  def balance
    @balance
  end
end
  • Abstraction – focus on essential features and hide the complexity. You define an interface for what an object does, not how it does it. E.g. a Shape class might have a method calculate_area() without specifying how. Subclasses like Circle and Rectangle will implement the details:
class Shape
  def calculate_area
    raise NotImplementedError, "Subclass must implement this"
  end
end

class Circle < Shape
  def initialize(radius)
    @radius = radius
  end

  def calculate_area
    Math::PI * @radius**2
  end
end
  • Inheritance – create new classes from existing ones, which promotes reuse and logical hierarchy. E.g. an Employee class can be extended by Manager and Developer, inheriting common attributes like name and salary:
class Employee
  def initialize(name)
    @name = name
  end

  def work
    "#{@name} is working"
  end
end

class Developer < Employee
  def work
    "#{@name} is writing code"
  end
end
  • Polymorphism – allows objects of different types to be treated through a common interface, with behaviour varying based on their actual class. E.g. you call work() on any Employee, but a Developer and a Manager respond differently:
employees = [Developer.new("Bronwen"), Employee.new("Carl")]

employees.each do |e|
  puts e.work
end
# Output:
# Bronwen is writing code
# Carl is working

SOLID principles

  • Single responsibility – every component of a Test Automation Solution should be in charge of exactly one thing, e.g. generating keywords or data, executing test cases, logging results, generating execution reports.
  • Open-closed – software entities should be open for extension but closed for modification, i.e. it should be possible to modify or enrich the behaviour of the components without breaking the backward-compatible functionality.
  • Liskov substitution – every Test Automation Solution component must be replaceable without affecting the overall behaviour of the Test Automation Solution.
  • Interface segregation – it is better to have more specific components than a general, multi-purpose component. This makes substitution and maintenance easier by eliminating unnecessary dependencies.
  • Dependency inversion – the components of a Test Automation Solution must depend on abstractions rather than on low-level details. I.e. the components should not depend on specific automated test scenarios.

Design patterns

  • Facade pattern – implementation details are hidden and expose only what testers need to create their test cases.
  • Singleton pattern – It’s a pattern where you only have one instance of a class and you provide a global point of access to it. It is often used to make sure there is only one driver that communicates with the system under test.
  • Page object model – a class file is created and referred to as the page object model. When the system under test’s structure changes, you only need to make updates in one place, instead of updating locators in each test case. E.g. if a class interacts with a button element and the ID of the button changes when the website is redesigned, updating it in the class will ensure that all test cases that use this class still work.
  • Flow model – combines usage of page models and flow models and stores all the user actions that interact with the page objects. (Test steps can be re-used in multiple test scripts). Typically used when the system’s structure changes frequently. Cucumber allows you to make use of nested steps, so that your Gherkin doesn’t have to spell out all the detailed steps. E.g. when logging in, you could just re-use the following step:
steps %{
   Given I have logged in
}