Test Suite

From GRASS-Wiki
Jump to navigation Jump to search

GRASS Test Suite

We aim at creating a comprehensive test suite for GRASS modules and libraries.

Background

See what has been done so far by Sören Gebbert and others here: Development#QA and the good [| VKT test suite]

Keep an eye on GRASS 7 ideas collection

Main picture

We plan to run unittests and integration tests for both libraries and modules. The test suite will be run after compilation, with a command like:

 $ make tests [ proj ]

Options:

  • proj: run the tests with a [list of] CRS, so that reprojection and map unit handling are tested.
  • ...

Tests are executed recursively. If the "make tests" command is executed in the root source folder all libraries and modules are tested. To test only the libraries you need to switch in the lib directory and execute "make tests". If you want to test only raster modules switch into the raster directory and run "make tests", same for other modules directories. To test a single module switch into the module directory and run "make tests".

Tests

Modules

Tests are targeted to cover all modules as well as library test modules.

The modules tests should as independent as possible from other GRASS modules.

Tests are written as simple shell script with annotations. Test script should start with test. followed by the module name i.e. r.mapcalc. and further ending with .sh. Example test.r.mapcalc.1.sh. All test should be well documented using shell comments.

The framework will execute all test scripts starting with test. and ending with .sh located in module or library directories.

Annotations are integrated in the test documentation (simple shell comments) and specify pre-processing steps, the tests and the type of the data to validate. The following annotations should be supported:

  • The @preprocess annotation
    • All commands below this annotation are handled as preprocessing steps for the tests.
    • If any of the preprocess commands fail, the framework should stop testing an create a detailed error report.
    • Preprocessing steps may be data generation, region settings and so on.
    • The preprocess annotation is valid till a @test annotation specifies the begin of a test
    • Preprocess annotations can be specified at between tests
  • The @test annotation
    • All command below a this annotation are handled as tests by the framework
    • The test annotation must be integrated in the comment block which describes the test
    • Data validation is performed by the framework for tests if reference data is present
    • The test annotation is valid till a @preprocess annotation specifies the begin of a preprocess block for further test runs
  • The data type annotations
    • Data type annotations should be specified in the same comment block as the @test annotation
    • Data type annotations specify the grass data types which should be validated with reference data
    • The following data type annotations should be specified
      • @file the framework should compare the reference data with files
      • @raster the framework should compare the reference data with raster maps using r.out.ascii for export
      • @vector the framework should compare the reference data with vector maps using v.out.ascii for export
      • @raster3d the framework should compare the reference data with raster3d maps using r3.out.ascii for export
      • @color the framework should compare the reference data with color rules using r.color.out for export
      • @table the framework should compare the reference data with SQL tables using db.select for export
      • ... please add more

Reference data for validation must be located in the module/library directory. The reference data names must be identical with the generated data (files, maps, ...) except that reference data always has a .ref suffix.

Tests are in each module's and in each library's folder. To test library functions special modules must be implemented. Library test modules test the library functions directly and should be written in C. Have a look at the test directories in the g3d, gpde and gmath libraries of grass7.

Framework should be able to generate and compare grass data types: raster, vector, raster3d, general, db, icon, imagery, d.*?

wxGUI testing should be tested separately.

Automated tests on server generates HTML report. Test several platforms.

Test framework:

What the test framework should do:

  • Creation of a test mapset for each test case in a specific test location located in the grass sources
  • Setting the environment variables to the test location and grass installation (grass environment to run modules)
  • Parsing and interpretation and execution of test scripts
  • Support of several test scripts in a single module directory
  • Run of location specific test scripts (only LL or UTM test scripts)
  • Handles module test results codes and stderr messages
  • Validation of module output based on reference data and data type annotations in the test description
  • Creates a HTML report for single modules and the whole test run
  • Deletion of the generated test mapset to clean up the test location

Timeline and status

TBD

Interested people