Unit testing

Author(s): Edison Mera, Pedro López, Manuel Hermenegildo, Alvaro Sevilla San Mateo, Nataliia Stulova, Ignacio Casso, Jose Luis Bueno.

Stability: [beta] Most of the functionality is there but it is still missing some testing and/or verification.


The Ciao assertion language (see The Ciao assertion language) allows writing tests (including unit tests) by means of test assertions. These assertions make it possible to write specific test cases at the predicate level. This library contains predicates that can be used to run tests in modules and gather or pretty-print the results. It also provides some special properties that are convenient when writing tests and the corresponding run-time support.

Writing test assertions

As described in The Ciao assertion language a test assertion is written as follows:

:- test predicate(A1, A2, ..., An) 
   :  <Precondition>
   => <Postcondition>
   +  <Global properties>
   #  <Comment>.

Where the fields of the test assertion have the usual meaning in Ciao assertions, i.e., they contain conjunctions of properties which must hold at certain points in the execution. Here we give a somewhat more operational (``test oriented'') reading to these fields:

  • predicate/n is the predicate to be tested.

  • Precondition is a goal (a literal or a conjuntion of literals) that is called before the predicate being tested, and can be used to generate values of the input parameters. While in some other types of assertions these preconditions contain properties to be checked, the typical role of the preconditions here is to provide concrete input values for which the predicate can be actually executed.

  • Postcondition is a goal that should succeed after predicate/n has been called. This is used to test that the output of the predicate is the correct one for the input provided.

  • Golbal properties specifies some global properties that the predicate should meet, in the same way as other assertions. For example, not_fails means that the predicate does not fail, exception(error(a,b)) means that the predicate should throw the exception error(a,b), and so on.

  • Comment is a string that documents the test.

The following are some example tests for a complex number evaluator (see Examples (unittest) for the full code):

:- module(ceval2, [ceval/2], [assertions, regtypes, nativeprops]).

:- test ceval(A, B) : (A = c(3, 4) + c(1, 2) - c(2, 3))
    => (B = c(2, 3)) + (not_fails, is_det).

:- test ceval(A, B) : (A = c(3, 4) * c(1, 2) / c(1, 2))
    => (B = c(3.0, 4.0)) + (not_fails, is_det).

ceval(A,   A) :- complex(A), !.
ceval(A+B, C) :- ceval(A, CA), ceval(B, CB), add(CA, CB, C).
ceval(A-B, C) :- ceval(A, CA), ceval(B, CB), sub(CA, CB, C).
ceval(A*B, C) :- ceval(A, CA), ceval(B, CB), mul(CA, CB, C).
ceval(A/B, C) :- ceval(A, CA), ceval(B, CB), div(CA, CB, C).

...

:- regtype complex/1.
:- export(complex/1).

complex(c(A, B)) :-
    num(A),
    num(B).

Test assertions can be combined with other assertions:

:- test ceval(A, B) : (A = c(3, 4) + c(1, 2) - c(2, 3))
    => (B = c(2, 3)) + (not_fails, is_det).
:- test ceval(A, B) : (A = c(3, 4) * c(1, 2) / c(1, 2))
    => (B = c(3.0, 4.0)) + (not_fails, is_det).
:- check pred ceval/2 : gnd * term => gnd * complex.

Test assertions can also take the standard assertion status prefixes. In particular, a status of false can be used to state that a test fails. This can be useful to flag bugs as known.

:- false test ceval(A, B) : (A = c(3, 4) + c(1, 2) - c(2, 3))
    => (B = c(2, 3)) + (not_fails, is_det).

Tests with a false (or true) prefix are not run.

There are some specific properties that only apply to testing which are provided in module unittest_props.pl (see Special properties for testing). For example, the limit to the number of solutions to be generated for the tested predicate can be set with the property try_sols(N), a timeout to a test can be set with the property timeout(N), times(N) specifies that the given test should be executed N times, etc.

Unit tests as examples

The special property example can be used to mark the unit test as an example, so that it is documented as such in manuals. The default behavior in lpdoc is to not include the unit tests in manuals unless they are marked this way. For example, the following test would be included in the manual as an example:

:- test ceval(A, B) : (A = c(3, 4) + c(1, 2) - c(2, 3))
    => (B = c(2, 3)) + (not_fails, is_det, example).

Running tests in a bundle

To run all these tests in a given bundle (as well as the other standard tests in the system) run the following (at the top level of the source tree or a bundle ):

ciao test

Running the tests from the IDE

A convenient way to run these tests is by selecting options in the CiaoDbg menu within the development environment. This menu offers the following options:

  1. Run tests in current module: execute only the tests specified in the current module.

  2. Run tests in current and all related modules: execute the tests specified in the current module and in all the modules being used by it.

  3. Show untested exported predicates: show the exported predicates that do not have any test assertions.

Running the tests from the top level or programmatically

The tests can also be run from the top level, loading this module (unittest.pl) and calling the appropiate predicates that it exports (see the module Usage and interface section below). This can also be done from a program, provided it imports this module.

Combination with run-time tests

These tests can be combined with the run-time checking of other assertions present in the involved modules. This can be done by including the rtchecks package in the desired modules. Any check assertions present in the code will then be checked dynamically during the execution of the tests and can detect additional errors.

Integration tests

If you need to write tests for predicates that are spread over several modules, but work together, it may be useful to create a separate module, and reexport the predicates required to build the tests. This allows performing integration testing, using the syntax and functionality of the test assertions.


Documentation on exports

Usage:show_untested_exp_preds(Alias)

Show any exported predicates that do not have test assertions. This is an aid towards ensuring that all exported predicates have tests.

  • The following properties should hold at call time:
    (sourcename/1)Alias is a source name.

Usage:run_tests_in_dir_rec(BaseDir,Opts,S)

Executes all the tests in the modules of the given directory and its subdirectories. You can indicate that the modules in a sub-directory should not be tested by placing an empty NOTEST file in that sub-directory. Also, if a NOTESTFILES file is present, containing patterns for modules, those modules will not be tested. Unittest's statistical summary is summarised in S, which is given value 0 if there are as many successes as expected, and 1 otherwise.

  • The following properties should hold at call time:
    (pathname/1)BaseDir is a pathname (encoded as an atom)
    (list/2)Opts is a list of test_options.
    (var/1)S is a free variable.

A global option that controls the testing system. The current set of options is:

  • stdout(save): Save stdout as test output, do not show
  • stdout(dump): Save stdout as test output, then show
  • stdout(show): Just show stdout (do not save)
  • stdout(null): Ignore stdout
  • stderr(Opt): (same as above)
  • stderr(stdout): redirect stderr to stdout

  • dump_output: Equivalent to stdout(dump)
  • dump_error: Equivalent to stderr(dump)

  • rtc_entry: Force run-time checking of at least exported assertions even if the runtime_checks flag has not been activated. (This is a workaround since currently we cannot enable runtime checks in system libraries smoothly).

  • treat_related : Run tests in current and all related modules;

  • dir_rec : Run tests in a specified directory recursively.

Usage:test_option(Opt)

Opt is a testing option.

    A global option that specifies a testing routine. The current set of actions is:

    • check : run tests and temporarily save results in the auto-rewritable module.testout file;

    • show_output : print the testing trace to the standard output;

    • show_stats : print the test results statistics to the standard output;

    • save : save test results file in module.testout-saved file;

    • briefcompare : check whether current and saved test output files differ;

    • compare : see the differences in the current and saved test output files in the diff format;

    Usage:test_action(Action)

    Action is a testing action

      Usage:show_test_output(Alias,Format)

      Given a file Alias, tries to look up the respective unittest output file and print it to the standard output in the specified Format ('output' for test full trace, 'stats' for a statistical summary only, 'full' for both), otherwise emits a warning message that no test output file is avaiable.

      • The following properties should hold at call time:
        (sourcename/1)Alias is a source name.
        (atm/1)Format is an atom.

      No further documentation available for this predicate.

      PREDICATErun_tests/3
      No further documentation available for this predicate.

      PREDICATErun_tests/4
      No further documentation available for this predicate. Meta-predicate with arguments: run_tests(?,?,?,pred(1)).

      Usage:run_tests_in_module(Alias,Opts,TestSummaries)

      Run the tests in module Alias (with options Opts). The results of the tests are returned as data in TestSummaries. TestSummaries can be pretty printed using show_test_summaries/1 and statistical_summary/2.

      • The following properties should hold at call time:
        (sourcename/1)Alias is a source name.
        (list/2)Opts is a list of test_options.
      • The following properties should hold upon exit:
        (list/1)TestSummaries is a list.

      Usage:run_tests_in_module(Alias,Opts)

      Run the tests in module Alias. The results of the tests are printed out.

      • The following properties should hold at call time:
        (sourcename/1)Alias is a source name.
        (list/2)Opts is a list of test_options.

      Usage:run_tests_in_module(Alias)

      Run the tests in module Alias (using default options). The results of the tests are printed out.

      • The following properties should hold at call time:
        (sourcename/1)Alias is a source name.

      No further documentation available for this predicate.

      Usage:run_tests_related_modules(Alias)

      • The following properties should hold at call time:
        (sourcename/1)Alias is a source name.

      (UNDOC_REEXPORT)show_test_summaries/1
      Imported from unittest_summaries (see the corresponding documentation for details).

      (UNDOC_REEXPORT)statistical_summary/1
      Imported from unittest_statistics (see the corresponding documentation for details).

      (UNDOC_REEXPORT)statistical_summary/1
      Imported from unittest_statistics (see the corresponding documentation for details).

      Imported from unittest_statistics (see the corresponding documentation for details).

      Documentation on multifiles

      PREDICATEdefine_flag/3

      Usage:define_flag(Flag,FlagValues,Default)

      • The following properties hold upon exit:
        (atm/1)Flag is an atom.
        (flag_values/1)Define the valid flag values
      The predicate is multifile.

      PREDICATEtest_filter/2
      No further documentation available for this predicate. The predicate is multifile.