Unit testing

Author(s): Edison Mera, Pedro López, Manuel Hermenegildo, Alvaro Sevilla San Mateo, Nataliia Stulova.

The Ciao assertion language (see The Ciao assertion language) allows writing tests (including unit tests) by means of test assertions. These assertions make it possible to write specific test cases at the predicate level. This library contains predicates that can be used to run tests in modules and gather or pretty print the results. It also provides some special properties that are convenient when writing tests and the required run-time support.

Writing test assertions

Recall that a test assertion is written as follows:

:- test predicate(A1, A2, ..., An) 
   :  <Precondition>
   => <Postcondition>
   +  <Global properties>
   #  <Comment>.

Where the fields of the test assertion have the usual meaning in Ciao assertions, i.e., they contain conjunctions of properties which must hold at certain points in the execution. Here we give a somewhat more operational (“test oriented”), reading to these fields:

  • predicate/n is the predicate to be tested.

  • Precondition is a goal that is called before the predicate being tested, and can be used to generate values of the input parameters. While in other types of assertions the idea of these preconditions is to provide concrete input values for which the predicate can be actually executed.

  • Postcondition is a goal that should succeed after predicate/n has been called. This is used to test that the output of the predicate is the correct one for the input provided.

  • Properties specifies some global properties that the predicate should meet, in the same way as other assertions. For example, not_fails means that the predicate does not fail, exception(error(a,b)) means that the predicate should throw the exception error(a,b), and so on.

  • Comment is a string that documents the test.

The following are some example tests for a complex number evaluator (see Examples (unittest) for the full code):

:- module(ceval2, [ceval/2], [assertions, regtypes, nativeprops]).

:- test ceval(A, B) : (A = c(3, 4) + c(1, 2) - c(2, 3))
        => (B = c(2, 3)) + (not_fails, is_det).

:- test ceval(A, B) : (A = c(3, 4) * c(1, 2) / c(1, 2))
        => (B = c(3.0, 4.0)) + (not_fails, is_det).

ceval(A,   A) :- complex(A), !.
ceval(A+B, C) :- ceval(A, CA), ceval(B, CB), add(CA, CB, C).
ceval(A-B, C) :- ceval(A, CA), ceval(B, CB), sub(CA, CB, C).
ceval(A*B, C) :- ceval(A, CA), ceval(B, CB), mul(CA, CB, C).
ceval(A/B, C) :- ceval(A, CA), ceval(B, CB), div(CA, CB, C).

...

:- regtype complex/1.
:- export(complex/1).

complex(c(A, B)) :-
        num(A),
        num(B).

Test assertions can be combined with other assertions:

:- test ceval(A, B) : (A = c(3, 4) + c(1, 2) - c(2, 3))
        => (B = c(2, 3)) + (not_fails, is_det).
:- test ceval(A, B) : (A = c(3, 4) * c(1, 2) / c(1, 2))
        => (B = c(3.0, 4.0)) + (not_fails, is_det).
:- check pred ceval/2 : gnd * term => gnd * complex.

Test assertions can also take the standard assertion status prefixes. In particular, a status of false can be used to state that a test fails. This can be useful to flag bugs as known.

:- false test ceval(A, B) : (A = c(3, 4) + c(1, 2) - c(2, 3))
        => (B = c(2, 3)) + (not_fails, is_det).

Tests with a false (or true) prefix are not run.

Due to the non-determinism of logic programs, the test engine needs to test all the solutions that can be tested up to given limits (for example, a maximum number of solutions, or a given timeout).

There are some specific properties that only apply to testing which are provided in module unittest_props.pl. For example try_sols(N) specifies that the first N solutions of the predicate predicate/n are tested. times(N) specifies that the given test should be executed N times, etc.

Running tests in a bundle

To run all these tests (as well as the other standard tests in the system) run the following (at the top level of the source tree or a bundle ):

ciao test

Running the tests from the IDE

A convenient way to run these tests is by selecting options in the CiaoDbg menu within the development environment:

  1. Run tests in current module: execute only the tests specified in the current module.

  2. Run tests in current and all related modules: execute the tests specified in the module and in all the modules being used by this.

  3. Show untested exported predicates: show the exported predicates that do not have any test assertion.

Running the tests from the top level or programmatically

The tests can also be run from the top level, loading this module (unittest.pl) and calling the appropiate predicates that it exports (see the module Usage and Interface section below). This can also be done from a program, provided it imports this module.

Integration tests

If you need to write tests for predicates that are spread over several modules, but work together, it may be useful to create a separate module, and reexport the predicates required to build the tests. This allows performing integration testing, using the same syntax of the test assertions.


Documentation on exports

Usage:run_tests_in_module(Alias)

Run the tests in module Alias (using default options). The results of the tests are printed out.

  • The following properties should hold at call time:
    (sourcename/1)Alias is a source name.

Usage:run_tests_in_module(Alias,Opts)

Run the tests in module Alias. The results of the tests are printed out.

  • The following properties should hold at call time:
    (sourcename/1)Alias is a source name.
    (list/2)Opts is a list of test_options.

Usage:run_tests_in_module(Alias,Opts,TestSummaries)

Run the tests in module Alias (with options Opts). The results of the tests are returned as data in TestSummaries. TestSummaries can be pretty printed using show_test_summaries/1 and statistical_summary/2.

  • The following properties should hold at call time:
    (sourcename/1)Alias is a source name.
    (list/2)Opts is a list of test_options.
  • The following properties should hold upon exit:
    (list/1)TestSummaries is a list.

No further documentation available for this predicate.

Usage:run_tests_in_dir_rec(BaseDir,Opts)

Executes all the tests in the modules of the given directory and its subdirectories. You can indicate that the modules in a sub-directory should not be tested by placing an empty NOTEST file in that sub-directory. Also, is a NOTESTFILES is present containing patterns for modules those modules will not be tested.

  • The following properties should hold at call time:
    (pathname/1)BaseDir is a pathname (encoded as an atom)
    (list/2)Opts is a list of test_options.

Usage:run_tests_related_modules(Alias)

  • The following properties should hold at call time:
    (sourcename/1)Alias is a source name.

PREDICATErun_tests/3
No further documentation available for this predicate.

Usage:show_untested_exp_preds(Alias)

Show any exported predicates that do not have test assertions. This is an aid towards ensuring that all exported predicates have tests.

  • The following properties should hold at call time:
    (sourcename/1)Alias is a source name.

Usage:show_test_summaries(TestSummaries)

Pretty print the test results contained in TestSummaries.

    Usage:show_test_output(Alias,Format)

    Given a file Alias, tries to lookup the respective unittest output file and print it to the standard output in the specified Format ('output' for test full trace, 'stats' for a statistical summary only, 'full' for both), otherwise emits a warning message that no test output file is avaiable.

    • The following properties should hold at call time:
      (sourcename/1)Alias is a source name.
      (atm/1)Format is an atom.

    No further documentation available for this predicate.

    A global option that controls the testing system. The current set of options is:

    • dump_output: Show the standard output of the test execution.

    • dump_error: Show the standard error of the test execution.

    • rtc_entry: Force run-time checking of at least exported assertions even if the flag runtime_checks has not been activated. (This is a workaround since currently we cannot enable runtime checks in system libraries smoothly).

    • treat_related : Run tests in current and all related modules;

    • dir_rec : Run tests in a specified directory recursively.

    Usage:test_option(Opt)

    Opt is a testing option.

      A global option that specifies a testing routine. The current set of actions is:

      • check : run tests and temporarily save results in the auto-rewritable module.testout file;

      • show_output : print the testing trace to the standard output;

      • show_stats : print the test results statistics to the standard output;

      Usage:test_action(Action)

      Action is a testing action