Worker for CiaoPP batch processing

Author(s): Isabel Garcia-Contreras.

This module implements an execution mode for CiaoPP as a worker for ciaopp-batch (using ciaopp_batch(db_analysis) for communication).

Note that this tool has to be used with an external timeout because the analysis of some modules could require too much memory or time to be performed. It must be used with Analysis management predicates.

Generated data

  • Analysis information: For each module abstract analysis information is stored in a .dump file generated in the same location. This information and can be restored later.

  • Run-time statistics: Statistics of time and memory used are stored in as a term in a .err file in the same location as the original module. This file contains also the output of the analyzer.

  • Status of the analysis: For each module, information of load and analysis success is stored in data/task_status.pl.

    This allows also the script to be incremental, i.e., it does not repeat ciaopp analysis for a module if it has already been done.

    If the the user wants the tool to redo an analysis for all files, task_status.pl has to be removed before starting.

  • last_analyzed_file.pl: This file contains the file that is being analyzed.


Usage and interface

Documentation on exports

No further documentation available for this predicate.

Usage:logged_once_port_reify(Goal,Port,OutString,ErrString)

Executes Goal, Port is the state when the predicate finishes (true, fail). Its stdout and stderr are stored in OutString and ErrString respectively.

  • The following properties should hold upon exit:
    (string/1)OutString is a string (a list of character codes).
    (string/1)ErrString is a string (a list of character codes).
Meta-predicate with arguments: logged_once_port_reify(goal,?,?,?).

Documentation on internals

Usage:task_status(Status)

Type of analysis status.