Author: | Roscoe A. Bartlett (rabartl@sandia.gov) |
---|---|
Date: | 2024-11-01 |
Version: | tribits_start-3525-g860f3d0 |
Abstract: | This document describes the internal implementation and the maintenance of the TriBITS project itself. The primary audience are those individuals who will make changes and contributions to the TriBITS project or just want to understand its implementation details. Included is all of the same information as the TriBITS Users Guide but also include file names and line numbers for all of the documented TriBITS macros and functions. |
---|
Contents
This document describes the usage and maintenance of the TriBITS (Tribal Build, Integration, Test System) package itself. This document includes a super-set the material from the TriBITS Users Guide and Reference document. In addition, all of the detailed function and macro documentation blocks include the file names and line numbers where they are implemented in the TriBITS source repository. This makes it easier to navigate around the TriBITS source code when developing on TriBITS itself or just trying to understand its implementation.
In order to easily find the most appropriate documentation, see the TriBITS Developer and User Roles guide. This guide describes the different roles that users of TriBITS may play and offers links to relevant sections of the documentation. Additionally, the reader may wish to review the CMake Language Overview and Gotchas section which is meant for users that are new to both CMake and TriBITS. This section gives a brief overview of getting started with CMake and provides some warnings about non-obvious CMake behavior that often trips up new users of TriBITS.
There are approximately five different types roles related to TriBITS. These different roles require different levels of expertise and knowledge of CMake and knowledge of the TriBITS system. The primary roles are 1) TriBITS Project User, 2) TriBITS Project Developer, 3) TriBITS Project Architect, 4) TriBITS System Developer, and 5) TriBITS System Architect. Each of these roles builds on the necessary knowledge of the lower-level roles.
The first role is that of a TriBITS Project User who only needs to be able to configure, build, and test a project that uses TriBITS as its build system. A person acting in this role needs to know little about CMake other than basics about how to run the cmake and ctest executables, how to set CMake cache variables, and the basics of building software by typing make and running tests with ctest. The proper reference for a TriBITS Project User is the Project-Specific Build Reference. The TriBITS Overview document may also be of some help. A TriBITS project user may also need to consult Package Dependencies and Enable/Disable Logic.
A TriBITS Project Developer is someone who contributes to a software project that uses TriBITS. They will add source files, libraries and executables , test executables and define tests run with ctest. They have to configure and build the project code in order to be able to develop and run tests and therefore this role includes all of the necessary knowledge and functions of a TriBITS Project User. A casual TriBITS Project Developer typically does not need to know a lot about CMake and really only needs to know a subset of the TriBITS Macros and Functions defined in the document TriBITS Users Guide and Reference in addition to the genetic TriBITS Build Reference document. A slightly more sophisticated TriBITS Project Developer will also add new packages, add new package dependencies, and define new external packages/TPLs. The TriBITS Users Guide and Reference should supply everything such a developer needs to know and more. However, only a smaller part of that document needs to be understood and accessed by people assuming this role.
The next level of roles is a TriBITS Project Architect. This is someone (perhaps only one person on a project development team) that knows the usage and functioning of TriBITS in great detail. They understand how to set up a TriBITS project from scratch, how to set up automated testing using the TriBITS system, and know how to use TriBITS to implement the overall software development process. A person in this role is also likely to be the one who makes the initial technical decision for their project to adopt TriBITS for its native build and test system. The document TriBITS Users Guide and Reference, detailed CMake/CTest/CDash documentation provided by Kitware, and great books like Professional CMake should provide most of what a person in this role needs to know. A person assuming this role is the primary audience for a lot of the more advanced material in that document.
The last two roles TriBITS System Developer and TriBITS System Architect are for those individuals that actually extend and modify the TriBITS system itself. A TriBITS System Developer needs to know how to add new TriBITS functionality while maintaining backward compatibility, know how to add new unit tests for the TriBITS system (see The TriBITS Test Package), and perform other related tasks. Such a developer needs to be very knowledgeable of the basic functioning of CMake and know how TriBITS is implemented in the CMake language. A TriBITS System Architect is someone who must be consulted on almost all non-trivial changes or additions to the TriBITS system. A TriBITS System Architect in addition needs to know the entire TriBITS system, the design philosophy that provides the foundation for TriBITS and be an expert in CMake, CTest, and CDash. Much of what needs to be known by a TriBITS System Developer and a TriBITS System Architect is contained in the document TriBITS Maintainers Guide and Reference. The rest of the primary documentation for these roles will be in the TriBITS CMake source code and various unit tests itself defined in The TriBITS Test Package. At the time of this writing, there is currently there is only one TriBITS System Architect (who also happens to be the primary author of this document).
An explicit goal of the document TriBITS Users Guide and Reference is to foster the creation of new TriBITS Project Architects (i.e. those who would make the decision to adopt TriBITS for their projects). An explicit goal of the document TriBITS Maintainers Guide and Reference is to foster the creation of new TriBITS System Developers to help extend and maintain the TriBITS package itself.
Depending on the particular role that a reader falls into, this document may not be necessary and the TriBITS Overview or the <Project>BuildReference documents may be more appropriate. Hopefully the above roles and discussion help the reader select the right document to start with.
NOTE: Before getting started with TriBITS, if a reader is unfamiliar with CMake, please review the CMake Language Overview and Gotchas. Once those CMake basics and common gotchas have been reviewed, we now get into the meat of TriBITS starting with software engineering principles that lie at the foundation of TriBITS. That is followed by with and overall of the structure of a TriBITS project.
The design of TriBITS takes into account standard software engineering packaging principles. In his book [Agile Software Development, 2003], Robert Martin defines several software engineering principles related to packaging software which are listed below:
Any of these six OO packaging principles (and other issues) may be considered when deciding how to partition software into different TriBITS Packages.
NOTE: The purpose of this TriBITS Developers Guide document is not teach basic software engineering so these various principles will not be expanded on further. However, interested readers are strongly encouraged to read [Agile Software Development, 2003] as one of the better software engineering books out there (see https://bartlettroscoe.github.io/reading-list/#most_recommended_se_books).
TriBITS is a Framework, implemented in CMake, to create CMake projects. As a Software Framework, TriBITS defines the overall structure of a CMake build system for a project and it processes the various project-, repository-, and package-specific files in a specified order. Almost all of this processing takes place in the tribits_project() macro (or macros and functions it calls). The following subsections define the essence of the TriBITS framework in some detail. Later sections cover specific topics and the various sections link to each other. Within this section, the subsection TriBITS Structural Units defines the basic units TriBITS Project, TriBITS Repository, TriBITS Package, TriBITS External Package/TPL and other related structural units. The subsection Processing of TriBITS Files: Ordering and Details defines exactly what files TriBITS processes and in what order. It also shows how to get TriBITS to show exactly what files it is processing to help in debugging issues. The subsection Coexisting Projects, Repositories, and Packages gives some of the rules and constrains for how the different structure units can co-exist in the same directories. The last two subsections in this section cover Standard TriBITS TPLs and Common TriBITS TPLs.
A CMake project that uses TriBITS as its build and test system is composed of a single TriBITS Project, one or more TriBITS Repositories and one or more TriBITS Packages. In addition, a TriBITS Package can be broken up into TriBITS Subpackages. Together, the collection of TriBITS Packages and TriBITS Subpackages are called TriBITS Software Engineering Packages, or TriBITS Packages for short.
First, to better establish the basic nomenclature, the key structural TriBITS units are:
In this document, dependencies are described as either being upstream or downstream/forward defined as:
The following subsections define the major structural units of a TriBITS project in more detail. Each structural unit is described along with the files and directories associated with each. In addition, a key set of TriBITS CMake variables for each are defined as well.
In the next major section following this one, some Example TriBITS Projects are described. For those who just want to jump in and learn best by example, these example projects are a good way to start. These example projects will be referenced in the more detailed descriptions given in this document.
The last issue to touch on before getting into the detailed descriptions of the different TriBITS structural units is the issue of how CMake variables are defined and used by TriBITS. The CMake variables described in the TriBITS structural units below fall into one of two major types:
More information about these various files is described in section Processing of TriBITS Files: Ordering and Details.
A TriBITS Project:
For more details on the definition of a TriBITS Project, see:
The core files making up a TriBITS Project (where <projectDir> = ${PROJECT_SOURCE_DIR}) are:
<projectDir>/ ProjectName.cmake # Defines PACKAGE_NAME CMakeLists.txt # Base project CMakeLists.txt file CTestConfig.cmake # [Optional] Needed for CDash submits Version.cmake # [Optional] Dev mode, Project version, VC branch project-checkin-test-config.py # [Optional] checkin-test.py config cmake/ NativeRepositoriesList.cmake # [Optional] Rarely used ExtraRepositoriesList.cmake # [Optional] Lists repos and VC URLs ProjectCiFileChangeLogic.py # [Optional] CI global change/test logic ProjectCompilerPostConfig.cmake # [Optional] Override/tweak build flags ProjectDependenciesSetup.cmake # [Optional] Project deps overrides CallbackDefineProjectPackaging.cmake # [Optional] CPack settings tribits/ # [Optional] Or provide ${PROJECT_NAME}_TRIBITS_DIR ctest/ CTestCustom.cmake.in # [Optional] Custom ctest settings
These TriBITS Project files are documented in more detail below:
<projectDir>/ProjectName.cmake: [Required] At a minimum provides a set() statement to set the local variable PROJECT_NAME. This file is the first file that is read by a number of tools in order to get the TriBITS project's name. This file is read first in every context that involves processing the TriBITS project's files, including processes and tools that just need to build the package dependency tree (see Reduced Package Dependency Processing). Being this is the first file read in for a TriBITS project and that it is read in first at the top level scope in every context, this is a good file to put in other universal static project options. Note that this is a project, not a repository file so no general repository-specific settings should go in this file. A simple example of this file is TribitsExampleProject/ProjectName.cmake:
# Must set the project name at very beginning before including anything else set(PROJECT_NAME TribitsExProj) # Turn on export dependency generation for WrapExteranl package set(${PROJECT_NAME}_GENERATE_EXPORT_FILE_DEPENDENCIES_DEFAULT ON) # Turn on by default the generation of the export files set(${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES_DEFAULT ON)
A meta-project's ProjectName.cmake file might have a number of other variables set that define how the various different TriBITS repos are cobbled together into a single TriBITS meta-project (usually because the different TriBITS repos and packages are a bit messy and have other issues). For example, the CASL VERA TriBITS meta-project at one point had a very extensive collection of set statements in this file.
<projectDir>/CMakeLists.txt: [Required] The top-level CMake project file. This is the first file that the cmake executable processes that starts everything off and is the base-level scope for local (non-cache) CMake variables. Due to a few CMake limitations and quirks, a project's top-level CMakeLists.txt file is not quit as clean as one might otherwise hope would be but it is not too bad. A simple, but representative, example is TribitsExampleProject/CMakeLists.txt:
################################################################################ # # # TribitsExampleProject # # # ################################################################################ # To be safe, define your minimum CMake version. This may be newer than the # min required by TriBITS. cmake_minimum_required(VERSION 3.23.0 FATAL_ERROR) # Make CMake set WIN32 with CYGWIN for older CMake versions. CMake requires # this to be in the top-level CMakeLists.txt file and not an include file :-( set(CMAKE_LEGACY_CYGWIN_WIN32 1 CACHE BOOL "" FORCE) # # A) Define your project name and set up major project options # # NOTE: Don't set options that would impact what packages get defined or # enabled/disabled in this file as that would not impact other tools that # don't process this file. # # Get PROJECT_NAME (must be in a file for other parts of system to read) include("${CMAKE_CURRENT_SOURCE_DIR}/ProjectName.cmake") # CMake requires that you declare the CMake project in the top-level file and # not in an include file :-( project(${PROJECT_NAME} NONE) set(TRIBITS_HIDE_DEPRECATED_INCLUDE_DIRECTORIES_OVERRIDE TRUE) # # B) Pull in the TriBITS system and execute # set(${PROJECT_NAME}_TRIBITS_DIR "${CMAKE_CURRENT_LIST_DIR}/../.." CACHE STRING "TriBITS base directory (default assumes in TriBITS source tree)") include("${${PROJECT_NAME}_TRIBITS_DIR}/TriBITS.cmake") # Set default location for header-only TPL to make easy to configure out of # the TriBITS source tree. set(HeaderOnlyTpl_INCLUDE_DIRS "${${PROJECT_NAME}_TRIBITS_DIR}/examples/tpls/HeaderOnlyTpl" CACHE PATH "Default set by TriBITS/CMakeLists.txt" ) # Do all of the processing for this Tribits project tribits_project()
A couple of CMake and TriBITS quirks that the above example CMakeLists.txt addresses are worth some discussion. First, to avoid duplication, the project's ProjectName.cmake file is read in with an include() that defines the local variable PROJECT_NAME. Right after this initial include, the built-in CMake command project(${PROJECT_NAME} NONE) is run. This command must be explicitly called with NONE so as to avoid default CMake behavior for defining compilers. The definition of compilers comes later as part of the TriBITS system inside of the tribits_project() command (see Full Processing of TriBITS Project Files).
As noted in the above example file, the only project defaults that should be set in this top-level CMakeLists.txt file are those that do not impact the list of package enables/disables. The latter type of defaults should set in other files (see below).
In this example project, a CMake cache variable ${PROJECT_NAME}_TRIBITS_DIR must be set by the user to define where the base tribits source directory is located. With this variable set (i.e. passed into cmake command-line use -DTribitsExProj_TRIBITS_DIR=<someDir>), one just includes a single file to pull in the TriBITS system:
include("${${PROJECT_NAME}_TRIBITS_DIR}/TriBITS.cmake")
With the TriBITS.cmake file included, the configuration of the project using TriBITS occurs with a single call to tribits_project().
Some projects, like Trilinos, actually snapshot the TriBITS/tribits/ directory into their source tree <projectDir>/cmake/tribits/ and therefore don't need to have this variable set. In Trilinos, the include line is just:
include(${CMAKE_CURRENT_SOURCE_DIR}/cmake/tribits/TriBITS.cmake)
The minimum CMake version must also be declared in the top-level CMakeLists.txt file as shown. Explicitly setting the minimum CMake version avoids strange errors that can occur when someone tries to build the project using a version of CMake that is too old. The project should set the minimum CMake version based on the CMake features used in that project's own CMake files. The minimum CMake version required by TriBITS is defined in the variable TRIBITS_CMAKE_MINIMUM_REQUIRED (the current minimum version of CMake required by TriBITS is given at in Getting set up to use CMake) . For example, the VERA/CMakeLists.txt file lists as its first line:
set(VERA_TRIBITS_CMAKE_MINIMUM_REQUIRED 3.23.0) cmake_minimum_required(VERSION ${VERA_TRIBITS_CMAKE_MINIMUM_REQUIRED} FATAL_ERROR)
<projectDir>/CTestConfig.cmake: [Optional] Specifies the CDash site and project to submit results to when doing an automated build driven by the CTest driver function tribits_ctest_driver() (see TriBITS CTest/CDash Driver). This file is also required to use the TriBITS-generated dashboard target (see Dashboard Submissions). An example of this file is TribitsExampleProject/CTestConfig.cmake:
include(SetDefaultAndFromEnv) set(CTEST_NIGHTLY_START_TIME "04:00:00 UTC") # 10 PM MDT or 9 PM MST if (NOT DEFINED CTEST_DROP_METHOD) set_default_and_from_env(CTEST_DROP_METHOD "https") endif() if (CTEST_DROP_METHOD STREQUAL "http" OR CTEST_DROP_METHOD STREQUAL "https") set_default_and_from_env(CTEST_DROP_SITE "my.cdash.org") set_default_and_from_env(CTEST_PROJECT_NAME "TribitsExampleProject") set_default_and_from_env(CTEST_DROP_LOCATION "/submit.php?project=TribitsExampleProject") set_default_and_from_env(CTEST_TRIGGER_SITE "") set_default_and_from_env(CTEST_DROP_SITE_CDASH TRUE) endif()
Most of the variables set in this file are directly understood by raw ctest and those variables not be explained here further (see documentation for the standard CMake module CTest). The usage of the function set_default_and_from_env() allows the variables to be overridden both as CMake cache variables and in the environment. The latter is needed when running using ctest as the driver (since older versions of ctest did not support -D<var-name>:<type>=<value> command-line arguments like for cmake). Given that all of these variables are nicely namespaced, overriding them in the shell environment is not as dangerous as might otherwise be the case but this is what had to be done to get around limitations for older versions of CMake/CTest.
NOTE: One can also set:
set_default_and_from_env(TRIBITS_2ND_CTEST_DROP_SITE ...) set_default_and_from_env(TRIBITS_2ND_CTEST_DROP_LOCATION ...)
in this file in order to submit to a second CDash site/location. For details, see Dashboard Submissions. This is useful when considering a CDash upgrade and/or implementing new CDash features or tweaks.
<projectDir>/Version.cmake: If defined, gives the project's version and determines development/release mode (see Project and Repository Versioning and Release Mode). This file is read in (using include()) in the project's base-level <projectDir>/CMakeLists.txt file scope so local variables set in this file are seen by the entire CMake project. For example, TribitsExampleProject/Version.cmake, looks like:
set(${REPOSITORY_NAME}_VERSION 1.1) set(${REPOSITORY_NAME}_MAJOR_VERSION 01) set(${REPOSITORY_NAME}_MAJOR_MINOR_VERSION 010100) set(${REPOSITORY_NAME}_VERSION_STRING "1.1 (Dev)") set(${REPOSITORY_NAME}_ENABLE_DEVELOPMENT_MODE_DEFAULT ON) # Change to 'OFF' for a release
When this file exists in the base project, these will be used to create standard SOVERSION symlinks to shared libs. For example, on Linux, in addition to the real shared lib lib<libname>.so, the standard SOVERSION symlinks are created like:
lib<libname>.so.01 lib<libname>.so.1.1
When this file exists at the repository level, the prefix ${REPOSITORY_NAME}_ is used instead of hard-coding the project name. This is so that the same Version.txt file can be used as the <repoDir>/Version.cmake file and have the repository name be flexible. TriBITS sets REPOSITORY_NAME = ${PROJECT_NAME} when it reads in this file at the project-level scope.
It is strongly recommended that every TriBITS project contain a Version.cmake file, even if a release has never occurred. Otherwise, the project needs to define the variable ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE_DEFAULT at the global project scope (perhaps in <projectDir>/ProjectName.cmake) to get right development mode behavior.
<projectDir>/project-checkin-test-config.py: [Optional] Used to define the --default-builds and other project-level configuration options for the project's usage of the checkin-test.py tool. Machine or package-specific options should not be placed in this file. An example of this file for TribitsExampleProject/project-checkin-test-config.py is shown below:
# # Define project-specific options for the checkin-test script for # TribitsExampleProject. # configuration = { # Default command line arguments 'defaults': { '--send-email-to-on-push': 'trilinos-checkin-tests@software.sandia.gov', }, # CMake options (-DVAR:TYPE=VAL) cache variables. 'cmake': { # Options that are common to all builds. 'common': [], # Defines --default-builds, in order. 'default-builds': [ # Options for the MPI_DEBUG build. ('MPI_DEBUG', [ '-DTPL_ENABLE_MPI:BOOL=ON', '-DCMAKE_BUILD_TYPE:STRING=RELEASE', '-DTribitsExProj_ENABLE_DEBUG:BOOL=ON', '-DTribitsExProj_ENABLE_CHECKED_STL:BOOL=ON', '-DTribitsExProj_ENABLE_DEBUG_SYMBOLS:BOOL=ON', ]), # Options for the SERIAL_RELEASE build. ('SERIAL_RELEASE', [ '-DTPL_ENABLE_MPI:BOOL=OFF', '-DCMAKE_BUILD_TYPE:STRING=RELEASE', '-DTribitsExProj_ENABLE_DEBUG:BOOL=OFF', '-DTribitsExProj_ENABLE_CHECKED_STL:BOOL=OFF', ]), ], # default-builds }, # cmake } # configuration
The contents of the file project-checkin-test-config.py show above are pretty self explanatory. This file defines a single python dictionary data-structure called configuration which gives some default arguments in defaults, and then cmake options that define the projects --default-builds. For more details, see the section Pre-push Testing using checkin-test.py.
<projectDir>/cmake/NativeRepositoriesList.cmake: [Deprecated] If present, this file gives the list of native repositories for the TriBITS project. The file must contain a set() statement defining the variable ${PROJECT_NAME}_NATIVE_REPOSITORIES which is just a flat list of repository names that must also be directory names under <projectDir>/. For example, if this file contains:
set(${PROJECT_NAME}_NATIVE_REPOSITORIES Repo0 Repo1)
then the directories <projectDir>/Repo0/ and <projectDir>/Repo1/ must exist and must be valid TriBITS repositories (see TriBITS Repository).
There are no examples for the usage of this file in any of the TriBITS examples or test projects. However, support for this file is maintained for backward compatibility since there may be some TriBITS projects that still use it. It is recommended instead to define multiple repositories using the <projectDir>/cmake/ExtraRepositoriesList.cmake file as it allows for more flexibility in how extra repositories are specified and how they are accessed. The latter file allows the various tools to perform version control (VC) activities with these repos while "native repositories" do not.
If this file NativeRepositoriesList.cmake does not exist, then TriBITS sets ${PROJECT_NAME}_NATIVE_REPOSITORIES equal to ".", or the base project directory (i.e. <projectDir>/.). In this case, the file <projectDir>/PackagesList.cmake and <projectDir>/TPLsList.cmake must exist. However, if the project has no native packages or external packages/TPLs, then these files can be set up with empty lists. This is the case for meta-projects like CASL VERA that have only extra repositories specified in the file <projectDir>/cmake/ExtraRepositoriesList.cmake.
<projectDir>/cmake/ExtraRepositoriesList.cmake: [Optional] If present, this file defines a list of extra repositories that are added on to the project's native repositories. The list of repositories is defined using the macro tribits_project_define_extra_repositories(). For example, the extra repos file:
tribits_project_define_extra_repositories( ExtraRepo1 "" GIT someurl.com:/ExtraRepo1 "" Continuous ExtraRepo2 packages/SomePackage/Blah GIT someurl2.com:/ExtraRepo2 NOPACKAGES Nightly ExtraRepo3 "" HG someurl3.com:/ExtraRepo3 "" Continuous ExtraRepo4 "" SVN someurl4.com:/ExtraRepo4 "" Nightly )
shows the specification of both TriBITS Repositories and non-TriBITS VC Repositories. In the above file, the repositories ExtraRepo1, ExtraRepo3, and ExtraRepo4 are both TriBITS and VC repositories that are cloned into directories under <projectDir> of the same names from the URLs someurl.com:/ExtraRepo1, someurl3.com:/ExtraRepo3, and someurl4.com:/ExtraRepo4, respectively. However, the repository ExtraRepo2 is not a TriBITS Repository because it is marked as NOPACKAGES. In this case, it gets cloned as the directory:
<projectDir>/packages/SomePackage/Blah
However, the code in the tools checkin-test.py and tribits_ctest_driver() will consider non-TriBITS VC repos like ExtraRepo2 and any changes to this repository will be listed as changes to somePackage (see Pre-push Testing using checkin-test.py).
NOTE: This file can be overridden by setting the cache variable <Project>_EXTRAREPOS_FILE.
<projectDir>/cmake/ProjectCiFileChangeLogic.py: [Optional] If present, then this Python module is imported and the Python class defined there ProjectCiFileChangeLogic there is used to determine which files need to trigger a global rebuild of the project enabling all packages.
An example of this given in the file TribitsExampleProject/cmake/ProjectCiFileChangeLogic.py:
# # Specialized logic for what file changes should trigger a global build in CI # testing where testing should only occur package impacted by the change. # class ProjectCiFileChangeLogic: def isGlobalBuildFileRequiringGlobalRebuild(self, modifiedFileFullPath): modifiedFileFullPathArray = modifiedFileFullPath.split('/') lenPathArray = len(modifiedFileFullPathArray) if lenPathArray==1: # Files sitting directly under <projectDir>/ if modifiedFileFullPathArray[0] == "CMakeLists.txt": return True if modifiedFileFullPathArray[0].rfind(".cmake") != -1: return True elif modifiedFileFullPathArray[0] == 'cmake': # Files under <projectDir>/cmake/ if modifiedFileFullPathArray[1]=='ExtraRepositoriesList.cmake': return False elif modifiedFileFullPathArray[1] == 'ctest' and lenPathArray >= 3: if lenPathArray > 3: # This is a file # <projectDir>/cmake/ctest/<something>/[...something...] so this is # for a specific machine and should not trigger a global build. return False else: # Any other file directly under cmake/ctest/ should trigger a global # build. return True else: # All other files under cmake/ if modifiedFileFullPath.rfind(".cmake") != -1: # All other *.cmake files under cmake/ trigger a global build. return True # Any other files should not trigger a global build return False
This logic is used in all code that is used in CI testing including checkin-test.py, tribits_ctest_driver() and get-tribits-packages-from-files-list.py. If this file does not exist, then TriBITS has some default logic which may or may not be sufficient for the needs of a given project.
<projectDir>/cmake/ProjectCompilerPostConfig.cmake: [Optional] If present, then this file is read using include() at the top-level CMakeLists.txt file scope right after the compilers for the languages <LANG> = C, CXX, and Fortran are determined and checked using enable_language(<LANG>) but before any other checks are performed. This file can contain logic for the project to adjust the flags set in CMAKE_<LANG>_FLAGS and changes to other aspects of the build flags (including link flags, etc.).
One example of the usage of this file is the Trilinos project where this file is (or was) used to apply specialized logic implemented in the Kokkos build system to select compiler options and to determine how C++11 and OpenMP flags are set. This file in Trilinos looked like:
if (${Trilinos_ENABLE_Kokkos}) ... include(${Kokkos_GEN_DIR}/kokkos_generated_settings.cmake) if (NOT KOKKOS_ARCH STREQUAL "None") set(CMAKE_CXX_FLAGS "${CMAKE_CXX_FLAGS} ${KOKKOS_CXX_FLAGS}") message("-- " "Skip adding flags for OpenMP because Kokkos flags does that ...") set(OpenMP_CXX_FLAGS_OVERRIDE " ") endif() endif()
The exact context where this file is processed (if it exists) is described in Full Processing of TriBITS Project Files and TriBITS Environment Probing and Setup.
<projectDir>/cmake/ProjectDependenciesSetup.cmake: [Optional] If present, this file is included a single time as part of the generation of the project's dependency data-structure (see Reduced Package Dependency Processing). It gets included at the top project level scope after all of the <repoDir>/cmake/RepositoryDependenciesSetup.cmake files have been included but before all of the package <packageDir>/cmake/Dependencies.cmake files are included. Any local variables set in this file have project-wide scope. The primary purpose for this file is to set variables that will impact the processing of project's package Dependencies.cmake files.
The typical usage of this file is to set the default CDash email address for all packages or override the email addresses for all of a repository's package CDash regression email addresses (see CDash regression email addresses). For example, to set the default email address for all of the packages, one would set in this file:
set_default(${PROJECT_NAME}_PROJECT_MASTER_EMAIL_ADDRESS projectx-regressions@somemailserver.org)
The repository email address variables ${REPOSITORY_NAME}_REPOSITORY_EMAIL_URL_ADDRESS_BASE and ${REPOSITORY_NAME}_REPOSITORY_MASTER_EMAIL_ADDRESS possibly set in the just processed <repoDir>/cmake/RepositoryDependenciesSetup.cmake files can also be overridden in this file. The CASL VERA meta-project uses this file to override several of the repository-specific email addresses for its constituent repositories.
In general, variables that affect how package dependencies are defined or affect package enable/disable logic for only this particular project should be defined in this file.
<projectDir>/cmake/CallbackDefineProjectPackaging.cmake: [Optional] If exists, defines the CPack settings for the project (see Official CPack Documentation and Online CPack Wiki). This file must define a macro called tribits_project_define_packaging() which is then invoked by TriBITS. The file:
TribitsExampleProject/cmake/CallbackDefineProjectPackaging.cmake
provides a good example which is:
macro(TRIBITS_PROJECT_DEFINE_PACKAGING) tribits_copy_installer_resource(TribitsExProj_README "${TribitsExProj_SOURCE_DIR}/README.md" "${TribitsExProj_BINARY_DIR}/README.md") tribits_copy_installer_resource(TribitsExProj_LICENSE "${TribitsExProj_SOURCE_DIR}/LICENSE" "${TribitsExProj_BINARY_DIR}/LICENSE.txt") set(CPACK_PACKAGE_DESCRIPTION "TribitsExampleProject just shows you how to use TriBITS.") set(CPACK_PACKAGE_FILE_NAME "tribitsexproj-setup-${TribitsExProj_VERSION}") set(CPACK_PACKAGE_INSTALL_DIRECTORY "TribitsExProj ${TribitsExProj_VERSION}") set(CPACK_PACKAGE_REGISTRY_KEY "TribitsExProj ${TribitsExProj_VERSION}") set(CPACK_PACKAGE_NAME "tribitsexproj") set(CPACK_PACKAGE_VENDOR "Sandia National Laboratories") set(CPACK_PACKAGE_VERSION "${TribitsExProj_VERSION}") set(CPACK_RESOURCE_FILE_README "${TribitsExProj_README}") set(CPACK_RESOURCE_FILE_LICENSE "${TribitsExProj_LICENSE}") set(${PROJECT_NAME}_CPACK_SOURCE_GENERATOR_DEFAULT "TGZ;TBZ2") set(CPACK_SOURCE_FILE_NAME "tribitsexproj-source-${TribitsExProj_VERSION}") set(CPACK_COMPONENTS_ALL ${TribitsExProj_PACKAGES} Unspecified) endmacro()
The CPack variables show above that should be defined at the project-level are described in the Official CPack Documentation.
Settings that are general for all distributions (like non-package repository files to exclude from the tarball) should be set at the in the file <repoDir>/cmake/CallbackDefineRepositoryPackaging.cmake. See Creating Source Distributions for more details.
<projectDir>/cmake/tribits/: [Optional] This is the typical location of the TriBITS/tribits/ source tree for projects that choose to snapshot TriBITS into their source tree. In fact, TriBITS assumes this is the default location for the TriBITS source tree if ${PROJECT_NAME}_TRIBITS_DIR is not otherwise specified. Trilinos, for example, currently snapshots the TriBITS source tree into this directory. See TriBITS directory snapshotting for more details.
<projectDir>/cmake/ctest/CTestCustom.cmake.in: [Optional] If this file exists, it is processed using a configure_file() command to write the file CTestCustom.cmake in the project base build directory `${PROJECT_BINARY_DIR}/. This file is picked up automatically by ctest (see CTest documentation). This file is typically used to change the maximum size of test output. For example, the TribitsExampleProject/cmake/ctest/CTestCustom.cmake.in looks like:
# Increase the amount of output being produced set(CTEST_CUSTOM_MAXIMUM_PASSED_TEST_OUTPUT_SIZE 54321) set(CTEST_CUSTOM_MAXIMUM_FAILED_TEST_OUTPUT_SIZE 123456) # NOTE: Above, we use these numbers to make it easy to determine that these # values are getting read correctly in various modes of running # tribits_ctest_driver(). In a real project, you could set something like: # # set(CTEST_CUSTOM_MAXIMUM_PASSED_TEST_OUTPUT_SIZE 50000) # set(CTEST_CUSTOM_MAXIMUM_FAILED_TEST_OUTPUT_SIZE 5000000)
which sets the output size for each test submitted to CDash be unlimited (which is not really recommended). These variables used by Trilinos at one time were:
set(CTEST_CUSTOM_MAXIMUM_PASSED_TEST_OUTPUT_SIZE 50000) set(CTEST_CUSTOM_MAXIMUM_FAILED_TEST_OUTPUT_SIZE 5000000)
which sets the max output for passed and failed tests to 50000k and 5000000k, respectively.
For documentation of the options one can change for CTest, see online CTest documentation.
The following local variables are defined in the top-level Project CMakeLists.txt file scope and are therefore accessible by all files processed by TriBITS:
PROJECT_NAME
The name of the TriBITS Project. This exists to support, among other things, the ability for subordinate units (Repositories and Packages) to determine the Project in which is participating. This is typically read from a set() statement in the project's <projectDir>/ProjectName.cmake file.PROJECT_SOURCE_DIR
The absolute path to the base Project source directory. This is set automatically by TriBITS given the directory passed into cmake at configure time at the beginning of the tribits_project() macro.PROJECT_BINARY_DIR
The absolute path to the base Project binary/build directory. This is set automatically by TriBITS and is the directory where cmake is run from and is set at the beginning of the tribits_project() macro.${PROJECT_NAME}_SOURCE_DIR
Set to the same directory as ${PROJECT_SOURCE_DIR} automatically by the built-in project() command called in the top-level <projectDir>/CMakeLists.txt file..${PROJECT_NAME}_BINARY_DIR
Set to the same directory as ${PROJECT_BINARY_DIR} automatically by the built-in project() command called in the top-level <projectDir>/CMakeLists.txt file..
The following cache variables are defined for every TriBITS project:
${PROJECT_NAME}_TRIBITS_DIR
CMake cache variable that gives the path to the TriBITS implementation directory. When set to a relative path (set as type STRING, see below), this is taken relative to ${CMAKE_CURRENT_SOURCE_DIR}/ (the project base source dir). When an absolute path is given, it is used without modification. If this variable is not set in the <projectDir>/CMakeLists.txt file, then it will be automatically set as a PATH cache variable by the include of TriBITS.cmake by the statement
set( ${PROJECT_NAME}_TRIBITS_DIR "${CMAKE_CURRENT_SOURCE_DIR}/cmake/tribits" CACHE PATH "...")Therefore, projects that snapshot TriBITS into <projectDir>/cmake/tribits/ don't need to explicitly set ${PROJECT_NAME}_TRIBITS_DIR. In addition, one can also point to a different TriBITS implementation just by setting the absolute path:
-D <Project>_TRIBITS_DIR=<some-abs-dir>or to a relative path using, for example:
-D <Project>_TRIBITS_DIR:STRING=TriBITS/tribitsNote that when the TriBITS git repo itself is cloned by a TriBITS project, then ${PROJECT_NAME}_TRIBITS_DIR should be set to the directory TriBITS/tribits (see TriBITS/tribits/) as shown above.
${PROJECT_NAME}_ENABLE_TESTS
CMake cache variable that if set to ON, then tests for all explicitly enabled packages will be turned on. This has a default value of OFF. This is used in logic to enable individual package tests (see <Project>_ENABLE_TESTS only enables explicitly enabled package tests).${PACKAGE_NAME}_ENABLE_EXAMPLES
CMake cache variable that if set to ON, then examples for all explicitly enabled packages will be turned on. This has a default value of OFF.
The following internal project-scope local (non-cache) CMake variables are defined by TriBITS giving the project's TriBITS repositories.:
${PROJECT_NAME}_NATIVE_REPOSITORIES
The list of Native Repositories for a given TriBITS project (i.e. Repositories that are always present when configuring the Project and are managed in the same VC repo typically). This variable is set in the file <projectDir>/cmake/NativeRepositoriesList.cmake if it exists. If the file NativeRepositoriesList.cmake does not exist, then the project is assumed to also be a repository and the list of native repositories is just the local project directory ${PROJECT_SOURCE_DIR}/.. In this case, the ${PROJECT_SOURCE_DIR}/ must contain at a minimum a PackagesList.cmake file, and a TPLsList.cmake file (see TriBITS Repository).${PROJECT_NAME}_EXTRA_REPOSITORIES
The list of Extra Repositories that the project is being configured with. This list of repositories either comes from processing the project's <projectDir>/cmake/ExtraRepositoriesList.cmake file or comes from the CMake cache variable ${PROJECT_NAME}_EXTRA_REPOSITORIES. See Enabling extra repositories with add-on packages for details.${PROJECT_NAME}_ALL_REPOSITORIES
Concatenation of all the repos listed in ${PROJECT_NAME}_NATIVE_REPOSITORIES and ${PROJECT_NAME}_EXTRA_REPOSITORIES in the order they are processed.
A TriBITS Repository is the basic unit of ready-made composition between different collections of software that use the TriBITS CMake build and system.
In short, a TriBITS Repository:
For more details on the definition of a TriBITS Repository, see:
The core files making up a TriBITS Repository (where <repoDir> = ${${REPOSITORY_NAME}_SOURCE_DIR}) are:
<repoDir>/ PackagesList.cmake TPLsList.cmake Copyright.txt # [Optional] Only needed if creating version header file Version.cmake # [Optional] Info inserted into ${REPO_NAME}_version.h cmake/ RepositoryDependenciesSetup.cmake # [Optional] CDash email addresses? CallbackSetupExtraOptions.cmake # [Optional] Called after main options CallbackDefineRepositoryPackaging.cmake # [Optional] CPack packaging
These TriBITS Repository files are documented in more detail below:
<repoDir>/PackagesList.cmake: [Required] Provides the list of top-level packages defined by the repository. This file typically just calls the macro tribits_repository_define_packages() to define the list of packages along with their directories and other properties. For example, the file TribitsExampleProject/PackagesList.cmake looks like:
tribits_repository_define_packages( SimpleCxx packages/simple_cxx PT MixedLang packages/mixed_lang PT InsertedPkg InsertedPkg ST WithSubpackages packages/with_subpackages PT WrapExternal packages/wrap_external ST ) tribits_disable_package_on_platforms(WrapExternal Windows) tribits_allow_missing_external_packages(InsertedPkg)
Other commands that are appropriate to use in this file include tribits_disable_package_on_platforms() and tribits_allow_missing_external_packages(). Also, if the binary directory for any package <packageName> needs to be changed from the default, then the variable <packageName>_SPECIFIED_BINARY_DIR can be set. (see TriBITS Package == TriBITS Repository == TriBITS Project).
It is perfectly legal for a TriBITS repository to define no packages at all with:
tribits_repository_define_packages()
and this would be the case for a TriBITS meta-project that has no native packages, only extra repositories.
<repoDir>/TPLsList.cmake: [Required] Provides the list of external packages/TPLs that are referenced as dependencies in the repository's package's <packageDir>/cmake/Dependencies.cmake files (see TriBITS External Package/TPL). This file typically just calls the macro tribits_repository_define_tpls() to define the TPLs along with their find modules and other properties. An example is ReducedMockTrilinos/TPLsList.cmake which shows:
tribits_repository_define_tpls( MPI "${${PROJECT_NAME}_TRIBITS_DIR}/core/std_tpls/" PT BLAS "${${PROJECT_NAME}_TRIBITS_DIR}/common_tpls/" PT LAPACK "${${PROJECT_NAME}_TRIBITS_DIR}/common_tpls/" PT Boost cmake/TPLs/ ST UMFPACK cmake/TPLs/ ST AMD cmake/TPLs/ EX PETSC "${${PROJECT_NAME}_TRIBITS_DIR}/common_tpls/" ST )
See TriBITS External Package/TPL for details on what gets defined for each TriBITS TPL once this file is processed.
It is perfectly fine to specify no TPLs at all for a repository with:
tribits_repository_define_tpls()
but the macro tribits_repository_define_tpls() has to be called, even if there are no TPLs. See tribits_repository_define_tpls() for further details and constraints.
<repoDir>/Copyright.txt: [Optional] Gives the default copyright and license declaration for all of the software in the TriBITS repository directory <repoDir>/. This file is read into a string and then used to configure the repository's version header file (see Project and Repository Versioning and Release Mode). Even if a repository version header file is not produced, it is a good idea for every TriBITS repository to define this file, just for legal purposes. For a good open-source license, one should consider copying the TriBITS/Copyright.txt file which is a simple 3-clause BSD-like license like:
TriBITS: Tribal Build, Integrate, and Test System Copyright (c) 2013 NTESS Copyright 2013 National Technology & Engineering Solutions of Sandia, LLC (NTESS). Under the terms of Contract DE-NA0003525 with NTESS, the U.S. Government retains certain rights in this software. Copyright the TriBITS contributors.
<repoDir>/Version.cmake: [Optional] Contains version information for the repository (and the project also if this is also the base project). For example, TribitsExampleProject/Version.cmake, this looks like:
set(${REPOSITORY_NAME}_VERSION 1.1) set(${REPOSITORY_NAME}_MAJOR_VERSION 01) set(${REPOSITORY_NAME}_MAJOR_MINOR_VERSION 010100) set(${REPOSITORY_NAME}_VERSION_STRING "1.1 (Dev)") set(${REPOSITORY_NAME}_ENABLE_DEVELOPMENT_MODE_DEFAULT ON) # Change to 'OFF' for a release
Note that the prefix ${REPOSITORY_NAME}_ is used instead of hard-coding the repository's name to allow flexibility in what a meta-project names a given TriBITS repository.
The local variables in these set statements are processed in the base project directory's local scope and are therefore seen by the entire CMake project. When this file is read in repository mode, the variable ${REPOSITORY_NAME}_ENABLE_DEVELOPMENT_MODE_DEFAULT is ignored.
<repoDir>/cmake/RepositoryDependenciesSetup.cmake: [Optional] If present, this file is included a single time as part of the generation of the project dependency data-structure (see Reduced Package Dependency Processing). It gets included in the order listed in ${PROJECT_NAME}_ALL_REPOSITORIES. Any local variables set in this file have project-wide scope. The primary purpose for this file is to set variables that will impact the processing of project's package's Dependencies.cmake files and take care of other enable/disable issues that are not otherwise cleanly handled by the TriBITS system automatically.
The typical usage of this file is to set the default CDash email address for all of the defined packages (see CDash regression email addresses). For example, to set the default email address for all of the packages in this repository, one would set in this file:
set_default(${REPOSITORY_NAME}_REPOSITORY_MASTER_EMAIL_ADDRESS repox-regressions@somemailserver.org)
Note that the prefix ${REPOSITORY_NAME}_ is used instead of hard-coding the repo name to allow greater flexibility in how meta-projects refer to this TriBITS repo.
<repoDir>/cmake/CallbackSetupExtraOptions.cmake: [Optional] If defined, this file is processed (included) for each repo in order right after the basic TriBITS options are defined in the macro tribits_define_global_options_and_define_extra_repos(). This file must define the macro tribits_repository_setup_extra_options() which is then called by the TriBITS system. This file is only processed when doing a basic configuration of the project and not when it is just building up the dependency data-structures (i.e. it is not processed in the Reduced Package Dependency Processing). Any local variables set have project-wide scope.
A few additional variables are defined by the time this file is processed and can be used in the logic in these files. Some of the variables that should already be defined (in addition to all of the basic user TriBITS cache variables set in tribits_define_global_options_and_define_extra_repos()) include CMAKE_HOST_SYSTEM_NAME, ${PROJECT_NAME}_HOSTNAME, and Python3_EXECUTABLE (see Python Support). The types of commands and logic to put in this file include:
An example of this file is:
TribitsExampleProject//cmake/CallbackSetupExtraOptions.cmake
which currently looks like:
macro(TRIBITS_REPOSITORY_SETUP_EXTRA_OPTIONS) assert_defined(${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES) if (${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES) message( "\n***" "\n*** NOTE: Setting ${PROJECT_NAME}_ENABLE_WrapExternal=OFF" " because ${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES='${${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES}'!" "\n***\n" ) set(${PROJECT_NAME}_ENABLE_WrapExternal OFF) endif() if ("${Python3_EXECUTABLE}" STREQUAL "") message( "\n***" "\n*** NOTE: Setting ${PROJECT_NAME}_ENABLE_WrapExternal=OFF" " because Python3_EXECUTABLE=''!" "\n***\n" ) set(${PROJECT_NAME}_ENABLE_WrapExternal OFF) endif() assert_defined(${PROJECT_NAME}_ENABLE_Fortran) if (NOT ${PROJECT_NAME}_ENABLE_Fortran) message( "\n***" "\n*** NOTE: Setting ${PROJECT_NAME}_ENABLE_MixedLang=OFF" " because ${PROJECT_NAME}_ENABLE_Fortran='${${PROJECT_NAME}_ENABLE_Fortran}'!" "\n***\n" ) set(${PROJECT_NAME}_ENABLE_MixedLang OFF) endif() endmacro()
<repoDir>/cmake/CallbackDefineRepositoryPackaging.cmake: [Optional] If this file exists, then it defines extra CPack-related options that are specific to this TriBITS Repository. This file must define the macro tribits_repository_define_packaging() which is called by TriBITS. This file is processed as the top project-level scope so any local variables set have project-wide effect. This file is processed after the project's <projectDir>/cmake/CallbackDefineProjectPackaging.cmake file so any project CPACK variables are defined for the repository-level options and commands are created. This file typically just sets extra excludes to remove files from the tarball. The file:
TribitsExampleProject/cmake/CallbackDefineRepositoryPackaging.cmake
provides a good example which is:
macro(TRIBITS_REPOSITORY_DEFINE_PACKAGING) assert_defined(${REPOSITORY_NAME}_SOURCE_DIR) append_set(CPACK_SOURCE_IGNORE_FILES "${${REPOSITORY_NAME}_SOURCE_DIR}/cmake/ctest/" ) endmacro()
As shown in the above example, it is important to prefix the excluded files and directories with the repository base directory ${${REPOSITORY_NAME}_SOURCE_DIR}/ since these are interpreted by CPack as regular-expressions.
The following temporary local variables are defined automatically by TriBITS before processing a given TriBITS repository's files (e.g. PackagesList.cmake, TPLsList.cmake, etc.):
REPOSITORY_NAME
The name of the current TriBITS repository. This name will be the repository name listed in <projectDir>/cmake/ExtraRepositoriesList.cmake file or if this repository directory is the project base directory, REPOSITORY_NAME will be set to ${PROJECT_NAME}.REPOSITORY_DIR
Path of the current Repository relative to the Project's base source directory ${PROJECT_NAME}_SOURCE_DIR.. This is typically just the repository name but can be an arbitrary directory if specified through the <projectDir>/cmake/ExtraRepositoriesList.cmake file.
The following project-scope (non-cache) local variables are set once the list of TriBITS repositories is processed and before any of the repository's files are processed:
${REPOSITORY_NAME}_SOURCE_DIR
The absolute path to the base of a given TriBITS Repository's source directory. CMake code, for example in a packages' CMakeLists.txt file, typically refers to this by the raw name like RepoX_SOURCE_DIR. This makes such CMake code independent of where the various TriBITS repos are in relation to each other or the TriBITS Project (but does hard-code the repository name which is not ideal).${REPOSITORY_NAME}_BINARY_DIR
The absolute path to the base of a given TriBITS Repository's binary directory. CMake code, for example in packages, refer to this by the raw name like RepoX_SOURCE_DIR. This makes such CMake code independent of where the various TriBITS repos are in relation to each other or the Project.
The following project-level local variables can be defined by the project or the user to help define the what packages from the repository ${REPOSITORY_NAME} contribute to the primary meta-project packages (PMPP):
${REPOSITORY_NAME}_NO_PRIMARY_META_PROJECT_PACKAGES
If set to TRUE, then the package's in the TriBITS repository are not considered to be part of the primary meta-project packages. This affects what packages get enabled by default when enabling all packages with setting ${PROJECT_NAME}_ENABLE_ALL_PACKAGES=ON and what tests and examples get enabled by default when setting ${PROJECT_NAME}_ENABLE_TESTS=ON. See TriBITS Dependency Handling Behaviors for more details.${REPOSITORY_NAME}_NO_PRIMARY_META_PROJECT_PACKAGES_EXCEPT
When the above variable is set to TRUE, this variable is read by TriBITS to find the list of TriBITS packages selected packages in the repository ${REPOSITORY_NAME} which are considered to be part of the set of the project's primary meta-project package when the above variable is set to ON. NOTE: It is not necessary to list all of the subpackages in a given parent package. Only the parent package need be listed and it will be equivalent to listing all of the subpackages. See TriBITS Dependency Handling Behaviors for more details.
The above primary meta-project variables should be set in the meta-project's <projectDir>/ProjectName.cmake file so that they will be set in all situations.
A TriBITS Package:
WARNING: As noted above, one must be very careful to pick globally unique TriBITS package names. This name must be unique not only within its defined TriBITS repository but also across all packages in all TriBITS repositories that ever might be cobbled together into a single TriBITS (meta) project! Choosing a good package name is the single most important decision when it comes to defining a TriBITS package. One must be careful not to pick names like "Debug" or "StandardUtils" that have a high chance of clashing with poorly named TriBITS packages from other TriBITS repositories.
For more details on the definition of a TriBITS Package (or subpackage), see:
The core files that make up a TriBITS Package (where <packageDir> = ${${PACKAGE_NAME}_SOURCE_DIR}) are:
<packageDir>/ CMakeLists.txt # Only processed if the package is enabled cmake/ Dependencies.cmake # Always processed if its repo is processed <packageName>_config.h.in # [Optional], name is not fixed
There are a few simple rules for the location and the contents of the <packageDir>/ directory:
The above rules are not needed for basic building and testing but are needed for extended features like automatically detecting when a package has changed by looking at what files have changed (see Pre-push Testing using checkin-test.py) and for creating source tarballs correctly (see Creating Source Distributions). Therefore, it would be wise to abide by the above rules when defining packages.
The following TriBITS Package files are documented in more detail below:
<packageDir>/cmake/Dependencies.cmake: [Required] Defines the dependencies for a given TriBITS package using the macro tribits_package_define_dependencies(). This file is processed at the top-level project scope (using an include()) so any local variables set will be seen by the entire project. This file is always processed, including when just building the project's dependency data-structure (see Reduced Package Dependency Processing).
An example of a Dependencies.cmake file for a package with optional and required dependencies is for the mock Panzer package in MockTrilinos:
tribits_package_define_dependencies( LIB_REQUIRED_PACKAGES Teuchos Sacado Phalanx Intrepid Thyra Tpetra Epetra EpetraExt LIB_OPTIONAL_PACKAGES Stokhos TEST_OPTIONAL_PACKAGES Stratimikos LIB_REQUIRED_TPLS MPI Boost )
An example of a package with subpackages is WithSubpackages which has the dependencies file:
TribitsExampleProject/packages/with_subpackages/cmake/Dependencies.cmake
which is:
tribits_package_define_dependencies( SUBPACKAGES_DIRS_CLASSIFICATIONS_OPTREQS A a PT REQUIRED B b ST OPTIONAL C c ST OPTIONAL REGRESSION_EMAIL_LIST with_packages-regressions@someurl.none )
WithSubpackages defines three subpackages which creates three new packages with names WithSubpackagesA, WithSubpackagesB, and WithSubpackagesC.
if a TriBITS Package or Subpackage has no dependencies, it still has to call tribits_package_define_dependencies() but it is called with no arguments such as with:
TribitsHelloWorld/hello_world/cmake/Dependencies.cmake:
which contains:
tribits_package_define_dependencies()
Other TriBITS macros/functions that can be called in this file include tribits_tpl_tentatively_enable() and tribits_allow_missing_external_packages().
<packageDir>/cmake/<packageName>_config.h.in: [Optional] The package's configured header file. This file will contain placeholders for variables that will be substitute at configure time with tribits_configure_file(). This includes usage of #cmakedefine <varName> and other standard CMake file configuration features used by CMake's configure_file() command.
An example of this file is shown in:
TribitsExampleProject/packages/simple_cxx/cmake/SimpleCxx_config.h.in
which is:
#ifndef SIMPLECXX_CONFIG_H #define SIMPLECXX_CONFIG_H #cmakedefine HAVE_SIMPLECXX___INT64 #cmakedefine HAVE_SIMPLECXX_DEBUG #cmakedefine HAVE_SIMPLECXX_SIMPLETPL @SIMPLECXX_DEPRECATED_DECLARATIONS@ #endif /** SIMPLECXX_CONFIG_H **/
The variable HAVE_SIMPLECXX___INT64 is set up in the base file SimpleCxx/CMakeLists.txt (see <packageDir>/CMakeLists.txt below). For an explanation of HAVE_SIMPLECXX_DEBUG, see tribits_add_debug_option(). For an explanation of HAVE_SIMPLECXX_SIMPLETPL, see How to add a new TriBITS Package dependency. For an explanation of @SIMPLECXX_DEPRECATED_DECLARATIONS@, see Setting up support for deprecated code handling.
NOTE: The file name <packageName>_config.h.in is not at all fixed and the package can call this file anything it wants. Also, a package can configure multiple header files in different directories for different purposes using tribits_configure_file() or even calls to the raw CMake function configure_file().
<packageDir>/CMakeLists.txt: [Required] The package's top-level CMakeLists.txt file that defines the libraries, include directories, and contains the tests for the package.
The basic structure of this file for a package without subpackages is shown in:
TribitsExampleProject/packages/simple_cxx/CMakeLists.txt
which is:
# # A) Define the package # tribits_package( SimpleCxx ENABLE_SHADOWING_WARNINGS CLEANED ) # # B) Platform-specific checks # include(CheckFor__int64) check_for___int64(HAVE_SIMPLECXX___INT64) tribits_pkg_export_cache_var(HAVE_SIMPLECXX___INT64) # # C) Set up package-specific options # tribits_add_debug_option() tribits_add_show_deprecated_warnings_option() # # D) Add the libraries, tests, and examples # add_subdirectory(src) tribits_add_test_directories(test) # Set a variable that will be used in downstream packages if (SimpleCxx_ENABLE_SimpleTpl) set(simpletplText "simpletpl ") else() set(simpletplText) endif() global_set(EXPECTED_SIMPLECXX_AND_DEPS "SimpleCxx ${simpletplText}headeronlytpl") tribits_pkg_export_cache_var(EXPECTED_SIMPLECXX_AND_DEPS) # # E) Do standard post processing # tribits_package_postprocess()
The first command at the top of the file is a call to tribits_package() which takes the package name (SimpleCxx in this case) in addition to a few other options. While TriBITS obviously already knows the package name (since it read it from the <repoDir>/PackagesList.cmake file), the purpose for repeating it in this call is as documentation for the developer's sake (and this name is checked against the expected package name). Then a set of configure-time tests are typically performed (if the package needs any of these). In this example, the existence of the C++ __int64 data-type is checked using the module CheckFor__int64.cmake (which is in the cmake/ directory of this package. (CMake has great support Configure-time System Tests.) This is followed by package-specific options. In this case, the standard TriBITS options for debug checking and deprecated warnings are added using the standard macros tribits_add_debug_option() and tribits_add_show_deprecated_warnings_option(). After all of this up-front stuff is complete (which will be present in any moderately complex CMake-configured project) the source and the test sub-directories are added that actually define the library and the tests. In this case, the standard tribits_add_test_directories() macro is used which only conditionally adds the tests for the package.
The final command in the package's base CMakeLists.txt file must always be tribits_package_postprocess(). This is needed in order to perform some necessary post-processing by TriBITS.
It is also possible for the package's top-level CMakeLists.txt to be the only CMakeLists.txt file for a package. Such an example can be seen in the example project TribitsHelloWorld in the HelloWorld package.
When a TriBITS package is broken up into subpackages (see TriBITS Subpackage), its CMakeLists.txt file looks a little different from a package with no subpackages as shown above. The basic structure of this file for a package with subpackages is shown in:
TribitsExampleProject/packages/with_subpackages/CMakeLists.txt
which contains:
tribits_package_decl(WithSubpackages) tribits_add_debug_option() tribits_process_subpackages() tribits_exclude_files( b/ExcludeFromRelease.txt b/src/AlsoExcludeFromTarball.txt ) tribits_package_def() tribits_package_postprocess()
What is different about CMakeLists.txt files for packages without subpackages is that the tribits_package() command is broken up into two parts tribits_package_decl() and tribits_package_def(). In between these two commands, the parent package can define the common package options and then calls the command tribits_process_subpackages() which fully processes the packages. If the parent package has libraries and/or tests/example of its own, it can define those after calling tribits_package_def(), just like with a regular package. However, it is rare for a package broken up into subpackages to have its own libraries and/or tests and examples. As always, the final command called inside of a package's base CMakeLists.txt file is tribits_package_postprocess().
NOTE: The package's base CMakeLists.txt file only gets processed if the package is actually enabled (i.e. ${PROJECT_NAME}_ENABLE_${PACKAGE_NAME}=ON). This is an important design feature of TriBITS in that the contents of non-enabled packages can't damage the configure, build, and test of the enabled packages based on errors in non-enabled packages. This is critical to allow experimental EX test-group packages and lower-maturity packages to exist in the same source repositories safely with higher-maturity and more important packages.
A packages' core variables are broken down into the following categories:
The following locally scoped TriBITS Package Local Variables are defined when the files for a given TriBITS Package (or any package for that matter) are being processed:
PACKAGE_NAME
The name of the current TriBITS package. This is set automatically by TriBITS before the packages' CMakeLists.txt file is processed. WARNING: This name must be globally unique across the entire project (see Globally unique TriBITS package names).PACKAGE_SOURCE_DIR
The absolute path to the package's base source directory. This is set automatically by TriBITS in the macro tribits_package().PACKAGE_BINARY_DIR
The absolute path to the package's base binary/build directory. This is set automatically by TriBITS in the macro tribits_package().PACKAGE_NAME_UC
This is set to the upper-case version of ${PACKAGE_NAME}. This is set automatically by TriBITS in the macro tribits_package().
Once all of the TriBITS package's Dependencies.cmake files have been processed, the following TriBITS Package Top-Level Local Variables are defined:
${PACKAGE_NAME}_SOURCE_DIR
The absolute path to the package's base source directory. CMake code, for example in other packages, refer to this by the raw name like PackageX_SOURCE_DIR. This makes such CMake code independent of where the package is in relation to other packages. NOTE: This variable is defined for all declared packages that exist, independent of whether they are enabled or not. This variable is set as soon as it is known if the given package exists or not.${PACKAGE_NAME}_REL_SOURCE_DIR
The relative path to the package's base source directory, relative to the projects base source directory ${PROJECT_NAME}_SOURCE_DIR. This is used in various contexts such as processing the packages <packageDir>/CMakeLists.txt file and generating the projects <Project>PackageDependencies.xml file where relative paths are needed.${PACKAGE_NAME}_BINARY_DIR
The absolute path to the package's base binary directory. CMake code, for example in other packages, refer to this by the raw name like PackageX_BINARY_DIR. This makes such CMake code independent of where the package is in relation to other packages. NOTE: This variable is only defined if the package is actually enabled!${PACKAGE_NAME}_PARENT_REPOSITORY
The name of the package's parent repository. This can be used by a package to access information about its parent repository. For example, the variable ${${PACKAGE_NAME}_PARENT_REPOSITORY}_SOURCE_DIR can be dereferenced and read of needed (but it is not recommended that packages be aware of their parent repository in general)..${PACKAGE_NAME}_TESTGROUP
Defines the Package Test Group for the package. This determines in what contexts the package is enabled or not for testing-related purposes (see Nested Layers of TriBITS Project Testing)${PACKAGE_NAME}_SUBPACKAGES
Defines the list of subpackage names for a top-level parent package. This gives the unique subpackage name without the parent package prefix. For example, the ReducedMockTrilinos package Thyra has the subpackages CoreLibs, GoodStuff, etc. (which forms the full package names ThyraCoreLibs, ThyraGoodStuff, etc.). If a top-level package is not broken down into subpackages, then this list is empty.
In addition, the following user-settable TriBITS Package Cache Variables are defined before a Package's CMakeLists.txt file is processed:
${PROJECT_NAME}_ENABLE_${PACKAGE_NAME}
Set to ON if the package is enabled and is to be processed or will be set to ON or OFF automatically during enable/disable logic. For a parent package that is not directly enabled but where one of its subpackages is enabled, this will get set to ON (but that is not the same as the parent package being directly enabled and therefore does not imply that all of the required subpackages will be enabled, only that the parent package will be processed).${PACKAGE_NAME}_ENABLE_${UPSTREAM_PACKAGE_NAME}
Set to ON if support for the optional upstream dependent package ${UPSTREAM_PACKAGE_NAME} is enabled in package ${PACKAGE_NAME}. Here ${UPSTREAM_PACKAGE_NAME} corresponds to each optional upstream package listed in the LIB_OPTIONAL_PACKAGES and TEST_OPTIONAL_PACKAGES arguments to the tribits_package_define_dependencies() macro.
NOTE: It is important that the CMake code in the package's CMakeLists.txt files key off of this variable and not the project-level variable ${PROJECT_NAME}_ENABLE_${UPSTREAM_PACKAGE_NAME} because the package-level variable ${PACKAGE_NAME}_ENABLE_${UPSTREAM_PACKAGE_NAME} can be explicitly turned off by the user even through the packages ${PACKAGE_NAME} and ${UPSTREAM_PACKAGE_NAME} are both enabled at the project level! See Support for optional package can be explicitly disabled.
NOTE: This variable will also be set for required dependencies as well to allow for uniform processing such as when looping over the items in ${PACKAGE_NAME}_LIB_DEFINED_DEPENDENCIES or ${PACKAGE_NAME}_TEST_DEFINED_DEPENDENCIES.
NOTE: The value of this variable also determines the value of the macro define variable name HAVE_<PACKAGE_NAME_UC>_<UPSTREAM_PACKAGE_NAME_UC>.
${PACKAGE_NAME}_ENABLE_TESTS
Set to ON if the package's tests are to be enabled. This will enable a package's tests and all of its subpackage's tests.${PACKAGE_NAME}_ENABLE_EXAMPLES
Set to ON if the package's examples are to be enabled. This will enable a package's examples and all of its subpackage's examples.
The above global cache variables can be explicitly set by the user or may be set automatically as part of the Package Dependencies and Enable/Disable Logic.
The following local TriBITS Package Optional Dependency Macro Variables are defined in the top-level project scope before a Package's CMakeLists.txt file is processed:
HAVE_<PACKAGE_NAME_UC>_<UPSTREAM_PACKAGE_NAME_UC>
Set to ON if support for optional upstream package ${UPSTREAM_PACKAGE_NAME is enabled in downstream package ${PACKAGE_NAME} (i.e. ${PACKAGE_NAME}_ENABLE_${UPSTREAM_PACKAGE_NAME} = ON) and is set to FALSE otherwise. Here, <PACKAGE_NAME_UC> and <UPSTREAM_PACKAGE_NAME_UC> are the upper-case names for the packages ${PACKAGE_NAME} and ${UPSTREAM_PACKAGE_NAME}, respectively. For example, if optional support for upstream package Triutils is enabled in downstream package EpetraExt in ReducedMockTrilinos, then EpetraExt_ENABLE_TriUtils=ON and HAVE_EPETRAEXT_TRIUTILS=ON. This variable is meant to be used in:
#cmakedefine HAVE_<PACKAGE_NAME_UC>_<UPSTREAM_PACKAGE_NAME_UC>in configured header files (e.g. <packageDir>/cmake/<packageName>_config.h.in). For example, for the EpetraExt and Triutils example, this would be:
#cmakedefine HAVE_EPETRAEXT_TRIUTILSNOTE: TriBITS automatically sets this variable depending on the value of ${PACKAGE_NAME}_ENABLE_${UPSTREAM_PACKAGE_NAME} during the step "Adjust package and TPLs enables and disables" in Full Processing of TriBITS Project Files. And tweaking this variable after that must be done carefully as described in How to tweak downstream TriBITS "ENABLE" variables during package configuration.
Currently, a Package can refer to its containing Repository and refer to its source and binary directories. This is so that it can refer to repository-level resources (e.g. like the Trilinos_version.h file for Trilinos packages). However, this may be undesirable because it will make it hard to pull a package out of one TriBITS repository and place it in another repository for a different use. However, a package can indirectly refer to its own repository without loss of generality by reading the variable ${PACKAGE_NAME}_PARENT_REPOSITORY. The problem is referring to other TriBITS repositories explicitly.
A TriBITS Subpackage:
The contents of a TriBITS Subpackage are almost identical to those of a TriBITS Package. The differences are described below and in How is a TriBITS Subpackage different from a TriBITS Package?.
For more details on the definition of a TriBITS Package (or subpackage), see:
The set of core files for a subpackage are identical to the TriBITS Package Core Files and are:
<packageDir>/<spkgDir>/ CMakeLists.txt # Only processed if this subpackage is enabled cmake/ Dependencies.cmake # Always processed if the parent package # is listed in the enclosing Repository
(where <packageDir> = ${${PARENT_PACKAGE_NAME}_SOURCE_DIR} and <spkgDir> is the subpackage directory listed in the SUBPACKAGES_DIRS_CLASSIFICATIONS_OPTREQS to tribits_package_define_dependencies()).
There are a few simple rules for the location and the contents of the <spkgDir>/ directory:
The above rules are not needed for basic building and testing but are needed for extended features like automatically detecting when a package has changed by looking at what files have changed (see Pre-push Testing using checkin-test.py) and for creating source tarballs correctly (see Creating Source Distributions). Therefore, it would be wise to abide by the above rules when defining subpackages.
These TriBITS Subpackage files are documented in more detail below:
<packageDir>/<spkgDir>/cmake/Dependencies.cmake: [Required] The contents of this file for subpackages are identical as for top-level packages. It just contains a call to the macro tribits_package_define_dependencies() to define this package's upstream package dependencies. A simple example is for the example subpackage WithSubpackagesB (declared in with_subpackages/cmake/Dependencies.cmake) with the file:
TribitsExampleProject/packages/with_subpackages/b/cmake/Dependencies.cmake
which is:
tribits_package_define_dependencies( LIB_REQUIRED_PACKAGES SimpleCxx LIB_OPTIONAL_PACKAGES WithSubpackagesA InsertedPkg TEST_OPTIONAL_PACKAGES MixedLang )
What this shows is that subpackages must list their dependencies on each other (if such dependencies exist) using the full package name ${PARENT_PACKAGE_NAME}${SUBPACKAGE_NAME} or in this case:
'WithSubpackagesA' = 'WithSubpackages' + 'A'
Note that the parent package depends on its subpackages, not the other way around. For example, the WithSubpackages parent package automatically depends its subpackages WithSubpackagesA, WithSubpackagesC, and WithSubpackagesC. As such all (direct) dependencies for a subpackage must be listed in its own Dependencies.cmake file. For example, the WithSubpackages subpackage A depends on the SimpleCxx package and is declared as such as shown in:
TribitsExampleProject/packages/with_subpackages/a/cmake/Dependencies.cmake
which is:
tribits_package_define_dependencies( LIB_REQUIRED_PACKAGES SimpleCxx )
What this means is that any package dependencies listed in the parent package's <packageDir>/cmake/Dependencies.cmake file are NOT dependencies of its subpackages. For example, if with_subpackages/cmake/Dependencies.cmake where changed to be:
tribits_package_define_dependencies( LIB_REQUIRED_TPLS Boost SUBPACKAGES_DIRS_CLASSIFICATIONS_OPTREQS A A PT REQUIRED ... )
then the Boost TPL would NOT be a dependency of the package WithSubpackagesA but instead would be listed as a dependency of the parent package WithSubpackages. (And in this case, this TPL dependency is pretty worthless since the package WithSubpackages does not even define any libraries or tests of its own.)
<packageDir>/<spkgDir>/CMakeLists.txt: [Required] The subpackage's top-level CMakeLists.txt file that defines the libraries, include directories, and contains the tests for the subpackage. The contents of a subpackage's top-level CMakeLists.txt file are almost identical to a top-level package's <packageDir>/CMakeLists.txt file. The primary difference is that the commands tribits_package() and tribits_package_postprocess() and replaced with tribits_subpackage() and tribits_subpackage_postprocess() as shown in the file:
TribitsExampleProject/packages/with_subpackages/a/CMakeLists.txt
which contains:
# # A) Define the subpackage # tribits_subpackage(A) # # B) Set up subpackage-specific options # set(${PACKAGE_NAME}_SPECIAL_VALUE 3 CACHE STRING "Integer special value") tribits_pkg_export_cache_var(${PACKAGE_NAME}_SPECIAL_VALUE) # # C) Add the libraries, tests, and examples # tribits_configure_file(${PACKAGE_NAME}_config.h) tribits_include_directories(${CMAKE_CURRENT_BINARY_DIR}) tribits_include_directories(${CMAKE_CURRENT_SOURCE_DIR}) tribits_add_library(pws_a SOURCES A.cpp HEADERS A.hpp ${CMAKE_CURRENT_BINARY_DIR}/${PACKAGE_NAME}_config.h ) tribits_add_test_directories(tests) # # D) Do standard post processing # tribits_subpackage_postprocess()
Unlike tribits_package(), tribits_subpackage() does not take any extra arguments. Those extra settings are assumed to be defined by the top-level parent package. Like top-level packages, subpackages are free to define user-settable options and configure-time tests but typically don't. The idea is that subpackages should be lighter weight than top-level packages. Other than using tribits_subpackage() and tribits_subpackage_postprocess(), a subpackage can be laid out just like any other package and can call on any other commands to add libraries, add executables, add test, etc.
The core variables associated with a subpackage are identical to the TriBITS Package Core Variables. In addition, a subpackage may need to refer to its top-level parent package where a top-level package does not have a parent package. These additional variables that are defined for subpackages are broken down into the following categories:
In addition to the TriBITS Package Local Variables, the following locally scoped TriBITS Subpackage Local Variables are defined when the files for a given TriBITS Subpackage are being processed:
PARENT_PACKAGE_NAME
The name of the parent package.PARENT_PACKAGE_SOURCE_DIR
The absolute path to the parent package's base source directory.PARENT_PACKAGE_BINARY_DIR
The absolute path to the parent package's base binary directory.
In addition to the TriBITS Package Top-Level Local Variables, once all of a TriBITS subpackage's Dependencies.cmake files have been processed, the following TriBITS Subpackage Top-Level Local Variables are defined:
${PACKAGE_NAME}_PARENT_PACKAGE
The name of the parent package. (NOTE: If this is empty "", then ${PACKAGE_NAME} is actually a parent package and not a subpackage.)
A common question this is natural to ask is how a TriBITS Subpackage is different from a TriBITS Package? They contain the same basic files (i.e. a cmake/Dependencies.cmake file, a top-level CMakeLists.txt file, source files, test files, etc.). They both are included in the list of TriBITS Packages and therefore can both be enabled/disabled by the user or in automatic dependency logic (see Package Dependencies and Enable/Disable Logic). The primary difference is that a subpackage is meant to involve less overhead and is to be used to partition the parent package's software into chunks according to Software Engineering Packaging Principles. Also, the dependency logic treats a parent package's subpackages as part of itself so when the parent package is explicitly enabled or disabled, it is identical to explicitly enabling or disabling all of its subpackages (see Enable/disable of parent package is enable/disable for subpackages). Also, subpackages are tested along with their peer subpackages with the parent package as part of TriBITS CTest/CDash Driver. This effectively means that if a build failure is detected in any subpackage, then that will effectively disable the parent package and all of its other subpackages in downstream testing. This is a type of "all for one and one for all" when it comes to the relationship between the subpackages within a single parent package. These are some of the issues to consider when breaking up software into packages and subpackages that will be mentioned in other sections as well.
A TriBITS External Package/TPL:
The TriBITS external package/TPL mechanism provides a uniform way to find and provide access to any type of external resource no matter how it might be installed or ways to provide access to it. Using a TriBITS external package/TPL is to be preferred over using raw CMake find_package(<externalPkg>) call because the TriBITS system guarantees that only a single unique version of an external package/TPL of the same version will be used all of the downstream packages that uses it. Also, by defining a TriBITS TPL, automatic enable/disable logic will be applied as described in Package Dependencies and Enable/Disable Logic. For example, if an external package/TPL is explicitly disabled, all of the downstream packages that depend on it will be automatically disabled as well (see Package disable triggers auto-disables of downstream dependencies).
NOTE: The TriBITS TPL system implements a mechanism to turn external dependencies into both TriBITS-compliant packages for consumption by downstream TriBITS internal packages and also writes <tplName>Config.cmake files that are TriBITS-compliant external packages for consumption by downstream <Package>Config.cmake files (which are TriBITS-compliant external packages) generated by TriBITS-compliant internal packages.
WARNING: One must be very careful to pick Globally unique TriBITS External Package/TPL names <tplName> across all TPLs in all TriBITS repositories that ever might be cobbled together into a single TriBITS (meta) project! However, choosing TPL names is usually much easier and less risky than choosing Globally unique TriBITS package names because widely used TPLs tend to already be uniquely named. For example, the external package/TPL names BLAS and LAPACK are well defined in the applied math and computational science community and are not likely to clash.
The core files that define a TriBITS External Package/TPL are:
<tplDefsDir>/ FindTPL<tplName>.cmake # The name is not fixed (see <tplName>_FINDMOD) FindTPL<tplName>Dependencies.cmake # [Optional], defines upstream dependencies
Above, <tplDefsDir>/ can be a subdirectory under a parent TriBITS repository <repoDir>/ (e.g. <repoDir>/cmake/tpls/) or can be under a TriBITS package directory <packageDir>/ (e.g. <packageDir>/cmake/tpls/).
The following TriBITS External Package/TPL files are documented in more detail below:
<tplDefsDir>/FindTPL<tplName>.cmake: [Required] TriBITS TPL find module that defines how a TriBITS external package/TPL is found and provided for usage by a downstream TriBITS package. This module must provide the <tplName>::all_libs target and must create a TriBITS-compliant external package wrapper package config file <tplName>Config.cmake. (See the requirements for a FindTPL<tplName>.cmake file in Requirements for FindTPL<tplName>.cmake modules).
The form of a simple FindTPL<tplName>.cmake file that uses an internal call to find_package(<externalPkg>) which provides modern IMPORTED CMake targets can use the tribits_extpkg_create_imported_all_libs_target_and_config_file() function and looks like:
find_package(<externalPkg> REQUIRED) tribits_extpkg_create_imported_all_libs_target_and_config_file( <tplName> INNER_FIND_PACKAGE_NAME <externalPkg> IMPORTED_TARGETS_FOR_ALL_LIBS <importedTarget0> <importedTarget1> ... )
In this case, the purpose for the FindTPL<tplName>.cmake file (as opposed to a direct call to find_package(<externalPkg>)) is to ensure the definition of the complete target <tplName>::all_libs which contains all usage requirements for the external package/TPL (i.e. all of the libraries, include directories, etc.) and this also generates the wrapper package config file <tplName>Config.cmake.
The form of a simple FindTPL<tplName>.cmake file that just provides a list of required header files and libraries that does not use an internal call to find_package() and instead uses the function tribits_tpl_find_include_dirs_and_libraries() looks like:
tribits_tpl_find_include_dirs_and_libraries( <tplName> REQUIRED_HEADERS <header0> <header1> ... REQUIRED_LIBS_NAMES <libname0> <libname1> ... MUST_FIND_ALL_LIBS )
An example concrete file is tribits/common_tpls/FindTPLPETSC.cmake:
tribits_tpl_find_include_dirs_and_libraries( PETSC REQUIRED_HEADERS petsc.h REQUIRED_LIBS_NAMES petsc )
For complete details, see Creating the FindTPL<tplName>.cmake file.
<tplDefsDir>/FindTPL<tplName>Dependencies.cmake: [Optional] Declares dependencies on upstream external packages/TPLs for the external package/TPL <tplName>. Many external packages/TPLs defined with a FindTPL<tplName>.cmake file do not have any upstream dependencies or have internal mechanisms to get those (such as when using find_package(<externalPkg>) where the <externalPkg>Config.cmake file which recursively uses find_dependency() to get its upstream dependencies). But for FindTPL<tplName>.cmake files that just use tribits_tpl_find_include_dirs_and_libraries() (see Creating a FindTPL<tplName>.cmake module without find_package()), TriBITS needs to be told about any upstream external packages/TPLs that it may depend on so it can add the dependencies between the created IMPORTED target libraries.
The file FindTPL<tplName>Dependencies.cmake is typically just a single call to tribits_extpkg_define_dependencies() and takes the form:
tribits_extpkg_define_dependencies( <tplName> DEPENDENCIES <upstreamTpl_0> <upstreamTpl_1> ... )
This defines all of the TPLs that <tplName> could directly depends on but only dependencies for enabled upstream TPLs will be added to the IMPORTED targets.
NOTE: TPL-to-TPL dependencies are optional. Therefore, in the above example, enabling the TPL <tplName> will not auto-enable a dependent upstream TPL <upstreamTpl_i>. Likewise, disabling an upstream TPL <upstreamTpl_i> will not auto-disable a dependent downstream TPL <tplName>.
Once the <repoDir>/TPLsList.cmake files are all processed, then each defined TPL TPL_NAME is assigned the following global non-cache variables:
${TPL_NAME}_FINDMOD
For a non- TriBITS-compliant external package, this is the relative path (w.r.t. <projectDir>) or absolute path for the TriBITS TPL find module (typically named FindTPL<tplName>.cmake). This is set using the FINDMOD field in the call to tribits_repository_define_tpls(). The final value of the variable is defined by the last <repoDir>/TPLsList.cmake file that is processed that declares the TPL TPL_NAME. For example, if Repo1/TPLsList.cmake and Repo2/TPLsList.cmake both list the TPL SomeTpl, then if Repo2 is processed after Repo1, then SomeTpl_FINDMOD is determined by Repo2/TPLsList.cmake and the find module listed in Repo1/TPLsList.cmake is ignored. NOTE: for a TriBITS-compliant external package, the special value TRIBITS_PKG is also recognized. (Any pre-installed TriBITS package is a TriBITS-compliant external package.)${TPL_NAME}_DEPENDENCIES_FILE
Relative path (w.r.t. <projectDir>) or absolute path for the external package/TPL's dependencies file (typically named FindTPL<tplName>Dependencies.cmake). This is always beside the find module ${TPL_NAME}_FINDMOD. (In fact, for a non- TriBITS-compliant external package, ${TPL_NAME}_DEPENDENCIES_FILE is constructed from ${TPL_NAME}_FINDMOD). NOTE: A TriBITS-compliant external package with dependencies will also have this file set and the path will be specified independent of the path to the non-existent FindTPL<tplName>.cmake file (see the FINDMOD field in the call to tribits_repository_define_tpls()).${TPL_NAME}_TESTGROUP
TPL's Package Test Group: This is set using the CLASSIFICATION field in the call to tribits_repository_define_tpls(). If multiple repos define a given TPL, then the first <repoDir>/TPLsList.cmake file that is processed that declares the TPL TPL_NAME specifies the test group. For example, if Repo1/TPLsList.cmake and Repo2/TPLsList.cmake both list the TPL SomeTpl, then if Repo2 is processed after Repo1, then SomeTpl_TESTGROUP is determined by Repo1/TPLsList.cmake and the test group in Repo2/TPLsList.cmake is ignored. However, if ${TPL_NAME}_TESTGROUP is already set before the <repoDir>/TPLsList.cmake files are processed, then that test group will be used. Therefore, the project can override the test group for a given TPL if desired by setting ${TPL_NAME}_TESTGROUP before the first <repoDir>/TPLsList.cmake file gets processed.${TPL_NAME}_TPLS_LIST_FILE
Absolute path of the (last) <repoDir>/TPLsList.cmake file that declared this external package/TPL.
Note, the <findmod> field path in the call to tribits_repository_define_tpls() is relative to the TriBITS repository dir <repoDir> but a relative path in for the variable <tplName>_FINDMOD is relative to the project dir <projectDir>. There is a translation of the <findmod> field to the variable <tplName>_FINDMOD that takes place when the <repoDir>/TPLsList.cmake file is processed to make this so.
As noted above, it is allowed for the same TPL to be listed in multiple <repoDir>/TPLsList.cmake files. In this case, the rules for overrides of the find module and the test group as described above.
The specification given in Enabling support for an optional Third-Party Library (TPL) and Creating the FindTPL<tplName>.cmake file describe how to create a FindTPL<tplName>.cmake module. However, all that is required is that some CMake file fragment exist such that, once included, will define the target <tplName>::all_libs and create the <tplName>Config.cmake file in the correct location (see Requirements for FindTPL<tplName>.cmake modules).
One of the most important things to know about TriBITS is what files it processes, in what order, and in what context. This is critical to being able to understand what impact (if any) setting a variable or otherwise changing the CMake run-time state will have on configuring a CMake project which uses TriBITS. While the different files that make up a TriBITS Project, TriBITS Repository, TriBITS Package, TriBITS Subpackage, and TriBITS TPL were defined in the section TriBITS Structural Units, that material did not fully describe the context and in what order these files are processed by the TriBITS framework.
The TriBITS system processes the project's files in one of two general use cases. The first use case is in the basic configuration of the project with a standard cmake command invocation in order to set up the build files in the binary directory (see Full TriBITS Project Configuration). The second use case is in reading the project's dependency-related files in order to build the package dependency data-structure (e.g. the <Project>PackageDependencies.xml file, see Reduced Package Dependency Processing). The second use case of reading the project's dependency files is largely a subset of the first.
Another factor that is important to understand is the scoping in which the various files are processed (with include() or add_subdirectory()). This scoping has a large impact on the configuration of the project and what effect the processing of files and setting variables have on the project as a whole. Some of the strange scoping rules for CMake are discussed in CMake Language Overview and Gotchas and should be understood before trying to debug issues with processing. Many of the basic files are processed (included) in the base project <projectDir>/CMakeLists.txt scope and therefore any local variables set in these files are accessible to the entire CMake project (after the file is processed, of course). Other files get processed inside of functions which have their own local scope and therefore only impact the rest of the project in more purposeful ways. And of course all of the package <packageDir>/CMakeLists.txt files that are processed using add_subdirectory() create a new local scope for that given package.
The first use case to describe is the full processing of all of the TriBITS project's files starting with the base <projectDir>/CMakeLists.txt file. This begins with the invocation of the following CMake command to generate the project's build files:
$ cmake [options] <projectDir>
Below, is a short pseudo-code algorithm for the TriBITS framework processing and callbacks that begins in the <projectDir>/CMakeLists.txt file and proceeds through the call to tribits_project().
Full Processing of TriBITS Project Files:
The TriBITS Framework obviously does a lot more than what is described above but the basic trace of major operations and ordering and the processing of project, repository, package, and subpackage files should be clear. All of this information should also be clear when enabling File Processing Tracing and watching the output from the cmake configure STDOUT.
In addition to the full processing that occurs as part of the Full TriBITS Project Configuration, there are also TriBITS tools that only process as subset of project's files. This reduced processing is performed in order to build up the project's package dependencies data-structure and to write the file <Project>PackageDependencies.xml. For example, the tool checkin-test.py and the function tribits_ctest_driver() both drive this type of processing. In particular, the CMake -P script TribitsDumpDepsXmlScript.cmake reads all of the project's dependency-related files and dumps out the <Project>PackageDependencies.xml file (see TriBITS Project Dependencies XML file and tools). This reduced processing (e.g. as executed in cmake -P TribitsDumpDepsXmlScript.cmake) is described below.
Reduced Dependency Processing of TriBITS Project:
When comparing the above reduced dependency processing to the Full Processing of TriBITS Project Files it is important to note that that several files are not processed in the reduced algorithm shown above. The files that are not processed include <projectDir>/Version.cmake, <repoDir>/Version.cmake and <repoDir>/cmake/CallbackSetupExtraOptions.cmake (in addition to not processing any of the CMakeLists.txt files obviously). Therefore, one cannot put anything in these non-processed files that would impact the definition of TriBITS repositories, packages, TPLs, etc. Anything that would affect the dependencies data-structure that gets written out as <Project>PackageDependencies.xml must be contained in the files that are processed shown in the reduced processing above.
Debugging issues with Reduced Dependency Processing of TriBITS Project Files is more difficult because one cannot easily turn on File Processing Tracing like one can when doing the full CMake configure. However, options may be added to the various tools to show this file processing and help debug problems.
In order to aid in debugging problems with Full TriBITS Project Configuration and Reduced Package Dependency Processing, TriBITS defines the CMake cache option ${PROJECT_NAME}_TRACE_FILE_PROCESSING. When enabled, TriBITS will print out when any of the project-related, repository-related, or package-related file is being processed by TriBITS. When ${PROJECT_NAME}_TRACE_FILE_PROCESSING=ON, lines starting with "-- File Trace:" are printed to cmake stdout for files that TriBITS automatically processes where there may be any confusion about what files are processed and when.
For example, for TribitsExampleProject, the configure file trace for the configure command:
$ cmake \ -DTribitsExProj_TRIBITS_DIR=<tribitsDir> \ -DTribitsExProj_ENABLE_MPI=ON \ -DTribitsExProj_ENABLE_ALL_PACKAGES=ON \ -DTribitsExProj_ENABLE_TESTS=ON \ -DTribitsExProj_TRACE_FILE_PROCESSING=ON \ -DTribitsExProj_ENABLE_CPACK_PACKAGING=ON \ -DTribitsExProj_DUMP_CPACK_SOURCE_IGNORE_FILES=ON \ <tribitsDir>/doc/TribitsExampleProject \ | grep "^-- File Trace:"
looks something like:
-- File Trace: PROJECT INCLUDE [...]/Version.cmake -- File Trace: REPOSITORY INCLUDE [...]/cmake/CallbackSetupExtraOptions.cmake -- File Trace: REPOSITORY INCLUDE [...]/PackagesList.cmake -- File Trace: REPOSITORY INCLUDE [...]/TPLsList.cmake -- File Trace: PACKAGE INCLUDE [...]/packages/simple_cxx/cmake/Dependencies.cmake -- File Trace: PACKAGE INCLUDE [...]/packages/mixed_lang/cmake/Dependencies.cmake -- File Trace: PACKAGE INCLUDE [...]/packages/with_subpackages/cmake/Dependencies.cmake -- File Trace: PACKAGE INCLUDE [...]/packages/with_subpackages/a/cmake/Dependencies.cmake -- File Trace: PACKAGE INCLUDE [...]/packages/with_subpackages/b/cmake/Dependencies.cmake -- File Trace: PACKAGE INCLUDE [...]/packages/with_subpackages/c/cmake/Dependencies.cmake -- File Trace: PACKAGE INCLUDE [...]/packages/wrap_external/cmake/Dependencies.cmake -- File Trace: PROJECT CONFIGURE [...]/cmake/ctest/CTestCustom.cmake.in -- File Trace: REPOSITORY READ [...]/Copyright.txt -- File Trace: REPOSITORY INCLUDE [...]/Version.cmake -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/simple_cxx/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/simple_cxx/test/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/mixed_lang/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/mixed_lang/test/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/with_subpackages/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/with_subpackages/a/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/with_subpackages/a/tests/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/with_subpackages/b/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/with_subpackages/b/tests/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/with_subpackages/c/CMakeLists.txt -- File Trace: PACKAGE ADD_SUBDIR [...]/packages/with_subpackages/c/tests/CMakeLists.txt -- File Trace: PROJECT INCLUDE [...]/cmake/CallbackDefineProjectPackaging.cmake -- File Trace: REPOSITORY INCLUDE [...]/cmake/CallbackDefineRepositoryPackaging.cmake
However, every file that TriBITS processes is not printed in this file trace if it should be obvious that the file is being processed. For example, the package's configured header file created using tribits_configure_file() does not result in a file trace print statement because this is an unconditional command that is explicitly called in one of the package's CMakeLists.txt files so it should be clear that this file is being processed and exactly when it is processed.
Certain simplifications are allowed when defining TriBITS projects, repositories and packages. The known allowed simplifications are described below.
TriBITS Repository Dir == TriBITS Project Dir: It is allowed for a TriBITS Project and a TriBITS Repository to be the same source directory and in fact this is the default for every TriBITS project (unless the <projectDir>/cmake/NativeRepositoriesList.cmake is defined). In this case, the repository name REPOSITORY_NAME and the project name PROJECT_NAME are the same as well. This is quite common and is in fact the default that every TriBITS Project is also a TriBITS repository (and therefore must contain <repoDir>/PackagesList.cmake and <repoDir>/TPLsList.cmake files). This is the case, for example, with the Trilinos and the TribitsExampleProject projects and repositories. In this case, the Project's and the Repository's Version.cmake and Copyright.txt files are also one and the same, as they should be (see Project and Repository Versioning and Release Mode).
TriBITS Package Dir == TriBITS Repository Dir: It is also allowed for a TriBITS Repository to have only one package and to have that package be the base repository directory. The TriBITS Repository and the single TriBITS Package would typically have the same name in this case (but that is actually not required but it is confusing if they are not the same). For example, in the TriBITS test project MockTrilinos, the repository and package extraRepoOnePackage are the same directory. In this case, the file extraRepoOnePackage/PackagesList.cmake looks like:
tribits_repository_define_packages( extraRepoOnePackage . ST )
(Note the dot '.' for the package directory.)
This is also how the real TriBITS repository and package DataTransferKit is set up (at least that is the way it was when this document was first written).
However, to maximize flexibility, it is recommended that a TriBITS package and its TriBITS repository not share the same directory or the same name. This allows a TriBITS repository to define more packages later.
TriBITS Package Dir == TriBITS Repository Dir == TriBITS Project Dir: In the extreme, it is possible to collapse a single TriBITS package, repository, and project into the same base source directory. They can also share the same name for the package, repository and package. One example of this is the TriBITS project and The TriBITS Test Package themselves, which are both rooted in the base TriBITS/ source directory of the stand-alone TriBITS repository. There are a few restrictions and modifications needed to get this to work:
Other than that simple modification to the top-level CMakeLists.txt file, a TriBITS project, repository, and package can all be rooted in the same source directory.
The primary use case for collapsing a project, repository, and package into a single base source directory would be to support the stand-alone build of a TriBITS package as its own entity that uses an independent installation of the TriBITS (or a minimal snapshot of TriBITS). If a given TriBITS package has no required upstream TriBITS package dependencies and minimal external package/TPL dependencies (or only uses Standard TriBITS TPLs or Common TriBITS TPLs already defined in the tribits/core/std_tpls/ or tribits/common_tpls/ directories), then creating a stand-alone project build of a single TriBITS package requires fairly little extra overhead or duplication.
While a TriBITS Repository can define their own external packages/TPLs and their own TPL find modules (see TriBITS External Package/TPL), the TriBITS source tree contains TriBITS find modules for a few different standard TPLs and common TPLs. Standard TriBITS TPLs are integral to the TriBITS system itself while Common TriBITS TPLs are TPL that are used in several different TriBITS Repositories and are contained in TriBITS for convenience and uniformity.
TriBITS contains find modules for a few standard TPLs integral to the TriBITS system. The standard TriBITS TPLs are contained under the directory:
tribits/core/std_tpls/
The current list of standard TriBITS TPL find modules is:
FindTPLCUDA.cmake FindTPLMPI.cmake
The TPLs MPI and CUDA are standard because they are special in that they define compilers and other special tools that are used in tribits_add_library(), tribits_add_executable(), tribits_add_test() and other commands.
These standard TPLs are used in a <repoDir>/TPLsList.cmake file as:
tribits_repository_define_tpls( MPI "${${PROJECT_NAME}_TRIBITS_DIR}/core/std_tpls/" PT CUDA "${${PROJECT_NAME}_TRIBITS_DIR}/core/std_tpls/" ST ... )
TriBITS also contains find modules for several TPLs that are used across many independent TriBITS repositories. The goal of maintaining these under TriBITS is to enforce conformity in case these independent repositories are combined into a single meta-project.
The common TriBITS TPLs are contained under the directory:
tribits/common_tpls/
The current list of common TriBITS TPL find modules is:
find_modules FindTPLBinUtils.cmake FindTPLBLAS.cmake FindTPLBoost.cmake FindTPLCGNS.cmake FindTPLCGNSDependencies.cmake FindTPLHDF5.cmake FindTPLLAPACK.cmake FindTPLLAPACKDependencies.cmake FindTPLNetcdf.cmake FindTPLNetcdfDependencies.cmake FindTPLPETSC.cmake FindTPLPnetcdf.cmake FindTPLProjectLastLib.cmake utils
Common TPLs are used in a <repoDir>/TPLsList.cmake file as:
tribits_repository_define_tpls( BLAS "${${PROJECT_NAME}_TRIBITS_DIR}/common_tpls/" PT LAPACK "${${PROJECT_NAME}_TRIBITS_DIR}/common_tpls/" PT ... )
By using a standard TPL definition, it is guaranteed that the TPL used will be consistent with all of the TriBITS packages that depend on these TPLs in case they are combined into a single project.
Note that just because packages in two different TriBITS repositories reference the same TPL does not necessarily mean that it needs to be moved into the TriBITS source tree under tribits/common_tpls/. For example, if the TPL QT is defined in an upstream repository (e.g. Trilinos), then a package in a downstream repository can list a dependency on the TPL QT without having to define its own QT TPL in its repository's <repoDir>/TPLsList.cmake file. For more details, see TriBITS TPL.
At the CMake build-system level, there are just a few key requirements that a TriBITS package has for its upstream dependent packages when it is being configured to be built. These requirements apply whether the upstream package is defined internally in the current CMake project or provided externally and pulled in through find_package(<Package>).
The common requirements for both internal and external TriBITS-compliant packages as imposed by downstream TriBITS internal packages are:
The TriBITS system will also set the variable:
for all packages that are determined to be TriBITS-compliant packages and satisfy the above criteria.
The above are all that is needed by downstream TriBITS packages to build and link against their upstream dependencies.
Additional requirements are placed on TriBITS-compliant packages depending on if they are defined as internal CMake packages (i.e. TriBITS-compliant internal packages) or are pulled in as external pre-built/pre-installed packages (i.e. TriBITS-compliant external packages).
For TriBITS packages that are defined, built, and installed from a TriBITS CMake project, there are an additional set of requirements for them to behavior correctly with respect to other TriBITS packages.
The requirements for TriBITS-compliant internal packages are:
If a TriBITS package provides any CTest tests/examples, then it must also satisfy the following requirements:
TriBITS internal packages that are defined using the TriBITS framework using the TriBITS-provided macros and functions such as tribits_add_library() and have tests defined using the functions tribits_add_test() and tribits_add_advanced_test() are automatically TriBITS-compliant internal packages. And when these TriBITS-implemented internal packages are installed, they automatically provide TriBITS-compliant external packages. But it is possible for a CMake package to write its own raw CMake code to satisfy these basic requirements for both internal and (installed) external packages.
For packages that are installed on the system and not built in the current CMake project, a streamlined type of TriBITS External Package/TPL is a TriBITS-compliant external package. These special types of external package's don't need to provide a FindTPL<tplName>.cmake find module. Instead, they are fully defined by calling find_package(<Package>) or include(<someBaseDir>/<Package>Config.cmake) to load their <Package>Config.cmake package config file.
The requirements for TriBITS-compliant external packages are:
NOTE: TriBITS-compliant external packages that provide TriBITS-compliant external packages for all of their upstream dependencies are said to be fully TriBITS-compliant external packages while those that support the minimal requirements are said to be minimally TriBITS-compliant external packages. The TriBITS external package/TPL system is robust enough to deal with minimally TriBITS-compliant external packages. Any TriBITS external packages/TPLs upstream from a minimally TriBITS-compliant external package will be found again in the current TriBITS project. (In these cases, it is up to the user to make sure that the same upstream packages are found.)
In this section, a few different example TriBITS projects and packages are previewed. Most of these examples exist in the TriBITS source directory tribits itself so they are available to all users of TriBITS. These examples also provide a means to test the TriBITS system itself (see The TriBITS Test Package).
The first example covered is the bare bones TribitsHelloWorld example project. The second example covered in detail is TribitsExampleProject. This example covers all the basics for setting up a simple multi-package TriBITS project. The third example outlined is MockTrilinos which mostly exists to test the TriBITS system itself but also contains some nice examples of a few different TriBITS features and behaviors. The forth example is the ReducedMockTrilinos project which is used to demonstrate TriBITS behavior in this document. Also mentioned is the Trilinos project itself which can be a useful example of the usage of TriBITS (see disclaimers in the section Trilinos). The last example mentioned is The TriBITS Test Package itself which allows the TriBITS system to be tested and installed from any TriBITS project that lists it, including the TriBITS project itself (see Coexisting Projects, Repositories, and Packages).
The directory tribits/examples/ contains some other example TriBITS projects and repositories as well that are referred to in this and other documents.
TribitsHelloWorld is about the simplest possible TriBITS project that you can imagine and is contained under the directory:
tribits/examples/TribitsHelloWorld/
This example project contains only a single TriBITS package and no frills at all (does not support MPI or Fortran). However, it does show how minimal a TriBITS Project (which is also a TriBITS Repository) and a TriBITS Package can be and still demonstrates some of the value of TriBITS over raw CMake. The simple HelloWorld package is used to compare with the raw CMakeLists.txt file in the RawHeloWorld example project in the TriBITS Overview document.
The directory structure for this example his shown below:
TribitsHelloWorld/ CMakeLists.txt PackagesList.cmake ProjectName.cmake README TPLsList.cmake hello_world/ CMakeLists.txt cmake/ Dependencies.cmake hello_world_lib.cpp hello_world_lib.hpp hello_world_main.cpp hello_world_unit_tests.cpp
This has all of the required TriBITS Project Core Files, TriBITS Repository Core Files, and TriBITS Package Core Files. It just builds a simple library, a simple executable, a test executable, and the tests them as shown by the file TribitsHelloWorld/hello_world/CMakeLists.txt which is:
tribits_package(HelloWorld) tribits_add_library(hello_world_lib HEADERS hello_world_lib.hpp SOURCES hello_world_lib.cpp) tribits_add_executable(hello_world NOEXEPREFIX SOURCES hello_world_main.cpp INSTALLABLE) tribits_add_test(hello_world NOEXEPREFIX PASS_REGULAR_EXPRESSION "Hello World") tribits_add_executable_and_test(unit_tests SOURCES hello_world_unit_tests.cpp PASS_REGULAR_EXPRESSION "All unit tests passed") tribits_package_postprocess()
The build and test of this simple project is tested in the The TriBITS Test Package file:
TriBITS/test/core/ExamplesUnitTests/CMakeLists.txt
Note that this little example is a fully functional TriBITS Repository and can be embedded in to a larger TriBITS meta-project and be seamlessly built along with any other such TriBITS-based software.
TribitsExampleProject in an example TriBITS Project and TriBITS Repository contained in the TriBITS source tree under:
tribits/examples/TribitsExampleProject/
When this is used as the base TriBITS project, this is the directory corresponds to <projectDir> and <repoDir> referenced in TriBITS Project Core Files and TriBITS Repository Core Files, respectively.
Several files from this project are used as examples in the section TriBITS Project Structure. Here, a fuller description is given of this project and a demonstration of how TriBITS works. From this simple example project, one can quickly see how the basic structural elements of a TriBITS project, repository, and package (and subpackage) are pulled together.
This simple project shows how what is listed in files:
are used to specify the packages and packages in a TriBITS project and repository. More details about the contents of the Dependencies.cmake files is described in the section Package Dependencies and Enable/Disable Logic.
The name of this project PROJECT_NAME is given in its TribitsExampleProject/ProjectName.cmake file:
# Must set the project name at very beginning before including anything else set(PROJECT_NAME TribitsExProj) # Turn on export dependency generation for WrapExteranl package set(${PROJECT_NAME}_GENERATE_EXPORT_FILE_DEPENDENCIES_DEFAULT ON) # Turn on by default the generation of the export files set(${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES_DEFAULT ON)
The variable PROJECT_NAME=TribitsExProj is used to prefix (using "${PROJECT_NAME}_") all of the project's global TriBITS variables like TribitsExProj_ENABLE_TESTS, TribitsExProj_ENABLE_ALL_PACKAGES, etc.
The directory structure and key files for this example project are shown in the partial list of TribitsExampleProject Files and Directories below:
TribitsExampleProject/ CMakeLists.txt Copyright.txt PackagesList.cmake ProjectName.cmake project-checkin-test-config.py TPLsList.cmake Version.cmake ... cmake/ CallbackDefineProjectPackaging.cmake CallbackDefineRepositoryPackaging.cmake CallbackSetupExtraOptions.cmake packages/ simple_cxx/ CMakeLists.txt cmake/ CheckFor__int64.cmake Dependencies.cmake SimpleCxx_config.h.in src/ CMakeLists.txt SimpleCxx_HelloWorld.cpp SimpleCxx_HelloWorld.hpp test/ CMakeLists.txt SimpleCxx_HelloWorld_Tests.cpp mixed_lang/ ... with_subpackages/ CMakeLists.txt cmake/ Dependencies.cmake A/ CMakeLists.txt cmake/ Dependencies.cmake ... B/ ... C/ ... wrap_external/ ...
Above, the sub-directories under packages/ are sorted according to the order listed in the TribitsExampleProject/PackagesList.cmake file:
tribits_repository_define_packages( SimpleCxx packages/simple_cxx PT MixedLang packages/mixed_lang PT InsertedPkg InsertedPkg ST WithSubpackages packages/with_subpackages PT WrapExternal packages/wrap_external ST ) tribits_disable_package_on_platforms(WrapExternal Windows) tribits_allow_missing_external_packages(InsertedPkg)
From this file, we get the list of top-level packages SimpleCxx, MixedLang, WithSubpackages, and WrapExternal (and their base package directories and testing group, see <repoDir>/PackagesList.cmake). (NOTE: By default the package InsertedPkg is not defined because its directory is missing, see How to insert a package into an upstream repo.)
A full listing of package files in TribitsExampleProject Files and Directories is only shown for the SimpleCxx package directory packages/simple_cxx/. For this package, <packageDir> = <repoDir>/packages/simple_cxx and PACKAGE_NAME = SimpleCxx. As explained in TriBITS Package Core Files, the files <packageDir>/cmake/Dependencies.cmake and <packageDir>/CMakeLists.txt must exist for every package directory listed in <repoDir>/PackagesList.cmake and we see these files under in the directory packages/simple_cxx/. The package SimpleCxx does not have any upstream package dependencies.
Now consider the example top-level package WithSubpackages which, as the name suggests, is broken down into subpackages. The WithSubpackages dependencies file:
TribitsExampleProject/packages/with_subpackages/cmake/Dependencies.cmake
with contents:
tribits_package_define_dependencies( SUBPACKAGES_DIRS_CLASSIFICATIONS_OPTREQS A a PT REQUIRED B b ST OPTIONAL C c ST OPTIONAL REGRESSION_EMAIL_LIST with_packages-regressions@someurl.none )
references the three subpackages with sub-directories <spkgDir> = A, B, and C under the parent package directory packages/package_with_packages/ which are shown in TribitsExampleProject Files and Directories. This gives another set of three packages WithSubpackagesA, WithSubpackagesB, and WithSubpackagesC. Combining <packageDir> = packages/package_with_packages and <spkgDir> for each subpackage gives the subpackage directories:
TribitsExampleProject/packages/with_subpackages/a/ TribitsExampleProject/packages/with_subpackages/b/ TribitsExampleProject/packages/with_subpackages/c/
Together with the top-level parent package WithSubpackages itself, this top-level package provides four TriBITS Packages giving the final list of packages provided by this TriBITS repo as:
SimpleCxx MixedLang WithSubpackagesA WithSubpackagesB WithSubpackagesC \ WithSubpackages WrapExternal 7
The above list of packages is printed (with the number of packages printed at the end) by TriBITS to the cmake stdout on the line starting with "Final set of non-enabled packages:" when no packages are enabled (see Selecting the list of packages to enable). (Note that TriBITS does not put in line-brakes with continuation characters "\" as shown above.) TriBITS defines enable/disable cache variables for each of these defined packages like TribitsExProj_ENABLE_SimpleCxx and TribitsExProj_ENABLE_WithSubpackagesA, and defines all the variables listed in TriBITS Package Cache Variables that are settable by the users or by the dependency logic described in section Package Dependencies and Enable/Disable Logic.
When starting a new TriBITS project, repository, or package, one should consider basing these on the examples in TribitsExampleProject. In fact, the skeletons for any of the
should be copied from this example project as they represent best practice when using TriBITS for the typical use cases.
TribitsExampleProject2 in an example TriBITS Project and TriBITS Repository contained in the TriBITS source tree under:
tribits/examples/TribitsExampleProject2/
This example TriBITS project provides some examples for a few other features and testing scenarios. It contains three internal packages Package1, Package2, and Package3 as shown in its PackagesList.cmake file:
tribits_repository_define_packages( Package1 packages/package1 PT Package2 packages/package2 PT Package3 packages/package3 PT )
and supports four external packages/TPLs Tpl1, Tpl2, Tpl3, and Tpl4 as shown in its TPLsList.cmake file:
tribits_repository_define_tpls( Tpl1 "cmake/tpls/" PT Tpl2 "cmake/tpls/" PT Tpl3 "${CMAKE_CURRENT_LIST_DIR}/cmake/tpls/" PT Tpl4 "cmake/tpls/" PT ) # NOTE: Above we are setting the findmod path to an absolute path just to test # that case with TPL dependencies (see trilinos/Trilinos#10774).
The TriBITS project MockTrilinos is contained under the directory:
tribits/examples/MockTrilinos/
This TriBITS project is not a full TriBITS project (i.e. it does not build anything). Instead, it is primarily used to test the TriBITS system using tests defined in the The TriBITS Test Package. The MockTrilinos project is actually given the name PROJECT_NAME = Trilinos and contains a subset of Trilinos packages with slightly modified dependencies from a real version of the Trilinos project from May 2009. The list of packages in:
tribits/examples/MockTrilinos/PackagesList.cmake
is:
tribits_repository_define_packages( TrilinosFramework cmake PT Teuchos packages/teuchos PT RTOp packages/rtop PT Epetra packages/epetra PT Zoltan packages/zoltan PT Shards packages/shards PT Triutils packages/triutils PT Tpetra packages/tpetra PT EpetraExt packages/epetraext PT Stokhos packages/stokhos EX Sacado packages/sacado ST Thyra packages/thyra PT Isorropia packages/isorropia PT AztecOO packages/aztecoo PT Galeri packages/galeri PT Amesos packages/amesos PT Intrepid packages/intrepid PT Ifpack packages/ifpack PT ML packages/ml PT Belos packages/belos ST Stratimikos packages/stratimikos PT RBGen packages/rbgen PT Phalanx packages/phalanx ST Panzer packages/panzer ST AlwaysMissing AlwaysMissing PT ) # NOTE: Sacado was really PT but for testing purpose it is made ST # NOTE: Belos was really PT but for testing purpose it is made ST tribits_allow_missing_external_packages(AlwaysMissing) tribits_disable_package_on_platforms(ML BadSystem1) tribits_disable_package_on_platforms(Ifpack BadSystem1 BadSystem2)
All of the package directories listed above have <packageDir>/cmake/Dependencies.cmake files but generally do not have <packageDir>/CMakeLists.txt files since most of usage of MockTrilinos just involves the testing of the algorithms and behaviors described in the section Package Dependencies and Enable/Disable Logic.
MockTrilinos also contains a number of extra TriBITS repositories used in various tests. These extra repositories offer examples of different types of TriBITS repositories like:
New test extra repositories are added when new types of tests are needed that would require new package and TPL dependency structures since existing dependency tests based on MockTrilinos are expensive to change by their very nature.
The primary reason that the MockTrilinos test project is mentioned in this developers guide is because it contains a variety of packages, subpackages, and TPLs with a variety of different types of dependencies. This variety is is needed to more fully test the TriBITS system but this project and the tests also serve as examples and extra documentation for the behavior of the TriBITS system. Several of the dependency-related examples referenced in this document come from MockTrilinos.
Most of the dependency tests involving MockTrilinos are specified in:
TriBITS/test/core/DependencyUnitTests/CMakeLists.txt
A great deal about the current behavior of TriBITS Package Dependencies and Enable/Disable Logic can be learned from inspecting these tests. There are also some faster-running unit tests involving MockTrilinos defined in the file:
TriBITS/test/core/TribitsAdjustPackageEnables_UnitTests.cmake
The TriBITS project ReducedMockTrilinos is contained under the directory:
tribits/examples/ReducedMockTrilinos/
It is a scaled-down version of the MockTrilinos test project with just a handful of packages and some modified dependencies. Its primary purpose for this example project is to be used for examples in the section Package Dependencies and Enable/Disable Logic and to test a few features of the TriBITS system not covered in other tests.
The list of packages in:
tribits/examples/ReducedMockTrilinos/PackagesList.cmake
is:
tribits_repository_define_packages( Teuchos packages/teuchos PT RTOp packages/rtop PT Epetra packages/epetra PT Triutils packages/triutils ST EpetraExt packages/epetraext ST Thyra packages/thyra PT )
All of the listed packages are standard TriBITS packages except for the mock Thyra package which is broken down into subpackages. More details of this example project are described in Package Dependencies and Enable/Disable Logic.
The real Trilinos project and repository itself is an advanced example for the usage of TriBITS. Almost every single-repository use case for TriBITS is demonstrated somewhere in Trilinos. While some of the usage of TriBITS in Trilinos may not be not exemplary (e.g., because it represents old usage, or was written by CMake/TriBITS beginners) it does represent real working usage. Given that Trilinos is a widely available software repository, anyone should be able to access a newer version of Trilinos and mine it for CMake and TriBITS examples.
The last TriBITS example mentioned here is the TriBITS test package named (appropriately) TriBITS itself defined in the TriBITS repository. The directory for the TriBITS test package is the base TriBITS source directory tribits. This allows any TriBITS project to add testing for the TriBITS system by just listing the TriBITS repository in its <projectDir>/cmake/ExtraRepositoriesList.cmake file. Trilinos lists the TriBITS repository in its ExtraRepositoriesList.cmake file as:
- tribits_project_define_extra_repositories(
- TriBITS "" GIT https://github.com/TriBITSPub/TriBITS "" Continuous ... )
No downstream TriBITS packages list a dependency on TriBITS in their <packageDir>/cmake/Dependencies.cmake files. Defining the TriBITS TriBITS package in only done for running the TriBITS tests.
Once the TriBITS test package is added to the list of project/repository packages, it can be enabled just like any other package by adding the following to the cmake command-line options:
-D <Project>_ENABLE_TriBITS=ON \ -D <Project>_ENABLE_TESTS=ON
One can then inspect the added tests prefixed by "TriBITS_" to see what tests are defined and how they are run. There is a wealth of information about the TriBITS system embedded in these tests and where documentation and these tests disagreed, believe the tests!
Arguably, the more important feature/aspect of the TriBITS system is the partitioning of a large software project into packages and managing the dependencies between these packages to support building, testing, and deploying different pieces as needed (see discussion of Software Engineering Packaging Principles). This is especially useful in incremental CI testing of large projects. However, maintaining such dependencies is also a critical component in creating and maintaining Self-Sustaining Software (see the TriBITS Lifecycle Model). The fundamental mechanism for breaking up a large software into manageable pieces is to partition the software into different TriBITS Packages and then define the dependencies between these packages (which are defined inside of the <packageDir>/cmake/Dependencies.cmake files for each package).
Note that the basic idea of breaking up a large set of software into pieces, defining dependencies between the pieces, and then applying algorithms to manipulate the dependency data-structures is nothing new. If fact, nearly every binary package deployment system provided in various Linux OS distributions have the concept of packages and dependencies and will automatically install all of the necessary upstream dependencies when a downstream dependency install is requested. The main difference (and the added complexity) with TriBITS is that it can handle both required and optional dependencies since it can build from source. A binary package installation system, however, typically can't support optional dependencies because only pre-built binary libraries and tools are available to install.
This section is organized and broken-down as follows. First, the subsection Example ReducedMockTrilinos Project Dependency Structure presents the ReducedMockTrilinos example project, describes its dependency structure, and uses it to begin to describe how TriBITS sets up and manages package dependencies. The packages in this ReducedMockTrilinos example project are used in the following subsections so one will be constantly referred back to this subsection. The following subsection TriBITS Dependency Handling Behaviors defines and describes the nitty-gritty details of the TriBITS package dependency structure and the algorithms that manipulate the various package and test enables and disables. Specific examples for the TriBITS dependency handling algorithms are given in the subsection Example Enable/Disable Use Cases. Finally, the subsection <Project>PackageDependencies.xml describes the standard XML output data-structure that gets created by TriBITS that defines a project's package dependencies.
To demonstrate the TriBITS package dependency handling system, the small simple ReducedMockTrilinos project is used. The list of packages for this project is defined in the file ReducedMockTrilinos/PackagesList.cmake (see <repoDir>/PackagesList.cmake) which contents:
tribits_repository_define_packages( Teuchos packages/teuchos PT RTOp packages/rtop PT Epetra packages/epetra PT Triutils packages/triutils ST EpetraExt packages/epetraext ST Thyra packages/thyra PT )
All of the listed packages are standard TriBITS packages except for the mock Thyra package which is broken down into subpackages as shown in packages/thyra/cmake/Dependnecies.cmake (see <packageDir>/cmake/Dependencies.cmake) which is:
tribits_package_define_dependencies( SUBPACKAGES_DIRS_CLASSIFICATIONS_OPTREQS CoreLibs src PT REQUIRED GoodStuff good_stuff ST OPTIONAL CrazyStuff crazy_stuff EX OPTIONAL Epetra adapters/epetra PT OPTIONAL EpetraExt adapters/epetraext ST OPTIONAL )
This gives the full list of top-level TriBITS packages:
Teuchos RTOp Epetra Triutils EpetraExt Thyra
Adding in the subpackages defined in the top-level Thyra package, the full set of TriBITS Packages for this project is:
Teuchos RTOp Epetra Triutils EpetraExt ThyraCoreLibs ThyraGoodStuff \ ThyraCrazyStuff ThyraEpetra ThyraEpetraExt Thyra
Note that one can see this full list of top-level packages and packages in the cmake configure output lines starting with:
Final set of non-enabled top-level packages: Final set of non-enabled packages:
respectively, when configuring with no package enables as shown in the example Default configure with no packages enabled on input.
The list of TriBITS External Packages/TPLs for this example project given in the file ReducedMockTrilinos/TPLsList.cmake (see <repoDir>/TPLsList.cmake) which is:
tribits_repository_define_tpls( MPI "${${PROJECT_NAME}_TRIBITS_DIR}/core/std_tpls/" PT BLAS "${${PROJECT_NAME}_TRIBITS_DIR}/common_tpls/" PT LAPACK "${${PROJECT_NAME}_TRIBITS_DIR}/common_tpls/" PT Boost cmake/TPLs/ ST UMFPACK cmake/TPLs/ ST AMD cmake/TPLs/ EX PETSC "${${PROJECT_NAME}_TRIBITS_DIR}/common_tpls/" ST )
Take note of the Package Test Group (i.e. PT, ST, or EX) assigned to each package as it plays a significant role in how the TriBITS dependency system handles enables and disables.
The dependency structure of this simple TriBITS project is shown below in ReducedMockTrilinos Dependencies.
ReducedMockTrilinos Dependencies:
Package dependencies information: -- Trilinos_DEFINED_TPLS: MPI BLAS LAPACK Boost UMFPACK AMD PETSC -- Trilinos_NUM_DEFINED_TPLS='7' -- Trilinos_DEFINED_INTERNAL_TOPLEVEL_PACKAGES: Teuchos RTOp Epetra Triutils EpetraExt Thyra -- Trilinos_NUM_DEFINED_INTERNAL_TOPLEVEL_PACKAGES='6' -- Trilinos_DEFINED_TOPLEVEL_PACKAGES: MPI BLAS LAPACK Boost UMFPACK AMD PETSC Teuchos RTOp Epetra Triutils EpetraExt Thyra -- Trilinos_NUM_DEFINED_TOPLEVEL_PACKAGES='13' -- Trilinos_DEFINED_INTERNAL_PACKAGES: Teuchos RTOp Epetra Triutils EpetraExt ThyraCoreLibs ThyraGoodStuff ThyraCrazyStuff ThyraEpetra ThyraEpetraExt Thyra -- Trilinos_NUM_DEFINED_INTERNAL_PACKAGES='11' -- Trilinos_DEFINED_PACKAGES: MPI BLAS LAPACK Boost UMFPACK AMD PETSC Teuchos RTOp Epetra Triutils EpetraExt ThyraCoreLibs ThyraGoodStuff ThyraCrazyStuff ThyraEpetra ThyraEpetraExt Thyra -- Trilinos_NUM_DEFINED_PACKAGES='18' -- MPI_FORWARD_LIB_DEFINED_DEPENDENCIES: Teuchos[O] Epetra[O] -- BLAS_FORWARD_LIB_DEFINED_DEPENDENCIES: LAPACK[O] Teuchos[R] Epetra[R] -- LAPACK_LIB_DEFINED_DEPENDENCIES: BLAS[O] -- LAPACK_FORWARD_LIB_DEFINED_DEPENDENCIES: Teuchos[R] Epetra[R] -- Boost_FORWARD_LIB_DEFINED_DEPENDENCIES: Teuchos[O] -- UMFPACK_FORWARD_LIB_DEFINED_DEPENDENCIES: EpetraExt[O] -- AMD_FORWARD_LIB_DEFINED_DEPENDENCIES: EpetraExt[O] -- PETSC_FORWARD_LIB_DEFINED_DEPENDENCIES: EpetraExt[O] -- Teuchos_LIB_DEFINED_DEPENDENCIES: BLAS[R] LAPACK[R] Boost[O] MPI[O] -- Teuchos_FORWARD_LIB_DEFINED_DEPENDENCIES: RTOp[R] EpetraExt[R] ThyraCoreLibs[R] -- RTOp_LIB_DEFINED_DEPENDENCIES: Teuchos[R] -- RTOp_FORWARD_LIB_DEFINED_DEPENDENCIES: ThyraCoreLibs[R] -- Epetra_LIB_DEFINED_DEPENDENCIES: BLAS[R] LAPACK[R] MPI[O] -- Epetra_FORWARD_LIB_DEFINED_DEPENDENCIES: Triutils[R] EpetraExt[R] ThyraEpetra[R] -- Triutils_LIB_DEFINED_DEPENDENCIES: Epetra[R] -- Triutils_FORWARD_LIB_DEFINED_DEPENDENCIES: EpetraExt[O] -- EpetraExt_LIB_DEFINED_DEPENDENCIES: Teuchos[R] Epetra[R] Triutils[O] UMFPACK[O] AMD[O] PETSC[O] -- EpetraExt_FORWARD_LIB_DEFINED_DEPENDENCIES: ThyraEpetraExt[R] -- ThyraCoreLibs_LIB_DEFINED_DEPENDENCIES: Teuchos[R] RTOp[R] -- ThyraCoreLibs_FORWARD_LIB_DEFINED_DEPENDENCIES: ThyraGoodStuff[R] ThyraEpetra[R] Thyra[R] -- ThyraGoodStuff_LIB_DEFINED_DEPENDENCIES: ThyraCoreLibs[R] -- ThyraGoodStuff_FORWARD_LIB_DEFINED_DEPENDENCIES: ThyraCrazyStuff[R] Thyra[O] -- ThyraCrazyStuff_LIB_DEFINED_DEPENDENCIES: ThyraGoodStuff[R] -- ThyraCrazyStuff_FORWARD_LIB_DEFINED_DEPENDENCIES: Thyra[O] -- ThyraEpetra_LIB_DEFINED_DEPENDENCIES: Epetra[R] ThyraCoreLibs[R] -- ThyraEpetra_FORWARD_LIB_DEFINED_DEPENDENCIES: ThyraEpetraExt[R] Thyra[O] -- ThyraEpetraExt_LIB_DEFINED_DEPENDENCIES: ThyraEpetra[R] EpetraExt[R] -- ThyraEpetraExt_FORWARD_LIB_DEFINED_DEPENDENCIES: Thyra[O] -- Thyra_LIB_DEFINED_DEPENDENCIES: ThyraCoreLibs[R] ThyraGoodStuff[O] ThyraCrazyStuff[O] ThyraEpetra[O] ThyraEpetraExt[O] Dumping direct enabled dependencies for each package ... -- MPI: No enabled dependencies! -- BLAS: No enabled dependencies! -- LAPACK_LIB_ENABLED_DEPENDENCIES: BLAS[O] -- Boost: No enabled dependencies! -- UMFPACK: No enabled dependencies! -- AMD: No enabled dependencies! -- PETSC: No enabled dependencies! -- Teuchos_LIB_ENABLED_DEPENDENCIES: BLAS[R] LAPACK[R] -- RTOp_LIB_ENABLED_DEPENDENCIES: Teuchos[R] -- Epetra_LIB_ENABLED_DEPENDENCIES: BLAS[R] LAPACK[R] -- Triutils: No enabled dependencies! -- EpetraExt: No enabled dependencies! -- ThyraCoreLibs_LIB_ENABLED_DEPENDENCIES: Teuchos[R] RTOp[R] -- ThyraGoodStuff: No enabled dependencies! -- ThyraCrazyStuff: No enabled dependencies! -- ThyraEpetra_LIB_ENABLED_DEPENDENCIES: Epetra[R] ThyraCoreLibs[R] -- ThyraEpetraExt: No enabled dependencies! -- Thyra_LIB_ENABLED_DEPENDENCIES: ThyraCoreLibs[R] ThyraEpetra[O] Setting up export dependencies for all enabled packages ... -- Teuchos: No library dependencies! -- RTOp_FULL_ENABLED_DEP_PACKAGES: Teuchos -- Epetra: No library dependencies! -- ThyraCoreLibs_FULL_ENABLED_DEP_PACKAGES: RTOp Teuchos -- ThyraEpetra_FULL_ENABLED_DEP_PACKAGES: ThyraCoreLibs Epetra RTOp Teuchos -- Thyra_FULL_ENABLED_DEP_PACKAGES: ThyraEpetra ThyraCoreLibs Epetra RTOp Teuchos
The above dependency structure printout is produced by configuring with ${PROJECT_NAME}_DUMP_PACKAGE_DEPENDENCIES=ON (which also results in more dependency information than what is shown above, e.g. like computed forward package dependencies). Note that the top-level package Thyra is shown to depend on its subpackages (not the other way around). (Many people are confused about the nature of the dependencies between packages and subpackages. See <packageDir>/<spkgDir>/cmake/Dependencies.cmake for more discussion.)
A number of user-settable CMake cache variables determine what packages and what tests and examples get enabled. These cache variables are described in Selecting the list of packages to enable and are described below. Also, the assigned Package Test Group (i.e. PT, ST, and EX) also affects what packages get enabled or disabled.
Any of these packages can be enabled or disabled with ${PROJECT_NAME}_ENABLE_<TRIBITS_PACKAGE>=(ON|OFF) (the default enable is typically empty "", see PT/ST packages given default unset enable/disable state). For ReducedMockTrilinos, this gives the enable/disable cache variables (with the initial default values):
Trilinos_ENABLE_Teuchos="" Trilinos_ENABLE_RTOp"" Trilinos_ENABLE_Epetra="" Trilinos_ENABLE_Triutils="" Trilinos_ENABLE_EpetraExt="" Trilinos_ENABLE_ThyraCore="" Trilinos_ENABLE_ThyraGoodStuff="" Trilinos_ENABLE_ThyraCrazyStuff="OFF" # Because it is 'EX' Trilinos_ENABLE_ThyraEpetra="" Trilinos_ENABLE_ThyraEpetraExt=""
Every TriBITS package is assumed to have tests and/or examples so TriBITS defines the following cache variables as well (with the initial default values):
Teuchos_ENABLE_TESTS="" RTOp_ENABLE_TESTS="" Epetra_ENABLE_TESTS="" Triutils_ENABLE_TESTS="" EpetraExt_ENABLE_TESTS="" ThyraCoreLibs_ENABLE_TESTS="" ThyraGoodStuff_ENABLE_TESTS="" ThyraEpetra_ENABLE_TESTS="" ThyraEpetraExt_ENABLE_TESTS="" Thyra_ENABLE_TESTS=""
NOTE: TriBITS only sets the variables <TRIBITS_PACKAGE>_ENABLE_TESTS into the cache if the package <TRIBITS_PACKAGE> becomes enabled at some point. This cuts down the clutter in the CMake cache for large projects with lots of packages where the user only enables a subset of the packages.
NOTE: TriBITS also defines the cache variables <TRIBITS_PACKAGE>_ENABLE_EXAMPLES for each enabled TriBITS package which is handled the same way as the <TRIBITS_PACKAGE>_ENABLE_TEST variables.
Also, every defined external package/TPL is given its own TPL_ENABLE_<TRIBITS_TPL> enable/disable cache variable. For the TPLs in ReducedMockTrilinos, this gives the enable/disable cache variables (with default values):
TPL_ENABLE_MPI="" TPL_ENABLE_BLAS="" TPL_ENABLE_LAPACK="" TPL_ENABLE_Boost="" TPL_ENABLE_UMFPACK="" TPL_ENABLE_AMD="" TPL_ENABLE_PETSC=""
In addition, for every optional package dependency, TriBITS defines a cache variable <TRIBITS_PACKAGE>_ENABLE_<OPTIONAL_DEP>. For the optional dependencies shown in ReducedMockTrilinos Dependencies, that gives the additional cache variables (with default values):
Teuchos_ENABLE_Boost="" Teuchos_ENABLE_MPI="" Teuchos_ENABLE_Boost="" Epetra_ENABLE_MPI="" EpetraExt_ENABLE_Triutils="" EpetraExt_ENABLE_UMFPACK="" EpetraExt_ENABLE_AMD="" EpetraExt_ENABLE_PETSC="" Thyra_ENABLE_ThyraGoodStuff="" Thyra_ENABLE_ThyraCrazyStuff="" Thyra_ENABLE_ThyraEpetra="" Thyra_ENABLE_ThyraEpetraExt=""
The above optional package-specific cache variables allow one to control whether or not support for upstream dependency X is turned on in package Y independent of whether or not X and Y are themselves both enabled. For example, if the packages Triutils and EpetraExt are both enabled, one can explicitly disable support for the optional dependency Triutils in EpetraExt by setting EpetraExt_ENABLE_Triutils=OFF. One may want to do this for several reasons but the bottom line is that this gives the user more detailed control over package dependencies. See the TriBITS Dependency Handling Behaviors and Explicit disable of an optional package dependency for more discussion and examples.
Before getting into specific Example Enable/Disable Use Cases, some of the TriBITS Dependency Handling Behaviors are first defined below.
Below, some of the rules and behaviors of the TriBITS dependency management system are described. Examples refer to the Example ReducedMockTrilinos Project Dependency Structure. More detailed examples of these behaviors are given in the section Example Enable/Disable Use Cases.
In brief, the rules/behaviors of the TriBITS package dependency management system are:
In more detail, these rules/behaviors are:
Disables trump enables where there is a conflict and TriBITS will never override a disable in order to satisfy some dependency. For example, if the user sets Trilinos_ENABLE_Teuchos=OFF and Trilinos_ENABLE_RTOp=ON, then TriBITS will not override the disable of Teuchos in order to satisfy the required dependency of RTOp. In cases such as this, the behavior of the TriBITS dependency adjustment system will depend on the setting of the top-level user cache variable ${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES:
For an example of both behaviors, see Conflicting explicit enable and disable.
Enable/disable of parent package is enable/disable for subpackages: An explicit enable/disable of a top-level parent package with subpackages with ${PROJECT_NAME}_ENABLE_<TRIBITS_PACKAGE>=(ON|OFF) is equivalent to the explicit enable/disable of all of the parent package's subpackages. For example, explicitly setting Trilinos_ENABLE_Thyra=ON is equivalent to explicitly setting:
Trilinos_ENABLE_ThyraCoreLibs=ON Trilinos_ENABLE_ThyraGoodStuff=ON # Only if enabling ST code! Trilinos_ENABLE_ThyraEpetra=ON Trilinos_ENABLE_ThyraEpetraExt=ON # Only if enabling ST code!
(Note that Trilinos_ENABLE_ThyraCrazyStuff is not set to ON because it is already set to OFF by default, see EX packages disabled by default.) Likewise, explicitly setting Trilinos_ENABLE_Thyra=OFF is equivalent to explicitly setting all of the Thyra subpackages to OFF at the outset. For a PT example, see Explicit enable of a package and its tests. For a ST example, see Explicit enable of a package, its tests, an optional TPL, with ST enabled.
<Project>_ENABLE_ALL_PACKAGES enables all PT (cond. ST) packages: Setting the user cache-variable ${PROJECT_NAME}_ENABLE_ALL_PACKAGES=ON will result in the enable of all PT packages when ${PROJECT_NAME}_SECONDARY_TESTED_CODE=OFF and all PT and ST packages when ${PROJECT_NAME}_SECONDARY_TESTED_CODE=ON. For an example, see Enable all packages. When the project is a meta-project, this will only enable the project's primary meta-project packages (PMPP). That is, a package will only be enabled due to ${PROJECT_NAME}_ENABLE_ALL_PACKAGES=ON when its parent repository does not have ${REPOSITORY_NAME}_NO_PRIMARY_META_PROJECT_PACKAGES set to TRUE. However, if:
${REPOSITORY_NAME}_NO_PRIMARY_META_PROJECT_PACKAGES=TRUE
then the package may be enabled if it (or its parent package) is listed in ${REPOSITORY_NAME}_NO_PRIMARY_META_PROJECT_PACKAGES_EXCEPT.
TriBITS prints out a lot of information about the enable/disable logic as it applies the above rules/behaviors. For a large TriBITS project with lots of packages, this can produce a lot of output to stdout. One just needs to understand what TriBITS is printing out and where to look in the output for different information. The examples in the section Example Enable/Disable Use Cases show what this output looks like for the various enable/disable scenarios and tries to explain in more detail the reasons for why the given behavior is implemented the way that it is. Given this output, the rule definitions given above, and the detailed Example Enable/Disable Use Cases, one should always be able to figure out exactly why the final set of enables/disables is the way it is, even in the largest and most complex of TriBITS projects. (NOTE: The same can not be said for many other large software configuration and deployment systems where basic decisions about what to enable and disable are hidden from the user and can be very difficult to track down and debug.)
The above behaviors address the majority of the functionality of the TriBITS dependency management system. However, when dealing with TriBITS projects with multiple repositories, some other behaviors are supported through the definition of a few more variables. The following TriBITS repository-related variables alter what packages in a given TriBITS repository get enabled implicitly or not by TriBITS:
${REPOSITORY_NAME}_NO_IMPLICIT_PACKAGE_ENABLE
If set to ON, then the packages in Repository ${REPOSITORY_NAME} will not be implicitly enabled in any of the package adjustment logic.${REPOSITORY_NAME}_NO_IMPLICIT_PACKAGE_ENABLE_EXCEPT
List of packages in the Repository ${REPOSITORY_NAME} that will be allowed to be implicitly enabled. Only checked if ${REPOSITORY_NAME}_NO_IMPLICIT_PACKAGE_ENABLE is true.
The above variables typically are defined in the outer TriBITS Project's CTest driver scripts or even in top-level project files in order to adjust how packages in its listed repositories are handled. What these variable do is to allow a large project to turn off the auto-enable of optional packages in a given TriBITS repository to provide more detailed control of what gets used from a given TriBITS repository. This, for example, is used in the CASL VERA project to manage some of its extra repositories and packages to further reduce the number of packages that get auto-enabled.
Below, a few of the standard enable/disable use cases for a TriBITS project are given using the Example ReducedMockTrilinos Project Dependency Structure that demonstrate the TriBITS Dependency Handling Behaviors.
The use cases covered are:
All of these use cases and more can be easily run from the command-line by first setting:
$ export REDUCED_MOCK_TRILINOS=<base-dir>/tribits/examples/ReducedMockTrilinos
and then copy and pasting the cmake commands shown below. Just make sure to run these in a temp directory because this actually configures a CMake project in the local directory. Just make sure and run:
$ rm -r CMake*
before each run to clear the CMake cache.
These use cases are now described in detail below.
Default configure with no packages enabled on input
The first use-case to consider is the configure of a TriBITS project without enabling any packages. For the ReducedMockTrilinos project, this is done with:
$ cmake ${REDUCED_MOCK_TRILINOS}
which produces the relevant dependency-related output:
Explicitly enabled top-level packages on input (by user): 0 Explicitly enabled packages on input (by user): 0 Explicitly disabled top-level packages on input (by user or by default): 0 Explicitly disabled packages on input (by user or by default): ThyraCrazyStuff 1 Explicitly enabled external packages/TPLs on input (by user): 0 Explicitly disabled external packages/TPLs on input (by user or by default): 0 ... Final set of enabled top-level packages: 0 Final set of enabled packages: 0 Final set of non-enabled top-level packages: Teuchos RTOp Epetra Triutils EpetraExt Thyra 6 Final set of non-enabled packages: Teuchos RTOp Epetra Triutils EpetraExt \ ThyraCoreLibs ThyraGoodStuff ThyraCrazyStuff ThyraEpetra ThyraEpetraExt Thyra 11 Final set of enabled external packages/TPLs: 0 Final set of non-enabled external packages/TPLs: MPI BLAS LAPACK Boost UMFPACK AMD PETSC 7 ... *** *** WARNING: There were no packages configured so no libraries or tests/examples will be built! ***
The above example demonstrates the following behaviors of the TriBITS dependency handling system:
Explicit enable of a package and its tests
One of the most typical use cases is for the user to explicitly enable one or more top-level TriBITS package and enable its tests. This configuration would be used to drive local development on a specific set of packages (i.e. tests do not need to be enabled for packages not being changed).
Consider the configure of the ReducedMockTrilinos project enabling the top-level Thyra package and its tests with:
$ cmake -DTrilinos_ENABLE_Thyra:BOOL=ON \ -DTrilinos_ENABLE_TESTS:BOOL=ON \ ${REDUCED_MOCK_TRILINOS}
which produces the relevant dependency-related output:
Explicitly enabled top-level packages on input (by user): Thyra 1 Explicitly enabled packages on input (by user): Thyra 1 Explicitly disabled top-level packages on input (by user or by default): 0 Explicitly disabled packages on input (by user or by default): ThyraCrazyStuff 1 Explicitly enabled external packages/TPLs on input (by user): 0 Explicitly disabled external packages/TPLs on input (by user or by default): 0 Enabling subpackages for hard enables of parent packages due to \ Trilinos_ENABLE_<PARENT_PACKAGE>=ON ... -- Setting subpackage enable Trilinos_ENABLE_ThyraCoreLibs=ON because parent \ package Trilinos_ENABLE_Thyra=ON -- Setting subpackage enable Trilinos_ENABLE_ThyraEpetra=ON because parent package \ Trilinos_ENABLE_Thyra=ON Disabling forward required packages and optional intra-package support that have a \ dependency on disabled packages Trilinos_ENABLE_<TRIBITS_PACKAGE>=OFF ... -- Setting Thyra_ENABLE_ThyraCrazyStuff=OFF because Thyra has an optional library \ dependence on disabled package ThyraCrazyStuff Enabling all tests and/or examples that have not been explicitly disabled because \ Trilinos_ENABLE_[TESTS,EXAMPLES]=ON ... -- Setting ThyraCoreLibs_ENABLE_TESTS=ON -- Setting ThyraCoreLibs_ENABLE_EXAMPLES=ON -- Setting ThyraEpetra_ENABLE_TESTS=ON -- Setting ThyraEpetra_ENABLE_EXAMPLES=ON -- Setting Thyra_ENABLE_TESTS=ON -- Setting Thyra_ENABLE_EXAMPLES=ON Enabling all required (and optional since Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES=ON) \ upstream packages for current set of enabled packages \ (Trilinos_ENABLE_SECONDARY_TESTED_CODE=OFF) ... -- NOTE: Not Setting Trilinos_ENABLE_ThyraGoodStuff=ON even though Thyra \ has an optional dependence on ThyraGoodStuff because Trilinos_ENABLE_SECONDARY_TESTED_CODE=OFF -- NOTE: Not Setting Trilinos_ENABLE_ThyraEpetraExt=ON even though Thyra \ has an optional dependence on ThyraEpetraExt because Trilinos_ENABLE_SECONDARY_TESTED_CODE=OFF -- Setting Trilinos_ENABLE_Epetra=ON because ThyraEpetra has a required dependence on Epetra -- Setting Trilinos_ENABLE_Teuchos=ON because ThyraCoreLibs has a required dependence on Teuchos -- Setting Trilinos_ENABLE_RTOp=ON because ThyraCoreLibs has a required dependence on RTOp -- Setting TPL_ENABLE_BLAS=ON because Epetra has a required dependence on BLAS -- Setting TPL_ENABLE_LAPACK=ON because Epetra has a required dependence on LAPACK Enabling all optional intra-package enables <TRIBITS_PACKAGE>_ENABLE_<DEPPACKAGE> that are not currently disabled if both sets of packages are enabled ... -- Setting Thyra_ENABLE_ThyraEpetra=ON since Trilinos_ENABLE_Thyra=ON AND Trilinos_ENABLE_ThyraEpetra=ON Final set of enabled top-level packages: Teuchos RTOp Epetra Thyra 4 Final set of enabled packages: Teuchos RTOp Epetra ThyraCoreLibs ThyraEpetra Thyra 6 Final set of non-enabled top-level packages: Triutils EpetraExt 2 Final set of non-enabled packages: Triutils EpetraExt ThyraGoodStuff ThyraCrazyStuff ThyraEpetraExt 5 Final set of enabled external packages/TPLs: BLAS LAPACK 2 Final set of non-enabled external packages/TPLs: MPI Boost UMFPACK AMD PETSC 5 Getting information for all enabled external packages/TPLs ... Processing enabled external package/TPL: BLAS Processing enabled external package/TPL: LAPACK Configuring individual enabled Trilinos packages ... Processing enabled top-level package: Teuchos (Libs) Processing enabled top-level package: RTOp (Libs) Processing enabled top-level package: Epetra (Libs) Processing enabled top-level package: Thyra (CoreLibs, Epetra, Tests, Examples)
This is a configuration that a developer would use to develop on the Thyra package and its subpackages for example. There is no need to be enabling the tests and examples for upstream packages unless those packages are going to be changed as well.
This case demonstrates a number of TriBITS dependency handling behaviors that are worth some discussion.
First, note that enabling the parent package Thyra with Trilinos_ENABLE_Thyra=ON right away results in the auto-enable of its PT subpackages ThyraCoreLibs and ThyraEpetra which demonstrates the behavior Enable/disable of parent package is enable/disable for subpackages. Note that the ST subpackages ThyraGoodStuff and ThyraEpetraExt where not enabled because ${PROJECT_NAME}_SECONDARY_TESTED_CODE=OFF (which is off by default) which demonstrates the behavior ST packages only auto-enabled if ST code is enabled.
Second, note the auto-enable of required upstream packages Epetra, RTOp and Teuchos shown in lines like:
-- Setting Trilinos_ENABLE_Teuchos=ON because ThyraCoreLibs has a required dependence on Teuchos
Lastly, note that the final set of enabled packages, packages, tests/examples and external packages/TPLs can be clearly seen when processing the external packages/TPLs and top-level packages in the lines:
Getting information for all enabled external packages/TPLs ... -- Processing enabled external package/TPL: BLAS -- Processing enabled external package/TPL: LAPACK Configuring individual enabled Trilinos packages ... Processing enabled top-level package: Teuchos (Libs) Processing enabled top-level package: RTOp (Libs) Processing enabled top-level package: Epetra (Libs) Processing enabled top-level package: Thyra (CoreLibs, Epetra, Tests, Examples)
Note that subpackage enables are listed with their parent packages along with if the tests and/or examples are enabled. Top-level packages that don't have subpackages just show Libs and Tests and Examples if they have been enabled as well.
Explicit enable of a package, its tests, an optional TPL, with ST enabled
An extended use case shown here is for the explicit enable of a package and its tests along with the enable of an optional TPL and with ST code enabled. This is a configuration that would be used to support the local development of a TriBITS package that involves modifying ST software.
Consider the configure of the ReducedMockTrilinos project with:
$ cmake -DTPL_ENABLE_Boost:BOOL=ON \ -DTrilinos_ENABLE_Thyra:BOOL=ON \ -DTrilinos_ENABLE_TESTS:BOOL=ON \ -DTrilinos_ENABLE_SECONDARY_TESTED_CODE:BOOL=ON \ -DTrilinos_ENABLE_TESTS:BOOL=ON \ ${REDUCED_MOCK_TRILINOS}
which produces the relevant dependency-related output:
Explicitly enabled top-level packages on input (by user): Thyra 1 Explicitly enabled packages on input (by user): Thyra 1 Explicitly disabled top-level packages on input (by user or by default): 0 Explicitly disabled packages on input (by user or by default): ThyraCrazyStuff 1 Explicitly enabled external packages/TPLs on input (by user): Boost 1 Explicitly disabled external packages/TPLs on input (by user or by default): 0 Enabling subpackages for hard enables of parent packages due to \ Trilinos_ENABLE_<PARENT_PACKAGE>=ON ... -- Setting subpackage enable Trilinos_ENABLE_ThyraCoreLibs=ON because parent package \ Trilinos_ENABLE_Thyra=ON -- Setting subpackage enable Trilinos_ENABLE_ThyraGoodStuff=ON because parent package \ Trilinos_ENABLE_Thyra=ON -- Setting subpackage enable Trilinos_ENABLE_ThyraEpetra=ON because parent package \ Trilinos_ENABLE_Thyra=ON -- Setting subpackage enable Trilinos_ENABLE_ThyraEpetraExt=ON because parent package \ Trilinos_ENABLE_Thyra=ON Disabling forward required packages and optional intra-package support that have a \ dependency on disabled packages Trilinos_ENABLE_<TRIBITS_PACKAGE>=OFF ... -- Setting Thyra_ENABLE_ThyraCrazyStuff=OFF because Thyra has an optional library \ dependence on disabled package ThyraCrazyStuff Enabling all tests and/or examples that have not been explicitly disabled because \ Trilinos_ENABLE_[TESTS,EXAMPLES]=ON ... -- Setting ThyraCoreLibs_ENABLE_TESTS=ON -- Setting ThyraGoodStuff_ENABLE_TESTS=ON -- Setting ThyraEpetra_ENABLE_TESTS=ON -- Setting ThyraEpetraExt_ENABLE_TESTS=ON -- Setting Thyra_ENABLE_TESTS=ON Enabling all required (and optional since Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES=ON) \ upstream packages for current set of enabled packages ... -- Setting Trilinos_ENABLE_EpetraExt=ON because ThyraEpetraExt has a required dependence on EpetraExt -- Setting Trilinos_ENABLE_Epetra=ON because ThyraEpetra has a required dependence on Epetra -- Setting Trilinos_ENABLE_Teuchos=ON because ThyraCoreLibs has a required dependence on Teuchos -- Setting Trilinos_ENABLE_RTOp=ON because ThyraCoreLibs has a required dependence on RTOp -- Setting Trilinos_ENABLE_Triutils=ON because EpetraExt has an optional dependence on Triutils -- Setting TPL_ENABLE_BLAS=ON because Epetra has a required dependence on BLAS -- Setting TPL_ENABLE_LAPACK=ON because Epetra has a required dependence on LAPACK Enabling all optional intra-package enables <TRIBITS_PACKAGE>_ENABLE_<DEPPACKAGE> that are \ not currently disabled if both sets of packages are enabled ... -- Setting Teuchos_ENABLE_BLAS=ON since Trilinos_ENABLE_Teuchos=ON AND TPL_ENABLE_BLAS=ON -- Setting Teuchos_ENABLE_LAPACK=ON since Trilinos_ENABLE_Teuchos=ON AND TPL_ENABLE_LAPACK=ON -- Setting Teuchos_ENABLE_Boost=ON since Trilinos_ENABLE_Teuchos=ON AND TPL_ENABLE_Boost=ON -- NOT setting Teuchos_ENABLE_MPI=ON since MPI is NOT enabled at this point! -- Setting RTOp_ENABLE_Teuchos=ON since Trilinos_ENABLE_RTOp=ON AND Trilinos_ENABLE_Teuchos=ON -- Setting Epetra_ENABLE_BLAS=ON since Trilinos_ENABLE_Epetra=ON AND TPL_ENABLE_BLAS=ON -- Setting Epetra_ENABLE_LAPACK=ON since Trilinos_ENABLE_Epetra=ON AND TPL_ENABLE_LAPACK=ON -- NOT setting Epetra_ENABLE_MPI=ON since MPI is NOT enabled at this point! -- Setting Triutils_ENABLE_Epetra=ON since Trilinos_ENABLE_Triutils=ON AND Trilinos_ENABLE_Epetra=ON -- Setting EpetraExt_ENABLE_Teuchos=ON since Trilinos_ENABLE_EpetraExt=ON AND Trilinos_ENABLE_Teuchos=ON -- Setting EpetraExt_ENABLE_Epetra=ON since Trilinos_ENABLE_EpetraExt=ON AND Trilinos_ENABLE_Epetra=ON -- Setting EpetraExt_ENABLE_Triutils=ON since Trilinos_ENABLE_EpetraExt=ON AND Trilinos_ENABLE_Triutils=ON -- NOT setting EpetraExt_ENABLE_UMFPACK=ON since UMFPACK is NOT enabled at this point! -- NOT setting EpetraExt_ENABLE_AMD=ON since AMD is NOT enabled at this point! -- NOT setting EpetraExt_ENABLE_PETSC=ON since PETSC is NOT enabled at this point! -- Setting ThyraCoreLibs_ENABLE_Teuchos=ON since Trilinos_ENABLE_ThyraCoreLibs=ON AND Trilinos_ENABLE_Teuchos=ON -- Setting ThyraCoreLibs_ENABLE_RTOp=ON since Trilinos_ENABLE_ThyraCoreLibs=ON AND Trilinos_ENABLE_RTOp=ON -- Setting ThyraGoodStuff_ENABLE_ThyraCoreLibs=ON since Trilinos_ENABLE_ThyraGoodStuff=ON AND Trilinos_ENABLE_ThyraCoreLibs=ON -- Setting ThyraEpetra_ENABLE_Epetra=ON since Trilinos_ENABLE_ThyraEpetra=ON AND Trilinos_ENABLE_Epetra=ON -- Setting ThyraEpetra_ENABLE_ThyraCoreLibs=ON since Trilinos_ENABLE_ThyraEpetra=ON AND Trilinos_ENABLE_ThyraCoreLibs=ON -- Setting ThyraEpetraExt_ENABLE_ThyraEpetra=ON since Trilinos_ENABLE_ThyraEpetraExt=ON AND Trilinos_ENABLE_ThyraEpetra=ON -- Setting ThyraEpetraExt_ENABLE_EpetraExt=ON since Trilinos_ENABLE_ThyraEpetraExt=ON AND Trilinos_ENABLE_EpetraExt=ON -- Setting Thyra_ENABLE_ThyraCoreLibs=ON since Trilinos_ENABLE_Thyra=ON AND Trilinos_ENABLE_ThyraCoreLibs=ON -- Setting Thyra_ENABLE_ThyraGoodStuff=ON since Trilinos_ENABLE_Thyra=ON AND Trilinos_ENABLE_ThyraGoodStuff=ON -- Setting Thyra_ENABLE_ThyraEpetra=ON since Trilinos_ENABLE_Thyra=ON AND Trilinos_ENABLE_ThyraEpetra=ON -- Setting Thyra_ENABLE_ThyraEpetraExt=ON since Trilinos_ENABLE_Thyra=ON AND Trilinos_ENABLE_ThyraEpetraExt=ON Final set of enabled top-level packages: Teuchos RTOp Epetra Triutils EpetraExt Thyra 6 Final set of enabled packages: Teuchos RTOp Epetra Triutils EpetraExt ThyraCoreLibs \ ThyraGoodStuff ThyraEpetra ThyraEpetraExt Thyra 10 Final set of non-enabled top-level packages: 0 Final set of non-enabled packages: ThyraCrazyStuff 1 Final set of enabled external packages/TPLs: BLAS LAPACK Boost 3 Final set of non-enabled external packages/TPLs: MPI UMFPACK AMD PETSC 4 Getting information for all enabled external packages/TPLs ... Processing enabled external package/TPL: BLAS Processing enabled external package/TPL: LAPACK Processing enabled external package/TPL: Boost Configuring individual enabled Trilinos packages ... Processing enabled top-level package: Teuchos (Libs) Processing enabled top-level package: RTOp (Libs) Processing enabled top-level package: Epetra (Libs) Processing enabled top-level package: Triutils (Libs) Processing enabled top-level package: EpetraExt (Libs) Processing enabled top-level package: Thyra (CoreLibs, GoodStuff, Epetra, EpetraExt, Tests, Examples)
A few more behaviors of the TriBITS system that this particular configuration use-case shows are described below.
First, note the enable of the ST Thyra subpackages in lines like:
-- Setting subpackage enable Trilinos_ENABLE_ThyraGoodStuff=ON because parent package \ Trilinos_ENABLE_Thyra=ON
Second, note the auto-enable of support for optional packages in lines like:
-- Setting EpetraExt_ENABLE_Triutils=ON since Trilinos_ENABLE_EpetraExt=ON \ AND Trilinos_ENABLE_Triutils=ON
Third, note the auto-enable of support for the optional TPL Boost in the line:
-- Setting Teuchos_ENABLE_Boost=ON since TPL_ENABLE_Boost=ON
Explicit disable of a package
Another common use case is to enable a package but to disable an optional upstream package. This type of configuration would be used as part of a "black list" approach to enabling only a subset of packages and optional support. The "black list" approach is to enable a package with ${PROJECT_NAME}_ENABLE_ALL_OPTIONAL_PACKAGES=ON (the TriBITS default) but then to turn off a specific set of packages that you don't want. This is contrasted with a "white list" approach where you would configure with ${PROJECT_NAME}_ENABLE_ALL_OPTIONAL_PACKAGES=OFF and then have to manually enable all of the optional packages you want. Experience with projects like Trilinos show that the "black list" approach is generally to be preferred for a few reasons.
Consider the configure of the ReducedMockTrilinos project enabling Thyra but disabling Epetra with:
$ cmake -DTrilinos_ENABLE_Thyra:BOOL=ON \ -DTrilinos_ENABLE_Epetra:BOOL=OFF \ -DTrilinos_ENABLE_TESTS:BOOL=ON \ ${REDUCED_MOCK_TRILINOS}
which produces the relevant dependency-related output:
Disabling forward required packages and optional intra-package support that have a \ dependency on disabled packages Trilinos_ENABLE_<TRIBITS_PACKAGE>=OFF ... -- Setting Trilinos_ENABLE_Triutils=OFF because Triutils has a required library dependence \ on disabled package Epetra -- Setting Trilinos_ENABLE_EpetraExt=OFF because EpetraExt has a required library \ disabled package Epetra -- Setting Trilinos_ENABLE_ThyraEpetra=OFF because ThyraEpetra has a required library \ disabled package Epetra -- Setting Trilinos_ENABLE_ThyraEpetraExt=OFF because ThyraEpetraExt has a required library \ disabled package EpetraExt -- Setting Thyra_ENABLE_ThyraCrazyStuff=OFF because Thyra has an optional library \ disabled package ThyraCrazyStuff -- Setting Thyra_ENABLE_ThyraEpetra=OFF because Thyra has an optional library \ disabled package ThyraEpetra -- Setting Thyra_ENABLE_ThyraEpetraExt=OFF because Thyra has an optional library \ disabled package ThyraEpetraExt Final set of enabled top-level packages: Teuchos RTOp Thyra 3 Final set of enabled packages: Teuchos RTOp ThyraCoreLibs Thyra 4 Final set of non-enabled top-level packages: Epetra Triutils EpetraExt 3 Final set of non-enabled packages: Epetra Triutils EpetraExt ThyraGoodStuff \ ThyraCrazyStuff ThyraEpetra ThyraEpetraExt 7 Final set of enabled external packages/TPLs: BLAS LAPACK 2 Final set of non-enabled external packages/TPLs: MPI Boost UMFPACK AMD PETSC 5 Configuring individual enabled Trilinos packages ... Processing enabled top-level package: Teuchos (Libs) Processing enabled top-level package: RTOp (Libs) Processing enabled top-level package: Thyra (CoreLibs, Tests, Examples)
Note how the disable of Epetra wipes out all of the required and optional packages and intra-package dependencies that depend on Epetra. What is left is only the ThyraCoreLibs and its upstream dependencies that don't depend on Epetra (which is only RTOp and Teuchos).
Conflicting explicit enable and disable
One use case that occasionally comes up is when a set of inconsistent enables and disables are set. While this seems illogical that anyone would ever do this, when it comes to larger more complex projects with lots of packages and lots of dependencies, this can happen very easily. In some cases, someone is enabling a set of packages they want and is trying to weed out as many of the (what they think) are optional dependencies that they don't need and accidentally disables a package that is an indirect required dependency of one of the packages they want (in which case the configure should likely fail and provide a good error message). The other use case where conflicting enables/disables can occur is in CTest drivers using tribits_ctest_driver() where an upstream package has failed and is explicitly disabled (in which case it should gracefully disable downstream dependent packages). TriBITS can either be set up to have the disable override the explicit enable or stop the configure in error depending on the value of the cache variable ${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES (see Disables trump enables where there is a conflict).
For example, consider what happens with the ReducedMockTrilinos project if someone tries to enable the RTOp package and disable the Teuchos package. This is not consistent because RTOp has a required dependency on Teuchos. The default behavior of TriBITS is this case is shown in the below configure:
$ cmake -DTrilinos_ENABLE_Epetra:BOOL=ON \ -DTrilinos_ENABLE_RTOp:BOOL=ON \ -DTrilinos_ENABLE_Teuchos:BOOL=OFF \ ${REDUCED_MOCK_TRILINOS}
which produces the relevant dependency-related output:
Explicitly enabled top-level packages on input (by user): RTOp Epetra 2 Explicitly disabled top-level packages on input (by user or by default): Teuchos 1 Disabling forward required packages and optional intra-package support that have \ a dependency on disabled packages Trilinos_ENABLE_<TRIBITS_PACKAGE>=OFF ... *** *** ERROR: Setting Trilinos_ENABLE_RTOp=OFF which was 'ON' because RTOp has a required \ library dependence on disabled package Teuchos! ***
As shown above, the TriBITS default (which is ${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES=OFF) results in a configure-time error with a good error message.
However, if one sets ${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES=ON and configures with:
$ cmake -DTrilinos_ENABLE_Epetra:BOOL=ON \ -DTrilinos_ENABLE_RTOp:BOOL=ON \ -DTrilinos_ENABLE_Teuchos:BOOL=OFF \ -DTrilinos_DISABLE_ENABLED_FORWARD_DEP_PACKAGES:BOOL=ON \ ${REDUCED_MOCK_TRILINOS}
then the disable trumps the enable and results in a successful configure as shown in the following relevant dependency-related output:
Explicitly enabled top-level packages on input (by user): RTOp Epetra 2 Explicitly disabled top-level packages on input (by user or by default): Teuchos 1 Disabling forward required packages and optional intra-package support that \ have a dependency on disabled packages Trilinos_ENABLE_<TRIBITS_PACKAGE>=OFF ... *** *** NOTE: Setting Trilinos_ENABLE_RTOp=OFF which was 'ON' because RTOp has \ a required library dependence on disabled package Teuchos but \ Trilinos_DISABLE_ENABLED_FORWARD_DEP_PACKAGES=ON! *** Final set of enabled top-level packages: Epetra 1 Final set of non-enabled top-level packages: Teuchos RTOp Triutils EpetraExt Thyra 5
As shown above, what you end up with is just the enabled package Epetra which does not have a required dependency on the disabled package Teuchos. Developers of large complex TriBITS projects would be wise to set the default for ${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES to ON, especially in automated builds and testing.
Explicit enable of an optional TPL:
ToDo: Set Trilinos_ENABLE_Thyra=ON and TPL_ENABLE_MPI=ON
Explicit disable of an optional TPL:
ToDo: Set Trilinos_ENABLE_Thyra=ON and TPL_ENABLE_MPI=OFF
Explicit disable of a required TPL
ToDo: Set Trilinos_ENABLE_Epetra=ON and Trilinos_ENABLE_BLAS=OFF
Explicit enable of a subpackage
ToDo: Enable ThyraEpetra and show how it enables other packages and at the end, enables the Thyra package (just for show).
Explicit enable of an optional package dependency
ToDo: Set Trilinos_ENABLE_EpetraExt=ON and EpetraExt_ENABLE_Triutils=ON and shows how it enables Trilinos_ENABLE_Triutils=ON even through ST code is not enabled.
Explicit disable of an optional package dependency
ToDo: Set Trilinos_ENABLE_EpetraExt=ON, Trilinos_ENABLE_Triutils=ON, and EpetraExt_ENABLE_Triutils=OFF. Discuss how EpetraExt's and ThyraEpetraExt's CMakeLists.txt files might turn off some features if they detects that EpetraExt/Triutils support is turned off.
Explicit enable of an optional TPL dependency
ToDo: The current ReducedMockTrilinos is not set up to give a good example of this. We should add an optional Boost dependency to say, Epetra. Then we could show the enable of Teuchos and Epetra and Epetra_ENABLE_Boost=ON. That would enable Boost and enable support for Boost in Epetra but would not provide support for Boost in Teuchos.
Explicit disable of an optional TPL dependency
ToDo: The current ReducedMockTrilinos is not set up to give a good example of this. We should add an optional Boost dependency to say, Epetra. Then we could show the enable of Teuchos and Epetra and TPL_ENABLE_Boost=ON but set Epetra_ENABLE_Boost=OFF. That would provide support for Boost in Teuchos but not in Epetra.
Explicit enable of a package and downstream packages and tests
ToDo: Set Trilinos_ENABLE_RTOp=ON, Trilinos_ENABLE_ALL_FORWARD_DEP_PACKAGES=ON, and Trilinos_ENABLE_TESTS=ON and show what packages and tests/examples get enabled. This is the use case for the checkin-test.py tool for PT enabled code.
Enable all packages
The last use case to consider is enabling all defined packages. This configuration would be used for either doing a full test of all of the packages defined or to create a distribution of the project.
Enabling all PT packages the with:
$ cmake -DTrilinos_ENABLE_ALL_PACKAGES:BOOL=ON \ -DTrilinos_DUMP_PACKAGE_DEPENDENCIES:BOOL=ON \ ${REDUCED_MOCK_TRILINOS}
produces the relevant dependency-related output:
Enabling all packages that are not currently disabled because of \ Trilinos_ENABLE_ALL_PACKAGES=ON (Trilinos_ENABLE_SECONDARY_TESTED_CODE=OFF) ... -- Setting Trilinos_ENABLE_Teuchos=ON -- Setting Trilinos_ENABLE_RTOp=ON -- Setting Trilinos_ENABLE_Epetra=ON -- Setting Trilinos_ENABLE_ThyraCoreLibs=ON -- Setting Trilinos_ENABLE_ThyraEpetra=ON -- Setting Trilinos_ENABLE_Thyra=ON Enabling all required (and optional since Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES=ON) upstream packages for current set of enabled packages (Trilinos_ENABLE_SECONDARY_TESTED_CODE=OFF) ... -- NOTE: Not Setting Trilinos_ENABLE_ThyraGoodStuff=ON even though Thyra has an optional dependence on ThyraGoodStuff because Trilinos_ENABLE_SECONDARY_TESTED_CODE=OFF -- NOTE: Not Setting Trilinos_ENABLE_ThyraEpetraExt=ON even though Thyra has an optional dependence on ThyraEpetraExt because Trilinos_ENABLE_SECONDARY_TESTED_CODE=OFF -- Setting TPL_ENABLE_BLAS=ON because Epetra has a required dependence on BLAS -- Setting TPL_ENABLE_LAPACK=ON because Epetra has a required dependence on LAPACK Final set of enabled top-level packages: Teuchos RTOp Epetra Thyra 4 Final set of enabled packages: Teuchos RTOp Epetra ThyraCoreLibs ThyraEpetra Thyra 6 Final set of non-enabled top-level packages: Triutils EpetraExt 2 Final set of non-enabled packages: Triutils EpetraExt ThyraGoodStuff ThyraCrazyStuff ThyraEpetraExt 5 Final set of enabled external packages/TPLs: BLAS LAPACK 2 Final set of non-enabled external packages/TPLs: MPI Boost UMFPACK AMD PETSC 5 Getting information for all enabled external packages/TPLs ... Processing enabled external package/TPL: BLAS Processing enabled external package/TPL: LAPACK Configuring individual enabled Trilinos packages ... Processing enabled top-level package: Teuchos (Libs) Processing enabled top-level package: RTOp (Libs) Processing enabled top-level package: Epetra (Libs) Processing enabled top-level package: Thyra (CoreLibs, Epetra)
As shown above, only the PT packages get enabled. To also enable the ST packages as well, one additionally set ${PROJECT_NAME}_SECONDARY_TESTED_CODE=ON at configure time.
The TriBITS CMake configure system can write out the project's package dependencies into a file <Project>Dependencies.xml (or any name one wants to give it). This file is used by a number of the TriBITS SE-related tools. The structure of this XML file, showing one of the more interesting mock packages from the MockTrilinos project is shown below:
<PackageDependencies project="Trilinos"> ... <Package name="Amesos" dir="packages/amesos" type="PT"> <LIB_REQUIRED_DEP_PACKAGES value="Teuchos,Epetra"/> <LIB_OPTIONAL_DEP_PACKAGES value="EpetraExt"/> <TEST_REQUIRED_DEP_PACKAGES/> <TEST_OPTIONAL_DEP_PACKAGES value="Triutils,Galeri"/> <LIB_REQUIRED_DEP_TPLS/> <LIB_OPTIONAL_DEP_TPLS value="SuperLUDist,ParMETIS,UMFPACK,SuperLU,MUMPS"/> <TEST_REQUIRED_DEP_TPLS/> <TEST_OPTIONAL_DEP_TPLS/> <EmailAddresses> <Regression address="amesos-regression@repo.site.gov"/> </EmailAddresses> <ParentPackage value=""/> </Package> ... </PackageDependencies>
This XML file contains the names, directories, Test Test Category (i.e. type), CDash email address, and all of the package and TPL dependencies for every package in the TriBITS project (including add-on repositories if specified). There are several python tools under tribits/ci_support/ that read in this file and use the created data-structure for various tasks. This file and these tools are used by checkin-test.py and tribits_ctest_driver(). But these tools can also be used to construct other workflows and tools. These tools require a Python3 installation and for the python3 executable to be installed.
A TriBITS project configure can create this file as a byproduct of configuration by setting the configure option (see Outputting package dependency information), or the CMake -P script TribitsDumpDepsXmlScript.cmake can be used to create this file on the fly without having to configure a TriBITS project. To create this file outside of configuration, one can run:
cmake \ [-D PROJECT_SOURCE_DIR=<projectSourceDir>] \ [-D <Project>_PRE_REPOSITORIES=<prepo0>,<prepo1>,...] \ [-D <Project>_EXTRA_REPOSITORIES=<erepo0>,<erepo1>,...] \ -D <Project>_DEPS_XML_OUTPUT_FILE=<projectDepsFileOut> \ -P <tribitsDir>/ci_support/TribitsDumpDepsXmlScript.cmake
If TriBITS is snashotted into the project in the standard location <projectDir>/cmake/tribits or the entire TriBITS repo is cloned under <projectDir>/TriBITS (so that the tribits dir is <projectDir>/TriBITS/tribits) then one can leave off -DPROJECT_SOURCE_DIR=<projectSourceDir> and (if not wanting to include extra repos) just run:
cmake \ -D <Project>_DEPS_XML_OUTPUT_FILE=<projectDepsFileOut> \ -P <projectSourceDir>/cmake/tribits/ci_support/TribitsDumpDepsXmlScript.cmake
Once the XML file <projectDepsFileOut> is created, it can be used in various types of analysis and used with different tools and commands.
The tool get-tribits-packages-from-files-list.py can be used to determine the list of TriBITS packages that need to be tested given a list of changed files (e.g. as returned from git diff --name-only <from>..<to> > changed-files.txt). This is used in the checkin-test.py tool and the tribits_ctest_driver() function to determine what TriBITS packages need to be tested based on what files have been changed.
The tool get-tribits-packages-from-last-tests-failed.py can be used to extract the list of TriBITS packages that correspond to the failings tests listed in the CTest-generated <build-dir>/Testing/Temporary/LastTestsFailed*.log file. This tool is used in the tribits_ctest_driver() function in CI-testing mode to determine what packages must be re-tested if they failed in the last CI iteration.
The tool filter-packages-list.py takes in a list of TriBITS package names and then filters the list according the Test Test Category of the packages. This is used in testing workflows that only test a subset of packages according to the Test Test Category at different stages in the workflow. For example, the checkin-test.py tool and the tribits_ctest_driver() function use this filtering to only test Primary Tested (PT) or Secondary Tested (ST) packages for a given set of changed files in a continuous integration workflow (see Nested Layers of TriBITS Project Testing).
Much of the value provided by the TriBITS system is support for testing of complex projects. Many different types of testing are required in a complex project and development effort. A large project with lots of repositories and packages provides a number of testing and development challenges but also provides a number of opportunities to do testing in an efficient way; especially pre-push and post-push continuous integration (CI) testing. In addition, a number of post-push automated nightly test cases must be managed. TriBITS takes full advantage of the features of raw CMake, CTest, and CDash in support of testing and where gaps exist, TriBITS provides tools and customizations.
The following subsections describe several aspects to the TriBITS support for testing. The subsection Test Classifications for Repositories, Packages, and Tests defines the different types of test-related classifications that are defined by TriBITS. These different test classifications are then used to define a number of different standard Nested Layers of TriBITS Project Testing which include different types of CI testing as well as nightly and other tests. One of the most important types of CI testing, pre-push testing, is then described in more detail in the subsection Pre-push Testing using checkin-test.py. The subsection TriBITS CTest/CDash Driver describes the usage of the advanced tribits_ctest_driver() function to do incremental project testing of a projects using advanced ctest -S scripts. The final subsection TriBITS CDash Customizations describes how projects can use a CDash server to more effectively display test results and provide notifications for failures that are compartmentalized for a large project.
TriBITS defines a few different testing-related classifications for a TriBITS project. These different classifications are used to select subsets of the project's repositories, packages (and code within these packages), and tests to be included in a given project build and test definition. These different classification are:
These different test-related classifications are used to defined several different Nested Layers of TriBITS Project Testing. First, the Repository Test Classification determines what repositories are even processed in order for their packages to even consider being enabled. Second, if a repository is selected, then the Package Test Group determines what packages (and optional code in those packages) are even enabled such that their <packageDir>/CMakeLists.txt files are even processed (i.e. according to TriBITS Dependency Handling Behaviors). Lastly, if an package gets enabled, then the Test Test Category determines what test executables and test cases get defined using the functions tribits_add_executable(), tribits_add_test() and tribits_add_advanced_test().
More detailed descriptions of Repository Test Classifications, Package Test Groups, and Test Test Categories are given in the following subsections.
Repository Test Classification
The first type of test-related classification is for extra repositories defined in the file <projectDir>/cmake/ExtraRepositoriesList.cmake (pulled in through the ${PROJECT_NAME}_EXTRAREPOS_FILE cache variable) using the REPO_CLASSIFICATION field in the macro call tribits_project_define_extra_repositories(). These classifications map to the standard CTest dashboard types Continuous, Nightly, and Experimental (see CTest documentation and TriBITS CTest/CDash Driver for details).
Package Test Group
Once a set of TriBITS repositories are selected in accordance with their Repository Test Classification, that determines the set of packages defined for the TriBITS project. Given the set of defined packages, the set of packages that get enabled is determined by the Package Test Group which is defined and described here.
Every TriBITS Package is assigned a test group. These test groups are for Primary Tested (PT) code, Secondary Tested (ST) code, and Experimental (EX) code. The test group defines what package get selected (or are excluded from being selected) to include in a given build for testing-related purposes. packages may also conditionally build in additional code based on the testing group. The detailed rules for when an package is selected or excluded from the build based on the test group is given in TriBITS Dependency Handling Behaviors. We only summarize those rules here.
More detailed descriptions of the test groups are given below.
The test group for each type of entity is assigned in the following places:
After these files are processed, the variable ${PACKAGE_NAME}_TESTGROUP gives the test group for each defined Package while the variable ${TPL_NAME}_TESTGROUP gives the test group for each defined TPL.
Note that the test group classification PT/ST/EX is not to be confused with the maturity level of the package as discussed in the TriBITS Lifecycle Model. The test group classification in no way implies the maturity of the given TriBITS Package or piece of code. Instead, the test group is just used to sub-select packages (and pieces of code within those packages) that are the most important to sustain for the various current development group's activities. While more-mature code would typically never be classified as EX (Experimental), there are cases were immature packages may be classified as ST or even PT. For example, a very important research project may be driving the development of a very new algorithm with the low maturity level of Research Stable (RS) or even Exploratory (EP) because keeping that code working may be critical to keeping the research project on track.
In addition to just selecting PT and ST packages as a whole, a TriBITS PT package can also contain conditional code and test directories that get enabled when ${PROJECT_NAME}_SECONDARY_TESTED_CODE=ON and therefore represents more ST code. The package's <packageDir>/CMakeLists.txt files can contain simple if statements and can use the tribits_set_st_for_dev_mode() function to automatically select extra code to enable when ST is enabled or when the project is in release mode.
Test Test Category
Once a package is even defined (due to its parent repository's selection consistent with its Repository Test Classification) and is the package is enabled (consistent with its Package Test Group) then the set of individual test executables and test cases that are included or not in that package depends on the CATEGORIES argument in the functions tribits_add_executable(), tribits_add_test() and tribits_add_advanced_test(), and the ${PROJECT_NAME}_TEST_CATEGORIES variable. This Test Test Category defines the last "knob" that the development team has in controlling what tests get run in a particular test scenario as described in the section Nested Layers of TriBITS Project Testing.
The currently allowed values for the Test Test Category are BASIC, CONTINUOUS, NIGHTLY, HEAVY, and PERFORMANCE. Tests are enabled based on their assigned test test category matching the categories set in the CMake cache variable ${PROJECT_NAME}_TEST_CATEGORIES. The test test categories BASIC, CONTINUOUS, NIGHTLY, and HEAVY are subsets of each other. That is, a BASIC test is automatically included in the set of CONTINUOUS, NIGHTLY, and HEAVY tests (as set using ${PROJECT_NAME}_TEST_CATEGORIES).
The different test test categories are described below in more detail:
Every TriBITS project has a default setting for ${PROJECT_NAME}_TEST_CATEGORIES that is set for a basic cmake configure of the project (see ${PROJECT_NAME}_TEST_CATEGORIES_DEFAULT for more details). In addition, the different testing processes described in the section Nested Layers of TriBITS Project Testing set this to different values.
Now that the different types of Test Classifications for Repositories, Packages, and Tests have been defined, this section describes how these different test-related classifications are used to select repositories, packages (and code) and tests to run in the standard project testing processes. More than any other section in this document, this section will describe and assume a certain class of software development processes (namely agile processes) where testing and continuous integration (CI) are critical components. However, detailed descriptions of these processes are deferred to the later sections Pre-push Testing using checkin-test.py and TriBITS CTest/CDash Driver.
The standard TriBITS-supported project testing processes are:
These standard testing processes are outlined in more detail below and show how the different test-related categories are used to define each of these.
Pre-Push CI Testing
The first level of testing is Pre-Push CI Testing that is performed before changes to the project are pushed to the master branch(es) in the global repository(s). With TriBITS, this type of testing and the following push is typically done using the checkin-test.py tool. This category of testing is described in much more detail in Pre-push Testing using checkin-test.py. All of the "default builds" used with the checkin-test.py tool select repositories, packages and code, and individual tests using the following test-related classifications:
Classification Type | Classification | (See Reference) |
---|---|---|
Repository Test Classif. | Continuous | (Repository Test Continuous) |
Package Test Group | PT | (PT) |
Test Test Category | BASIC | (Test Test Category BASIC) |
Typically a TriBITS project will define a "standard development environment" which is comprised of a standard compiler (e.g. GCC 8.3.0), external package/TPL versions (e.g. OpenMPI 4.0.5, Boost 4.9, etc.), and other tools (e.g. cmake 3.23.0, git 2.10.1, etc.). This standard development environment is expected to be used to test changes to the project's code before any push. By using a standard development environment, if the code builds and all the tests pass for the "default" pre-push builds for one developer, then that maximizes the probability that the code will also build and all tests will pass for every other developer using the same development environment. This is critical to keep the development team maximally productive. Portability is also important for most projects but portability testing is best done in a secondary feedback look using Nightly Testing builds. TriBITS has some support for helping to set up a standard software development environment as described in section TriBITS Development Toolset.
The basic assumption of all CI processes (including the one described here) is that if anyone pulls the project's development sources at any time, then all of the code will build and all of the tests will pass for the "default" build cases. For a TriBITS project, this means that the project's --default-builds (see above) will all pass for every PT package. All of these software development processes make this basic assumption and agile software development methods fall apart if this is not true.
Post-Push CI Testing
After changes are pushed to the master branch(es) in the global repository(s), Post-Push CI Testing is performed where a CI server detects the changes and immediately fires off a CI build using CTest to test the changes and the results are posted to a CDash server (in the "Continuous" section on the project's dashboard page). This process is driven by CTest driver code that calls tribits_ctest_driver() as described in the section TriBITS CTest/CDash Driver. Various types of specific CI builds can be constructed and run (see CTest/CDash CI Server) but these post-push CI builds typically select repositories, packages and code, and individual tests using the following test-related classifications:
Classification Type | Classification | (See Reference) |
---|---|---|
Repository Test Classif. | Continuous | (Repository Test Continuous) |
Package Test Group | PT & ST | (PT and ST) |
Test Test Category | CONTINUOUS | (Test Test Category CONTINUOUS) |
Post-push CI testing would assume to use the same standard development environment as used for Pre-Push CI Testing. Also, the project may also choose to run additional automated post-push CI builds that exactly match the pre-push CI default builds to help check on the health of these builds continuously and not just rely on the development team to always perform the pre-push CI builds correctly before pushing.
Nightly Testing
In addition to pre-push and post-push CI testing, a typical TriBITS project will set up multiple Nightly Testing builds (or once-a-day builds, they don't need to only run at night). These builds are also driven by CTest driver scripts described in the section TriBITS CTest/CDash Driver and post results to the project's CDash server (in the "Nightly" section on the project's dashboard page). Nightly builds don't run in a continuous loop but instead are run once a day (e.g. driven by a cron job) and there tends to be many different nightly build cases that test the project using different compilers (e.g. GCC, Intel, Microsoft, etc., and different versions of each), different external package/TPL versions (e.g. different OpenMPI versions, different MPICH versions, etc.), different platforms (e.g. Linux, Windows, etc.), and varying many other options and settings on these different platforms. What all nightly builds have in common is that they tend to select repositories, packages and code, and individual tests using the following test-related classifications:
Classification Type | Classification | (See Reference) |
---|---|---|
Repository Test Classif. | Nightly | (Repository Test Nightly) |
Package Test Group | PT & ST | (PT and ST) |
Test Test Category | NIGHTLY | (Test Test Category NIGHTLY) |
The nightly builds comprise the basic "heart beat" for the project.
Heavy Testing
Heavy Testing builds are just an extension to the Nightly Testing builds that add on more expensive tests marked using the Test Test Category HEAVY. For projects that define heavy tests and heavy builds, individual test cases may be allowed to take 24 hours or longer to run so they can't even be run every day in nightly testing. What standard heavy builds have in common is that they tend to select repositories, packages and code, and individual tests using the following test-related classifications:
Classification Type | Classification | (See Reference) |
---|---|---|
Repository Test Classif. | Nightly | (Repository Test Nightly) |
Package Test Group | PT & ST | (PT and ST) |
Test Test Category | HEAVY | (Test Test Category HEAVY) |
Project developer teams should strive to limit the number of test cases that are marked as HEAVY since these tests will typically not get run in very may builds or may not be run every day and developers will tend to never enable them when doing more extensive testing using --st-extra-builds with the checkin-test.py tool in extended pre-push testing.
Performance Testing
Performance Testing builds are a special class of builds that have tests that are specifically designed to test the run-time performance of a particular piece of code or algorithm. These tests tend to be sensitive to loads on the machine and therefore typically need to be run on an unloaded machine for reliable results. Details on how to write good performance tests with hard pass/fail time limits is beyond the scope of this document. All TriBITS does is to define the special Test Test Category PERFORMANCE to allow TriBITS packages to declare these tests in a consistent way so that they can be run along with performance tests defined in other TriBITS packages. From a TriBITS standpoint, all performance testing builds would tend to select repositories, packages and code, and individual tests using the following test-related classifications:
Classification Type | Classification | (See Reference) |
---|---|---|
Repository Test Classif. | Nightly | (Repository Test Nightly) |
Package Test Group | PT & ST | (PT and ST) |
Test Test Category | PERFORMANCE | (Test Test Category PERFORMANCE) |
CMake provides the integrated tool CTest (executable ctest) which is used to define and run different tests. However, a lot more needs to be done to effectively test changes for a large project before pushing to the master branch(es) in the main repository(s). Things get especially complicated and tricky when multiple version-control (VC) repositories are involved. The TriBITS system provides the tool checkin-test.py for automating the process of:
There are several advantages to using a project's checkin-test.py tool for pushing changes to the main development branch which include:
When using the checkin-test.py tool, every TriBITS project defines one or more "default builds" (specified through the --default-builds argument) for pre-push CI testing that form the criteria for if it is okay to push code changes or not. The "default builds" select repositories, packages and code, and individual tests as described in Pre-Push CI Testing. A TriBITS project defines its default pre-push builds using the file <projectDir>/project-checkin-test-config.py. For an example, the file TribitsExampleProject/project-checkin-test-config.py is shown below:
# # Define project-specific options for the checkin-test script for # TribitsExampleProject. # configuration = { # Default command line arguments 'defaults': { '--send-email-to-on-push': 'trilinos-checkin-tests@software.sandia.gov', }, # CMake options (-DVAR:TYPE=VAL) cache variables. 'cmake': { # Options that are common to all builds. 'common': [], # Defines --default-builds, in order. 'default-builds': [ # Options for the MPI_DEBUG build. ('MPI_DEBUG', [ '-DTPL_ENABLE_MPI:BOOL=ON', '-DCMAKE_BUILD_TYPE:STRING=RELEASE', '-DTribitsExProj_ENABLE_DEBUG:BOOL=ON', '-DTribitsExProj_ENABLE_CHECKED_STL:BOOL=ON', '-DTribitsExProj_ENABLE_DEBUG_SYMBOLS:BOOL=ON', ]), # Options for the SERIAL_RELEASE build. ('SERIAL_RELEASE', [ '-DTPL_ENABLE_MPI:BOOL=OFF', '-DCMAKE_BUILD_TYPE:STRING=RELEASE', '-DTribitsExProj_ENABLE_DEBUG:BOOL=OFF', '-DTribitsExProj_ENABLE_CHECKED_STL:BOOL=OFF', ]), ], # default-builds }, # cmake } # configuration
This gives --default-builds=MPI_DEBUG,SERIAL_RELEASE. As shown, typically two default builds are defined so that various options can be toggled between the two builds. Typical options to toggle include enabling/disabling MPI and enabling/disabling run-time debug mode checking (i.e. toggle ${PROJECT_NAME}_ENABLE_DEBUG). Typically, other important options will also be toggled between these two builds.
Note that both of the default builds shown above, including the MPI_DEBUG build, actually set optimized compiler flags with -DCMAKE_BUILD_TYPE:STRING=RELEASE. What makes the MPI_DEBUG build a "debug" build is turning on optional run-time debug-mode checking, not disabling optimized code. This is important so that the defined tests run fast. For most projects, the default pre-push builds should not be used to debug-enabled code which is suitable to run through a debugger (e.g. gdb). Instead, these "debug" builds are designed to test changes to the project's code efficiently before pushing changes. Typically, a development team should not have to test the chosen compiler's ability to generate non-optimized debug code and suffer slower test times before pushing.
Note that turning on -DTribitsExProj_ENABLE_CHECKED_STL=ON as shown above can only be used when the external packages/TPLs have no C++ code using the C++ STL or if that particular build points to C++ TPLs also compiled with checked STL enabled. The TribitsExampleProject default builds do not depend on any C++ TPLs that might use the C++ STL so enabling this option adds additional positive debug-mode checking for C++ code.
The checkin-test.py tool is a fairly sophisticated piece of software that is well tested and very robust. The level of testing of this tool is likely greater than any of the software that it will be used to test (unless the project is a real-time flight control system or nuclear reactor control system or something). This is needed so as to provide confidence in the developers that the tool will only push their changes if everything checks out as it should. There are a lot of details and boundary cases that one has to consider and a number of use cases that need to be supported by such a tool. For more detailed documentation, see checkin-test.py --help.
Note that the checkin-test.py tool can also be used to implement "poor-man's" post-push testing processes as described in Post-Push CI and Nightly Testing using checkin-test.py. However, most software projects will want to go with the more elaborate and more feature-full CTest/CDash system described in TriBITS CTest/CDash Driver.
The TriBITS system uses a sophisticated and highly customized CTest -S driver script to test TriBITS projects and submit results to a CDash server. The primary code for driving this is contained in the CTest function tribits_ctest_driver() contained in the file TribitsCTestDriverCore.cmake. This script loops through all of the specified TriBITS packages for a given TriBITS project and does a configure, built, and test and then submits results to the specified CDash server incrementally. If the configure or library build of any upstream TriBITS package fails, then that TriBITS package is disabled in all downstream TriBITS package builds so as not to propagate already known failures. Each TriBITS top-level package is assigned its own CDash regression email address (see CDash regression email addresses) and each package configure/build/test is given its own row for the package build in the CDash server. A CTest script using tribits_ctest_driver() is run in one of three different modes. First, it can run standard once-a-day, from-scratch builds as described in CTest/CDash Nightly Testing. Second, it can run as a CI server as described in CTest/CDash CI Server. Third, it can run in experimental mode testing a local repository using the TriBITS-defined make dashboard target.
When a TriBITS CTest script using tribits_ctest_driver() is run in "Nightly" testing mode, it builds the project from scratch package-by-package and submits results to the TriBITS project's CDash project on the designated CDash server.
When a TriBITS ctest driver script is used in continuous integration (CI) mode, it starts every day with a clean from-scratch build and then performs incremental rebuilds as new commits are pulled from the master branch in the main repository(s). In this mode, a continuous loop is performed after the initial baseline build constantly pulling commits from the master git repository(s). If any package changes are detected (looking at git file diffs), then the tests and examples for those packages and all downstream packages are enabled and run using a reconfigure/rebuild. Since all of the upstream package libraries are already built, this rebuild and retest can take place in a fraction of the time of a complete from-scratch build and test of the project.
CDash is not currently designed to accommodate multi-package, multi-repository VC projects in the way they are supported by TriBITS. However, CDash provides some ability to customize a CDash project and submits to address missing features. Each TriBITS package is given a CTest/CDash "Label" with the name of the TriBITS package. CDash will then aggregate the different package configure/build/test runs for each package into aggregated "builds". The commits pulled for each of the extra VC repos listed in the <projectDir>/cmake/ExtraRepositoriesList.cmake file are shown in an uploaded CDash "Notes" file for each TriBITS package configure/build/test submit. This uploaded "Notes" file also contains a cleaned up version of the CMakeCache.txt file as well as a copy of the ctest -s script that ran the case.
CDash offers numerous features such as the ability to construct a number of different types of queries and is extremely helpful in using past test data.
Every TriBITS Package has a regression email address associated with it that gets uploaded to a CDash project on a CDash server that is used to determine what email address to use when a package has configure, build, or test failures. Because of the complex organizational nature of different projects and different integration models, a single static email address for a given package in every project build is not practical.
The TriBITS system allows for a package's regression email address to be specified in the following order of precedence:
1) ${REPOSITORY_NAME}_REPOSITORY_OVERRIDE_PACKAGE_EMAIL_LIST (typically defined in <projectDir>/cmake/ProjectDependenciesSetup.cmake): Defines a single email address for all packages for the repository ${REPOSITORY_NAME} and overrides all other package email regression specification variables. This is typically used by a meta-project to redefine the regression email addresses for all of the packages in an externally developed repository.
2) REGRESSION_EMAIL_LIST (defined in <packageDir>/cmake/Dependencies.cmake): Package-specific email address specified in the package's Dependencies.cmake file using tribits_package_define_dependencies().
3) ${REPOSITORY_NAME}_REPOSITORY_EMAIL_URL_ADDRESS_BASE (set in <repoDir>/cmake/RepositoryDependenciesSetup.cmake): A base email address specified at the Repository level creating package-specific email addresses (e.g. <lower-case-package-name>-regression@some.repo.gov, where ${REPOSITORY_NAME}_REPOSITORY_EMAIL_URL_ADDRESS_BASE=some.repo.gov). This variable is used, for example, by the Trilinos project to provide automatic regression email addresses for packages.
4) ${REPOSITORY_NAME}_REPOSITORY_MASTER_EMAIL_ADDRESS (set in <repoDir>/cmake/RepositoryDependenciesSetup.cmake): A single email address for all packages specified at the Repository level (e.g. my-repo-regression@some.repo.gov). This variable is used for smaller repositories with smaller development groups who just want all regression emails for the repository's packages going to a single email address. This reduces the overhead of managing a bunch of individual package email addresses but at the expense of spamming too many people with CDash failure emails.
5) ${PROJECT_NAME}_PROJECT_EMAIL_URL_ADDRESS_BASE (set in <projectDir>/cmake/ProjectDependenciesSetup.cmake): A base email address specified at the Project level creating package-specific email addresses (e.g. <lower-case-package-name>-regression@some.project.gov, where ${PROJECT_NAME}_PROJECT_EMAIL_URL_ADDRESS_BASE=some.project.gov). If not already set, this variable will be set to ${REPOSITORY_NAME}_REPOSITORY_EMAIL_URL_ADDRESS_BASE for the first repository processed that has this set. This behavior is used, for example by the Trilinos project to automatically assign email addresses for add-on packages and was added to maintain backward compatibility.
6) ${PROJECT_NAME}_PROJECT_MASTER_EMAIL_ADDRESS (set in <projectDir>/cmake/ProjectDependenciesSetup.cmake): A single default email address for all packages specified at the Project level (e.g. my-project-regression@some.project.gov). If not already set, this variable will be set to ${REPOSITORY_NAME}_REPOSITORY_MASTER_EMAIL_ADDRESS for the first repository processed that has this set. Every meta-project should set this variable so that it will be the default email address for any new package added.
WARNING: If any of the email lists or URL string variables listed above are set to "OFF" or "FALSE" (or some other value that CMake interprets as false, see CMake Language Overview and Gotchas) then the variables are treated as empty and not set.
If a TriBITS project does not use CDash, then no email address needs to be assigned to packages at all (and therefore none of the above variables need be set).
As a general rule, repository-level settings override project-level settings and package-level settings override both. Also, a project can redefine a repository's regression email list settings by resetting the variables in the project's <projectDir>/cmake/ProjectDependenciesSetup.cmake file.
All of the email dependency management logic must be accessible by just running the macro:
tribits_read_all_project_deps_files_create_deps_graph()
The above email address configuration variables are read from the Repository and Project files <repoDir>/cmake/RepositoryDependenciesSetup.cmake and <projectDir>/cmake/ProjectDependenciesSetup.cmake, respectively. The RepositoryDependenciesSetup.cmake files are read first in the specified repository order followed up by reading the ProjectDependenciesSetup.cmake file. In this way, the project can override any of the repository settings.
In review, the precedence order for how regression email addresses are selected for a given package is:
What the above setup does is it results in the TriBITS system (in the tribits_ctest_driver() function called in a ctest -S script) creating a file called CDashSubprojectDependencies.xml (which contains the list of TriBITS packages, which CDash calls "subprojects", and email address for each package to send regression emails to) and that file gets sent to the CDash server. CDash then takes this file and creates, or updates, a set of CDash users (same name and password as the email list address) and sets up a mapping of Labels (which are used for TriBITS package names) to CDash user emails addresses. CDash is automatically set up to process this XML file and create and update CDash users. There are several consequences of this implementation of which project maintainers need to be aware.
First, one should not list the email address for a CDash user account already on CDash. This is because labels will be added for the TriBITS packages that this email address is associated with and CDash emails will not be sent out for any other TriBITS packages, no matter what setting that CDash user account has. Therefore, one should only list email addresses as CDash regression email lists that are not already CDash user accounts and wish to be maintained separately. For example, if there is an email list that one wants to have CDash emails sent to but is already a CDash user account, then one can create another email list (e.g. using Mailman) which can then be registered with the TriBITS packages in the CDashSubprojectDependencies.xml file and then that new email list forward email to the target email list.
Second, the CDash implementation currently is not set up to remove labels from existing users when an email address is disassociated with a TriBITS package in the CDashSubprojectDependencies.xml file. Therefore, if one changes a TriBITS package's CDash regression email address then one needs to manually remove the associated labels from the old email address. CDash will not remove them automatically. Otherwise, email will continue to be sent to that email address for that package.
Therefore, to change the mapping of CDash regression email addresses to TriBITS packages, one must perform the following actions:
Hopefully that should be enough information to manage the mapping of CDash regression email lists to TriBITS packages for single and multi-repository TriBITS projects.
TriBITS has built-in support for projects involving multiple TriBITS Repositories which contain multiple TriBITS Packages (see How to set up multi-repository support). The basic configuration, build, and test of such projects requires only raw CMake/CTest, just like any other CMake project (see TriBITS System Project Dependencies). Every TriBITS project automatically supports tacking on add-on TriBITS packages and external packages/TPLs through the ${PROJECT_NAME}_EXTRA_REPOSITORIES cmake cache variable as described in Enabling extra repositories with add-on packages. In addition, a TriBITS project can be set up to pull in other TriBITS Repositories using the <projectDir>/cmake/ExtraRepositoriesList.cmake file. A special form of this type of project is a TriBITS Meta-Project that contains no native packages or TPLs of its own. The ability to create meta-projects out of individual TriBITS repositories allows TriBITS to be used to provide coordinated builds (or meta-builds) of large aggregations of software.
To help set up a full-featured development environment (i.e. not just the basic configure, build, test, and install) for TriBITS projects with multiple repositories, TriBITS provides some extra development tools implemented using Python which are provided in the "extended" parts of TriBITS (see TriBITS/tribits/ Directory Contents). The primary tools supporting multi-repository projects are the Python tools clone_extra_repos.py, gitdist, and checkin-test.py.
To demonstrate, consider the TriBITS meta-project with the following ExtraRepositoriesList.cmake file:
tribits_project_define_extra_repositories( ExtraRepo1 "" GIT git@someurl.com:ExtraRepo1 "" Continuous ExtraRepo2 "ExtraRepo1/ExtraRepos2" GIT git@someurl.com:ExtraRepo2 NOPACKAGES Continuous ExtraRepo3 "" GIT git@someurl.com:ExtraRepo3 "" Nightly )
Once cloned, the directories would be laid out as:
MetaProject/ .git/ .gitignore ExtraRepo1/ .git/ ExrraRepo2/ .git/ ExtraRepo3/ .git/
The tool clone_extra_repos.py is used to clone the extra repositories for a multi-repositories TriBITS project. It reads the repository URLs and destination directories from the file <projectDir>/cmake/ExtraRepositoriesList.cmake and does the clones. For example, to clone all the repos for the MetaProject project, one would use the commands:
$ git clone git@someurl.com:MetaProject $ cd MetaProject/ $ ./cmake/tribits/ci_support/clone_extra_repos.py
which produces the output like:
... *** *** Clone the selected extra repos: *** Cloning repo ExtraRepo1 ... Running: git clone git@someurl.com:ExtraRepo1 ExtraRepo1 Cloning repo ExtraRepo2 ... Running: git clone git@someurl.com:ExtraRepo2 ExtraRepo1/ExtraRepo2 Cloning repo ExtraRepo3 ... Running: git clone git@someurl.com:ExtraRepo3 ExtraRepo3
See clone_extra_repos.py --help for more details.
Once cloned, one needs to work with the multiple repositories to perform basic VC operations. For this, TriBITS provides the tool gitdist which is a simple stand-alone Python script that distributes a git command across a set of git repos. This tool is not specific to TriBITS but it is very useful for dealing with TriBITS projects with multiple repositories. It only requires a local base git repo and a set of zero or more git repos cloned under it.
To use gitdist with this aggregate meta-project, one would first set up the file MetaProject/.gitdist (or a version-controlled MetaProject.gitdist.default file) which would contain the lines:
ExtraRepo1 ExtraRepo1/ExtraRepo2 ExtraRepo3
and one would set up the tracked ignore file MetaProject/.gitignore which contains the lines:
/ExtraRepo1/ /ExtraRepo1/ExtraRepo2/ /ExtraRepo3/
To use gitdist, one would put gitdist into their path and also set up the command-line shell aliases gitdist-status and gitdist-mod (see gitdist --dist-help=aliases).
Some of the aggregate commands that one would typically run under the base MetaProject/ directory are:
# See status of all repos at once gitdist-status # Pull updates to all gitdist pull # Push local commits to tracking branches gitdist push
The tool gitdist is provided under TriBITS directory:
cmake/tribits/python_utils/gitidst
and can be installed by the install_devtools.py tool (see TriBITS Development Toolset). See gitdist documentation for more details.
For projects with a standard set of extra repositories defined in the <projectDir>/cmake/ExtraRepositoriesList.cmake file, the checkin-test.py tool only requires passing in the option --extra-repos-file=project and --extra-repos-type=Continuous (or Nightly, see Repository Test Classification) and it will automatically perform all of the various actions for all of the selected repositories. See checkin-test.py and checkin-test.py --help for more details.
To keep track of compatible versions of the git repos, TriBITS provides support for a <Project>RepoVersion.txt file. Any TriBITS project can generate this file automatically by setting the option ${PROJECT_NAME}_GENERATE_REPO_VERSION_FILE. For the above example MetaProject, this file looks like:
*** Base Git Repo: MetaProject e102e27 [Mon Sep 23 11:34:59 2013 -0400] <author0@someurl.com> First summary message *** Git Repo: ExtraRepo1 b894b9c [Fri Aug 30 09:55:07 2013 -0400] <author1@someurl.com> Second summary message *** Git Repo: ExtraRepo1/ExtraRepo2 97cf1ac [Thu Dec 1 23:34:06 2012 -0500] <author2someurl.com> Third summary message *** Git Repo: ExtraRepo3 cd4a3af [Mon Mar 9 19:39:06 2013 -0400] <author3someurl.com> Fourth summary message
This file gets created in the build directory, gets echoed in the configure output, gets installed into the install directory, and get added to the source distributions tarball. It also gets pushed up to CDash for all automated builds. The tool gitdist can then use this file to checkout and tag compatible versions, difference two versions of the meta-project, etc. (see gitdist documentation for more details on git operations).
The TriBITS approach to managing multiple VC repos described above works well for around 20 or 30 VC repos but is likely not a good solution for many more git repos. For larger numbers of VC repos, one should consider nested integration creating snapshot git repos (e.g. using the tool snapshot-dir.py) that aggregate several related repositories into a single git repo. Another approach might be to use git submodules. (However, note that the TriBITS tools and processes described here are not currently set up to support aggregate VC repos that use git submodules.) The design decision with TriBITS was to explicitly handle the different git VC repos by listing them in the <projectDir>/cmake/ExtraRepositoriesList.cmake file and then using the simple, easy to understand, tools clone_extra_repos.py and gitdist. There are advantages and disadvantages to explicitly handling the different git repos as is currently employed by the TriBITS software development tools verses using git submodules. It is possible that TriBITS will add support for aggregate git repos using git submodules in the future but only if there are important projects that choose to use them. The discussion of these various approaches and strategies to dealing with aggregate repos is beyond the scope of this document.
In this section, the typical development workflows for a TriBITS project are described. First, the Basic Development Workflow for a single-repository TriBITS project is described. This is followed up with a slightly more complex Multi-Repository Development Workflow.
The basic development workflow of a TriBITS project is not much different than with any other CMake project that uses CTest to define and run tests. One pulls updates from the master VC repo then configures with cmake, and iteratively builds, runs tests, adds files, changes files, does a final test, then pushes updates. The major difference is that a well constructed development process will use the checkin-test.py tool to test and push all changes that affect the build or the tests. The basic steps in configuring, building, running tests, etc., are given in the project's <Project>BuildReference. file (see Project-Specific Build Reference).
The development workflow for a project with multiple VC repos is very similar to a project with just a single VC repo if the project provides a standard <projectDir>/cmake/ExtraRepositoriesList.cmake file. The major difference is in making changes, creating commits, etc. The gitdist tool makes these steps easier and has been shown to work fairly well for up to 20 extra VC repos (as used in the CASL VERA project). The checkin-test.py tool automatically handles all of the details of pulling, diffing, pushing etc. to all the VC repos.
This section provides short, succinct lists of the steps to accomplish a few common tasks. Extra details are referenced.
To add a new TriBITS package, it is recommended to take the template from one of the TribitsExampleProject packages that most closely fits the needs of the new package and modify it for the new package. For example, the files for the SimpleCxx package can be copied one at a time and modified for the new package.
To add a new TriBITS package (with no subpackages), do the following:
Once the new package is defined, downstream packages can define dependencies on this new package.
Adding a new package with subpackages is similar to adding a new regular package described in How to add a new TriBITS Package. Again, it is recommended that one copies an example package from TribitsExampleProject. For example, one could copy files and directories from the example package WithSubpackages.
To add a new TriBITS package with packages, do the following:
Once the new packages are defined, downstream packages can define dependencies on these.
Given an existing top-level TriBITS package that is already broken down into subpackages (see How to add a new TriBITS Package with Subpackages), adding a new subpackage does not require changing any project-level or repository-level files. One only needs to add the declaration for the new subpackages in its parent's <packageDir>/cmake/Dependencies.cmake file then fill out the pieces of the new subpackage defined in the section TriBITS Subpackage Core Files. It is recommended to copy files from one of the TribitsExampleProject subpackages in the WithSubpackages package.
To add a new TriBITS subpackage to a top-level package that already has subpackages, do the following:
In order for a TriBITS package to define a dependency on a new TriBITS External Package/TPL (i.e. a TPL that has not already been declared in the current repo's or an upstream repo's <repoDir>/TPLsList.cmake file), one must add and modify a few repository-level files in addition to modifying files within the TriBITS packages that use the new external package/TPL.
To add a new TriBITS TPL, do the following:
Chose a name <tplName> for the new TPL (must be globally unique across all TriBITS repos. (See Globally unique TriBITS TPL names.)
Choose the subdirectory <tplDefsDir> for the new TPL files. These files are usually placed under the TriBITS repository directory <repoDir>/ (e.g. <repoDir>/cmake/tpls/) where the first downstream dependent package is defined or can be under a TriBITS package directory <packageDir>/ (e.g. <packageDir>/cmake/tpls/) if only one package is using that TPL.
Create the FindTPL<tplName>.cmake file (or some other name, see <tplName>_FINDMOD) under <tplDefsDir>/. (See Creating the FindTPL<tplName>.cmake file.) However, if the external package/TPL is a TriBITS-compliant external package this file is not needed and is ignored.
[Optional] Create the FindTPL<tplName>Dependencies.cmake file in the same directory as the FindTPL<tplName>.cmake file, <tplDefsDir>/. (See FindTPL<tplName>Dependencies.cmake.) NOTE: This file is need for a TriBITS-compliant external package if it has upstream dependent external packages/TPLs (where the file FindTPL<tplName>.cmake is not needed).
Add a row to the <repoDir>/TPLsList.cmake file for the new external package/TPL after any upstream TPLs that this new TPL may depend on. NOTE: For a TriBITS-compliant external package, the special value TRIBITS_PKG is used for the TPL (See <repoDir>/TPLsList.cmake.)
Configure the TriBITS project enabling the new TPL with TPL_ENABLE_<tplName>=ON and see that the TPL is found correctly at configure time.
Add <tplName> to the package Dependencies.cmake files of downstream dependent packages that will use this TPL (see <packageDir>/cmake/Dependencies.cmake or <packageDir>/<spkgDir>/cmake/Dependencies.cmake for a subpackage).
[Optional] Add #cmakedefine for an optional package LIB TPL dependency in the package's <packageName>_config.h.in file using:
#cmakedefine HAVE_<PACKAGE_NAME_UC>_<TPL_NAME_UC>
so that the package's LIB code build knows if the TPL is defined or not (see <packageName>_config.h.in and tribits_configure_file()). (NOTE: Do not add this define for a optional test-only TPL dependency. We don't want all of the libraries in a package to have to be rebuild when we enable or disable tests for the package.)
Use the TPL functionality in the packages that define the dependency on the new TPL, configure, test, etc.
Creating the FindTPL<tplName>.cmake file
The main axes of variation in defining FindTPL<tplName>.cmake modules are:
See the different variations in the below sections:
When defining a FindTPL<tplName>.cmake file, it encouraged to utilize find_package(<externalPkg> ...) to provide the default find operation (and also the definition of the IMPORTED targets needed to use the external package when supported). (Here, typically <tplName> and <externalPkg> are the same names, but there are cases where the names may be different or use different capitalization.) However, the state of the current ecosystem of Find<externalPkg>.cmake modules and <externalPkg>Config.cmake package config files is a bit uneven. While all find modules and package config files should be defining modern CMake IMPORTED targets which contains all the needed usage requirements (such as the target properties INTERFACE_INCLUDE_DIRECTORIES and INTERFACE_LINK_LIBRARIES) and use find_dependency() to get all of their required external upstream package dependencies, many do not. Also, many of these don't provide a complete <tplName>::all_libs target which is required for a TriBITS-compliant external package/TPL.
In this case, the FindTPL<tplName>.cmake file provides a thin "glue" layer to adapt the information and objects provided by the find_package(<externalPkg> ...) call into a complete <tplName>::all_libs target and a wrapper <tplName>Config.cmake file for consumption by downstream TriBITS-compliant packages.
The following subsections will describe how to create these TriBITS-compliant FindTPL<tplName>.cmake modules for all of the various cases using an internal call to find_package(<externalPkg> ...):
For cases where find_package(<externalPkg>) provides complete and proper modern (namespaced) IMPORTED targets (but is missing the <tplName>::all_libs target or the name <tplName> and <externalPkg> name are different), these FindTPL<tplName>.cmake modules can call the function tribits_extpkg_create_imported_all_libs_target_and_config_file() after calling find_package(<externalPkg>) to create a very thin find module file FindTPL<tplName>.cmake. In these cases, such a FindTPL<tplName>.cmake module file is nothing more than:
find_package(<externalPkg> REQUIRED) tribits_extpkg_create_imported_all_libs_target_and_config_file( <tplName> INNER_FIND_PACKAGE_NAME <externalPkg> IMPORTED_TARGETS_FOR_ALL_LIBS <importedTarget0> <importedTarget1> ... )
The function tribits_extpkg_create_imported_all_libs_target_and_config_file() creates the target <tplName>::all_libs and the wrapper file <tplName>Config.cmake which is installed by TriBITS. The only unique information required to create this glue module is the name of the external package <externalPkg> and the exact full names of the IMPORTED targets <importedTarget0> <importedTarget1> ... provided by the find_package(<externalPkg> ...) call. The TriBITS function tribits_extpkg_create_imported_all_libs_target_and_config_file() takes care of all of the rest of the details.
Such simple FindTPL<tplName>.cmake modules do not follow the legacy TriBITS TPL convention of allowing users to specify a TPL by setting the cache variables <tplName>_INCLUDE_DIRS, <tplName>_LIBRARY_DIRS, and <tplName>_LIBRARY_NAMES or by setting TPL_<tplName>_INCLUDE_DIRS and <tplName>_LIBRARIES. But as the ecosystem of CMake software transitions to modern CMake along with the proper usage of complete <Package>Config.cmake files, this is the reasonable thing to do.
However, to maintain backwards compatibility with the legacy TriBITS TPL system (such as when upgrading a existing FindTPL<tplName>.cmake file), a FindTPL<tplName>.cmake file can be extended to use the function tribits_tpl_allow_pre_find_package() in combination with the functions tribits_extpkg_create_imported_all_libs_target_and_config_file() and tribits_tpl_find_include_dirs_and_libraries() as follows:
set(REQUIRED_HEADERS <header0> <header1> ...) set(REQUIRED_LIBS_NAMES <libname0> <libname1> ...) set(IMPORTED_TARGETS_FOR_ALL_LIBS <importedTarget0> <importedTarget1> ...) tribits_tpl_allow_pre_find_package(<tplName> <tplName>_ALLOW_PREFIND) if (<tplName>_ALLOW_PREFIND) message("-- Using find_package(<externalPkg> ...) ...") find_package(<externalPkg>) if (<externalPkg>_FOUND) message("-- Found <externalPkg>_DIR='${<externalPkg>_DIR}'") message("-- Generating <tplName>::all_libs and <tplName>Config.cmake") tribits_extpkg_create_imported_all_libs_target_and_config_file(<tplName> INNER_FIND_PACKAGE_NAME <externalPkg> IMPORTED_TARGETS_FOR_ALL_LIBS ${IMPORTED_TARGETS_FOR_ALL_LIBS} ) endif() endif() if (NOT TARGET <tplName>::all_libs) tribits_tpl_find_include_dirs_and_libraries( <tplName> REQUIRED_HEADERS ${REQUIRED_HEADERS} REQUIRED_LIBS_NAMES ${REQUIRED_LIBS_NAMES} ) endif()
Above, if find_package(<externalPkg>) is called and returns <externalPkg>_FOUND=FALSE, then as fallback it will call tribits_tpl_find_include_dirs_and_libraries() to find the components of the TPL. See the documentation for tribits_tpl_allow_pre_find_package() for conditions where <tplName>_ALLOW_PREFIND is set to FALSE (and therefore find_package(<externalPkg>) is not called).
Note in the above FindTPL<tplName>.cmake file that find_package(<externalPkg>) will be called even on reconfigures. That is critical since find_package(<externalPkg>) defines IMPORTED targets that must be available each time configure is called. Also, if find_package(<externalPkg>) is called and <externalPkg>_FOUND=TRUE, then the function tribits_extpkg_create_imported_all_libs_target_and_config_file() is called which defines the targets <tplName>::all_libs which means that the function tribits_tpl_find_include_dirs_and_libraries() will not be called.
A concrete (and tested) example of the latter case can be found in the file TribitsExampleProject2/cmake/tpls/FindTPLTpl2.cmake:
set(REQUIRED_HEADERS Tpl2a.hpp) # Only look for one header file to find include dir set(REQUIRED_LIBS_NAMES tpl2b tpl2a) set(IMPORTED_TARGETS_FOR_ALL_LIBS tpl2::tpl2a tpl2::tpl2b) tribits_tpl_allow_pre_find_package(Tpl2 Tpl2_ALLOW_PREFIND) if (Tpl2_ALLOW_PREFIND) message("-- Using find_package(Tpl2 ...) ...") find_package(Tpl2) if (Tpl2_FOUND) message("-- Found Tpl2_DIR='${Tpl2_DIR}'") message("-- Generating Tpl2::all_libs and Tpl2Config.cmake") tribits_extpkg_create_imported_all_libs_target_and_config_file(Tpl2 INNER_FIND_PACKAGE_NAME Tpl2 IMPORTED_TARGETS_FOR_ALL_LIBS ${IMPORTED_TARGETS_FOR_ALL_LIBS} ) endif() endif() if (NOT TARGET Tpl2::all_libs) tribits_tpl_find_include_dirs_and_libraries( Tpl2 REQUIRED_HEADERS ${REQUIRED_HEADERS} REQUIRED_LIBS_NAMES ${REQUIRED_LIBS_NAMES} ) endif()
There are cases where calling find_package(<externalPkg>) to find the parts of an external package does not create proper IMPORTED targets that can be directly used. For example, legacy Find<externalPkg>.cmake modules (even many standard Find*.cmake modules shipped with CMake as of CMake 3.23) do not provide IMPORTED targets and instead only provide variables for the list of include directories <externalPkg>_INCLUDE_DIRS, a list of library files <externalPkg>_LIBRARIES, and other information. In cases such as this, one can implement the FindTPL<tplName>.cmake module by calling find_package(<externalPkg>), setting TPL_<tplName>_INCLUDE_DIRS and TPL_<tplName>_LIBRARIES, and then calling tribits_tpl_find_include_dirs_and_libraries() to create the <tplName>::all_libs target and the <tplName>Config.cmake wrapper config file.
The simplest way to implement FindTPL<tplName>.cmake is to always call find_package(<externalPkg>) as:
message("-- Using find_package(<externalPkg> ...) ...") find_package(<externalPkg> REQUIRED) # Tell TriBITS that we found <tplName> and there no need to look any further set(TPL_<tplName>_INCLUDE_DIRS ${<externalPkg>_INCLUDE_DIRS} CACHE PATH "...") set(TPL_<tplName>_LIBRARIES ${<externalPkg>_LIBRARIES} CACHE FILEPATH "...") set(TPL_<tplName>_LIBRARY_DIRS ${<externalPkg>_LIBRARY_DIRS} CACHE PATH "...") tribits_tpl_find_include_dirs_and_libraries( <tplName> REQUIRED_HEADERS neverFindThisHeader REQUIRED_LIBS_NAMES neverFindThisLib )
The above will always call find_package(<externalPkg> REQUIRED), and if it can't find the package, it will error out and stop the configure process.
While the above FindTPL<tplName>.cmake file is pretty simple, it does not allow for a fall-back using the legacy TriBITS TPL find system using tribits_tpl_find_include_dirs_and_libraries(). One can allow for a fall-back find by passing a set of header files and library names for tribits_tpl_find_include_dirs_and_libraries() to find. This can be done using the FindTPL<tplName>.cmake module:
set(REQUIRED_HEADERS <header0> <header1> ...) set(REQUIRED_LIBS_NAMES <libname0> <libname1> ...) message("-- Using find_package(<externalPkg> ...) ...") find_package(<externalPkg>) if (<externalPkg>_FOUND) # Tell TriBITS that we found <tplName> and there no need to look any further set(TPL_<tplName>_INCLUDE_DIRS ${<externalPkg>_INCLUDE_DIRS} CACHE PATH "...") set(TPL_<tplName>_LIBRARIES ${<externalPkg>_LIBRARIES} CACHE FILEPATH "...") set(TPL_<tplName>_LIBRARY_DIRS ${<externalPkg>_LIBRARY_DIRS} CACHE PATH "...") endif() tribits_tpl_find_include_dirs_and_libraries( <tplName> REQUIRED_HEADERS ${REQUIRED_HEADERS} REQUIRED_LIBS_NAMES ${REQUIRED_LIBS_NAMES} MUST_FIND_ALL_LIBS )
Above, if find_package(<externalPkg>) can't find <externalPkg>, then the function tribits_tpl_find_include_dirs_and_libraries() will try to find the listed required header files and libraries, using the legacy TriBITS TPL find system. And if find_package(<externalPkg>) can find <externalPkg>, then all the call to tribits_tpl_find_include_dirs_and_libraries() will not look for anything and will just take the information in the variables TPL_<tplName>_INCLUDE_DIRS and TPL_<tplName>_LIBRARIES and build IMPORTED targets and the <tplName>::all_libs target.
Finally, if one is upgrading an existing FindTPL<tplName>.cmake file to use find_package(<externalPkg>) but needs to maintain backward compatibility for existing configure scripts for the project that might be using the legacy TriBITS TPL system, one can use the function tribits_tpl_allow_pre_find_package() to determine if find_package(<externalPkg>) can be called or if the legacy TriBITS TPL find must be used (to maintain backward compatibility). This FindTPL<tplName>.cmake looks like:
set(REQUIRED_HEADERS <header0> <header1> ...) set(REQUIRED_LIBS_NAMES <libname0> <libname1> ...) tribits_tpl_allow_pre_find_package(<tplName> <tplName>_ALLOW_PREFIND) if (<tplName>_ALLOW_PREFIND) message("-- Using find_package<externalPkg> ...) ...") find_package(<externalPkg>) if (<externalPkg>_FOUND) # Tell TriBITS that we found <tplName> and there no need to look any further set(TPL_<tplName>_INCLUDE_DIRS ${<externalPkg>_INCLUDE_DIRS} CACHE PATH "...") set(TPL_<tplName>_LIBRARIES ${<externalPkg>_LIBRARIES} CACHE FILEPATH "...") set(TPL_<tplName>_LIBRARY_DIRS ${<externalPkg>_LIBRARY_DIRS} CACHE PATH "...") endif() endif() tribits_tpl_find_include_dirs_and_libraries( <tplName> REQUIRED_HEADERS ${REQUIRED_HEADERS} REQUIRED_LIBS_NAMES ${REQUIRED_LIBS_NAMES} MUST_FIND_ALL_LIBS )
The above will result in skipping the call of find_package(<externalPkg>) if any of the legacy TPL find variables are set. But if the legacy TPL find variables are not set, then the default find will use find_package(<externalPkg>) and will only fall back on the legacy TriBITS TPL find operation if find_package(<externalPkg>) sets <externalPkg>_FOUND=FALSE.
With this last approach, the FindTPL<tplName>.cmake module preserves all of the user behavior described in Enabling support for an optional Third-Party Library (TPL) for overriding what TPL components to look for, where to look, and finally to override what is actually used. That is, if the user sets the cache variables TPL_<tplName>_INCLUDE_DIRS, TPL_<tplName>_LIBRARIES, or TPL_<tplName>_LIBRARY_DIRS, then they should be used without question (which is why the set( ... CACHE ...) calls in the above example do not use FORCE).
If one wants to skip and ignore the standard TriBITS TPL override variables <tplName>_INCLUDE_DIRS, <tplName>_LIBRARY_NAMES, or <tplName>_LIBRARY_DIRS, then one can set:
set(<tplName>_FORCE_PRE_FIND_PACKAGE TRUE CACHE BOOL "Always first call find_package(<tplName> ...) unless explicit override")
at the top of the file FindTPL<tplName>.cmake and tribits_tpl_allow_pre_find_package() will ignore these variables and return TRUE. This avoids name classes with the variables <externalPkg>_INCLUDE_DIRS and <externalPkg>_LIBRARY_DIRS with <tplName>_INCLUDE_DIRS and <tplName>_LIBRARY_DIRS (when <tplName> == <externalPkg>) which are often used in the concrete legacy CMake Find<tplName>.cmake module files themselves.
For a slightly more complex (but real-life) example, see FindTPLHDF5.cmake which is:
######################################################################## # See associated tribits/Copyright.txt file for copyright and license! # ######################################################################## set(HDF5_INTERNAL_IS_MODERN FALSE) if (Netcdf_ALLOW_MODERN) print_var(Netcdf_ALLOW_MODERN) message("-- Using find_package(HDF5 CONFIG) ...") find_package(HDF5 CONFIG) if (HDF5_FOUND) message("-- Found HDF5_CONFIG=${HDF5_CONFIG}") message("-- Generating Netcdf::all_libs and NetcdfConfig.cmake") message("-- HDF5_EXPORT_LIBRARIES=${HDF5_EXPORT_LIBRARIES}") tribits_extpkg_create_imported_all_libs_target_and_config_file( HDF5 INNER_FIND_PACKAGE_NAME HDF5 IMPORTED_TARGETS_FOR_ALL_LIBS ${HDF5_EXPORT_LIBRARIES}) set(HDF5_INTERNAL_IS_MODERN TRUE) endif() endif() set(HDF5_FOUND_MODERN_CONFIG_FILE ${HDF5_INTERNAL_IS_MODERN} CACHE INTERNAL "True if HDF5 was found by the modern method") if (NOT TARGET HDF5::all_libs) # First, set up the variables for the (backward-compatible) TriBITS way of # finding HDF5. These are used in case find_package(HDF5 ...) is not called # or does not find HDF5. Also, these variables need to be non-null in order # to trigger the right behavior in the function # tribits_tpl_find_include_dirs_and_libraries(). set(REQUIRED_HEADERS hdf5.h) set(REQUIRED_LIBS_NAMES hdf5) if (HDF5_REQUIRE_FORTRAN) set(REQUIRED_LIBS_NAMES ${REQUIRED_LIBS_NAMES} hdf5_fortran) endif() if (TPL_ENABLE_MPI) set(REQUIRED_LIBS_NAMES ${REQUIRED_LIBS_NAMES} z) endif() if (TPL_ENABLE_Netcdf) set(REQUIRED_LIBS_NAMES ${REQUIRED_LIBS_NAMES} hdf5_hl) endif() # # Second, search for HDF5 components (if allowed) using the standard # find_package(HDF5 ...). # tribits_tpl_allow_pre_find_package(HDF5 HDF5_ALLOW_PREFIND) if (HDF5_ALLOW_PREFIND) message("-- Using find_package(HDF5 ...) ...") set(HDF5_COMPONENTS C) if (HDF5_REQUIRE_FORTRAN) list(APPEND HDF5_COMPONENTS Fortran) endif() if (TPL_ENABLE_MPI) set(HDF5_PREFER_PARALLEL TRUE) endif() find_package(HDF5 COMPONENTS ${HDF5_COMPONENTS}) # Make sure that HDF5 is parallel. if (TPL_ENABLE_MPI AND NOT HDF5_IS_PARALLEL) message(FATAL_ERROR "Trilinos is configured for MPI, HDF5 is not. Did CMake find the correct libraries? Try setting HDF5_INCLUDE_DIRS and/or HDF5_LIBRARY_DIRS explicitly. ") endif() if (HDF5_FOUND) # Tell TriBITS that we found HDF5 and there no need to look any further! set(TPL_HDF5_INCLUDE_DIRS ${HDF5_INCLUDE_DIRS} CACHE PATH "HDF5 include dirs") set(TPL_HDF5_LIBRARIES ${HDF5_LIBRARIES} CACHE FILEPATH "HDF5 libraries") set(TPL_HDF5_LIBRARY_DIRS ${HDF5_LIBRARY_DIRS} CACHE PATH "HDF5 library dirs") endif() endif() # # Third, call tribits_tpl_find_include_dirs_and_libraries() # tribits_tpl_find_include_dirs_and_libraries( HDF5 REQUIRED_HEADERS ${REQUIRED_HEADERS} REQUIRED_LIBS_NAMES ${REQUIRED_LIBS_NAMES} ) # NOTE: If find_package(HDF5 ...) was called and successfully found HDF5, then # tribits_tpl_find_include_dirs_and_libraries() will use the already-set # variables TPL_HDF5_INCLUDE_DIRS and TPL_HDF5_LIBRARIES and then print them # out (and set some other standard variables as well). This is the final # "hook" into the TriBITS TPL system. endif()
Note that some specialized Find<externalPkg>.cmake modules do more than just return a list of include directories and libraries. Some, like FindQt4.cmake also return other variables that are used in downstream packages. therefore, in these cases, find_package(Qt4 ...) must be called on every configure. Such find modules cannot completely adhere to the standard legacy TriBITS behavior described in Enabling support for an optional Third-Party Library (TPL).
For external packages that don't have a Find<externalPkg>.cmake module or <externalPkg>Config.cmake package config file, it may make sense to create a simple FindTpl<tplName>.cmake module that just calls tribits_tpl_find_include_dirs_and_libraries() with the set of required header files and libraries that must be found. A simple FindTPL<tplName>.cmake module of this form is:
tribits_tpl_find_include_dirs_and_libraries( <tplName> REQUIRED_HEADERS <header0> <header1> ... REQUIRED_LIBS_NAMES <libname0> <libname1> ... MUST_FIND_ALL_LIBS )
Note that a set of alternate names for each library can be specified using quotes around the set of alternative library names using the syntax:
tribits_tpl_find_include_dirs_and_libraries( <tplName> ... REQUIRED_LIBS_NAMES "<libname0> <libname0alt0> <libname0alt1> ..." ... ... )
This is most commonly used for simple single-library TPLs like BLAS that has different potential implementations like:
tribits_tpl_find_include_dirs_and_libraries( BLAS REQUIRED_LIBS_NAMES "blas openblas atlas" ... )
It is possible to create a FindTPL<tplName>.cmake find module without using any TriBITS functions. The only firm requirements for a FindTPL<tplName>.cmake file are:
TriBITS will set the remaining variables to provide a complete TriBITS-Compliant Package for the current CMake project and will add the install target to install the file <buildDir>/external_packages/<tplName>/<tplName>Config.cmake to create a TriBITS-compliant external package. TriBITS will also automatically create an appropriate package version file <tplName>ConfigVersion.cmake.
Some of issues to consider in this case (and the role of the <tplName>ConfigVersion.cmake file) are described in the section Tricky considerations for TriBITS-generated <tplName>Config.cmake files.
To add a new TriBITS and/ git VC repository to a TriBITS project that already contains other extra repositories, do the following:
See <projectDir>/cmake/ExtraRepositoriesList.cmake for more details and links.
It is often the case where one will want to add a new dependency for an existing downstream package to an existing upstream (internal or external) TriBITS Package. This can either be a required dependency or an optional dependency. Here, we will refer to the downstream package as <packageName> with base directory <packageDir> and will refer to the upstream (internal or external) package as <upstreamPackageName>.
The process for adding a new dependency to an existing upstream package is as follows:
Add the name of the upstream package to the downstream package's Dependencies.cmake file: Add <upstreamPackagename> to the call of tribits_package_define_dependencies() in the downstream package's <packageDir>/cmake/Dependencies.cmake file. If this is to be a required library dependency, then <upstreamPackagename> is added to the LIB_REQUIRED_PACKAGES argument. Alternatively, if this is to be an optional library dependency, then <upstreamPackagename> is added to the LIB_OPTIONAL_PACKAGES argument. (For example, see the file packages/EpetaExt/cmake/Dependencies.cmake file in the ReducedMockTrilinos project.) If only the test and/or example sources, and not the package's core library sources, will have the required or optional dependency, then <upstreamPackagename> is added to the arguments TEST_REQUIRED_PACKAGES or TEST_OPTIONAL_PACKAGES, respectively.
For an optional dependency, add `HAVE_` preprocessor macro to the package's configured header file: If this is an optional dependency, typically a C/C++ processor macro will be added to the package's configured <packageDir>/cmake/<packageName>_config.h.in file using the line:
#cmakedefine HAVE_<PACKAGE_NAME_UC>_<UPSTREAM_PACKAGE_NAME_UC>
(see HAVE_<PACKAGE_NAME_UC>_<UPSTREAM_PACKAGE_NAME_UC>.)
Warning, do not add optional defines for tests/examples to configured header files: If this is a test-only and/or example-only dependency then please do not add a #cmakedefine to the package's core <packageDir>/cmake/<packageName>_config.h.in file. Instead, add the #cmakedefine line to a configured header that is only included by sources for the tests/examples or just add a define on the compiler command line (see the DEFINES argument to tribits_add_library() and tribits_add_executable(), but see the warning about problems with add_definitions() in Miscellaneous Notes (tribits_add_library())). We don't want the package's header files to change or libraries to have to be rebuilt if tests/examples get enabled or disabled. Otherwise, the TriBITS CTest/CDash Driver process will result in unnecessary rebuilds of software over and over again.
Use the features of the upstream package in the source files of the downstream package sources and/or tests/examples: Usage of the features of the upstream package <upstreamPackageName> in the downstream package <packageName> will typically involve adding #include <upstreamPackageName>_<fileName> in the package's C/C++ source (or test/example) files (or the equivalent in Fortran). If it is an optional dependency, then these includes will typically be protected using preprocessor ifdefs, for example, as:
#include "<packageName>_config.h" #if HAVE_<PACKAGE_NAME_UC>_<UPSTREAM_PACKAGE_NAME_UC> # include "<upstreamPackageName>_<fileName>" #endif
For an optional dependency, use CMake if() statements based on ${PACKAGE_NAME}_ENABLE_<upstreamPackageName>: When a package PACKAGE_NAME has an optional dependency on an upstream package <upstreamPackageName> and needs to put in optional logic in a CMakeLists.txt file, then the if() statements should use the variable ${PACKAGE_NAME}_ENABLE_<upstreamPackageName> and not the variable ${PROJECT_NAME}_ENABLE_<upstreamPackageName> or TPL_ENABLE_<upstreamPackageName> (if <upstreamPackageName> is an external package/TPL). For example, to optionally enable a test that depends on the enable of the optional upstream dependent package <upstreamPackageName>, one would use:
tribits_add_test( ... EXCLUDE_IF_NOT_TRUE ${PACKAGE_NAME}_ENABLE_<upstreamPackageName> )
or:
if (${PACKAGE_NAME}_ENABLE_<upstreamPackageName>) tribits_add_test( ... ) endif()
NOTE: TriBITS will automatically add the include directories for the upstream package to the compile lines for the downstream package source builds and will add the libraries for the upstream package to the link lines to the downstream package library and executable links. See documentation in the functions tribits_add_library() and tribits_add_executable(), and the DEPLIBS argument to these functions, for more details.
A TriBITS package can request the tentative enable of any of its optional external packagse/TPLs (see How to add a new TriBITS Package dependency). This is done by calling tribits_tpl_tentatively_enable() in the package's <packageDir>/cmake/Dependencies.cmake file. For example:
tribits_package_define_dependencies( ... LIB_OPTIONAL_TPLS SomeTpl ... ) tribits_tpl_tentatively_enable(SomeTpl)
This will result is an attempt to find the components for the TPL SomeTpl. But if that attempt fails, then the TPL will be disabled and ${PACKAGE_NAME}_ENABLE_SomeTpl will be set to OFF.
Sometimes it is desired to insert a package from a downstream VC repo into an upstream TriBITS Repository in order for one or more packages in the upstream repo to define a dependency on that package. The way this is supported in TriBITS is to just list the inserted package into the PackagesList.cmake file of the upstream TriBITS repo after the packages it depends on and before the packages that will use it then call the tribits_allow_missing_external_packages() function to allow the package to be missing. This is demonstrated in TribitsExampleProject with the package InsertedPkg which is not included in the default TribitsExampleProject source tree. The TribitsExampleProject/PackagesList.cmake file looks like:
tribits_repository_define_packages( SimpleCxx packages/simple_cxx PT MixedLang packages/mixed_lang PT InsertedPkg InsertedPkg ST WithSubpackages packages/with_subpackages PT WrapExternal packages/wrap_external ST ) tribits_disable_package_on_platforms(WrapExternal Windows) tribits_allow_missing_external_packages(InsertedPkg)
In this example, InsertedPkg has a required dependency on SimpleCxx and the package WithSubpackagesB has an optional dependency on InsertedPkg. Therefore, the inserted package InsertedPkg has upstream and downstream dependencies on packages in the TribitsExampleProject repo.
The function tribits_allow_missing_external_packages() tells TriBITS to treat InsertedPkg the same as any other package if the directory TribitsExampleProject/InsertedPkg exists or to completely ignore the package InsertedPkg otherwise. In addition, TriBITS will automatically disable of all downstream package dependencies for the missing package (and print a note about the disables). NOTE: By default TriBITS will silently ignore missing inserted packages and disable optional support for the missing package. To see what packages are missing and being ignored, configure with:
-D <Project>_WARN_ABOUT_MISSING_EXTERNAL_PACKAGES=TRUE
The way one would set up TribitsExampleProject to enable InsertedPkg, if these were in separate VC (e.g. git) repos for example, would be to do:
$ git clone <some-url-base>/TribitsExampleProject $ cd TribitsExampleProject $ git clone <some-other-url-base>/ExteranlPkg $ echo /InsertedPkg/ >> .git/info/excludes
Then, when you configure TribitsExampleProject, the package InsertedPkg would automatically appear and could then be enabled or disabled like any other TriBITS package. The TriBITS test Tribits_TribitsExampleProject_InsertedPkg demonstrates this.
Assuming that one would put the (new) external package in a separate VC repo, one would perform the following steps:
There are cases where it is advantageous to have a raw CMake build system and a TriBITS CMake build system sit side-by-side in a CMake project. There are various ways to accomplish this but a very simple way that has minimal impact on the raw CMake build system is described here. An example of how to accomplish this is shown in the example project RawAndTribitsHelloWorld. This CMake project is a copy of the TribitsHelloWorld project that puts a primary default raw CMake build system side-by-side with a secondary TriBITS CMake build system. The key aspects of this basic approach shown in this example are:
The top file RawAndTribitsHelloWorld/CMakeLists.txt file demonstrates the basic approach:
cmake_minimum_required(VERSION 3.23.0 FATAL_ERROR) include(${CMAKE_CURRENT_SOURCE_DIR}/ProjectName.cmake) # Called at the top of every CMakeLists.txt file macro(include_tribits_build) if (${PROJECT_NAME}_TRIBITS_DIR) include("${CMAKE_CURRENT_SOURCE_DIR}/CMakeLists.tribits.cmake") return() endif() endmacro() if (${PROJECT_NAME}_TRIBITS_DIR) # TriBITS CMake project project(${PROJECT_NAME} NONE) include("${${PROJECT_NAME}_TRIBITS_DIR}/TriBITS.cmake") # Only one package in this simple project so just enable it :-) set(${PROJECT_NAME}_ENABLE_HelloWorld ON CACHE BOOL "" FORCE) tribits_project() else() # Raw CMake project project(RawHelloWorld) enable_testing() add_subdirectory(hello_world) endif() # NOTE: The cmake_minimum_required() and project() commands must be executed # in the base CMakeLists.txt file and *NOT* in an included
Then every raw CMakeLists.txt file starts with the command include_tribits_build() at the very top as shown in the example file RawAndTribitsHelloWorld/hello_world/CMakeLists.txt:
include_tribits_build() # Build and install library set(HEADERS hello_world_lib.hpp) set(SOURCES hello_world_lib.cpp) add_library(hello_world_lib ${SOURCES}) install(TARGETS hello_world_lib DESTINATION lib) install(FILES ${HEADERS} DESTINATION include) # Build and install user executable add_executable(hello_world hello_world_main.cpp) target_link_libraries(hello_world hello_world_lib) install(TARGETS hello_world DESTINATION bin) # Test the executable add_test(test ${CMAKE_CURRENT_BINARY_DIR}/hello_world) set_tests_properties(test PROPERTIES PASS_REGULAR_EXPRESSION "Hello World") # Build and run some unit tests add_executable(unit_tests hello_world_unit_tests.cpp) target_link_libraries(unit_tests hello_world_lib) add_test(unit_test ${CMAKE_CURRENT_BINARY_DIR}/unit_tests) set_tests_properties(unit_test PROPERTIES PASS_REGULAR_EXPRESSION "All unit tests passed")
To configure the project as a raw CMake project, just configure it as with any raw CMake project as:
cmake [options] <some_base_dir>/RawAndTribitsHelloWorld
To configure it as a TriBITS project, just set the cache var RawAndTribitsHelloWorld_TRIBITS_DIR to point to valid TriBITS source tree as:
cmake [options] \ -DRawAndTribitsHelloWorld_TRIBITS_DIR=<tribits_dir> \ <some_base_dir>/RawAndTribitsHelloWorld
A twist on this use case is for a package that only builds as a TriBITS package inside of some larger TriBITS project and not as its own TriBITS CMake project. In this case, some slight changes are needed to this example but the basic approach is nearly identical. One still needs an if() statement at the top of the first CMakeLists.txt file (this time for the package) and the macro include_tribits_build() needs to be defined at the top of that file as well. Then every CMakeLists.txt file in subdirectories just calls include_tribits_build(). That is it.
As described in TriBITS-Compliant Internal Packages, it is possible to create a raw CMake build system for a CMake package that can build under a parent TriBITS CMake project. The raw CMake code for such a package must provide the <Package>::all_libs target both in the current CMake build system and also in the generated <Package>Config.cmake file for the build directory and in the installed <Package>Config.cmake file. Every such TriBITS-compliant internal package therefore is also capable of installing a TriBITS-compliant external package <Package>Config.cmake file (see How to implement a TriBITS-compliant external package using raw CMake).
A raw CMake build system for a TriBITS-compliant internal package is demonstrated in the TribitsExampleProject2 project Package1 package. The base CMakeLists.txt file for building Package1 with a raw CMake build system (called package1/CMakeLists.raw.cmake in that directory) looks like:
cmake_minimum_required(VERSION 3.23.0 FATAL_ERROR) if (COMMAND tribits_package) message("Configuring raw CMake package Package1") else() message("Configuring raw CMake project Package1") endif() # Standard project-level stuff project(Package1 LANGUAGES C CXX) include(GNUInstallDirs) find_package(Tpl1 CONFIG REQUIRED) add_subdirectory(src) if (Package1_ENABLE_TESTS) include(CTest) add_subdirectory(test) endif() # Stuff that TriBITS does automatically include("${CMAKE_CURRENT_LIST_DIR}/cmake/raw/DefineAllLibsTarget.cmake") include("${CMAKE_CURRENT_LIST_DIR}/cmake/raw/GeneratePackageConfigFileForBuildDir.cmake") include("${CMAKE_CURRENT_LIST_DIR}/cmake/raw/GeneratePackageConfigFileForInstallDir.cmake")
As shown above, this simple CMake package contains the basic features of any CMake project/package including calling the cmake_minimum_required() and project() commands as well as including GNUInstallDirs. In this example, the project/package being built Package1 has a dependency on an external upstream package Tpl1 pulled in with find_package(Tpl1). Also in this example, the package has native tests it defines with include(CTest) and add_subdirectory() (if Package1_ENABLE_TESTS is set to ON).
The file package1/src/CMakeLists.raw.cmake (which gets included from package1/src/CMakeLists.txt) creates a library and executable for the package and has the contents:
# Create and install library 'package1' add_library(Package1_package1 Package1.hpp Package1.cpp) target_include_directories(Package1_package1 PUBLIC $<BUILD_INTERFACE:${CMAKE_CURRENT_SOURCE_DIR}>) target_link_libraries(Package1_package1 PRIVATE tpl1::tpl1 ) set_target_properties(Package1_package1 PROPERTIES EXPORT_NAME package1) add_library(Package1::package1 ALIAS Package1_package1) install(TARGETS Package1_package1 EXPORT ${PROJECT_NAME} INCLUDES DESTINATION ${CMAKE_INSTALL_INCLUDEDIR} ) install( FILES Package1.hpp DESTINATION ${CMAKE_INSTALL_INCLUDEDIR} ) # Create and install executable 'package1-prg' add_executable(package1-prg Package1_Prg.cpp) target_link_libraries(package1-prg PRIVATE Package1::package1) install( TARGETS package1-prg EXPORT ${PROJECT_NAME} INCLUDES DESTINATION ${CMAKE_INSTALL_INCLUDEDIR} )
This creates a single installable library target Package1_package1 which is aliased as Package1::package1 in the current CMake project and sets up to create the IMPORTED target Package1::package1 in the generated Package1ConfigTarget.cmake file, which gets included in the installed Package1Config.cmake (<Package>Config.cmake) file (as recommenced in the book "Professional CMake", see below). In addition, the above code creates the installable executable package1-prg.
The Package1::all_libs (<Package>::all_libs) target is defined and set up inside of the included file package1/cmake/raw/DefineAllLibsTarget.cmake which contains the code:
# Generate the all_libs target(s) add_library(Package1_all_libs INTERFACE) set_target_properties(Package1_all_libs PROPERTIES EXPORT_NAME all_libs) target_link_libraries(Package1_all_libs INTERFACE Package1_package1) install(TARGETS Package1_all_libs EXPORT ${PROJECT_NAME} COMPONENT ${PROJECT_NAME} INCLUDES DESTINATION ${CMAKE_INSTALL_INCLUDEDIR} ) add_library(Package1::all_libs ALIAS Package1_all_libs)
The above code contains the ALIAS library target Package1::all_libs (<Package>::all_libs) for the current CMake project as well as sets up for the IMPORTED target Package1::all_libs (<Package>::all_libs) getting put in the generated Package1ConfigTargets.cmake file (see below).
The Package1Config.cmake (<Package>Config.cmake) file for the build directory is generated inside of the included file package1/cmake/raw/GeneratePackageConfigFileForBuildDir.cmake which has the contents:
if (COMMAND tribits_package) # Generate Package1Config.cmake file for the build tree (for internal # TriBITS-compliant package) set(packageBuildDirCMakePackagesDir "${${CMAKE_PROJECT_NAME}_BINARY_DIR}/cmake_packages/${PROJECT_NAME}") export(EXPORT ${PROJECT_NAME} NAMESPACE ${PROJECT_NAME}:: FILE "${packageBuildDirCMakePackagesDir}/${PROJECT_NAME}ConfigTargets.cmake" ) configure_file( "${CMAKE_CURRENT_LIST_DIR}/Package1Config.cmake.in" "${packageBuildDirCMakePackagesDir}/${PROJECT_NAME}/Package1Config.cmake" @ONLY ) endif()
The above code uses the export(EXPORT ...) command to generate the file Package1ConfigTargets.cmake for the build directory which provides the IMPORTED targets Package1::package1 and Package1::all_libs. The command configure_file(...) generates the Package1Config.cmake file that includes it for the build directory <buildDir>/cmake_packages/Package1/. (NOTE: The above code only runs when the package is being built from inside of a TriBITS project which defines the command tribits_package. So this code gets skipped when building Package1 as a stand-alone raw CMake project.)
Finally, the code for generating and installing the Package1Config.cmake file for the install directory CMAKE_PREFIX_PATH=<installDir> is specified in the included file package1/cmake/raw/GeneratePackageConfigFileForInstallDir.cmake with the contents:
# Generate and install the Package1Config.cmake file for the install tree # (needed for both internal and external TriBITS package) set(pkgConfigInstallDir "${CMAKE_INSTALL_LIBDIR}/cmake/${PROJECT_NAME}") install(EXPORT ${PROJECT_NAME} DESTINATION "${pkgConfigInstallDir}" NAMESPACE ${PROJECT_NAME}:: FILE ${PROJECT_NAME}ConfigTargets.cmake ) configure_file( "${CMAKE_CURRENT_SOURCE_DIR}/cmake/raw/Package1Config.cmake.in" "${CMAKE_CURRENT_BINARY_DIR}/CMakeFiles/Package1Config.install.cmake" @ONLY ) install( FILES "${CMAKE_CURRENT_BINARY_DIR}/CMakeFiles/Package1Config.install.cmake" RENAME "Package1Config.cmake" DESTINATION "${pkgConfigInstallDir}" )
The above uses the command install(EXPORT ...) to have CMake automatically generate and install the file Package1ConfigTargets.cmake in the install directory <installDir>/libs/cmake/Package1/ which provides the IMPORTED targets Package1::package1 and Package1::all_libs. The command configure_file() is used to generate the file Package1Config.install.cmake in the build directory from the template file Package1Config.cmake.in. Finally, the install() command is used in the file GeneratePackageConfigFileForInstallDir.cmake to set up the installation of the Package1Config.cmake file.
Note, the template file package1/cmake/raw/Package1Config.cmake.in (which is unique to Package1) is:
set(Tpl1_DIR "@Tpl1_DIR@") find_package(Tpl1 CONFIG REQUIRED) include("${CMAKE_CURRENT_LIST_DIR}/Package1ConfigTargets.cmake")
As shown in the all of the above code, there is a lot of boilerplate CMake code needed to correctly define the targets such that they get put into the installed Package1Config.cmake file using the correct namespace Package1:: and care must be taken to ensure that a consistent "export set" is used for this purpose. (For more details, see the book "Professional CMake".)
NOTE: One should compare the above raw CMakeLists files to the more compact TriBITS versions for the base package1/CMakeLists.txt file (called package1/CMakeLists.tribits.cmake in the base directory pacakge1/):
message("Configuring package Package1 as full TriBITS package") tribits_package(Package1) add_subdirectory(src) tribits_add_test_directories(test) tribits_package_postprocess()
and the TriBITS package1/src/CMakeLists.txt file (called package1/src/CMakeLists.tribits.cmake):
tribits_include_directories(${CMAKE_CURRENT_SOURCE_DIR}) tribits_add_library(package1 HEADERS Package1.hpp SOURCES Package1.cpp) tribits_add_executable(package1-prg NOEXEPREFIX NOEXESUFFIX SOURCES Package1_Prg.cpp INSTALLABLE )
This shows the amount of boilerplate code that TriBITS addresses automatically (which reduces the overhead of finer-grained packages and avoids common mistakes with tedious low-level CMake code).
As described in TriBITS-Compliant External Packages, it is possible to create a raw CMake build system for a CMake package such that once it is installed, satisfies the requirements for a TriBITS-compliant external package. These installed packages provide a <Package>Config.cmake file that provides the required targets and behaviors as if it was produced by a TriBITS project. For most existing raw CMake projects that already produce a "Professional CMake" compliant <Package>Config.cmake file, that usually just means adding the IMPORTED target called <Package>::all_libs to the installed <Package>Config.cmake file.
A raw CMake build system for a TriBITS-compliant external package is demonstrated in the TribitsExampleProject2 project Package1 package. The base package1/CMakeLists.txt file for building Package1 with a raw CMake build system (called package1/CMakeLists.raw.cmake) for implementing a TriBITS-compliant internal package looks like:
cmake_minimum_required(VERSION 3.23.0 FATAL_ERROR) if (COMMAND tribits_package) message("Configuring raw CMake package Package1") else() message("Configuring raw CMake project Package1") endif() # Standard project-level stuff project(Package1 LANGUAGES C CXX) include(GNUInstallDirs) find_package(Tpl1 CONFIG REQUIRED) add_subdirectory(src) if (Package1_ENABLE_TESTS) include(CTest) add_subdirectory(test) endif() # Stuff that TriBITS does automatically include("${CMAKE_CURRENT_LIST_DIR}/cmake/raw/DefineAllLibsTarget.cmake") include("${CMAKE_CURRENT_LIST_DIR}/cmake/raw/GeneratePackageConfigFileForInstallDir.cmake")
Note that the raw build system this example is identical to the build system for the raw TriBITS-compliant internal package described in How to implement a TriBITS-compliant internal package using raw CMake. The only differences are:
Other than that, see How to implement a TriBITS-compliant internal package using raw CMake for how to implement a TriBITS-compliant external package.
The TriBITS test support functions tribits_add_test() and tribits_add_advanced_test() can be used from any raw (i.e. non-TriBITS) CMake project. To do so, one just needs to include the TriBITS modules:
and set the variable ${PROJECT_NAME}_ENABLE_TESTS to ON. For an MPI-enabled CMake project, the CMake variables MPI_EXEC, MPI_EXEC_PRE_NUMPROCS_FLAGS, MPI_EXEC_NUMPROCS_FLAG and MPI_EXEC_POST_NUMPROCS_FLAGS must also be set which define the MPI runtime program launcher command-line used in the TriBITS testing functions:
${MPI_EXEC} ${MPI_EXEC_PRE_NUMPROCS_FLAGS} ${MPI_EXEC_NUMPROCS_FLAG} <NP> ${MPI_EXEC_POST_NUMPROCS_FLAGS} <TEST_EXECUTABLE_PATH> <TEST_ARGS>
(NOTE: These variables are defined automatically in a TriBITS project when TPL_ENABLE_MPI is set to ON.)
This is demonstrated in the TribitsExampleProject2 project Package1 package. The base pacakge1/CMakeLists.txt file for building Package1 with a raw CMake build system using TriBITS testing functions (called package1/CMakeLists.raw.cmake) looks like:
cmake_minimum_required(VERSION 3.23.0 FATAL_ERROR) if (COMMAND tribits_package) message("Configuring raw CMake package Package1") else() message("Configuring raw CMake project Package1") endif() # Standard project-level stuff project(Package1 LANGUAGES C CXX) include(GNUInstallDirs) find_package(Tpl1 CONFIG REQUIRED) add_subdirectory(src) if (Package1_ENABLE_TESTS) include(CTest) include("${CMAKE_CURRENT_LIST_DIR}/cmake/raw/EnableTribitsTestSupport.cmake") add_subdirectory(test) endif()
The only difference between this base package1/CMakeLists.txt file and one for a raw CMake project is the inclusion of the file package1/cmake/raw/EnableTribitsTestSupport.cmake which has the contents:
set(Package1_USE_TRIBITS_TEST_FUNCTIONS OFF CACHE BOOL "Use TriBITS testing functions") set(Package1_TRIBITS_DIR "" CACHE PATH "Path to TriBITS implementation base dir (e.g. TriBITS/tribits)") if (Package1_USE_TRIBITS_TEST_FUNCTIONS AND Package1_TRIBITS_DIR) # Pull in and turn on TriBITS testing support include("${Package1_TRIBITS_DIR}/core/test_support/TribitsAddTest.cmake") include("${Package1_TRIBITS_DIR}/core/test_support/TribitsAddAdvancedTest.cmake") set(Package1_ENABLE_TESTS ON) endif()
The key lines are:
include("${Package1_TRIBITS_DIR}/core/test_support/TribitsAddTest.cmake") include("${Package1_TRIBITS_DIR}/core/test_support/TribitsAddAdvancedTest.cmake")
This defines the CMake functions tribits_add_test() and tribits_add_advanced_test(), respectively.
The above code demonstrates that CMAKE_MODULE_PATH does not need to be updated to use these TriBITS test_support modules. However, one is free to update CMAKE_MODULE_PATH and then include the modules by name only like:
list(PREPEND CMAKE_MODULE_PATH "${Package1_TRIBITS_DIR}/core/test_support") include(TribitsAddTest) include(TribitsAddAdvancedTest)
Once these TriBITS modules are included, one can use the TriBITS functions as demonstrated in the file package1/test/CMakeLists.tribits.cmake (which is included from the file package1/test/CMakeLists.txt) and has the contents:
tribits_add_test(package1-prg NOEXEPREFIX NOEXESUFFIX NAME Prg DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/../src" NUM_MPI_PROCS 1 PASS_REGULAR_EXPRESSION "Package1 Deps: tpl1" ) tribits_add_advanced_test(Prg-advanced TEST_0 EXEC package1-prg DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}/../src" NOEXEPREFIX NOEXESUFFIX ARGS "something_extra" PASS_REGULAR_EXPRESSION_ALL "Package1 Deps: tpl1" "something_extra" ALWAYS_FAIL_ON_NONZERO_RETURN )
Note that in this example, the executable package1-prg was already created. If new test libraries and executables need to be created, then the raw CMake commands to create those will need to be added as well.
TriBITS defines a number of special <XXX>_ENABLE_<YYY> variables for enabling/disabling various entities that allow for a default "undefined" empty "" enable status. Examples of these special variables include:
(see TriBITS Dependency Handling Behaviors).
To check for and tweak these special "ENABLE" variables, perform the following:
To check to see if an ENABLE variable has been enabled or disabled (either explicitly or through auto enable/disable logic), use:
if ("${<XXX>_ENABLE_<YYY>}" STREQUAL "") # Variable has not been set to 'ON' or 'OFF' yet ... endif()
This will work correctly independent of if the cache variable has been default defined or not.
To tweak the enable/disable of one or more of these variables after user input but before the step "Adjust package and TPLs enables and disables" in Full Processing of TriBITS Project Files:
- To tweak the enables/disables for a TriBITS Repository (i.e. affecting all TriBITS projects) add enable/disable code to the file <repoDir>/cmake/RepositoryDependenciesSetup.cmake.
- To tweak the enables/disables for a specific TriBITS Project (i.e. affecting only that TriBITS project) add enable/disable code to the file <projectDir>/cmake/ProjectDependenciesSetup.cmake.
For example, one might default disable a package if it has not been explicitly enabled (or disabled) in one of these files using logic like:
if (NOT ${PROJECT_NAME}_ENABLE_Fortran) if ("${${PROJECT_NAME}_ENABLE_<SomeFortranPackage>}" STREQUAL "") message("-- " "NOTE: Setting ${PROJECT_NAME}_ENABLE_<SomeFortranPackage>=OFF because" "${PROJECT_NAME}_ENABLE_Fortran = '${${PROJECT_NAME}_ENABLE_Fortran}'") set(${PROJECT_NAME}_ENABLE_<SomeFortranPackage> OFF) endif() endif()
In order to understand the above steps for properly querying and tweaking these ENABLE variables, one must understand how TriBITS CMake code defines and interprets variables of this type.
First, note that most of these particular ENABLE variables are not BOOL cache variables but are actually STRING variables with the possible values of ON, OFF and empty "" (see the macro set_cache_on_off_empty()). Therefore, just because the value of a <XXX>_ENABLE_<YYY> variable is defined (e.g. if (DEFINED <XXX>_ENABLE_<YYY>) ... endif()) does not mean that it has been set to ON or OFF yet (or any non-empty values that evaluates to true or false in CMake). To see if an ENABLE variable is one of these types, look in the CMakeCache.txt. If the type of the variable <XXX>_ENABLE_<YYY> is STRING and you see another variable set with the name <XXX>_ENABLE_<YYY>_-STRINGS, then it is most likely this special type of ENABLE variable with a typical default value of empty "". However, if the cache variable is of type BOOL, then it is likely a standard bool variable that is not allowed to have a value of empty "".
Second, note that the value of empty "" evaluates to FALSE in CMake if() statements. Therefore, if one just wants to know if one of these variables evaluates to true, then just use if (<XXX>_ENABLE_<YYY>) ... endif().
Third, note that TriBITS will not define cache variables for these ENABLE variables until TriBITS processes the Dependencies.cmake files on the first configure (see Full TriBITS Project Configuration). On future reconfigures, these variables are all defined (but most will have a default value of empty "" stored in the cache).
The reason the files RepositoryDependenciesSetup.cmake and ProjectDependenciesSetup.cmake are the best places to put in these tweaks is because, as shown in Full Processing of TriBITS Project Files, they get processed after all of the user input has been read (in CMake cache variables set with -D<variable>=<value> and read in from ${PROJECT_NAME}_CONFIGURE_OPTIONS_FILE files) but before TriBITS adjusts the package enables and disables (see Package Dependencies and Enable/Disable Logic). Also, these files get processed in Reduced Package Dependency Processing as well so they get processed in all contexts where enable/disable logic is applied.
However, if one wants to tweak these variables once packages are starting to be processed (in step "For each <packageDir> in all enabled top-level packages" in Full TriBITS Project Configuration), there are fewer situations where that can be done correctly as described in the next section.
There are cases where one may need to enable or disable some feature that TriBITS may have enabled by default (such as in "Adjust package and TPLs enables and disables" in Full Processing of TriBITS Project Files) and that decision can only be made while processing a package's <packageDir>/CMakeLists.txt file and not before. (And therefore the logic for this disable cannot be performed in the ProjectDependenciesSetup.cmake or RepositoryDependenciesSetup.cmake files as described in How to check for and tweak TriBITS "ENABLE" cache variables.) Also, there are cases where it is necessary to make this change visible to downstream packages. The main example is when optional support of an upstream package in a downstream package <DownstreamPackage>_ENABLE_<UpstreamPackage> must be changed in the package's <packageDir>/CMakeLists.txt file. But there are other examples such as support for a given data-type that may impact multiple downstream packages.
When the internal configuration of a package (i.e. while processing its <packageDir>/CMakeLists.txt file) determines that an optional feature <Package>_ENABLE_<YYY> must change the value previously set (e.g. that was set automatically by TriBITS during the "Adjust package and TPLs enables and disables" stage in Full Processing of TriBITS Project Files), one cannot use a simple set() statement. Changing the value of a <Package>_ENABLE_<YYY> variable inside a package's <packageDir>/CMakeLists.txt file using a raw set(<Package>_ENABLE_<YYY> <newValue>) statement only changes the variable's value inside the package's scope, but all other packages will see the old value of <Package>_ENABLE_<YYY>. To correctly change the value of one of these variables, instead use dual_scope_set() from the top-level <packageDir>/CMakeLists.txt file. To perform this disable more robustly than calling dual_scope_set() directly, use the provided macro tribits_disable_optional_dependency(). For example, to disable optional support for <UpstreamPackage> in <DownstreamPackage> in <DownstreamPackage> package's <packageDir>/CMakeLists.txt file based on some criteria, add the CMake code:
if (<some-condition>) tribits_disable_optional_dependency( <UpstreamPackage> "NOTE: ${PACKAGE_NAME}_ENABLE_<UpstreamPackage> being set to OFF because of <reason>" ) endif()
Calling dual_scope_set() in the package's top-level <packageDir>/CMakeLists.txt file sets the value in both the local scope of <packageDir>/CMakeLists.txt (and therefore propagated to all other CMakeLists.txt files in that package) and in base-level (global) project scope. (But this does not change the value of a cache variable <Package>_ENABLE_<YYY> that may have been set by the user or some other means which is the desired behavior; see TriBITS auto-enables/disables done using non-cache local variables.) In this way, any downstream package (configured after processing <packageDir>/CMakeLists.txt) will see the new value for <Package>_ENABLE_<YYY>.
It is also strongly recommended that a message be printed to CMake STDOUT using message("-- " "NOTE: <message>") when changing the value of one of these <Package>_ENABLE_<YYY> variables. The user may have set it explicitly or TriBITS may have printed automatic logic for setting it by default, and user needs to know why and where the value is being overridden.
NOTE: However, it is not allowed to try to change the value of a global enable of a upstream or downstream package by trying to change the value of <Project>_ENABLE_<Package> or TPL_ENABLE_<Package> in a <packageDir>/CMakeLists.txt file. Changing the value of these variables after the "Adjust package and TPLs enables and disables" stage in Full Processing of TriBITS Project Files will result in undefined behavior.
The following steps describe how to set up support for TriBITS project managing multiple version control and TriBITS repositories by default (see Multi-Repository Support).
1) Add file <projectDir>/cmake/ExtraRepositoriesList.cmake and list out extra repos
For example, this file would contain something like:
tribits_project_define_extra_repositories( ExtraRepo1 "" GIT git@someurl.com:ExtraRepo1 "" Continuous ExtraRepo2 "ExtraRepo1/ExtraRepos2" GIT git@someurl.com:ExtraRepo2 NOPACKAGES Continuous ExtraRepo3 "" GIT git@someurl.com:ExtraRepo3 "" Nightly )NOTE: If one will not be using the checkin-test.py tool, or clone_extra_repos.py tool, or the TriBITS CTest/CDash Driver system, then one can leave the REPO_VCTYPE and REPO_URL fields empty (see tribits_project_define_extra_repositories() for details). (TriBITS Core does not have any dependencies on any specific VC tool. These fields are listed here to avoid duplicating the list of repos in another file when using these additional TriBITS tools.)
2) Set default values for ${PROJECT_NAME}_EXTRAREPOS_FILE and ${PROJECT_NAME}_ENABLE_KNOWN_EXTERNAL_REPOS_TYPE (and possibly ${PROJECT_NAME}_IGNORE_MISSING_EXTRA_REPOSITORIES) in the file <projectDir>/ProjectName.cmake
For example, add:
set(${PROJECT_NAME}_EXTRAREPOS_FILE cmake/ExtraRepositoriesList.cmake CACHE FILEPATH "Set in ProjectName.cmake") set(${PROJECT_NAME}_ENABLE_KNOWN_EXTERNAL_REPOS_TYPE Continuous CACHE STRING "Set in ProjectName.cmake")to the <projectDir>/ProjectName.cmake file. Otherwise, no extra repos will be defined or processed by default when configuring the project.
And if the project can operate without all of its extra repos, the project can set the following default in this file as well with:
set(${PROJECT_NAME}_IGNORE_MISSING_EXTRA_REPOSITORIES TRUE CACHE STRING "Set in ProjectName.cmake")Otherwise, all of the extra repos need to be present or the project configure will fail.
3) If using git as the VC tool, then set the variable ${PROJECT_NAME}_GENERATE_REPO_VERSION_FILE_DEFAULT in the <projectDir>/ProjectName.cmake file
For example:
set(${PROJECT_NAME}_GENERATE_REPO_VERSION_FILE_DEFAULT TRUE)
4) If wanting a clone tool with git repos, set up a link to the clone_extra_repos.py tool in the base <projectDir>/ directory
Create a symlink to the script clone_extra_repos.py in the base project repo, for example with:
cd <projecDir>/ ln -s cmake/tribits/ci_support/clone_extra_repos.py . git add clone_extra_repos.py git commit
The following steps describe how to submit results to a CDash site using the TriBITS CTest/CDash Driver support.
To do this, one must have an account on the CDash site and the permissions to create a new CDash project. The name of the project on CDash should generally match the name of the TriBITS project PROJECT_NAME but it does not have to. In fact, one can generally submit to any CDash project with any name so creating a new CDash project is actually optional.
NOTE: For open-source projects, Kitware provides the free CDash site my.cdash.org that allows a limited number of submits and data per day. But it should be enough to test out submitting to CDash.
- The file <projectDir>/CTestConfig.cmake can be copied and pasted from TribitsExampleProject/CTestConfig.cmake. To customize for your project, you generally just need to update the variables CTEST_DROP_SITE, CTEST_PROJECT_NAME, and CTEST_DROP_LOCATION.
- The file <projectDir>/cmake/ctest/CTestCustom.cmake.in can be copied and pasted from TribitsExampleProject/cmake/ctest/CTestCustom.cmake.in and them modified as desired.
Once the CDash project and the <projectDir>/CTestConfig.cmake and <projectDir>/cmake/ctest/CTestCustom.cmake.in files are created, one perform an experimental submission by just configuring the project as normal (except configuring additionally with -DCTEST_BUILD_FLAGS=-j8 and -DCTEST_PARALLEL_LEVEL=8 to use parallelism in the build and testing in the ctest -S script) and then running the build target:
make dashboardThat will configure, build, test, and submit results to the CDash site to the Experimental track of the target CDash project (see Dashboard Submissions).
To work out problems locally without spamming the CDash site, one can run with:
env CTEST_DO_SUBMIT=OFF make dashboardTo submit to a different CDash site and project, change the cache vars CTEST_DROP_SITE, CTEST_PROJECT_NAME, and CTEST_DROP_LOCATION at configure time. For example, to submit TribitsExampleProject results to a different CDash site, configure with:
cmake \ -DCTEST_DROP_SITE=testing.sandia.gov/cdash \ -DCTEST_PROJECT_NAME=TribitsExProj \ -DCTEST_DROP_LOCATION="/submit.php?project=TribitsExProj" \ [other cmake options] \ <baseDir>/TribitsExamplProjectand then run make dashboard.
For driving different builds and tests, one needs to set up one or more CTest -S driver scripts. There are various ways to do this but a simple approach that avoids duplication is to first create a file like TribitsExampleProject/cmake/ctest/TribitsExProjCTestDriver.cmake:
# # Set the locations of things for this project # set(TRIBITS_PROJECT_ROOT "${CMAKE_CURRENT_LIST_DIR}/../..") set(CTEST_SOURCE_NAME "TribitsExampleProject") include("${TRIBITS_PROJECT_ROOT}/ProjectName.cmake") if (NOT "$ENV{${PROJECT_NAME}_TRIBITS_DIR}" STREQUAL "") set(${PROJECT_NAME}_TRIBITS_DIR "$ENV{${PROJECT_NAME}_TRIBITS_DIR}") endif() if("${${PROJECT_NAME}_TRIBITS_DIR}" STREQUAL "") # If not set externally, then assume this is inside of tribits example # directory. set(${PROJECT_NAME}_TRIBITS_DIR "${CMAKE_CURRENT_LIST_DIR}/../../../..") endif() # # Include the TriBITS file to get other modules included # include("${${PROJECT_NAME}_TRIBITS_DIR}/ctest_driver/TribitsCTestDriverCore.cmake") function(tribitsexproj_ctest_driver) set_default_and_from_env( CTEST_BUILD_FLAGS "-j1 -i" ) set_default_and_from_env( CTEST_PARALLEL_LEVEL "1" ) tribits_ctest_driver() endfunction()and then create a set of CTest -S driver scripts that uses that file. One example is the file TribitsExampleProject/cmake/ctest/general_gcc/ctest_serial_debug.cmake:
include("${CMAKE_CURRENT_LIST_DIR}/../TribitsExProjCTestDriver.cmake") set(COMM_TYPE SERIAL) set(BUILD_TYPE DEBUG) set(COMPILER_VERSION GCC) set(BUILD_DIR_NAME ${COMM_TYPE}_${BUILD_TYPE}) set( EXTRA_CONFIGURE_OPTIONS "-DBUILD_SHARED_LIBS:BOOL=ON" "-DCMAKE_BUILD_TYPE=DEBUG" "-DCMAKE_C_COMPILER=gcc" "-DCMAKE_CXX_COMPILER=g++" "-DCMAKE_Fortran_COMPILER=gfortran" "-DTribitsExProj_ENABLE_Fortran=ON" "-DTribitsExProj_TRACE_ADD_TEST=ON" ) set_default_and_from_env(TribitsExProj_CMAKE_INSTALL_PREFIX "") if (TribitsExProj_CMAKE_INSTALL_PREFIX) set(EXTRA_CONFIGURE_OPTIONS "${EXTRA_CONFIGURE_OPTIONS}" "-DCMAKE_INSTALL_PREFIX=${TribitsExProj_CMAKE_INSTALL_PREFIX}" ) endif() set(CTEST_TEST_TYPE Continuous) tribitsexproj_ctest_driver()
Once a CTest -S driver script (like the ctest_serial_debug.cmake example shown above) is created, one can test it locally and then test a submit to CDash. To test the exact state of the repository locally, one can create a temporary base directory, symbolically link in the local project source directory, and then run the CTest -S script by setting CTEST_DO_SUBMIT=OFF. For example, the TribitsExampleProject CTest -S script can be run and tested locally by doing:
$ mkdir MOCK_TRIBITSEXPROJ_SERIAL_DEBUG $ cd MOCK_TRIBITSEXPROJ_SERIAL_DEBUG/ $ ln -s $TRIBITS_DIR/examples/TribitsExampleProject . $ env CTEST_DASHBOARD_ROOT=$PWD \ CTEST_DROP_SITE=testing.sandia.gov/cdash \ CTEST_PROJECT_NAME=TribitsExProj \ CTEST_DROP_LOCATION="/submit.php?project=TribitsExProj" \ CTEST_TEST_TYPE=Experimental \ CTEST_DO_SUBMIT=OFF \ CTEST_DO_UPDATES=OFF \ CTEST_START_WITH_EMPTY_BINARY_DIRECTORY=TRUE \ ctest -V -S \ $TRIBITS_DIR/examples/TribitsExampleProject/cmake/ctest/general_gcc/ctest_serial_debug.cmake \ &> console.outwhere TRIBITS_DIR is an env var that points to the location of the TriBITS/tribits directory on the local machine (and the location of the CDash site and project is changed, since the free my.cdash.org site can only accept as small number of builds each day).
Once that CTest -S driver script is working correctly without submitting to CDash, the above ctest -S command can be run with CTEST_DO_SUBMIT=ON which submit the CDash and then print the location on CDash for the submitted configure, build, and test results.
The custom CTest -S driver scripts created above can be run and used to submit to CDash in a variety of ways:
- Cron jobs can be set up to run them at the same time every day.
- Jenkins jobs can be set up to run them based on various criteria.
- GitHub Actions can run them to respond to GitHub pushes or to test pull requests.
- GitLab CI can run them to respond to GitLab pushes or the test merge requests.
- Use the legacy TriBITS Dashboard Driver system (not recommended).
The setup of Jenkins, GitHub Actions, GitLab CI and other more sophisticated automated testing systems will not be described here. What will be briefly outlined below is the setup using cron jobs on a Linux machine. That is sufficient for most smaller projects and provides tremendous value.
To set up an automated build using a cron job, one will typically create a shell driver script that sets the env and then calls the ctest -S <script> command. Then one just adds a call to that shell driver script using crontab -e. That is about all there is to it.
Following up on How to submit testing results to a CDash site, to submit build and test results to a custom "Group" on CDash (instead of just "Nightly", "Continuous" or "Experimental"), one just has to create the new group on CDash using the CDash GUI interface and then tell the ctest -S local driver to submit results to that CDash group. The steps for doing this are given below.
Create the new CDash group <special_group> for CDash project <ProjectName> on the targeted CDash site.
If the CDash group <special_group> is not already created, then one can create it by first logging into CDash with an account that can modify the CDash project <ProjectName>. Once logged in, go to the project edit page and select "Settings" and "Groups". From there, create the new group <special_group>. Set the "Build Type" to either "Daily" or "Latest".
Set ${PROJECT_NAME}_TRACK=<special_group> with the CTest -S driver script.
One can either do that by setting set(${PROJECT_NAME}_TRACK <special_group>) in the CTest -S *.cmake driver script itself or can set it in the environment when running the ctest -S driver script. For example:
$ env <Project>_TRACK=<special_group> ... \ ctest -V -S <ctest_driver>.cmake
If the "build type" for the CDash group <special_group> was set to "Daily", then set CTEST_TEST_TYPE to Nightly. Otherwise, CTEST_TEST_TYPE can be set to Continuous or Experimental.
In this section, a number of miscellaneous topics and TriBITS features are discussed. These features and topics are either not considered primary features of TriBITS or don't neatly fit into one of the other sections.
The TriBITS git repository is organized as a TriBITS Project, TriBITS Repository, and TriBITS Package all in the same base directory. The base contents are described in the file:
TriBITS/README.DIRECTORY_CONTENTS.rst
The part of TriBITS that is installed or snapshotted in contained in the subdirectory TriBITS/tribits/ and is described in the following section.
This base directory for TriBITS acts as a TriBITS Project, a TriBITS Repository, and a TriBITS Package. As such, it contains the standard files that are found in a TriBITS Project, Repository, and Package:
ProjectName.cmake # PROJECT_NAME=TriBITS CMakeLists.txt # PROJECT_NAME = PACKAGE_NAME = TriBITS PackagesList.cmake # Lists just "TriBITS . PT" TPLsList.cmake # Lists only MPI cmake/ # Dependencies.cmake, etc.
The core functionality of TriBITS is provided in the following directory, 'tribits'/:
tribits/: The part of TriBITS that CMake projects use to access TriBITS functionality and assimilate into the TriBITS framework. It also contains basic documentation and examples. Files and directories from here are what get installed on the system or are snapshotted into <projectDir>/cmake/tribits/. Each TriBITS Project decides what parts of it wants to install or snapshot using the script tribits/snapshot_tribits.py (which takes arguments for what dirs to snapshot, see below). This directory contains no tests at all. All of the tests for TriBITS are in the test/ directory (see below). The breakdown of the contents of tribits/ are described in the file tribits/README.DIRECTORY_CONTENTS.rst.
The following directories are not snapshotted into <projectDir>/cmake/tribits/ by the script tribits/snapshot_tribits.py:
test/: Contains all of the automated tests for TriBITS as part of the TriBITS "TriBITS" package. When doing development, these tests are critical.
dev_testing/: Contains scripts that support the development of the TriBITS system itself in various contexts.
common_tools/: Contains misc utilities that are not central to the TriBITS system but are very helpful to keep around and do not take up too much space.
refactoring/: Some scripts and other files that have aided in various refactorings of TriBITS and are used to upgrade client TriBITS projects.
This directory contains the implementation for the various parts of TriBITS that are used by TriBITS projects to implement TriBITS functionality. It also contains basic documentation in the subdirectory doc/ that is very close to the TriBITS implementation. Files and directories from here are what get installed on the system or will be snapshotted into <projectDir>/cmake/tribits/. Each TriBITS Project decides what parts of TriBITS it wants to install or snapshot using the script tribits/snapshot_tribits.py (which takes arguments for what dirs to snapshot). This directory contains no tests at all. All of the tests for TriBITS are in the test/ directory in the parent TriBITS repository.
The breakdown of the contents of this directory are described below:
TriBITS.cmake: The one file that needs to be included in order to use TriBITS in a CMake project. This one file insulates clients from future TriBITS refactorings of TriBITS.
Version.cmake: Version of TriBITS. This gets included by TriBITS.cmake
core/: Core TriBITS test support and package-based architecture for CMake projects. This only depends on raw CMake and contains just the minimal support for building, testing, installing, and deployment. This CMake code depends only on CMake and nothing else.
python_utils/: Some basic Python utilities that are not specific to TriBITS but are used in TriBITS CI and testing support software. There are some very useful python scripts here like gitdist and snapshot-dir.py.
ci_support/: Support code for pre-push continuous integration testing. This contains the checkin-test.py script and its supporting Python modules.
ctest_driver/: Support for package-by-package testing driven by CTest submitting to CDash (to CDash project <Project>). This contains the file TribitsCTestDriveCore.cmake and some supporting modules.
dashboard_driver/: TriBITS Dashboard Driver system which uses CTest to drive individual project builds in parallel and submits results to a separate <Project>Driver CDash project. WARNING: This was written by a contractor and is least well tested, more confusing, and least desirable part of TriBITS. If you have a better way to manage multiple builds (e.g. Jenkins) then use that instead.
common_tpls/: TPLs that are very common and are used by several different TriBITS projects but are not built into the TriBITS system itself. Having some of these common TPLs in a central location enhances uniformity, reuse, and makes it easier to pull TriBITS packages out of a repo and build them independently.
doc/: Basic TriBITS documentation built using docutils. The generated documents are stored in the git repo but instead are built on command using the doc/build_docs.sh script.
examples/: Example TriBITS projects and TPLs. These can be copied out and used as examples
devtools_install/: Basic install scripts for tools like CMake, GCC, OpenMPI, MPICH, Git, etc. By default, these all download tarballs from the github.com/TriBITSPub/ site and the repos are named devtools-<toolname>-<version>-base. This makes it easy to set up a new dev environment for projects that uses TriBITS (or don't use TriBITS for that matter).
win_interface/: Some non-Windows C header files ported to Windows to make porting to Windows easier.
The script snapshot_tribits.py install the different pieces for of this tribits/ directory into a project's <projectDir>/cmake/tribits/ subdirectory. It supports the argument --components with values core, python_utils, ci_support, ctest_driver, dashboard_driver, common_tpls, doc, examples, win_interface, and devtools_install. These snapshot components have the dependencies:
The TriBITS core/ directory is broken down into several subdirectories of its own:
core/utils: General CMake utilities that are not specific to the TriBITS system and can be reused in any CMake project.
core/common: As small set of common modules that the different TriBITS Core module files in different directories depend on. These include things like common TriBITS constants and TriBITS CMake policies.
core/test_support: Modules that help define CTest tests using functions like tribits_add_test() and tribits_add_advanced_test(). These can be used in CMake projects that are not full-blown TriBITS projects.
core/config_tests: Some basic configure-time tests used by the TriBITS package architecture framework.
core/std_tpls: Some Find<tplName>.cmake files for key external dependencies handled as TriBITS TPLs but are more central to the TriBITS system. (Examples include CUDA and MPI support.)
core/installation: A collection of *.cmake.in and related Cmake code supporting installations.
core/package_arch: Modules for the full-blown TriBITS package architecture framework including package dependency management, multi-repository support, installations (including the generation of <Package>Config.cmake files), etc.
The dependencies between these different TriBITS core subdirectories are:
The core TriBITS system itself (see tribts/core/ in TriBITS/tribits/) which is used to configure, built, test, create tarballs, and install software has no dependencies other than a basic installation of CMake (which typically includes the executables cmake, ctest, and cpack). Great effort has been expended to implement all of this core functionality of TriBITS just using raw CMake. That means that anyone who needs to configure, build, and install software that uses TriBITS just needs a compatible CMake implementation. CMake is becoming iniquitous enough that many machines will already have a current-enough version of CMake installed by default on their systems and therefore no one will need to download or install any extra software when building and installing a project that uses TriBITS (assuming the necessary compilers etc. required by the project are also installed). If a current-enough version of CMake is not installed on a given system, it is easy to download the source code and all it needs is a basic C++ compiler to build and install.
However, note that a specific TriBITS project is free to use any newer CMake features it wants and therefore these projects will require newer versions of CMake than what is required by TriBITS (see discussion of cmake_minimum_required() in <projectDir>/CMakeLists.txt). But also note that specific TriBITS projects and packages will also require additional tools like compilers, Python (see Python Support), Perl, or many other such dependencies. It is just that TriBITS itself does not require any of these in order to perform the basic configure, build, test, and install of software. The goal of TriBITS is not to make the portability of software that uses it any worse than it already is but instead to make it easier in most cases (that after all is the whole goal of CMake).
While the TriBITS Core functionality to configure, build, test, and install software is written using only raw CMake, the more sophisticated development tools needed to implement the full TriBITS development environment require Python 2.7 (or higher including Python 3.x) (see Python Support). Python is needed for tools like checkin-test.py and gitdist. In addition, these python tools are used in tribits_ctest_driver() to drive automated testing and submits to CDash. Also note that git is the chosen version control tool for the TriBITS software development tools and all the VC-related functionality in TriBITS. But none of this is required for doing the most basic building, testing, or installation of a project using TriBITS Core.
TriBITS Core does not require anything other than raw CMake. However, Python Utils, TriBITS CI Support, and other extended TriBITS components require Python. These extra TriBITS tools only require Python 3.6+. By default, when a TriBITS project starts to configure using CMake, it will try to find Python 3.6+ on the system (see Full Processing of TriBITS Project Files). If Python is found, it will set the global cache variable Python3_EXECUTABLE. If it is not found, then it will print a warning and Python3_EXECUTABLE will be empty. With this default behavior, if Python is found, then the TriBITS project can use it. Otherwise, it can do without it.
While the default behavior for finding Python described above is useful for many TriBITS projects, some TriBITS projects need different behavior such as:
If a project that uses TriBITS is going to have a significant user base that will configure, build, and test the project, then having some documentation that explains how to do this would be useful. For this purpose, TriBITS provides a mechanism to quickly create a project-specific build reference document in restructured text (RST) format and with HTML and LaTeX/PDF outputs. This document are generally created in the base project source tree and given then name <Project>BuildReference.[rst,html,pdf]. This document consists of two parts. One part is a generic template document:
tribits/doc/TribitsBuildReferenceBody.rst
provided in the TriBITS source tree that uses the place-holder <Project> for the for the real project name. The second part is a project-specific template file:
<projectDir>/cmake/<Project>BuildReferenceTemplate.rst
which provides the outer RST document (with title, authors, abstract, introduction, other introductory sections). From these two files, the script:
tribits/doc/build_ref/create-project-build-quickref.py
is used to replace <Project> in the TribitsBuildReferenceBody.rst file with the real project name (read from the project's ProjectName.cmake file by default) and then generates the read-only files:
<projectDir>/ <Project>BuildReference.rst <Project>BuildReference.html <Project>BuildReference.pdf
For a simple example of this, see:
tribits/doc/build_ref/create-build-ref.sh
A project-independent version of this file is provided in the TribitsBuildReference.[rst,html,pdf] which is referred to many times in this developers guide.
TriBITS has built-in support for project and repository versioning and release-mode control. When the project contains the file <projectDir>/Version.cmake, it is used to define the project's official version. The idea is that when it is time to branch for a release, the only file that needs to be changed is the file <projectDir>/Version.cmake.
Each TriBITS repository can also contain a <repoDir>/Version.cmake file that sets version-related variables which TriBITS packages in that repository can use to derive development and release version information. If the TriBITS repository also contains a <repoDir>/Copyright.txt file, then the information in <repoDir>/Version.cmake and <repoDir>/Copyright.txt are used to configure a repository version header file:
${${REPOSITORY_NAME}_BINARY_DIR}/${REPOSITORY_NAME}_version.h
The configured header file ${REPOSITORY_NAME}_version.h defines C pre-processor macros that give the repository version number in several formats, which allows C/C++ code (or any software that uses the C preprocessor) to write conditional code like:
#if Trilinos_MAJOR_MINOR_VERSION > 100200 /* Contains feature X */ ... #else /* Does not contain feature X */ ... #endif
Of course when the TriBITS project and the TriBITS repository are the same directory, the <projectDir>/Version.cmake and <repoDir>/Version.cmake files are the same file, which works just fine.
Part of the TriBITS Framework is to probe the environment, set up the compilers, and get ready to compile code. This was mentioned in Full Processing of TriBITS Project Files. This is executed by the TriBITS macro tribits_setup_env(). Some of the things this macro does are:
Probe and set up the environment:
At the completion of this part of the processing, the TriBITS CMake project is ready to compile code. All of the major variables set as part of this process are printed to the cmake stdout when the project is configured.
An issue that comes up with external packages/TPLs like HDF5 that needs to be discussed here is the fact that FindTPL<tplName>.cmake module files create (See How to add a new TriBITS TPL) and TriBITS installs package config files of the name <tplName>Config.cmake. These TriBITS-generated package config files <tplName>Config.cmake could potentially be found by calls to find_package(<externalPkg>) (i.e. when <tplName> == <externalPkg> like with HDF5). These TriBITS-generated <tplName>Config.cmake files are primarily meant to provide a TriBITS-compliant external package for downstream TriBITS-compliant <Package>Config.cmake files. These TriBITS-generated <tplName>Config.cmake files will usually not behave the same way existing Find<tplName>.config find modules or native <tplName>Config.cmake package config files would behave as expected by downstream projects when found by find_package(<tplName>) commands called in some arbitrary downstream raw CMake project. Therefore, to avoid having an installed TriBITS-generated HDF5Config.cmake file, for example, being found by the inner call to find_package(HDF5 ...) in the file FindTPLHDF5.cmake (which could be disastrous), TriBITS employs two safeguards.
First, TriBITS-generated <tplName>Config.cmake package config files are placed into the build directory under:
<buildDir>/external_packages/<tplName>/<tplName>Config.cmake
and installed into the installation directory under:
<installDir>/lib/external_packages/<tplName>/<tplName>Config.cmake
so they will not be found by find_package(<tplName>) by default when <buildDir>/cmake_packages and/or <installDir>, respectively, are added to CMAKE_PREFIX_PATH.
Second, even if the directories <installDir>/lib/external_packages or <buildDir>/external_packages do get added to the search path somehow (e.g. by appending those to CMAKE_INSTALL_PREFIX), the companion TriBITS-generated <tplName>ConfigVersion.cmake files will set PACKAGE_VERSION_COMPATIBLE=OFF and result in find_package(<tplName>) not selecting the TriBITS-generated <tplName>Config.cmake file. (It turns out that CMake's find_package(<Package>) command always includes the file <Package>ConfigVersion.cmake, even if no version information is passed to the command find_package(<Package>). This allows special logic to be placed in the file <Package>ConfigVersion.cmake to determine if find_package(<Package>) will select a given <Package>Config.cmake file that is in the search path based on a number of different criteria such as in this case.)
For the most part, installation is pretty straightforward with a TriBITS-based CMake project. TriBITS automatically puts in appropriate default install() commands to install header files, libraries, executables, and other commonly installed artifacts (such as TriBITS-autogenerated <Package>Config.cmake files). And packages can add their own custom install() commands to install items under CMAKE_INSTALL_PREFIX (or the subdirs under CMAKE_INSTALL_PREFIX mentioned in Setting the install prefix). However, there are some special situations that need to be addressed and some tweaks to built-in CMake support that need to be made.
One issue that can occur is that there are cases where a Unix/Linux system is set up not to honor the group sticky bit and therefore one cannot control what group owns the created installed files and directories (i.e. the default group will be used). Also, there are cases were one cannot easily control the default file or directory creation permissions using umask. And there are cases where one would like to recursively install a set of directories and files where some of these files may be scripts that need to have the execute permission set on them for them to work. The only to flexible accomplish that with CMake (if one does not know the exist list of those files or extensions of those files) is to pass in the SOURCE_PERMISSIONS option to the install(DIRECTORY ...) command. An example of this is shown in:
that has:
install( DIRECTORY "${CMAKE_CURRENT_LIST_DIR}/stuff" DESTINATION "${CMAKE_INSTALL_PREFIX}/share/${PACKAGE_NAME}" USE_SOURCE_PERMISSIONS PATTERN "*~" EXCLUDE )
In this case, CMake will preserve the execute permission on any of the scripts contained under the stuff/ subdirectory but group and other permissions will not be set based on umask or the default CMake install permissions. Instead, these permissions are set based on the source directory permissions (which is often set to 700 or rwx------).
To address cases like this, TriBITS can automatically run chgrp and chmod on the created files and directories that are created during the install target as described in Setting install ownership and permissions. This is completely automatic and requires nothing for the TriBITS Project developers to do to enable support for this (other than to note the below warning).
WARNING: Do not add any install() commands after the tribits_project() command completes. Otherwise, any extra files or directories will not have their group and permissions fixed by these special TriBITS-added chgrp and chmod commands run at install time. Instead, try to put all install() commands inside of a package's <packageDir>/CMakeLists.txt file. Currently, there really is no good place to add repo-level or project-level install() commands. But if one had to sneak them in, they could add various install() commands to files like <projectDir>/CMakeLists.txt (before the tribits_project_() command), <repoDir>/cmake/CallbackSetupExtraOptions.cmake, <projectDir>/cmake/CallbackDefineProjectPackaging.cmake and/or <repoDir>/cmake/CallbackDefineRepositoryPackaging.cmake. (Note that install commands from the former two files are run before install commands for the enabled packages while install commands from the latter two files are run after.)
One can also change what compilers are written into the generated <Project>Config.cmake and <Package>Config.cmake files for the build and the install trees. By default, the compilers pointed to in these Config.cmake files will be CMAKE_<LANG>_COMPILER where <LANG> = CXX, C, and Fortran. But one can change this by setting any of the following:
set(CMAKE_CXX_COMPILER_FOR_CONFIG_FILE_BUILD_DIR <path>) set(CMAKE_C_COMPILER_FOR_CONFIG_FILE_BUILD_DIR <path>) set(CMAKE_Fortran_COMPILER_FOR_CONFIG_FILE_BUILD_DIR <path>) set(CMAKE_CXX_COMPILER_FOR_CONFIG_FILE_INSTALL_DIR <path>) set(CMAKE_C_COMPILER_FOR_CONFIG_FILE_INSTALL_DIR <path>) set(CMAKE_Fortran_COMPILER_FOR_CONFIG_FILE_INSTALL_DIR <path>)
before the Config.cmake files are generated. These can also be set in the CMake cache using, for example, -DCMAKE_CXX_COMPILER_FOR_CONFIG_FILE_INSTALL_DIR:FILEPATH=<path>.
This is used, for example, when compiler wrappers are used for the build tree and are set to CMAKE_<LANG>_COMPILER but when one wants to point to the original underlying compilers for the installed Config.cmake files.
As explained in Setting install RPATH, TriBITS changes the CMake defaults to write in the RPATH for shared libraries and executables so that they run right out of the install directory without needing to set paths in the environment (e.g. LD_LIBRARY_PATH). However, these defaults can be changed by changing setting different project defaults for the variables ${PROJECT_NAME}_SET_INSTALL_RPATH and CMAKE_INSTALL_RPATH_USE_LINK_PATH. But most projects should likely keep these defaults in place since they make it so that doing builds and installations on a single machine work correctly by default out of the box. For other installation/distribution use cases, the user is told how to manipulate CMake variables for those cases in Setting install RPATH.
CMake has good support for defining configure-time checks of the system to help in configuring the project. One can check for whether a header file exists, if the compiler supports a given data-type or language feature, or perform almost any other type of check that one can imagine that can be done using the configured compilers, libraries, system tools, etc. An example was given in TribitsExampleProject. Just follow that example, look at some of the built-in CMake configure-time test modules, and consult provided on-line CMake documentation in order to learn how to create a configure-time test for almost anything.
The TriBITS system uses CMake's built-in CPack support to create source distributions in a variety of zipped and tarred formats. (Note that the term "source tarball" or just "tarball" may be used below but should be interpreted as "source distribution".) TriBITS will automatically add support for CPack when the variable ${PROJECT_NAME}_ENABLE_CPACK_PACKAGING is set to ON. The commands for creating a source distribution are described in Creating a tarball of the source tree using the built-in package_source build target. The value added by TriBITS is that TriBITS will automatically exclude the source for any defined packages that are not enabled and TriBITS provides a framework for systematically excluding files and directories from individual repositories and packages. In addition, the source for non-enabled subpackages can also be excluded depending on the value of ${PROJECT_NAME}_EXCLUDE_DISABLED_SUBPACKAGES_FROM_DISTRIBUTION. All of this allows one to create distributions which only includes subsets of a larger project (even a single package in some cases).
Unlike other build systems (like autotools), CMake will put EVERYTHING into the source distribution (e.g. tarball) that is sitting in the source tree by default. Therefore, setting up for a source distribution usually means deciding what extra files and directories should be excluded. Beyond the directories for non-enabled packages, further files can be selected to be excluded on a package-by-package based and at the repository level (see below).
Individual packages can list additional files/directories under the package's source tree to be excluded from the project's source distribution using a call to tribits_exclude_files() in their <packageDir>/CMakeLists.txt file. Note that if the package is not enabled, these excludes will never get added! That is no problem if these excludes only apply to the given package since TriBITS will add an exclude for the entire package but is a problem if a package lists excludes for files outside of the package's source tree.
Additional files and entire directories can also be excluded at the repository level by listing them in the <repoDir>/cmake/CallbackDefineRepositoryPackaging.cmake file which typically just appends the built-in CMake variable CPACK_SOURCE_IGNORE_FILES. However, if a repository is not processed, this file is never processed and therefore no files will be excluded from a repository sitting in the source tree that is not processed (see below).
There are a number of project-level settings that need to be defined and these are specified in the file <projectDir>/cmake/CallbackDefineProjectPackaging.cmake.
The TribitsExampleProject is set up for creating source distributions and this is demonstrated in one of the tests defined in:
TriBITS/test/core/ExamplesUnitTests/CMakeLists.txt
There are a few points of caution to note about creating source distributions.
NOTE: It is worth stressing again that EVERY file that is in the source tree will be included in the source distribution (tarball) unless there is an exclude regex matching it appended to the variable CPACK_SOURCE_IGNORE_FILES. TriBITS can only add excludes for defined non-enabled packaged. Every other file listed in the source tree will be included in the tarball.
NOTE: The entries in CPACK_SOURCE_IGNORE_FILES are interpreted as REGULAR EXPRESSIONS not globs so if you add "someFile.*" as an exclude, it will exclude every file in the entire source tree that has "someFile" in the name! This is because, in regex terminology, the trailing ".*" means "match any character zero or more times" and "someFile" can match anywhere in the file name path. Also, note that if you add in an exclude like "*.pyc" (i.e. trying to exclude all of the generated Python byte code files) that it will exclude every file that has "pyc" in the name and not just those with the file extension "pyc". For example, the exclude ".pyc" would exclude the files "puppyc", "lpycso", etc. If you want to exclude all files with extension "pyc", you have to add the exclude regex ".*[.]pyc$"! One's lack of understanding of this fact will cost someone hours of lost time debugging what happens when random files are missing when one tries to configure what is left. Sometimes, what is left will actually configure and might almost build!
NOTE: As warned in TriBITS Package Core Files and TriBITS Subpackage Core Files, Packages must have directories that are strictly independent of the directories of other packages. If they don't, then the source directory for an enabled package will get excluded from the source distribution if its directory is under the directory of a package that is not enabled. For example, if PackageA is enabled but its package directory packageb/packagea/ is under the package directory packageb/ for the disabled package PackageB, then every file and directory under packageb/ will be excluded from the source distribution (tarball), including everything under packageb/packagea/! It would be too expensive to put in an automated check for cases like this so package developers should just take care not to nest the directories of packages inside of each other to avoid problems like this.
NOTE: Extra repositories that are sitting in the source tree but not processed by TriBITS for some reason (e.g. due to explicitly listing in the variable ${PROJECT_NAME}_EXTRA_REPOSITORIES only a subset of the repositories listed in <projectDir>/cmake/ExtraRepositoriesList.cmake ) will get added to the source distribution in full by default even though there are no enabled packages from these repos. These non-processed repo dirs are like any other random directory sitting in the source tree, they will get copied over into the source distribution!.
NOTE: When debugging tarball creation problems, always configure with the variable <Project>_DUMP_CPACK_SOURCE_IGNORE_FILES=ON. If you don't see a regex listed for the file or directory you expect to be excluded, then that file/directory it will be included in the source distribution!
There are cases where a customer will do an update of an upstream project from a git repo and then find out that some feature or behavior is broken with respect to their usage. This can happen even if the upstream project's own test suite passes all of its tests. Depending on the situation, there may be hundreds to thousands of commits between the last known "good" working version of the code and pulled "bad" version. To help customers find the first commit that contains the changes which are causing the breakage, git supplies git bisect. This set of commands does a binary search of the commits in the range <good-sha>..<bad_sha> and finds the first commit that is "bad" (or a range of commits which contains the first "bad" commit if commits are skipped, as described below).
But the git bisect commands require that all of the commits in the range <good-sha>..<bad_sha> be complete commits that provide full working versions of the upstream project. However, many beginning git developers and even many experienced developers don't always create individual git commits that build and pass all of the upstream project's tests and therefore can create false "bad" commits during the binary search. This can happen when developers create intermediate "check-point" commits during the development process but did not squash the intermediate commits together to create cohesive working commits. This can also happen when experienced developers have a set of what they believe are all working commits but do not actually test all of the commits to know that they pass all of the upstream project's tests before pushing these commits to the main development branch. This lack of detailed testing of each and every individual commit can give rise to false "bad" commits which will result in git bisect reporting the wrong first "bad" commit.
Projects that use the checkin-test.py tool to push sets of commits to the main development branch have an advantage in the usage of git bisect. This is because the default mode of the checkin-test.py script is to amend the top commit message with a summary of what was tested and therefore marks a set of commits that are known to have more complete testing. For example, the checkin-test.py tool amends the top commit (after the final pull and rebase by default) as shown in the following Trilinos commit:
commit 71ce56bd2d268922fda7b8eca74fad0ffbd7d807 Author: Roscoe A. Bartlett <bartlettra@ornl.gov> Date: Thu Feb 19 12:04:11 2015 -0500 Define HAVE_TEUCHOSCORE_CXX11 in TeuchosCore_config.h This makes TeuchosCore a good example for how Trilinos (or any TriBITS) subpackages should put in an optional dependency on C++11. Build/Test Cases Summary Enabled Packages: TeuchosCore Disabled Packages: [...] 0) MPI_DEBUG => passed: passed=44,notpassed=0 (2.61 min) 1) SERIAL_RELEASE => passed: passed=43,notpassed=0 (1.08 min)
Therefore, these special known-tested commits can be flagged by grepping the git log -1 HEAD output for the string "Build/Test Cases Summary". By bisecting on these commits, one has a lower chance of encountering false "bad" commits and has a higher chance of finding a smaller range of commits where the first true "bad" commit might be found. To aid in performing git bisect and only checking checkin-test.py-tested commits, the tool is_checkin_tested_commit.py is provided.
To demonstrate how the is_checkin_tested_commit.py tool can be used with git bisect, suppose that someone writes a customized script build_and_test_customer_code.sh that will build the upstream project and the downstream customer's code and then run a set of tests to see if the "bad" behavior seen by the customer code is the current HEAD version of the upstream project. Assume this script is copied into the upstream project's local git repo using:
$ cd <upstream-repo>/ $ cp ~/build_and_test_customer_code.sh . $ cat /build_and_test_customer_code.sh >> .git/info/exclude
Now, one could use build_and_test_customer_code.sh directly with:
$ git bisect run ./build_and_test_customer_code.sh
but that would result in testing all the commits which may have a high chance of producing false "bad" commits as described above and fail to correctly bracket the location of the true first "bad" commit.
So instead, one can write a filtered testing script safe_build_and_test_customer_code.sh which calls is_checkin_tested_commit.py and build_and_test_customer_code.sh as follows:
#!/bin/bash # # Script: safe_build_and_test_customer_code.sh $TRIBITS_DIR/ci_support/is_checkin_tested_commit.py IS_CHECKIN_TESTED_COMMIT_RTN=$? if [ "$IS_CHECKIN_TESTED_COMMIT_RTN" != "0" ] ; then exit 125 # Skip the commit because HEAD is not known to be tested! fi ./build_and_test_customer_code.sh # Rtn 0 "good", or [1, 124] if "bad"
The above test script safe_build_and_test_customer_code.sh will skip the testing of commits that are not marked by the checkin-test.py tool.
To demonstrate how to use the is_checkin_tested_commit.py script with git bisect, an example from Trilinos is used below. (Note that the current Trilinos public repository may have been filtered so the commit SHA1s shown below may not match what is in the current Trilinos repository. But one can use the commit summary message, author, and author date to find the updated SHA1s and then to update this example for the current repository.)
Consider a scenario where a customer application updates Trilinos from the commit:
d44c17d "Merge branch 'master' of software.sandia.gov:/space/git/Trilinos" Author: Roscoe A. Bartlett <xxx@ornl.gov> Date: Tue May 26 12:43:25 2015 -0400
to the commit:
605b91b "Merge branch 'master' of software.sandia.gov:/git/Trilinos" Author: Vitus Leung <xxx@sandia.gov> Date: Tue Sep 29 20:18:54 2015 -0600
and it is found that some critical feature broke or is not behaving acceptably for the customer code (but all of the tests for Trilinos pass just fine). This range of commits d44c17d..605b91b gives 2257 commits to search as shown by:
$ cd Trilinos/ $ git log-oneline d44c17d..605b91b | wc -l 2257
However, as described above, it is likely that doing git bisect on that full set of 2257 commits may result in hitting false "bad" commits and therefore result in a false bracketing of the first "bad" commit. This is where the usage of the checkin-test.py tool helps which is used by many (but not currently all) Trilinos developers to push changes to the Trilinos 'master' branch in the current single-branch workflow. The commits marked with the checkin-test.py tool are known (with some caveats mentioned below) to be working commits and for this the range of commits d44c17d..605b91b yields 166 commits as shown by:
$ git log-oneline --grep="Build/Test Cases Summary" d44c17d..605b91b | wc -l 166
That is an average of 2257/166 = 13.6 commits between commits pushed with the checkin-test.py tool. So bisecting on just the commits marked by checkin-test.py should bound the "bad" commit in a set of 13.6 commits on average. Bisecting on this set of 166 commits should likely give no false “bad” commits, and therefore result in the correct bracketing of the first "bad" commit.
Using the safe_build_and_test_customer_code.sh shown above, one would search for the first bad commit over this range using:
$ git bisect start 605b91b d44c17d $ env DUMMY_TEST_COMMIT_BAD_SHA=83f05e8 \ time git bisect run ./safe_build_and_test_customer_code.sh
and this would return the range of commits that contains the first "bad" commit (listed at the end of git bisect log output, see example below).
To provide a concrete example, suppose the commit that first introduced the problem in the range of commits d44c17d..605b91b was:
83f05e8 "MueLu: stop semi-coarsening if no z-layers are left." Author: Tobias Wiesner <tawiesn@sandia.gov> Date: Wed Jul 1 14:54:20 2015 -0600
And instead of using the script safe_build_and_test_customer_code.sh, we use a dummy driver script dummy_test_commit.sh to simulate this which is provided in the set of TriBITS documentation:
$TRIBITS_DIR/doc/developers_guide/scripts/dummy_test_commit.sh
as:
#!/bin/bash # # Script: dummy_test_commit.sh # # This script simulates a test script used with 'git bisect run <script>' to # show how to use the is_checkin_tested_commit.py script to skip commits that # are not known to be tested with the checkin-test.py script. To use this # script, set the env variable DUMMY_TEST_COMMIT_BAD_SHA to the SHA1 of a # commit that you are pretending is the bad commit in the range of commits # <good-commit>..<bad-commit> and then run: # # git bisect <bad-commit> <good-commit> # git bisect run ./dummy_test_commit.sh # # This should result in git-bisect bounding the commits around # $DUMMY_TEST_COMMIT_BAD_SHA. To see the sorted set of commits containing the # first bad commit, run: # # git bisect log | grep "possible first bad commit" # LOG_DUMMY_COMMIT=`git log --oneline HEAD ^$DUMMY_TEST_COMMIT_BAD_SHA^` if [ "$LOG_DUMMY_COMMIT" == "" ] ; then echo "Commit is *before* bad commit $DUMMY_TEST_COMMIT_BAD_SHA!" else echo "Commit is or after *after* bad commit $DUMMY_TEST_COMMIT_BAD_SHA!" fi # Skip the commit if not tested with checkin-test.py script $TRIBITS_DIR/ci_support/is_checkin_tested_commit.py IS_CHECKIN_TESTED_COMMIT_RTN=$? if [ "$IS_CHECKIN_TESTED_COMMIT_RTN" != "0" ] ; then exit 125 # Skip the commit because it was not known to be tested! fi echo "Building the current version ..." if [ "$LOG_DUMMY_COMMIT" == "" ] ; then echo "Commit is *before* bad commit $DUMMY_TEST_COMMIT_BAD_SHA! so marking good!" exit 0 else echo "Commit is or *after* bad commit $DUMMY_TEST_COMMIT_BAD_SHA so marking bad!" exit 1 fi
This driver script allows one to simulate the usage of git bisect to understand how it works without having to actually build and test code. It is a useful training and experimentation tool.
Using git bisect (with git version 2.1.0) over the range of commits d44c17d..605b91b searching for the first "bad" commit is done by running the commands:
$ git bisect start 605b91b d44c17d $ env DUMMY_TEST_COMMIT_BAD_SHA=83f05e8 \ time git bisect run \ $TRIBITS_DIR/doc/developers_guide/scripts/dummy_test_commit.sh \ &> ../git_bisect_run.log $ git2 bisect log &> ../git_bisect_log.log $ cat ../git_bisect_log.log | grep "possible first bad commit" | \ sed "s/possible first bad commit://g" | sed "s/[a-z0-9]\{30\}\]/]/g" $ git bisect reset
This set of commands yield the output:
Bisecting: 1128 revisions left to test after this (roughly 10 steps) [9634d462dba77704b598e89ba69ba3ffa5a71471] Revert "Trilinos: remove _def.hpp [...]" real 1m22.961s user 0m57.157s sys 3m40.376s # [165067ce53] MueLu: SemiCoarsenPFactory. Use strided maps to properly transfer [...] # [ada21a95a9] MueLu: refurbish LineDetectionFactory # [83f05e8970] MueLu: stop semi-coarsening if no z-layers are left. Previous HEAD position was 83f05e8... MueLu: stop semi-coarsening if no z-layers are left. Switched to branch 'master'
This output shows the dummy bad commit 83f05e8 in a set of just 3 commits, bounded in the set of commits 8b79832..165067c:
165067c "MueLu: SemiCoarsenPFactory. Use strided maps to properly [...]." Author: Tobias Wiesner <tawiesn@sandia.gov> Date: Thu Jul 2 12:11:24 2015 -0600 8b79832 "Ifpack2: RBILUK: adding additional ETI types" Author: Jonathan Hu <jhu@sandia.gov> Date: Thu Jul 2 14:17:40 2015 -0700
The set of commits that were actually tested by git bisect run <script> is shown by:
$ cat ../git_bisect_log.log | grep "\(good:\|bad:\)" | sed "s/[a-z0-9]\{30\}\]/]/g" # bad: [605b91b012] Merge branch 'master' of software.sandia.gov:/git/Trilinos # good: [d44c17d5d2] Merge branch 'master' of software.sandia.gov:/space/git/Trilinos # good: [7e13a95774] Ifpack2: If the user does not provide the bandwidth of the banded [...] # bad: [7335d8bc92] MueLu: fix documentation # bad: [9997ecf0ba] Belos::LSQRSolMgr: Fixed bug in setParameters. # bad: [b6e0453224] MueLu: add a nightly test for the combination of semicoarsening [...] # bad: [165067ce53] MueLu: SemiCoarsenPFactory. Use strided maps to properly [...] # good: [3b5453962e] Ifpack2: Nuking the old ETI system # good: [8b79832f1d] Ifpack2: RBILUK: adding additional ETI types
This is only 9 commits out of the possible set of 166 checkin-test.py marked commits which is out of the total set of 2257 possible commits. With just 9 build/test cycles, it bounded the first "bad" commit in a set of 3 commits in this case. And it does not matter how sloppy or broken the intermediate commits are in Trilinos. All that matters is the usage of the checkin-test.py tool (another motivation for the usage of the checkin-test.py tool, see Pre-push Testing using checkin-test.py for others as well).
Note that above, we grep the output from git bisect log for the set of possible "bad" commits instead of just looking at the output from the git bisect run <script> command (which also lists the set of possible "bad" commits). This is because the direct output from the git bisect run <script> command (shown in the log file git_bisect_run.log) shows the set of possible bad commits at the end of the output but they are unsorted and give no other git commit information:
There are only 'skip'ped commits left to test. The first bad commit could be any of: 83f05e89706590c4b384dd191f51ef4ab00ce9bb ada21a95a991cd238581e5a6a96800d209a57924 165067ce538af2cd0bd403e2664171726ec86f3f We cannot bisect more! bisect run cannot continue any more
The problem with unsorted commits is that it is difficult to use an unsorted set to do further bisection. However, the output of the set of commits from git bisect log is sorted and also shows the commit summary message and therefore is much more useful. (But note that older versions of git don’t show this set of commits at the end of git bisect log so make sure and use an updated version of git, preferably >= 2.1.0.)
Now that one has the range of possible "bad" commits (just 3 in this example) doing a further manual bisection or even manual inspection of these commits may be enough to find the change that is causing the problem for the downstream customer application.
Without the usage of the checkin-test.py tool, one would not have an automated way to ensure that git bisect avoids false "bad" commits. This allows for less experienced developers to create commits and push to the main development branch but still ensure effective usage of git bisect. (This is another example where automated tools in TriBITS help to overcome lacking developer experience and discipline.)
The checkin-test.py tool can be used to the implement staged integration of the various repositories in a multi-repo TriBITS project (see Multi-Repository Support) . This is referred to here as Almost Continuous Integration (ACI). The basic concept of Almost Continuous Integration (ACI) is defined and described in the paper [Integration Strategies for CSE, 2009].
This topic is broken down into the following subsections:
The TriBITS system allows for setting up composite meta-builds of large collections of software pulled in from many different git/TriBITS code repositories as described in the section Multi-Repository Support. The checkin-test.py tool is a key tool to enable the testing of a set of packages in different git/TriBITS repos before pushing to remote tracking branches for the set of git repos; all in one robust command invocation.
While the checkin-test.py tool was originally designed and its default behavior is to test a set of local commits created by a developer before pushing changes to one or more (public) git repos, it can also be used to set up an Almost Continuous Integration (ACI) process to keep these various git/TriBITS repos in sync thereby integrating the work of various disconnected development teams and projects. To use the checkin-test.py tool for ACI requires some setup and changing what the tool does a little by passing in additional options that a regular developer typically never uses.
The following subsections describe how to use the checkin-test.py tool to implement an ACI process for a given set of git/TriBITS repositories and also provides a little background and context behind ACI.
In order to set up the context for the ACI process, consider the following simple TriBITS project with two extra repositories:
BaseProj/ ExtraRepo1 ExtraRepo2
Here, BaseProj is the base TriBITS project/repository and ExtraRepo1 and ExtraRepo2 are extra repositories that supply additional TriBITS packages that are appended to the TriBITS packages defined in BaseProj (see <projectDir>/cmake/ExtraRepositoriesList.cmake). Also, assume that BaseProj, ExtraRepo1, and ExtraRepo2 are developed by three different development teams that all have different funding sources and different priorities so they tend not to work closely together or consider the other efforts too much when developing their software. However, in this example, there is great value in combining all of this software into a single integrated TriBITS meta-project. This combined meta-build is driven by a 4th integration/development team. In this case, the core developers for each of these three different git/TriBITS repos do not test compatibility with the other git/TriBITS repos when pushing commits to their own git/TriBITS repos. This gives three different git repos on three different machines:
BaseProj main repo: Pushed to by the core BaseProj team:
url1.gov:/git/BaseProj
ExtraRepo1 main repo: Pushed to by the core ExtraRepo1 team:
url2.gov:/git/ExtraRepo1
ExtraRepo2 main repo: Pushed to by the core ExtraRepo2 team:
url3.gov:/git/ExtraRepo2
Because of the independent development processes of these three teams, unless these development teams maintain 100% backward compatibility w.r.t. the interfaces and behavior of the combined software, one cannot at any time pull the code from these three different git repos and expect to be able to successfully build all of the code and have all of the tests pass. Therefore, how does the 4th integration team expect to be able to build, test, and possibly extend the combined software? In this case, the integration team would set up their own clones of all three git/TriBITS repos on their own machine such as:
Integration project mirrored git repos:
url4.gov:/git/BaseProj url4.gov:/git/ExtraRepo1 url4.gov:/git/ExtraRepo2
Once an initial collaboration effort between the integration team and the three other development teams is able to get a version of all three git/TriBITS repos to work correctly in the combined meta-project, these versions (assume the master branches) would be pushed to the git repos on the git integration server url4.gov. The state where the TriBITS packages in the three different git/TriBITS repos in the master branch on url4.gov all work together correctly constitutes the initial condition for the ACI process described below. From that initial condition, the ACI processes ensures that updates the master branches for the git/TriBITS repos on url4.gov do not break any builds or tests of the integrated software.
In order to describe how to set up an ACI process using the checkin-test.py tool, the following subsections will focus on the update of the git/TriBITS ExtraRepo1 repo keeping the other two git/TriBITS repos BaseProj and ExtraRepo2 constant as the ACI use case.
In order to set up an ACI process for the multi-git/TriBITS repo example outlined above, first local repos are created by cloning the repos on the integration server url4.gov as follows (all of which become 'origin'):
$ cd $SYNC_BASE_DIR $ git clone url4.gov:/git/BaseProj $ cd BaseProj $ git clone url4.gov:/git/ExtraRepo1 $ git clone url4.gov:/git/ExtraRepo2
where, SYNC_BASE_DIR=~/sync_base_dir for example, which must already be created.
Next, one defines a remote to pull changes for the ExtraRepo1 from the main development repo:
$ cd $SYNC_BASE_DIR/BaseProj/ExtraRepo1 $ git remote add public url2.gov:/git/ExtraRepo1
Here, one should pick a name for the remote repo for ExtraRepo1 that is most descriptive for that particular situation. In this case, the name public is chosen to signify the main public development repo.
This gives the remotes:
$ cd $SYNC_BASE_DIR/BaseProj $ gitdist remote -v | grep -v push | grep -v "^$" *** Base Git Repo: BaseProj origin url4.gov:/git/BaseProj (fetch) *** Git Repo: ExtraRepo1 origin url4.gov:/git/ExtraRepo1 (fetch) public url2.gov:/git/ExtraRepo1 (fetch) *** Git Repo: ExtraRepo2 origin url4.gov:/git/ExtraRepo2 (fetch)
The remote public is used by the checkin-test.py wrapper script (see ACI Sync Driver Script below) to pull and merge in additional changes that will be tested and pushed to the 'origin' repos on url4.gov. In this case, the ExtraRepo1 remote public will result in updates being pulled from the main development repo on url2.gov, thereby facilitating the update of ExtraRepo1 in the integrated meta-project.
After the git repos are cloned and the remotes are set up as described above, a build base directory is set up as:
$ cd $SYNC_BASE_DIR $ mkdir BUILDS $ mkdir BUILDS/CHECKIN
An ACI wrapper script for checkin-test.py is created to drive the syncing process. It is assumed that this script would be called only once a day and not continuously in a loop (but that is possible as well but is not documented here).
NOTE: Other build directory structures are possible, it all depends how one writes the checkin-test.py wrapper scripts but the above directory structure is fairly standard in the usage of the checkin-test.py script.
The sync driver script for this example should be called something like sync_ExtraRepo1.sh, placed under version control, and would look something like:
#!/bin/bash -e # Set up the environment (i.e. PATH; needed for cron jobs) ... SYNC_BASE_DIR=~/sync_base_dir CHECKIN_TEST_WRAPPER=$SYNC_BASE_DIR/BaseProj/sampleScripts/checkin-test-foo.sh cd $SYNC_BASE_DIR/BUILDS/CHECKIN $CHECKIN_TEST_WRAPPER \ --extra-pull-from=ExtraRepo1:public:master \ --abort-gracefully-if-no-changes-to-push \ --enable-extra-packages=Package1A \ --send-build-case-email=only-on-failure \ --send-email-to=base-proj-integrators@url4.gov \ --send-email-to-on-push=base-proj-integrators@url4.gov \ --no-append-test-results --no-rebase \ --do-all --push \ -j16 \ --wipe-clean \ "$@"
NOTE, in the above example sync_ExtraRepo1.sh script, the variable CHECKIN_TEST_WRAPPER is set to a wrapper script:
BaseProj/sampleScripts/checkin-test-foo.sh
which would be set up to call the project's checkin-test.py tool with configure options for the specific machine. The location and the nature of the wrapper script will vary from project to project and machine to machine. In some simple cases, CHECKIN_TEST_WRAPPER might just be set to be the raw machine-independent checkin-test.py tool for the project.
A description of each option passed into this invocation of the checkin-test.py tool is given below (see checkin-test.py --help for more details):
--extra-pull-from=ExtraRepo1:public:master
This option instructs the checkin-test.py tool to pull and merge in commits that define the integration. One could do the pull(s) manually of doing so has the disadvantage that if they fail for some reason, they will not be seen by the checkin-test.py tool and no notification email would go out.--abort-gracefully-if-no-changes-to-push
The option --abort-gracefully-if-no-changes-to-push makes the checkin-test.py tool gracefully terminate without sending out any emails if after all the pulls, there are no local changes to push to the 'origin' repos. This can happen, for example, if no commits were pushed to the main development git repo for ExtraRepo1 at url2.gov:/git/ExtraRepo1 since the last time this sync process was run. This avoids getting confusing and annoying emails like "PUSH FAILED". The reason this option is not generally needed for local developer usage of the checkin-test.py tool is that in general a developer will not run the checkin-test.py tool with --push unless they have made local changes; it just does not make any sense at all to do that and if they do by accident, they should get an error email. However, for an automated ACI sync process, there is no easy way to know a-priori if changes need to be synced so the script supports this option to deal with that case gracefully.--enable-extra-packages=Package1A
This option should be set if one wants to ensure that all commits get synced, even when these changes don't impact the build or the tests of the project. If not setting --enable-extra-packages=<some-package> , then the checkin-test.py tool will only decide on its own what packages to test just based on what packages have changed files in the ExtraRepo1 repo and if no modified files map to a package, then no packages will be auto-enabled and therefore no packages will be enabled at all. For example, if a top-level README file in the base ExtraRepo1 repo gets modified that does not sit under a package directory, then the automatic logic in the checkin-test.py tool will not trigger a package enable. In that case, no configure, build, testing, or push will take place (must run at least some tests in order to assume it is safe to push) and therefore the sync will not occur. Therefore, if one wants to ensure that every commit gets safely synced over on every invocation, then the safest way to that is to always enable at least one or more packages by specify --enable-extra-packages=<pkg0>,<pkg1>. WARNING: it is not advisable to manually set --enable-packages=<package-list> since it turns off the auto-enable logic for changed files. This is because if there are changes to other packages, then these packages will not get enabled and not get tested, which could break the global build and tests. Also, this is fragile if new packages are added to ExtraRepo1 later that are not listed in --enable-packages=<pkg0>,<pkg1>,... as they will not be included in the testing. Also, if someone makes local commits in other local git repos before running the sync script again, then these packages will not get enabled and tested. Therefore, in general, don't set --enable-packages=<pkg0>,<pkg1>,... in a sync script, only set --enable-extra-packages=<pkg0>,<pkg1>,... to be robust and safe.--send-build-case-email=only-on-failure
This makes the checkin-test.py tool skip sending email about a build case (e.g. MPI_DEBUG) unless it fails. That way, if everything passes, then only a final DID PUSH email will go out. But if a build case does fail (i.e. configure, build, or tests fail), then an "early warning" email will still go out. However, if one wants to never get the early build-case emails, one can turn this off by setting --send-build-case-email=never.--send-email-to=base-proj-integrators@url4.gov
The results of the builds will be sent this email address. If you only want an email sent when a push actually happens, you can set --send-email-to='' and rely on --send-email-to-on-push.--send-email-to-on-push=base-proj-integrators@url4.gov
A confirmation and summary email will be sent to this address if the push happens. This can be a different email address than set by the --send-email-to option. It is highly recommended that a mail list be used for this email address since this will be the only more permanent logging of the ACI process.--no-append-test-results --no-rebase
These options are needed to stop the checkin-test.py tool from modifying the commits being tested and pushed from one public git repo to another. The option --no-append-test-results is needed to instruct the checkin-test.py tool to NOT amend the last commit with the test results. The option --no-rebase is needed to avoid rebasing the new commits pulled. While the default behavior of the checkin-test.py tool is to amend the last commit message and rebase the local commits (which is considered best practice when testing local commits), this is a very bad thing to do when a ACI sync server is only testing and moving commits between public repos. Amending the last commit would change the SHA1 of the commit (just as a rebase would) and would fork the history and mess up a number of workflows that otherwise should work smoothly. Since an email logging what was tested will go out if a push happens due to the --send-email-to-on-push argument, there is no value in appending the test results to the last commit pulled and merged (which will generally not be a merge commit but just a fast-forward). There are cases, however, where appending the test results in an ACI process might be acceptable but they are not discussed here.--do-all --push -j16
These are standard options that always get used when invoking the checkin-test.py tool and need no further explanation.--wipe-clean
This option is added if you want to make the sync server more robust to changes that might require a clean configure from script. If you care more about using less computer resources and testing that rebuilds work smoothly, remove this option.
The sync script can be created and tested locally to ensure that it works correctly first, before setting it as a cron job as described next. Also, the sync script should be version controlled in one of the project's git repos. This ensures that changes to script pushed to the repos will get invoked automatically when they are pulled (but only on the second invocation of the script). If a change to script is critical in order to do the pull, then one must manually pull the updated commit to the local sync repo.
Note, if using this in a continuous sync server that runs many times in a day in a loop, you also want to set the option --abort-gracefully-if-no-changes-pulled in addition to the option --abort-gracefully-if-no-changes-to-push. That is because if the updated repos are in a broken state such that there are always local changes at every CI iteration (because they have not been pushed to origin), you don't want to do a new CI build unless something has changed that would otherwise perhaps make the error go away. That allows the CI server to sit ready to try out any change that gets pulled that might allow the integrated build to work and then push the updates.
Once the sync script sync_ExtraRepo1.sh has been locally tested, then it should be committed to a version control git repo and then run automatically as a cron job. For example, the cron script shown below would fire off the daily ACI process at 8pm local time every night:
# ----------------- minute (0 - 59) # | -------------- hour (0 - 23) # | | ----------- day of month (1 - 31) # | | | -------- month (1 - 12) # | | | | ----- day of week (0 - 7) (Sunday=0 or 7) # | | | | | # * * * * * command to be executed 00 20 * * * ~/sync_base_dir/sync_ExtraRepo1.sh &> ~/sync_base_dir/sync_ExtraRepo1.out
In the above crontab file (set with 'crontab -e' or 'crontab my-crontab-file.txt'), the script:
~/sync_base_dir/sync_ExtraRepo1.sh
is assumed to be a soft symbolic link to some version controlled copy of the ACI sync script. For example, it might make sense for this script to be version controlled in the BaseProj repo and therefore the symbolic link would be created with something like:
$ cd ~/sync_base_dir/ $ ln -s BaseProj/sampleScripts/sync_ExtraRepo1.sh .
Such a setup would ensure that sync scripts would always be up-to-date due to the git pulls part of the ACI process.
After the above cron job starts running (setup described above), the checkin-test.py tool will send out emails to the email addresses passed into the underlying checkin-test.py tool. If the emails report an update, configure, build, or test failure, then someone will need to log onto the machine where the ACI sync server is running and investigate what went wrong, just like they would if they were running the checkin-test.py tool for testing locally modified changes before pushing.
In the above example, only a single git/TriBITS repo is integrated in this ACI sync scripts. For a complete system, other ACI sync scripts would be written to sync the two other git/TriBITS repos in order to maintain some independence. Or, a single ACI sync script that tries to update all three git/TriBITS repos at could would be written and used. The pattern of integrations chosen will depend many different factors and these patterns can change over time according to current needs and circumstances.
In summary, the checkin-test.py tool can be used to set up robust and effective Almost Continuous Integration (ACI) sync servers that can be used to integrate large collections of software in logical configurations at any logical frequency. Such an approach, together with the practice of Regulated Backward Compatibility and Deprecated Code, can allow very large collections of software to be kept integrated in short time intervals using ACI.
While the post-push CI and Nightly testing processes using ctest -S scripts using tribits_ctest_driver() (see TriBITS CTest/CDash Driver) which posts results to a CDash server (see TriBITS CDash Customizations) is a very attractive testing system with many advantages, setting up a CDash server can be a bit difficult and a CDash server can require non-trivial storage and CPU resources (due to the MySQL DB of test results) and requires some amount of extra maintenance. As an intermediate approach, one can consider just using the project's checkin-test.py tool to implement basic post-push CI and/or Nightly testing servers using simple cron jobs and some other helper scripts. The checkin-test.py tool will robustly pull new commits, configure the project, build, run tests, and send out emails with results and pass/fail. A bunch of builds can be run at once using multiple builds specified in the --default-builds, --st-extra-builds, and --extra-build arguments, or different invocations of the checkin-test.py tool can be run in parallel for better machine utilization.
What one gives up with this approach over the full-blow CTest/CDash implementation is:
The process for setting up a basic nightly build using checkin-test.py as a cron job is a subset of the steps needed to set up a driver and cron job for the more complex ACI process described in ACI Sync Driver Script and ACI Cron Job Setup. Setting up a CI server using checkin-test.py is a little more involved than setting up a nightly build (and less desirable because it blows away old CI iteration results pretty fast) but can be done using existing tools.
For smaller, private, less-critical projects with few developers, setting up CI and Nightly testing using checkin-test.py may be quite adequate. In fact, post-push testing processes implemented with checkin-test.py are much more solid and feature-full than have been employed in many software projects that we have seen over the years that were larger, more public, had many developers, and were quite important to many users and development teams.
TriBITS also includes a system based on CMake/CTest/CDash to drive the builds of a TriBITS project and post results to another CDash project. This system is contained under the directory:
tribits/ctest/tdd/
If the TriBITS project name is <projectName>, the TDD driver CDash project is typically called <projectName>Driver. Using CTest to drive the Nightly builds of a TriBITS project makes sense because CTest can run different builds in parallel, can time-out builds that are taking too long, and will report results to a dashboard and submit notification emails when things fail. However, this is the most confusing and immature part of the TriBITS system. The TriBITS CTest/CDash Driver system using the tribits_ctest_driver() can be used without this TriBITS Dashboard Driver (TDD) system.
However, this TriBITS subsystem is not well tested with automated tests, is difficult to extend and test manually, and has other problems. Therefore, it is not recommended that projects adopt the usage of this subsystem. A simple set of cron jobs or a tool like Jenkins is likely a better option (if done carefully).
The motivation and ideas behind Regulated Backward Compatibility and deprecated code are described in the TriBITS Lifecycle Model document. Here, the details of the implementation in TriBITS are given and how transitions between non-backward compatible major versions is accomplished.
This section describes the process for deprecating code within a major backward-compatible version sequence and finally removing deprecated code when transitioning to a new major version X to X+1 for the semantic versioning numbering scheme X.Y.Z. This information is given as a process with different phases.
The processing for managing deprecated code is as follows and more details are given below:
Setting up support for managing deprecated code in a TriBITS package requires just two simple changes to the TriBITS-related files in a package. First, the top-level package <packageDir>/CMakeLists.txt file needs to have a call to:
tribits_add_show_deprecated_warnings_option()
Second, the package's configure header:
<packageDir>/cmake/<PACKAGE_UCNAME>_config.h.in
file needs to have:
@<PACKAGE_UCNAME>_DEPRECATED_DECLARATIONS@
where <PACKAGE_UCNAME> is the upper-case name of the TriBITS package.
That is all that is needed to provide support for TriBITS deprecated code.
When a TriBITS project is configured with:
-D<Project>_SHOW_DEPRECATED_WARNINGS=ON
by default, all packages will show deprecated warnings. These deprecated warnings can be turned on and off on a package-by-package basis by setting:
-D<PackageName>_SHOW_DEPRECATED_WARNINGS=[ON|OFF]
This gives developers and users a little extra control over what deprecated warnings are shown.
In addition, deprecated code can be hidden from the build to help certify that downstream code is clean by configuring with:
-D<Project>_HIDE_DEPRECATED_CODE=ON
This will remove the deprecated code from the build and install (see details below) so that other software in the TriBITS project can be shown to build clean without deprecated code and so that outside client code can be shown to be clean of deprecated code.
As with deprecated warnings, showing or hiding deprecated code can be controlled on a package-by-package basis by setting:
-D<PackageName>_HIDE_DEPRECATED_CODE=[ON|OFF]
In this case, hiding deprecated code on a package-by-package basis may not work because deprecated code in a downstream package might rely on deprecated code in an upstream package (which might have its deprecated code hidden).
One of the most important aspects of the TriBITS Lifecycle Model for later-stage Production Growth and Production Maintenance code is to provide backward compatibility between a continuous stream of versions of the software within a major version number X in the version numbering scheme X.Y.Z. In all cases, if a piece of client code builds and runs correctly against version X0.Y0.Z0, it should also build, without modification, against versions X0.Y1,Z1 for all Y1 >= Y0 and all Z1 and up to (but not including) X0+1.0.0. There are many types of constructs that one will want to deprecate and therefore later remove. When deprecating code, one wants to give users compile-time warnings of the usage of deprecated features so that they know what they need to remove. One also wants to allow them to certify that their code is free of deprecated warnings by hiding deprecated code. Below, the different types of entities that one wants to deprecate and how to support hiding code (which also facilitates removing it later) are described.
To deprecate standard C/C++ constructs, one can just use the standard TriBITS compile-time macro <PACKAGE_UCNAME>_DEPRECATED which is properly ifdefed by the TriBITS system to add a GCC/Intel deprecated attribute or not. For example, one would deprecate standard C/C++ constructs for the package SomePackage with:
// Deprecate a class (or struct) class SOMEPACKAGE_DEPRECATED SomeClass { ... }; // Deprecate a function SOMEPACKAGE_DEPRECATED int somefunc(...); // Deprecate a typedef SOMEPACKAGE_DEPRECATED typedef someTypeDef int;
The GCC (version 3.1 and newer) and Intel C and C++ compilers both support adding extra attributes including the __deprecated__ attribute. When this attribute is applied to a given C/C++ entity, it produces a compiler warning that can be searched for in the compiler output and elevated to an error (when -Werror is also passed to the compiler).
In addition to the basic deprecated warning, one can also add an optional deprecated warning message using the macro <PACKAGE_UCNAME>_deprecated_msg(). For example, if a new function is replacing an old function, one might use:
// Deprecated old unsafe function taking raw pointers somepackage_deprecated_msg( "Please use the safe somefunc(const Ptr<const std::string>&) instead!") void somefunc(const std::string *str); // New version take does not take raw pointers (and is therefore safer) void somefunc(const Teuchos::Ptr<const std::string> &str);
Then, if user code calls the version somefunc(const std::string*) they will get the string:
"Please use the safe somefunc(const Ptr<const std::string>&) instead!"
printed as well by the compiler. Note that the custom message will only be printed for GCC versions 4.5 and newer. If an older version of GCC is used, then the message string is ignored.
A C/C++ preprocessor macro is not an entity seen by the C/C++ compiler and therefore cannot directly take advantage of a feature such as the __deprecated__ attribute of the GCC/Intel compilers. However, in some cases, for function-like macros can such as:
// The definition of the macro #define some_old_func_macro(ARG1, ARG2, ...) ... ... // A use of the macro some_old_func_macro(a, b, ...)
there is a strategy where one can define the macro to call a dummy deprecated function such as with:
SOMEPACKAGE_DEPRECATED inline void some_old_func_macro_is_deprecated() {} // The definition of the macro #define some_old_func_macro(ARG1, ARG2, ...) \ { \ some_old_func_macro_is_deprecated(); \ ... \ } ... // A use of the macro some_old_func_macro(a, b, ...)
In the above example, client code calling some_old_func_macro() will now result in a deprecated compiler warning which should make it clear what is being deprecated.
Note that this approach will not work in cases were a macro is only providing a single value such as a constant (but one should not use macros for providing constants anyway).
There are times when one wants to deprecate an entire set of files and all of the contents in those files. In addition to deprecating the contents of the files one will want to deprecate the entire file as well. There are a few steps to this. First, one wants to put a warning in the file such as:
#ifdef __GNUC__ # warning "This header <THIS_HEADER> is deprecated! Use <NEW_HEADER> instead!" #endif
The above ifdef on __GNUC__ is needed in order avoid the non-standard #warning preprocessor directive on non-compliant compilers (but should work on all later version GCC and Intel compilers).
In addition to adding deprecation warnings at preprocessing or compile-time, it is also highly desirable to allow the deprecated code to be removed from the build to help certify that client code indeed no longer needs the deprecated code. The following subsections describe how to hide deprecated code from existing files and how to hide deprecated files entirely.
In the case when various C/C++ entities will be removed from an existing file, but the file will remain, then the deprecated code can be ifdefed out, for the package SomePackage for example, using:
#ifndef SOMEPACKAGE_HIDE_DEPRECATED_CODE // Deprecate a class (or struct) class SOMEPACKAGE_DEPRECATED SomeClass { ... }; ... #endif /* SOMEPACKAGE_HIDE_DEPRECATED_CODE */
In this way, when the CMake variable SomePackae_HIDE_DEPRECATED_CODE=ON, then the deprecated code will be completely removed resulting in compile errors for any downstream code still using them.
In order to hide entire deprecated header and source files when the CMake variable <PackageName>_HIDE_DEPRECATED_CODE=ON is set, one needs to move the headers and sources to another directory and provide for conditional inclusion in the TriBITS build of the library. For example, suppose one wants to remove support for the deprecated files SomeOldStuff.hpp and SomeOldStuff.cpp. In this case, one would move the files onto a new deprecated/ sub-directory and then write the CMakeLists.txt file like:
set(HEADERS "") set(SOURCES "") set_and_inc_dirs(DIR ${CMAKE_CURRENT_SOURCE_DIR}) append_glob(HEADERS ${DIR}/*.hpp) append_glob(SOURCES ${DIR}/*.cpp) if (NOT ${PACKAGE_NAME}_HIDE_DEPRECATED_CODE) include_directories(${CMAKE_CURRENT_SOURCE_DIR}/deprecated) append_set(HEADERS SomeOldStuff.hpp ) append_set(SOURCES SomeOldStuff.cpp ) endif() ... tribits_add_library( <LIBRARY_NAME> HEADERS ${HEADERS} SOURCES ${SOURCES} )
In this way, when ${PACKAGE_NAME}_HIDE_DEPRECATED_CODE=TRUE, then the directory for the deprecated headers will not be in the include path (so downstream clients will not even be able to see them) and they will not be installed so external clients will not be able to see them either. However, when ${PACKAGE_NAME}_HIDE_DEPRECATED_CODE=FALSE, then these files will be included in the build and include path and downstream clients can use them.
Once these files need to be permanently removed, one just then needs to remove them from the version control repository (i.e. git rm <files_to_remove>) and then remove them from the above CMakeLists.txt code.
The final step in the code deprecation cycle is to actually remove the deprecated code. This is necessary to clean house, remove clutter and finally get the payoff in the reduction of technical debt that occurs when removing what is no longer needed or wanted.
It is recommended to remove deprecated files first, then remove deprecated file fragments from remaining files. Also, it is recommended to create git commits after each step.
It is also recommended that some time before deprecated code is actually removed, that a TriBITS repo change the default of <Project>_HIDE_DEPRECATED_CODE from OFF to ON so that downstream clients will see the effects of hiding the deprecated code before the code is actually removed. In fact, this default should be changed several days to a week or more before the code is actually removed. This way, downstream code developers will get a "shock" about removal of the deprecated code but can manually configure with -D<Project>_HIDE_DEPRECATED_CODE=OFF to keep building in the short-term until they can remove their usage of deprecated code.
To remove entire deprecated header and source files one just needs to first remove them from the version control repository and local directories (e.g. git rm deprecated/*) and then remove any traces of them from the CMakeLists.txt file. For the example in Hiding entire deprecated header and source files, one would just remove the files SomeOldStuff.hpp and SomeOldStuff.cpp from the CMakeLists.txt file leaving:
if (NOT ${PACKAGE_NAME}_HIDE_DEPRECATED_CODE) include_directories(${CMAKE_CURRENT_SOURCE_DIR}/deprecated) append_set(HEADERS ) append_set(SOURCES ) endif()
Since more files may be deprecated later, it may be a good idea to leave the machinery for conditionally including deprecated files by leaving the above empty CMake code or just commenting it out.
To find which CMakeLists.txt files need to be modified, do a search like:
$ find . -name CMakeLists.txt -exec grep -nH HIDE_DEPRECATED_CODE {} \;
After removing the files, create a local commit of the removed files and the updated CMakeLists.txt files before removing deprecated fragments from the source files. In other words, do:
$ emacs -nw CMakeLists.txt # Remove the references to the deprecated files $ git rm SomeOldStuff.hpp SomeOldStuff.cpp $ git commit -m "Removing deprecated files"
The deprecated ifdefed blocks described in Hiding C/C++ entities can be removed manually but it is generally preferred to use a tool. One simple tool that can do this is called unifdef, that can be downloaded and it is documented at:
http://dotat.at/prog/unifdef/
Just download, build, and install the program unifdef (see unifdef/INSTALL in untarred source) and then run it as described below. In the example below, assume the program is installed in the user's home directory under:
~/install/bin/unifdef
For a given TriBITS package, the program is then run as:
$ find . -name "*pp" -exec ~/install/bin/unifdef \ -DSomePackage_HIDE_DEPRECATED_CODE {} -o {} \;
After the above command runs, look at the diffs to make sure the ifdef deprecated code blocks were removed correctly. For example, run:
$ git diff -- .
If the diffs look correct, commit the changes:
$ git commit -m "Removing deprecated code blocks" -- .
Then test everything and push using the checkin-test.py tool.
After that, all deprecated code is removed and the next period of incremental change and deprecation begins.
TriBITS has some built-in support for installation testing and backward compatibility testing. The way it works is that one can install the headers, libraries, and executables for a TriBITS project and then configure the tests and examples in the TriBITS project against the installed headers/libraries/executables. In this mode, the TriBITS project's libraries and executables are not build and the header file locations to local source are not included.
When the same version of the project sources are used to build the tests/examples against the installed headers/libraries/executables, then this constitutes installation testing. When an older version of the project is used to build and run tests and examples against the headers/libraries/executables for a version of the project, then this constitutes backward compatibility testing which also includes installation testing of course.
It is possible to take an external piece of software that uses any arbitrary build system and wrap it as a TriBITS package and have it integrate in with the package dependency infrastructure. The TribitsExampleProject package WrapExternal shows how this can be done. Support for this in TriBITS is slowly evolving but some key TriBITS features that have been added to support the arbitrary case include:
While it is possible to wrap nearly any externally configured and built piece of software as a TriBITS package, in most cases, it is usually better to just create a TriBITS build system for the software. For projects that use a raw CMake build system, a TriBITS build system can be created side-by-side with the existing raw CMake build using a number of approaches. The common approach that is not too invasive is to create a CMakeLists.tribits.txt file along side every native CMakeLists.txt file in the external software project and have the native CMakeLists.txt file defined like:
if (DOING_A_TRIBITS_BUILD) include("${CMAKE_CURRENT_SOURCE_DIR}/CMakeLists.tribits.txt") return() endif() # Rest of native CMakeLists.txt file ...
Experience from the CASL VERA project has shown that, overall, there is less hassle, less work, and better portability when creating a native TriBITS build, even if it is a secondary build system for a given piece of software.
The decision whether to just wrap the build system for the existing software or to create a (secondary) TriBITS build system for it depends on a number of factors.
Note that while it is generally recommended to create a TriBITS build for an existing piece of software, it is generally not recommended with switch over all of the tests to use CTest (unless the existing software is going to ditch their current test driver and reporting system). Instead, the external software's native test system can just be called by the wrapped TriBITS package in one or more CTest tests.
Some TriBITS projects choose to snapshot the TriBITS/tribits/ directory source tree into their project's source tree, typically under <projectDir>/cmake/tribits/. The independent TriBITS/tribts/ source tree contains the tool snapshot_tribits.py (calls snapshot-dir.py) that allows one to update the snapshot of the TriBITS source tree as simply as:
$ cd <projectDir>/cmake/tribits/ $ <some-base-dir>/TriBITS/tribits/snapshot_tribits.py
This will create a git commit in the local <projectDir>/ git repo that looks like:
Automatic snapshot commit from tribits at f8c1682 Origin repo remote tracking branch: 'casl-dev-collab/tribits_reorg_26' Origin repo remote repo URL: 'casl-dev-collab = git@casl-dev:collaboration/TriBITS' At commit: f8c1682 Assert TriBITS min CMake version in TriBITS itself Author: Roscoe A. Bartlett <bartlettra@ornl.gov> Date: Fri Dec 5 05:40:49 2014 -0500
This, of course, assumes that <projectDir>/ is a local git repo (or is in local git repo). If that is not the case, then one cannot use the script snapshot_tribits.py or must use it with the --skip-commit option.
See snapshot-dir.py --help for more details. Note the guidance on using a different branch for the snapshot sync followed by a merge. This allows for one to maintain local changes to TriBITS and use git to manage the merges. However, this will increase the changes of merge conflicts so one should consider just directly snapshotting into the master branch to avoid merge conflicts.
Most TriBITS projects need git, a compiler (e.g. GCC), MPI, and a number of other standard TPLs and other tools in order to develop on and test the project code. To this end, TriBITS contains some helper scripts for downloading, configuring, building, and installing packages like git, cmake, GCC, MPICH, and others needed to set up a development environment for a typical computational science software project. These tools are used to set up development environments on new machines for projects like Trilinos and CASL VERA. Scripts with names like install-gcc.py are defined which pull sources from public git repos then configure, build, and install into specified installation directories.
The script install_devtools.py is provided in the directory:
tribits/devtools_install/
To use this script, one just needs to create some scratch directory like:
$ mkdir scratch $ cd scratch/
then install the tools using, for example:
$ install_devtools.py --install-dir=~/install/tribits_devtools \ --parallel=16 --do-all
Then to access installed development environment, one just needs to source the script:
~/install/tribits_devtools/load_dev_env.sh
and then the installed versions of GCC, MPICH, CMake, and gitdist are placed in one's path.
See install_devtools.py --help for more details.
The following subsections contain detailed reference documentation for the various TriBITS variables, functions, and macros that are used by TriBITS projects that TriBITS Project Developers need to know about. Variables, functions and macros that are used only internally in TriBITS are generally not documented here (see the TriBITS *.cmake source files).
TriBITS defines a number of global project-level settings that can be set by the user and can have their default determined by each individual TriBITS project. If a given TriBITS project does not define its own default, a reasonable default is set by the TriBITS system automatically. These options are defined and are set, for the most part, in the internal TriBITS function tribits_define_global_options_and_define_extra_repos() in the TriBITS CMake code file TribitsGlobalMacros.cmake which gets called inside of the tribits_project() macro. That function and that file are the definitive source the options that a TriBITS project takes and what the default values are but we strive to document them here as well. Many of these global options (i.e. cache variables) such as ${PROJECT_NAME}_<SOME_OPTION> allow the project to define a default by setting a local variable ${PROJECT_NAME}_<SOME_OPTION>_DEFAULT as:
set(${PROJECT_NAME}_<SOME_OPTION>_DEFAULT <someDefault>)
either in its top-level CMakeLists.txt file or in its ProjectName.cmake file (depends on what variable it is as to where it should be set). If ${PROJECT_NAME}_<SOME_OPTION>_DEFAULT is not set by the project, then TriBITS provides a reasonable default value. The TriBITS code that uses these defaults for this looks like:
if ("${${PROJECT_NAME}_<SOME_OPTION>_DEFAULT}" STREQUAL "") set(${PROJECT_NAME}_<SOME_OPTION>_DEFAULT <someDefault>) endif() advanced_set( ${PROJECT_NAME}_<SOME_OPTION> ${PROJECT_NAME}_<SOME_OPTION>_DEFAULT} CACHE BOOL "[documentation]." )
where <SOME_OPTION> is an option name, for example like TEST_CATEGORIES, and <someDefault> is the default set by TriBITS if the project does not define a default. In this way, if the project sets the variable ${PROJECT_NAME}_<SOME_OPTION>_DEFAULT before this code executes, then ${${PROJECT_NAME}_<SOME_OPTION>_DEFAULT} will be used as the default for the cache variable ${PROJECT_NAME}_<SOME_OPTION> which, of course, can be overridden by the user when calling cmake in a number of ways.
Most of these global options that can be overridden externally by setting the cache variable ${PROJECT_NAME}_<SOME_OPTION> should be documented in the Project-Specific Build Reference document. A generic version of this document is found in TribitsBuildReference. Some of the more unusual options that might only be of interest to developers mentioned below may not be documented in TribitsBuildReference.
The global project-level TriBITS options for which defaults can be provided by a given TriBITS project are:
These options are described below.
${PROJECT_NAME}_ASSERT_CORRECT_TRIBITS_USAGE
The CMake cache variable ${PROJECT_NAME}_ASSERT_CORRECT_TRIBITS_USAGE is used to define how some invalid TriBITS usage checks are handled. The valid values include 'FATAL_ERROR', 'SEND_ERROR', 'WARNING', and 'IGNORE'. The default value is 'FATAL_ERROR' for a project when ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE=ON, which is best for development mode for a project that currently has no invalid usage patterns. The default is 'IGNORE' when ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE=OFF. But a project with some existing invalid usage patterns might want to set, for example, a default of 'WARNING' in order to allow for a smooth upgrade of TriBITS. To do so, set:
set(${PROJECT_NAME}_ASSERT_CORRECT_TRIBITS_USAGE_DEFAULT WARNING)in the project's base <projectDir>/ProjectName.cmake file.
${PROJECT_NAME}_ASSERT_DEFINED_DEPENDENCIES
To set ${PROJECT_NAME}_ASSERT_DEFINED_DEPENDENCIES a different default, set:
set(${PROJECT_NAME}_ASSERT_DEFINED_DEPENDENCIES_DEFAULT <newDefault>)in the project's base <projectDir>/ProjectName.cmake file, where <newDefault> can be FATAL_ERROR, SEND_ERROR, WARNING, NOTICE or IGNORE
Otherwise, the default is WARNING when ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE is ON and if IGNORE if ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE is OFF.
${PROJECT_NAME}_C_Standard
The variable ${PROJECT_NAME}_C_Standard is used define the C standard pass to the compiler in --std=<cstd> for GCC builds of the project. TriBITS sets the default as c99 but the project can set a new default in the project's base <projectDir>/CMakeLists.txt file with, for example:
set(${PROJECT_NAME}_C_Standard_DEFAULT c11)
${PROJECT_NAME}_CHECK_FOR_UNPARSED_ARGUMENTS
The variable ${PROJECT_NAME}_CHECK_FOR_UNPARSED_ARGUMENTS determines how unparsed and otherwise ignored arguments are handled in TriBITS functions that are called by the client TriBITS projects. These are arguments that are left over from parsing input options to functions and macros that take both positional arguments and keyword arguments/options handled with the cmake_parse_arguments() function. For example, for the a TriBITS function declared like:
tribits_copy_files_to_binary_dir( <targetName> [SOURCE_FILES <file1> <file2> ...] [SOURCE_DIR <sourceDir>] ... )the arguments SOURCE_FILES <file1> <file2> ... and those that follow are parsed by the cmake_parse_arguments() function while the argument <targetName> is a positional argument. The problem is that any arguments passed between the first <targetName> argument and the specified keyword arguments like SOURCE_FILES and SOURCE_DIR are returned as unparsed arguments and are basically ignored (which is what happened in earlier versions of TriBITS). For example, calling the function as:
tribits_copy_files_to_binary_dir( FooTestCopyFiles ThisArgumentIsNotParsedAndIsIgnored SOURCE_FILES file1.cpp file2.cpp ... ... )would result in the unparsed argument ThisArgumentIsNotParsedAndIsIgnored.
The value of ${PROJECT_NAME}_CHECK_FOR_UNPARSED_ARGUMENTS determines how that ignored argument is handled. If the value is WARNING, then it will just result in a message(WARNING ...) command that states the warning but configure is allowed to be completed. This would be the right value to allow an old TriBITS project to keep configuring until the warnings can be cleaned up. If the value is SEND_ERROR, then message(SEND_ERROR ...) is called. This will result in the configure failing but will allow configure to continue until the end (or a FATAL_ERROR is raised). This would be the right value when trying to upgrade a TriBITS project where you wanted to see all of the warnings when upgrading TriBITS (so you could fix them all in one shot). Finally, the value of FATAL_ERROR will result in message(FATAL_ERROR ...) being called which will halt configure right away. This is the best value when developing on a TriBITS project that is already clean but you want to catch new developer-inserted errors right away.
The default value for ${PROJECT_NAME}_CHECK_FOR_UNPARSED_ARGUMENTS is WARNING, so that it will be backward compatible for TriBITS projects that might have previously undetected unparased and therefore ignored argument . However, a project can change the default by setting, for example:
set(${PROJECT_NAME}_CHECK_FOR_UNPARSED_ARGUMENTS_DEFAULT FATAL_ERROR)in the <projectDir>/ProjectName.cmake file.
The user of a TriBITS project should not be able to trigger this unparsed arguments condition so this variable is not documented in the TriBITS Build Reference. But it is still a CMake cache var that is documented in the CMakeCache.txt file and can be set by the user or developer if desired.
${PROJECT_NAME}_CONFIGURE_OPTIONS_FILE_APPEND
The variable ${PROJECT_NAME}_CONFIGURE_OPTIONS_FILE_APPEND is used to define the absolute path to a file (or a list of files) that should be included after the files listed in ${PROJECT_NAME}_CONFIGURE_OPTIONS_FILE. This variable can be used by the TriBITS project to define, for example, a standard set of development environments in the base <projectDir>/CMakeLists.txt file with:
set(${PROJECT_NAME}_CONFIGURE_OPTIONS_FILE_APPEND_DEFAULT "${CMAKE_CURRENT_LIST_DIR}/cmake/StdDevEnvs.cmake")before the tribits_project() command. By including this file(s) after the file(s) listed in ${PROJECT_NAME}_CONFIGURE_OPTIONS_FILE, the user can override the variables set in this appended file(s). But it is important that these variables best set after the users options have been set but before the Package and TPL dependency analysis is done (because this might enable or disable some TPLs).
${PROJECT_NAME}_CPACK_SOURCE_GENERATOR
The variable ${PROJECT_NAME}_CPACK_SOURCE_GENERATOR determines the CPack source generation types that are created when the package_source target is run. The TriBITS default is set to TGZ. However, this default can be overridden by setting, for example:
set(${PROJECT_NAME}_CPACK_SOURCE_GENERATOR_DEFAULT "TGZ;TBZ2")This variable should generally be set in the file:
<projectDir>/cmake/CallbackDefineProjectPackaging.cmakeinstead of in the base-level CMakeLists.txt file so that it goes along with rest of the project-specific CPack packaging options.
${PROJECT_NAME}_CTEST_DO_ALL_AT_ONCE
The variable ${PROJECT_NAME}_CTEST_DO_ALL_AT_ONCE determines if the CTest driver scripts using tribits_ctest_driver() configure, build, test and submit results to CDash all-at-once for all of the packages being tested or if instead is done package-by-package. Currently, the default is set to FALSE for the package-by-package mode (for historical reasons) but the default can be set to TRUE by setting:
set(${PROJECT_NAME}_CTEST_DO_ALL_AT_ONCE_DEFAULT "TRUE")in the project's <projectDir>/ProjectName.cmake file. (This default must be changed in the <projectDir>/ProjectName.cmake file and NOT the <projectDir>/CMakeLists.txt file because the latter is not directly processed in CTest -S driver scripts using tribits_ctest_driver().)
In general, a project should change the default to TRUE when using a newer CDash installation with CDash versions 3.0+ that can accommodate the results coming from ctest -S and display them package-by-package very nicely. Otherwise, most projects are better off with package-by-package mode since it results in nicer display on CDash for older CDash versions.
${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES
If ${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES=ON (the TriBITS default value), then any explicitly enabled packages that have disabled upstream required packages or TPLs will be disabled. If ${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES=OFF, then an configure error will occur. For more details also see TribitsBuildReference and Disables trump enables where there is a conflict. A project can define a different default value by setting:
set(${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES_DEFAULT FALSE)
${PROJECT_NAME}_ELEVATE_ST_TO_PT
If ${PROJECT_NAME}_ELEVATE_ST_TO_PT is set to ON, then all ST packages will be elevated to PT packages. The TriBITS default is obviously OFF. The default can be changed by setting:
set(${PROJECT_NAME}_ELEVATE_ST_TO_PT_DEFAULT ON)There are projects, especially meta-projects, where the distinction between PT and ST code is not helpful or the assignment of PT and ST packages in a repository is not appropriate with respect to the outer meta-project. An example project like this CASL VERA. Changing the default to ON allows any and packages to be considered in pre-push testing.
${PROJECT_NAME}_ENABLE_CPACK_PACKAGING
If ${PROJECT_NAME}_ENABLE_CPACK_PACKAGING is ON, then CPack support is enabled and some TriBITS code is run that is needed to set up data-structures that are used by the built-in CMake target package_source. The TriBITS default is OFF with the idea that the average developer or user will not be wanting to create source distributions with CPack. However, this default can be changed by setting:
set(${PROJECT_NAME}_ENABLE_CPACK_PACKAGING_DEFAULT ON)
${PROJECT_NAME}_ENABLE_CXX
If ${PROJECT_NAME}_ENABLE_CXX is ON, then C++ language support for the project will be enabled and the C++ compiler must be found. By default, TriBITS sets this to ON for all systems. A project never requires C++ can set this to off by default by setting:
set(${PROJECT_NAME}_ENABLE_CXX_DEFAULT FALSE)
${PROJECT_NAME}_ENABLE_C
If ${PROJECT_NAME}_ENABLE_C is ON, then C language support for the project will be enabled and the C compiler must be found. By default, TriBITS sets this to ON for all systems. A project never requires C can set this to off by default by setting:
set(${PROJECT_NAME}_ENABLE_C_DEFAULT FALSE)If a project does not have any native C code a good default would be:
set(${PROJECT_NAME}_ENABLE_C_DEFAULT FALSE)NOTE: It is usually not a good idea to always force off C, or any compiler, because extra repositories and packages might be added by someone that might require the compiler and we don't want to unnecessarily limit the generality of a given TriBITS build. Setting the default for all platforms should be sufficient.
${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE
The variable ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE switches the TriBITS project from development mode to release mode. The default for this variable ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE_DEFAULT should be set in the project's <projectDir>/Version.cmake file and switched from ON to OFF when creating a release (see Project and Repository Versioning and Release Mode). When ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE is ON, several other variables are given defaults appropriate for development mode. For example, ${PROJECT_NAME}_ASSERT_DEFINED_DEPENDENCIES is set to FATAL_ERROR by default in development mode but is set to IGNORE by default in release mode. In addition, strong compiler warnings are enabled by default in development mode but are disabled by default in release mode. This variable also affects the behavior of tribits_set_st_for_dev_mode().
${PROJECT_NAME}_ENABLE_Fortran
If ${PROJECT_NAME}_ENABLE_Fortran is ON, then Fortran support for the project will be enabled and the Fortran compiler(s) must be found. By default, TriBITS sets this to ON .
If a project does not have any native Fortran code a good default would be:
set(${PROJECT_NAME}_ENABLE_Fortran_DEFAULT OFF)This default can be set in <projectDir>/ProjectName.cmake or <projectDir>/CMakeLists.txt.
On WIN32 systems, the default for ${PROJECT_NAME}_ENABLE_Fortran_DEFAULT is set to OFF since it can be difficult to get a Fortran compiler for native Windows.
Given that a native Fortran compiler is not supported by default on Windows and on most Mac OSX systems, projects that have optional Fortran code may decide to set the default depending on the platform by setting, for example:
if ( (WIN32 AND NOT CYGWIN) OR (CMAKE_HOST_SYSTEM_NAME STREQUAL "Darwin") ) message(STATUS "Warning: Setting ${PROJECT_NAME}_ENABLE_Fortran=OFF by default" " because this is Windows (not cygwin) and we assume to not have Fortran!") set(${PROJECT_NAME}_ENABLE_Fortran_DEFAULT OFF) else() set(${PROJECT_NAME}_ENABLE_Fortran_DEFAULT ON) endif()NOTE: It is usually not a good idea to always force off Fortran, or any compiler, because extra repositories and packages might be added by someone that might require the compiler and we don't want to unnecessarily limit the generality of a given TriBITS build. Setting the default for all platforms should be sufficient.
${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES
If ${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES is set to ON, then <PackageName>Config.cmake files are created at configure time in the build tree and installed into the install tree. These files are used by external CMake projects to pull in the list of compilers, compiler options, include directories and libraries. The TriBITS default is OFF but a project can change the default by setting, for example:
set(${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES_DEFAULT ON)A project would want to leave off the creation and installation of <PackageName>Config.cmake files if it was only installing and providing executables (see ${PROJECT_NAME}_INSTALL_LIBRARIES_AND_HEADERS). However, if it is wanting to provide libraries for other projects to use, then it should turn on the default generation of these files.
${PROJECT_NAME}_ENABLE_SECONDARY_TESTED_CODE
If ${PROJECT_NAME}_ENABLE_SECONDARY_TESTED_CODE is ON, then packages and subpackages marked as ST in the <repoDir>/PackagesList.cmake file will be implicitly enabled along with the PT packages. Additional code and tests may also be enabled using this option. The TriBITS default is OFF but this can be changed by setting:
set(${PROJECT_NAME}_ENABLE_SECONDARY_TESTED_CODE_DEFAULT ON)in the <projectDir>/ProjectName.cmake file.
${PROJECT_NAME}_EXCLUDE_DISABLED_SUBPACKAGES_FROM_DISTRIBUTION
If ${PROJECT_NAME}_EXCLUDE_DISABLED_SUBPACKAGES_FROM_DISTRIBUTION is TRUE, then the directories for subpackages that are not enabled are left out of the source tarball. This reduces the size of the tarball as much as possible but does require that the TriBITS packages and subpackages be properly set up to allow disabled subpackages from being excluded. The TriBITS default is TRUE but this can be changed by setting:
set(${PROJECT_NAME}_EXCLUDE_DISABLED_SUBPACKAGES_FROM_DISTRIBUTION_DEFAULT FALSE)
${PROJECT_NAME}_GENERATE_EXPORT_FILE_DEPENDENCIES
If ${PROJECT_NAME}_GENERATE_EXPORT_FILE_DEPENDENCIES is ON, then the data-structures needed to generate <PackageName>Config.cmake files are created. These data structures are also needed in order to generate export makefiles on demand using the function tribits_write_flexible_package_client_export_files(). The default in TriBITS is to turn this ON automatically by default if ${PROJECT_NAME}_ENABLE_INSTALL_CMAKE_CONFIG_FILES is ON. Else, by default, TriBITS sets this to OFF. The only reason for the project to override the default is to set it to ON as with:
set(${PROJECT_NAME}_GENERATE_EXPORT_FILE_DEPENDENCIES_DEFAULT ON)is so that the necessary data-structures are generated in order to use the function tribits_write_flexible_package_client_export_files().
${PROJECT_NAME}_GENERATE_VERSION_DATE_FILES
If ${PROJECT_NAME}_GENERATE_VERSION_DATE_FILES is ON, then the files VersionDate.cmake and <RepoName>_version_date.h will get generated and the generated file <RepoName>_version_date.h will get installed for each TriBITS version-controlled repository when the local directories are git repositories. The default is OFF but the project can change that by setting:
set(${PROJECT_NAME}_GENERATE_VERSION_DATE_FILES ON)in the <projectDir>/ProjectName.cmake file.
${PROJECT_NAME}_GENERATE_REPO_VERSION_FILE
If ${PROJECT_NAME}_GENERATE_REPO_VERSION_FILE is ON, then the file <Project>RepoVersion.txt will get generated as a byproduct of configuring with CMake. See Multi-Repository Support and <Project>_GENERATE_REPO_VERSION_FILE. The default is OFF but the project can change that by setting:
set(${PROJECT_NAME}_GENERATE_REPO_VERSION_FILE_DEFAULT ON)in the <projectDir>/ProjectName.cmake file.
Note that if a git exectauble cannot be found at configure time, then the default ${PROJECT_NAME}_GENERATE_REPO_VERSION_FILE_DEFAULT will be overridden to OFF. But if the user sets ${PROJECT_NAME}_GENERATE_REPO_VERSION_FILE=ON in the cache and git can't be found, then an configure-time error will occur.
${PROJECT_NAME}_IMPORTED_NO_SYSTEM
By default, include directories from IMPORTED library targets from the TriBITS project's installed <Package>Config.cmake files will be considered SYSTEM headers and therefore be included on the compile lines of downstream CMake projects with -isystem with most compilers. However, if ${PROJECT_NAME}_IMPORTED_NO_SYSTEM is set to ON (only supported for CMake versions 3.23 or greater), then all of the IMPORTED library targets exported into the set of installed <Package>Config.cmake files will have the IMPORTED_NO_SYSTEM property set. This will cause downstream customer CMake projects to apply the include directories from these IMPORTED library targets as non-system include directories. On most compilers, that means that the include directories will be listed on the compile lines with -I instead of with -isystem. (See more details in the TriBITS Build Reference for <Project>_IMPORTED_NO_SYSTEM.)
The default value set by TriBITS itself is OFF but a TriBITS project can change the default value to ON by adding:
if (CMAKE_VERSION VERSION_GREATER_EQUAL 3.23) set(${PROJECT_NAME}_IMPORTED_NO_SYSTEM_DEFAULT ON) endif()in the <projectDir>/ProjectName.cmake file. (NOTE: The above if() statement ensures that a configure error will not occur if a version of CMake less than 3.23 is used. But if the TriBITS project minimum CMake version is 3.23 or greater, then the above if() statement guard can be removed.)
${PROJECT_NAME}_INSTALL_LIBRARIES_AND_HEADERS
If ${PROJECT_NAME}_INSTALL_LIBRARIES_AND_HEADERS is set to ON, then any defined libraries or header files that are listed in calls to tribits_add_library() or tribits_install_headers() will be installed (unless options are passed into tribits_add_library() that disable installs). If set to OFF, then headers and libraries will not be installed by default and only INSTALLABLE executables added with tribits_add_executable() will be installed. However, as described in TribitsBuildReference, shared libraries will always be installed if enabled since they are needed by the installed executables.
For a TriBITS project that is primarily delivering libraries (e.g. Trilinos), then it makes sense to leave the TriBITS default which is ON or explicitly set:
set(${PROJECT_NAME}_INSTALL_LIBRARIES_AND_HEADERS_DEFAULT ON)For a TriBITS project that is primarily delivering executables (e.g. VERA), then it makes sense to set the default to:
set(${PROJECT_NAME}_INSTALL_LIBRARIES_AND_HEADERS_DEFAULT OFF)
${PROJECT_NAME}_MAKE_INSTALL_GROUP_READABLE ${PROJECT_NAME}_MAKE_INSTALL_GROUP_WRITABLE ${PROJECT_NAME}_MAKE_INSTALL_WORLD_READABLE
Determines the permissions for directories and files created during the execution of the of the install and isntall_package_by_package targets.
To make the created directories by only group readable for the project by default, set:
set(${PROJECT_NAME}_MAKE_INSTALL_WORLD_READABLE_DEFAULT TRUE)To make the created directories by only group writable (and readable) for the project by default, set:
set(${PROJECT_NAME}_MAKE_INSTALL_WORLD_WRITABLE_DEFAULT TRUE)To make the created directories by world readable for the project by default, set:
set(${PROJECT_NAME}_MAKE_INSTALL_WORLD_READABLE_DEFAULT TRUE)On non-Windows systems, these set permissions for all files and directories from the the user-set base directory ${PROJECT_NAME}_SET_GROUP_AND_PERMISSIONS_ON_INSTALL_BASE_DIR on down. For more details see Installation considerations.
These defaults can be set in the <projectDir>/ProjectName.cmake file.
${PROJECT_NAME}_MUST_FIND_ALL_TPL_LIBS
Determines if all of the libraries listed in <tplName>_LIBRARY_NAMES for a given TPL must be found for each enabled TPL. By default, this is FALSE which means that the determination if all of the listed libs for a TPL should be found is determined by the MUST_FIND_ALL_LIBS option to the tribits_tpl_find_include_dirs_and_libraries() function in the TPL find module. To change the default for this, set:
set(${PROJECT_NAME}_MUST_FIND_ALL_TPL_LIBS_DEFAULT TRUE)in the <projectDir>/ProjectName.cmake file.
${PROJECT_NAME}_Python3_FIND_VERSION
Determines the version of Python that is looked for. TriBITS requires at least version "3.6". A particular TriBITS project can require a higher version of TriBITS and this is set using, for example:
set(${PROJECT_NAME}_Python3_FIND_VERSION_DEFAULT "3.8")in the <projectDir>/ProjectName.cmake file (See Python Support). The user can force a more recent version of Python by configuring with, for example:
-D <Project>_Python3_FIND_VERSION="3.8"
${PROJECT_NAME}_REQUIRES_PYTHON
If the TriBITS project requires Python, set:
set(${PROJECT_NAME}_REQUIRES_PYTHON TRUE)in the <projectDir>/ProjectName.cmake file (See Python Support). The default is implicitly FALSE.
${PROJECT_NAME}_SET_INSTALL_RPATH
The cache variable ${PROJECT_NAME}_SET_INSTALL_RPATH is used to define the default RPATH mode for the TriBITS project (see Setting install RPATH for details). The TriBITS default is to set this to TRUE but the TriBITS project can be set the default to FALSE by setting:
set(${PROJECT_NAME}_SET_INSTALL_RPATH_DEFAULT FALSE)in the project's <projectDir>/ProjectName.cmake file (see RPATH Handling).
${PROJECT_NAME}_SHOW_GIT_COMMIT_PARENTS
The cache variable ${PROJECT_NAME}_SHOW_GIT_COMMIT_PARENTS results in the repo version file showing the parent commits for each repo commit. By default, this variable is set to OFF but projects can set to to ON by default by setting:
set(${PROJECT_NAME}_SHOW_GIT_COMMIT_PARENTS_DEFAULT ON)in the project's ProjectName.cmake file. (That way, it will also impact cmake -P scripts don't configure the project itself to be built.)
${PROJECT_NAME}_SHOW_TEST_START_END_DATE_TIME
The cache variable ${PROJECT_NAME}_SHOW_TEST_START_END_DATE_TIME determines if the start and end date/time for each advanced test (i.e. added with tribits_add_advanced_test()) is printed or not with each test. If set to TRUE this also causes in the timing for each TEST_<IDX> block to be printed as well. The TriBITS default is OFF but a TriBITS project can change this default by setting:
set(${PROJECT_NAME}_SHOW_TEST_START_END_DATE_TIME_DEFAULT ON)The implementation of this feature currently uses execute_process(date) and therefore will work on many (but perhaps not all) Linux/Unix/Mac systems and not on Windows systems.
NOTE: In a future version of CTest, this option may turn on start and end date/time for regular tests added with tribits_add_test() (which uses a raw command with add_test()).
${PROJECT_NAME}_SKIP_INSTALL_PROJECT_CMAKE_CONFIG_FILES
To change the default value of the ${PROJECT_NAME}_SKIP_INSTALL_PROJECT_CMAKE_CONFIG_FILES to TRUE, for example, for a TriBITS project, set:
set(${PROJECT_NAME}_SKIP_INSTALL_PROJECT_CMAKE_CONFIG_FILES_DEFAULT TRUE)in the project's <projectDir>/CMakeLists.txt or <projectDir>/ProjectName.cmake files.
${PROJECT_NAME}_SKIP_EXTRAREPOS_FILE
The cache variable ${PROJECT_NAME}_SKIP_EXTRAREPOS_FILE is set in the <projectDir>/ProjectName.cmake file as:
set(${PROJECT_NAME}_SKIP_EXTRAREPOS_FILE TRUE)for projects that don't have a <projectDir>/cmake/ExtraRepositoriesList.cmake file. This variable needs to be set when using the CTest driver script and does not need to be set for the basic configure and build process.
${PROJECT_NAME}_TEST_CATEGORIES
The cache variable ${PROJECT_NAME}_TEST_CATEGORIES determines what tests defined using tribits_add_test() and tribits_add_advanced_test() will be added for ctest to run (see Test Test Category). The TriBITS default is NIGHTLY for a standard local build. The checkin-test.py tool sets this to BASIC by default. A TriBITS project can override the default for a basic configure using, for example:
set(${PROJECT_NAME}_TEST_CATEGORIES_DEFAULT BASIC)The justification for having the default Test Test Category be NIGHTLY instead of BASIC is that when someone is enabling a package to develop on it or install it, we want them by default to be seeing the full version of the test suite (shy of the Test Test Category HEAVY tests which can be very expensive) for the packages they are explicitly enabling. Typically they will not be enabling forward/downstream dependent packages so the cost of running the test suite should not be too prohibitive. This all depends on how good of a job the development teams do in making their test suites run fast and keeping the cost of running the tests down. See the section TriBITS Automated Testing for a more detailed discussion.
${PROJECT_NAME}_TPL_SYSTEM_INCLUDE_DIRS
If ${PROJECT_NAME}_TPL_SYSTEM_INCLUDE_DIRS is set to TRUE, then the SYSTEM flag will be passed into the include_directories() command for TPL include directories for every TPL for every package, by default. On some systems this will result in include directories being passed to the compiler with -isystem instead of -I. This helps to avoid compiler warning coming from TPL header files for C and C++. However, with CMake version 3.2 and less, this also results in -isystem being passed to the Fortran compiler (e.g. gfortran) as well. This breaks the reading of Fortran module files (perhaps a bug in gfortran). Because of this issue with Fortran, the TriBITS default for this option is set to FALSE but a project can override the default using:
set(${PROJECT_NAME}_TPL_SYSTEM_INCLUDE_DIRS_DEFAULT TRUE)(This would be a good default if the project has not Fortran files or has not Fortran files that use modules provided by TPLs).
However, if a package or subpackage sets:
set(${PACKAGE_NAME}_SKIP_TPL_SYSTEM_INCLUDE_DIRS TRUE)in its CMakeLists.txt files before the tribits_add_library() or tribits_add_executable() commands are called in that package, then SYSTEM will not be passed into include_directories() for TPL include dirs. This is how some TriBITS packages with Fortran files that use Fortran modules avoid passing in -isystem to the Fortran compiles and thereby avoid the defect with gfortran described above. If CMake version 3.3 or greater is used, this variable is not required.
NOTE: Currently, a TriBITS package must have a direct dependency on a TPL to have -isystem added to a TPL's include directories on the compile lines for that package. That is, the TPL must be listed in the LIB_REQUIRED_TPLS or LIB_OPTIONAL_TPLS arguments passed into the tribits_package_define_dependencies() function in the package's <packageDir>/cmake/Dependencies.cmake file. In addition, to have -isystem added to the include directories for a TPL when compiling the tests for an package, it must be listed in the TEST_REQUIRED_TPLS or TEST_OPTIONAL_TPLS arguments. This is a limitation of the TriBITS implementation that will be removed in a future version of TriBITS.
${PROJECT_NAME}_TRACE_ADD_TEST
If ${PROJECT_NAME}_TRACE_ADD_TEST is set to TRUE, then a single line will be printed for each call to tribits_add_test() and tribits_add_advanced_test() for if the test is added or not and if not then why. The default is set based on the value of ${PROJECT_NAME}_VERBOSE_CONFIGURE but a project can override the default by setting:
set(${PROJECT_NAME}_TRACE_ADD_TEST_DEFAULT TRUE)
${PROJECT_NAME}_USE_GNUINSTALLDIRS
If ${PROJECT_NAME}_USE_GNUINSTALLDIRS is set to TRUE, then the default install paths will be determined by the standard CMake module GNUInstallDirs. Otherwise, platform independent install paths are used by default.
A project can use the paths given the cmake module GNUInstallDirs by default by setting:
set(${PROJECT_NAME}_USE_GNUINSTALLDIRS_DEFAULT TRUE)in the project's top-level <projectDir>/CMakeLists.txt file or its <projectDir>/ProjectName.cmake file. The default is FALSE.
${PROJECT_NAME}_USES_PYTHON
If the TriBITS project can use Python, but does not require it, set:
set(${PROJECT_NAME}_USES_PYTHON TRUE)in the <projectDir>/ProjectName.cmake file (see Python Support). The default for a TriBITS project is implicitly TRUE. To explicitly state that Python is never needed, set:
set(${PROJECT_NAME}_USES_PYTHON FALSE)
DART_TESTING_TIMEOUT
The cache variable DART_TESTING_TIMEOUT is a built-in CMake variable that provides a default timeout for all tests (see Setting test timeouts at configure time). By default, TriBITS defines this to be 1500 seconds (which is also the raw CMake default) but the project can change this default, from 1500 to 300 for example, by setting the following in the project's <projectDir>/ProjectName.cmake or <projectDir>/CMakeLists.txt file:
set(DART_TESTING_TIMEOUT_DEFAULT 300)
CMAKE_INSTALL_RPATH_USE_LINK_PATH
The cache variable CMAKE_INSTALL_RPATH_USE_LINK_PATH is a built-in CMake variable that determines if the paths for external libraries (i.e. from TPLs) is put into the installed library RPATHS (see RPATH Handling). TriBITS sets the default for this to TRUE but a project can change the default back to FALSE by setting the following in the project's <projectDir>/ProjectName.cmake file:
set(CMAKE_INSTALL_RPATH_USE_LINK_PATH_DEFAULT FALSE)
MPI_EXEC_MAX_NUMPROCS
The variable MPI_EXEC_MAX_NUMPROCS gives the maximum number of processes for an MPI test that will be allowed as defined by tribits_add_test() and tribits_add_advanced_test(). The TriBITS default is set to be 4 (for no good reason really but it needs to stay that way for backward compatibility). This default can be changed by setting:
set(MPI_EXEC_MAX_NUMPROCS_DEFAULT <newDefaultMax>)While this default can be changed for the project as a whole on all platforms, it is likely better to change this default on a machine-by-machine basis to correspond to the load that can be accommodated by a given machine (or class of machines). For example if a given machine has 64 cores, a reasonable number for MPI_EXEC_MAX_NUMPROCS_DEFAULT is 64.
TRIBITS_HANDLE_TRIBITS_DEPRECATED_CODE
Determines how the function tribits_deprecated() behaves. To change the default behavor, such as call message(FATAL_ERROR ...), set:
set(TRIBITS_HANDLE_TRIBITS_DEPRECATED_CODE_DEFAULT FATAL_ERROR)in the project's <projectDir>/ProjectName.cmake file, or <projectDir>/CMakeLists.txt file, or on the individual package basis in its <packageDir>/CMakeLists.txt file.
The following subsections give detailed documentation for the CMake macros and functions that make up the core TriBITS system. These are what are used by TriBITS project developers in their CMakeLists.txt and other files. All of these functions and macros should be automatically available when processing the project's and package's variables files if used properly. Therefore, no explicit include() statements should be needed other than the initial include of the TriBITS.cmake file in the top-level <projectDir>/CMakeLists.txt file so the command tribits_project() can be executed.
Function that creates an advanced test defined by stringing together one or more executable and/or command invocations that is run as a cmake -P script with very flexible pass/fail criteria.
Usage:
tribits_add_advanced_test( <testNameBase> TEST_0 (EXEC <execTarget0> | CMND <cmndExec0>) ... [TEST_1 (EXEC <execTarget1> | CMND <cmndExec1>) ...] ... [TEST_N (EXEC <execTargetN> | CMND <cmndExecN>) ...] [OVERALL_WORKING_DIRECTORY (<overallWorkingDir> | TEST_NAME)] [SKIP_CLEAN_OVERALL_WORKING_DIRECTORY] [FAIL_FAST] [RUN_SERIAL] [KEYWORDS <keyword1> <keyword2> ...] [COMM [serial] [mpi]] [OVERALL_NUM_MPI_PROCS <overallNumProcs>] [OVERALL_NUM_TOTAL_CORES_USED <overallNumTotalCoresUsed>] [CATEGORIES <category0> <category1> ...] [HOST <host0> <host1> ...] [XHOST <host0> <host1> ...] [HOSTTYPE <hosttype0> <hosttype1> ...] [XHOSTTYPE <hosttype0> <hosttype1> ...] [EXCLUDE_IF_NOT_TRUE <varname0> <varname1> ...] [DISABLED <messageWhyDisabled>] [FINAL_PASS_REGULAR_EXPRESSION "<regex>" | FINAL_FAIL_REGULAR_EXPRESSION "<regex>"] [ENVIRONMENT <var1>=<value1> <var2>=<value2> ...] [TIMEOUT <maxSeconds>] [LIST_SEPARATOR <sep>] [ADDED_TEST_NAME_OUT <testName>] )
This function allows one to add a single CTest test that is actually a sequence of one or more separate commands strung together in some way to define the final pass/fail. One will want to use this function to add a test instead of tribits_add_test() when one needs to run more than one command, or one needs more sophisticated checking of the test result other than just grepping STDOUT (e.g. by running separate post-processing programs to examine output files).
For more details on these arguments, see TEST_<idx> EXEC/CMND Test Blocks and Arguments (tribits_add_advanced_test()).
The most common type of an atomic test block TEST_<idx> runs a command as either a package-built executable or just any command. An atomic test command block TEST_<idx> (i.e. TEST_0, TEST_1, ...) takes the form:
TEST_<idx> (EXEC <exeRootName> [NOEXEPREFIX] [NOEXESUFFIX] [ADD_DIR_TO_NAME] [DIRECTORY <dir>] | CMND <cmndExec>) [ARGS "<arg0>" "<arg1>" ... "<argn>"] [MESSAGE "<message>"] [WORKING_DIRECTORY <workingDir>] [SKIP_CLEAN_WORKING_DIRECTORY] [NUM_MPI_PROCS <numProcs>] [NUM_TOTAL_CORES_USED <numTotalCoresUsed>] [OUTPUT_FILE <outputFile>] [NO_ECHO_OUTPUT]] [PASS_ANY | PASS_REGULAR_EXPRESSION "<regex0>" "<regex1>" ... | PASS_REGULAR_EXPRESSION_ALL "<regex0>" "<regex1>" ... | STANDARD_PASS_OUTPUT ] [FAIL_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...] [ALWAYS_FAIL_ON_NONZERO_RETURN | ALWAYS_FAIL_ON_ZERO_RETURN] [WILL_FAIL]
For more information on these arguments, see TEST_<idx> EXEC/CMND Test Blocks and Arguments (tribits_add_advanced_test()).
The other type of TEST_<idx> block supported is for copying files and takes the form:
TEST_<idx> COPY_FILES_TO_TEST_DIR <file0> <file1> ... <filen> [SOURCE_DIR <srcDir>] [DEST_DIR <destDir>]
This makes it easy to copy files from the source tree (or other location) to inside of the test directory (usually created with OVERALL_WORKING_DIR TEST_NAME) so that tests can run in their own private working directory (and so these files get deleted and recopied each time the test runs). This approach has several advantages:
For more information on these arguments, see TEST_<idx> COPY_FILES_TO_TEST_DIR Test Blocks and Arguments (tribits_add_advanced_test()).
By default, each and every atomic TEST_<idx> block needs to pass (as defined in Test case Pass/Fail (tribits_add_advanced_test())) in order for the overall test to pass.
Finally, the test is only added if tests are enabled for the package (i.e. ${PACKAGE_NAME}_ENABLE_TESTS = ON) and if other criteria are met (see Overall Arguments (tribits_add_advanced_test())). (NOTE: A more efficient way to optionally enable tests is to put them in a test/ subdir and then include that subdir with tribits_add_test_directories().)
Sections:
Overall Arguments (tribits_add_advanced_test())
Below, some of the overall arguments are described. The rest of the overall arguments that control overall pass/fail are described in Overall Pass/Fail (tribits_add_advanced_test()). (NOTE: All of these arguments must be listed outside of the TEST_<idx> blocks, see Argument Parsing and Ordering (tribits_add_advanced_test())).
<testNameBase>
The base name of the test (which will have ${PACKAGE_NAME}_ prepended to the name, see <testName> below) that will be used to name the output CMake script file as well as the CTest test name passed into add_test(). This must be the first argument to this function. The name is allowed to contain '/' chars but these will be replaced with '__' in the overall working directory name and the ctest -P script (Debugging and Examining Test Generation (tribits_add_advanced_test())).OVERALL_WORKING_DIRECTORY <overallWorkingDir>
If specified, then the working directory <overallWorkingDir> (relative or absolute path) will be created and all of the test commands by default will be run from within this directory. If the value <overallWorkingDir>=TEST_NAME is given, then the working directory will be given the name <testName> where any '/' chars are replaced with '__'. By default, if the directory <overallWorkingDir> exists before the test runs, it will be deleted and created again. If one wants to preserve the contents of this directory between test runs then set SKIP_CLEAN_OVERALL_WORKING_DIRECTORY. Using a separate test directory is a good option to use if the commands create intermediate files and one wants to make sure they get deleted before the test cases are run again. It is also important to create a separate test directory if multiple tests are defined in the same CMakeLists.txt file that read/write files with the same name.SKIP_CLEAN_OVERALL_WORKING_DIRECTORY
If specified, then <overallWorkingDir> will not be deleted if it already exists.FAIL_FAST
If specified, then the remaining test commands will be aborted when any test command fails. Otherwise, all of the test cases will be run.RUN_SERIAL
If specified, then no other tests will be allowed to run while this test is running. See the RUN_SERIAL argument in the function tribits_add_test() for more details.COMM [serial] [mpi]
If specified, selects if the test will be added in serial and/or MPI mode. See the COMM argument in the function tribits_add_test() for more details.OVERALL_NUM_MPI_PROCS <overallNumProcs>
If specified, gives the default number of MPI processes that each executable command runs on. If <overallNumProcs> is greater than ${MPI_EXEC_MAX_NUMPROCS} then the test will be excluded. If not specified, then the default number of processes for an MPI build will be ${MPI_EXEC_DEFAULT_NUMPROCS}. For serial builds, this argument is ignored. For MPI builds with all TEST_<idx> CMND blocks, <overallNumProcs> is used to set the property PROCESSORS. (see Running multiple tests at the same time (tribits_add_advanced_test())). WARNING! If just running a serial script or other command, then the property PROCESSORS will still get set to ${OVERALL_NUM_MPI_PROCS} so in order to avoid CTest unnecessarily reserving ${OVERALL_NUM_MPI_PROCS} processes for a serial non-MPI test, then one must leave off OVERALL_NUM_MPI_PROCS or explicitly pass in MPI_EXEC_DEFAULT_NUMPROCS 1!OVERALL_NUM_TOTAL_CORES_USED <overallNumTotalCoresUsed>
Used for NUM_TOTAL_CORES_USED if missing in a TEST_<idx> block.CATEGORIES <category0> <category1> ...
Gives the Test Test Categories for which this test will be added. See tribits_add_test() for more details.HOST <host0> <host1> ...
The list of hosts for which to enable the test (see tribits_add_test()).XHOST <host0> <host1> ...
The list of hosts for which not to enable the test (see tribits_add_test()).HOSTTYPE <hosttype0> <hosttype1> ...
The list of host types for which to enable the test (see tribits_add_test()).XHOSTTYPE <hosttype0> <hosttype1> ...
The list of host types for which not to enable the test (see tribits_add_test()).EXCLUDE_IF_NOT_TRUE <varname0> <varname1> ...
If specified, gives the names of CMake variables that must evaluate to true, or the test will not be added (see tribits_add_test()).DISABLED <messageWhyDisabled>
If <messageWhyDisabled> is non-empty and does not evaluate to FALSE by CMake, then the test will be added by ctest but the DISABLED test property will be set (see tribits_add_test()).ENVIRONMENT "<var1>=<value1>" "<var2>=<value2>" ....
If passed in, the listed environment variables will be set by CTest before calling the test. This is set using the built-in CTest test property ENVIRONMENT. Note, if the env var values contain semi-colons ';', then replace the semi-colons ';' with another separator '<sep>' and pass in LIST_SEPARATOR <sep> so <sep> will be replaced with ';' at point of usage. If the env var values contain any spaces, also quote the entire variable/value pair as "<vari>=<valuei>". For example, the env var and value my_env_var="arg1 b;arg2;I have spaces" would need to be passed as "my_env_var=arg1 b<sep>arg2<sep>I have spaces".TIMEOUT <maxSeconds>
If passed in, gives maximum number of seconds the test will be allowed to run before being timed-out and killed (see Setting timeouts for tests (tribits_add_test())). This is for the full CTest test, not individual TEST_<idx> commands!LIST_SEPARATOR <sep>
String used as placeholder for the semi-colon char ';' in order to allow pass-through. For example, if arguments to the ARGS or ENVIRONMENT need to use semi-colons, then replace ';' with '<semicolon>' (for example) such as with "somearg=arg1<semicolon>arg2", then at the point of usage, '<semicolon>' will be replaced with ';' and it will be passed to the final command as "somearg=arg1;arg2" (with as many preceding escape backslashes '\' in front of ';' as is needed for the given usage context).ADDED_TEST_NAME_OUT <testName>
If specified, then on output the variable <testName> will be set with the name of the test passed to add_test(). Having this name allows the calling CMakeLists.txt file access and set additional test properties (see Setting additional test properties (tribits_add_advanced_test())).
TEST_<idx> EXEC/CMND Test Blocks and Arguments (tribits_add_advanced_test())
Each test general command block TEST_<idx> runs either a package-built test executable or some general command executable and is defined as either EXEC <exeRootName> or an arbitrary command CMND <cmndExec> with the arguments:
EXEC <exeRootName> [NOEXEPREFIX] [NOEXESUFFIX] [ADD_DIR_TO_NAME] [DIRECTORY <dir>]
If EXEC is specified, then <exeRootName> gives the root name of an executable target that will be run as the command. The full executable name and path is determined in exactly the same way it is in the tribits_add_test() function (see Determining the Executable or Command to Run (tribits_add_test())). If this is an MPI build, then the executable will be run with MPI using NUM_MPI_PROCS <numProcs> (or OVERALL_NUM_MPI_PROCS <overallNumProcs> if NUM_MPI_PROCS is not set for this test case). If the maximum number of MPI processes allowed is less than this number of MPI processes, then the test will not be run. Note that EXEC <exeRootName> when NOEXEPREFIX and NOEXESUFFIX are specified is basically equivalent to CMND <cmndExec> except that in an MPI build, <exeRootName> is always run using MPI. In this case, one can pass in <exeRootName> to any command one would like and it will get run with MPI in MPI mode just link any other MPI-enabled built executable.CMND <cmndExec>
If CMND is specified, then <cmndExec> gives the executable for a command to be run. In this case, MPI will never be used to run the executable even when configured in MPI mode (i.e. TPL_ENABLE_MPI=ON). If one wants to run an arbitrary command using MPI, use EXEC <fullPathToCmndExec> NOEXEPREFIX NOEXESUFFIX instead. WARNING: If you want to run such tests using valgrind, you have to use the raw executable as the <cmndExec> argument and not the script. For example, if you have a python script my_python_test.py with /usr/bin/env python3 at the top, you can't just use:
CMND <path>/my_python_test.py ARGS "<arg0>" "<arg1>" ...The same goes for Perl or any other scripting language.
Instead, you have to use:
CMND ${Python3_EXECUTABLE} ARGS <path>/my_python_test.py <arg0> <arg1> ...ARGS "<arg0>" "<arg1>" ... "<argN>"
The list of command-line arguments to pass to the CMND command or EXEC executable. Put each argument <argi> in quotes "<argi>" if it contains any spaces. Also, of any of the individual arguments need to contain semi-colons ';' such as --my-arg=a;b a;c;d, then pass that quoted as "--my-arg=a<sep>b a<sep>c<sep>d" where <sep> matches the <sep> argument to the input LIST_SEPARATOR <sep>.
By default, the output (stdout/stderr) for each test command is captured and is then echoed to stdout for the overall test. This is done in order to be able to grep the result to determine pass/fail.
Other miscellaneous arguments for each TEST_<idx> block include:
DIRECTORY <dir>
If specified, then the executable is assumed to be in the directory given by relative <dir>. See tribits_add_test().MESSAGE "<message>"
If specified, then the string in "<message>" will be printed before this test command is run. This allows adding some documentation about each individual test invocation to make the test output more understandable.WORKING_DIRECTORY <workingDir>
If specified, then the working directory <workingDir> (relative or absolute) will be created and the test will be run from within this directory. If the directory <workingDir> exists before the test runs, it will be deleted and created again. If one wants to preserve the contents of this directory between test blocks, then one needs to set SKIP_CLEAN_WORKING_DIRECTORY. Using a different WORKING_DIRECTORY for individual test commands allows creating independent working directories for each test case. This would be useful if a single OVERALL_WORKING_DIRECTORY was not sufficient for some reason.SKIP_CLEAN_WORKING_DIRECTORY
If specified, then <workingDir> will not be deleted if it already exists.NUM_MPI_PROCS <numProcs>
If specified, then <numProcs> is the number of processors used for MPI executables. If not specified, this will default to <overallNumProcs> from OVERALL_NUM_MPI_PROCS <overallNumProcs>. If that is not specified, then the value is taken from ${MPI_EXEC_DEFAULT_NUMPROCS}. For serial builds (i.e. TPL_ENABLE_MPI=OFF), passing in a value <numMpiProcs> > 1 will cause the entire test to not be added.NUM_TOTAL_CORES_USED <numTotalCoresUsed>
If specified, gives the total number of processes used by this command/executable. If this is missing, but NUM_MPI_PROCS <numProcs> is specified, then <numProcs> is used instead. If NUM_TOTAL_CORES_USED is missing BUT OVERALL_NUM_TOTAL_CORES_USED <overallNumTotalCoresUsed> is, then <overallNumTotalCoresUsed> is used for <numTotalCoresUsed>. This argument is used for test scripts/executables that use more cores than MPI processes (i.e. <numProcs>) and its only purpose is to inform CTest and TriBITS of the maximum number of cores that are used by the underlying test executable/script. When <numTotalCoresUsed> is greater than ${MPI_EXEC_MAX_NUMPROCS}, then the test will not be added. Otherwise, the CTest property PROCESSORS is set to the max over all <numTotalCoresUsed> so that CTest knows how to best schedule the test w.r.t. other tests on a given number of available processes.OUTPUT_FILE <outputFile>
If specified, then stdout and stderr for the test case will be sent to <outputFile>. By default, the contents of this file will also be printed to STDOUT unless NO_ECHO_OUTPUT is passed as well.
NOTE: Contrary to CMake documentation for execute_process(), STDOUT and STDERR may not get output in the correct order interleaved correctly, even in serial without MPI. Therefore, you can't write any tests that depend on the order of STDOUT and STDERR output in relation to each other. Also note that all of STDOUT and STDERR will be first read into the CTest executable process main memory before the file <outputFile> is written. Therefore, don't run executables or commands that generate massive amounts of console output or it may exhaust main memory. Instead, have the command or executable write directly to a file instead of going through STDOUT.
NO_ECHO_OUTPUT
If specified, then the output for the test command will not be echoed to the output for the entire test command.
By default, an individual test case TEST_<IDX> is assumed to pass if the executable or commands returns a non-zero value to the shell. However, a test case can also be defined to pass or fail based on the arguments/options (see Test case Pass/Fail (tribits_add_advanced_test())):
PASS_ANY
If specified, the test command will be assumed to pass regardless of the return value or any other output. This would be used when a command that is to follow will determine pass or fail based on output from this command in some way.PASS_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...
If specified, the test command will be assumed to pass if it matches any of the given regular expressions. Otherwise, it is assumed to fail. TIPS: Replace ';' with '[;]' or CMake will interpret this as an array element boundary. To match '.', use '[.]'.PASS_REGULAR_EXPRESSION_ALL "<regex0>" "<regex1>" ...
If specified, the test command will be assumed to pass if the output matches all of the provided regular expressions. Note that this is not a capability of raw ctest and represents an extension provided by TriBITS. NOTE: It is critical that you replace ';' with '[;]' or CMake will interpret this as an array element boundary.STANDARD_PASS_OUTPUT
If specified, the test command will be assumed to pass if the string expression "Final Result: PASSED" is found in the output for the test. This as the result of directly passing in PASS_REGULAR_EXPRESSION "End Result: TEST PASSED".FAIL_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...
If specified, the test command will be assumed to fail if it matches any of the given regular expressions. This will be applied and take precedence over other above pass criteria. For example, if even if PASS_REGULAR_EXPRESSION or PASS_REGULAR_EXPRESSION_ALL match, then the test will be marked as failed if any of the fail regexes match the output.ALWAYS_FAIL_ON_NONZERO_RETURN
If specified, then the test case will be marked as failed if the test command returns nonzero, independent of the other pass/fail criteria. This option is used in cases where one wants to grep for strings in the output but still wants to require a zero return code. This make for a stronger test by requiring that both the strings are found and that the command returns 0.ALWAYS_FAIL_ON_ZERO_RETURN
If specified, then the test case will be marked as failed if the test command returns zero '0', independent of the other pass/fail criteria. This option is used in cases where one wants to grep for strings in the output but still wants to require a nonzero return code. This make for a stronger test by requiring that both the strings are found and that the command returns != 0.WILL_FAIL
If specified, invert the result from the other pass/fail criteria. For example, if the regexes in PASS_REGULAR_EXPRESSION or PASS_REGULAR_EXPRESSION_ALL indicate that a test should pass, then setting WILL_FAIL will invert that and report the test as failing. But typically this is used to report a test that returns a nonzero code as passing.
All of the arguments for a test block TEST_<idx> must appear directly below their TEST_<idx> argument and before the next test block (see Argument Parsing and Ordering (tribits_add_advanced_test())).
NOTE: The current implementation limits the number of TEST_<idx> blocks to just 20 (i.e. for <idx>=0...19). If more test blocks are added (e.g. TEST_20), then an fatal error message will be printed and processing will end. To increase this max in a local scope, call:
set(TRIBITS_ADD_ADVANCED_TEST_MAX_NUM_TEST_BLOCKS <larger-num>)
where <larger-num> > 20. This can be set in any scope in any CMakeLists.txt file or inside of a function and it will impact all of the future calls to tribits_add_advanced_test() in that scope.
TEST_<idx> COPY_FILES_TO_TEST_DIR Test Blocks and Arguments (tribits_add_advanced_test())
The arguments for the TEST_<idx> COPY_FILES_TO_TEST_DIR block are:
COPY_FILES_TO_TEST_DIR <file0> <file1> ... <filen>
Required list of 1 or more file names for files that will be copied from <srcDir>/ to <destDir>/.SOURCE_DIR <srcDir>
Optional source directory where the files will be copied from. If <srcDir> is not given, then it is assumed to be ${CMAKE_CURRENT_SOURCE_DIR}. If <srcDir> is given but is a relative path, then it is interpreted relative to ${CMAKE_CURRENT_SOURCE_DIR}. If <srcDir> is an absolute path, then that path is used without modification.DEST_DIR <destDir>
Optional destination directory where the files will be copied to. If <destDir> is not given, then it is assumed to be the working directory where the test is running (typically a new directory created under ${CMAKE_CURRENT_BINARY_DIR} when OVERALL_WORKING_DIR TEST_NAME is given). If <destDir> is given but is a relative path, then it is interpreted relative to the current test working directory. If <destDir> is an absolute path, then that path is used without modification. If <destDir> does not exist, then it will be created (including several directory levels deep if needed).
Test case Pass/Fail (tribits_add_advanced_test())
The logic given below can be used to determine pass/fail criteria for a test case both based on what is printed in the test output and the return code for the test block command. Raw CTest, as of version 3.23, does not allow that. With raw CTest, one can only set pass/fail criteria based the test output or the return code, but not both. This make tribits_add_advanced_test() more attractive to use than tribits_add_test() or raw add_test() in cases where it is important to check both.
The logic for how pass/fail for a TEST_<IDX> EXEC or CMND case is applied is given by:
# A) Apply first set of pass/fail logic TEST_CASE_PASSED = FALSE If PASS_ANY specified: TEST_CASE_PASSED = TRUE Else If PASS_REGULAR_EXPRESSION is specified: For each "<regexi>" in PASS_REGULAR_EXPRESSION: If "<regexi>" matches STDOUT: TEST_CASE_PASSED = TRUE Endif Endforeach Else if PASS_REGULAR_EXPRESSION_ALL specified: TEST_CASE_PASSED = TRUE For each "<regexi>" in PASS_REGULAR_EXPRESSION_ALL: If "<regexi>" does not match STDOUT: TEST_CASE_PASSED = FALSE Endif Endforeach Else If command return code == 0: TEST_CASE_PASSED = TRUE Endif Endif # B) Check for failing regex matching? If FAIL_REGULAR_EXPRESSION specified: For each "<regexi>" in FAIL_REGULAR_EXPRESSION: If "<regexi>" matches STDOUT: TEST_CASE_PASSED = FALSE Endif Endforeach Endif # C) Check for return code always 0 or !=0? If ALWAYS_FAIL_ON_NONZERO_RETURN specified and return code != 0: TEST_CASE_PASSED = FALSE Else If ALWAYS_FAIL_ON_ZERO_RETURN specified and return code == 0: TEST_CASE_PASSED = FALSE Endif # D) Invert pass/fail result? If WILL_FAIL specified: If TEST_CASE_PASSED: TEST_CASE_PASSED = FALSE Else TEST_CASE_PASSED = TRUE Endif Endif
Note that the above is the exact same logic that CTest uses to determine pass/fail w.r.t. to the CTest properties PASS_REGULAR_EXPRESSION, FAIL_REGULAR_EXPRESSION and WILL_FAIL. (It is just that raw CMake/CTest, as of version 3.23, does not support any pass/fail criteria like PASS_REGULAR_EXPRESSION_ALL or ALWAYS_FAIL_ON_NONZERO_RETURN/ALWAYS_FAIL_ON_ZERO_RETURN.)
Overall Pass/Fail (tribits_add_advanced_test())
By default, the overall test will be assumed to pass if it prints:
"OVERALL FINAL RESULT: TEST PASSED (<testName>)"
However, this can be changed by setting one of the following optional arguments:
FINAL_PASS_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...
If specified, the test will be assumed to pass if the output matches any of the provided regular expressions <regexi>. (Sets the CTest property PASS_REGULAR_EXPRESSION for the overall test.)FINAL_FAIL_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...
If specified, the test will be assumed to fail if the output matches any of the provided regular expressions <regexi> regardless if other criteria would have the test passing. (Sets the CTest property FAIL_REGULAR_EXPRESSION for the overall test.)
NOTE: It is not recommended to set FINAL_PASS_REGULAR_EXPRESSION or FINAL_FAIL_REGULAR_EXPRESSION directly, but instead to determine pass/fail for each test case individually as described in TEST_<idx> EXEC/CMND Test Blocks and Arguments (tribits_add_advanced_test()) and Test case Pass/Fail (tribits_add_advanced_test()). Otherwise, the test will confuse most people and the output behavior will seem very strange.
Argument Parsing and Ordering (tribits_add_advanced_test())
The basic tool used for parsing the arguments to this function is the command cmake_parse_arguments() which has a certain set of behaviors. The parsing using cmake_parse_arguments() is actually done in two phases. There is a top-level parsing of the "overall" arguments listed in Overall Arguments (tribits_add_advanced_test()) that also pulls out the test blocks. Then there is a second level of parsing using cmake_parse_arguments() for each of the TEST_<idx> blocks. Because of this usage, there are a few restrictions that one needs to be aware of when using tribits_add_advanced_test(). This short sections tries to explain the behaviors and what is allowed and what is not allowed.
For the most part, the "overall" arguments and the arguments inside of any individual TEST_<idx> blocks can be listed in any order but there are restrictions related to the grouping of overall arguments and TEST_<idx> blocks which are as follows:
Other than that, the keyword arguments and options can appear in any order.
Implementation Details (tribits_add_advanced_test())
Since raw CTest does not support the features provided by this function, the way an advanced test is implemented is that a cmake -P script with the name <testName>.cmake (with any '/' replaced with '__') gets created in the current binary directory that then gets added to CTest using:
add_test(<testName> cmake [other options] -P <testName>.cmake)
This cmake -P script then runs the various test cases and checks the pass/fail for each case to determine overall pass/fail and implement other functionality described above.
Setting Additional Test Properties (tribits_add_advanced_test())
After this function returns, if the test gets added using add_test(), then additional properties can be set and changed using set_tests_properties(<testName> ...), where <testName> is returned using the ADDED_TEST_NAME_OUT <testName> argument. Therefore, any tests properties that are not directly supported by this function and passed through the argument list to this wrapper function can be set in the outer CMakeLists.txt file after the call to tribits_add_advanced_test(). For example:
tribits_add_advanced_test_test( someTest ... ADDED_TEST_NAME_OUT someTest_TEST_NAME ) if (someTest_TEST_NAME) set_tests_properties( ${someTest_TEST_NAME} PROPERTIES ATTACHED_FILES someTest.log ) endif()
where the test writes a log file someTest.log that we want to submit to CDash also.
This approach will work no matter what TriBITS names the individual test(s) or whether the test(s) are added or not (depending on other arguments like COMM, XHOST, etc.).
The following built-in CTest test properties are set through Overall Arguments (tribits_add_advanced_test()) or are otherwise automatically set by this function and should NOT be overridden by direct calls to set_tests_properties(): ENVIRONMENT, FAIL_REGULAR_EXPRESSION, LABELS, PASS_REGULAR_EXPRESSION, RUN_SERIAL, TIMEOUT, WILL_FAIL, and WORKING_DIRECTORY.
However, generally, other built-in CTest test properties can be set after the test is added like show above. Examples of test properties that can be set using direct calls to set_tests_properties() include ATTACHED_FILES, ATTACHED_FILES_ON_FAIL, COST, DEPENDS, MEASUREMENT, and RESOURCE_LOCK.
For example, one can set a dependency between two tests using:
tribits_add_advanced_test_test( test_a [...] ADDED_TEST_NAME_OUT test_a_TEST_NAME ) tribits_add_advanced_test_test( test_b [...] ADDED_TEST_NAME_OUT test_z_TEST_NAME ) if (test_a_TEST_NAME AND test_b_TEST_NAME) set_tests_properties(${test_b_TEST_NAME} PROPERTIES DEPENDS ${test_a_TEST_NAME}) endif()
This ensures that test test_b will always be run after test_a if both tests are run by CTest.
Running multiple tests at the same time (tribits_add_advanced_test())
Just as with tribits_add_test(), setting NUM_MPI_PROCS <numProcs> or OVERALL_NUM_MPI_PROCS <numOverallProcs> or NUM_TOTAL_CORES_USED <numTotalCoresUsed> or OVERALL_NUM_TOTAL_CORES_USED <overallNumTotalCoresUsed> will set the PROCESSORS CTest property to allow CTest to schedule and run multiple tests at the same time when 'ctest -j<N>' is used (see Running multiple tests at the same time (tribits_add_test())).
Disabling Tests Externally (tribits_add_advanced_test())
The test can be disabled externally by setting the CMake cache variable <testName>_DISABLE=TRUE. This allows tests to be disabled on a case-by-case basis. The name <testName> must be the exact name that shows up in ctest -N when running the test.
Debugging and Examining Test Generation (tribits_add_advanced_test())
In order to see what tests get added and if not then why, configure with ${PROJECT_NAME}_TRACE_ADD_TEST=ON. That will print one line per test that shows that the test got added or not and if not then why the test was not added (i.e. due to COMM, OVERALL_NUM_MPI_PROCS, NUM_MPI_PROCS, CATEGORIES, HOST, XHOST, HOSTTYPE, or XHOSTTYPE).
Likely the best way to debug test generation using this function is to examine the generated file <testName>.cmake in the current binary directory (see Implementation Details (tribits_add_advanced_test())) and the generated CTestTestfile.cmake file that should list this test case.
Using tribits_add_advanced_test() in non-TriBITS CMake projects
The function tribits_add_advanced_test() can be used to add tests in non-TriBITS projects. To do so, one just needs to set the variables ${PROJECT_NAME}_ENABLE_TESTS=TRUE and ${PROJECT_NAME}_TRIBITS_DIR (pointing to the TriBITS location). For example, a valid project can be a simple as:
cmake_minimum_required(VERSION 3.23.0) set(PROJECT_NAME TAATDriver) project(${PROJECT_NAME} NONE) set(${PROJECT_NAME}_TRACE_ADD_TEST TRUE) set(${PROJECT_NAME}_TRIBITS_DIR "" CACHE FILEPATH "Location of TriBITS to use." ) set(PACKAGE_NAME ${PROJECT_NAME}) set(${PACKAGE_NAME}_ENABLE_TESTS TRUE) include("${${PROJECT_NAME}_TRIBITS_DIR}/core/test_support/TribitsAddAdvancedTest.cmake") include(CTest) enable_testing() tribits_add_advanced_test( HelloWorld OVERALL_WORKING_DIRECTORY TEST_NAME TEST_0 CMND echo ARGS "Hello World!" PASS_REGULAR_EXPRESIOIN "Hello World" )
Above, one can replace:
include("${${PROJECT_NAME}_TRIBITS_DIR}/core/test_support/TribitsAddAdvancedTest.cmake")
with:
list(PREPEND CMAKE_MODULE_PATH "${${PROJECT_NAME}_TRIBITS_DIR}/core/test_support") include(TribitsAddAdvancedTest)
and it will have the same effect.
In: core/test_support/TribitsAddAdvancedTest.cmake:22
Add the standard cache variable option ${PACKAGE_NAME}_ENABLE_DEBUG for the package.
Usage:
tribits_add_debug_option()
This option is given the default value ${${PROJECT_NAME}_ENABLE_DEBUG}, and if true, this macro will set the variable HAVE_${PACKAGE_NAME_UC}_DEBUG (to be used in the package's configured header file <packageDir>/cmake/<packageName>_config.h.in). This macro is typically called in the package's <packageDir>/CMakeLists.txt file (see the example SimpleCxx/CMakeLists.txt).
NOTE: This also calls tribits_pkg_export_cache_var() to export the variable ${PACKAGE_NAME}_ENABLE_DEBUG.
In: core/package_arch/TribitsPackageMacros.cmake:467
Macro called to conditionally add a set of example directories for an package.
Usage:
tribits_add_example_directories(<dir1> <dir2> ...)
This macro typically is called from the top-level <packageDir>/CMakeLists.txt file for which all subdirectories are all "examples" according to standard package layout.
This macro can be called several times within a package as desired to break up example directories any way one would like.
Currently, all it does macro does is to call add_subdirectory(<diri>) if ${PACKAGE_NAME}_ENABLE_EXAMPLES = TRUE.
In: core/package_arch/TribitsPackageMacros.cmake:550
Function used to create an executable (typically for a test or example), using the built-in CMake command add_executable().
Usage:
tribits_add_executable( <exeRootName> [NOEXEPREFIX] [NOEXESUFFIX] [ADD_DIR_TO_NAME] SOURCES <src0> <src1> ... [CATEGORIES <category0> <category1> ...] [HOST <host0> <host1> ...] [XHOST <host0> <host1> ...] [HOSTTYPE <hosttype0> <hosttype1> ...] [XHOSTTYPE <hosttype0> <hosttype1> ...] [EXCLUDE_IF_NOT_TRUE <varname0> <varname1> ...] [DIRECTORY <dir>] [TESTONLYLIBS <lib0> <lib1> ...] [IMPORTEDLIBS <lib0> <lib1> ...] [COMM [serial] [mpi]] [LINKER_LANGUAGE (C|CXX|Fortran)] [TARGET_DEFINES -D<define0> -D<define1> ...] [INSTALLABLE] [ADDED_EXE_TARGET_NAME_OUT <exeTargetName>] )
Sections:
Formal Arguments (tribits_add_executable())
<exeRootName>
The root name of the executable (and CMake target) (see Executable and Target Name (tribits_add_executable())). This must be the first argument.NOEXEPREFIX
If passed in, then ${PACKAGE_NAME}_ is not added the beginning of the executable name (see Executable and Target Name (tribits_add_executable())).NOEXESUFFIX
If passed in, then ${${PROJECT_NAME}_CMAKE_EXECUTABLE_SUFFIX} and not added to the end of the executable name (except for native Windows builds, see Executable and Target Name (tribits_add_executable())).ADD_DIR_TO_NAME
If passed in, the directory path relative to the package's base directory (with "/" replaced by "_") is added to the executable name (see Executable and Target Name (tribits_add_executable())). This provides a simple way to create unique test executable names inside of a given TriBITS package. Only test executables in the same directory would need to have unique <execRootName> passed in.SOURCES <src0> <src1> ...
Gives the source files that will be compiled into the built executable. By default, these sources are assumed to be in the current working directory (or can contain the relative path or absolute path). If <srci> is an absolute path, then that full file path is used. This list of sources (with adjusted directory path) are passed into add_executable(<exeTargetName> ... ). After calling this function, the properties of the source files can be altered using the built-in CMake command set_source_file_properties().DIRECTORY <dir>
If specified, then the generated executable <exeTargetName> is placed in the relative or absolute directory <dir>. If <dir> is not an absolute path, then the generated executable is placed in the directory ${CMAKE_CURRENT_BINARY_DIR}/<dir>/. Also, the sources for the executable listed in SOURCES <src0> <src1> ... are assumed to be in the relative or absolute directory <dir> instead of the current source directory. This directory path is prepended to each source file name <srci> unless <srci> is an absolute path. If <dir> is not an absolute path, then source files listed in SOURCES are assumed to be in the directory ${CMAKE_CURRENT_SOURCE_DIR}/<dir>/.CATEGORIES <category0> <category1> ...
Gives the Test Test Categories for which this test will be added. See tribits_add_test() for more details.HOST <host0> <host1> ...
The list of hosts for which to enable the test (see tribits_add_test()).XHOST <host0> <host1> ...
The list of hosts for which not to enable the test (see tribits_add_test()).HOSTTYPE <hosttype0> <hosttype1> ...
The list of host types for which to enable the test (see tribits_add_test()).XHOSTTYPE <hosttype0> <hosttype1> ...
The list of host types for which not to enable the test (see tribits_add_test()).EXCLUDE_IF_NOT_TRUE <varname0> <varname1> ...
If specified, gives the names of CMake variables that must evaluate to true, or the test will not be added (see tribits_add_test()).TESTONLYLIBS <lib0> <lib1> ...
Specifies extra test-only libraries defined in this CMake project that will be linked to the executable using target_link_libraries(). Note that regular libraries (i.e. not TESTONLY) defined in the current package or any upstream packages can NOT be listed! TriBITS automatically links non TESTONLY libraries in this package and upstream packages to the executable. The only libraries that should be listed in this argument are either TESTONLY libraries.IMPORTEDLIBS <lib0> <lib1> ...
Specifies extra external libraries that will be linked to the executable using target_link_libraries(). This can only be used for libraries that are built external from this CMake project and are not provided through a proper TriBITS TPL. The latter usage of passing in external libraries is not recommended. External libraries should be handled as declared TriBITS TPLs. So far, the only case where IMPORTEDLIBS has been shown to be necessary is to pass in the standard C math library m. In every other case, a TriBITS TPL should be used instead.COMM [serial] [mpi]
If specified, selects if the test will be added in serial and/or MPI mode. See the COMM argument in the script tribits_add_test() for more details.LINKER_LANGUAGE (C|CXX|Fortran)
If specified, overrides the linker language used by setting the built-in CMake target property LINKER_LANGUAGE. TriBITS sets the default linker language as follows:
if (${PROJECT_NAME}_ENABLE_CXX) set(LINKER_LANGUAGE CXX) elseif (${PROJECT_NAME}_ENABLE_C) set(LINKER_LANGUAGE C) else() # Let CMake set the default linker language it wants based # on source file extensions passed into ``add_executable()``. endif()The reason for this logic is that on some platform if you have a Fortran or C main that links to C++ libraries, then you need the C++ compiler to do the final linking. CMake does not seem to automatically know that it is pulling in C++ libraries and therefore needs to be told use C++ for linking. This is the correct default behavior for mixed-language projects. However, this argument allows the developer to override this logic and use any linker language desired based on other considerations.
TARGET_DEFINES -D<define0> -D<define1> ...
Add the listed defines using target_compile_definitions(<exeTargetName> ...). These should only affect the listed sources for the built executable and not other targets.INSTALLABLE
If passed in, then an install target will be added to install the built executable into the ${CMAKE_INSTALL_PREFIX}/bin/ directory (see Install Target (tribits_add_executable())).ADDED_EXE_TARGET_NAME_OUT <exeTargetName>
If specified, then on output the variable <exeTargetName> will be set with the name of the executable target passed to add_executable(<exeTargetName> ... ). Having this name allows the calling CMakeLists.txt file access and set additional target properties (see Additional Executable and Source File Properties (tribits_add_executable())).
Executable and Target Name (tribits_add_executable())
By default, the full name of the executable and target name is:
<exeTargetName> = ${PACKAGE_NAME}_<exeRootName>
If ADD_DIR_TO_NAME is set, then the directory path relative to the package base directory (with "/" replaced with "_"), or <relDirName>, is added to the executable name to form:
<exeTargetName> = ${PACKAGE_NAME}_<relDirName>_<exeRootName>
If the option NOEXEPREFIX is passed in, then the prefix ${PACKAGE_NAME}_ is removed.
The executable suffix ${${PROJECT_NAME}_CMAKE_EXECUTABLE_SUFFIX} will be added to the actual executable file name if the option NOEXESUFFIX is not passed in but this suffix is never added to the target name. (However, note that on Windows platforms, the default *.exe extension is always added because windows will not run an executable in many contexts unless it has the *.exe extension.)
The reason that a default prefix is prepended to the executable and target name is because the primary reason to create an executable is typically to create a test or an example that is private to the package. This prefix helps to namespace the executable and its target so as to avoid name clashes with targets in other packages. It also helps to avoid clashes if the executable gets installed into the install directory (if INSTALLABLE is specified). For general utility executables on Linux/Unix systems, NOEXEPREFIX and NOEXESUFFIX should be passed in. In this case, one must be careful to pick <exeRootName> that will be sufficiently globally unique. Please use common sense when picking non-namespaced names.
Additional Executable and Source File Properties (tribits_add_executable())
Once add_executable(<exeTargetName> ... ) is called and this function exists, one can set and change properties on the <exeTargetName> executable target using the built-in set_target_properties() command as well as properties on any of the source files listed in SOURCES using the built-in set_source_file_properties() command just like in any CMake project. IF the executable is added, its name will be returned by the argument ADDED_EXE_TARGET_NAME_OUT <exeTargetName>. For example:
tribits_add_executable( someExe ... ADDED_EXE_TARGET_NAME_OUT someExe_TARGET_NAME ) if (someExe_TARGET_NAME) set_target_properties( ${someExe_TARGET_NAME} PROPERTIES LINKER_LANGUAGE CXX ) endif()
The if(someExe_TARGET_NAME) is needed in case the executable does not get added for some reason (see Formal Arguments (tribits_add_executable()) for logic that can result in the executable target not getting added).
Install Target (tribits_add_executable())
If INSTALLABLE is passed in, then an install target using the built-in CMake command install(TARGETS <exeTargetName> ...) is added to install the built executable into the ${CMAKE_INSTALL_PREFIX}/bin/ directory (actual install directory path is determined by ${PROJECT_NAME}_INSTALL_RUNTIME_DIR, see Setting the install prefix).
In: core/package_arch/TribitsAddExecutable.cmake:24
Add an executable and a test (or several tests) all in one shot (just calls tribits_add_executable() followed by tribits_add_test()).
Usage:
tribits_add_executable_and_test( <exeRootName> [NOEXEPREFIX] [NOEXESUFFIX] [ADD_DIR_TO_NAME] SOURCES <src0> <src1> ... [NAME <testName> | NAME_POSTFIX <testNamePostfix>] [CATEGORIES <category0> <category1> ...] [HOST <host0> <host1> ...] [XHOST <xhost0> <xhost1> ...] [XHOST_TEST <xhost0> <xhost1> ...] [HOSTTYPE <hosttype0> <hosttype1> ...] [XHOSTTYPE <xhosttype0> <xhosttype1> ...] [XHOSTTYPE_TEST <xhosttype0> <xhosttype1> ...] [EXCLUDE_IF_NOT_TRUE <varname0> <varname1> ...] [DISABLED "<messageWhyDisabled>"] [DIRECTORY <dir>] [TESTONLYLIBS <lib0> <lib1> ...] [IMPORTEDLIBS <lib0> <lib1> ...] [COMM [serial] [mpi]] [ARGS "<arg0> <arg1> ..." "<arg2> <arg3> ..." ...] [NUM_MPI_PROCS <numProcs>] [RUN_SERIAL] [LINKER_LANGUAGE (C|CXX|Fortran)] [STANDARD_PASS_OUTPUT | PASS_REGULAR_EXPRESSION "<regex0>;<regex1>;..."] [FAIL_REGULAR_EXPRESSION "<regex0>;<regex1>;..."] [WILL_FAIL] [ENVIRONMENT <var0>=<value0> <var1>=<value1> ...] [INSTALLABLE] [TIMEOUT <maxSeconds>] [LIST_SEPARATOR <sep>] [ADDED_EXE_TARGET_NAME_OUT <exeTargetName>] [ADDED_TESTS_NAMES_OUT <testsNames>] )
This function takes a fairly common set of arguments to tribits_add_executable() and tribits_add_test() but not the full set passed to tribits_add_test(). See the documentation for tribits_add_executable() and tribits_add_test() to see which arguments are accepted by which functions.
Arguments that are specific to this function and not directly passed on to tribits_add_executable() or tribits_add_test() include:
XHOST_TEST <xhost0> <xhost1> ...
When specified, this disables just running the tests for the named hosts <xhost0>, <xhost0> etc. but still builds the executable for the test. These are just passed in through the XHOST argument to tribits_add_test().XHOSTTYPE_TEST <xhosttype0> <hosttype1> ...
When specified, this disables just running the tests for the named host types <hosttype0>, <hosttype0>, ..., but still builds the executable for the test. These are just passed in through the XHOSTTYPE argument to tribits_add_test().
This is the function to use for simple test executables that you want to run that either takes no arguments or just a simple set of arguments passed in through ARGS. For more flexibility, just use tribits_add_executable() followed by tribits_add_test().
Finally, the tests are only added if tests are enabled for the package (i.e. ${PACKAGE_NAME}_ENABLE_TESTS = ON) and other criteria are met. But the test executable will always be added if this function is called, regardless of the value of ${PACKAGE_NAME}_ENABLE_TESTS. To avoid adding the test (or example) executable when ${PACKAGE_NAME}_ENABLE_TESTS=OFF, put this command in a subdir under test/ or example/ and that subdir with tribits_add_test_directories() or tribits_add_example_directories(), respectively.
In: core/package_arch/TribitsAddExecutableAndTest.cmake:54
Function used to add a CMake library and target using add_library() and also the ALIAS target ${PACKAGE_NAME}::<libname> (where <libname> is the full CMake target name as returned from ${<libTargetName>}).
Usage:
tribits_add_library( <libBaseName> [HEADERS <h0> <h1> ...] [HEADERS_INSTALL_SUBDIR <headerssubdir>] [NOINSTALLHEADERS <nih0> <hih1> ...] [SOURCES <src0> <src1> ...] [DEPLIBS <deplib0> <deplib1> ...] [IMPORTEDLIBS <ideplib0> <ideplib1> ...] [STATIC|SHARED] [TESTONLY] [NO_INSTALL_LIB_OR_HEADERS] [CUDALIBRARY] [ADDED_LIB_TARGET_NAME_OUT <libTargetName>] )
Sections:
Formal Arguments (tribits_add_library())
<libBaseName>
Required base name of the library. The name of the actual library name will be prefixed by ${${PROJECT_NAME}_LIBRARY_NAME_PREFIX} to produce:
<libTargetName> = ${${PROJECT_NAME}_LIBRARY_NAME_PREFIX}<libBaseName>This is the name passed to add_library(<libTargetName> ...). The name is not prefixed by the package name. CMake will of course add any standard prefix or post-fix to the library file name appropriate for the platform and if this is a static or shared library build (e.g. on Linux prefix = 'lib', postfix = '.so' for shared lib and postfix = '.a' static lib) (see documentation for the built-in CMake command add_library().
HEADERS <h0> <h1> ...
List of public header files for using this library. By default, these header files are assumed to be in the current source directory. They can also contain the relative path or absolute path to the files if they are not in the current source directory. This list of headers is passed into add_library(...) as well (which is not strictly needed but is helpful for some build tools, like MS Visual Studio). By default, these headers will be installed (see Install Targets (tribits_add_library())).HEADERS_INSTALL_SUBDIR <headerssubdir>
Optional subdirectory that the headers will be installed under the standard installation directory. If <headerssubdir>!="", then the headers will be installed under ${PROJECT_NAME}_INSTALL_INCLUDE_DIR}/<headerssubdir>. Otherwise, they will be installed under ${PROJECT_NAME}_INSTALL_INCLUDE_DIR}/. Install Targets (tribits_add_library()).NOINSTALLHEADERS <nih0> <hih1> ...
List of private header files which are used by this library. These headers are not installed and do not needed to be passed in for any purpose other than to pass them into add_library() as some build tools like to have these listed (e.g. MS Visual Studio).SOURCES <src0> <src1> ...
List of source files passed into add_library() that are compiled into header files and included in the library. The compiler used to compile the files is determined automatically based on the file extension (see CMake documentation for add_library()).DEPLIBS <deplib0> <deplib1> ...
List of dependent libraries that are built in the current package that this library is dependent on. These libraries are passed into target_link_libraries(<libTargetName> ...) so that CMake knows about the dependency structure of the libraries within this package. NOTE: One must not list libraries in other upstream TriBITS Packages or libraries built externally from this TriBITS CMake project in DEPLIBS. The TriBITS system automatically handles linking to libraries in upstream TriBITS packages. External libraries need to be listed in the IMPORTEDLIBS argument instead if they are not already specified automatically using a TriBITS TPL.IMPORTEDLIBS <ideplib0> <ideplib1> ...
List of dependent libraries built externally from this TriBITS CMake project. These libraries are passed into target_link_libraries(<libTargetName> ...) so that CMake knows about the dependency. However, note that external libraries are often better handled as TriBITS TPLs. A well constructed TriBITS package and library should never have to use this option! So far, the only case where IMPORTEDLIBS has been shown to be necessary is to pass in the standard C math library m. In every other case, a TriBITS TPL should be used instead.STATIC or SHARED
If STATIC is passed in, then a static library will be created independent of the value of BUILD_SHARED_LIBS. If SHARED is passed in, then a shared library will be created independent of the value of BUILD_SHARED_LIBS. If neither STATIC or SHARED are passed in, then a shared library will be created if BUILD_SHARED_LIBS evaluates to true, otherwise and a static library will be created. If both STATIC and SHARED are passed in (which is obviously a mistake), then a shared library will be created. WARNING: Once you mark a library with STATIC, then all of the downstream libraries in the current package and all downstream packages must also be also be marked with STATIC. That is because, generally, one can not link a link a static lib against a downstream shared lib since that is not portable (but can be done on some platforms if, for example, -fPIC is specified). So be careful to use STATIC in all downstream libraries!TESTONLY
If passed in, then <libTargetName> will not be added to ${PACKAGE_NAME}_LIBRARIES and an install target for the library will not be added. In this case, the current include directories will be set in the global variable <libTargetName>_INCLUDE_DIR which will be used in tribits_add_executable() when a test-only library is linked in through its DEPLIBS argument. Also, the custom property TRIBITS_TESTONLY_LIB will be set to TRUE which will ensure that this library will not be added to the ${PACKAGE_NAME}::all_libs target.NO_INSTALL_LIB_OR_HEADERS
If specified, then no install targets will be added for the library <libTargetName> or the header files listed in HEADERS.CUDALIBRARY
If specified then cuda_add_library() is used instead of add_library() where cuda_add_library() is assumed to be defined by the standard FindCUDA.cmake module as processed using the standard TriBITS FindTPLCUDA.cmake file (see Standard TriBITS TPLs). For this option to work, this package must have an enabled direct or indirect dependency on the TriBITS CUDA TPL or a configure-time error may occur about not knowing about cuda_all_library().ADDED_LIB_TARGET_NAME_OUT <libTargetName>
If specified, then on output the variable <libTargetName> will be set with the name of the library passed to add_library(). Having this name allows the calling CMakeLists.txt file access and set additional target properties (see Additional Library and Source File Properties (tribits_add_library())).
Include Directories (tribits_add_library())
Any base directories for the header files listed in the arguments HEADERS or NOINSTALLHEADERS should be passed into the standard CMake command include_directories() before calling this function. For example, a CMakeLists.txt file will look like:
... tribits_configure_file(${PACKAGE_NAME}_config.h) configure_file(...) ... include_directories(${CMAKE_CURRENT_SOURCE_DIR}) include_directories(${CMAKE_CURRENT_BINARY_DIR}) ... tribits_add_library( <libName> SOURCES <src0>.c <subdir0>/<src1>.cpp <subdir1>/<src2>.F90 ... HEADERS <header0>.h <subdir0>/<header1>.hpp ... NONINSTALLHEADERS <header2>.hpp <header3>.hpp ... ... )
The include of ${CMAKE_CURRENT_BINARY_DIR} is needed for any generated header files (e.g. using raw configure_file() or tribits_configure_file()) or any generated Fortran *.mod module files generated as a byproduct of compiling F90+ source files (that contain one or more Fortran module declarations).
The function tribits_add_library() will grab the list of all of the include directories in scope from prior calls to include_directories() and will add this to the generated library target using target_link_libraries() so that they get propagated downstream as well.
Install Targets (tribits_add_library())
By default, an install target for the library is created using install(TARGETS <libTargetName> ...) to install into the directory ${CMAKE_INSTALL_PREFIX}/lib/ (actual install directory is given by ${PROJECT}_INSTALL_LIB_DIR, see Setting the install prefix). However, this install target will not get created if ${PROJECT_NAME}_INSTALL_LIBRARIES_AND_HEADERS is FALSE and BUILD_SHARD_LIBS=OFF. But when BUILD_SHARD_LIBS=ON, the install target will get added. Also, this install target will not get added if TESTONLY or NO_INSTALL_LIB_OR_HEADERS are passed in.
By default, an install target for the headers listed in HEADERS will get added using install(FILES <h0> <h1> ...), but only if TESTONLY and NO_INSTALL_LIB_OR_HEADERS are not passed in as well. Also, the install target for the headers will not get added if ${PROJECT_NAME}_INSTALL_LIBRARIES_AND_HEADERS is FALSE. If this install target is added, then the headers get installed into the flat directory ${${PROJECT_NAME}_INSTALL_INCLUDE_DIR}/ (default is ${CMAKE_INSTALL_PREFIX}/include/, see Setting the install prefix). If HEADERS_INSTALL_SUBDIR is set, then the headers will be installed under ${${PROJECT_NAME}_INSTALL_INCLUDE_DIR}/<headerssubdir>/.
Note that an install target will not get created for the headers listed in NOINSTALLHEADERS.
Additional Library and Source File Properties (tribits_add_library())
Once add_library(<libTargetName> ... <src0> <src1> ...) is called, one can set and change properties on the <libTargetName> library target using the built-in CMake command set_target_properties() as well as set and change properties on any of the source files listed in SOURCES using the built-in CMake command set_source_file_properties() just like in any CMake project. For example:
tribits_add_library( somelib ... ADDED_LIB_TARGET_NAME_OUT somelib_TARGET_NAME ) set_target_properties( ${somelib_TARGET_NAME} PROPERTIES LINKER_LANGUAGE CXX )
Miscellaneous Notes (tribits_add_library())
When the file Version.cmake exists and the CMake variables ${PROJECT_NAME}_VERSION and ${PROJECT_NAME}_MAJOR_VERSION are defined, then produced shared libraries will be given the standard SOVERSION symlinks (see <projectDir>/Version.cmake).
WARNING: Do NOT use the built-in CMake command add_definitions() to add defines -D<someDefine> to the compile command line that will affect any of the header files in the package! These CMake-added defines are only set locally in this directory and child directories. These defines will NOT be set when code in peer directories (e.g. a downstream TriBITS packages) compiles that may include these header files. To add defines that affect header files, please use a configured header file (see tribits_configure_file()).
In: core/package_arch/TribitsAddLibrary.cmake:25
Add an option and an optional macro define variable in one shot.
Usage:
tribits_add_option_and_define( <userOptionName> <macroDefineName> "<docStr>" <defaultValue> [NONCACHE])
This macro sets the user cache BOOL variable <userOptionName> and if it is true, then sets the global (internal cache) macro define variable <macroDefineName> to ON, and otherwise sets it to OFF. If NONCACHE is passed in, then <macroDefineName> is set as a non-cache local variable instead of a cache variable.
This is designed to make it easy to add a user-enabled option to a configured header file and have the define set in one shot. This would require that the package's configure file (see tribits_configure_file()) have the line:
#cmakedefine <macroDefineName>
NOTE: This also calls tribits_pkg_export_cache_var() to export the variables <userOptionName> and <macroDefineName> (when NONCACHE is not passed). This also requires that local variables with the same names of these cache variables not be assigned with a different value from these cache variables. If they are, then an error will occur later when these variables are read.
NOTE: The define var name <macroDefineName> can be empty "" in which case all logic related to <macroDefineName> is skipped. (But in this case, it would be better to just call:
set(<userOptionName> <defaultValue> CACHE BOOL "<docStr>")
In: core/package_arch/TribitsAddOptionAndDefine.cmake:14
Add the standard option ${PACKAGE_NAME}_SHOW_DEPRECATED_WARNINGS for the package.
Usage:
tribits_add_show_deprecated_warnings_option()
This macro should be called in the package's <packageDir>/CMakeLists.txt`_ file. This option is given the default value ${${PROJECT_NAME}_SHOW_DEPRECATED_WARNINGS}. This option is then looked for in tribits_configure_file() to add macros to add deprecated warnings to deprecated parts of a package.
In: core/package_arch/TribitsPackageMacros.cmake:505
Add a test or a set of tests for a single executable or command using CTest add_test().
Usage:
tribits_add_test( <exeRootName> [NOEXEPREFIX] [NOEXESUFFIX] [NAME <testName> | NAME_POSTFIX <testNamePostfix>] [DIRECTORY <directory>] [ADD_DIR_TO_NAME] [RUN_SERIAL] [ARGS "<arg0> <arg1> ..." "<arg2> <arg3> ..." ... | POSTFIX_AND_ARGS_0 <postfix0> <arg0> <arg1> ... POSTFIX_AND_ARGS_1 ... ] [COMM [serial] [mpi]] [NUM_MPI_PROCS <numMpiProcs>] [NUM_TOTAL_CORES_USED <numTotalCoresUsed>] [CATEGORIES <category0> <category1> ...] [HOST <host0> <host1> ...] [XHOST <host0> <host1> ...] [HOSTTYPE <hosttype0> <hosttype1> ...] [XHOSTTYPE <hosttype0> <hosttype1> ...] [EXCLUDE_IF_NOT_TRUE <varname0> <varname1> ...] [DISABLED <messageWhyDisabled>] [STANDARD_PASS_OUTPUT | PASS_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...] [FAIL_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...] [WILL_FAIL] [ENVIRONMENT <var0>=<value0> <var1>=<value1> ...] [TIMEOUT <maxSeconds>] [LIST_SEPARATOR <sep>] [ADDED_TESTS_NAMES_OUT <testsNames>] )
The tests are only added if tests are enabled for the package (i.e. ${PACKAGE_NAME}_ENABLE_TESTS = ON). (NOTE: A more efficient way to optionally enable tests or examples is to put them in a test/ or example/ subdir and then include that subdir with tribits_add_test_directories() or tribits_add_example_directories(), respectively.)
Sections:
Formal Arguments (tribits_add_test())
<exeRootName>
The name of the executable or path to the executable to run for the test (see Determining the Executable or Command to Run (tribits_add_test())). This name is also the default root name for the test (see Determining the Full Test Name (tribits_add_test())).NOEXEPREFIX
If specified, then the prefix ${PACKAGE_NAME}_ is assumed not to be prepended to <exeRootName> (see Determining the Executable or Command to Run (tribits_add_test())).NOEXESUFFIX
If specified, then the postfix ${${PROJECT_NAME}_CMAKE_EXECUTABLE_SUFFIX} is assumed not to be post-pended to <exeRootName> (except on Windows platforms, see Determining the Executable or Command to Run (tribits_add_test())).NAME <testRootName>
If specified, gives the root name of the test. If not specified, then <testRootName> is taken to be <exeRootName>. The actual test name passed to add_test() will always be prefixed as ${PACKAGE_NAME}_<testRootName>. The main purpose of this argument is to allow multiple tests to be defined for the same executable. CTest requires all test names to be globally unique in a single project. See Determining the Full Test Name (tribits_add_test()).NAME_POSTFIX <testNamePostfix>
If specified, gives a postfix that will be added to the standard test name based on <exeRootName> (appended as _<NAME_POSTFIX>). If the NAME <testRootName> argument is given, this argument is ignored. See Determining the Full Test Name (tribits_add_test()).DIRECTORY <dir>
If specified, then the executable is assumed to be in the directory given by <dir>. The directory <dir> can either be a relative or absolute path. If not specified, the executable is assumed to be in the current binary directory ${CMAKE_CURRENT_BINARY_DIR}. See Determining the Executable or Command to Run (tribits_add_test()).ADD_DIR_TO_NAME
If specified, then the directory name that this test resides in will be added into the name of the test after the package name is added and before the root test name (see Determining the Full Test Name (tribits_add_test())). The directory name will have the package's base directory stripped off so only the unique part of the test directory will be used. All directory separators "/" will be changed into underscores "_".RUN_SERIAL
If specified, then no other tests will be allowed to run while this test is running. This is useful for devices (like CUDA GPUs) that require exclusive access for processes/threads. This just sets the CTest test property RUN_SERIAL using the built-in CMake function set_tests_properties(). Also, the addition of the RUN_SERIAL test property can be triggered by (the user) setting the global cache variable <fullTestName>_SET_RUN_SERIAL=ON. NOTE: If RUN_SERIAL is passed in but <fullTestName>_SET_RUN_SERIAL=OFF (or any value evaluating to FALSE), then the RUN_SERIAL test property will NOT be set on the added test(s).ARGS "<arg0> <arg1> ..." "<arg2> <arg3> ..." ...
If specified, then a set of arguments can be passed in quotes. If multiple groups of arguments are passed in different quoted clusters of arguments then a different test will be added for each set of arguments. In this way, many different tests can be added for a single executable in a single call to this function. Each of these separate tests will be named <fullTestName>_xy where xy = 00, 01, 02, and so on. WARNING: When defining multiple tests it is preferred to use the POSTFIX_AND_ARGS_<IDX> form instead. WARNING: Multiple arguments passed to a single test invocation must be quoted or multiple tests taking single arguments will be created instead! See Adding Multiple Tests (tribits_add_test()) for more details and examples.POSTFIX_AND_ARGS_<IDX> <postfix> <arg0> <arg1> ...
If specified, gives a sequence of sets of test postfix names and arguments lists for different tests (up to POSTFIX_AND_ARGS_19). For example, a set of three different tests with argument lists can be specified as:
POSTIFX_AND_ARGS_0 postfix0 --arg1 --arg2=dummy POSTIFX_AND_ARGS_1 postfix1 --arg2=fly POSTIFX_AND_ARGS_2 postfix2 --arg2=bagsThis will create three different test cases with the postfix names postfix0, postfix1, and postfix2. The indexes must be consecutive starting a 0 and going up to (currently) 19. The main advantages of using these arguments instead of just ARGS are that one can give a meaningful name to each test case and one can specify multiple arguments without having to quote them and one can allow long argument lists to span multiple lines. See Adding Multiple Tests (tribits_add_test()) for more details and examples.
Note that one of the <postfix> arguments can be empty, in which case the base test name is not appended so:
POSTIFX_AND_ARGS_0 "" --arg1 --arg2=dummywould create one test without appending the test name.
COMM [serial] [mpi]
If specified, determines if the test will be added in serial and/or MPI mode. If the COMM argument is missing, the test will be added in both serial and MPI builds of the code. That is if COMM mpi is passed in, then the test will not be added if TPL_ENABLE_MPI=OFF. Likewise, if COMM serial is passed in, then the test will not be added if TPL_ENABLE_MPI=ON. If COMM serial mpi or COMM mpi serial is passed in, then the value of TPL_ENABLE_MPI does not determine if the test is added or not.NUM_MPI_PROCS <numMpiProcs>
If specified, gives the number of MPI processes used to run the test with the MPI exec program ${MPI_EXEC}. If <numMpiProcs> is greater than ${MPI_EXEC_MAX_NUMPROCS} then the test will be excluded. If not specified, then the default number of processes for an MPI build (i.e. TPL_ENABLE_MPI=ON) will be ${MPI_EXEC_DEFAULT_NUMPROCS}. For serial builds (i.e. TPL_ENABLE_MPI=OFF), passing in a value <numMpiProcs> > 1 will cause the test to not be added. The value <numMpiProcs> will also be set as the built-in test property PROCESSORS if NUM_TOTAL_CORES_USED is not specified.NUM_TOTAL_CORES_USED <numTotalCoresUsed>
If specified, gives the total number of processes or cores that is reported to CTest as the built-in CTest PROCESSORS property. If this is not specified, then PROCESSORS is specified by the argument NUM_MPI_PROCS <numMpiProcs>. This argument is used for test scripts/executables that use more cores than MPI processes (i.e. <numMpiProcs>) and its only purpose is to inform CTest and TriBITS of the maximum number of processes or cores that are used by the underlying test executable/script. When specified, if <numTotalCoresUsed> is greater than ${MPI_EXEC_MAX_NUMPROCS}, then the test will not be added. Otherwise, the CTest property PROCESSORS is set to <numTotalCoresUsed> so that CTest knows how to best schedule the test w.r.t. other tests on a given number of available processes. See Running multiple tests at the same time (tribits_add_test()).CATEGORIES <category0> <category1> ...
If specified, gives the specific categories of the test. Valid test categories include BASIC, CONTINUOUS, NIGHTLY, HEAVY and PERFORMANCE. If not specified, the default category is BASIC. When the test category does not match ${PROJECT_NAME}_TEST_CATEGORIES, then the test is not added. When CATEGORIES contains BASIC it will match ${PROJECT_NAME}_TEST_CATEGORIES equal to CONTINUOUS, NIGHTLY, and HEAVY. When CATEGORIES contains CONTINUOUS it will match ${PROJECT_NAME}_TEST_CATEGORIES equal to CONTINUOUS, NIGHTLY, and HEAVY. When CATEGORIES contains NIGHTLY it will match ${PROJECT_NAME}_TEST_CATEGORIES equal to NIGHTLY and HEAVY. When CATEGORIES contains PERFORMANCE it will match ${PROJECT_NAME}_TEST_CATEGORIES=PERFORMANCE only.HOST <host0> <host1> ...
If specified, gives a list of hostnames where the test will be included. The current hostname is determined by the built-in CMake command site_name(${PROJECT_NAME}_HOSTNAME). On Linux/Unix systems, this is typically the value returned by uname -n. If this list is given, the value of ${${PROJECT_NAME}_HOSTNAME} must equal one of the listed host names <hosti> or test will not be added. The value of ${PROJECT_NAME}_HOSTNAME gets printed out in the TriBITS cmake output under the section Probing the environment (see Full Processing of TriBITS Project Files).XHOST <host0> <host1> ...
If specified, gives a list of hostnames (see HOST argument) on which the test will not be added. This check is performed after the check for the hostnames in the HOST list if it should exist. Therefore, this exclusion list overrides the HOST inclusion list.HOSTTYPE <hosttype0> <hosttype1> ...
If specified, gives the names of the host system type (given by the built-in CMake cache variable CMAKE_HOST_SYSTEM_NAME which is printed in the TriBITS cmake configure output in the section Probing the environment) for which the test is allowed to be added. If HOSTTYPE is specified and CMAKE_HOST_SYSTEM_NAME is not equal to one of the values of <hosttypei>, then the test will not be added. Typical host system type names include Linux, Darwin, Windows, etc.XHOSTTYPE <hosttype0> <hosttype1> ...
If specified, gives the names of the host system type (see the HOSTTYPE argument above) for which not to include the test on. This check is performed after the check for the host system names in the HOSTTYPE list if it should exist. Therefore, this exclusion list overrides the HOSTTYPE inclusion list.EXCLUDE_IF_NOT_TRUE <varname0> <varname1> ...
If specified, gives the names of CMake variables that must evaluate to true, or the test will not be added.DISABLED <messageWhyDisabled>
If <messageWhyDisabled> is non-empty and does not evaluate to FALSE, then the test will be added by add_test() (so CTest will see it) but the ctest test property DISABLED will be set. Therefore, CTest will not run the test and will instead list it as "Not Run" when tests are run locally and when submitting test results to CDash (with test details "Not Run (Disabled)"). Also, the message <messageWhyDisabled> will be printed to STDOUT by cmake after the line stating the test was added when ${PROJECT_NAME}_TRACE_ADD_TEST=ON is set. If <messageWhyDisabled> evaluates to FALSE in CMake (e.g. "FALSE", "false", "NO", "no", "0", "", etc.), then the DISABLED property will not be set. This property can also be set with the CMake cache var -D <fullTestName>_SET_DISABLED_AND_MSG="<msgSetByVar>" and in fact that var will override the value of <messageWhyDisabled> passed in here (if <msgSetByVar> is non-empty). This allows a user to enable tests that are disabled in the CMakeList.txt files using this input.STANDARD_PASS_OUTPUT
If specified, then the standard test output string End Result: TEST PASSED is grepped in the test stdout for to determine success. This is needed for MPI tests on some platforms since the return value from MPI executables is unreliable. This is set using the built-in CTest property PASS_REGULAR_EXPRESSION.PASS_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...
If specified, then the test will be assumed to pass only if one of the regular expressions <regex0>, <regex1> etc. match the output send to stdout. Otherwise, the test will fail. This is set using the built-in CTest property PASS_REGULAR_EXPRESSION. Consult standard CMake documentation for full behavior. TIPS: Replace ';' with '[;]' or CMake will interpret this as an array element boundary. To match '.', use '[.]'.FAIL_REGULAR_EXPRESSION "<regex0>" "<regex1>" ...
If specified, then a test will be assumed to fail if one of the regular expressions <regex0>, <regex1> etc. match the output send to stdout. Otherwise, the test will pass. This is set using the built-in CTest property FAIL_REGULAR_EXPRESSION. Consult standard CMake documentation for full behavior (and see above tips for PASS_REGULAR_EXPRESSION).WILL_FAIL
If passed in, then the pass/fail criteria will be inverted. This is set using the built-in CTest property WILL_FAIL. Consult standard CMake documentation for full behavior.ENVIRONMENT "<var1>=<value1>" "<var2>=<value2>" ....
If passed in, the listed environment variables will be set by CTest before calling the test. This is set using the built-in CTest test property ENVIRONMENT. Note, if the env var values contain semi-colons ';', then replace the semi-colons ';' with another separator '<sep>' and pass in LIST_SEPARATOR <sep> so <sep> will be replaced with ';' at point of usage. If the env var values contain any spaces, also quote the entire variable/value pair as "<vari>=<valuei>". For example, the env var and value my_env_var="arg1 b;arg2;I have spaces" would need to be passed as "my_env_var=arg1 b<sep>arg2<sep>I have spaces".TIMEOUT <maxSeconds>
If passed in, gives maximum number of seconds the test will be allowed to run before being timed-out and killed. This sets the CTest property TIMEOUT. The value <maxSeconds> will be scaled by the value of ${PROJECT_NAME}_SCALE_TEST_TIMEOUT. See Setting timeouts for tests (tribits_add_test()) for more details.
WARNING: Rather than just increasing the timeout for an expensive test, please try to either make the test run faster or relegate the test to being run less often (i.e. set CATEGORIES NIGHTLY or even HEAVY for extremely expensive tests). Expensive tests are one of the worse forms of technical debt that a project can have!
LIST_SEPARATOR <sep>
String used as placeholder for the semi-colon char ';' in order to allow pass-through. For example, if arguments to the ARGS or ENVIRONMENT need to use semi-colons, then replace ';' with '<semicolon>' (for example) such as with "somearg=arg1<semicolon>arg2", then at the point of usage, '<semicolon>' will be replaced with ';' and it will be passed to the final command as "somearg=arg1;arg2" (with as many preceding escape backslashes '\' in front of ';' as is needed for the given usage context).ADDED_TESTS_NAMES_OUT <testsNames>
If specified, then on output the variable <testsNames> will be set with the name(S) of the tests passed to add_test(). If more than one test is added, then this will be a list of test names. Having this name allows the calling CMakeLists.txt file access and set additional test properties (see Setting additional test properties (tribits_add_test())).
In the end, this function just calls the built-in CMake commands add_test(${TEST_NAME} ...) and set_tests_properties(${TEST_NAME} ...) to set up a executable process for ctest to run, determine pass/fail criteria, and set some other test properties. Therefore, this wrapper function does not provide any fundamentally new features that are not already available in the basic usage if CMake/CTest. However, this wrapper function takes care of many of the details and boiler-plate CMake code that it takes to add such a test (or tests) and enforces consistency across a large project for how tests are defined, run, and named (to avoid test name clashes).
If more flexibility or control is needed when defining tests, then the function tribits_add_advanced_test() should be used instead.
In the following subsections, more details on how tests are defined and run is given.
Determining the Executable or Command to Run (tribits_add_test())
This function is primarily designed to make it easy to run tests for executables built using the function tribits_add_executable(). To set up tests to run arbitrary executables, see below.
By default, the executable to run is determined by first getting the executable name which by default is assumed to be:
<fullExeName> = ${PACKAGE_NAME}_<exeRootName>${${PROJECT_NAME}_CMAKE_EXECUTABLE_SUFFIX}
which is (by no coincidence) identical to how it is selected in tribits_add_executable() (see Executable and Target Name (tribits_add_executable())). This name can be altered by passing in NOEXEPREFIX, NOEXESUFFIX, and ADD_DIR_TO_NAME as described in Executable and Target Name (tribits_add_executable()).
By default, this executable is assumed to be in the current CMake binary directory ${CMAKE_CURRENT_BINARY_DIR} but the directory location can be changed using the DIRECTORY <dir> argument.
If an arbitrary executable is to be run (i.e. not build inside of the project), then pass in NOEXEPREFIX and NOEXESUFFIX and set <exeRootName> to the relative or absolute path of the executable to be run. If <exeRootName> is not an absolute path, then ${CMAKE_CURRENT_BINARY_DIR}/<exeRootName> is set as the executable to run in this case.
NOTE: On native Windows platforms, the NOEXESUFFIX will still allow CTest to run executables that have the *.exe suffix.
Whatever executable path is specified using this logic, if the executable is not found, then when ctest goes to run the test, it will mark it as NOT RUN.
Determining the Full Test Name (tribits_add_test())
By default, the base test name is selected to be:
<fullTestName> = ${PACKAGE_NAME}_<exeRootName>
If NAME <testRootName> is passed in, then <testRootName> is used instead of <exeRootName> above.
If NAME_POSTFIX <testNamePostfix> is passed in, then the base test name is selected to be:
<fullTestName> = ${PACKAGE_NAME}_<exeRootName>_<testNamePostfix>
If ADD_DIR_TO_NAME is passed in, then the directory name relative to the package base directory is added to the name as well to help disambiguate the test name (see the above).
Let the test name determined as described above be <fullTestName>. If no arguments or only a single set of arguments are passed in through ARGS, then this is the test name actually passed in to add_test(). If multiple tests are defined, then this name becomes the base test name for each of the tests (see Adding Multiple Tests (tribits_add_test())).
Finally, for any test that gets defined, if MPI is enabled (i.e. TPL_ENABLE_MPI=ON), then the terminal suffix _MPI_${NUM_MPI_PROCS} will be added to the end of the test name (even for multiple tests). No such prefix is added for the serial case (i.e. TPL_ENABLE_MPI=OFF).
Adding Multiple Tests (tribits_add_test())
Using this function, one can add executable arguments and can even add multiple tests in one of two ways. One can either pass in one or more quoted clusters of arguments using:
ARGS "<arg0> <arg1> ..." "<arg2> <arg3> ..." ...
or can pass in an explicit test name postfix and arguments with:
POSTFIX_AND_ARGS_0 <postfix0> <arg0> <arg1> ... POSTFIX_AND_ARGS_1 <postfix1> <arg2> ... ...
If only one short set of arguments needs to be passed in, then passing:
ARGS "<arg0> <arg1>"
may be preferable since it will not add any postfix name to the test. To add more than one test case using ARGS, one will use more than one quoted set of arguments such as with:
ARGS "<arg0> <arg1>" "<arg2> <arg2>"
which creates 2 tests with the names <fullTestName>_00 passing arguments "<arg0> <arg1>" and <fullTestName>_01 passing arguments "<arg2> <arg3>". However, when passing multiple sets of arguments it is preferable to not use ARGS but instead use:
POSTFIX_AND_ARGS_0 test_a <arg0> <arg1> POSTFIX_AND_ARGS_1 test_b <arg2> <arg2>
which also creates the same 2 tests but now with the improved names <fullTestName>_test_a passing arguments "<arg0> <arg1>" and <fullTestName>_test_b passing arguments "<arg2> <arg3>". In this way, the individual tests can be given more understandable names.
The other advantage of the POSTFIX_AND_ARGS_<IDX> form is that the arguments <arg0>, <arg1>, ... do not need to be quoted and can therefore be extended over multiple lines like:
POSTFIX_AND_ARGS_0 long_args --this-is-the-first-long-arg=very --this-is-the-second-long-arg=verylong
If one does not use quotes when using ARGS one will actually get more than one test. For example, if one passes in:
ARGS --this-is-the-first-long-arg=very --this-is-the-second-long-arg=verylong
one actually gets two tests, not one test. This is a common mistake that people make when using the ARGS form of passing arguments. This can't be fixed or it will break backward compatibility. If this could be designed fresh, the ARGS argument would only create a single test and the arguments would not be quoted.
Determining Pass/Fail (tribits_add_test())
The only means to determine pass/fail is to use the built-in CTest properties PASS_REGULAR_EXPRESSION and FAIL_REGULAR_EXPRESSION which can only grep the test's STDOUT/STDERR or to check for a 0 return value (or invert these using WILL_FAIL). For simple tests, that is enough. However, for more complex executables, one may need to examine one or more output files to determine pass/fail. Raw CMake/CTest cannot do this. In this case, one should use tribits_add_advanced_test() instead to add the test.
Setting additional test properties (tribits_add_test())
After this function returns, any tests that get added using add_test() can have additional properties set and changed using set_tests_properties(). Therefore, any tests properties that are not directly supported and passed through this wrapper function can be set in the outer CMakeLists.txt file after the call to tribits_add_test().
If tests are added, then the names of those tests will be returned in the variable ADDED_TESTS_NAMES_OUT <testsNames>. This can be used, for example, to override the PROCESSORS property for the tests with:
tribits_add_test( someTest ... ADDED_TESTS_NAMES_OUT someTest_TEST_NAME ) if (someTest_TEST_NAME) set_tests_properties( ${someTest_TEST_NAME} PROPERTIES ATTACHED_FILES someTest.log ) endif()
where the test writes a log file someTest.log that we want to submit to CDash also.
This approach will work no matter what TriBITS names the individual test(s) or whether the test(s) are added or not (depending on other arguments like COMM, XHOST, etc.).
The following built-in CTest test properties are set through Formal Arguments (tribits_add_test()) or are otherwise automatically set by this function and should NOT be overridden by direct calls to set_tests_properties(): ENVIRONMENT, FAIL_REGULAR_EXPRESSION, LABELS, PASS_REGULAR_EXPRESSION, RUN_SERIAL, TIMEOUT, and WILL_FAIL.
However, generally, other built-in CTest test properties can be set after the test is added like show above. Examples of test properties that can be set using direct calls to set_tests_properties() include ATTACHED_FILES, ATTACHED_FILES_ON_FAIL, COST, DEPENDS, MEASUREMENT, RESOURCE_LOCK and WORKING_DIRECTORY.
For example, one can set a dependency between two tests using:
tribits_add_test( test_a [...] ADDED_TESTS_NAMES_OUT test_a_TEST_NAME ) tribits_add_test( test_b [...] ADDED_TESTS_NAMES_OUT test_z_TEST_NAME ) if (test_a_TEST_NAME AND test_b_TEST_NAME) set_tests_properties(${test_b_TEST_NAME} PROPERTIES DEPENDS ${test_a_TEST_NAME}) endif()
This ensures that test test_b will always be run after test_a if both tests are run by CTest.
Running multiple tests at the same time (tribits_add_test())
By default, CTest will run as many tests defined with add_test() at same time as it can according to its parallel level (e.g. 'ctest -j<N>' or the CTest property CTEST_PARALLEL_LEVEL). For example, when raw 'ctest -j10' is run, CTest will run multiple tests at the same time to try to make usage of 10 processors/cores. If all of the defined tests only used one process (which is assumed by default except for MPI tests), then CTest will run 10 tests at the same time and will launch new tests as running tests finish. One can also define tests which use more than one process or use more cores than the number of MPI processes. When passing in NUM_MPI_PROCS <numMpiProcs> (see above), this TriBITS function will set the built-in CTest property PROCESSORS to <numMpiProcs> using:
set_tests_properties(<fullTestName> PROPERTIES PROCESSORS <numMpiProcs>)
This tells CTest that the defined test uses <numMpiProcs> processes and CTest will use that information to not exceed the requested parallel level. For example, if several NUM_MPI_PROCS 3 tests are defined and CTest is run with 'ctest -j12', then CTest would schedule and run 4 of these tests at a time (to make use of 12 processors/cores on the machine), starting new tests as running tests finish, until all of the tests have been run.
There are some situations where a test will use more processes/cores than specified by NUM_MPI_PROCS <numMpiProcs> such as when the underlying executable fires off more processes in parallel to do processing. Also, an MPI program may use threading and therefore use overall more cores than the number of MPI processes. For these cases, it is critical to set NUM_TOTAL_CORES_USED <numTotalCoresUsed> to tell TriBITS and CTest how many cores will be used by a threaded test. This is needed to exclude the test if there are too many processes/cores needed to run the test than are available. Also, if the test is added, then this is needed to set the built-in CTest PROCESSORS property so CTest can avoid overloading the machine. For example, for test where the MPI executable running on 4 processes uses 10 threads per process, one would set:
NUM_MPI_PROCS 4 NUM_TOTAL_CORES_USED 40
In this case, it sets the CTest PROCESSORS property as:
set_tests_properties(<fullTestName> PROPERTIES PROCESSORS <numTotalCoresUsed>)
When the number of processes a test uses does not cleanly divide into the requested CTest parallel level, it is not clear how CTest schedules the tests (hard to find documentation on this but one could always inspect the CTest source code to find out for sure). However, one boundary case that is well observed is that CTest will run all defined tests regardless of the size of the PROCESSORS property or the value of CTEST_PARALLEL_LEVEL. For example, if there are tests where PROCESSORS is set to 20 but `ctest -j10' is used, then CTest will still run those tests (using 20 processes) but will not schedule any other tests while the parallel level is exceeded. This can overload the machine obviously. Therefore, always set MPI_EXEC_MAX_NUMPROCS to the maximum number of cores/processes that can be comfortably run on a given machine. Also note that MPI tests are very fragile to overloaded machines
NOTE: Never manually override the PROCESSORS CTest property. Instead, always using NUM_TOTAL_CORES_USED <numTotalCoresUsed> to set this. This is important because TriBITS needs to know how many processes/cores are required for test so that it can disable the test if the test requires more cores/processes than a given machine can handle or to exceed an imposed budget of the number processes to be used. The latter is important when running multiple ctest -J<N> invocations on the same test machine.
Setting timeouts for tests (tribits_add_test())
By default, all tests have a default timeout (1500 seconds for most projects, see DART_TESTING_TIMEOUT). That means that if they hang (e.g. as is common when deadlocking occurs with multi-process MPI-based tests and multi-threaded tests) then each test may hang for a long time, causing the overall test suite to take a long time to complete. For many CMake projects, this default timeout is way too long.
Timeouts for tests can be adjusted in a couple of ways. First, a default timeout for all tests is enforced by CTest given the configured value of the variable DART_TESTING_TIMEOUT (typically set by the user but set by default to 1500 for most projects). This is a global timeout that applies to all tests that don't otherwise have individual timeouts set using the TIMEOUT CTest property (see below). The value of DART_TESTING_TIMEOUT in the CMake cache on input to CMake will get scaled by ${PROJECT_NAME}_SCALE_TEST_TIMEOUT and the scaled value gets written into the file DartConfiguration.tcl as the field TimeOut. The``ctest`` executable reads TimeOut from this file when it runs to determine the default global timeout. The value of this default global TimeOut can be overridden using the ctest argument --timeout <seconds> (see Overriding test timeouts).
Alternatively, timeouts for individual tests can be set using the input argument TIMEOUT (see Formal Arguments (tribits_add_test()) above). The timeout value passed in to this function is then scaled by ${PROJECT_NAME}_SCALE_TEST_TIMEOUT and the scaled timeout is then set as the CTest test property TIMEOUT. One can observe the value of this property in the CMake-generated file CTestTestfile.cmake in the current build directory. Individual test timeouts set this way are not impacted by the global default timeout DART_TESTING_TIMEOUT or the ctest argument --timeout <seconds>.
In summary, CTest determines the timeout for any individual test as follows:
Debugging and Examining Test Generation (tribits_add_test())
In order to see what tests get added and if not then why, configure with ${PROJECT_NAME}_TRACE_ADD_TEST=ON. That will print one line per show that the test got added and if not then why the test was not added (i.e. due to COMM, NUM_MPI_PROCS, CATEGORIES, HOST, XHOST, HOSTTYPE, or XHOSTTYPE).
Also, CMake writes a file CTestTestfile.cmake in the current binary directory which contains all of the added tests and test properties that are set. This is the file that is read by ctest when it runs to determine what tests to run, determine pass/fail and adjust other behavior using test properties. In this file, one can see the exact add_test() and set_tests_properties() commands. The is the ultimate way to debug exactly what tests are getting added by this function (or if the test is even being added at all).
Disabling Tests Externally (tribits_add_test())
The test can be disabled externally by setting the CMake cache variable <fullTestName>_DISABLE=TRUE. This allows tests to be disabled on a case-by-case basis by the user (for whatever reason). Here, <fullTestName> must be the exact name that shows up in 'ctest -N' when running the test. If multiple tests are added in this function through multiple argument sets to ARGS or through multiple POSTFIX_AND_ARGS_<IDX> arguments, then <fullTestName>_DISABLE=TRUE must be set for each test individually. When a test is disabled in this way, TriBITS will always print a warning to the cmake stdout at configure time warning that the test is being disabled.
Adding extra commandline arguments externally (tribits_add_test())
One can add additional command-line arguments for any ctest test added using this function. In order to do so, set the CMake cache variable:
set(<fullTestName>_EXTRA_ARGS "<earg0>;<earg1>;<earg2>;..." CACHE STRING "Extra args")
in a *.cmake configure options fragment file or:
-D <fullTestName>_EXTRA_ARGS="<earg0>;<earg1>;<earg2>;..."
on the CMake command-line.
These extra command-line arguments are added after any arguments passed in through ARGS "<oarg0> <oarg1> ..." or POSTFIX_AND_ARGS_<IDX> <oarg0> <oarg1> .... This allows these extra arguments to override the earlier arguments.
The primary motivating use case for <fullTestName>_EXTRA_ARGS is to allow one to alter how a test runs on a specific platform or build. For example, this allows one to disable specific individual unit tests for a GTest executable such as with:
set(<fullTestName>_EXTRA_ARGS "--gtest_filter=-<unittest0>:<unittest1>:..." CACHE STRING "Disable specific unit tests" )
For example, this would be an alternative to disabling an entire unit testing executable using -D<fullTestName>_DISABLE=ON as described above.
In: core/test_support/TribitsAddTest.cmake:15
Macro called to add a set of test directories for an package.
Usage:
tribits_add_test_directories(<dir1> <dir2> ...)
This macro only needs to be called from the top most CMakeLists.txt file for which all subdirectories are all "tests".
This macro can be called several times within a package and it will have the right effect.
Currently, all this macro does macro is to call add_subdirectory(<diri>) if ${PACKAGE_NAME}_ENABLE_TESTS is TRUE.
In: core/package_arch/TribitsPackageMacros.cmake:429
Allow listed packages to be missing and automatically excluded from the package dependency data-structures.
Usage:
tribits_allow_missing_external_packages(<pkg0> <plg1> ...)
If the missing upstream package <pkgi> is optional, then the effect will be to simply ignore the missing package (i.e. it will never be added to package's list and not added to dependency data-structures) and remove it from the dependency lists for downstream packages that have an optional dependency on the missing upstream package <pkgi>. However, all downstream packages that have a required dependency on the missing upstream package <pkgi> will be hard disabled, i.e. ${PROJECT_NAME}_ENABLE_{CURRENT_PACKAGE}=OFF and a note on the disable will be printed.
WARNING: This macro just sets the cache variable <pkgi>_ALLOW_MISSING_EXTERNAL_PACKAGE=TRUE for each package <pkgi>. Therefore, using this function effectively turns off error checking for misspelled package names so it is important to only use it when it absolutely is needed (use cases mentioned below). Also note that missing packages are silently ignored by default. Therefore, when doing development involving these packages, it is usually a good idea to set:
-D<pkgi>_ALLOW_MISSING_EXTERNAL_PACKAGE=FALSE
so that it will catch errors in the misspelling of package names or source directories. However, notes on what missing packages are being ignored can printed by configuring with:
-D <Project>_WARN_ABOUT_MISSING_EXTERNAL_PACKAGES=TRUE
This macro is typically called in one of two different contexts:
For some meta-projects that composes packages from may different TriBITS repositories, one might need to also call this function from the file <projectDir>/cmake/ProjectDependenciesSetup.cmake.
In: core/package_arch/TribitsProcessPackagesAndDirsLists.cmake:113
Asset that a cache variable and a possible local variable (if it exists) have the same value.
Usage:
tribits_assert_cache_and_local_vars_same_value(<cacheVarName>)
If the local var <cacheVarName> and the cache var <cacheVarName> both exist but have different values, then message(SEND_ERROR ...) is called with an informative error message.
In: core/package_arch/TribitsPkgExportCacheVars.cmake:45
Macro that configures the package's main configured header file (typically called ${PACKAGE_NAME}_config.h but any name can be used).
Usage:
tribits_configure_file(<packageConfigFile>)
This function requires the file:
${PACKAGE_SOURCE_DIR}/cmake/<packageConfigFile>.in
exists and it creates the file:
${CMAKE_CURRENT_BINARY_DIR}/<packageConfigFile>
by calling the built-in configure_file() command:
configure_file( ${PACKAGE_SOURCE_DIR}/cmake/<packageConfigFile>.in ${CMAKE_CURRENT_BINARY_DIR}/<packageConfigFile> )
which does basic substitution of CMake variables (see documentation for built-in CMake configure_file() command for rules on how it performs substitutions). This command is typically used to configure the package's main <packageDir>/cmake/<packageName>_config.h.in file.
In addition to just calling configure_file(), this function also aids in creating configured header files adding macros for deprecating code as described below.
Deprecated Code Macros
If ${PARENT_PACKAGE_NAME}_SHOW_DEPRECATED_WARNINGS is TRUE (see tribits_add_show_deprecated_warnings_option()), then the local CMake variable ${PARENT_PACKAGE_NAME_UC}_DEPRECATED_DECLARATIONS is set which adds a define <PARENT_PACKAGE_NAME_UC>_DEPRECATED (where <PARENT_PACKAGE_NAME_UC> is the package name in all upper-case letters) which adds a compiler-specific deprecated warning for an entity. To take advantage of this, just add the line:
@<PARENT_PACKAGE_NAME_UC>_DEPRECATED_DECLARATIONS@
to the <packageConfigFile>.in file and it will be expanded at configure time.
Then C/C++ code can use this macro to deprecate functions, variables, classes, etc., for example, using:
<PARENT_PACKAGE_NAME_UC>_DEPRECATED class SomeDepreatedClass { ... }.
If the particular compiler does not support deprecated warnings, then this macro is defined to be empty. See Regulated Backward Compatibility and Deprecated Code for more details.
In: core/package_arch/TribitsConfigureFile.cmake:27
Function that copies a list of files from a source directory to a destination directory at configure time, typically so that it can be used in one or more tests.
Usage:
tribits_copy_files_to_binary_dir( <targetName> [SOURCE_FILES <file1> <file2> ...] [SOURCE_DIR <sourceDir>] [DEST_FILES <dfile1> <dfile2> ...] [DEST_DIR <destDir>] [TARGETDEPS <targDep1> <targDep2> ...] [EXEDEPS <exeDep1> <exeDep2> ...] [NOEXEPREFIX] [CATEGORIES <category1> <category2> ...] )
This sets up all of the custom CMake commands and targets to ensure that the files in the destination directory are always up to date just by building the ALL target.
NOTE: The target name <targetName> must be unique from all other targets in the same TriBITS Package. Otherwise, one will get a configure failure complaining that a target name has already been defined. Therefore, be sure to pick long and unique target names!
This function has a few valid calling modes:
1) Source files and destination files have the same name:
tribits_copy_files_to_binary_dir( <targetName> SOURCE_FILES <file1> <file2> ... [SOURCE_DIR <sourceDir>] [DEST_DIR <destDir>] [TARGETDEPS <targDep1> <targDep2> ...] [EXEDEPS <exeDep1> <exeDep2> ...] [NOEXEPREFIX] [CATEGORIES <category1> <category2> ...] )
In this case, the names of the source files and the destination files are the same but just live in different directories.
2) Source files have a prefix different from the destination files:
tribits_copy_files_to_binary_dir( <targetName> DEST_FILES <file1> <file2> ... SOURCE_PREFIX <srcPrefix> [SOURCE_DIR <sourceDir>] [DEST_DIR <destDir>] [EXEDEPS <exeDep1> <exeDep2> ...] [NOEXEPREFIX] [CATEGORIES <category1> <category2> ...] )
In this case, the source files have the same basic name as the destination files except they have the prefix <srcPrefix> prepended to the name.
3) Source files and destination files have completely different names:
tribits_copy_files_to_binary_dir( <targetName> SOURCE_FILES <sfile1> <sfile2> ... [SOURCE_DIR <sourceDir>] DEST_FILES <dfile1> <dfile2> ... [DEST_DIR <destDir>] [EXEDEPS <exeDep1> <exeDep2> ...] [NOEXEPREFIX] [CATEGORIES <category1> <category2> ...] )
In this case, the source files and destination files have completely different prefixes.
The individual arguments are:
SOURCE_FILES <file1> <file2> ...
Listing of the source files relative to the source directory given by the argument SOURCE_DIR <sourceDir>. If omitted, this list will be the same as DEST_FILES with the argument SOURCE_PREFIX <srcPrefix> appended.SOURCE_DIR <sourceDir>
Optional argument that gives the (absolute) base directory for all of the source files. If omitted, this takes the default value of ${CMAKE_CURRENT_SOURCE_DIR}.DEST_FILES <file1> <file2> ...
Listing of the destination files relative to the destination directory given by the argument DEST_DIR <destDir>. If omitted, this list will be the same as given by the SOURCE_FILES list.DEST_DIR <destDir>
Optional argument that gives the (absolute) base directory for all of the destination files. If omitted, this takes the default value of ${CMAKE_CURRENT_BINARY_DIR}TARGETDEPS <targDep1> <targDep2> ...
Listing of general CMake targets that these files will be added as dependencies to. This results in the copies to be performed when any of the targets <targDepi> are built.EXEDEPS <exeDep1> <exeDep2> ...
Listing of executable targets that these files will be added as dependencies to. By default, the prefix ${PACKAGE_NAME}_ will is appended to the names of the targets. This ensures that if the executable target is built that these files will also be copied as well.NOEXEPREFIX
Option that determines if the prefix ${PACKAGE_NAME}_ will be appended to the arguments in the EXEDEPS list.
In: core/package_arch/TribitsCopyFilesToBinaryDir.cmake:14
Universal platform-independent CTest/CDash driver function for CTest -S scripts for TriBITS projects
Usage (in <script>.cmake file run with CTest -S <script>.cmake):
# Set some basic vars and include tribits_ctest_driver() set(TRIBITS_PROJECT_ROOT "${CMAKE_CURRENT_LIST_DIR}/../../../..") include( "${TRIBITS_PROJECT_ROOT}/cmake/tribits/ctest_driver/TribitsCTestDriverCore.cmake") # Set variables that define this build set(CTEST_BUILD_NAME <buildName>) set(CTEST_TEST_TYPE Nightly) set(CTEST_DASHBOARD_ROOT PWD) set(MPI_EXEC_MAX_NUMPROCS 16) set(CTEST_BUILD_FLAGS "-j16") set(CTEST_PARALLEL_LEVEL 16) set(${PROJECT_NAME}_REPOSITORY_LOCATION <git-url-to-the-base-git-repo>) [... Set other vars ...] # Call the driver script to handle the rest tribits_ctest_driver()
This platform independent code is used in CTest -S scripts to drive the testing process for submitting to CDash for a TriBITS project.
This function drives the following operations:
After each of these steps, results are submitted to CDash if CTEST_DO_SUBMIT = TRUE and otherwise no data is submitted to any CDash site (which is good for local debugging of CTest -S driver scripts). For the package-by-package mode these steps 7-11 for configure, build, and running tests shown above are actually done in a loop package-by-package with submits for each package to be tested. For the all-at-once mode, these steps are done all at once for the selected packages to be tested and results are submitted to CDash all-at-once for all packages together (see All-at-once versus package-by-package mode (tribits_ctest_driver())).
For context for how this function is used, see:
Also note that this function executes Reduced Package Dependency Processing so all of the files described in that process are read in while this function runs. This processing is needed to determine the TriBITS package dependency graph and to determine the set of packages to be enabled or disabled when determining the set of packages to be tested.
Sections:
The following is an alphabetical listing of all of the variables that impact the behavior of the function tribits_ctest_driver() with links to their more detailed documentation:
List of all variables (tribits_ctest_driver()):
Setting variables (tribits_ctest_driver()):
Variables can be set to control the behavior of this function before the function is called. Some variables must be set in the CTest -S driver script before calling this function tribits_ctest_driver(). Many variables have a default value that will work in most cases.
In general, these variables fall into one of three different categories:
Which variables are which are described below for each variable.
Source and Binary Directory Locations (tribits_ctest_driver()):
To understand how to set the source and binary directories, one must understand that CTest -S scripts using this function get run in one of two different modes:
Mode 1: Run where there are already existing source and binary directories (i.e. is set empty before call). In this case, CTEST_SOURCE_DIRECTORY and CTEST_BINARY_DIRECTORY must be set by the user before calling this function (and CTEST_DASHBOARD_ROOT is empty). This mode is typically used to test a local build or an existing cloned and setup set source tree and post to CDash (see the custom dashboard target in Dashboard Submissions).
Mode 2: A new binary directory is created and optionally new sources are cloned or updated under a driver directory (i.e. CTEST_DASHBOARD_ROOT is set before call and that directory will be created if it does not already exist). In this case, there are typically two (partial) project source tree's, a) the "driver" skeleton source tree (typically with an embedded tribits/ directory) that bootstraps the testing process that contains the CTest -S driver script, and b) the full "source" tree that is (optionally) cloned and/or updated and is directly configured, build, and tested. This mode can also handle the case where the source tree is already set up in the location pointed to by CTEST_SOURCE_DIRECTORY and CTEST_DO_SUBMIT is set to FALSE so this mode can get away with a single source tree and can handle a variety of use cases that may pre-manipulate the source tree before tribits_ctest_driver() is run.
There are a few different directory locations that are significant for this script used in one or both of the modes described above:
TRIBITS_PROJECT_ROOT=<projectDir>.
The root directory to an existing source tree where the project's <projectDir>/ProjectName.cmake (defining the PROJECT_NAME variable) and <projectDir>/Version.cmake files can be found. This can be set() in the CTest -S script or override as an env var. The default and env override is set for this during the include() of the module TribitsCTestDriverCore.cmake.${PROJECT_NAME}_TRIBITS_DIR=<tribits-dir>
The base directory for the TriBITS system's various CMake modules, python scripts, and other files. By default this is assumed to be ${TRIBITS_PROJECT_ROOT}/cmake/tribits. This can be set() in the CTest -S script or overridden as an env var. The default and env override is set for this during the include() of TribitsCTestDriverCore.cmake.CTEST_DASHBOARD_ROOT=<dashboard-root-dir>
If set, this is the base directory where this script runs that clones the sources for the project. If this directory does not exist, it will be created. If provided as the special value PWD, then the present working directory is used. If empty, then this var has no effect. This can be set() in CTest -S script before the call to tribits_ctest_driver() or override as an env var.CTEST_SOURCE_NAME=<src-dir-name>
The name of the source directory. This can be set() in the CTest -S script before the call to tribits_ctest_driver() or overridden as an env var. By default, this is set to ${PROJECT_NAME}.CTEST_SOURCE_DIRECTORY=<src-dir-full-path>
Built-in CTest variable that determines the location of the sources that are used to define packages, dependencies and configure, build, and test the software. This is a variable that CTest directly reads and must therefore be set. This is used to set PROJECT_SOURCE_DIR which is used by the TriBITS system. If CTEST_DASHBOARD_ROOT is set, then this is hard-coded internally to ${CTEST_DASHBOARD_ROOT}/${CTEST_SOURCE_NAME} and will therefore override any value that might be set in the CTest -S driver script. However, if CTEST_DASHBOARD_ROOT is empty when TribitsCTestDriverCore.cmake is included(), then by default it set to ${TRIBITS_PROJECT_ROOT}. This can only be set() in the CTest -S driver script and is not overridden as an env var. The only way to override in the ENV is to indirectly set through ${CTEST_DASHBOARD_ROOT}.CTEST_BINARY_DIRECTORY=<binary-dir-full-path>
Built-in CTest variable that determines the location of the binary tree where output from CMake/CTest is put. This is used to set to PROJECT_BINARY_DIR which is used by the TriBITS system and this variable is directly ready by CTest itself. If CTEST_DASHBOARD_ROOT is set, then this is hard-coded internally to ${CTEST_DASHBOARD_ROOT}/BUILD (overwriting any existing value of CTEST_BINARY_DIRECTORY). If CTEST_BINARY_DIRECTORY is empty when TribitsCTestDriverCore.cmake is included(), then by default it set to $ENV{PWD}/BUILD. CTEST_BINARY_DIRECTORY can not be overridden in the env.
Determining What Packages Get Tested (tribits_ctest_driver()):
Before any testing is done, the set of packages to be tested is determined. This determination uses the basic TriBITS Dependency Handling Behaviors and logic. By default, the set of packages to be tested and otherwise explicitly processed is determined by the vars (which can also be set as env vars):
${PROJECT_NAME}_PACKAGES=<pkg0>,<pkg1>,...
A semi-colon ';' or comma ',' separated list of packages that determines the specific set of packages to test. If left at the default value of empty "", then ${PROJECT_NAME}_ENABLE_ALL_PACKAGES is set to ON and that enables packages as described in <Project>_ENABLE_ALL_PACKAGES enables all PT (cond. ST) packages. This variable can use ',' to separate package names instead of ';'. The default value is empty "".${PROJECT_NAME}_ADDITIONAL_PACKAGES=<pkg0>,<pkg1>,...
If ${PROJECT_NAME}_PACKAGES is empty (and therefore ${PROJECT_NAME}_ENABLE_ALL_PACKAGES=ON is set), then additional packages not enabled in that logic can be listed ${PROJECT_NAME}_ADDITIONAL_PACKAGES and they will be tested as well. For example, if this wold be used when there are some additional ST or EX packages that should be tested in a PT build (e.g. ${PROJECT_NAME}_ENABLE_SECONDARY_TESTED_CODE=FALSE. The default value is empty "".${PROJECT_NAME}_PACKAGE_ENABLES_FILE=<filepath>
A file that is expected to define a set to set() statements to enable a set of packages. The set of packages enabled will determine what packages are specifically processed and tested (according to other options as well). NOTE: To get this set of enables passed to inner configure, also list this file in the inner configure cache variable ${PROJECT_NAME}_CONFIGURE_OPTIONS_FILE (see passing such options through in Setting variables in the inner CMake configure (tribits_ctest_driver())). This is used instead of the variable ${PROJECT_NAME}_PACKAGES to specify the set of packages to enable and test. (If both ${PROJECT_NAME}_PACKAGES and ${PROJECT_NAME}_PACKAGE_ENABLES_FILE are both set, then a fatal error will occur. The default value is empty "".${PROJECT_NAME}_ENABLE_ALL_FORWARD_DEP_PACKAGES=[TRUE|FALSE]
If set to TRUE, then all of the downstream packages from those specified in ${PROJECT_NAME}_PACKAGES will be enabled (see <Project>_ENABLE_ALL_FORWARD_DEP_PACKAGES enables downstream packages/tests). The default value is FALSE unless CTEST_ENABLE_MODIFIED_PACKAGES_ONLY=TRUE is set in which case the default value is TRUE.${PROJECT_NAME}_ENABLE_SECONDARY_TESTED_CODE=[TRUE|FALSE]
If set to TRUE, then ST packages will get enabled in automated logic in the outer determination of what packages to get tested. This value also gets passed to the inner CMake configure. The default value is OFF.${PROJECT_NAME}_EXCLUDE_PACKAGES=<pkg0>,<pkg1>,...
A semi-colon ';' or comma ',' separated list of packages NOT to enable when determining the set of packages to be tested. NOTE: Listing packages here will not disable the package in the inner CMake configure when using the package-by-packages approach. To do that, you will have to disable them in the variable EXTRA_CONFIGURE_OPTIONS (set in your driver script). But for the all-at-once approach this list of package disables IS pass into the inner configure.${PROJECT_NAME}_DISABLE_ENABLED_FORWARD_DEP_PACKAGES=[TRUE|FALSE]
If set to ON (or TRUE), then if there are conflicts between explicit enables and disables then explicit disables will override the explicit enables (see Disables trump enables where there is a conflict). The default is ON and likely should not be changed. The default value is ON.CTEST_EXPLICITLY_ENABLE_IMPLICITLY_ENABLED_PACKAGES=[TRUE|FALSE]
If set to TRUE, then all of the upstream packages for those selected to be explicitly tested will be processed with results posted to CDash. The default is TRUE unless CTEST_ENABLE_MODIFIED_PACKAGES_ONLY==TRUE. Most builds that specify a specific set of packages in ${PROJECT_NAME}_PACKAGES should likely set this to FALSE.
NOTE: Any and all of the above vars can be set as env vars and they will override the value set inside the CTest -S script with set()` (or set_default()) statements. Also, for any of the vars that take a list, the CMake standard semi-colon char ';' can be used to separate list items or comas ',' can be used so that they can be used when setting env vars. (The comas ',' are then replaced with semi-colons ';' internally before interpreted as an list by CMake.)
The other mode for selecting the set of packages to be tested is to only test the packages that have changes since the last time this build was run and testing packages that previously failed. That mode is turned on by the var:
CTEST_ENABLE_MODIFIED_PACKAGES_ONLY=[TRUE|FALSE]
If TRUE, then only packages that have changes pulled from the git repos since the last time the build ran will be tested (in addition to packages that failed in the last build). If FALSE, the set of packages to be tested is determined by ${PROJECT_NAME}_PACKAGES and other variables as described above.
Setting variables in the inner CMake configure:
It is important to understand that none of the CMake vars that get set in the outer CTest -S program that calls tribits_ctest_driver() automatically get passed into the inner configure of the TriBITS CMake project using the ctest_configure() command by CMake. From the perspective of raw CTest and CMake, these are completely separate programs. However, the tribits_ctest_driver() function will forward subset a of variables documented below into the inner CMake configure. The following variables that are set in the outer CTest -S program will be passed into the inner CMake configure by default (but their values they can be overridden by options listed in EXTRA_SYSTEM_CONFIGURE_OPTIONS or EXTRA_CONFIGURE_OPTIONS):
-D${PROJECT_NAME}_IGNORE_MISSING_EXTRA_REPOSITORIES=ON
Missing extra repos are always ignored in the inner CMake configure. This is because any problems reading an extra repo will be caught in the outer CTest -S driver script.-D${PROJECT_NAME}_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=ON
Because of the behavior of the package-by-package mode, currently, this is hard-coded to ON. (This set may be removed in the future for the all-at-once mode.)-D${PROJECT_NAME}_ALLOW_NO_PACKAGES:BOOL=ON
This is currently set for the package-by-package mode since some packages may get disabled because required upstream dependent packages may be disabled. (This set may be removed in the future for the all-at-once mode.)
The following variables set in the outer CTest -S driver script will be passed down into the inner CMake configure through the OPTIONS variable to the ctest_configure() command:
Arbitrary options can be set to be passed into the inner CMake configure after the above options are passed by setting the following variables:
EXTRA_SYSTEM_CONFIGURE_OPTIONS
Additional list of system-specific options to be passed to the inner CMake configure. This must be set in the CTest -S driver script with a set() statement (i.e. env var is not read). These options get added after all of the above pass-through options so they can override any of those options. WARNING: Do not include any semicolons ';' in these arguments (see below WARNING).EXTRA_CONFIGURE_OPTIONS
Additional list of extra cmake configure options to be passed to the inner CMake configure. This must be set in the CTest -S driver script with a set() statement (i.e. env var is not read). These options get added after all of the above pass-through options and the options listed in EXTRA_SYSTEM_CONFIGURE_OPTIONS so they can override any of those options. WARNING: Do not include any semicolons ';' in these arguments (see below WARNING).${PROJECT_NAME}_EXTRA_CONFIGURE_OPTIONS:
A yet additional list of extra cmake configure options to be passed to the inner CMake configure after all of the others. Unlike the above options, this var is read from the env and allows the user to set arbitrary configure options that overrides all others. WARNING: Do not include any semicolons ';' in these arguments (see below WARNING).
These configure options are passed into the ctest_configure() command in the order:
<initial options> ${EXTRA_SYSTEM_CONFIGURE_OPTIONS}} \ ${EXTRA_CONFIGURE_OPTIONS} ${${PROJECT_NAME}_EXTRA_CONFIGURE_OPTIONS}
WARNING: The options listed in EXTRA_SYSTEM_CONFIGURE_OPTIONS, EXTRA_CONFIGURE_OPTIONS, and ${PROJECT_NAME}_EXTRA_CONFIGURE_OPTIONS should not contain any semi-colons ';' or they will be interpreted as array bounds and mess up the arguments when passed to the inner CMake configure. To avoid problems with spaces and semicolons, it is usually a good idea to put these cache vars into *.cmake file fragments and the pass them through using the variable <Project>_CONFIGURE_OPTIONS_FILE as:
-D<Project>_CONFIGURE_OPTIONS_FILE=<optionsfile1>.cmake,<optionsfile2>.cmake,...
or using the built-in CMake option:
-C<abs-base>/<optionsfile1>.cmake -C<abs-base>/<optionsfile2>.cmake ...
NOTE: The full list of options passed into the inner CMake is printed out before calling ctest_configure() so any issues setting options and the ordering of options can be seen in that printout.
When run, tribits_ctest_driver() always performs a configure and build but other actions are optional. By default, a version control update (or clone) is performed as well as running tests and submitting results to CDash. But the version control update, the running tests and submitting results to CDash can be disabled. Also, coverage testing and memory testing are not performed by default but they can be turned on with results submitted to CDash. These actions are controlled by the following variables (which can be set in the CTest -S script before calling tribits_ctest_driver() and can be overridden by env vars of the same name):
CTEST_DO_NEW_START=[TRUE|FALSE]
If TRUE, ctest_start() is called to set up a new "dashboard" (i.e. define a new CDash build with a unique Build Stamp defined in the Testing/TAG file). If FALSE, then ctest_start(... APPEND) is called which allows it this ctest -S invocation to append results to an existing CDash build. (See ???). Default TRUE.CTEST_DO_UPDATES=[TRUE|FALSE]
If TRUE, then the source repos will be updated as specified in Repository Updates (tribits_ctest_driver()). Default TRUE.CTEST_UPDATE_ARGS
Any extra arguments to use with git clone to clone the base git repo. The default value is empty "". This is only used for the base git repo (not the extra repos).CTEST_UPDATE_VERSION_ONLY:
Built-in CTest variable that if set to TRUE will change the default behavior of ctest_update() such that it will not clone or pull from the remove repo or update the local branch in any way. This also skips any actions on extra repos and skips the creation of the Updates.txt or UpdateCommandsOutput.txt files. Setting this to TRUE along with CTEST_DO_UPDATES=ON and doing a submit to CDash will result "Revision" column being present with the Git SHA1 of the base repo (and a hyperlink to the commit in the public git repo). This is useful when using with a CI testing system that handles all of the git repo manipulation like GitHub Actions, GitLab CI, or Jenkins.CTEST_START_WITH_EMPTY_BINARY_DIRECTORY=[TRUE|FALSE]
If TRUE, then if the binary directory ${CTEST_BINARY_DIRECTORY} already exists, then it will be clean out using the CTest command ctest_empty_binary_directory(). However, this can set to FALSE in which case a rebuild (using existing object files, libraries, etc.) will be performed which is useful when using an incremental CI server. But this is ignored if CTEST_DO_NEW_START=FALSE. Default TRUE (which is the most robust option).CTEST_DO_CONFIGURE=[TRUE|FALSE]
If TRUE, then the selected packages will be configured. If FALSE, it is assumed that a relevant configure is already in place in the binary directory if a build or running tests is to be done. Note that for the package-by-package mode, a configure is always done if a build or any testing is to be done but results will not be sent to CDash unless CTEST_DO_CONFIGURE=TRUE. Default TRUE.CTEST_WIPE_CACHE=[TRUE|FALSE]
If TRUE, then ${CTEST_BINARY_DIRECTORY}/CMakeCache.txt and ${CTEST_BINARY_DIRECTORY}/CMakeFiles/ will be deleted before performing a configure. (This value is set to FALSE by the make dashboard target that does an experimental build, test, and submit to CDash.) Default TRUE (which is the most robust option in general).CTEST_DO_BUILD=[TRUE|FALSE]
If TRUE, then the selected packages will be build. If FALSE, it is assumed that a relevant build is already in place in the binary directory if any testing is to be done. Default TRUE.CTEST_BUILD_FLAGS
Built-in CTest variable that gives the flags passed to the build command called inside of the built-in CTest command ctest_build(). The default is -j2 when CTEST_CMAKE_GENERATOR is set to Unix Makefiles. Otherwise, the default is empty "". Useful options to set are -j<N> (to build on parallel) and -k (to keep going when there are build errors so we can see all of the build errors). When CTEST_CMAKE_GENERATOR is set to Ninja, the j<N> option can be left off (in which case all of the available unloaded cores are used to build) and the option -k 999999 can be used to build all targets when there are build failures.CTEST_DO_INSTALL=[TRUE|FALSE]
If TRUE, then -DCMAKE_SKIP_INSTALL_ALL_DEPENDENCY=ON will be passed th the inner CMake configure and the 'install_package_by_package' target will be built to install what has been configured and built by the build step for the all-at-once mode (i.e. ${PROJECT_NAME}_CTEST_DO_ALL_AT_ONCE=TRUE). If FALSE, then -DCMAKE_SKIP_INSTALL_ALL_DEPENDENCY=ON is not added to the inner configure and no install is performed. (NOTE: The cmake var CMAKE_INSTALL_PREFIX must be set on the inner cmake configure for this to work correctly. Also, the install is currently not implemented for the package-by-package mode ${PROJECT_NAME}_CTEST_DO_ALL_AT_ONCE=FALSE and this option will simply be ignored in that case.) Default FALSE.CTEST_DO_TEST=[TRUE|FALSE]
If TRUE, then ctest_test() will be called and test results will be submitted to CDash. This should be set to FALSE when one wanted to only test the configure and build of a project but not run any tests (e.g. when cross compiling or if the tests are too expensive to run). The default value is TRUE.CTEST_PARALLEL_LEVEL=<num>
The parallel level passed in the PARALLEL_LEVEL argument to ctest_test() AND ctest_memcheck(). The default value is 1 (one).${PROJECT_NAME}_INNER_ENABLE_TESTS
If OFF, then ${PROJECT_NAME}_ENABLE_TESTS=OFF will be passed to the inner CMake configure. This will avoid building of all tests and examples for the enabled packages and also no CTest tests will be defined using calls to add_test() in the inner project configure. This results in just the building of the libraries and non-test, non-example executables (which will be much faster for some projects). The default value is ON (in which case all of the test and example executable and other targets will get built for all of the explicitly enabled packages and the associated ctest tests will be defined and run).${PROJECT_NAME}_SKIP_CTEST_ADD_TEST:
If set to TRUE, then ${PROJECT_NAME}_SKIP_CTEST_ADD_TEST=TRUE is passed in to the inner CMake configure. This will result in all of the test and example executables for the enabled packages to be built but no ctest tests will get defined and run by skipping the inner CMake calls to add_test(). Setting this to TRUE allows all of the libraries, production executables and the test and example executables and other targets to get built, but no tests will be run. However, when CTEST_DO_TEST=ON, the ctest_test() command will still be run and test results will still be submitted to CDash which will report zero tests. This avoids the test results being reported as missing on CDash for tools like ctest_analyze_and_report.py. The default value is FALSE (in which case any enabled tests or examples in the explicitly enabled packages will get run).CTEST_DO_COVERAGE_TESTING=[TRUE|FALSE]
If TRUE, then ctest_coverage() is called to collect coverage and submit results generated from the previous ctest_test() command. Setting this to TRUE also results in -D${PROJECT_NAME}_ENABLE_COVERAGE_TESTING=ON getting passed down to the inner CMake configure of the project (i.e. so that the executables are instrumented to generate coverage data when run by the tests in the ctest_test() command). (Default is OFF)CTEST_COVERAGE_COMMAND
Built-in CTest variable that determines the command that is run by ctest_coverage() to collect coverage results. That default value is gcov.CTEST_DO_MEMORY_TESTING=[TRUE|FALSE]
If TRUE, then ctest_memcheck() is called to run the test suite with the memory checking tool and results submitted to CDash.CTEST_MEMORYCHECK_COMMAND
Built-in CTest variable that determines the command that is used to run the command for each test run by the ctest_memcheck() command. If valgrind is found on the local system, then that is used by default. Otherwise, the default is empty "".CTEST_MEMORYCHECK_COMMAND_OPTIONS
Built-in CTest variable that determines what options are passed to the memory checking command before the actual test command. The default value is empty "".CTEST_GENERATE_OUTER_DEPS_XML_OUTPUT_FILE=[TRUE|FALSE]
If TRUE, then <Project>PackageDependencies.xml file will be generated in the outer CTest -S program. This file is used to help determine what packages have changed and need to be tested when in CI mode (e.g. when CTEST_ENABLE_MODIFIED_PACKAGES_ONLY=TRUE is set). It is also needed to generate the CDashSubprojectDependencies.xml file that gets submitted to CDash to inform it of the list of subprojects and subproject dependencies (i.e. TriBITS packages). The default value is TRUE.CTEST_SUBMIT_CDASH_SUBPROJECTS_DEPS_FILE=[TRUE|FALSE]
If TRUE, then CDash subprojects XML file is generated and submitted to CDash. This file tells CDash about the subproject (i.e. TriBITS package) structure. The default value is TRUE.CTEST_DO_SUBMIT=[TRUE|FALSE]
If TRUE, then all of the results generated locally are submitted to CDash using ctest_submit(). One can set this to FALSE when locally debugging a CTest -S driver script to avoid spamming CDash. The default value is TRUE. (NOTE: This may submit to more than one CDash site as noted in Specifying where the results go to CDash (tribits_ctest_driver())).
Determining how the results are displayed on CDash (tribits_ctest_driver()):
These options all primarily determine how VC update, configure, build, test, and other results submitted are displayed on CDash (but not what CDash site(s) or project(s) to which they are submitted, see Specifying where the results go to CDash (tribits_ctest_driver())). These options can all be set in the CTest -S script using set() statements before tribits_ctest_driver() is called and can be overridden in the env when running the CTest -S driver script.
CTEST_TEST_TYPE=[Nightly|Continuous|Experimental]
Determines the model for build. This value is passed in as the first argument to the built-in CTest function ctest_start(). Valid values include Nightly, Continuous, and Experimental. As far as CTest is concerned, the only real impact this CTest "Model" has is on setting the time stamp in the build stamp field (which is stored in the file Testing/TAG). For the model Nightly, the time stamp in the build stamp is taken from the variable CTEST_NIGHTLY_START_TIME read in from the file <projectDir>/CTestConfig.cmake file. Otherwise, the time stamp used is the current build start time. (The reason this is significant is that builds on CDash that have the same site, buildname, and build stamp are considered the same build and will combine results.) This also defines the default value for ${PROJECT_NAME}_TRACK (see below) as well as defines the default value for ${PROJECT_NAME}_ENABLE_KNOWN_EXTERNAL_REPOS_TYPE. The default value is Experimental.${PROJECT_NAME}_TRACK=<cdash-group>
Specifies the testing track that specifies the CDash group for which results are displayed under (i.e. the "Group" filter field on CDash). This is the value used for the (deprecated) TRACK argument (renamed GROUP in CMake/CTest versions 3.16+) of the built-in CTest function ctest_start(). The default value is set to ${CTEST_TEST_TYPE}. However, if CTEST_TEST_TYPE==Experimental (or EXPERIMENTAL), then ${PROJECT_NAME}_TRACK is forced to Experimental, even if it was set to a different value. The default value can also be set in the ctest -S driver script itself by setting set(${PROJECT_NAME}_TRACK <cdash-group>). And, of course, if the environment variable export <Project>_TRACK=<cdash-group> is set, then that value will be used for the CDash Track/Group to submit results to.CTEST_SITE=<site-name>
This is a built-in CTest variable that determines what is displayed for the site field for the build on CDash. This specified by default by calling the built-in CMake/CTest function site_name().COMPILER_VERSION=<compiler-version>
Gives the name of the compiler that is used to compose a default CTEST_BUILD_NAME. If CTEST_BUILD_NAME is explicitly set, then this value is ignored.CTEST_BUILD_NAME=<build-name>
This is a built-in CTest variable that determines the name of the build on CDash. Builds that have the same CTEST_SITE, CTEST_BUILD_NAME and ${PROJECT_NAME}_TRACK are considered to be related builds and CDash will relate them as "previous" and "next" builds (good for showing number of added or removed tests, new test failures, new passing tests, etc.). If not specified, it is given the default value ${HOST_TYPE}-${COMPILER_VERSION}-${BUILD_DIR_NAME}. Here, HOST_TYPE is determined automatically from the uname system command using find_program(uname). The value of BUILD_DIR_NAME is expected to be set in each specific CTest -S driver script.CTEST_NOTES_FILES="<filepath1>;<filepath2>;..."
Built-in CTest variable that specifies a semi-colon separated list of files that will get uploaded to CDash as "notes files". This function will also add notes files as well such as the file CMakeCache.clean.txt (cleaned-up version of the CMakeCache.txt file), the file Updates.txt (lists new git commits pulled in all the git repos), the file UpdateCommandsOutput.txt (list of commands and their output which are run by ctest_update() in the base git repo), and the file ${PROJECT_NAME}RepoVersion.txt (gives version of all the git repos being tested).CTEST_CHANGE_ID
Built-in CTest variable that can be used to set to an integer for the GitHub Pull Request (PR) ID or GitLab Merge Request (MR) ID (or other such repository and development management system's change control ID). If the CDash project is properly configured to point to the GitHub or GitLab (or other supported) repository for the project, then CDash will put a hyper-linked icon beside the build name that links back to the PR or MR issue with that ID. It may also be used for other purposes as well in the future.
Specifying where the results go to CDash (tribits_ctest_driver()):
By default, the target CDash server and CDash project are specified by the variables set in the file <projectDir>/CTestConfig.cmake; specifically, CTEST_DROP_SITE, CTEST_PROJECT_NAME, and CTEST_DROP_LOCATION. If these are set using set_default_and_from_env(), as shown in the example TribitsExampleProject/CTestConfig.cmake file, then they can be overridden with set() statements in the CTest -S script or as env vars; simple enough.
In addition, results can be sent to a second CDash site using the variables:
TRIBITS_2ND_CTEST_DROP_SITE
CDash drop site for second upload of results. If empty, then CTEST_DROP_SITE is used.TRIBITS_2ND_CTEST_DROP_LOCATION
Location for the second drop site. If empty, then CTEST_DROP_LOCATION is used.
At lease one of these vars must be set to non empty or a second submit will not be performed. For more details, see TRIBITS_2ND_CTEST_DROP_SITE and TRIBITS_2ND_CTEST_DROP_LOCATION.
Links to results on CDash (tribits_ctest_driver()):
Links to where the results will be posted on CDash are printed to STDOUT before it performs any actions and at end after all of the actions and submits have been completed.
The results are printed to STDOUT in a section that looks like:
Link to this build's results on CDash: <cdash-build-url> Link to all builds for this repo version on CDash: <cdash-revision-builds-url> Link to all nonpassing tests for all builds for this repo version on CDash: <cdash-revision-nonpassing-tests-url>
The URL <cdash-build-url> is created from the buildname, site, and buildstartime fields which is known from the TAG file created by CTest. This allows access the results for this particular build on CDash by just clicking that link.
The URL <cdash-revision-builds-url> provides a link to a CDash index.php query that includes all of the builds with the same base Git repo SHA1. This allows comparing the results of this build for other builds for this same version of the base Git repository.
The URL <cdash-revision-nonpassing-tests-url> gives a link to a CDash queryTests.php query for all of the nonpassing tests for all of the builds with this same base project Git repo SHA1. This allows comparing test failures across all of the builds for the same base project Git repo version.
NOTE: The links <cdash-revision-builds-url> and <cdash-revision-nonpassing-tests-url> are only provided if the base project Git repo has the .git/ subdirectory and if git log successfully returns the SHA1 for that base Git repo.
NOTE: The links <cdash-revision-builds-url> and <cdash-revision-nonpassing-tests-url> only consider the Git SHA1 of the base project Git repo. For multi-repo projects (see Multi-Repository Support), you may get results for builds with different subrepo versions and therefore may be comparing apples and oranges. (Projects that commit a <Project>SubRepoVersion.txt file to their base Git repo or use Git Submodules will have unique base project Git repo SHA1s for different versions of the project's repos.)
In addition, a text file CDashResults.txt will be written in the build directory that contains this same CDash link information shown above. This allows a process to cat the file CDashResults.txt to get links to the results on CDash.
Determining what TriBITS repositories are included (tribits_ctest_driver()):
This script is set up to process extra VC and TriBITS repos that contribute additional TriBITS packages to the base TriBITS project. This set of extra repos is determined using the following vars (which can be set in the CTest -S script or overridden with env vars of the same name):
${PROJECT_NAME}_EXTRAREPOS_FILE=<extrarepos-file-path>
Points to a file that lists the extra VC and TriBITS repos. If not explicitly set, then by default it will read from the file <projectDir>/cmake/ExtraRepositoriesList.cmake unless ${PROJECT_NAME}_SKIP_EXTRAREPOS_FILE=TRUE is set in the ProjectName.cmake file in which case no extra repos file is read in. See <Project>_EXTRAREPOS_FILE.${PROJECT_NAME}_ENABLE_KNOWN_EXTERNAL_REPOS_TYPE=[Nightly|Continuous|Experimental]
The category of extra repos to process from the file ${PROJECT_NAME}_EXTRAREPOS_FILE (see <Project>_ENABLE_KNOWN_EXTERNAL_REPOS_TYPE).${PROJECT_NAME}_PRE_REPOSITORIES=<reponame1>,<reponame2>,...
Subset of "pre" extra repos specified in the file ${PROJECT_NAME}_EXTRAREPOS_FILE to process (see <Project>_PRE_REPOSITORIES).${PROJECT_NAME}_EXTRA_REPOSITORIES=<reponame1>,<reponame2>,...
Subset of "post" extra repos specified in the file ${PROJECT_NAME}_EXTRAREPOS_FILE to process (see <Project>_EXTRA_REPOSITORIES).
The behavior for selecting extra repos using these variables is determined as described in:
All-at-once versus package-by-package mode (tribits_ctest_driver()):
This function supports driving the configure, build, testing, and submitting to CDash of the packages in the TriBITS project either all-at-once or package-by-package, based on the vars (which can be set in the CTest -S script and overridden by env vars):
${PROJECT_NAME}_CTEST_DO_ALL_AT_ONCE=[TRUE|FALSE]
If TRUE, then single calls to ctest_configure(), ctest_build() and ctest_test() are made for all of the packages to be tested all at once with ctest_submit() called after each of these. If FALSE then ctest_configure(), ctest_build() and ctest_test() and ctest_submit() are called in a loop, once for each package to be explicitly tested.
Both the all-at-once mode and the package-by-package mode should produce equivalent builds of the project and submits to CDash (for correctly constructed TriBITS projects and packages). But the package-by-package mode will disable packages with failing library builds when processing downstream packages, and therefore reduce the propagation of failures to downstream packages and therefore is more robust. But the package-by-package mode is more expensive in several respects for many projects.
For newer versions of CDash 3.1+, for the all-at-once mode, the CDash server will break down build and test results on a package-by-package basis on CDash together.
Multiple ctest -S invocations (tribits_ctest_driver()):
By default, this function is meant to be used in a single invocation of the ctest -S <script>.cmake command in order to do everything from the beginning and submit to CDash. But there are times when one needs to do the various steps in multiple ctest -S invocations that all send data to the same CDash build. For example, on some clusters, configure and build must be done on "compile nodes" but the tests must be run on "compute nodes". Typically, these types of machines have a shared file system. On a system like this, one would use two different invocations as:
# Start new dashboard, update, configure, and build on compile node env CTEST_DO_TEST=OFF \ ctest -S <script>.cmake # Run tests only on compute node <run-on-compute-node> \ env CTEST_DO_NEW_START=OFF CTEST_DO_UPDATES=OFF \ CTEST_DO_CONFIGURE=OFF CTEST_DO_BUILD=OFF \ CTEST_DO_TEST=ON \ ctest -S <script>.cmake
Above, CTEST_DO_NEW_START = OFF is needed to ensure that the test results go to the same CDash build. (NOTE: A CDash build is uniquely determined by the site name, build name and build stamp.)
This approach works for both the all-at-once mode and the package-by-package mode.
Also, one can run each of the basic steps in its own ctest -S invocation starting with CTEST_DO_NEW_START = ON, then CTEST_DO_UPDATES = ON, then CTEST_DO_CONFIGURE = ON, then CTEST_DO_BUILD = ON, then then CTEST_DO_TEST = ON, etc. While there is typically no reason to split things up to this level of granularity, CTest and this tribits_ctest_driver() function will support such usage. All that is required is that those steps be performed in that order. For example, one cannot do a build in one ctest -S invocation and then try to do a configure in the next because the build will fail because a valid configuration has not been performed yet. And one cannot run just tests if there is not a valid configuration and build already in place.
Repository Updates (tribits_ctest_driver()):
Like the rest of TriBITS, ctest -S scripts written using this function support a collection of extra repositories in addition to the base git repository.
Whether the local repos are updated (or left as is) is determined by the variable:
CTEST_DO_UPDATES=[TRUE|FALSE]
If set to TRUE, then each of the git repos will be cloned if they do not already exist and if already present will be updated as described below (and will wipe out any local changes). If set to FALSE, then the git repos will be left alone and must therefore already be cloned and updated at the desired state. For example, this should be set to FALSE when running against a local development repo (e.g. the make dashboard target sets this to FALSE automatically) or when other logic is used to setup the source directories. WARNING: If you are running against a local repo with local changes and you don't set to FALSE, then your local uncommitted changes will be wiped out and the local branch will be hard reset to the remote tracking branch! The default value is TRUE.
WARNING: If you don't want local changes in your git repos to get blown away, then set CTEST_DO_UPDATES to FALSE!
If the base repo pointed to by ${CTEST_SOURCE_DIRECTORY} is missing, it cloned inside of the ctest_start() function using the custom command:
git clone [-b ${${PROJECT_NAME}_BRANCH}] \ -o ${${PROJECT_NAME}_GIT_REPOSITORY_REMOTE} \ ${${PROJECT_NAME}_REPOSITORY_LOCATION}
where:
${PROJECT_NAME}_REPOSITORY_LOCATION=<repo-url>
The URL of the base git repo <repo-url> to clone inside of ctest_start(). The default is ${${PROJECT_NAME}_REPOSITORY_LOCATION_NIGHTLY_DEFAULT} when CTEST_TEST_TYPE=Nightly and otherwise the default is ${${PROJECT_NAME}_REPOSITORY_LOCATION_DEFAULT}.${PROJECT_NAME}_GIT_REPOSITORY_REMOTE=<remote-name>
The git remote name given to the cloned repo. This is needed for robust git operations as described below (Default 'origin'). If a repo is already cloned, then a remote in the already existing repo must exist with this name or${PROJECT_NAME}_BRANCH=<branch>
The branch of the base repo to explicitly checkout after clone (and on each update). The value of empty "" is allowed which results in the default branch being checked out on clone (and the -b <branch> argument to be omitted from the git clone command). The default value determined by the variable ${${PROJECT_NAME}_REPOSITORY_BRANCH}}. The default value for ${PROJECT_NAME}_REPOSITORY_BRANCH is empty.
If the base repo already exists, no initial clone is performed and it is assumed that it is in a state to allow it to be updated as described below.
After the base repo is cloned, any missing extra git repositories are cloned using CMake/CTest code in this tribits_ctest_driver() function (raw CTest does not support cloning a list of extra repos) using the command:
git clone [-b ${${PROJECT_NAME}_EXTRAREPO_BRANCH}] \ -o ${${PROJECT_NAME}_GIT_REPOSITORY_REMOTE} \ <extrarepo_url>
where:
${PROJECT_NAME}_EXTRAREPOS_BRANCH=<extrarepo-branch>
The branch <extrarepo-branch> that each extra VC repo that is checked out. The default value is set to ${${PROJECT_NAME}_BRANCH}. If empty "", then the -b <branch> argument is omitted from the git clone command. (NOTE: Checking out a separate branch on the extra repos from the base repo was needed for backward compatibility for the Trilinos project and is not recommended usage as it violates the "single branch" approach for using gitdist.)<extrarepo_url>
The git repo remote URL given in the file ${PROJECT_NAME}_EXTRAREPOS_FILE.
When CTEST_DO_UPDATES=TRUE (after a possible initial clone), the function ctest_update() is called to update the base git repo. The base git repo is updated with the custom git commands executed inside of the ctest_update() using:
$ git fetch ${${PROJECT_NAME}_GIT_REPOSITORY_REMOTE} $ git clean -fdx # Remove untracked ignored files $ git reset --hard HEAD # Clean files and set ORIG_HEAD to HEAD $ git checkout -B ${${PROJECT_NAME}_BRANCH} \ --track origin/${${PROJECT_NAME}_BRANCH} # Sets HEAD
The above set of commands are the maximally robust way to update a git repo. They will correct any local state of the local repo and will put the local repo on the requested local tracking branch. It can handled hard-reset remote branches, previous tracking branch now missing, etc. The only requirement is that the remote repo pointed to at ${${PROJECT_NAME}_GIT_REPOSITORY_REMOTE} is valid and has not changed since the repo was first cloned. (NOTE: A future version of TriBITS may automate the update of this git remote.)
If ${PROJECT_NAME}_BRANCH is empty "", the last git checkout -B <branch> ... command is replaced with the git command:
$ git reset --hard @{u} # Sets HEAD
After the base git repo is updated inside of ctest_update() as described above, each of the extra repos is updated using a similar set of git commands:
$ git fetch ${${PROJECT_NAME}_GIT_REPOSITORY_REMOTE} $ git clean -fdx # Remove untracked ignored files $ git reset --hard HEAD # Clean files and set ORIG_HEAD to HEAD $ git checkout -B ${${PROJECT_NAME}_EXTRAREPO_BRANCH} \ --track origin/${${PROJECT_NAME}_EXTRAREPO_BRANCH} # Sets HEAD
where if ${PROJECT_NAME}_EXTRAREPO_BRANCH is empty, the last git checkout -B <branch> ... command replaced with:
$ git reset --hard @{u}
WARNING: This version of the git checkout -B <branch> ... command is not supported in older versions of git. Therefore, a newer version of git is required when using named branches.
The command git clone -fdx removes any untracked ignored files that may have been created since the last update (either by the build process or by someone messing around in that local git repository). The command git reset --hard HEAD removes any untracked non-ignored files, any modified tracked files, and sets ORIG_HEAD to the current HEAD. This sets ORIG_HEAD after the initial clone (which is needed since ORIG_HEAD is not set after the initial git clone command). This allows using the range ORIG_HEAD..HEAD with git diff and git log commands even after the initial clone. (Directly after the initial clone, the range ORIG_HEAD..HEAD will be empty). The git commands git checkout -B <branch> <remote>/<branch> or git reset --hard @{u} are used to update the local repo to match the remote tracking branch. This is done to deal with a possible forced push of the remote tracking branch or even changing to different tracking branch (when using an explicit <branch> name).
Note that the repository updating approach described above using non-empty ${PROJECT_NAME}_BRANCH is more robust, because it can recover from a state where someone may have put a repo on a detached head or checked out a different branch. One of these repos might get into this state when a person is messing around in the Nightly build and source directories to try to figure out what happened and forgot to put the repos back on the correct tracking branch. Therefore, it is recommended to always set an explicit ${PROJECT_NAME}_BRANCH to a non-null value like master or develop for the git repos, even if this branch is the default repo branch.
Other CTest Driver options (tribits_ctest_driver()):
Other miscellaneous vars that can be set in the CTest -S script or as env vars are given below.
CTEST_CMAKE_GENERATOR="[Unix Makefiles|Ninja|..]"
Built-in CTest variable that determines the CMake generator used in the inner configure. If an existing CMakeCache.txt file exists, then the default value for the generator will be read out of that file. Otherwise, the default generator is selected to be Unix Makefiles. Another popular option is Ninja. The value of this variable determines the type of generator used in the inner CMake configure done by the command ctest_configure(...) called in this function. This is done implicitly by CTest. The selected generator has an impact on what flags can be used in CTEST_BUILD_FLAGS since make and ninja accept different arguments in some cases.${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE=[TRUE|FALSE]
Puts TriBITS configure into development mode (vs. release mode) in the outer CTest -S script. The default is provided by ${${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE_DEFAULT}} (which is typically set in the <projectDir>/Version.cmake file). See <Project>_ENABLE_DEVELOPMENT_MODE.${PROJECT_NAME}_VERBOSE_CONFIGURE=[TRUE|FALSE]
Make TriBITS run in verbose mode. (Useful for debugging hard problems.) See <Project>_VERBOSE_CONFIGURE.CTEST_CONFIGURATION_UNIT_TESTING=[TRUE|FALSE]
If set to TRUE, then tribits_ctest_driver() is put in unit testing mode and does not actually drive configure, build, test, and submit. This is used to drive automated testing of the code in tribits_ctest_driver().
Return value (tribits_ctest_driver()):
Currently, the ctest -S script will return 0 if all of the requested operations completed without failure. That is, the update, configure, build, tests, coverage, dynamic analysis and submits must pass with no CMake errors in order for a 0 return code to be returned. Therefore, the return code from the ctest -S script can be used to drive other automated processes that require all passing builds and tests.
ToDo: Add another mode that will return 0 if no errors are reported in the ctest -S driver script but ignore configure, build, and test failures that are submitted to a CDash site (and therefore will be reported there).
In: ctest_driver/TribitsCTestDriverCore.cmake:185
Determine at configure time if any of the upstream dependencies for a package require the current package to be rebuilt.
Usage:
tribits_determine_if_current_package_needs_rebuilt( [SHOW_MOST_RECENT_FILES] [SHOW_OVERALL_MOST_RECENT_FILES] CURRENT_PACKAGE_OUT_OF_DATE_OUT <currentPackageOutOfDate> )
Arguments:
SHOW_MOST_RECENT_FILES
If specified, then the most recently modified file for each individual base source and binary directory searched will be will be printed the STDOUT. Setting this implies SHOW_OVERALL_MOST_RECENT_FILE.SHOW_OVERALL_MOST_RECENT_FILE
If specified, then only the most recent modified file over all of the individual directories for each category (i.e. one for upstream package source dirs, one for upstream package binary dirs, one for the package's source dir, and one for the package's own binary dir) is printed to STDOUT.CURRENT_PACKAGE_OUT_OF_DATE_OUT <currentPackageOutOfDate>
On output, the local variable <currentPackageOutOfDate> will be set to TRUE if any of the upstream most modified files are more recent than the most modified file in the package's binary directory. Otherwise, this variable is set to FALSE.
Description:
This function is designed to help take an externally configured and built piece of software (that generates libraries) and wrap it as a TriBITS package or subpackage. This function uses the lower-level functions:
to determine the most recent modified files in the upstream TriBITS packages' source and binary directories as well as the most recent source file for the current package. It then compares these timestamps to the most recent binary file timestamp in this package's binary directory. If any of these three files are more recent than this package's most recent binary file, then the output variable <currentPackageOutOfDate> is set to TRUE. Otherwise, it is set to FALSE.
NOTE: The source and binary directories for full packages are searched, not individual subpackage dirs. This is to reduce the number of dirs searched. This will, however, result in changes in non-dependent subpackages being considered as well.
See the demonstration of the usage of this function in the WrapExternal package in TribitsExampleProject.
In: core/package_arch/TribitsFindMostRecentFileTimestamp.cmake:444
Macro called to disable an optional dependency in the current package for an optional (internal or external) upstream package.
Usage:
tribits_disable_optional_dependency(<upstreamPackageName> "<reasonStr>")
This macro can be called from a top-level package's <packageDir>/CMakeLists.txt file to disable an optional dependency that may have been enabled by the user or through automated enable/disable logic.
This is most useful in cases where multiple criteria must be considered before support for some upstream dependency can really be supported. In that case, the dependency can be disabled in the current package and telegraphed to all downstream packages. See How to tweak downstream TriBITS "ENABLE" variables during package configuration for more details.
In: core/package_arch/TribitsPackageMacros.cmake:381
Disable a package automatically for a list of platforms.
Usage:
tribits_disable_package_on_platforms( <packageName> <hosttype0> <hosttype1> ...)
If any of the host-type arguments <hosttypei> matches the ${PROJECT_NAME}_HOSTTYPE variable for the current platform, then package <packageName> test group classification is changed to EX. Changing the package test group classification to EX results in the package being disabled by default (see EX packages disabled by default). However, an explicit enable can still enable the package.
In: core/package_arch/TribitsListHelpers.cmake:55
Exclude package files/dirs from the source distribution by appending CPACK_SOURCE_IGNORE_FILES.
Usage:
tribits_exclude_files(<file0> <file1> ...)
This is called in the top-level parent package's <packageDir>/CMakeLists.txt file and each file or directory name <filei> is actually interpreted by CMake/CPack as a regex that is prefixed by the project's and package's source directory names so as to not exclude files and directories of the same name and path from other packages. If <filei> is an absolute path it is not prefixed but is appended to CPACK_SOURCE_IGNORE_FILES unmodified.
In general, do NOT put in excludes for files and directories that are not under this package's source tree. If the given package is not enabled, then this command will never be called! For example, don't put in excludes for PackageB's files in PackageA's CMakeLists.txt file because if PackageB is enabled but PackageA is not, the excludes for PackageB will never get added to CPACK_SOURCE_IGNORE_FILES.
Also, be careful to note that the <filei> arguments are actually regexes and one must be very careful to understand how CPack will use these regexes to match files that get excluded from the tarball. For more details, see Creating Source Distributions.
In: core/package_arch/TribitsPackagingSupport.cmake:19
Macro called from inside of a FindTPL<tplName>Dependencies.cmake file to define the direct upstream dependencies an external package/TPL.
Usage:
tribits_extpkg_define_dependencies( <tplName> DEPENDENCIES <upstreamTpl_0> <upstreamTpl_1>:<vis_1> ... )
The listed upstream dependencies <upstreamTpl_i> are other external packages/TPLs listed before this external packages/TPL <tplName> in a <repoDir>/TPLsList.cmake file. Each upstream dependency can include a visibility specification <vis_i> that can be added to the dependency using a colon : with <upstreamTpl_1>:<vis_1> where <vis_i> can take the allowed values:
If <vis_i> is not specified, then the default is PRIVATE. (If a package needs the include directories from some external package/TPL, then it should list that external package/TPL as a direct dependency and not expect to get include directories from indirect dependencies.)
In: core/package_arch/TribitsPackageDependencies.cmake:30
Extract <PkgName> and <Vis> from <PkgName>[:<Vis>] input with default <Vis> of PRIVATE.
Usage:
tribits_extpkg_get_dep_name_and_vis( <upstreamTplDepEntry> <upstreamTplDepNameOut> <upstreamTplDepVisOut>)
In: core/package_arch/TribitsPackageDependencies.cmake:143
Called from a FindTPL<tplName>.cmake module which first calls find_package(<externalPkg>)``and the calls this function to get and external package that uses modern CMake IMPORTED targets. This function creates the ``<tplName>::all_libs target and creates a TriBITS-compliant external package wrapper file <tplName>Config.cmake.
Usage:
tribits_extpkg_create_imported_all_libs_target_and_config_file( <tplName> INNER_FIND_PACKAGE_NAME <externalPkg> IMPORTED_TARGETS_FOR_ALL_LIBS <importedTarget0> <importedTarget1> ... )
This function is called from a TriBITS FindTPL<tplName>.cmake wrapper module after it calls find_package(<externalPkg>) and then this function creates the IMPORTED target <tplName>::all_libs from the list of IMPORTED targets <importedTarget0> <importedTarget1> ... which are defined from the call find_package(<externalPkg>). This function also takes care of generating the correct <tplName>Config.cmake file under the directory:
${${PROJECT_NAME}_BINARY_DIR}/${${PROJECT_NAME}_BUILD_DIR_EXTERNAL_PKGS_DIR}
The generated <tplName>Config.cmake file pulls in the upstream TriBITS-compliant <UpstreamPkg>Config.cmake` files, calls ``find_dependency(<externalPkg>) (with no other arguments), defines the <tplName>::all_libs` target, and then sets up the correct dependencies between these targets.
For more details, see Creating FindTPL<tplName>.cmake using find_package() with IMPORTED targets.
In: core/package_arch/TribitsExternalPackageWithImportedTargetsFindTplModuleHelpers.cmake:29
Returns the type of the library entry in the list TPL_<tplName>_LIBRARIES
Usage:
tribits_extpkg_tpl_libraries_entry_type(<libentry> <libEntryTypeOut>)
Arguments:
<libentry> [in]: Element of TPL_<tplName>_LIBRARIES
<libEntryTypeOut> [out]: Variable set on output to the type of entry.
The types of entries set on libEntryTypeOut include:
- FULL_LIB_PATH: A full library path
- LIB_NAME_LINK_OPTION: A library name link option of the form -l<libname>
- LIB_NAME: A library name of the form <libname>
- LIB_DIR_LINK_OPTION: A library directory search option of the form -L<dir>
- GENERAL_LINK_OPTION: Some other general link option that starts with - but is not -l or -L.
- UNSUPPORTED_LIB_ENTRY: An unsupported lib option
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:535
Write out a <tplName>Config.cmake file for a TriBITS TPL given the list of include directories and libraries for an external package/TPL.
Usage:
tribits_write_external_package_config_file( <tplName> <tplConfigFile> )
The arguments are:
<tplName>: Name of the external package/TPL
<tplConfigFile>: Full file path for the <tplName>Config.cmake file that will be written out.
This function just calls tribits_extpkg_write_config_file_str() and writes that text to the file <tplConfigFile> so see that function for more details.
NOTE: This is used for a classic TriBITS TPL that does not use find_package(<externalPkg>) with modern IMPORTED targets.
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:27
Create the text string for a <tplName>Config.cmake file given the list of include directories and libraries for an external package/TPL from the legacy TriBITS TPL specification.
Usage:
tribits_extpkg_write_config_file_str( <tplName> <tplConfigFileStrOut> )
The function arguments are:
<tplName>: Name of the external package/TPL
<tplConfigFileStrOut>: Name of variable that will contain the string for the config file on output.
This function reads from the (cache) variables
- TPL_<tplName>_INCLUDE_DIRS
- TPL_<tplName>_LIBRARIES
- <tplName>_LIB_ENABLED_DEPENDENCIES
(which must already be set) and uses that information to produce the contents of the <tplName>Config.cmake which is returned as a string variable that contains IMPORTED targets to represent these libraries and include directories as well as find_dependency() calls for upstream packages listed in <tplName>_LIB_ENABLED_DEPENDENCIES.
The arguments in TPL_<tplName>_LIBRARIES are handled in special ways in order to create the namespaced IMPORTED targets tribits::<tplName>::<libname> and the <tplName>::all_libs target that depends on these.
The types of arguments that are handled and how the are interpreted:
<abs-base-path>/[lib]<libname>.<longest-ext>
Arguments that are absolute file paths are treated as libraries and an imported target name <libname> is derived from the file name (of the form lib<libname>.<longest-ext> removing beginning lib and file extension .<longest-ext>). The IMPORTED target tribits::<tplName>::<libname> is created and the file path is set using the IMPORTED_LOCATION target property.-l<libname>
Arguments of the form -l<libname> are used to create IMPORTED targets with the name tribits::<tplName>::<libname> using the IMPORTED_LIBNAME target property.<libname>
Arguments that are a raw name that matches the regex ^[a-zA-Z_][a-zA-Z0-9_-]*$ are interpreted to be a library name <libname> and is used to create an IMPORTED targets <tplName>::<libname> using the IMPORTED_LIBNAME target property.-L<dir>
Link directories. These are pulled off and added to the <tplName>::all_libs using target_link_options(). (The order of these options is maintained.)-<any-option>
Any other option that starts with - is assumed to be a link argument where the order does not matter in relation to the libraries (but the order of these extra options are maintained w.r.t. each other).<unrecognized>
Any other argument that does not match one of the above patterns is regarded as an error.
For more details on the handling of individual TPL_<tplName>_LIBRARIES arguments, see tribits_extpkg_tpl_libraries_entry_type().
The list of directories given in TPL_<tplName>_INCLUDE_DIRS is added to the <tplName>::all_libs target using target_include_directories().
Finally, for every <upstreamTplName> listed in <tplName>_LIB_ENABLED_DEPENDENCIES, a link dependency is created using target_link_library(<tplName>::all_libs INTERFACE <upstreamTplName>).
NOTE: The IMPORTED targets generated for each library argument <tplName>::<libname> are prefixed with tribits:: to give tribits::<tplName>::<libname>. This is to avoid clashing with IMPORTED targets <tplName>::<libname> from other package config files <tplName>Config.cmake or find modules Find<tplName>.cmake that may clash (see TriBITSPub/TriBITS#548). But the generated INTERFACE IMPORTED target <tplName>::all_libs is not namespaced with tribits:: since the all_libs target is unlikely to clash. The targets tribits::<tplName>::<libname> are not directly used in downstream target_link_library() calls so the names of these targets are really just an implementation detail. (The reason we give these a name based of the library name they represent <libname> is to make it more clear what the matching library is and to make the name unique.)
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:250
Find the most modified binary file in a set of base directories and return its timestamp.
Usage:
tribits_find_most_recent_binary_file_timestamp( BINARY_BASE_DIRS <dir0> <dir1> ... [BINARY_BASE_BASE_DIR <dir>] [MOST_RECENT_TIMESTAMP_OUT <mostRecentTimestamp>] [MOST_RECENT_FILEPATH_BASE_DIR_OUT <mostRecentFilepathBaseDir>] [MOST_RECENT_RELATIVE_FILEPATH_OUT <mostRecentRelativeFilePath>] [SHOW_MOST_RECENT_FILES] [SHOW_OVERALL_MOST_RECENT_FILE] )
This function just calls tribits_find_most_recent_file_timestamp() passing in a set of basic exclude regexes like CMakeFiles/, [.]cmake$, and /Makefile$, etc. These types of files usually don't impact the build of downstream software in CMake projects.
In: core/package_arch/TribitsFindMostRecentFileTimestamp.cmake:355
Find the most modified file in a set of base directories and return its timestamp.
Usage:
tribits_find_most_recent_file_timestamp( BASE_DIRS <dir0> <dir1> ... [BASE_BASE_DIR <dir>] [EXCLUDE_REGEXES "<re0>" "<re1>" ... [SHOW_MOST_RECENT_FILES] [SHOW_OVERALL_MOST_RECENT_FILE] [MOST_RECENT_TIMESTAMP_OUT <mostRecentTimestamp>] [MOST_RECENT_FILEPATH_BASE_DIR_OUT <mostRecentFilepathBaseDir>] [MOST_RECENT_RELATIVE_FILEPATH_OUT <mostRecentRelativeFilePath>] )
Arguments:
BASE_DIRS <dir0> <dir1> ...
Gives the absolute base directory paths that will be searched for the most recently modified files, as described above.BASE_BASE_DIR <dir>`
Absolute path for which to print file paths relative to. This makes outputting less verbose and easier to read (optional).EXCLUDE_REGEXES "<re0>" "<re1>" ...
Gives the regular expressions that are used to exclude files from consideration. Each "<rei>" regex is used with a grep -v "<rei>" filter to exclude files before sorting by time stamp.SHOW_MOST_RECENT_FILES
If specified, then the most recently modified file for each individual directory <dir0>, <dir1, ... will be printed the STDOUT. Setting this implies SHOW_OVERALL_MOST_RECENT_FILE.SHOW_OVERALL_MOST_RECENT_FILE
If specified, then only the most recent modified file over all of the individual directories is printed to STDOUT.MOST_RECENT_TIMESTAMP_OUT <mostRecentTimestamp>
On output, the variable <mostRecentTimestamp> is set that gives the timestamp of the most recently modified file over all the directories. This number is given as the number of seconds since Jan. 1, 1970, 00:00 GMT.MOST_RECENT_FILEPATH_BASE_DIR_OUT <mostRecentFilepathBaseDir>
On output, the variable <mostRecentFilepathBaseDir> gives absolute base directory of the file with the most recent timestamp over all directories.MOST_RECENT_RELATIVE_FILEPATH_OUT <mostRecentRelativeFilePath>
On output, the variable <mostRecentFilepathBaseDir> gives the file name with relative path to the file with the most recent timestamp over all directories.
Description:
This function uses the Linux/Unix command:
$ find . -type f -printf '%T@ %p\n' \ | grep -v "<re0>" | grep -v "<re1>" | ... \ | sort -n | tail -1
to return the most recent file in each listed directory <dir0>, <dir1>, etc. It then determines the most recently modified file over all of the directories and prints and returns in the variables <mostRecentTimestamp>, <mostRecentFilepathBaseDir>, and <mostRecentRelativeFilePath>.
In: core/package_arch/TribitsFindMostRecentFileTimestamp.cmake:16
Find the most modified source file in a set of base directories and return its timestamp.
Usage:
tribits_find_most_recent_source_file_timestamp( SOURCE_BASE_DIRS <dir0> <dir1> ... [SOURCE_BASE_BASE_DIR <dir>] [SHOW_MOST_RECENT_FILES] [SHOW_OVERALL_MOST_RECENT_FILE] [MOST_RECENT_TIMESTAMP_OUT <mostRecentTimestamp>] [MOST_RECENT_FILEPATH_BASE_DIR_OUT <mostRecentFilepathBaseDir>] [MOST_RECENT_RELATIVE_FILEPATH_OUT <mostRecentRelativeFilePath>] )
This function just calls tribits_find_most_recent_file_timestamp() passing in a set of basic exclude regexes like [.]git/, [.]svn/, etc. These types of version control files can not possibly directly impact the source code.
In: core/package_arch/TribitsFindMostRecentFileTimestamp.cmake:269
Function that determines a given external or internal package's enable status (e.g. 'ON' or 'OFF' or any valid CMake bool)
Usage:
tribits_get_package_enable_status(<packageName> <packageEnableOut> <packageEnableVarNameOut>)
On return, if non-empty, the variable <packageEnableOut> will contain the actual value of ${${PROJECT_NAME}_ENABLE_<packageName>} or ${TPL_ENABLE_<packageName>} or will return empty "". If ${packageName}_PACKAGE_BUILD_STATUS == "INTERNAL", then only the value of ${PROJECT_NAME}_ENABLE_<packageName> will be considered.
On return, if non-empty, the variable <packageEnableVarNameOut> will be either ${${PROJECT_NAME}_ENABLE_<packageName>} or ${TPL_ENABLE_<packageName>}, depending on which one is used to obtain the value <packageEnableOut>.
This works for both external packages/TPLs and internal packages.
In: core/package_arch/TribitsGetPackageEnableStatus.cmake:11
Function used to (optionally) install header files using install() command.
Usage:
tribits_install_headers( HEADERS <h0> <h1> ... [INSTALL_SUBDIR <subdir>] [COMPONENT <component>] )
The formal arguments are:
HEADERS <h0> <h1> ...
List of header files to install. By default, these header files are assumed to be in the current source directory. They can also contain the relative path or absolute path to the files if they are not in the current source directory.INSTALL_SUBDIR <subdir>
Optional subdirectory that the headers will be installed under the standard installation directory. If <subdir>!="", then the headers will be installed under ${PROJECT_NAME}_INSTALL_INCLUDE_DIR}/<subdir>. Otherwise, they will be installed under ${PROJECT_NAME}_INSTALL_INCLUDE_DIR}/.COMPONENT <component>
If specified, then COMPONENT <component> will be passed into install(). Otherwise, COMPONENT ${PROJECT_NAME} will get used.
If ${PROJECT_NAME}_INSTALL_LIBRARIES_AND_HEADERS is FALSE, then the headers will not get installed.
In: core/package_arch/TribitsInstallHeaders.cmake:14
This function overrides the standard behavior of the built-in CMake include_directories() command for special behavior for installation testing.
Usage:
tribits_include_directories( [REQUIRED_DURING_INSTALLATION_TESTING] <dir0> <dir1> ... )
If specified, REQUIRED_DURING_INSTALLATION_TESTING can appear anywhere in the argument list.
This function allows overriding the default behavior of include_directories() for installation testing, to ensure that include directories will not be inadvertently added to the build lines for tests during installation testing (see Installation and Backward Compatibility Testing). Normally we want the include directories to be handled as cmake usually does. However during TriBITS installation testing we do not want most of the include directories to be used as the majority of the files should come from the installation we are building against. The exception is when there are test only headers that are needed. For that case REQUIRED_DURING_INSTALLATION_TESTING must be passed in to ensure the include paths are added for installation testing.
In: core/package_arch/TribitsIncludeDirectories.cmake:14
Macro that registers a package-level cache var to be exported in the <Package>Config.cmake file
Usage:
tribits_pkg_export_cache_var(<cacheVarName>)
where <cacheVarName> must be the name of a cache variable (or an error will occur).
NOTE: This will also export this variable to the <Package><Spkg>Config.cmake file for every enabled subpackage (if this is called from a CMakeLists.txt file of a top-level package that has subpackages). That way, any top-level package cache vars are provided by any of the subpackages' <Package><Spkg>Config.cmake files.
In: core/package_arch/TribitsPkgExportCacheVars.cmake:11
Macro called at the very beginning of a package's top-level <packageDir>/CMakeLists.txt file.
Usage:
tribits_package( <packageName> [ENABLE_SHADOWING_WARNINGS] [DISABLE_STRONG_WARNINGS] [CLEANED] [DISABLE_CIRCULAR_REF_DETECTION_FAILURE] )
See tribits_package_decl() for the documentation for the arguments and tribits_package_decl() and tribits_package() for a description the side-effects (and variables set) after calling this macro.
In: core/package_arch/TribitsPackageMacros.cmake:323
Macro called at the very beginning of a package's top-level <packageDir>/CMakeLists.txt file when a package has subpackages.
Usage:
tribits_package_decl( <packageName> [ENABLE_SHADOWING_WARNINGS] [DISABLE_STRONG_WARNINGS] [CLEANED] [DISABLE_CIRCULAR_REF_DETECTION_FAILURE] )
The arguments are:
<packageName>
Gives the name of the Package, mostly just for checking and documentation purposes. This must match the name of the package provided in the <repoDir>/PackagesList.cmake or an error is issued.ENABLE_SHADOWING_WARNINGS
If specified, then shadowing warnings for the package's sources will be turned on for supported platforms/compilers. The default is for shadowing warnings to be turned off. Note that this can be overridden globally by setting the cache variable ${PROJECT_NAME}_ENABLE_SHADOWING_WARNINGS.DISABLE_STRONG_WARNINGS
If specified, then all strong warnings for the package's sources will be turned off, if they are not already turned off by global cache variables. Strong warnings are turned on by default in development mode.CLEANED
If specified, then warnings will be promoted to errors for compiling the package's sources for all defined warnings.DISABLE_CIRCULAR_REF_DETECTION_FAILURE
If specified, then the standard grep looking for RCPNode circular references in tribits_add_test() and tribits_add_advanced_test() that causes tests to fail will be disabled. Note that if these warnings are being produced then it means that the test is leaking memory and user like may also be leaking memory.
There are several side-effects of calling this macro:
If the package does not have subpackages, just call tribits_package() which calls this macro.
In: core/package_arch/TribitsPackageMacros.cmake:78
Macro called in <packageDir>/CMakeLists.txt after subpackages are processed in order to handle the libraries, tests, and examples of the parent package.
Usage:
tribits_package_def()
If the package does not have subpackages, just call tribits_package() which calls this macro.
This macro has several side effects:
In: core/package_arch/TribitsPackageMacros.cmake:251
Define the dependencies for a given TriBITS Package (i.e. a top-level TriBITS Package or a TriBITS Subpackage) in the package's <packageDir>/cmake/Dependencies.cmake file.
Usage:
tribits_package_define_dependencies( [LIB_REQUIRED_PACKAGES <pkg1> <pkg2> ...] [LIB_OPTIONAL_PACKAGES <pkg1> <pkg2> ...] [TEST_REQUIRED_PACKAGES <pkg1> <pkg2> ...] [TEST_OPTIONAL_PACKAGES <pkg1> <pkg2> ...] [LIB_REQUIRED_TPLS <tpl1> <tpl2> ...] [LIB_OPTIONAL_TPLS <tpl1> <tpl2> ...] [TEST_REQUIRED_TPLS <tpl1> <tpl2> ...] [TEST_OPTIONAL_TPLS <tpl1> <tpl2> ...] [SUBPACKAGES_DIRS_CLASSIFICATIONS_OPTREQS <spkg1_name> <spkg1_dir> <spkg1_classifications> <spkg1_optreq> <spkg2_name> <spkg2_dir> <spkg2_classifications> <spkg2_optreq> ... ] [REGRESSION_EMAIL_LIST <regression-email-address>] )
Every argument in this macro is optional (that is, an package can have no upstream dependencies). The arguments that apply to all packages are:
LIB_REQUIRED_PACKAGES
List of required upstream packages that must be enabled in order to build and use the libraries (or capabilities) in this package.LIB_OPTIONAL_PACKAGES
List of additional optional upstream packages that can be used in this package if enabled. These upstream packages need not be enabled in order to enable this package but not enabling one or more of these optional upstream packages will result in diminished capabilities of this package.TEST_REQUIRED_PACKAGES
List of additional upstream packages that must be enabled in order to build and/or run the tests and/or examples in this package. If any of these upstream packages are not enabled, then there will be no tests or examples defined or run for this package.TEST_OPTIONAL_PACKAGES
List of additional optional upstream packages that can be used by the tests in this package. These upstream packages need not be enabled in order to run some basic tests or examples for this package. Typically, extra tests that depend on optional test packages involve integration testing of some type. Not enabling these optional upstream packages will result in diminished tests or examples.LIB_REQUIRED_TPLS
DEPRECATED: List of required upstream TPLs that must be enabled in order to build and use the libraries (or capabilities) in this package. (Add these to LIB_REQUIRED_PACKAGES instead.)LIB_OPTIONAL_TPLS
DEPRECATED: List of additional optional upstream TPLs that can be used in this package if enabled. These upstream TPLs need not be enabled in order to use this package but not enabling one or more of these optional upstream TPLs will result in diminished capabilities of this package. (Add these to LIB_OPTIONAL_PACKAGES instead.)TEST_REQUIRED_TPLS
DEPRECATED: List of additional upstream TPLs that must be enabled in order to build and/or run the tests and/or examples in this package. If any of these upstream TPLs are not enabled, then there will be no tests or examples defined or run for this package. (Add these to TEST_REQUIRED_PACKAGES instead.)TEST_OPTIONAL_TPLS
DEPRECATED: List of additional optional upstream TPLs that can be used by the tests in this package. These upstream TPLs need not be enabled in order to run basic tests for this package. Typically, extra tests that depend on optional TPLs involve integration testing or some additional testing of some type. (Add these to TEST_OPTIONAL_PACKAGES instead.)
NOTE: The above XXX_TPLS arguments/lists are deprecated. At the package level, there is no distinction between upstream internal and external packages/TPLs, so all upstream package dependencies can (and should) be listed in the XXX_PACKAGES arguments/lists. (There is no change in behavior listing upstream packages in XXX_PACKAGES or the XXX_TPLS arguments/lists.)
Only upstream packages can be listed (as defined by the order the packages are listed in tribits_repository_define_packages() in the <repoDir>/PackagesList.cmake or <repoDir>/TPLsList.cmake files). Otherwise an error will occur and processing will stop. Misspelled package names are caught as well.
Only direct package dependencies need to be listed. Indirect package dependencies are automatically handled. For example, if this package directly depends on package PKG2 which depends on package PKG1 (but this package does not directly depend on anything in PKG1) then this package only needs to list a dependency on PKG2, not PKG1. The dependency on PKG1 will be taken care of automatically by the TriBITS dependency management system.
The packages listed in LIB_REQUIRED_PACKAGES are implicitly also dependencies in TEST_REQUIRED_PACKAGES. Likewise LIB_OPTIONAL_PACKAGES are implicitly also dependencies in TEST_OPTIONAL_PACKAGES.
The upstream dependencies within a single list do not need to be listed in any particular order. For example, if PKG2 depends on PKG1, and this given package depends on both, then one can list the dependencies as:
LIB_REQUIRED_PACKAGES PKG2 PKG1
or:
LIB_REQUIRED_PACKAGES PKG1 PKG2
If some upstream packages are allowed to be missing, this can be specified by calling the macro tribits_allow_missing_external_packages().
A top-level TriBITS Package can also be broken down into TriBITS Subpackages. In this case, the following argument must be passed in:
SUBPACKAGES_DIRS_CLASSIFICATIONS_OPTREQS
2D array with rows listing the subpackages where each row has the columns:
- SUBPACKAGE (Column 0): The name of the subpackage <spkg_name>. The full package name is ${PARENT_PACKAGE_NAME}<spkg_name>. The full package name is what is used in listing dependencies in other packages.
- DIRS (Column 1): The subdirectory <spkg_dir> relative to the parent package's base directory. All of the contents of the subpackage should be under this subdirectory. This is assumed by the TriBITS testing support software when mapping modified files to packages that need to be tested (see checkin-test.py).
- CLASSIFICATIONS (Column 2): The Test Test Category PT, ST, EX and the maturity level EP, RS, PG, PM, GRS, GPG, GPM, and UM, separated by a coma ',' with no spaces in between (e.g. "PT,GPM"). These have exactly the same meaning as for full packages (see tribits_repository_define_packages()).
- OPTREQ (Column 3): Determines if the outer parent package has an OPTIONAL or REQUIRED dependence on this subpackage.
Other variables that this macro handles:
REGRESSION_EMAIL_LIST
The email list that is used to send CDash error messages. If this argument is missing, then the email list that CDash errors go to is determined by other means (see CDash regression email addresses).
In: core/package_arch/TribitsPackageDefineDependencies.cmake:13
Process an enabled TPL's FindTPL${TPL_NAME}.cmake module.
In: core/package_arch/TribitsProcessEnabledTpls.cmake:124
Function that determines if a package's enable variable evaluates to true or is unset.
Usage:
tribits_package_is_enabled_or_unset((<packageEnableVarName> <packageIsEnabledOrUnsetOut>)
On return, the value of <packageIsEnabledOrUnsetOut> will set to TRUE if the variable <packageEnableVarName> evaluates to true and or is empty "". Otherwise, <packageIsEnabledOrUnsetOut> will set to FALSE on return.
In: core/package_arch/TribitsGetPackageEnableStatus.cmake:64
Function that determines if a package's enable variable is explicitly disabled (i.e. evaluates to false but is not emapty).
Usage:
tribits_package_is_explicitly_disabled((<packageEnableVarName> <packageIsExplicitlyDisabledOut>)
On return, the value of <packageIsExplicitlyDisabledOut> will set to TRUE if the variable <packageEnableVarName> evaluates to false and is not empty "". Otherwise, <packageIsExplicitlyDisabledOut> will set to FALSE on return.
In: core/package_arch/TribitsGetPackageEnableStatus.cmake:92
Macro called at the very end of a package's top-level <packageDir>/CMakeLists.txt file that performs some critical post-processing activities.
Usage:
tribits_package_postprocess()
NOTE: This creates the aliased target ${PACKAGE_NAME}::all_libs for all libraries in all subdirectories that don't have the TRIBITS_TESTONLY_LIB target property set on them.
NOTE: It is unfortunate that this macro must be called in a package's top-level CMakeLists.txt file but limitations of the CMake language make it necessary to do so.
In: core/package_arch/TribitsPackageMacros.cmake:742
Macro that processes the TriBITS Subpackages for a parent TriBITS package for packages that are broken down into subpackages. This is called in the parent packages top-level <packageDir>/CMakeLists.txt file.
Usage:
tribits_process_subpackages()
This macro must be called after tribits_package_decl() but before tribits_package_def().
In: core/package_arch/TribitsPackageMacros.cmake:852
Processes a TriBITS Project's files and configures its software which is called from the project's top-level <projectDir>/CMakeLists.txt file.
Usage:
tribits_project()
This macro requires that the variable PROJECT_NAME be defined before calling this macro. All default values for project settings should be set before calling this macro (see TriBITS Global Project Settings). Also, the variable ${PROJECT_NAME}_TRIBITS_DIR must be set as well.
This macro then adds all of the necessary paths to CMAKE_MODULE_PATH and then performs all processing of the TriBITS project files (see Full TriBITS Project Configuration).
In: core/package_arch/TribitsProject.cmake:43
Declare a set of extra repositories for the TriBITS Project (i.e. in the project's <projectDir>/cmake/ExtraRepositoriesList.cmake file).
Usage:
tribits_project_define_extra_repositories( <repo0_name> <repo0_dir> <repo0_vctype> <repo0_url> <repo0_packstat> <repo0_classif> <repo1_name> <repo1_dir> <repo1_vctype> <repo1_url> <rep10_packstat> <repo1_classif> ... )
This macro takes in a 2D array with 6 columns, where each row defines an extra repository. The 6 columns (ordered 0-5) are:
This command is used to put together one or more VC and/or TriBITS repositories to construct a composite TriBITS Project. The option <Project>_EXTRAREPOS_FILE is used to point to files that call this macro.
Repositories with <repoi_packstat>=NOPACKAGES are not TriBITS Repositories and are technically not considered at all during the basic configuration of the a TriBITS project. They are only listed in this file so that they can be used in the version control logic for tools that perform version control with the repositories (such as getting git versions, cloning, updating, looking for changed files, etc.). For example, a non-TriBITS repo can be used to grab a set of directories and files that fill in the definition of a package in an upstream repository (see How to insert a package into an upstream repo). Also, non-TriBITS repos can be used to provide extra test data for a given package or a set of packages so that extra tests can be run.
Repositories with <repoi_repotype>='' are not VC repos. This can be used, for example, to represent the project's native repos or it can be used to point to a TriBITS repository that was cloned in an early listed VC repo.
NOTE: These repositories must be listed in the order of package dependencies. That is, all of the packages listed in repository i must have upstream TPL and package dependencies listed before this package in this repository or in upstream repositories i-1, i-2, etc.
NOTE: This module just sets the local variable:
${PROJECT_NAME}_EXTRAREPOS_DIR_VCTYPE_REPOURL_PACKSTAT_CATEGORY
in the current scope. The advantages of using this macro instead of directly setting this variable are that the macro:
In: core/package_arch/TribitsProcessExtraRepositoriesList.cmake:20
Process a project where you enable all of the packages by default.
Usage:
tribits_project_enable_all()
This macro just sets the global cache var ${PROJECT_NAME}_ENABLE_ALL_PACKAGES to ON by default then calls tribits_project(). That is all. This macro is generally used for TriBITS projects that have just a single package or by default just want to enable all packages. This is especially useful when you have a TriBITS project with just a single package.
In: core/package_arch/TribitsProjectImpl.cmake:299
Define the set of packages for a given TriBITS Repository. This macro is typically called from inside of a <repoDir>/PackagesList.cmake file for a given TriBITS repo.
Usage:
tribits_repository_define_packages( <pkg0> <pkg0_dir> <pkg0_classif> <pkg1> <pkg1_dir> <pkg1_classif> ... )
This macro sets up a 2D array of NumPackages by NumColumns listing out the packages for a TriBITS repository. Each row (with 3 column entries) specifies a package which contains the columns (ordered 0-2):
IMPORTANT: The packages must be listed in increasing order of package dependencies. That is No circular dependencies of any kind are allowed (see the ADP (Acyclic Dependencies Principle) in Software Engineering Packaging Principles). Package i can only list dependencies (in <packageDir>/cmake/Dependencies.cmake) for packages listed before this package in this list (or in upstream TriBITS repositories). This avoids an expensive package sorting algorithm and makes it easy to flag packages with circular dependencies or misspelling of package names.
NOTE: For some rare use cases, the package directory <pkgi_dir> is allowed to be specified as an absolute directory but this absolute directory must be a subdirectory of the project source base directory given by PROJECT_SOURCE_DIR. If not, message(FATAL_ERROR ...) is called and processing stops immediately.
NOTE: This macro just sets the variable:
${REPOSITORY_NAME}_PACKAGES_AND_DIRS_AND_CLASSIFICATIONS
in the current scope. The advantages of using this macro instead of directly setting this variable are that the macro:
In: core/package_arch/TribitsProcessPackagesAndDirsLists.cmake:26
Define the list of TriBITS External Packages/TPLs for a given TriBITS Repository which includes the external package/TPL name, TriBITS TPL find module, and classification . This macro is typically called from inside of a TriBITS Repository's <repoDir>/TPLsList.cmake file.
Usage:
tribits_repository_define_tpls( <tpl0_name> <tpl0_findmod> <tpl0_classif> <tpl1_name> <tpl1_findmod> <tpl1_classif> ... )
This macro sets up a 2D array of NumTPLS by NumColumns listing out the TriBITS TPLs for a TriBITS Repository. Each row (with 3 entries) specifies a different TriBITS exernal package/TPL which contains the columns (ordered 0-2):
A TPL defined in a upstream repo can be listed again in a downstream repo, which allows redefining the find module that is used to specify the external package/TPL. This allows downstream repos to add additional requirements for a given TPL (i.e. add more libraries, headers, etc.). However, the downstream repo's find module file must find the TPL components that are fully compatible with the upstream defined find module in terms of what it provides for packages in the upstream repos.
In: core/package_arch/TribitsProcessTplsLists.cmake:18
Set a variable to an include directory and call tribits_include_directories() (removes boiler-plate code).
Usage:
tribits_set_and_inc_dirs(<dirVarName> <includeDir>)
On output, this sets <dirVarName> to <includeDir> in the local scope and calls tribits_include_directories(<includeDir>).
In: core/package_arch/TribitsSetAndIncDirs.cmake:11
Function that allows packages to easily make a feature ST for development builds and PT for release builds by default.
Usage:
tribits_set_st_for_dev_mode(<outputVar>)
This function is typically called in a package's top-level <packageDir>/CMakeLists.txt file before defining other options for the package. The output variable ${<outputVar>} is set to ON or OFF based on the configure state. In development mode (i.e. ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE==ON), ${<outputVar>} will be set to ON only if ST code is enabled (i.e. ${PROJECT_NAME}_ENABLE_SECONDARY_TESTED_CODE==ON), otherwise it is set to OFF. In release mode (i.e. ${PROJECT_NAME}_ENABLE_DEVELOPMENT_MODE==OFF), ${<outputVar>} is always set to ON. This allows some parts of a TriBITS package to be considered ST for development mode (thereby reducing testing time by not enabling the dependent features/tests), while still having important functionality available to users by default in a release of the package.
In: core/package_arch/TribitsGeneralMacros.cmake:62
Forward declare a TriBITS Subpackage called at the top of the subpackage's <packageDir>/<spkgDir>/CMakeLists.txt file.
Usage:
tribits_subpackage(<spkgName>)
Once called, the following local variables are in scope:
PARENT_PACKAGE_NAME
The name of the parent package.SUBPACKAGE_NAME
The local name of the subpackage (does not contain the parent package name).SUBPACKAGE_FULLNAME
The full project-level name of the subpackage (which includes the parent package name at the beginning, ${PARENT_PACKAGE_NAME}${SUBPACKAGE_NAME}).PACKAGE_NAME
Inside the subpackage, the same as SUBPACKAGE_FULLNAME.
In: core/package_arch/TribitsSubPackageMacros.cmake:14
Macro that performs standard post-processing after defining a TriBITS Subpackage which is called at the bottom of a subpackage's <packageDir>/<spkgDir>/CMakeLists.txt file.
Usage:
tribits_subpackage_postprocess()
NOTE: This creates the aliased target ${PACKAGE_NAME}::all_libs for all libraries in all subdirectories that don't have the TRIBITS_TESTONLY_LIB target property set on them.
NOTE: It is unfortunate that a Subpackages's CMakeLists.txt file must call this macro but limitations of the CMake language make it necessary to do so.
In: core/package_arch/TribitsSubPackageMacros.cmake:113
Function that determines if a TriBITS find module file FindTPL<tplName>.cmake is allowed to call find_package(<tplName> ...) before calling tribits_tpl_find_include_dirs_and_libraries().
Usage:
tribits_tpl_allow_pre_find_package( <tplName> <allowPackagePrefindOut> )
The required arguments are:
<tplName> : The input name of the TriBITS TPL (e.g. HDF5).
<allowPackagePrefindOut> : Name of a variable which will be set to TRUE on output if find_package(<tplName> ...) should be called to find the TPL <tplName> or FALSE if it should not be called.
This function will set <allowPackagePrefindOut> to FALSE if any of the variables TPL_<tplName>_INCLUDE_DIRS, ${TPL_<tplName>_LIBRARIES, or TPL_<tplName>_LIBRARY_DIRS are set. This allows the user to override the search for the library components and just specify the absolute locations. The function will also set <allowPackagePrefindOut> to FALSE if <tplName>_INCLUDE_DIRS, <tplName>_LIBRARY_NAMES, or <tplName>_LIBRARY_DIRS is set and <tplName>_FORCE_PRE_FIND_PACKAGE is set to FALSE. Otherwise, if <tplName>_FORCE_PRE_FIND_PACKAGE is set to TRUE, the function will not return FALSE for <allowPackagePrefindOut> no matter what the values of <tplName>_INCLUDE_DIRS, <tplName>_LIBRARY_NAMES, or <tplName>_LIBRARY_DIRS. Finally, <allowPackagePrefindOut> is set to FALSE if <tplName>_ALLOW_PACKAGE_PREFIND=OFF is set in the cache.
The variable <tplName>_FORCE_PRE_FIND_PACKAGE is needed to allow users (or the FindTPL<tplName>.cmake module itself) to avoid name clashes with the variables <tplName>_INCLUDE_DIRS or <tplName>_LIBRARY_DIRS in the usage of find_package(<tplName> ...) because a lot of default Find<tplName>.cmake modules also use these variables. This function sets <tplName>_FORCE_PRE_FIND_PACKAGE as a cache variable with default value FALSE to maintain backward compatibility with existing FindTPL<tplName>.cmake modules.
The cache variable <tplName>_ALLOW_PACKAGE_PREFIND is to allow the user to disable the prefind call to find_package() even if it would be allowed otherwise.
See Creating FindTPL<tplName>.cmake using find_package() without IMPORTED targets for details in how to use this function to create a FindTPL<tplName>.cmake module file.
In: core/package_arch/TribitsTplFindIncludeDirsAndLibraries.cmake:26
This function reads (cache) variables that specify where to find a TriBITS TPL's headers and libraries and then creates IMPORTED targets, the <tplName>::all_libs target, and writes the file <tplName>Config.cmake into the standard location in the build directory. This function is typically called inside of a FindTPL<tplName>.cmake module file (see ${TPL_NAME}_FINDMOD).
Usage:
tribits_tpl_find_include_dirs_and_libraries( <tplName> [REQUIRED_HEADERS <header1> <header2> ...] [MUST_FIND_ALL_HEADERS] [REQUIRED_LIBS_NAMES "<libname1> <libname1alt1> ..." <libname2> ...] [MUST_FIND_ALL_LIBS] [NO_PRINT_ENABLE_SUCCESS_FAIL] )
This function can be called to specify/require header files and include directories and/or a list of libraries.
The input arguments to this function are:
<tplName>
Name of the TPL that is listed in a <repoDir>/TPLsList.cmake file.REQUIRED_HEADERS
List of header files that are searched in order to find the TPL's include directories files using find_path().MUST_FIND_ALL_HEADERS
If set, then all of the header files listed in REQUIRED_HEADERS must be found (unless TPL_<tplName>_INCLUDE_DIRS is already set).REQUIRED_LIBS_NAMES "<libname1> <libname1alt1> ..." <libname2> ...
List of libraries that are searched for when looking for the TPL's libraries using find_library(). A single list of library names of the form:
<libname1> <libname2> ...are searched for and must all be found and will define the libraries for this TPL on the link line in that order. However, a library name along with alternate library names can be provided using outer quotes with inner spaces:
"<libname1> <libname1alt1> <libname1alt2> ..."In this case, first, <libname1> is looked for and used if it is found. If not found, then the next alternate library name <libname1alt1> is looked for and is used if found. This continues with each successive alternate library name in the set until one is found. If none of the libraries in the set alternative names are found, then this is an error. Providing a set of alternate library names (in order of preference) allows the default find operation to look for different library names for different situations and implementations. For example, the BLAS library can be called blas, openblas or atlas for different BLAS implementations and can be specified as:
"blas openblas atlas"The list of required library names can be overridden by the user by setting <tplName>_LIBRARY_NAMES (see below).
MUST_FIND_ALL_LIBS
If set, then all of the library files listed in REQUIRED_LIBS_NAMES must be found or the TPL is considered not found (unless TPL_<tplName>_LIBRARIES is already set). If the global cache var <Project>_MUST_FIND_ALL_TPL_LIBS is set to TRUE, then this is turned on as well. WARNING: The default is not to require finding all of the listed libs. (This is to maintain backward compatibility with some older FindTPL<tplName>.cmake modules.)NO_PRINT_ENABLE_SUCCESS_FAIL
If set, then the final success/fail will not be printed
This function implements the TPL find behavior described in Enabling support for an optional Third-Party Library (TPL).
The following (cache) variables, if set, will be used by this function:
<tplName>_INCLUDE_DIRS (type PATH)
List of paths to search first for header files defined in REQUIRED_HEADERS <header1> <header2> ....<tplName>_LIBRARY_DIRS (type PATH)
The list of directories to search first for libraries defined in REQUIRED_LIBS_NAMES <libname1> <libname2> .... If, for some reason, no libraries should be linked in for this particular configuration, then setting <tplName>_LIBRARY_DIRS=OFF or is empty will no special paths will be searched.<tplName>_LIBRARY_NAMES (type STRING)
List of library names to be looked for instead of what is specified in REQUIRED_LIBS_NAMES <libname1> <libname2> .... If set, only a single set of libraries can be specified of which all need to be found.<tplName>_LIB_ENABLED_DEPENDENCIES
List of direct upstream external package/TPL dependencies that also define <upstreamTplName>::all_libs targets.
An addition, the function will avoid calling the find operations if the following (cache) variables are set on input:
TPL_<tplName>_INCLUDE_DIRS (type PATH)
A list of common-separated full directory paths that contain the TPL's header files.TPL_<tplName>_LIBRARIES (type FILEPATH)
A list of commons-separated full library names (i.e. output from find_library()) for all of the libraries for the TPL.
This function produces the following:
TPL_<tplName>_NOT_FOUND (type BOOL)
Will be set to ON if all of the parts of the TPL could not be found.<tplName>::<libname>
Namespaced IMPORTED target for every library found or specified in TPL_<tplName>_LIBRARIES. These IMPORTED targets will have the <upstreamTplName>::all_libs for the upstream external packages/TPLs listed in <tplName>_LIB_ENABLED_DEPENDENCIES.<tplName>::all_libs
INTERFACE target that depends on all of the created IMPORTED targets.<buildDir>/external_packages/<tplName>/<tplName>Config.cmake
A package configure file that contains all of the generated IMPORTED targets <tplName>::<libname> and the <tplName>::all_libs target. This fill will also call find_dependency() to pull in <upstreamTplName>Config.cmake files for upstream TPLs that are listed in <tplName>_LIB_ENABLED_DEPENDENCIES. (For more information, see tribits_extpkg_write_config_file().)
Note, if TPL_TENTATIVE_ENABLE_<tplName>=ON, then if all of the parts of the TPL can't be found, then TPL_ENABLE_<tplName> will be (forced) set to OFF in the cache. See tribits_tpl_tentatively_enable().
In: core/package_arch/TribitsTplFindIncludeDirsAndLibraries.cmake:129
Function that sets up for an optionally enabled TPL that is attempted to be enabled but will be disabled if all of the parts are not found.
Usage:
tribits_tpl_tentatively_enable(<tplName>)
This function can be called from any CMakeLists.txt file to put a TPL in tentative enable mode. But typically, it is called from an Package's <packageDir>/cmake/Dependencies.cmake file (see How to tentatively enable an external package/TPL).
This should only be used for optional TPLs. It will not work correctly for required TPLs because any enabled packages that require this TPL will not be disabled and instead will fail to configure or fail to build.
All this function does is to force set TPL_ENABLE_<tplName>=ON if it has not already been set, and sets TPL_TENTATIVE_ENABLE_<tplName>=ON in the cache.
NOTE: This function will only tentatively enable a TPL if its enable has not be explicitly set on input, i.e. if -D TPL_ENABLE_<tplName>="". If the TPL has been explicitly enabled (i.e. -D TPL_ENABLE_<tplName>=ON) or disabled (i.e. -D TPL_ENABLE_<tplName>=OFF), then this function has no effect and the TPL will be unconditionally enabled or disabled.
In: core/package_arch/TribitsTplFindIncludeDirsAndLibraries.cmake:731
Utility function for writing the ${PACKAGE_NAME}Config.cmake files for the build dir and/or for the install dir for the package <packageName> with some flexibility . (See NOTE below for what is actually generated and what is NOT generated.)
Usage:
tribits_write_flexible_package_client_export_files( PACKAGE_NAME <packageName> [EXPORT_FILE_VAR_PREFIX <exportFileVarPrefix>] [PACKAGE_CONFIG_FOR_BUILD_BASE_DIR <packageConfigForBuildBaseDir>] [PACKAGE_CONFIG_FOR_INSTALL_BASE_DIR <packageConfigForInstallBaseDir>] )
The arguments are:
PACKAGE_NAME <packageName>
Gives the name of the TriBITS package for which the export files should be created. (This must match the export set for the libraries for the generated/exported <packageName>ConfigTargets.cmake file.)EXPORT_FILE_VAR_PREFIX <exportFileVarPrefix>
If specified, then all of the variables in the generated export files will be prefixed with <exportFileVarPrefix>_ instead of <packageName>_.PACKAGE_CONFIG_FOR_BUILD_BASE_DIR <packageConfigForBuildBaseDir>
If specified, then the package's <packageName>Config.cmake file will be written under the directory <packageConfigForBuildBaseDir>/ (and any subdirs that do not exist will be created). The generated file <packageName>Config.cmake is for usage of the package in the build tree (not the install tree) and points to include directories and libraries in the build tree. (NOTE: The included <packageName>Targets.cmake file is NOT generated in this function.)PACKAGE_CONFIG_FOR_INSTALL_BASE_DIR <packageConfigForInstallBaseDir>
If specified, then the package's <packageName>Config_install.cmake file will be written under the directory <packageConfigForInstallBaseDir>/ (and any subdirs that do not exist will be created). The file ${PACKAGE_NAME}Config_install.cmake is meant to be installed renamed as <packageName>Config.cmake in the install tree and it points to installed include directories and libraries. (NOTE: The included <packageName>Targets.cmake file is NOT generated in this function.)
NOTE: This function does not generate the <packageName>Config.cmake files (which will be created later using export() or include()`) which are included in these generated package config files and this function. Also, this function does *not* invoke the ``install() command to install the package config file for the install directory. The export() and install() project commands are bot allowed in cmake -P scripting mode that is used for unit testing this function. Instead, the commands to generate the <packageName>Targets.cmake files and install the package config file for the install tree are produced by the function tribits_write_package_client_export_files_export_and_install_targets() which is called after this function. This allows this function tribits_write_package_client_export_files() to be run in unit testing with a cmake -P script.
In: core/package_arch/TribitsInternalPackageWriteConfigFile.cmake:304
print a variable giving its name then value if ${PROJECT_NAME}_VERBOSE_CONFIGURE=TRUE.
Usage:
tribits_verbose_print_var(<varName>)
This prints:
message("-- " "${VARIBLE_NAME}='${${VARIBLE_NAME}}'")
The variable <varName> can be defined or undefined or empty. This uses an explicit "-- " line prefix so that it prints nice even on Windows CMake.
In: core/package_arch/TribitsVerbosePrintVar.cmake:14
The following subsections give detailed documentation for some CMake macros and functions which are not a core part of the TriBITS system but are included in the TriBITS source tree, are used inside of the TriBITS system, and are provided as a convenience to TriBITS project developers. One will see many of these functions and macros used throughout the implementation of TriBITS and even in the CMakeLists.txt files for different projects that use TriBITS.
These macros and functions are not prefixed with TRIBITS_. However, there is really not a large risk to defining and using these non-namespaces utility functions and macros. It turns out that CMake allows one to redefine any macro or function, even built-in ones, inside of one's project. Therefore, even if CMake did add new commands that clashed with these names, there would be no conflict. When overriding a built-in command, e.g. some_builtin_command(), one can always access the original built-in command as _some_builtin_command().
Macro that adds a list of subdirectories all at once (removes boiler-plate code).
Usage:
add_subdirectories(<dir1> <dir2> ...)
instead of:
add_subdirectory(<dir1>) add_subdirectory(<dir2>) ...
In: core/utils/AddSubdirectories.cmake:11
Macro that sets an option and marks it as advanced (removes boiler-plate and duplication).
Usage:
advanced_option(<varName> [other arguments])
This just calls the built-in CMake commands:
option(<varName> [other arguments]) mark_as_advanced(<varName>)
In: core/utils/AdvancedOption.cmake:11
Macro that sets a variable and marks it as advanced (removes boiler-plate and duplication).
Usage:
advanced_set(<varName> [other arguments])
This just calls the built-in commands:
set(<varName> [other arguments]) mark_as_advanced(<varName>)
In: core/utils/AdvancedSet.cmake:11
Utility function that appends command-line arguments to a variable of command-line arguments.
Usage:
append_cmndline_args(<var> "<extraArgs>")
This function just appends the command-line arguments in the string "<extraArgs>" but does not add an extra space if <var> is empty on input. This just makes the formatting of command-line arguments easier.
In: core/utils/AppendCmndlineArgs.cmake:11
Utility macro that does a file(GLOB ...) and appends to an existing list (removes boiler-plate code).
Usage:
append_glob(<fileListVar> <glob0> <glob1> ...)
On output, <fileListVar> will have the list of glob files appended.
In: core/utils/AppendGlob.cmake:13
Utility macro that appends arguments to a global variable (reduces boiler-plate code and mistakes).
Usage:
append_global_set(<varName> <arg0> <arg1> ...)
NOTE: The variable <varName> must exist before calling this function. To set it empty initially use global_null_set().
In: core/utils/AppendGlobalSet.cmake:14
Utility function to append elements to a variable (reduces boiler-plate code).
Usage:
append_set(<varName> <arg0> <arg1> ...)
This just calls:
list(APPEND <varName> <arg0> <arg1> ...)
There is better error reporting if one misspells APPEND_SET than if one misspells APPEND.
In: core/utils/AppendSet.cmake:11
Append strings to an existing string variable (reduces boiler-place code and reduces mistakes).
Usage:
append_string_var(<stringVar> "<string1>" "<string2>" ...)
Note that the usage of the characters '[', ']', '{', '}' are taken by CMake to bypass the meaning of ';' to separate string characters. If one wants to ignore the meaning of these special characters and are okay with just adding one string at a time, then use append_string_var_ext().
DEPRECATED: Instead, use:
string(APPEND <stringVar> "<string1>" "<string2>" ...)
In: core/utils/AppendStringVar.cmake:17
Append a single string to an existing string variable, ignoring ';' (reduces boiler-place code and reduces mistakes).
Usage:
append_string_var_ext(<stringVar> "<string>")
Simply sets <stringVar> = "${<stringVar>}<string>" and leaves in ';' without creating new array elements.
In: core/utils/AppendStringVar.cmake:46
Append strings to a given string variable, joining them using a separator string.
Usage:
append_string_var_with_sep(<stringVar> "<sepStr>" "<str0>" "<str1>" ...)
Each of the strings <stri> are appended to <stringVar> using the separation string <sepStr>.
In: core/utils/AppendStringVarWithSep.cmake:13
Assert that a variable is defined and if not call message(SEND_ERROR ...).
Usage:
assert_defined(<varName>)
This is used to get around the problem of CMake not asserting the dereferencing of undefined variables. For example, how does one know if one did not misspell the name of a variable in an if statement like:
if (SOME_VARBLE) ... endif()
?
If one misspelled the variable SOME_VARBLE (which is likely in this case), then the if statement will always be false! To avoid this problem when one always expects that a variable is explicitly set, instead do:
assert_defined(SOME_VARBLE) if (SOME_VARBLE) ... endif()
Now if one misspells this variable, then CMake will asset and stop processing. This is not a perfect solution since one can misspell the variable name in the following if statement but typically one would always just copy and paste between the two statements so these names are always the same. This is the best that can be done in CMake unfortunately to catch usage of misspelled undefined variables.
In: core/utils/AssertDefined.cmake:11
Set up a BOOL cache variable (i.e. an option) based on a set of dependent options.
Usage:
combined_option( <combinedOptionName> DEP_OPTIONS_NAMES <depOpName0> <depOptName1> ... DOCSTR "<docstr0>" "<docstr1>" ... )
This sets up a BOOL cache variable <combinedOptionName> which is defaulted to ON if all of the listed dependent option variables <depOpName0>, <depOptName1>, ... are all ON. However, if <combinedOptionName> is set to ON by the user and not all of the dependent option variables are also ON, then this results in a fatal error and all processing stops.
This is used by a CMake project to automatically turn on a feature that requires a set of other features (when they are all enabled) but allows a user to disable the feature if desired.
In: core/utils/CombinedOption.cmake:15
Concatenate a set of string arguments.
Usage:
concat_strings(<outputVar> "<str0>" "<str1>" ...)
On output, <outputVar> is set to "<str0><str1>...". This makes it easier to format a long string over multiple CMake source code lines.
In: core/utils/ConcatStrings.cmake:13
Utility function that appends command-line arguments to a variable of command-line options and sets the result in current scope and parent scope.
Usage:
dual_scope_append_cmndline_args(<var> "<extraArgs>")
Just calls append_cmndline_args() and then set(<var> ${<var>} PARENT_SCOPE).
In: core/utils/DualScopeAppendCmndlineArgs.cmake:14
Utility function that prepends command-line arguments to a variable of command-line arguments and sets the result in current scope and parent scope.
Usage:
dual_scope_prepend_cmndline_args(<var> "<extraArgs>")
Just calls prepend_cmndline_args() and then set(<var> ${<var>} PARENT_SCOPE).
In: core/utils/DualScopePrependCmndlineArgs.cmake:14
Macro that sets a variable name both in the current scope and the parent scope.
Usage:
dual_scope_set(<varName> [other args])
It turns out that when one calls add_subdirectory(<someDir>) or enters a FUNCTION that CMake actually creates a copy of all of the regular non-cache variables in the current scope in order to create a new set of variables for the CMakeLists.txt file in <someDir>. This means that if you call set(SOMEVAR Blah PARENT_SCOPE) that it will not affect the value of SOMEVAR in the current scope! This macro therefore is designed to set the value of the variable in the current scope and the parent scope in one shot to avoid confusion.
Global variables are different. When one moves to a subordinate CMakeLists.txt file or enters a FUNCTION, then a local copy of the variable is not created. If one sets the variable locally, it will shadow the global variable. However, if one sets the global cache value with set(SOMEVAR someValue CACHE INTERNAL ""), then the value will get changed in the current subordinate scope and in all parent scopes all in one shot!
In: core/utils/DualScopeSet.cmake:11
Set a variable as a null internal global (cache) variable (removes boiler-plate code).
Usage:
global_null_set(<varName>)
This just calls:
set(<varName> "" CACHE INTERNAL "")
This avoid problems with misspelling CACHE.
In: core/utils/GlobalNullSet.cmake:11
Set a variable as an internal global (cache) variable (removes boiler-plate code).
Usage:
global_set(<varName> [other args])
This just calls:
set(<varName> [other args] CACHE INTERNAL "")
This avoid misspelling CACHE.
In: core/utils/GlobalSet.cmake:11
Join a set of strings into a single string using a join string.
Usage:
join(<outputStrVar> "<sepStr>" <quoteElements> "<string0>" "<string1>" ...)
Arguments:
<outputStrVar>
The name of a variable that will hold the output string."<sepStr>"
A string to use to join the list of strings.<quoteElements>
- If TRUE, then each <stringi> is quoted using an escaped quote
- char \". If FALSE then no escaped quote is used.
"<string0>" "<string1>" ...
Zero or more string arguments to be joined.
On output, the variable <outputStrVar> is set to:
"<string0><sepStr><string1><sepStr>..."
If <quoteElements>=TRUE, then <outputStrVar> is set to:
"\"<string0>\"<sepStr>\"<string1>\"<sepStr>..."
For example, the latter can be used to set up a set of command-line arguments given a CMake array like:
join(CMND_LINE_ARGS " " TRUE ${CMND_LINE_ARRAY})
WARNING: Be careful to quote string arguments that have spaces because CMake interprets those as array boundaries.
In: core/utils/Join.cmake:11
Function that wraps the standard CMake/CTest message() function call in order to allow unit testing to intercept the output.
Usage:
message_wrapper(...)
This function takes exactly the same arguments as built-in message() function. However, when the variable MESSAGE_WRAPPER_UNIT_TEST_MODE is set to TRUE, then this function will not call message(...) but instead will prepend set to the global variable MESSAGE_WRAPPER_INPUT the input argument that would have gone to message(). To capture just this call's input, first call:
global_null_set(MESSAGE_WRAPPER_INPUT)
before calling this function (or the functions/macros that call this function).
This function allows one to unit test other user-defined CMake macros and functions that call this function to catch error conditions without stopping the CMake program. Otherwise, this is used to capture print messages to verify that they say the right thing.
In: core/utils/MessageWrapper.cmake:14
Function to set a single string by concatenating a list of separate strings
Usage:
multiline_set(<outputStrVar> "<string0>" "<string1>" ... )
On output, the local variables <outputStrVar> is set to:
"<string0><string1>..."
The purpose of this is function to make it easier to set longer strings over multiple lines.
This function is exactly the same as concat_strings() and should not even exist :-(
In: core/utils/MultilineSet.cmake:11
Utility function that prepends command-line arguments to a variable of command-line arguments.
Usage:
prepend_cmndline_args(<var> "<extraArgs>")
This function just prepends the command-line arguments in the string "<extraArgs>" but does not add an extra space if <var> is empty on input.
In: core/utils/PrependCmndlineArgs.cmake:11
Utility macro that prepends arguments to a global variable (reduces boiler-plate code and mistakes).
Usage:
prepend_global_set(<varName> <arg0> <arg1> ...)
The variable <varName> must exist before calling this function. To set it empty initially use global_null_set().
In: core/utils/PrependGlobalSet.cmake:14
Print a defined variable giving its name then value only if it is not empty.
Usage:
print_nonempty_var(<varName>)
Calls print_var(<varName>) if ${<varName>} is not empty.
In: core/utils/PrintNonemptyVar.cmake:14
Print a list variable giving its name then value printed with spaces instead of ';', but only if the list is non-empty.
Usage:
print_nonempty_var_with_spaces(<varName> <printedVarOut>)
Prints the variable as:
<varName>: <ele0> <ele1> ...
If <printedVarOut> is TRUE on input, then the variable is not touched. If however, the variable <printedVarOut> is not TRUE and the list <varName> in non-empty, then <printedVarOut> is set to TRUE on output.
In: core/utils/PrintNonemptyVarWithSpaces.cmake:14
Unconditionally print a variable giving its name then value.
Usage:
print_var(<varName>)
This prints:
message("-- " "${VARIBLE_NAME}='${${VARIBLE_NAME}}'")
The variable <varName> can be defined or undefined or empty. This uses an explicit "-- " line prefix so that it prints nice even on Windows CMake.
In: core/utils/PrintVar.cmake:12
Print a defined variable giving its name then value printed with spaces instead of ';'.
Usage:
print_var_with_spaces(<varName> <printedVarInOut>)
Prints the variable as:
<varName>: <ele0> <ele1> ...
If $<printedVarInOut> is TRUE on input, then the variable is not touched. If however, the variable $<printedVarInOut> is not TRUE on input, then it is set to TRUE on output.
In: core/utils/PrintVarWithSpaces.cmake:14
Remove duplicate elements from a global list variable (removes boiler-plate code and errors).
Usage:
remove_global_duplicates(<globalVarName>)
This function is necessary in order to preserve the "global" nature of the variable. If one just calls list(REMOVE_DUPLICATES ...) it will actually create a local variable of the same name and shadow the global variable! That is a fun bug to track down! The variable <globalVarName> must be defined before this function is called. If <globalVarName> is actually not a global cache variable before this function is called it will be after it completes.
In: core/utils/RemoveGlobalDuplicates.cmake:14
Usage:
set_cache_on_off_empty(<varName> <initialVal> "<docString>" [FORCE])
Sets a special string cache variable with possible values "", "ON", or "OFF". This results in a nice drop-down box in the CMake cache manipulation GUIs.
In: core/utils/SetCacheOnOffEmpty.cmake:11
Give a local variable a default value if a non-empty value is not already set.
Usage:
set_default(<varName> <arg0> <arg1> ...)
If on input "${<varName>}"=="", then <varName> is set to the given default <arg0> <arg1> .... Otherwise, the existing non-empty value is preserved.
In: core/utils/SetDefault.cmake:11
Set a default value for a local variable and override from an environment variable of the same name if it is set.
Usage:
set_default_and_from_env(<varName> <defaultVal>)
First calls set_default(<varName> <defaultVal>) and then looks for an environment variable named <varName>, and if non-empty then overrides the value of the local variable <varName>.
This macro is primarily used in CTest code to provide a way to pass in the value of CMake variables. Older versions of ctest did not support the option -D <var>:<type>=<value> to allow variables to be set through the command-line like cmake always allowed.
In: core/utils/SetDefaultAndFromEnv.cmake:15
Split a string variable into a string array/list variable.
Usage:
split("<inputStr>" "<sepStr>" <outputStrListVar>)
The <sepStr> string is used with string(REGEX ...) to replace all occurrences of <sepStr> in <inputStr> with ";" and writing into <outputStrListVar>.
WARNING: <sepStr> is interpreted as a regular expression (regex) so keep that in mind when considering special regex chars like '*', '.', etc!
In: core/utils/Split.cmake:11
Return the raw time in seconds (nano-second accuracy) since epoch, i.e., since 1970-01-01 00:00:00 UTC.
Usage:
timer_get_raw_seconds(<rawSecondsVar>)
This function is used along with timer_get_rel_seconds(), and timer_print_rel_time() to time big chunks of CMake code for timing and profiling purposes. See timer_print_rel_time() for more details and an example.
NOTE: This function runs an external process with execute_process() to run the date command. Therefore, it only works on Unix/Linux and other systems that have a standard date command. Since this uses execute_process(), this function should only be used to time very course-grained operations (i.e. that take longer than a second). If the date command does not exist, then ${<rawSecondsVar>} will be empty on output!
In: core/utils/TimingUtils.cmake:20
Return the relative time between start and stop seconds.
Usage:
timer_get_rel_seconds(<startSeconds> <endSeconds> <relSecondsOutVar>)
This simple function computes the relative number of seconds between <startSeconds> and <endSeconds> (returned from timer_get_raw_seconds()) and sets the result in the local variable <relSecondsOutVar>.
In: core/utils/TimingUtils.cmake:49
Print the relative time between start and stop timers in <min>m<sec>s format.
Usage:
timer_print_rel_time(<startSeconds> <endSeconds> "<messageStr>")
Differences the raw times <startSeconds> and <endSeconds> (i.e. gotten from timer_get_raw_seconds()) and prints the time in <min>m<sec>s format.
This is meant to be used with timer_get_raw_seconds() to time expensive blocks of CMake code like:
timer_get_raw_seconds(REAL_EXPENSIVE_START) real_expensive(...) timer_get_raw_seconds(REAL_EXPENSIVE_END) timer_print_rel_time(${REAL_EXPENSIVE_START} ${REAL_EXPENSIVE_END} "real_expensive() time")
This will print something like:
real_expensive() time: 0m5.235s
In: core/utils/TimingUtils.cmake:86
Set up a string cache variable that must match a fixed set of values (i.e. an enum) and assert that it matches those values.
Usage:
tribits_add_enum_cache_var(<cacheVarName> DEFAULT_VAL <defaultVal> DOC_STRING "<docString>" ALLOWED_STRINGS_LIST "<val0>" "<val1>" ... [IS_ADVANCED] )
On output, <cacheVarName> will be set to the list of paths
In: core/utils/TribitsAddEnumCacheVar.cmake:13
Set an advanced cache variable with a default value (passing in a default default value).
Usage:
tribits_advanced_set_cache_var_and_default(<cacheVarName> <cacheVarType> <defaultDefaultVal> <docString>)
If the variable <cacheVarName>_DEFAULT already exists with a value, that is used as the default cache variable. Otherwise, <cacheVarName>_DEFAULT is set set to <defaultDefaultVal> first.
In: core/utils/TribitsSetCacheVarAndDefault.cmake:13
Notify the user that some TriBITS functionality is deprecated.
Usage:
tribits_deprecated(<message>)
Depending on the value of the cache variable TRIBITS_HANDLE_TRIBITS_DEPRECATED_CODE, this can do one of several things:
DEPRECATION message and continue.
AUTHOR_WARNING: Issue a CMake AUTHOR_WARNING message and continue.
SEND_ERROR: Issue a CMake SEND_ERROR message and continue.
FATAL_ERROR: Issue a CMake FATAL_ERROR message and exit.
IGNORE: Issue no message and continue.
In: core/utils/TribitsDeprecatedHelpers.cmake:25
Notify the user that a TriBITS function or macro is deprecated. This should be the first command called at the top of any deprecated function or macro.
Usage:
tribits_deprecated_command(<name> [MESSAGE <message>] )
In: core/utils/TribitsDeprecatedHelpers.cmake:66
Create a reverse list var in one shot.
Usage:
tribits_create_reverse_list(<oldListName> <newListName>)
In: core/utils/TribitsCreateReverseList.cmake:11
Set a cache variable with a default value (passing in a default default value).
Usage:
tribits_set_cache_var_and_default(<cacheVarName> <cacheVarType> <defaultDefaultVal> <docString>)
If the variable <cacheVarName>_DEFAULT already exists with a value, that is used as the default cache variable. Otherwise, <cacheVarName>_DEFAULT is set set to <defaultDefaultVal> first.
In: core/utils/TribitsSetCacheVarAndDefault.cmake:36
Remove one set of quotes from the outside of a string if they exist.
Usage:
tribits_strip_quotes_from_str(<str_in> <str_var_out>)
If <str_in> does not contain a quote char '"' as the first and last char, then the original <str_in> is returned in <str_var_out>.
In: core/utils/TribitsStripQuotesFromStr.cmake:1
Perform a single unit test equality check and update overall test statistics
Usage:
unittest_compare_const(<varName> <expectedValue>)
If ${<varName>} == <expectedValue>, then the check passes, otherwise it fails. This prints the variable name and values and shows the test result.
This updates the global variables UNITTEST_OVERALL_NUMRUN, UNITTEST_OVERALL_NUMPASSED, and UNITTEST_OVERALL_PASS which are used by the unit test harness system to assess overall pass/fail.
In: core/utils/UnitTestHelpers.cmake:34
Check that a given string var contains the given substring and update overall test statistics
Usage:
unittest_has_substr_const(<varName> <substr>)
If ${<varName>} contains the substring <substr>, then the check passes, otherwise it fails. This prints the variable name and values and shows the test result.
This updates the global variables UNITTEST_OVERALL_NUMRUN, UNITTEST_OVERALL_NUMPASSED, and UNITTEST_OVERALL_PASS which are used by the unit test harness system to assess overall pass/fail.
In: core/utils/UnitTestHelpers.cmake:318
Check that a given string var does NOT contains the given substring and update overall test statistics
Usage:
unittest_not_has_substr_const(<varName> <substr>)
If ${<varName>} contains the substring <substr>, then the check failed, otherwise it passes. This prints the variable name and values and shows the test result.
This updates the global variables UNITTEST_OVERALL_NUMRUN, UNITTEST_OVERALL_NUMPASSED, and UNITTEST_OVERALL_PASS which are used by the unit test harness system to assess overall pass/fail.
In: core/utils/UnitTestHelpers.cmake:364
Perform a series of regexes on a given string and update overall test statistics.
Usage:
unittest_string_regex( "<inputString>" REGEX_STRINGS "<str0>" "<str1>" ... )
If the <inputString> matches all of the of the regexs "<str0>", "<str1>", ..., then the test passes. Otherwise it fails.
This updates the global variables UNITTEST_OVERALL_NUMRUN, UNITTEST_OVERALL_NUMPASSED, and UNITTEST_OVERALL_PASS which are used by the unit test harness system to assess overall pass/fail.
In: core/utils/UnitTestHelpers.cmake:208
Perform a series of regexes on a given string variable and update overall test statistics.
Usage:
unittest_string_var_regex( <inputStringVar> REGEX_STRINGS "<str0>" "<str1>" ... )
If the "${<inputStringVar>}" matches all of the of the regexs "<str0>", "<str1>", ..., then the test passes. Otherwise it fails.
This updates the global variables UNITTEST_OVERALL_NUMRUN, UNITTEST_OVERALL_NUMPASSED, and UNITTEST_OVERALL_PASS which are used by the unit test harness system to assess overall pass/fail.
In: core/utils/UnitTestHelpers.cmake:267
Perform a series regexes of given strings and update overall test statistics.
Usage:
unittest_file_regex( <inputFileName> REGEX_STRINGS "<str1>" "<str2>" ... )
The contents of <inputFileName> are read into a string and then passed to unittest_string_regex() to assess pass/fail.
In: core/utils/UnitTestHelpers.cmake:410
Print final statistics from all tests and assert final pass/fail
Usage:
unittest_final_result(<expectedNumPassed>)
If ${UNITTEST_OVERALL_PASS}==TRUE and ${UNITTEST_OVERALL_NUMPASSED} == <expectedNumPassed>, then the overall test program is determined to have passed and string:
"Final UnitTests Result: PASSED"
is printed. Otherwise, the overall test program is determined to have failed, the string:
"Final UnitTests Result: FAILED"
is printed, and message(SEND_ERROR "FAIL") is called.
The reason that we require passing in the expected number of passed tests is as an extra precaution to make sure that important unit tests are not left out. CMake is a very loosely typed language and it pays to be a little paranoid.
In: core/utils/UnitTestHelpers.cmake:431
This section contains more detailed information about the internal implementation of TriBITS. This information is meant to make it easier to understand and manipulate the data-structures and macros/functions that make up internal implementation of TriBITS and is important for TriBITS System Developers and TriBITS System Architects.
This section describes the global CMake variables that make up the data-structures and the macros/functions that create them that define the TriBITS package dependency system. All of these variables all exist at the base project-level CMakeLists.txt file and are typically not cache variables (and therefore are recomputed on every reconfigure and can therefore accommodate changing enables/disables without a full reconfigure from scratch). These variables define a graph of external packages/TPLs (i.e. pre-built and found out on the system) and internal packages (i.e. buildable CMake packages). This information is meant for maintainers of the TriBITS system itself and should not need to be known by TriBITS Project Developers or even TriBITS Project Architects.
Before describing the TriBITS package architecture data-structures and the macros/functions that create and manipulate those data-structures in detail, first we define some naming conventions for TriBITS macros/function and variables.
First, the term "Package" is overloaded in the context of TriBITS (and in CMake for that matter). In the context of TriBITS, here are the different types of entities called a "Package":
Also, there are the collection of all three of the TriBITS-related "packages".
To try to avoid ambiguity, we define the following identifiers that appear in TriBITS variable, macro, and function names:
TriBITS uses the follow general case naming conventions for variables, macros, functions and module files:
The user-level variables that define a TriBITS Project, Repository, Package and Subpackage are listed in:
These are variables that can be accessed by TriBITS Project Developers but are also used in the internal implementation of TriBITS functionality.
List of non-cache top-level project variables:
All of the above list variables are sorted in a valid dependency ordering in that any upstream dependent packages are listed before a given package in these lists. After these variables have been set in the macro tribits_read_all_project_deps_files_create_deps_graph(), they should considered to be constant and not modified.
These variables are described in more detail below.
The original list of all defined external packages (TPLs) read from the processed <repoDir>/TPLsList.cmake files is given in the list variable:
${PROJECT_NAME}_DEFINED_TPLS
with size:
${PROJECT_NAME}_NUM_DEFINED_TPLS
The original list of all defined internal top-level packages read in from the processed <repoDir>/PackagesList.cmake files is given in the list variable:
${PROJECT_NAME}_DEFINED_INTERNAL_TOPLEVEL_PACKAGES
with size:
${PROJECT_NAME}_NUM_DEFINED_INTERNAL_TOPLEVEL_PACKAGES
In this list, a defined internal TriBITS Package (i.e. a package that can be built from source) will have ${PACKAGE_NAME}_SOURCE_DIR != "" while a defined external package/TPL will have a non-empty ${PACKAGE_NAME}_FINDMOD != "".
The full list of defined external packages/TPLs and top-level internal packages (i.e. TriBITS top-level packages) (not including subpackages) is stored in the project-level non-cache list variable:
${PROJECT_NAME}_DEFINED_TOPLEVEL_PACKAGES
with size:
${PROJECT_NAME}_NUM_DEFINED_TOPLEVEL_PACKAGES
The first set of elements in this list are the defined external packages/TPLs that are read in from the <repoDir>/TPLsList.cmake files from each processed TriBITS repository, in order. This is followed by the set of internal packages (TriBITS packages) that are defined in the <repoDir>/PackagesList.cmake files from each processed TriBITS repository, read in order. This list does not include any subpackages.
Note that some of the packages listed in ${PROJECT_NAME}_DEFINED_INTERNAL_TOPLEVEL_PACKAGES may actually be treated as external packages and not build from source code and instead will be found on the system as pre-built/pre-installed packages using find_package(<PackageName>). The final decision for if a package is treated as an internal or external package is determined by the variable:
${PACKAGE_NAME}_PACKAGE_BUILD_STATUS=[INTERNAL|EXTERNAL]
which gets set using various criteria as described in section Determining if a package is internal or external. This variable determines what pre-built/pre-installed packages must be found out on the system if enabled and what internal packages need to be built if enabled.
The set of external packages, internal top-level packages, and internal sub-packages are just called the list of "Packages". When the term "Packages" is used without an adjective, it is usually meant in this more general context.
The set of all of the defined internal top-level packages and subpackages is given by the non-cache project-level list variable:
${PROJECT_NAME}_DEFINED_INTERNAL_PACKAGES
with the size:
${PROJECT_NAME}_NUM_DEFINED_INTERNAL_PACKAGES
The set of all of the defined external packages/TPLs, internal top-level packages and subpackages is given by the non-cache project-level list variable:
${PROJECT_NAME}_DEFINED_PACKAGES
with the size:
${PROJECT_NAME}_NUM_DEFINED_PACKAGES
These data-structures as well as the package dependencies graph is built up in the macro tribits_read_all_project_deps_files_create_deps_graph() with the call graph described in the section Function call tree for constructing package dependency graph. These data-structures don't consider what packages are actually enabled or disabled.
The enable/disable logic (given an initial set of enables and disables) is applied in the macro tribits_adjust_package_enables(). Once all of this logic has been applied, several lists of enabled and non-enabled packages are computed.
The list of enabled internal top-level packages is given in the non-cache project-level list variable:
${PROJECT_NAME}_ENABLED_INTERNAL_TOPLEVEL_PACKAGES
with size:
${PROJECT_NAME}_NUM_ENABLED_INTERNAL_TOPLEVEL_PACKAGES
The list of enabled external packages/TPLs and internal top-level packages is given in the non-cache project-level list variable:
${PROJECT_NAME}_ENABLED_TOPLEVEL_PACKAGES
with size:
${PROJECT_NAME}_NUM_ENABLED_TOPLEVEL_PACKAGES
The list of enabled external packages/TPLs, internal top-level and subpackages is given in the non-cache project-level list variable:
${PROJECT_NAME}_ENABLED_PACKAGES
with size:
${PROJECT_NAME}_NUM_ENABLED_PACKAGES
TriBITS sets up the following project-level non-cache variables that define the dependencies for each external package/TPL and internal package:
${PACKAGE_NAME}_LIB_DEFINED_DEPENDENCIES
The list of all defined direct required and optional upstream external package/TPL and internal package dependencies, regardless if they are enabled or not. To determine if a given direct upstream package <depPkg> in this list is enabled or not for this package ${PACKAGE_NAME}, check the value of ${PACKAGE_NAME}_ENABLE_<depPkg>. NOTE: The variables ${PACKAGE_NAME}_ENABLE_<depPkg> will be set even for required upstream packages to allow for uniform loops involving required and optional upstream dependencies. (And for a parent package with subpackages, it is possible for a required subpackage to not be enabled and for ${PACKAGE_NAME}_ENABLE_<depPkg> to be OFF as explained in Subpackage enable does not auto-enable the parent package.) This list will be set regardless of if the package ${PACKAGE_NAME} is enabled or not.${PACKAGE_NAME}_LIB_ENABLED_DEPENDENCIES
List of all enabled direct required and optional upstream external package/TPL and internal package dependencies. This is strict subset of ${PACKAGE_NAME}_LIB_DEFINED_DEPENDENCIES (i.e. all of the <depPkg> items in this list will have ${PACKAGE_NAME}_ENABLE_<depPkg> set to ON).${PACKAGE_NAME}_LIB_DEP_REQUIRED_<depPkg>
Is TRUE if the entry <depPkg> in ${PACKAGE_NAME}_LIB_DEFINED_DEPENDENCIES or ${PACKAGE_NAME}_LIB_ENABLED_DEPENDENCIES is a required LIB dependency and is FALSE if it is only an optional LIB dependency.${PACKAGE_NAME}_TEST_DEFINED_DEPENDENCIES
This list of all define direct extra package test required and optional upstream external package/TPL and internal package dependencies. This list is set regardless if the package ${PACKAGE_NAME} is enabled or not. NOTE: This list does not contain the items in the list ${PACKAGE_NAME}_LIB_DEFINED_DEPENDENCIES (but those are implicitly also required/optional test dependencies as well).${PACKAGE_NAME}_TEST_ENABLED_DEPENDENCIES
The list of all enabled direct extra required and optional upstream external package/TPL and internal package dependencies. This is a strict subset of ${PACKAGE_NAME}_TEST_DEFINED_DEPENDENCIES.${PACKAGE_NAME}_TEST_DEP_REQUIRED_<depPkg>
Is TRUE if the entry <depPkg> in ${PACKAGE_NAME}_TEST_DEFINED_DEPENDENCIES or ${PACKAGE_NAME}_TEST_ENABLED_DEPENDENCIES is a required TEST dependency and is FALSE if it is only an optional TEST dependency. For the sake of simplicity and generality, ${PACKAGE_NAME}_TEST_DEP_REQUIRED_<depPkg> will also be set to TRUE or FALSE for <depPkg> in the lists ${PACKAGE_NAME}_LIB_DEFINED_DEPENDENCIES or ${PACKAGE_NAME}_LIB_ENABLED_DEPENDENCIES because a LIB dependency is also implicitly a TEST dependency.
NOTE: The same upstream package <depPkg> can be included in both the lists ${PACKAGE_NAME}_LIB_DEFINED_DEPENDENCIES and ${PACKAGE_NAME}_TEST_DEFINED_DEPENDENCIES if <depPkg> is optional in the former but required in the latter (which is a valid situation if you think about it as a package that may be optional for the lib(s) of a package is required by the tests for a package). (Otherwise, duplicate entries will be removed from the list ${PACKAGE_NAME}_TEST_DEFINED_DEPENDENCIES.)
NOTE: Having flat lists containing both optional and required dependencies with the bool variables ${PACKAGE_NAME}_[LIB|TEST]_DEP_REQUIRED_<depPkg> defining which entries are required or optional is modeled after the CMake standard for handing the COMPONENTS and OPTIONAL_COMPONENTS arguments to find_package() in that it passes that info to the <Package>Config.cmake file as the single list variable ${CMAKE_FIND_PACKAGE_NAME}_FIND_COMPONENTS and the bool vars ${CMAKE_FIND_PACKAGE_NAME}_FIND_REQUIRED_<comp>.
Given the above upstream dependency list variables, the following derived list variables are then constructed which provide navigation from a package to its downstream/forward dependent packages:
${PACKAGE_NAME}_FORWARD_LIB_DEFINED_DEPENDENCIES
For a given package ${PACKAGE_NAME}, lists the names of all of the forward packages <fwdDepPkg> that list this package in their <fwdDepPkg>_LIB_DEFINED_PACKAGES variables.${PACKAGE_NAME}_FORWARD_TEST_DEFINED_DEPENDENCIES
For a given package ${PACKAGE_NAME}, lists the names of all of the forward packages <fwdDepPkg> that list this package in their <fwdDepPkg>_TEST_DEFINED_PACKAGES variables.
The following variables can be set by the user to determine what packages get enabled or disabled:
${PROJECT_NAME}_ENABLE_ALL_PACKAGES ${PROJECT_NAME}_ENABLE_ALL_FORWARD_DEP_PACKAGES ${PROJECT_NAME}_ENABLE_ALL_OPTIONAL_PACKAGES ${PROJECT_NAME}_ENABLE_${PACKAGE_NAME} ${PROJECT_NAME}_ENABLE_TESTS ${PROJECT_NAME}_ENABLE_EXAMPLES ${PACKAGE_NAME}_ENABLE_${OPTIONAL_DEP_PACKAGE_NAME} ${PACKAGE_NAME}_ENABLE_TESTS ${PACKAGE_NAME}_ENABLE_EXAMPLES
according to the rules described in Package Dependencies and Enable/Disable Logic.
As mentioned above, some subset of initially internal packages listed in ${PROJECT_NAME}_DEFINED_INTERNAL_TOPLEVEL_PACKAGES (which all have ${PACKAGE_NAME}_SOURCE_DIR != "") may be chosen to be external packages. Packages that could be built internally may be chosen to be treated as external packages (and therefore located on the system using find_package()) by setting:
-D TPL_ENABLE_<PackageTreatedAsExternal>=ON
The final status of whether a package is treated as an internal package or an external package is provided by the variable:
${PACKAGE_NAME}_PACKAGE_BUILD_STATUS=[INTERNAL|EXTERNAL]
(NOT: The value of ${PACKAGE_NAME}_PACKAGE_BUILD_STATUS is only changed after all of the enable/disable dependency logic is complete.)
As a result, every other package upstream from any of these <PackageTreatedAsExternal> packages must therefore also be treated as external packages automatically and will have ${PACKAGE_NAME}_PACKAGE_BUILD_STATUS=EXTERNAL set accordingly. Also, if any subpackage is determined to be EXTERNAL, then the parent package of that subpackage and every other peer subpackage will also be set to EXTERNAL.
The processing of external packages/TPLs is influenced by whether the external package is a regular TriBITS TPL (i.e with a FindTPL<tplName>.cmake modules) or is a TriBITS-compliant external package. Here, a TriBITS-Compliant External Package has a <tplName>Config.cmake file that satisfies the following properties:
That means that when calling find_package() for a fully TriBITS-compliant external package, there is no need to worry about finding any of its upstream dependent external packages. That means that any external packages/TPLs defined a TriBITS project which is upstream from a TriBITS-compliant external package will be uniquely defined by calling find_package() on the most downstream TriBITS-compliant external package that depends on it. Therefore, defining the external packages and their targets in this set of external packages just involves calling find_package() on the terminal TriBITS-compliant external packages (i.e. TriBITS-compliant external packages that don't have any downstream dependencies that are external packages). Then the remaining subset of external packages/TPLs that don't have a downstream TriBITS-compliant external package dependency will be defined as usual. (However, as mentioned above, some of these are not fully TriBITS compliant and don't fully define the <UpstreamTpl>::all_libs for all of their upstream dependencies (see below).)
By having all fully TriBITS-compliant external packages, an external dependency is never found more than once.
The variables that are set internally to define these different subsets of external packages/TPLs are:
An external package with <Package>_IS_TRIBITS_COMPLIANT=TRUE AND <Package>_PROCESSED_BY_DOWNSTREAM_TRIBITS_EXTERNAL_PACKAGE=FALSE is the one for which find_package(<Package> CONFIG REQUIRED) will be called and does not have any downstream packages that are being treated as external packages. Also, find_package(<Package> CONFIG REQUIRED) will be called on TriBITS-compliant external packages if <Package>::all_libs was not defined by a downstream non fully TriBITS-compliant external package.
The variable <Package>_IS_TRIBITS_COMPLIANT is set right when the packages are initially defined by reading in the various input files. That is, all initially internal packages that are listed in a <repoDir>/PackagesList.cmake file will have <Package>_IS_TRIBITS_COMPLIANT=TRUE set while all external packages/TPLs listed in a <repoDir>/TPLsList.cmake file will have <Package>_IS_TRIBITS_COMPLIANT=FALSE set (except for those tagged with TRIBITS_PKG which will have <Package>_IS_TRIBITS_COMPLIANT=FALSE set).
The processing of external packages/TPLs is done in two loops:
For more details, see the implementation in tribits_process_enabled_tpls().
Below is the CMake macro and function call graph for constructing the packages lists and dependency data-structures described above.
These are key macros and functions that are used to implement the guts of TriBITS that TriBITS System Maintainers need to know about in order to understand the internals of TriBITS.
Usage:
tribits_abort_on_missing_package(<depPkg> <packageName>)
Function that creates error message about missing/misspelled package. This error message also suggests that the package might be defining an upstream dependency on a downstream dependency (i.e. a circular dependency).
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:667
Usage:
tribits_abort_on_self_dep(<packageName> <depPkgListName>)
Prints a fatal error message for an attempt for a self dependency declaration and which list it comes from.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:696
Usage:
tribits_adjust_package_enables()
Macro that adjusts all of the package enables from what the user input to the final set that will be used to enable packages.
In: core/package_arch/TribitsAdjustPackageEnables.cmake:43
Appends forward/downstream package dependency lists for the upstream dependent package list provided.
Usage:
tribits_append_forward_dep_packages(<packageName> <listType>)
In particular, it appends the var:
<packageName>_FORWARD_<listType>
for one of the vars listed in Variables defining the package dependencies graph.
This function is called multiple times to build up the forward package dependencies for a given <packageName> by the downstream packages that declare dependencies on it.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:569
Usage:
tribits_assert_read_dependency_vars(<packageName>)
Assert that all of the required variables set by the function tribits_package_define_dependencies() in the file <packageDir>/cmake/Dependencies.cmake have been set.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:329
Usage:
tribits_dump_package_dependencies_info()
Function that dumps (prints to STDOUT) the package dependency info if ${PROJECT_NAME}_DUMP_PACKAGE_DEPENDENCIES==TRUE.
This function does not modify any state!
In: core/package_arch/TribitsPrintDependencyInfo.cmake:56
Creates the <tplName>::all_libs target command text using input info and from TPL_<tplName>_INCLUDE_DIRS.
Usage:
tribits_extpkg_append_create_all_libs_target_str( <tplName> LIB_TARGETS_LIST <libTargetsList> LIB_LINK_FLAGS_LIST <libLinkFlagsList> CONFIG_FILE_STR_INOUT <configFileFragStrInOut> )
The arguments are:
<tplName>: [in] Name of the external package/TPL
<libTargetsList>: [in] List of targets created from processing TPL_<tplName>_LIBRARIES.
<libLinkFlagsList>: [in] List of of -L<dir> library directory paths entries found while processing TPL_<tplName>_LIBRARIES.
<configFileFragStrInOut>: [out] A string variable that will be appended with the <tplName>::all_libs target statements.
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:821
Add includes for all upstream external packages/TPLs listed in <tplName>_LIB_ENABLED_DEPENDENCIES.
Usage:
tribits_extpkg_append_find_upstream_dependencies_str(tplName configFileFragStrInOut)
NOTE: This also requires that <upstreamTplName>_TRIBITS_COMPLIANT_PACKAGE_CONFIG_FILE be set for each external package/TPL listed in <tplName>_LIB_ENABLED_DEPENDENCIES.
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:396
Install an already-generated <tplName>Config.cmake file.
Usage:
tribits_write_external_package_install_config_file( <tplName> <tplConfigFile> )
The arguments are:
<tplName>: Name of the external package/TPL
<tplConfigFile>: Full file path for the <tplName>Config.cmake file that will be installed into the correct location.
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:97
Install an already-generated <tplName>ConfigVersion.cmake file.
Usage:
tribits_write_external_package_install_config_version_file( <tplName> <tplConfigVersionFile> )
The arguments are:
<tplName>: Name of the external package/TPL
<tplConfigVersionFile>: Full file path for the <tplName>ConfigVersion.cmake file that will be installed into the correct location ${${PROJECT_NAME}_INSTALL_LIB_DIR}/external_packages/
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:121
Read the TPL_<tplName>_LIBRARIES and <tplName>_LIB_ENABLED_DEPENDENCIES list variables and produce the string for the IMPORTED targets commands with upstream linkages and return list of targets and left over linker flags.
Usage:
tribits_extpkg_process_libraries_list( <tplName> LIB_TARGETS_LIST_OUT <libTargetsListOut> LIB_LINK_FLAGS_LIST_OUT <libLinkFlagsListOut> CONFIG_FILE_STR_INOUT <configFileFragStrInOut> )
The arguments are:
<tplName>: [In] Name of the external package/TPL
<libTargetsListOut>: [Out] Name of list variable that will be set with the list of IMPORTED library targets generated from this list.
<libLinkFlagsListOut>: [Out] Name of list variable that will be set with the list of -L<dir> library directory paths.
<configFileFragStrInOut>: [Inout] A string variable that will be appended with the IMPORTED library commands for the list of targets given in <libTargetsList>.
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:446
Usage:
tribits_extpkg_setup_enabled_dependencies(<externalPkgName>)
Macro that sets up the list of enabled external package/TPL dependencies
Takes the list <externalPkgName>_LIB_DEFINED_DEPENDENCIES and sets the default entries of the non-cache var <externalPkgName>_LIB_ENABLED_DEPENDENCIES. However, if ${<externalPkgName>_LIB_ENABLED_DEPENDENCIES is non-empty when this macro is called, then it will not be changed. That allows the user to override the list of enabled TPL dependencies in the cache. This also sets the non-cache vars <externalPkgName>_ENABLE_<upstsreamPkgName>=ON for each enabled package listed in <externalPkgName>_LIB_ENABLED_DEPENDENCIES and to OFF for each <upstsreamPkgName> listed in <externalPkgName>_LIB_DEFINED_DEPENDENCIES but not in <externalPkgName>_LIB_ENABLED_DEPENDENCIES.
In: core/package_arch/TribitsPackageDependencies.cmake:86
Write out a <tplName>ConfigVersion.cmake file.
Usage:
tribits_write_external_package_config_version_file( <tplName> <tplConfigVersionFile> )
ToDo: Add version arguments!
The arguments are:
<tplName>: Name of the external package/TPL
<tplConfigVersionFile>: Full file path for the <tplName>ConfigVersion.cmake file that will be written out.
In: core/package_arch/TribitsExternalPackageWriteConfigFile.cmake:57
Filter a list of packages based on several criteria including internal/external, enable status (with empty or non-empty)
Usage:
tribits_filter_package_list_from_var( <packageListVarName> <internalOrExternal> <enabledFlag> <enableEmptyStatus> <packageSublistOut> )
Where:
In: core/package_arch/TribitsGetPackageSublists.cmake:13
Get sub-list of disabled packages
Usage:
tribits_get_sublist_disabled( <enableListName> <disabledSublistNameOut> [<numDisabledVarOut>] )
On output, <disabledSublistNameOut> contains the sublist of entries <enableListName> which evaluate to FALSE and is not empty "" in an if () statement.
In: core/package_arch/TribitsGetPackageSublists.cmake:123
Get sub-list of enabled packages
Usage:
tribits_get_sublist_enabled( <enableListName> <enabledSublistNameOut> [<numEnabledVarOut>] )
On output, <enabledSublistNameOut> contains the sublist of entries in <enableListName> which evaluate to TRUE in an if () statement.
In: core/package_arch/TribitsGetPackageSublists.cmake:62
Get sub-list of packages that are INTERNAL, EXTERNAL, or either.
Usage:
tribits_get_sublist_internal_external( <inputPackageListName> <internalOrExternal> <sublistNameOut> [<sizeSublistOut>] )
where:
- <internalOrExternal> is either INTERNAL, EXTERNAL or empty "".
On output, <sublistNameOut> contains the sublist of entries in <inputPackageListName> which are either INTERNAL or EXTERNAL or both (if <internalOrExternal> is "") based on <Package>_PACKAGE_BUILD_STATUS for each element package name.
In: core/package_arch/TribitsGetPackageSublists.cmake:185
Get sub-list of non-disabled packages
Usage:
tribits_get_sublist_nondisabled( <enableListName> <nondisabledListNameOut> [<numNondisabledVarOut>] )
On output, <nondisabledListNameOut> contains the sublist of entries from <enableListName> for which evaluate to TRUE or empty "" in an if () statement.
In: core/package_arch/TribitsGetPackageSublists.cmake:92
Get sub-list of non-enabled entries
Usage:
tribits_get_sublist_nonenabled( <enableListName> <nonenabledListNameOut> [<numNonenabledVarOut>] )
On output, <nonenabledListNameOut> contains the subset of entries in <enableListName> that evaluate to FALSE (which can also be empty "") in an if () statement.
In: core/package_arch/TribitsGetPackageSublists.cmake:154
Usage:
tribits_print_initial_dependency_info()
Function that prints whatever initial dependency information that is available that is requested by the user after the initial construction of the package dependency graph but before the call of tribits_adjust_package_enables().
Calls:
In: core/package_arch/TribitsPrintDependencyInfo.cmake:14
Usage:
tribits_print_tentatively_enabled_tpls()
Function that print the set of tentatively enabled TPLs.
Does not modify any state!
In: core/package_arch/TribitsPrintDependencyInfo.cmake:36
Usage:
tribits_parse_subpackages_append_packages_add_options(<parentPackageName>)
Macro that parses the read-in variable SUBPACKAGES_DIRS_CLASSIFICATIONS_OPTREQS set by the macro tribits_package_define_dependencies() , adds subpackages to the list of defined packages, and defines user cache var options for those subpackages.
This sets the list variables for the parent package <parentPackageName>:
<parentPackageName>_SUBPACKAGES <parentPackageName>_SUBPACKAGE_DIRS <parentPackageName>_SUBPACKAGE_OPTREQ
For each subpackage <subpackageFullName>, this sets:
<subpackageFullName>_SOURCE_DIR <subpackageFullName>_REL_SOURCE_DIR <subpackageFullName>_PARENT_PACKAGE <subpackageFullName>_PARENT_REPOSITORY
And it appends each subpackage to the list variable:
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:720
Usage:
tribits_prep_to_read_dependencies(<packageName>)
Macro that sets to undefined all of the variables that must be set by the tribits_package_define_dependencies() macro.
It also sets to empty the forward dependency list vars:
for each of the forward/downstream package/dependency in Variables defining the package dependencies graph.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:288
Process any dependency logic at the repo level by loading <repoDir>/cmake/RepositoryDependenciesSetup.cmake files.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:60
Gather information and targets from enabled TPLs
For more info, see Processing of external packages/TPLs and TriBITS-compliant external packages.
In: core/package_arch/TribitsProcessEnabledTpls.cmake:24
Usage:
tribits_process_package_dependencies_lists(<packageName>)
Sets up the upstream/backward and downstream/forward package dependency list variables for <packageName> described in Variables defining the package dependencies graph. Note that the downstream/forward dependencies of upstream packages for this package <packageName> are built up incrementally. (The forward dependency list vars are initialized to empty in tribits_prep_to_read_dependencies().)
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:407
Usage:
tribits_process_packages_and_dirs_lists()
Macro that processes the list variable:
${REPOSITORY_NAME}_PACKAGES_AND_DIRS_AND_CLASSIFICATIONS
from a <repoDir>/PackagesList.cmake file that just got read in and creates/updates the top-level non-cache variables:
- ${PROJECT_NAME}_DEFINED_INTERNAL_TOPLEVEL_PACKAGES
- ${PROJECT_NAME}_NUM_DEFINED_INTERNAL_TOPLEVEL_PACKAGES
- ${PROJECT_NAME}_LAST_DEFINED_INTERNAL_TOPLEVEL_PACKAGE_IDX
For each of the listed top-level (parent) packages ${PACKAGE_NAME}, it also sets up constant variables defined in TriBITS Package Top-Level Local Variables like:
and sets up some standard enable/disable vars with default values as defined in TriBITS Package Cache Variables like:
NOTE: Set TRIBITS_PROCESS_PACKAGES_AND_DIRS_LISTS_VERBOSE=TRUE to see really verbose debug output from this macro.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsProcessPackagesAndDirsLists.cmake:338
Process any dependency logic at the project level by loading the <projectDir>/cmake/ProjectDependenciesSetup.cmake file
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:92
This macro that processes the project-level variable:
${REPOSITORY_NAME}_TPLS_FINDMODS_CLASSIFICATIONS
This updates the project-level variables:
For each TPL, it also sets the variables:
See Function call tree for constructing package dependency graph
In: core/package_arch/TribitsProcessTplsLists.cmake:89
Usage:
tribits_read_all_package_deps_files_create_deps_graph()
This macro reads in all of the <packageDir>/cmake/Dependencies.cmake and <packageDir>/<spkgDir>/cmake/Dependencies.cmake files for top-level packages and subpackages, respectively, and builds the package dependency graph variables.
This macro reads from the variables:
And writes to the variable:
as well creates the package dependency variables described in Variables defining the package dependencies graph that defines the directed acyclic dependency (DAG) package dependency graph (with navigation up and down the graph).
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:119
Usage:
tribits_read_all_project_deps_files_create_deps_graph()
Macro run at the top project-level scope that reads the lists of packages and TPLs and creates the packages dependency graph.
On output, this creates all of the package lists and dependency data-structures described in the section TriBITS System Data Structures and more specifically the sections:
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadAllProjectDepsFilesCreateDepsGraph.cmake:23
Usage:
tribits_read_back_dependencies_vars(<postfix>)
Read back the local package dependency vars from the saved-off vars with suffix _<postfix>.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:381
Usage:
tribits_read_defined_external_and_internal_toplevel_packages_lists()
Macro run at the top project-level scope that reads in the contents of all of the <repoDir>/TPLsList.cmake and <repoDir>/PackagesList.cmake files to get the list of defined external packages (TPLs) and internal top-level (TriBITS) packages.
On output, this produces the local variables:
and the length vars for these:
This includes the files:
and calls the macros:
which set their variables.
See Function call tree for constructing package dependency graph
In: core/package_arch/TribitsReadAllProjectDepsFilesCreateDepsGraph.cmake:65
Usage:
tribits_read_deps_files_create_deps_graph()
This macro reads of all the package dependencies and builds the package dependency graph. This first executes the logic in the files <repoDir>/cmake/RepositoryDependenciesSetup.cmake (for each TriBITS repo) and <projectDir>/cmake/ProjectDependenciesSetup.cmake and then reads in all of the <packageDir>/cmake/Dependencies.cmake and <packageDir>/<spkgDir>/cmake/Dependencies.cmake files and builds the package dependency graph variables.
This macro reads from the variables:
and writes to the variables:
as well creates the package dependency variables described in Variables defining the package dependencies graph that defines the directed acyclic dependency (DAG) package dependency graph (with navigation up and down the graph).
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:15
Reads in dependencies for the external packages/TPL <tplName> and creates the package dependency graph entries for it.
Usage:
tribits_read_external_package_deps_files_add_to_graph(<tplName>)
This reads in the file ${<tplName>_DEPENDENCIES_FILE} and sets the variable:
<tplName>_LIB_DEFINED_DEPENDENCIES
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:165
Usage:
tribits_read_package_subpackage_deps_files_add_to_graph(<toplevelPackageName>)
Read in subpackages dependencies files and add to dependencies graph variables.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:851
Usage:
tribits_read_subpackage_deps_file_add_to_graph(<toplevelPackageName> <subpackageName> <subpackageDir>)
Macro that reads in a single subpackage dependencies file <packageDir>/<spkgDir>/cmake/Dependencies.cmake and sets up the dependency structure for it.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:875
Usage:
tribits_read_toplevel_package_deps_files_add_to_graph(<packageName>)
Macro that reads in package dependencies for a top-level package from the file <packageDir>/cmake/Dependencies.cmake and appends the forward dependencies list vars for packages already read in for this package <packageName> (see Variables defining the package dependencies graph).
It also appends the list variable:
Also, the subpackage dependencies under this top-level package are read in order and then this top-level package is appended and dependencies are created for them.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:202
Usage:
tribits_save_off_dependency_vars(<postfix>)
Saves off package dependency variables with variable suffix _<postfix>.
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:356
Macro set up backward package dependency lists for a given package given the vars read in from the macro tribits_package_define_dependencies().
Usage:
tribits_set_dep_packages(<packageName> <testOrLib> <requiredOrOptional> <pkgsOrTpls>)
where:
Sets the upstream/backward dependency variables defined in the section Variables defining the package dependencies graph.
This also handles the several types of issues:
See Function call tree for constructing package dependency graph.
In: core/package_arch/TribitsReadDepsFilesCreateDepsGraph.cmake:451
Print trace of file processing when ${PROJECT_NAME}_TRACE_FILE_PROCESSING is TRUE.
Usage:
tribits_trace_file_processing( <type> <processingType> <filePath>)
Arguments:
In: core/package_arch/TribitsGeneralMacros.cmake:103
Create the <Package>ConfigTargets.cmake file and install rules and the install() target for the previously generated <Package>Config_install.cmake files generated by the tribits_write_flexible_package_client_export_files() function.
Usage:
tribits_write_package_client_export_files_export_and_install_targets( PACKAGE_NAME <packageName> PACKAGE_CONFIG_FOR_BUILD_BASE_DIR <packageConfigForBuildBaseDir> PACKAGE_CONFIG_FOR_INSTALL_BASE_DIR <packageConfigForInstallBaseDir> )
The install() commands must be in a different subroutine or CMake will not allow you to call the routine, even if you if() it out!
In: core/package_arch/TribitsInternalPackageWriteConfigFile.cmake:62
Usage:
tribits_write_xml_dependency_files()
Macro that outputs XML dependency files if asked based in the global project package dependency graph previously constructed.
In: ci_support/TribitsWriteXmlDependenciesFiles.cmake:21
Usage:
tribits_write_xml_dependency_files_if_supported()
Function that writes XML dependency files if support for that exists in this instance of TriBITs.
See Function call tree for constructing package dependency graph
In: core/package_arch/TribitsReadAllProjectDepsFilesCreateDepsGraph.cmake:221
Martin, Robert. Agile Software Development (Principles, Patterns, and Practices). Prentice Hall. 2003.
Bartlett, Roscoe. Integration Strategies for Computational Science & Engineering Software. 2009-0655, Second International Workshop on Software Engineering for Computational Science and Engineering, 2009. https://bartlettroscoe.github.io/publications/CSE_SoftwareIntegration_Strategies.pdf.
SCALE: A Comprehensive Modeling and Simulation Suite for Nuclear Safety Analysis and Design, ORNL/TM-2005/39, Version 6.1, Oak Ridge National Laboratory, Oak Ridge, Tennessee, June 2011. Available from Radiation Safety Information Computational Center at Oak Ridge National Laboratory as CCC-785. http://scale.ornl.gov/
Scott, Craig. Professional CMake: A Practical Guide (5th Edition). ISBN 978-1-925904-03-1. 2019. https://crascit.com/
Q: Why does not TriBITS just use the standard CMake Find<PACKAGE_NAME>.cmake modules and the standard find_package() function to find TPLs?
A: The different "standard" CMake Find<PACKAGE_NAME>.cmake modules do not have a standard set of outputs and therefore, can't be handled in a uniform way. For example,
TriBITS removes a lot of the boiler plate code needed to write a CMake project. As a result, many people can come into a project that uses TriBITS and quickly start to contribute by adding new source files, adding new libraries, adding new tests, and even adding new TriBITS packages and external packages/TPLs; all without really having learned anything about CMake. Often one can use existing example CMake code as a guide and be successful using basic functionality. As long as nothing out of the ordinary happens, many people can get along just fine in this mode for a time.
However, we have observed that most mistakes and problems that people run into when using TriBITS are due to lack of basic knowledge of the CMake language. One can find basic tutorials and references on the CMake language in various locations online for free. One can also purchase the official CMake reference book. Also, documentation for any built-in CMake command is available locally by running:
$ cmake --help-command <CMAKE_COMMAND>
Because tutorials and detailed documentation for the CMake language already exists, this document does not attempt to provide a first reference to CMake (which is a large topic in itself). However, what we try to provide below is a short overview of the more quirky or surprising aspects of the CMake language that a programmer experienced in another language might get tripped up or surprised by. Some of the more unique features of the language are described in order to help avoid some of these common mistakes and provide greater understanding of how TriBITS works.
The CMake language is used to write CMake projects with TriBITS. In fact the core TriBITS functionality itself is implemented in the CMake language (see TriBITS System Project Dependencies). CMake is a fairly simple programming language with relatively simple rules (for the most part). However, compared to other programming languages, there are a few peculiar aspects to the CMake language that can make working with it difficult if you don't understand these rules. For example there are unexpected variable scoping rules and how arguments are passed to macros and functions can be tricky. Also, CMake has some interesting gotchas. In order to effectively use TriBITS (or just raw CMake) to construct and maintain a project's CMake files, one must know the basic rules of CMake and be aware of these gotchas.
The first thing to understand about the CMake language is that nearly every line of CMake code is just a command taking a string (or an array of strings) and functions that operate on strings. An array argument is just a single string literal with elements separated by semi-colons "<str0>;<str1>;...". CMake is a bit odd in how it deals with these arrays, which are just represented as a string with elements separated with semi-colons ';'. For example, all of the following are equivalent and pass in a CMake array with 3 elements [A], [B], and [C]:
some_func(A B C) some_func("A" "B" "C") some_func("A;B;C")
However, the above is not the same as:
some_func("A B C")
which just passes in a single element with value [A B C]. Raw quotes in CMake basically escape the interpretation of space characters as array element boundaries. Quotes around arguments with no spaces does nothing (as seen above, except for the interpretation as variable names in an if() statement). In order to get a quote char ["] into string, you must escape it as:
some_func(\"A\")
which passes an array with the single argument [\"A\"].
Variables are set using the built-in CMake set() command that just takes string arguments like:
set(SOME_VARIABLE "some_value")
In CMake, the above is identical, in every way, to:
set(SOME_VARIABLE some_value) set("SOME_VARIABLE";"some_value") set("SOME_VARIABLE;some_value")
The function set() simply interprets the first argument to as the name of a variable to set in the local scope. Many other built-in and user-defined CMake functions work the same way. That is, some of the string arguments are interpreted as the names of variables. There is no special language feature that interprets them as variables (except in an if() statement).
However, CMake appears to parse arguments differently for built-in CMake control structure functions like foreach() and if() and does not just interpret them as a string array. For example:
foreach (SOME_VAR "a;b;c") message("SOME_VAR='${SOME_VAR}'") endforeach()
prints `SOME_VAR='a;b;c' instead of printing SOME_VAR='a' followed by SOME_VAR='b', etc., as you would otherwise expect. Therefore, this simple rule for the handling of function arguments as string arrays does not hold for CMake logic control commands. Just follow the CMake documentation for these control structures (i.e. see cmake --help-command if and cmake --help-command foreach).
CMake offers a rich assortment of built-in commands for doing all sorts of things. Two of these are the built-in macro() and the function() commands which allow you to create user-defined macros and functions. TriBITS is actually built on CMake functions and macros. All of the built-in and user-defined macros, and some functions take an array of string arguments. Some functions take in positional arguments. In fact, most functions take a combination of positional and keyword arguments.
Variable names are translated into their stored values using ${SOME_VARIABLE}. The value that is extracted depends on if the variable is set in the local or global (cache) scope. The local scopes for CMake start in the base project directory in its base CMakeLists.txt file. Any variables that are created by macros in that base local scope are seen across an entire project but are not persistent across multiple successive cmake configure invocations where the cache file CMakeCache.txt is not deleted in between.
The handling of variables is one area where CMake is radically different from most other languages. First, a variable that is not defined simply returns nothing. What is surprising to most people about this is that it does not even return an empty string that would register as an array element! For example, the following set statement:
set(SOME_VAR a ${SOME_UNDEFINED_VAR} c)
(where SOME_UNDEFINED_VAR is an undefined variable) produces SOME_VAR='a;c' and not 'a;;c'! The same thing occurs when an empty variable is de-references such as with:
set(EMPTY_VAR "") set(SOME_VAR a ${EMPTY_VAR} c)
which produces SOME_VAR='a;c' and not 'a;;c'. In order to always produce an element in the array even if the variable is empty, one must quote the argument as with:
set(EMPTY_VAR "") set(SOME_VAR a "${EMPTY_VAR}" c)
which produces SOME_VAR='a;;c', or three elements as one might assume.
This is a common error that people make when they call CMake functions (built-in or TriBITS-defined) involving variables that might be undefined or empty. For example, for the macro:
macro(some_macro A_ARG B_ARG C_ARG) ... endmacro()
if someone tries to call it with (misspelled variable?):
some_macro(a ${SOME_OHTER_VAR} c)
and if SOME_OHTER_VAR="" or if it is undefined, then CMake will error out with the error message saying that the macro some_macro() takes 3 arguments but only 2 were provided. If a variable might be empty but that is still a valid argument to the command, then it must be quoted as:
some_macro(a "${SOME_OHTER_VAR}" c)
Related to this problem is that if you misspell the name of a variable in a CMake if() statement like:
if (SOME_VARBLE) ... endif()
then it will always be false and the code inside the if statement will never be executed! To avoid this problem, use the utility function assert_defined() as:
assert_defined(SOME_VARBLE) if (SOME_VARBLE) ... endif()
In this case, the misspelled variable would be caught.
While on the subject of if() statements, CMake has a strange convention. When you say:
if (SOME_VAR) do_something() endif()
then SOME_VAR is interpreted as a variable and will be considered true and do_something() will be called if ${SOME_VAR} does not evaluate to 0, OFF, NO, FALSE, N, IGNORE, "", or ends in the suffix -NOTFOUND. How about that for a true/false rule! To be safe, use ON/OFF and TRUE/FALSE pairs for setting variables. Look up native CMake documentation on if() for all the interesting details and all the magical things it can do.
WARNING: If you mistype "ON" as "NO", it evaluates to FALSE/OFF! (That is a fun defect to track down!)
CMake language behavior with respect to case sensitivity is also strange:
I don't know of any other programming language that uses different case sensitivity rules for variables and functions. However, because we must parse macro and function arguments when writing user-defined macros and functions, it is a good thing that CMake variables are case sensitive. Case insensitivity would make it much harder and more expensive to parse argument lists that take keyword-based arguments.
Other mistakes that people make result from not understanding how CMake scopes variables and other entities. CMake defines a global scope (i.e. "cache" variables) and several nested local scopes that are created by add_subdirectory() and entering functions. See dual_scope_set() for a short discussion of these scoping rules. And it is not just variables that can have local and global scoping rules. Other entities, like defines set with the built-in command add_definitions() only apply to the local scope and child scopes. That means that if you call add_definitions() to set a define that affects the meaning of a header-file in C or C++, for example, that definition will not carry over to a peer subdirectory and those definitions will not be set (see warning in Miscellaneous Notes (tribits_add_library())).
TriBITS started development in November 2007 as a set of helper macros to provide a CMake build system for a small subset of packages in Trilinos. The initial goal was to support a native Windows build (using Visual C++) to compile and install these few Trilinos packages on Windows for usage by another project (the Sandia Titan project which included VTK). At that time, Trilinos was using a highly customized and augmented autotools build system. Initially, this CMake system was just a set of macros to streamline creating executables and tests. Some of the conventions started in that early effort (e.g. naming conventions of variables and macros where functions use upper case like old FORTRAN and variables are mixed case) were continued in later efforts and are reflected in the current implementation. Then, stating in early 2008, a more detailed evaluation was performed to see if Trilinos should switch over to CMake as the default (and soon only) supported build and test system (see "Why CMake?" in TriBITS Overview). This lead to the initial implementation of a scalable package-based architecture (PackageArch) for the Trilinos CMake project in late 2008. This Trilinos CMake PackageArch system evolved over the next few years with development in the system slowing into 2010. This Trilinos CMake build system was then adopted as the build infrastructure for the CASL VERA effort in 2011 where CASL VERA packages were treated as add-on Trilinos packages (see Section Multi-Repository Support). Over the next year, there was significant development of the system to support larger multi-repo projects in support of CASL VERA. That lead to the decision to formally generalize the Trilinos CMake PackageArch build system outside of Trilinos and the name TriBITS was formally adopted in November 2011. Work to refactor the Trilinos CMake system into a general reusable stand-alone CMake-based build system started in October 2011 and an initial implementation was complete in December 2011 when it was used for the CASL VERA build system. In early 2012, the ORNL CASL-related projects Denovo and SCALE (see [SCALE, 2011]) adopted TriBITS as their native development build systems. Shortly after, TriBITS was adopted as the native build system for the CASL-related University of Michigan code MPACT. In addition to being used in CASL, all of these codes also had a significant life outside of CASL. Because they used the same TriBITS build system, it proved relatively easy to keep these various codes integrated together in the CASL VERA code meta-build. At the same time, TriBITS well served the independent development teams and non-CASL projects independent from CASL VERA. Since the initial extraction of TriBITS from Trilinos, the TriBITS system was further extended and refined, driven by CASL VERA development and expansion. Independently, an early version of TriBITS from 2012 was adopted by the LiveV project (see [LiveV]) which was forked and extended independently.
Note that a TriBITS Package is not the same thing as a "Package" in raw CMake terminology. In raw CMake, a "Package" is some externally provided bit of software or other utility for which the current CMake project has an optional or required dependency (see CMake: How to Find Libraries). Therefore, a raw CMake "Package" actually maps to a TriBITS TPL. A raw CMake "Package" (e.g. Boost, CUDA, etc.) can be found using a standard CMake find module Find<rawPackageName>.cmake using the built-in CMake command find_package(<rawPackageName>). It is unfortunate that the TriBITS and the raw CMake definitions of the term "Package" are not exactly the same. However, the term "Package" was coined by the Trilinos project long ago before CMake was adopted as the Trilinos build system and Trilinos' definition of "Package" (going back to 1998) pre-dates the development of CMake (see History of CMake) and therefore Trilinos dictated the terminology of TriBITS and the definition of the term "Package" in the TriBITS system. However, note that both meanings of the term "Package" are consistent with the more general software engineering definition of a "Package" according to Software Engineering Packaging Principles.
Some of the basic requirements and design goals for TriBITS are outlined in the TriBITS Overview document.
As stated in TriBITS Dependency Handling Behaviors, No circular dependencies of any kind are allowed. That is, no TriBITS package (or its tests) can declare a dependency on a downstream package, period! To some, this might seem over constraining but adding support for circular dependencies to the TriBITS system would add significant complexity and space/time overhead and is a bad idea from a basic software engineering perspective (see the ADP (Acyclic Dependencies Principle) in Software Engineering Packaging Principles). From a versioning, building, and change-prorogation perspective, any packages involved in a circular dependency would need to be treated as a single software engineering package anyway so TriBITS forces development teams to glob all of this stuff together into a single TriBITS Package when cycles in software exist. There are numerous wonderful ways to break circular dependencies between packages that are proven and well established in the SE community (for example, see [Agile Software Development, 2003]).
Below is a snapshot of the output from clone_extra_repos.py --help. For more details on the usage of clone_extra_repos.py, see Multi-Repository Support and Multi-Repository Development Workflow.
Usage: clone_extra_repos.py [options] This script clones one more extra repos listed in a TriBITS ExtraRepositoriesList.cmake file. The standard usage is: $ cd base <projectDir> $ ./cmake/tribits/ci_support/clone-extra-repos.py where <projectDir> is the base TriBITS project dir and base git repo. By default, this will clone all the 'Nightly' extra repos that are listed in the file: <projectDir>/cmake/ExtraRepositoriesList.cmake (other repo types can be selected using --extra-repos-type). The list of which repos to clone can be "white-list" selected with the option --extra-repos (see options below for details). Extra repos can in addition be "back-listed" using the option --not-extra-repos. To see the full list of repos that can be cloned, pass in just: --skip-clone --verbosity=more That will print out a table like: ------------------------------------------------------------------------------ | ID | Repo Name | Repo Dir | VC | Repo URL | Category | |----|------------|------------|-----|--------------------------|------------| | 1 | ExtraRepo1 | ExtraRepo1 | GIT | someurl.com.ExtraRepo1 | Continuous | | 2 | ExtraRepo3 | ExtraRepo3 | GIT | someurl3.com:/ExtraRepo3 | Continuous | ------------------------------------------------------------------------------ If the git repo server is using gitolite, one can set --gitolite-root=<gitolite-root> and that will result in git repos being selected only if the selected repos are listed in 'ssh <gitolite-root> info'. This allows one to automatically exclude repos from being cloned that the user has no permissions to clone. NOTE: See warning about the --gitolite-root option below! TIP: After cloning the set of repos, a nice way to interact with the repos is to use the tool 'gitdist'. If your project does not have a version controlled .gitdist.default file, you can generate one using the --create-gitdist-file=<gitdist-file> argument, for example with: --create-gitdist-file=.gitdist This will restrict the list of repos processed by gitdist to just the repos cloned. Options: -h, --help show this help message and exit --extra-repos=EXTRAREPOS List of names of extra repos to be cloned <extra- repos> (i.e. "repo0,repo1,,..."). When set to empty '' (the default value) then all repos that match <extra-repos-type> listed in <extra-repos-file> will be selected. But the repos listed in <extra-repos> must always be a subset of the repos of type <extra- repos-type> selected from <extra-repos-file>. (Default '') --not-extra-repos=NOTEXTRAREPOS List of names of extra repos *NOT* to clone (i.e. "repo0,repo1,..."). (Default '') --extra-repos-file=EXTRAREPOSFILE The file path <extra-repos-file> for the ExtraRepositoriesList.cmake file. This can be an absolute or relative path. (Default = 'cmake/ExtraRepositoriesList.cmake') --extra-repos-type=EXTRAREPOSTYPE Type of extra repositories <extra-repos-type> to select from <extra-repos-file>. When --extra-repos is set, then this argument is ignored. Choices = ('Continuous', 'Nightly', 'Experimental'). [default = 'Nightly'] --gitolite-root=GITOLITEROOT Gives the root for a gitolite repos <gitolite-root> (e.g. git@<some-url>). If specified, then any git repos with the <gitolite-root> listed as their root will only be selected if they are listed with 'R' permissions returned from 'ssh <gitolite-root> info'. WARNING: Make sure that you have your gitoliote SSH registered correctly before using this option by typing the command 'ssh <gitlite-root> info' and make sure that it does *not* ask for a password! (Default = '') --with-cmake=WITHCMAKE CMake executable to use with cmake -P scripts internally (only set by unit testing code). (Default = 'cmake') --verbosity=VERBLEVEL Verbosity of the script (levels are cumulative): none = no output at all (except for commands with --no-op). minimal = print script args echo and clone commands. more = print basic repo include/exclude logic and print repo table. most = print output from cmake script called, the output from gitolite, and other detailed info. Choices = ('none', 'minimal', 'more', 'most'). [default = 'more'] --do-clone Do the clone of the selected repos. [default] --skip-clone Skip the clone of the repos and just show what would be done. --do-op Do the clone of the selected repos. [default] --no-op Skip cloning the repos and just show the clone commands. --create-gitdist-file=CREATEGITDISTFILE If specified, the file <gitdist-file> will get generated with the list of git repos (the same list that is cloned with --do-clone). (Default = '') --show-defaults Show the default option values and do nothing at all.
The sections below show snapshots of the output from the gitdist tool from gitdist --help and gitdist --dist-help=<topic>:
For more details on the usage of gitdist, see Multi-Repository Support and Multi-Repository Development Workflow.
Usage: gitdist [gitdist arguments] <raw-git-command> [git arguments] gitdist [gitdist arguments] dist-repo-status gitdist [gitdist arguments] dist-repo-versions-table Run git over a set of git repos in a multi-repository git project (see --dist-help=overview --help). This script also includes other tools like printing a compact repo status table (see --dist-help=dist-repo-status) and tracking compatible versions through multi-repository SHA1 version files (see --dist-help=repo-versions). The options in [gitdist options] are prefixed with '--dist-' and are pulled out before running 'git <raw-git-command> [git arguments]' in each local git repo that is processed (see --dist-help=repo-selection-and-setup). Options: -h, --help show this help message and exit --dist-help=HELPTOPIC Print a gitdist help topic. Using --dist-help=all prints all help topics. If --help is also specified, then the help usage header and command-line 'options' are also printed. Choices = ('', 'overview', 'repo- selection-and-setup', 'dist-repo-status', 'repo- versions', 'dist-repo-versions-table', 'aliases', 'default-branch', 'move-to-base-dir', 'usage-tips', 'script-dependencies', 'all'). [default = ''] --dist-use-git=USEGIT Path to the git executable to use for each git repo command. By default, gitdist will use 'git' in the environment. If it can't find 'git' in the environment, then it will require setting --dist-use- git=<path-to-git>. (Typically only used in automated testing.) (default='git') --dist-repos=REPOS Comma-separated list of repo relative paths '<repo0>,<repo1>,...'. The base repo is specified with '.' and should usually be listed first. If left empty '', then the list of repos to process is taken from the file ./.gitdist (which lists the relative path of each git repo separated by newlines). If the file ./.gitdist does not exist, then the repos listed in the file ./.gitdist.default are processed. If the file the file ./.gitdist.default is missing, then no extra repos are processed and it is assumed that the base repo will be processed. Also, any git repos listed that don't exist are ignored. See --dist- help=repo-selection-and-setup. (default='') --dist-not-repos=NOTREPOS Comma-separated list of extra repo relative paths '<repoX>,<repoY>,...' to *not* process. (default='') --dist-mod-only If set, then only git repos that have changes w.r.t. their tracking branches will be processed. That is, only repos that have modified or untracked files or where 'git diff --name-only ^<tracking-branch>' returns non-empty output will be processed (where <tracking-branch> is returned from 'rev-parse --abbrev-ref --symbolic-full-name @{u})'. If a local repo does not have a tracking branch, then the repo will be skipped as well. Therefore, be careful to first run 'gitdist-status' (see --dist-help=dist-repo- status) to see the status of each local git repo to know which repos don't have tracking branches. --dist-legend If set, then a legend will be printed below the repo summary table for the special dist-repo-status command. Only applicable with dist-repo-status (see --dist-help=dist-repo-status). --dist-utf8-output If set, use UTF-8 box drawing characters instead of ASCII ones when creating the repo summary table. --dist-version-file=VERSIONFILE Path to a file which contains a list of extra repo relative directories and git versions (replaces _VERSION_). (See --dist-help=repo-versions.) (default='') --dist-version-file2=VERSIONFILE2 Path to a second file contains a list of extra repo relative directories and git versions (replaces _VERSION2_). (See --dist-help=repo-versions.) (default='') --dist-no-color If set, don't use color in the output for gitdist and set '-c color.status=never' before the git command (like 'status'). NOTE: user should also pass in --color=never for git commands accept that argument. (Better for output to a file). --dist-debug If set, then debugging info is printed. --dist-no-opt If set, then no git commands will be run but instead will just be printed. --dist-short If set, then the repo versions table will only include the Repo Dir and SHA1 columns; Commit Date, Author, and Summary will be omitted.
OVERVIEW: Running: $ gitdist [gitdist options] <raw-git-command> [git arguments] will distribute git commands specified by '<raw-git-command> [git arguments]' across the current base git repo and the set of git repos listed in the file ./.gitdist (or the file ./.gitdist.default, or the argument --dist-repos=<repo0>,<repo1>,..., see --dist-help=repo-selection-and-setup). For example, consider the following base git repo 'BaseRepo' with three other "extra" git repos cloned under it: BaseRepo/ .git/ .gitdist ExtraRepo1/ .git/ ExtraRepo2/ .git/ ExtraRepo3/ .git/ The file .gitdist shown above is created by the user and in this example should have the contents (note the base repo entry '.'): . ExtraRepo1 ExtraRepo1/ExtraRepo2 ExtraRepo3 For this example, running the command: $ cd BaseRepo/ $ gitdist status results in the following commands: $ git status $ cd ExtraRepo1/ ; git status ; .. $ cd ExtraRepo1/ExtraRepo2/ ; git status ; ../.. $ cd ExtraRepo3/ ; git status ; .. which produces output like: *** Base Git Repo: BaseRepo On branch master Your branch is up-to-date with 'origin/master'. nothing to commit, working directory clean *** Git Repo: ExtraRepo1 On branch master Your branch is up-to-date with 'origin/master'. nothing to commit, working directory clean *** Git Repo: ExtraRepo1/ExtraRepo2 On branch master Your branch is up-to-date with 'origin/master'. nothing to commit, working directory clean *** Git Repo: ExtraRepo3 On branch master Your branch is up-to-date with 'origin/master'. nothing to commit, working directory clean The gitdist tool allows managing a set of git repos like one big integrated git repo. For example, after cloning a set of git repos, one can perform basic operations like for single git repos such as creating a new release branch and pushing it with: $ gitdist checkout master $ gitdist pull $ gitdist tag -a -m "Start of the 2.3 release" release-2.3-start $ gitdist checkout -b release-2.3 release-2.3-start $ gitdist push origin release-2.3-start $ gitdist push origin -u release 2.3 $ gitdist checkout master The above gitdist commands create the same tag 'release-2.3-start' and the same branch 'release-2.3' in all of the local git repos and pushes these to the remote 'origin' for each git repo. For more information about a certain topic, use '--dist-help=<topic-name> [--help]' for <topic-name>: * 'overview' * 'repo-selection-and-setup' * 'dist-repo-status' * 'repo-versions' * 'dist-repo-versions-table' * 'aliases' * 'default-branch' * 'move-to-base-dir' * 'usage-tips' * 'script-dependencies' To see full help with all topics, use '--dist-help=all [--help]'. This script is self-contained and has no dependencies other than standard python 2.6+ packages so it can be copied to anywhere and used.
REPO SELECTION AND SETUP: Before using the gitdist tool, one must first add the gitdist script to one's default path. On bash, the simplest way to do this is to source the gitdist-setup.py script: $ source <some-base-dir>/TriBITS/tribits/python_utils/gitdist-setup.sh This will set an alias to the gitdist script in that same directory by default, will set up useful alias 'gitdist-status', 'gitdist-mod', and 'gitdist-mod-status', and 'gitdist-repo-versions', and will set up command-line completion just like for raw git (assuming that git-completion.bash has been sourced first). The files 'gitdist' and 'gitdist-setup.sh' can also be copied to another directory (e.g. ~/bin) and then 'gitdist-setup.sh' can be sourced from there (as a simple "install"): $ cp <some-base-dir>/TriBITS/tribits/python_utils/gitdist \ <some-base-dir>/TriBITS/tribits/python_utils/gitdist-setup.sh \ ~/bin/ $ source ~/bin/gitdist-setup.sh $ export PATH=$HOME/bin:$PATH This script can also be set up manually, for example, by copying the gitdist script to one's ~/bin/ directory: $ cp <some-base-dir>/TriBITS/tribits/python_utils/gitdist ~/bin/ $ chmod a+x ~/bin/gitdist and then adding $HOME/bin to one's 'PATH' env var with: $ export PATH=$HOME/bin:$PATH (i.e. in one's ~/.bash_profile file). Then, one will want to set up some useful shell aliases like 'gitdist-status', 'gitdist-mod', and 'gitdist-mod-status' and 'gitdist-repo-versions' (see --dist-help=aliases). The set of git repos processed by gitdist is determined by the argument: --dist-repos=<repo0>,<repo1>,... or the files .gitdist or .gitdist.default. If --dist-repos="", then the list of repos to process will be read from the file '.gitdist' in the current directory. If the file '.gitdist' does not exist, then the list of repos to process will be read from the file '.gitdist.default' in the current directory. The format of this files '.gitdist' and '.gitdist.default' is to have one repo relative directory per line, for example: $ cat .gitdist . ExtraRepo1 ExtraRepo1/ExtraRepo2 ExtraRepo3 where each line is the relative path under the base git repo (i.e. under 'BaseRepo/'). The file .gitdist.default is meant to be committed to the base git repo (i.e. 'BaseRepo') so that gitdist is ready to use right away after the base repo and the extra repos are cloned. If an extra repository directory (i.e. listed in --dist-repos=<repo0>,<repo1>,..., .gitdist, or .gitdist.default) does not exist, then it will be ignored by the script. Therefore, be careful to manually verify that the script recognizes the repositories that you list. The best way to do that is to run 'gitdist-status' and see which repos are listed. Certain git repos can also be selectively excluded using the option '--dist-not-repos=<repox>,<repoy>,...'. Setting up to use gitdist on a specific set of local git repos first requires cloning and organizing the local git repo. For the example listed here, one would clone the base repo 'BaseRepo' and the three extra git repos, set up a .gitdist file, and then add ignores for the extra cloned repos like: # A) Clone and organize the git repos $ git clone git@some.url:BaseRepo.git $ cd BaseRepo/ $ git clone git@some.url:ExtraRepo1.git $ cd ExtraRepo1/ $ git clone git@some.url:ExtraRepo2.git $ cd .. $ git clone git@some.url:ExtraRepo3.git # B) Create .gitdist $ echo . > .gitdist $ echo ExtraRepo1 >> .gitdist $ echo ExtraRepo1/ExtraRepo2 >> .gitdist $ echo ExtraRepo3 >> .gitdist # C) Add ignores in base repo $ echo /ExtraRepo1/ >> .git/info/exclude $ echo /ExtraRepo3/ >> .git/info/exclude # D) Add ignore in nested extra repo $ echo /ExtraRepo2/ >> ExtraRepo1/.git/info/exclude (Note that one may instead add the above ignores to the version-controlled files BaseRepo/.gitignore and ExtraRepo1/.gitignore.) This produces the local repo structure: BaseRepo/ .git/ .gitdist ExtraRepo1/ .git/ ExtraRepo2/ .git/ ExtraRepo3/ .git/ After this setup, running: $ gitdist <raw-git-command> [git arguments] in the 'BaseRepo/ 'directory will automatically distribute a given command across the base repo 'BaseRepo/ and the extra repos ExtraRepo1/, ExtraRepo1/ExtraRepo2/, and ExtraRepo3/, in that order. To simplify the setup for the usage of gitdist for a given set of local git repos, one may choose to instead create the file .gitdist.default in the base repo (i.e. `BaseRepo/`') and add the ignores for the extra repos to the .gitignore files and commit the files to the repo(s). That way, one does not have to manually do any extra setup for every new set of local clones of the repos. But if the file .gitdist is present, then it will override the file .gitdist.default as described above (which allows customization of what git repos are processed at any time).
SUMMARY OF REPO STATUS: The script gitdist also supports the special command 'dist-repo-status' which prints a compact table showing the current status of all the repos (see alias 'gitdist-status' in --dist-help=aliases). For the example set of repos shown in OVERVIEW (see --dist-help=overview), running: $ gitdist dist-repo-status # alias 'gitdist-status' outputs a table like: ---------------------------------------------------------------------- | ID | Repo Dir | Branch | Tracking Branch | C | M | ? | |----|-----------------------|--------|-----------------|---|----|---| | 0 | BaseRepo (Base) | dummy | | | | | | 1 | ExtraRepo1 | master | origin/master | 1 | 2 | | | 2 | ExtraRepo1/ExtraRepo2 | abc123 | | | 25 | 4 | | 3 | ExtraRepo3 | master | origin/master | | | | ---------------------------------------------------------------------- If the option --dist-legend is also passed in, the output will include: Legend: * ID: Repository ID, zero based (order git commands are run) * Repo Dir: Relative to base repo (base repo shown first with '(Base)') * Branch: Current branch, or (if detached HEAD) tag name or SHA1 * Tracking Branch: Tracking branch (or empty if no tracking branch exists) * C: Number local commits w.r.t. tracking branch (empty if zero or no TB) * M: Number of tracked modified (uncommitted) files (empty if zero) * ?: Number of untracked, non-ignored files (empty if zero) In the case of a detached head state, as shown above with the repo 'ExtraRepo3', the SHA1 (e.g. 'abc123') was printed instead of 'HEAD'. However, if the repo is in the detached head state but a tag happens to point to the current commit (e.g. 'git tag --points-at' returns non-empy), then the tag name (e.g. 'v1.2.3') is printed instead of the SHA1 of the commit. One can also show the status of only changed repos with the command: $ gitdist dist-repo-status --dist-mod-only # alias 'gitdist-mod-status' which produces a table like: ---------------------------------------------------------------------- | ID | Repo Dir | Branch | Tracking Branch | C | M | ? | |----|-----------------------|--------|-----------------|---|----|---| | 1 | ExtraRepo1 | master | origin/master | 1 | 2 | | | 2 | ExtraRepo1/ExtraRepo2 | abc123 | | | 25 | 4 | ---------------------------------------------------------------------- (see the alias 'gitdist-mod-status' in --dist-help=aliases). Note that rows for the repos BaseRepo and ExtraRepo2 were left out but the repo indexes for the remaining repos are preserved. This allows one to compactly show the status of the changed local repos even when there are many local git repos by filtering out rows for repos that have no changes w.r.t. their tracking branches. This allows one to get the status on a few repos with changes out of a large number of local repos (i.e. 10s and even 100s of local git repos).
REPO VERSION FILES: The script gitdist also supports the options --dist-version-file=<versionFile> and --dist-version-file2=<versionFile2> which are used to provide different SHA1 versions for each local git repo. Each of these version files is expected to represent a compatible set of versions of the repos (e.g. in the same style as .gitmodule files used by the 'git submodule' command). The format of these repo version files is shown in the following example: *** Base Git Repo: BaseRepo e102e27 [Mon Sep 23 11:34:59 2013 -0400] <author0@someurl.com> First summary message *** Git Repo: ExtraRepo1 b894b9c [Fri Aug 30 09:55:07 2013 -0400] <author1@someurl.com> Second summary message *** Git Repo: ExtraRepo1/ExtraRepo2 97cf1ac [Thu Dec 1 23:34:06 2011 -0500] <author2@someurl.com> Third summary message *** Git Repo: ExtraRepo3 6facf33 [Fri May 6 15:28:35 2013 -0400] <author3@someurl.com> Fourth summary message Each repository entry can have a summary message or not (i.e. use two or three lines per repo in the file). A compatible repo version file can be generated with this script listing three lines per repo (e.g. as shown above) using (for example): $ gitdist --dist-no-color log --color=never -1 --pretty=format:"%h [%ad] <%ae>%n%s" \ | grep -v "^$" &> RepoVersion.txt (which is defined as the alias 'gitdist-repo-versions' in the file 'gitdist-setup.sh') or two lines per repo using (for example): $ gitdist --dist-no-color log --color=never -1 --pretty=format:"%h [%ad] <%ae>" \ | grep -v "^$" &> RepoVersion.txt This allows checking out consistent versions of the set git repos, diffing two consistent versions of the set of git repos, etc. To checkout an older set of consistent versions of the set of repos represented by the set of versions given in a file RepoVersion.<date>.txt, use: $ gitdist fetch origin $ gitdist --dist-version-file=RepoVersion.<date>.txt checkout _VERSION_ The string '_VERSION_' is replaced with the SHA1 for each of the repos listed in the file 'RepoVersion.<date>.txt'. (NOTE: this puts the repos into a detached head state so one has to know what that means.) To tag a set of repos using a consistent set of versions, use (for example): $ gitdist --dist-version-file=RepoVersion.<date>.txt \ tag -a -m "<message>" <some_tag> _VERSION_ To create a branch off of a consistent set of versions, use (for example): $ gitdist --dist-version-file=RepoVersion.<date>.txt \ checkout -b some-branch _VERSION_ To diff two sets of versions of the repos, use (for example): $ gitdist \ --dist-version-file=RepoVersion.<new-date>.txt \ --dist-version-file2=RepoVersion.<old-date>.txt \ diff _VERSION_ ^_VERSION2_ Here, _VERSION_ is replaced by the SHA1s listed in the file 'RepoVersion.<new-date>.txt' and _VERSION2_ is replaced by the SHA1s listed in 'RepoVersion.<old-date>.txt'. One can construct any git command taking one or two different repo version arguments (SHA1s) using this approach (which covers a huge number of different git operations). Note that the set of git repos listed in the 'RepoVersion.txt' file must be a super-set of those processed by this script or an error will occur and the script will abort (before running any git commands). If there are additional repos RepoX, RepoY, etc. not listed in the 'RepVersion'.txt file, then one can exclude them with: $ gitdist --dist-not-repos=RepoX,RepoY,... \ --dist-version-file=RepoVersion.txt \ <raw-git-comand> [git arguments]
USEFUL ALIASES: A few very useful (bash) shell aliases and setup commands to use with gitdist include: $ alias gitdist-status="gitdist dist-repo-status" $ alias gitdist-mod="gitdist --dist-mod-only" $ alias gitdist-mod-status="gitdist dist-repo-status --dist-mod-only" $ alias gitdist-repo-versions="gitdist --dist-no-color log --color=never -1 \ --pretty=format:\"%h [%ad] <%ae>%n%s\" | grep -v \"^$\"" These are added by sourcing the provided file 'gitdist-setup.sh' (which should be sourced in your ~/.bash_profile file.) which also adds some useful commandline tab completions. This avoids lots of extra typing as these gitdist arguments are used a lot. For example, to see the compact status table of all your local git repos, do: $ gitdist-status To just see a compact status table of only changed repos, do: $ gitdist-mod-status To process only repos that have changes and see commits in these repos w.r.t. their tracking branches, do (for example): $ gitdist-mod log --name-status HEAD ^@{u} or $ gitdist-mod local-stat (where 'local-stat' is a useful git alias defined in the script 'git-config-alias.sh' which adds these to your ~/.gitconf file).
DEFAULT BRANCH SPECIFICATION: When using any git command that accepts a reference (a SHA1, or branch or tag name), it is possible to use _DEFAULT_BRANCH_ instead. For instance, gitdist checkout _DEFAULT_BRANCH_ will check out the default development branch in each repository being managed by gitdist. You can specify the default branch for each repository in your .gitdist[.default] file. For instance, if your .gitdist file contains . master extraRepo1 develop extraRepo2 app-devel then the command above would check out 'master' in the base repo, 'develop' in extraRepo1, and 'app-devel' in extraRepo2. This makes it convenient when working with multiple repositories that have different names for their main development branches. For instance, you can do a topic branch workflow like: gitdist checkout _DEFAULT_BRANCH_ gitdist pull gitdist checkout -b newFeatureBranch <create some commits> gitdist fetch gitdist merge origin/_DEFAULT_BRANCH_ <create some commits> gitdist checkout _DEFAULT_BRANCH_ gitdist pull gitdist merge newFeatureBranch and not worry about this 'newFeatureBranch' being off of 'master' in the root repo, off of 'develop' in extraRepo1, and off of 'app-devel' in extraRepo2. If no branch name is specified for any given repository in the .gitdist[.default] file, then 'master' is assumed.
MOVE TO BASE DIRECTORY: By default, when you run gitdist, it will look in your current working directory for a .gitdist[.default] file. If it fails to find one, it will treat the current directory as the base git repository (as if there was a .gitdist file in it, having a single line with only "." in it) and then run as usual. You have the ability to change this behavior by setting the GITDIST_MOVE_TO_BASE_DIR environment variable. To describe the behavior for the differ net options, consider the following set of nested git repositories and directories: BaseRepo/ .git .gitdist ... ExtraRepo/ .git .gitdist ... path/ ... to/ ... some/ ... directory/ ... The valid settings for GITDIST_MOVE_TO_BASE_DIR include: "" (Empty) This gives the default behavior where gitdist runs in the current working directory. IMMEDIATE_BASE In this case, gitdist will start moving up the directory tree until it finds a .gitdist[.default] file, and then run in the directory where it finds it. In the above example, if you are in BaseRepo/ExtraRepo/path/to/some/directory/ when you run gitdist, it will move up to ExtraRepo to execute the command you give it from there. EXTREME_BASE: In this case, gitdist will continue moving up the directory tree until it finds the outer-most repository containing a .gitdist[.default] file, and then run in that directory. Given the directory tree above, if you were in BaseRepo/ExtraRepo/path/to/some/directory, it will move up to BaseRepo to execute the command you give it. With either of the settings above, when gitdist is finished running, it will leave you in the same directory you were in when you executed command in the first place. Additionally, if no .gitdist[.default] file can be found, gitdist will execute the command you give it in your current working directory, as if GITDIST_MOVE_TO_BASE_DIR hadn't been set.
USAGE TIPS: Since gitdist allows treating a set of git repos as one big git repo, almost any git workflow that is used for a single git repo can be used for a set of repos using gitdist. The main difference is that one will typically need to create commits individually for each repo. Also, pulls and pushes are no longer atomic like is guaranteed for a single git repo. In general, the mapping between the commands for a single-repo git workflow using raw git vs. a multi-repo git workflow using gitdist (using the shell aliases 'gitdist-status', 'gitdist-mod-status', and 'gitdist-mod'; see --dist-help=aliases) is given by: git pull => gitdist pull git checkout -b <branch> [<ref>] => gitdist checkout -b <branch> [<ref>] git checkout <branch> => gitdist checkout <branch> git tag -a -m "<message>" <tag> => gitdist tag -a -m "<message>" <tag> git status => gitdist-mod status # status details => gitdist-status # table for all => gitdist-mod-status # table for mod. git commit => gitdist-mod commit git log HEAD ^@{u} => gitdist-mod log HEAD ^@{u} git push => gitdist-mod push git push [-u] <remote> <branch> => gitdist push [-u] <remote> <branch> git push <remote> <tag> => gitdist push <remote> <tag> NOTE: The usage of 'gitdist-mod' can be replaced with just 'gitdist' in all of the above commands. It is just that in these cases gitdist-mod produces more compact output and avoids do-nothing commands for repos that have no changes with respect to their tracking branch. But when it doubt, just use raw 'gitdist' if you are not sure. A typical development iteration of the centralized workflow using using multiple git repos looks like the following: 1) Update the local branches from the remote tracking branches: $ cd BaseRepo/ $ gitdist pull 2) Make local modifications for each repo: $ emacs <base-files> $ cd ExtraRepo1/ $ emacs <files-in-extra-repo1> $ cd .. $ cd ExtraRepo1/ExtraRepo2/ $ emacs <files-in-extra-repo2> $ cd ../.. $ cd ExtraRepo3/ $ emacs <files-in-extra-repo3> $ cd .. 3) Build and test local modifications: $ cd BUILD/ $ make -j16 $ make test # hopefully all pass! $ cd .. 4) View the modifications before committing: $ gitdist-mod-status # Produces a summary table $ gitdist-mod status # See status details 5) Make commits to each repo: $ gitdist-mod commit -a # Opens editor for each repo in order or use the same commit message for all repos: $ emacs commitmsg.txt $ echo /commitmsg.txt >> .git/info/exclude $ gitdist-mod commit -a -F $PWD/commitmsg.txt or manually create the commits in each repo separately with raw git: $ cd BaseRepo/ $ git commit -a $ cd ExtraRepo1/ $ git commit -a $ cd .. $ cd ExtraRepo1/ExtraRepo2/ $ git commit -a $ cd ../.. $ cd ExtraRepo3/ $ git commit -a $ cd .. 6) Examine the local commits that are about to be pushed: $ gitdist-mod-status # Should be no unmodified or untracked files! $ gitdist-mod log --name-status HEAD ^@{u} # or ... $ gitdist-mod local-stat # alias defined in 'git-config-alias.sh' 7) Rebase and push local commits to remote tracking branch: $ gitdist pull --rebase $ gitdist-mod push $ gitdist-mod-status # Make sure all the pushes occurred! Another example workflow is creating a new release branch as shown in the OVERVIEW section (--dist-help=overview). Other usage tips: - 'gitdist --help' will run gitdist help, not git help. If you want raw git help, then run 'git --help'. - Be sure to run 'gitdist-status' to make sure that each repo is on the correct local branch and is tracking the correct remote branch. - In general, for most workflows, one should use the same local branch name, remote repo name, and remote tracking branch name in each local git repo. That allows commands like 'gitdist checkout --track <remote>/<branch>' and 'gitdist checkout <branch>' to work correctly. - For many git commands, it is better to process only repos that are changed w.r.t. their tracking branch with 'gitdist-mod <raw-git-command> [git arguments]'. For example, to see the status of only changed repos use 'gitdist-mod status'. This allows the usage of gitdist to scale well when there are even 100s of git repos. - As an exception to the last item, a few different types of git commands tend to be run on all the git repos like 'gitdist pull', 'gitdist checkout', and 'gitdist tag'. - If one is not sure whether to run 'gitdist' or 'gitdist-mod', then just run 'gitdist' to be safe.
SCRIPT DEPENDENCIES: The Python script gitdist only depends on the Python 2.6+ standard modules 'sys', 'os', 'subprocess', and 're'. Also, of course, it requires some compatible version of 'git' in your path (but gitdist works with several versions of git starting as far back as git 1.6+).
OVERVIEW: Running: $ gitdist [gitdist options] <raw-git-command> [git arguments] will distribute git commands specified by '<raw-git-command> [git arguments]' across the current base git repo and the set of git repos listed in the file ./.gitdist (or the file ./.gitdist.default, or the argument --dist-repos=<repo0>,<repo1>,..., see --dist-help=repo-selection-and-setup). For example, consider the following base git repo 'BaseRepo' with three other "extra" git repos cloned under it: BaseRepo/ .git/ .gitdist ExtraRepo1/ .git/ ExtraRepo2/ .git/ ExtraRepo3/ .git/ The file .gitdist shown above is created by the user and in this example should have the contents (note the base repo entry '.'): . ExtraRepo1 ExtraRepo1/ExtraRepo2 ExtraRepo3 For this example, running the command: $ cd BaseRepo/ $ gitdist status results in the following commands: $ git status $ cd ExtraRepo1/ ; git status ; .. $ cd ExtraRepo1/ExtraRepo2/ ; git status ; ../.. $ cd ExtraRepo3/ ; git status ; .. which produces output like: *** Base Git Repo: BaseRepo On branch master Your branch is up-to-date with 'origin/master'. nothing to commit, working directory clean *** Git Repo: ExtraRepo1 On branch master Your branch is up-to-date with 'origin/master'. nothing to commit, working directory clean *** Git Repo: ExtraRepo1/ExtraRepo2 On branch master Your branch is up-to-date with 'origin/master'. nothing to commit, working directory clean *** Git Repo: ExtraRepo3 On branch master Your branch is up-to-date with 'origin/master'. nothing to commit, working directory clean The gitdist tool allows managing a set of git repos like one big integrated git repo. For example, after cloning a set of git repos, one can perform basic operations like for single git repos such as creating a new release branch and pushing it with: $ gitdist checkout master $ gitdist pull $ gitdist tag -a -m "Start of the 2.3 release" release-2.3-start $ gitdist checkout -b release-2.3 release-2.3-start $ gitdist push origin release-2.3-start $ gitdist push origin -u release 2.3 $ gitdist checkout master The above gitdist commands create the same tag 'release-2.3-start' and the same branch 'release-2.3' in all of the local git repos and pushes these to the remote 'origin' for each git repo. For more information about a certain topic, use '--dist-help=<topic-name> [--help]' for <topic-name>: * 'overview' * 'repo-selection-and-setup' * 'dist-repo-status' * 'repo-versions' * 'dist-repo-versions-table' * 'aliases' * 'default-branch' * 'move-to-base-dir' * 'usage-tips' * 'script-dependencies' To see full help with all topics, use '--dist-help=all [--help]'. This script is self-contained and has no dependencies other than standard python 2.6+ packages so it can be copied to anywhere and used. REPO SELECTION AND SETUP: Before using the gitdist tool, one must first add the gitdist script to one's default path. On bash, the simplest way to do this is to source the gitdist-setup.py script: $ source <some-base-dir>/TriBITS/tribits/python_utils/gitdist-setup.sh This will set an alias to the gitdist script in that same directory by default, will set up useful alias 'gitdist-status', 'gitdist-mod', and 'gitdist-mod-status', and 'gitdist-repo-versions', and will set up command-line completion just like for raw git (assuming that git-completion.bash has been sourced first). The files 'gitdist' and 'gitdist-setup.sh' can also be copied to another directory (e.g. ~/bin) and then 'gitdist-setup.sh' can be sourced from there (as a simple "install"): $ cp <some-base-dir>/TriBITS/tribits/python_utils/gitdist \ <some-base-dir>/TriBITS/tribits/python_utils/gitdist-setup.sh \ ~/bin/ $ source ~/bin/gitdist-setup.sh $ export PATH=$HOME/bin:$PATH This script can also be set up manually, for example, by copying the gitdist script to one's ~/bin/ directory: $ cp <some-base-dir>/TriBITS/tribits/python_utils/gitdist ~/bin/ $ chmod a+x ~/bin/gitdist and then adding $HOME/bin to one's 'PATH' env var with: $ export PATH=$HOME/bin:$PATH (i.e. in one's ~/.bash_profile file). Then, one will want to set up some useful shell aliases like 'gitdist-status', 'gitdist-mod', and 'gitdist-mod-status' and 'gitdist-repo-versions' (see --dist-help=aliases). The set of git repos processed by gitdist is determined by the argument: --dist-repos=<repo0>,<repo1>,... or the files .gitdist or .gitdist.default. If --dist-repos="", then the list of repos to process will be read from the file '.gitdist' in the current directory. If the file '.gitdist' does not exist, then the list of repos to process will be read from the file '.gitdist.default' in the current directory. The format of this files '.gitdist' and '.gitdist.default' is to have one repo relative directory per line, for example: $ cat .gitdist . ExtraRepo1 ExtraRepo1/ExtraRepo2 ExtraRepo3 where each line is the relative path under the base git repo (i.e. under 'BaseRepo/'). The file .gitdist.default is meant to be committed to the base git repo (i.e. 'BaseRepo') so that gitdist is ready to use right away after the base repo and the extra repos are cloned. If an extra repository directory (i.e. listed in --dist-repos=<repo0>,<repo1>,..., .gitdist, or .gitdist.default) does not exist, then it will be ignored by the script. Therefore, be careful to manually verify that the script recognizes the repositories that you list. The best way to do that is to run 'gitdist-status' and see which repos are listed. Certain git repos can also be selectively excluded using the option '--dist-not-repos=<repox>,<repoy>,...'. Setting up to use gitdist on a specific set of local git repos first requires cloning and organizing the local git repo. For the example listed here, one would clone the base repo 'BaseRepo' and the three extra git repos, set up a .gitdist file, and then add ignores for the extra cloned repos like: # A) Clone and organize the git repos $ git clone git@some.url:BaseRepo.git $ cd BaseRepo/ $ git clone git@some.url:ExtraRepo1.git $ cd ExtraRepo1/ $ git clone git@some.url:ExtraRepo2.git $ cd .. $ git clone git@some.url:ExtraRepo3.git # B) Create .gitdist $ echo . > .gitdist $ echo ExtraRepo1 >> .gitdist $ echo ExtraRepo1/ExtraRepo2 >> .gitdist $ echo ExtraRepo3 >> .gitdist # C) Add ignores in base repo $ echo /ExtraRepo1/ >> .git/info/exclude $ echo /ExtraRepo3/ >> .git/info/exclude # D) Add ignore in nested extra repo $ echo /ExtraRepo2/ >> ExtraRepo1/.git/info/exclude (Note that one may instead add the above ignores to the version-controlled files BaseRepo/.gitignore and ExtraRepo1/.gitignore.) This produces the local repo structure: BaseRepo/ .git/ .gitdist ExtraRepo1/ .git/ ExtraRepo2/ .git/ ExtraRepo3/ .git/ After this setup, running: $ gitdist <raw-git-command> [git arguments] in the 'BaseRepo/ 'directory will automatically distribute a given command across the base repo 'BaseRepo/ and the extra repos ExtraRepo1/, ExtraRepo1/ExtraRepo2/, and ExtraRepo3/, in that order. To simplify the setup for the usage of gitdist for a given set of local git repos, one may choose to instead create the file .gitdist.default in the base repo (i.e. `BaseRepo/`') and add the ignores for the extra repos to the .gitignore files and commit the files to the repo(s). That way, one does not have to manually do any extra setup for every new set of local clones of the repos. But if the file .gitdist is present, then it will override the file .gitdist.default as described above (which allows customization of what git repos are processed at any time). SUMMARY OF REPO STATUS: The script gitdist also supports the special command 'dist-repo-status' which prints a compact table showing the current status of all the repos (see alias 'gitdist-status' in --dist-help=aliases). For the example set of repos shown in OVERVIEW (see --dist-help=overview), running: $ gitdist dist-repo-status # alias 'gitdist-status' outputs a table like: ---------------------------------------------------------------------- | ID | Repo Dir | Branch | Tracking Branch | C | M | ? | |----|-----------------------|--------|-----------------|---|----|---| | 0 | BaseRepo (Base) | dummy | | | | | | 1 | ExtraRepo1 | master | origin/master | 1 | 2 | | | 2 | ExtraRepo1/ExtraRepo2 | abc123 | | | 25 | 4 | | 3 | ExtraRepo3 | master | origin/master | | | | ---------------------------------------------------------------------- If the option --dist-legend is also passed in, the output will include: Legend: * ID: Repository ID, zero based (order git commands are run) * Repo Dir: Relative to base repo (base repo shown first with '(Base)') * Branch: Current branch, or (if detached HEAD) tag name or SHA1 * Tracking Branch: Tracking branch (or empty if no tracking branch exists) * C: Number local commits w.r.t. tracking branch (empty if zero or no TB) * M: Number of tracked modified (uncommitted) files (empty if zero) * ?: Number of untracked, non-ignored files (empty if zero) In the case of a detached head state, as shown above with the repo 'ExtraRepo3', the SHA1 (e.g. 'abc123') was printed instead of 'HEAD'. However, if the repo is in the detached head state but a tag happens to point to the current commit (e.g. 'git tag --points-at' returns non-empy), then the tag name (e.g. 'v1.2.3') is printed instead of the SHA1 of the commit. One can also show the status of only changed repos with the command: $ gitdist dist-repo-status --dist-mod-only # alias 'gitdist-mod-status' which produces a table like: ---------------------------------------------------------------------- | ID | Repo Dir | Branch | Tracking Branch | C | M | ? | |----|-----------------------|--------|-----------------|---|----|---| | 1 | ExtraRepo1 | master | origin/master | 1 | 2 | | | 2 | ExtraRepo1/ExtraRepo2 | abc123 | | | 25 | 4 | ---------------------------------------------------------------------- (see the alias 'gitdist-mod-status' in --dist-help=aliases). Note that rows for the repos BaseRepo and ExtraRepo2 were left out but the repo indexes for the remaining repos are preserved. This allows one to compactly show the status of the changed local repos even when there are many local git repos by filtering out rows for repos that have no changes w.r.t. their tracking branches. This allows one to get the status on a few repos with changes out of a large number of local repos (i.e. 10s and even 100s of local git repos). REPO VERSION FILES: The script gitdist also supports the options --dist-version-file=<versionFile> and --dist-version-file2=<versionFile2> which are used to provide different SHA1 versions for each local git repo. Each of these version files is expected to represent a compatible set of versions of the repos (e.g. in the same style as .gitmodule files used by the 'git submodule' command). The format of these repo version files is shown in the following example: *** Base Git Repo: BaseRepo e102e27 [Mon Sep 23 11:34:59 2013 -0400] <author0@someurl.com> First summary message *** Git Repo: ExtraRepo1 b894b9c [Fri Aug 30 09:55:07 2013 -0400] <author1@someurl.com> Second summary message *** Git Repo: ExtraRepo1/ExtraRepo2 97cf1ac [Thu Dec 1 23:34:06 2011 -0500] <author2@someurl.com> Third summary message *** Git Repo: ExtraRepo3 6facf33 [Fri May 6 15:28:35 2013 -0400] <author3@someurl.com> Fourth summary message Each repository entry can have a summary message or not (i.e. use two or three lines per repo in the file). A compatible repo version file can be generated with this script listing three lines per repo (e.g. as shown above) using (for example): $ gitdist --dist-no-color log --color=never -1 --pretty=format:"%h [%ad] <%ae>%n%s" \ | grep -v "^$" &> RepoVersion.txt (which is defined as the alias 'gitdist-repo-versions' in the file 'gitdist-setup.sh') or two lines per repo using (for example): $ gitdist --dist-no-color log --color=never -1 --pretty=format:"%h [%ad] <%ae>" \ | grep -v "^$" &> RepoVersion.txt This allows checking out consistent versions of the set git repos, diffing two consistent versions of the set of git repos, etc. To checkout an older set of consistent versions of the set of repos represented by the set of versions given in a file RepoVersion.<date>.txt, use: $ gitdist fetch origin $ gitdist --dist-version-file=RepoVersion.<date>.txt checkout _VERSION_ The string '_VERSION_' is replaced with the SHA1 for each of the repos listed in the file 'RepoVersion.<date>.txt'. (NOTE: this puts the repos into a detached head state so one has to know what that means.) To tag a set of repos using a consistent set of versions, use (for example): $ gitdist --dist-version-file=RepoVersion.<date>.txt \ tag -a -m "<message>" <some_tag> _VERSION_ To create a branch off of a consistent set of versions, use (for example): $ gitdist --dist-version-file=RepoVersion.<date>.txt \ checkout -b some-branch _VERSION_ To diff two sets of versions of the repos, use (for example): $ gitdist \ --dist-version-file=RepoVersion.<new-date>.txt \ --dist-version-file2=RepoVersion.<old-date>.txt \ diff _VERSION_ ^_VERSION2_ Here, _VERSION_ is replaced by the SHA1s listed in the file 'RepoVersion.<new-date>.txt' and _VERSION2_ is replaced by the SHA1s listed in 'RepoVersion.<old-date>.txt'. One can construct any git command taking one or two different repo version arguments (SHA1s) using this approach (which covers a huge number of different git operations). Note that the set of git repos listed in the 'RepoVersion.txt' file must be a super-set of those processed by this script or an error will occur and the script will abort (before running any git commands). If there are additional repos RepoX, RepoY, etc. not listed in the 'RepVersion'.txt file, then one can exclude them with: $ gitdist --dist-not-repos=RepoX,RepoY,... \ --dist-version-file=RepoVersion.txt \ <raw-git-comand> [git arguments] REPO VERSION TABLE: The script gitdist also supports the special command 'dist-repo-versions-table', which prints a Markdown-formatted table of repositories and corresponding commit information for easy inclusion in an issue tracking system. For instance, running: $ gitdist dist-repo-versions-table outputs a table like: | Repository | SHA1 | Commit Date | Author | Summary | |:-------------- |:-------:|:------------------- |:---------------------- |:---------------------------------------------- | | MockProjectDir | e2dc488 | 2019-10-23 10:16:07 | user@domain.com | Merge Pull Request #1234 from user/repo/branch | | ExtraRepo1 | f671414 | 2019-10-22 11:18:47 | wile.e.coyote@acme.com | Fixed a Bug | | ExtraRepo2 | 50bbf3e | 2019-10-17 16:32:15 | someone@somewhere.com | Did Some Work | If the option --dist-short is also passed in, the output will be limited to: | Repository | SHA1 | |:-------------- |:-------:| | MockProjectDir | e2dc488 | | ExtraRepo1 | f671414 | | ExtraRepo2 | 50bbf3e | USEFUL ALIASES: A few very useful (bash) shell aliases and setup commands to use with gitdist include: $ alias gitdist-status="gitdist dist-repo-status" $ alias gitdist-mod="gitdist --dist-mod-only" $ alias gitdist-mod-status="gitdist dist-repo-status --dist-mod-only" $ alias gitdist-repo-versions="gitdist --dist-no-color log --color=never -1 \ --pretty=format:\"%h [%ad] <%ae>%n%s\" | grep -v \"^$\"" These are added by sourcing the provided file 'gitdist-setup.sh' (which should be sourced in your ~/.bash_profile file.) which also adds some useful commandline tab completions. This avoids lots of extra typing as these gitdist arguments are used a lot. For example, to see the compact status table of all your local git repos, do: $ gitdist-status To just see a compact status table of only changed repos, do: $ gitdist-mod-status To process only repos that have changes and see commits in these repos w.r.t. their tracking branches, do (for example): $ gitdist-mod log --name-status HEAD ^@{u} or $ gitdist-mod local-stat (where 'local-stat' is a useful git alias defined in the script 'git-config-alias.sh' which adds these to your ~/.gitconf file). DEFAULT BRANCH SPECIFICATION: When using any git command that accepts a reference (a SHA1, or branch or tag name), it is possible to use _DEFAULT_BRANCH_ instead. For instance, gitdist checkout _DEFAULT_BRANCH_ will check out the default development branch in each repository being managed by gitdist. You can specify the default branch for each repository in your .gitdist[.default] file. For instance, if your .gitdist file contains . master extraRepo1 develop extraRepo2 app-devel then the command above would check out 'master' in the base repo, 'develop' in extraRepo1, and 'app-devel' in extraRepo2. This makes it convenient when working with multiple repositories that have different names for their main development branches. For instance, you can do a topic branch workflow like: gitdist checkout _DEFAULT_BRANCH_ gitdist pull gitdist checkout -b newFeatureBranch <create some commits> gitdist fetch gitdist merge origin/_DEFAULT_BRANCH_ <create some commits> gitdist checkout _DEFAULT_BRANCH_ gitdist pull gitdist merge newFeatureBranch and not worry about this 'newFeatureBranch' being off of 'master' in the root repo, off of 'develop' in extraRepo1, and off of 'app-devel' in extraRepo2. If no branch name is specified for any given repository in the .gitdist[.default] file, then 'master' is assumed. MOVE TO BASE DIRECTORY: By default, when you run gitdist, it will look in your current working directory for a .gitdist[.default] file. If it fails to find one, it will treat the current directory as the base git repository (as if there was a .gitdist file in it, having a single line with only "." in it) and then run as usual. You have the ability to change this behavior by setting the GITDIST_MOVE_TO_BASE_DIR environment variable. To describe the behavior for the differ net options, consider the following set of nested git repositories and directories: BaseRepo/ .git .gitdist ... ExtraRepo/ .git .gitdist ... path/ ... to/ ... some/ ... directory/ ... The valid settings for GITDIST_MOVE_TO_BASE_DIR include: "" (Empty) This gives the default behavior where gitdist runs in the current working directory. IMMEDIATE_BASE In this case, gitdist will start moving up the directory tree until it finds a .gitdist[.default] file, and then run in the directory where it finds it. In the above example, if you are in BaseRepo/ExtraRepo/path/to/some/directory/ when you run gitdist, it will move up to ExtraRepo to execute the command you give it from there. EXTREME_BASE: In this case, gitdist will continue moving up the directory tree until it finds the outer-most repository containing a .gitdist[.default] file, and then run in that directory. Given the directory tree above, if you were in BaseRepo/ExtraRepo/path/to/some/directory, it will move up to BaseRepo to execute the command you give it. With either of the settings above, when gitdist is finished running, it will leave you in the same directory you were in when you executed command in the first place. Additionally, if no .gitdist[.default] file can be found, gitdist will execute the command you give it in your current working directory, as if GITDIST_MOVE_TO_BASE_DIR hadn't been set. USAGE TIPS: Since gitdist allows treating a set of git repos as one big git repo, almost any git workflow that is used for a single git repo can be used for a set of repos using gitdist. The main difference is that one will typically need to create commits individually for each repo. Also, pulls and pushes are no longer atomic like is guaranteed for a single git repo. In general, the mapping between the commands for a single-repo git workflow using raw git vs. a multi-repo git workflow using gitdist (using the shell aliases 'gitdist-status', 'gitdist-mod-status', and 'gitdist-mod'; see --dist-help=aliases) is given by: git pull => gitdist pull git checkout -b <branch> [<ref>] => gitdist checkout -b <branch> [<ref>] git checkout <branch> => gitdist checkout <branch> git tag -a -m "<message>" <tag> => gitdist tag -a -m "<message>" <tag> git status => gitdist-mod status # status details => gitdist-status # table for all => gitdist-mod-status # table for mod. git commit => gitdist-mod commit git log HEAD ^@{u} => gitdist-mod log HEAD ^@{u} git push => gitdist-mod push git push [-u] <remote> <branch> => gitdist push [-u] <remote> <branch> git push <remote> <tag> => gitdist push <remote> <tag> NOTE: The usage of 'gitdist-mod' can be replaced with just 'gitdist' in all of the above commands. It is just that in these cases gitdist-mod produces more compact output and avoids do-nothing commands for repos that have no changes with respect to their tracking branch. But when it doubt, just use raw 'gitdist' if you are not sure. A typical development iteration of the centralized workflow using using multiple git repos looks like the following: 1) Update the local branches from the remote tracking branches: $ cd BaseRepo/ $ gitdist pull 2) Make local modifications for each repo: $ emacs <base-files> $ cd ExtraRepo1/ $ emacs <files-in-extra-repo1> $ cd .. $ cd ExtraRepo1/ExtraRepo2/ $ emacs <files-in-extra-repo2> $ cd ../.. $ cd ExtraRepo3/ $ emacs <files-in-extra-repo3> $ cd .. 3) Build and test local modifications: $ cd BUILD/ $ make -j16 $ make test # hopefully all pass! $ cd .. 4) View the modifications before committing: $ gitdist-mod-status # Produces a summary table $ gitdist-mod status # See status details 5) Make commits to each repo: $ gitdist-mod commit -a # Opens editor for each repo in order or use the same commit message for all repos: $ emacs commitmsg.txt $ echo /commitmsg.txt >> .git/info/exclude $ gitdist-mod commit -a -F $PWD/commitmsg.txt or manually create the commits in each repo separately with raw git: $ cd BaseRepo/ $ git commit -a $ cd ExtraRepo1/ $ git commit -a $ cd .. $ cd ExtraRepo1/ExtraRepo2/ $ git commit -a $ cd ../.. $ cd ExtraRepo3/ $ git commit -a $ cd .. 6) Examine the local commits that are about to be pushed: $ gitdist-mod-status # Should be no unmodified or untracked files! $ gitdist-mod log --name-status HEAD ^@{u} # or ... $ gitdist-mod local-stat # alias defined in 'git-config-alias.sh' 7) Rebase and push local commits to remote tracking branch: $ gitdist pull --rebase $ gitdist-mod push $ gitdist-mod-status # Make sure all the pushes occurred! Another example workflow is creating a new release branch as shown in the OVERVIEW section (--dist-help=overview). Other usage tips: - 'gitdist --help' will run gitdist help, not git help. If you want raw git help, then run 'git --help'. - Be sure to run 'gitdist-status' to make sure that each repo is on the correct local branch and is tracking the correct remote branch. - In general, for most workflows, one should use the same local branch name, remote repo name, and remote tracking branch name in each local git repo. That allows commands like 'gitdist checkout --track <remote>/<branch>' and 'gitdist checkout <branch>' to work correctly. - For many git commands, it is better to process only repos that are changed w.r.t. their tracking branch with 'gitdist-mod <raw-git-command> [git arguments]'. For example, to see the status of only changed repos use 'gitdist-mod status'. This allows the usage of gitdist to scale well when there are even 100s of git repos. - As an exception to the last item, a few different types of git commands tend to be run on all the git repos like 'gitdist pull', 'gitdist checkout', and 'gitdist tag'. - If one is not sure whether to run 'gitdist' or 'gitdist-mod', then just run 'gitdist' to be safe. SCRIPT DEPENDENCIES: The Python script gitdist only depends on the Python 2.6+ standard modules 'sys', 'os', 'subprocess', and 're'. Also, of course, it requires some compatible version of 'git' in your path (but gitdist works with several versions of git starting as far back as git 1.6+).
Below is a snapshot of the output from snapshot-dir.py --help. For more details on the usage of snapshot-dir.py, specifically for snapshotting the <projectDir>/cmake/tribits/ directory, see TriBITS directory snapshotting.
usage: snapshot-dir.py [-h] [--show-defaults] [--orig-dir ORIGDIR] [--dest-dir DESTDIR] [--exclude [EXCLUDE [EXCLUDE ...]]] [--assert-clean-orig-dir] [--allow-dirty-orig-dir] [--assert-clean-dest-dir] [--allow-dirty-dest-dir] [--clean-ignored-files-orig-dir] [--no-clean-ignored-files-orig-dir] [--do-commit] [--skip-commit] [--verify-commit] [--no-verify-commit] [--no-op] This tool uses rsync and some git commands to snapshot the contents of an origin directory ('orig-dir') in one Git repo to destination directory ('dest-dir') in another Git repo and logs version information in a new Git commit message. WARNING: Because this tool uses 'rsync --delete', it can be quite destructive to the contents of dest-dir if one is not careful and selects the wrong orig-dir or wrong dest-dir! Therefore, please carefully read this entire help message and especially the warnings given below and always start by running with --no-op! To sync between any two arbitrary directories between two different git repos invoking this script from any directory location, one can do: $ <some-base-dir>/snapshot-dir.py \ --orig-dir=<dest-dir> \ --dest-dir=<dest-dir> \ [--no-op] (where --no-op should be used on the initial trial run to check that the correct rsync and other commands will be run, see below). To describe how this script is used, consider the desire to snapshot the directory tree from one git repo: <some-orig-base-dir>/orig-dir/ and exactly duplicate it in another git repo under: <some-dest-base-dir>/dest-dir/ Here, the directories can be any two directories from local git repos with any names. (Note if the paths don't end with '/', then '/' will be added. Otherwise, rsync will copy the contents from 'orig-dir' into a subdir of 'dest-dir' which is usually not what you want.) A typical case is to have snapshot-dir.py soft linked into orig-dir/ to allow a simple sync process. The linked-in location of snapshot-dir.py gives the default 'orig-dir' directory automatically (but can be overridden with --orig-dir=<orig-dir> option). When snapshot-dir.py is soft-linked into the 'orig-dir' directory base, the way to run this script would be: $ cd <some-dest-base-dir>/dest-dir/ $ <some-orig-base-dir>/orig-dir/snapshot-dir.py By default, this assumes that git repos are used for both the 'orig-dir' and 'dest-dir' locations. The info about the origin of the snapshot from 'orig-dir' is recorded in the commit message of the 'dest-dir' git repo to provide tractability for the versions (see below). By default, this script does the following: 1) Assert that 'orig-dir' in its git repo is clean (i.e. no uncommitted files). (Can be disabled by passing in --allow-dirty-orig-dir.) 2) Assert that <dest-dir>/ in its git repo is clean (same checks as for 'orig-dir' above). (Can be disabled by passing in --allow-dirty-dest-dir. Also, this must be skipped on the initial snapshot where <some-dist-dir>/ does not exist yet.) 3) Clean out the ignored files from <some-source-dir>/orig-dir using 'git clean -xdf' run in that directory. (Only if --clean-ignored-files-orig-dir is passed.) 4) Run 'rsync -cav --delete [other options] <orig-dir>/ <dest-dir>/' to copy the contents from 'orig-dir' to 'dest-dir', excluding the '.git/' directory if it exists in either git repo dir. After this runs, <dest-dir>/ should be an exact duplicate of <orig-dir>/ (except for otherwise noted excluded files). This rsync will delete any files in 'dest-dir' that are not in 'orig-dir'. Note that if there are any ignored untracked files in 'orig-dir' that get copied over, then the copied .gitignore files should avoid treating them as tracked files in the 'dest-dir' git repo. (The rsync command is skipped if the argument --no-op is passed.) 5) Run 'git add .' in <dest-dir>/ to stage any files copied over. (Note that git will automatically stage deletes for any files removed by the 'rsync -cav --delete' command. Also note that any untracked, unknown or ignored files in 'orig-dir' that get copied over and are not ignored in 'dist-dir' in the copied-over '.gitignore' files, or other git ignore files in 'dist-dir', then they will be added to the new commit in 'dist-dir'.) 6) Get the git remote URL from the orig-dir git repo, and the git log for the last commit for the directory orig-dir from its git repo. (If the orig-dir repo is on a tracking branch, then this remote will be guaranteed to be correct. However, if the orig-dir repo not on a tracking branch, then the first remote returned from 'git remote -v' will be used. This information is used to provide tractability of the version info back to the originating git repo in the commit created in the dest-dir repo.) 7) Commit the updated dest-dir directory using a commit message with the orig-dir snapshot version info. (This will only commit files in 'dest-dir' and not in other directories in the destination git repo!) (The 'git commit' will be skipped if the options --skip-commit or --no-op are passed.) NOTES: * When first running this tool, use the --no-op option to see what commands would be run without actually performing any mutable operations. Especially pay attention to the rsync command that would be run and make sure it is operating on the desired directories. * On the first creation of <dest-dir>/, one must pass in --allow-dirty-dest-dir to avoid checks of <dest-dir>/ (because it does not yet exist). * This script allows the syncing between base git repos or subdirs within git repos. This is allowed because the rsync command is told to ignore the .git/ directory when syncing. * The cleaning of orig-dir/ using 'git clean -xdf' may be somewhat dangerous but it is recommended by passing in --clean-ignored-files-orig-dir to avoid copying locally-ignored files in orig-dir/ (e.g. ignored in .git/info/excludes and not ignored in a committed .gitignore file in orig-dir/) that get copied to and then committed in the dest-dir/ repo. Therefore, be sure you don't have any of these type of ignored files in orig-dir/ that you want to keep before you run this tool with the option --clean-ignored-files-orig-dir! * Snapshotting with this script will create an exact duplicate of 'orig-dir' in 'dest-dir' and therefore if there are any local changes to the files or changes after the last snapshot, they will get wiped out. To avoid this, one can the snapshot on a branch in the 'dest-dir' git repo, then merge that branch into the main branch (e.g. 'master') in 'dest-dir' repo. As long as there are no merge conflicts, this will preserve local changes for the mirrored directories and files. This strategy can work well as a way to allow for local modifications but still do snapshotting. WARNINGS: * Make sure that orig-dir is a non-ignored subdir of the origin git repo and that it does not contain any ignored subdirs and files that are ignored by listing them in the .git/info/exclude file instead of .gitignore files that would get copied over to dist-dir. (If ignored files are ignored using .gitignore files within orig-dir, then those files will be copied by the rsync command along with the rest of the files from orig-dir and those files will also be ignored after being copied to dest-dir. However, if these files and dirs are ignored because of being listed in the .git/info/exclude file in orig-dir, then those files will not be ignored when copied over to dest-dir and will therefore be added to the git commit in the dist-dir git repo. That is why it is recommended to run with --clean-ignored-files-orig-dir to ensure that all ignore files are removed from orig-dir before doing the rsync.) * Make sure that the top commit in orig-dir has been pushed (or will be pushed) to the listed remote git repo showed in the output line 'origin remote name' or 'origin remote URL'. Otherwise, others will not be able to trace that exact version by cloning that repo and traceability is lost. * Make sure that dest-dir is a non-ignored subdir of the destination git repo and does not contain any ignored subdirs or files that you don't care if they are deleted. (Otherwise, the 'rsync --delete' command will delete any files in dest-dir that are not also in orig-dir and since these non-ignored files in dest-dir are not under version control, they will not be recoverable.) * As a corollary to the above warnings about ignored files and directories, do not snapshot from an orig-dir or to a dest-dir that contain build directories or other generated files that you want to keep! Even if those files and directories are ignored in copied-over .gitignore files, the copy of those ignored files will still occur. (Just another reason to keep your build directories completely outside of the source tree!) optional arguments: -h, --help show this help message and exit --show-defaults Show the default option values and do nothing at all. --orig-dir ORIGDIR Original directory that is the source for the snapshotted directory. If a trailing '/' is missing then it will be added. The default is the directory where this script lives (or is soft-linked). [default: '<orig-dir>'] --dest-dir DESTDIR Destination directory that is the target for the snapshoted directory. If a trailing '/' is missing then it will be added. The default dest-dir is current working directory. [default: '<dest-dir>'] --exclude [EXCLUDE [EXCLUDE ...]] List of files/directories/globs to exclude from orig- dir when snapshotting. --assert-clean-orig-dir Check that orig-dir is committed and clean. [default] --allow-dirty-orig-dir Skip clean check of orig-dir. --assert-clean-dest-dir Check that dest-dir is committed and clean. [default] --allow-dirty-dest-dir Skip clean check of dest-dir. --clean-ignored-files-orig-dir Clean out the ignored files from orig-dir/ before snapshotting. --no-clean-ignored-files-orig-dir Do not clean out orig-dir/ ignored files before snapshotting. [default] --do-commit Actually do the commit. [default] --skip-commit Skip the commit. --verify-commit Do not pass --no-verify to git commit. [default] --no-verify-commit Pass --no-verify to git commit. --no-op Don't actually run any commands that would change the state other (other than the natural side-effects of running git query commands)
Below is a snapshot of the output from checkin-test.py --help. This --help output contains a lot of information about the recommended development workflow (mostly related to pushing commits) and outlines a number of different use cases for using the tool.
Usage: checkin-test.py [OPTIONS] This tool does testing of a TriBITS-based project using CTest and this script can actually do the push itself using git in a safe way. In fact, it is recommended that one uses this script to push since it will amend the last commit message with a (minimal) summary of the builds and tests run with results and/or send out a summary email about the builds/tests performed. QUICKSTART ----------- In order to do a safe push, perform the following recommended workflow (different variations on this workflow are described in the COMMON USE CASES section below): 1) Commit changes in the local repo: # 1.a) See what files are changed, newly added, etc. $ git status # 1.b) Stage the files you want to commit $ git stage <files you want to commit> # 1.c) Create your local commits $ git commit -- SOMETHING $ git commit -- SOMETHING_ELSE ... # 1.d) Stash whatever changes are left you don't want to test/push $ git stash NOTE: You can group your commits any way that you would like (see the basic git documentation). NOTE: When multiple repos are involved, use the 'gitdist' command instead of 'git'. This script is provided at tribits/python_utils/gitdist. See gitdist --help for details. 2) Review the changes that you have made to make sure it is safe to push: $ cd $PROJECT_HOME $ git local-stat | less # Look at the full status of local repo $ git diff --name-status HEAD ^@{u} # [Optional] Look at the files that have changed NOTE: The command 'local-stat' is a git alias that can be installed with the script tribits/python_utils/git-config-alias.sh. This command is recommended over just a raw 'git status' or 'git log' to review commits before attempting to test/push commits. If you have not installed these alias, then run the following commands instead: $ git status $ git log --oneline --name-status HEAD ^@{u} NOTE: If you see any files/directories that are listed as 'unknown' returned from 'git local-stat', then you will need to do an 'git add' to track them or add them to an ignore list *before* you run the checkin-test.py script. The git script will not allow you to push if there are new 'unknown' files or uncommitted changes to tracked files. NOTE: When multiple repos are involved, use 'gitdist-mod-status' to see the state of your repos before pushing. See gitdist --help for details. 3) Set up the checkin base build directory (first time only): $ cd $PROJECT_HOME $ echo CHECKIN >> .git/info/exclude $ mkdir CHECKIN $ cd CHECKIN NOTE: You may need to set up some configuration files if CMake cannot find the right compilers, MPI, and TPLs by default (see detailed documentation below). NOTE: You might want to set up a simple shell driver script for your common use cases. NOTE: You can set up a CHECKIN directory of any name in any location you want. If you create one outside of the main source dir, then you will not have to add the git exclude shown above. 4) Do the pull, configure, build, test, and push: $ cd $PROJECT_HOME $ cd CHECKIN $ ../checkin-test.py -j4 --do-all --push NOTE: The above will: a) pull updates from the tracking branch, b) automatically enable the correct packages based on changed files, c) configure and build the changed and downstream packages, d) run the tests, e) send you emails about what happened, f) do a final pull from the global repo, g) optionally amend the last local commit with the test results, and h) finally push local commits to the tracking branch if everything passes. NOTE: The repo must be on a branch (not a detached head state) with a tracking branch '<remoterepo>/<remotebranch>' so that a raw 'git pull' can be performed to get updates and to push changes. Also, the push is done explicitly to the tracking branch using: git push <remoterepo> <remotebranch> NOTE: You must not have any uncommitted changes or the script will stop right away. To run the script, you will may need to first use 'git stash' to stash away your unstagged/uncommitted changes *before* running this script. NOTE: You need to have SSH public/private keys set up to the remote repo machines for the git commands invoked in the script to work without you having to type a password. If the 'git pull' fails, the detailed output is generally in a pull*.out file (see the checkin-test.out log file for details). NOTE: You can do the final push in a second invocation of the script with a follow-up run with --push and removing --do-all (it will remember the results from the build/test cases just ran). For more details, see detailed documentation below. NOTE: Once you start running the checkin-test.py script, you can go off and do something else and just check your email to see if all the builds and tests passed and if the push happened or not. NOTE: The commands 'cmake', 'ctest', and 'make' must be in your default path before running this script. NOTE: Defaults like -j4 can be set using a local-checkin-test-defaults.py file (see below). For more details on using this script, see the detailed documentation below. DETAILED DOCUMENTATION ----------------------- The following approximate steps are performed by this script: ---------------------------------------------------------------------------- 1) Check to see if the local repo(s) are clean: $ git status NOTE: If any modified or any unknown files are shown, the process will be aborted. The local repo(s) working directory must be clean and ready to push *everything* that is not stashed away. 2) Do a 'git pull' to update the repo (done if --pull or --do-all is set): NOTE: If not doing a pull, use --allow-no-pull or --local-do-all. 3) Select the list of packages to enable forward/downstream based on the package directories where there are changed files (or from a list of packages passed in by the user). NOTE: The automatic enable behavior can be overridden or modified using the options --enable-all-packages=on/off, --enable-packages=<p0>,<p1>,... , --disable-packages=<p0>,<p1>,... , and/or --no-enable-fwd-packages. 4) For each build/test case <BUILD_NAME> (e.g. MPI_DEBUG, SERIAL_RELEASE, extra builds specified with --st-extra-builds and --extra-builds): 4.a) Configure a build directory <BUILD_NAME> in a standard way for all of the packages that have changed and all of the packages that depend on these packages forward/downstream. You can manually select which packages get enabled (see the enable options above). (done if --configure, --do-all, or --local-do-all is set.) 4.b) Build all configured code with 'make' (e.g. with -jN set through -j or --make-options). (done if --build, --do-all, or --local-do-all is set.) 4.c) Run all BASIC tests for enabled packages. (done if --test, --do-all, or --local-do-all is set.) 4.d) Analyze the results of the pull, configure, build, and tests and send email about results. (emails only sent out if --send-emails-to!="") 5) Do final pull and rebase, append test results to last commit message, and push (done if --push or --do-all is set) 5.a) Do a final 'git pull' (done if --pull or --do-all is set) 5.b) Do 'git rebase <remoterepo>/<remotebranch>' (done if --rebase is set) 5.c) Amend commit message of the most recent commit with the summary of the testing performed. (done if --append-test-results is set.) 5.d) Push the local commits to the global repo (done if --push is set) 6) Send out final on actions (i.e. 'DID PUSH' email if a push occurred). (done if --send-email-to!="" is set and --send-email-only-on-failure is *not* set) ---------------------------------------------------------------------------- The recommended way to use this script is to create a new base CHECKIN test directory apart from your standard build directories such as with: $ $PROJECT_HOME $ mkdir CHECKIN $ echo CHECKIN >> .git/info/exclude The most basic way to do pre-push testing is with: $ cd CHECKIN $ ../checkin-test.py --do-all [other options] If your MPI installation, other compilers, and standard TPLs can be found automatically, then this is all you will need to do. However, if the setup cannot be determined automatically, then you can add a set of CMake variables that will get read in the files: COMMON.config MPI_DEBUG.config SERIAL_RELEASE.config (or whatever your standard --default-builds are). Actually, for built-in build/test cases, skeletons of these files will automatically be written out with typical CMake cache variables (commented out) that you would need to set out. Any CMake cache variables listed in these files will be read into and passed on the configure line to 'cmake'. WARNING: Please do not add any extra CMake cache variables than what are needed to get the Primary Tested (PT) --default-builds builds to work. Adding other enables/disables will make the builds non-standard and can break these PT builds. The goal of these configuration files is to allow you to specify the minimum environment to find MPI, your compilers, and the required TPLs (e.g. BLAS, LAPACK, etc.). If you need to fudge what packages are enabled, please use the script arguments --enable-packages, --enable-extra-pacakges, --disable-packages, --no-enable-fwd-packages, and/or --enable-all-packages to control this, not the *.config files! WARNING: Please do not add any CMake cache variables in the *.config files that will alter what packages or TPLs are enabled or what tests are run. Actually, the script will not allow you to change TPL enables in these standard *.config files because to do so deviates from a consistent build configuration for Primary Tested (PT) Code. NOTE: All tentatively-enabled TPLs (e.g. Pthreads and BinUtils) are hard disabled in order to avoid different behaviors between machines where they would be enabled and machines where they would be disabled. NOTE: If you want to add extra build/test cases that do not conform to the standard build/test configurations described above, then you need to create extra builds with the --extra-builds and/or --st-extra-builds options (see below). NOTE: Before running this script, you should first do an 'git status' and 'git diff --name-status HEAD ^@{u}' and examine what files are changed to make sure you want to push what you have in your local working directory. Also, please look out for unknown files that you may need to add to the git repository with 'git add' or add to your ignores list. There cannot be any uncommitted changes in the local repo before running this script. NOTE: You don't need to run this script if you have not changed any files that affect the build or the tests. For example, if all you have changed are documentation files, then you don't need to run this script before pushing manually. NOTE: To see detailed debug-level information, set TRIBITS_CHECKIN_TEST_DEBUG_DUMP=ON in the env before running this script. COMMON USE CASES (EXAMPLES): ---------------------------- (*) Basic full testing with integrating with global repo(s) without push: ../checkin-test.py --do-all NOTE: This will result in a set of emails getting sent to your email address for the different configurations and an overall push readiness status email. NOTE: If everything passed, you can follow this up with a --push (see below). (*) Basic full testing with integrating with local repo and push: ../checkin-test.py --do-all --push NOTE: By default this will rebase your local commits and amend the last commit with a short summary of test results. This is appropriate for pushing commits that only exist in your local repo and are not shared with any remote repo. (*) Push to global repo after a completed set of tests have finished: ../checkin-test.py [other options] --push NOTE: This will pick up the results for the last completed test runs with [other options] and append the results of those tests to the log of the most recent commit. NOTE: Take the action options for the prior run and replace --do-all with --push but keep all of the rest of the options the same. For example, if you did: ../checkin-test.py --enable-packages=Blah --default-builds=MPI_DEBUG --do-all then follow that up with: ../checkin-test.py --enable-packages=Blah --default-builds=MPI_DEBUG --push NOTE: This is a common use case when some tests are failing which aborted the initial push but you determine it is okay to push anyway and do so with --force-push. (*) Test only the packages modified and not the forward dependent packages: ../checkin-test.py --do-all --no-enable-fwd-packages NOTE: This is a safe thing to do when only tests in the modified packages are changed and not library code. This can speed up the testing process and is to be preferred over not running this script at all. It would be very hard to make this script automatically determine if only test code has changed because every package does not follow a set pattern for tests and test code. (*) Run the most important default (e.g. MPI_DEBUG) build/test only: ../checkin-test.py --do-all --default-builds=MPI_DEBUG (*) The minimum acceptable testing when code has been changed: ../checkin-test.py \ --do-all --enable-all-packages=off --no-enable-fwd-packages \ --default-builds=MPI_DEBUG NOTE: This will do only an MPI DEBUG build and will only build and run the tests for the packages that have directly been changed and not any forward packages. Replace "MPI_DEBUG" with whatever your most important default build is. (*) Test only a specific set of packages and no others: ../checkin-test.py \ --enable-packages=<P0>,<P1>,<P2> --no-enable-fwd-packages \ --do-all NOTE: This will override all logic in the script about which packages will be enabled based on file changes and only the given packages will be enabled. When there are tens of thousands of changed files and hundreds of defined packages, this auto-detection algorithm can be very expensive! NOTE: You might also want to pass in --enable-all-packages=off in case the script wants to enable all the packages (see the output in the checkin-test.py log file for details) and you think it is not necessary to do so. NOTE: Using these options is greatly preferred to not running this script at all and should not be any more expensive than the testing you would already do manually before a push. (*) Test changes locally without pulling updates: ../checkin-test.py --local-do-all NOTE: This will just configure, build, test, and send an email notification without updating or changing the status of the local git repo in any way and without any communication with the global repo. Hence, you can have uncommitted changes and still run configure, build, test without having to commit or having to stash changes. NOTE: This will determine what packages to enable and test based on changes w.r.t. to the tracking branch. If not on a tracking branch, or in a detached head state, see below. NOTE: This is typically not a sufficient level of testing in order to push the changes to a shared branch because you have not fully integrated your changes yet with other developers. However, this would be a sufficient level of testing in order to do a commit on the local machine and then pull to a remote machine for further testing and a push (see below). (*) Local test of repo version on a detached head or with no tracking branch: ../checkin-test.py --enable-all-packages=[on|off] \ --enable-packages=<P0>,... --local-do-all By specifying what packages are enabled and not doing a pull or push, the script allows the repo(s) to be in a detached head state or on a branch that does not have a tracking branch. This allows the checkin-test.py script to be used, for example, to test versions using 'git bisect'. (*) Adding extra build/test cases: Often you will be working on Secondary Tested (ST) Code or Experimental (EX) Code and want to include the testing of this in your pre-push testing process along with the standard --default-builds build/test cases which can only include Primary Tested (PT) Code. In this case you can run with: ../checkin-test.py --extra-builds=<BUILD1>,<BUILD2>,... [other options] For example, if you have a build that enables the TPL CUDA you would do: echo " -DTPL_ENABLE_MPI:BOOL=ON -DTPL_ENABLE_CUDA:BOOL=ON " > MPI_DEBUG_CUDA.config and then run with: ../checkin-test.py --extra-builds=MPI_DEBUG_CUDA --do-all This will do the standard --default-builds (e.g. MPI_DEBUG and SERIAL_RELEASE) build/test cases along with your non-standard MPI_DEBUG_CUDA build/test case. NOTE: You can disable the default build/test cases with --default-builds="". However, please only do this when you are not going to push because you need at least one default build/test case (the most important default PT case, e.g. MPI_DEBUG) to do a safe push. (*) Including extra repos and extra packages: You can also use the checkin-test.py script to continuously integrate multiple git repos containing add-on packages. To do so, just run: ../checkin-test.py --extra-repos=<REPO1>,<REPO2>,... [options] NOTE: You have to create local commits in all of the extra repos where there are changes or the script will abort. NOTE: Extra repos can be specified with more flexibility using the --extra-repos-file and --extra-repos-type arguments (also see --ignore-missing-extra-repos). NOTE: Each of the last local commits in each of the changed repos will get amended with the appended summary of what was enabled in the build/test (if --append-test-results is set). (*) Avoid changing any of the local commit SHA1s: If you are pushing commits from a shared branch, it is critical that you do not change any of the SHA1s of the commits. Changing the SHA1s for any of the commits will mess up various multi-repo, multi-branch workflows. To avoid changing any of the SHA1s of the local commits, one must run with: ../checkin-test.py --no-rebase --no-append-test-results [options] (*) Performing a remote test/push: If you develop on a slow machine like your laptop, doing an appropriate level of testing can take a long time. In this case, you can pull the changes to another faster remote workstation and do a more complete set of tests and push from there. If you are knowledgeable with git, this will be easy and natural to do, without any help from this script. However, this script can still help and automate the steps and can do so in one command invocation on the part of the developer. On your slow local development machine 'mymachine', do the limited testing with: ../checkin-test.py --do-all --no-enable-fwd-packages On your fast remote test machine, do a full test and push with: ../checkin-test.py \ --extra-pull-from=<remote-repo>:master \ --do-all --push where <remote-name> is a git remote repo name pointing to mymachine:/some/dir/to/your/src (see 'git help remote'). NOTE: You can of course adjust the packages and/or build/test cases that get enabled on the different machines. NOTE: Once you invoke the checkin-test.py script on the remote test machine and it has pulled the commits from mymachine, then you can start changing files again on your local development machine and just check your email to see what happens on the remote test machine. NOTE: If something goes wrong on the remote test machine, you can either work on fixing the problem there or you can fix the problem on your local development machine and then do the process over again. NOTE: If you alter the commits on the remote machine (such as squashing commits), you will have trouble merging back on our local machine. Therefore, if you have to to fix problems, make new commits and don't alter the ones you pulled from your local machine (but rebasing them should be okay as long as the local commits on mymachine are not pushed to other repos). NOTE: Git will resolve the duplicated commits when you pull the commits pushed from the remote machine. Git knows that the commits are the same and will do the right thing when rebasing (or just merging). NOTE: This would also work for multiple repos if the remote name '<remote-repo>' pointed to the right remote repo in all the local repos. (*) Check push readiness status: ../checkin-test.py This will examine results for the last testing process and send out an email stating if the a push is ready to perform or not. (*) See the default option values without doing anything: ../checkin-test.py --show-defaults This is the easiest way to figure out what all of the default options are. Hopefully the above documentation, the example use cases, the documentation of the command-line arguments below, and some experimentation will be enough to get you going using this script for all of your pre-push testing and pushes. If that is not sufficient, send email to your development support team to ask for help. LOCAL DEFAULT COMMAND LINE DEFAULTS ----------------------------------- If the file local-checkin-test-defaults.py exists in the current directory, then it will be read in and will change the project defaults for the command-line arguments. For example, a valid local-checkin-test-defaults.py file would look like: defaults = [ "-j10", "--no-rebase", "--ctest-options=-E '(PackageA_Test1|PackageB_Test2)'" ] Any of the project's checkin-test.py command-line argument defaults can be changed in this way. The updated defaults can be observed by running: ./checkin-test.py --show-defaults Any command-line arguments explicitly passed in will override these local defaults. HANDLING OF PT, ST, AND EX CODE IN BUILT-IN AND EXTRA BUILDS: ------------------------------------------------------------- This script will only process PT (Primary Tested) packages in the --default-builds (e.g. MPI_DEBUG and SERIAL_RELEASE) builds. This is to avoid problems of side-effects of turning on ST packages that would impact PT packages (e.g. an ST package getting enabled that enables an ST TPL which turns on support for that TPL in a PT package producing different code which might work but the pure PT build without the extra TPL may actually be broken and not know it). Therefore, any non-PT packages that are enabled (either implicitly through changed files or explicitly by listing in --enable-packages) will be turned off in the --default-builds builds. If none of the enabled packages are PT, then they will all be disabled and the --default-builds builds will be skipped. In order to better support the development of ST and EX packages, this script allows you to define some extra builds that will be invoked and used to determine overall pass/fail before a potential push. The option --st-extra-builds is used to specify extra builds that will test ST packages (and also PT packages if any are enabled). If only PT packages are enabled then the builds specified in --st-extra-builds will still be run. The reasoning is that PT packages may contain extra ST features and therefore if the goal is to test these ST builds it is desirable to also run these builds because they also my impact downstream ST packages. Finally, the option --extra-builds will test all enabled packages, including EX packages, regardless of their test group. Therefore, when using --extra-builds, be careful that you watch what packages are enabled. If you change an EX package, it will be enabled in --extra-builds builds. A few use cases might help better demonstrate the behavior. Consider the following input arguments specifying extra builds --st-extra-builds=MPI_DEBUG_ST --extra-builds=INTEL_DEBUG with the packages Teuchos, Phalanx, and Meros where Teuchos is PT, Phalanx is ST, and Meros is EX. Here is what packages would be enabled in each of the builds: --default-builds=MPI_DEBUG,SERIAL_RELEASE \ --st-extra-builds=MPI_DEBUG_ST \ --extra-builds=INTEL_DEBUG and which packages would be excluded: A) --enable-packages=Teuchos: MPI_DEBUG: [Teuchos] SERIAL_RELEASE: [Teuchos] MPI_DEBUG_ST: [Teuchos] INTEL_DEBUG: [Teuchos] Always enabled! B) --enable-packages=Phalanx: MPI_DEBUG: [] Skipped, no PT packages! SERIAL_RELEASE: [] Skipped, no PT packages! MPI_DEBUG_ST: [Phalanx] INTEL_DEBUG: [Phalanx] C) --enable-packages=Meros: MPI_DEBUG: [] Skipped, no PT packages! SERIAL_RELEASE: [] Skipped, no PT packages! MPI_DEBUG_ST: [] Skipped, no PT or ST packages! INTEL_DEBUG: [Meros] D) --enable-packages=Teuchos,Phalanx: MPI_DEBUG: [Teuchos] SERIAL_RELEASE: [Teuchos] MPI_DEBUG_ST: [Teuchos,Phalanx] INTEL_DEBUG: [Teuchos,Phalanx] E) --enable-packages=Teuchos,Phalanx,Meros: MPI_DEBUG: [Teuchos] SERIAL_RELEASE: [Teuchos] MPI_DEBUG_ST: [Teuchos,Phalanx] INTEL_DEBUG: [Teuchos,Phalanx,Meros] The --extra-builds=INTEL_DEBUG build is always performed with all of the enabled packages. This logic given above must be understood in order to understand the output given in the script. CONVENTIONS FOR COMMAND-LINE ARGUMENTS: --------------------------------------- The command-line arguments are segregated into three broad categories: a) action commands, b) aggregate action commands, and c) others. a) The action commands are those such as --build, --test, etc. and are shown with [ACTION] in their documentation. These action commands have no off complement. If the action command appears, then the action will be performed. b) Aggregate action commands such as --do-all and --local-do-all turn on sets of other action commands and are shown with [AGGR ACTION] in their documentation. The sub-actions that these aggregate action commands turn on cannot be disabled with other arguments. c) Other arguments are those that are not marked with [ACTION] or [AGGR ACTION] tend to either pass in data and turn control flags on or off. EXIT CODE: --------- This script returns 0 if the actions requested are successful. This does not necessarily imply that it is okay to do a push or that a push was done. For example, if only --pull is passed in and is successful, then 0 will be returned but that does *not* mean that it is okay to do a push. Therefore, a return value of is a necessary but not sufficient condition for readiness to push, it depends on the requested actions. Options: -h, --help show this help message and exit --project-configuration=PROJECTCONFIGURATION Custom file to provide configuration defaults for the project. By default, the file project-checkin-test- config.py is looked for in <checkin-test-path> (in case it is symlinked into <projectDir>/checkin- test.py) if not found there, then it is looked for in <checkin-test-path>/../../.. (assuming default TriBITS snapshot <projectDir>/cmake/tribits/ci_support/) If this file is set to a location that is not in the project's base directory, then --src-dir must be set to point to the project's base directory. --show-defaults Show the default option values and do nothing at all. --project-name=PROJECTNAME Set the project's name. This is used to locate various files. If not set, then it reads the project name from the PROJECT_NAME variable set in the file SRCDIR/ProjectName.cmake. --src-dir=SRCDIR The source base directory for code to be tested. The default is determined by the location of the found project-checkin-test-config.py file. --default-builds=DEFAULTBUILDS Comma separated list of builds that should always be run by default. --extra-repos-file=EXTRAREPOSFILE File path to an extra repositories list file. If set to 'project', then <project_dir>/cmake/ExtraRepositoriesList.cmake is read. See the argument --extra-repos for details on how this list is used (default empty '') --extra-repos-type=EXTRAREPOSTYPE The test type of repos to read from <extra_repos_file>. Choices = ('', 'Continuous', 'Nightly', 'Experimental'). [default = ''] --extra-repos=EXTRAREPOS List of comma separated extra repositories containing extra packages that can be enabled. The order these repos is listed in not important. This option overrides --extra-repos-file. --ignore-missing-extra-repos If set, then extra repos read in from <extra_repos_file> will be ignored and removed from list. This option is not applicable if <extra_repos_file>=='' or <extra_repos_type>==''. --require-extra-repos-exist If set, then all listed extra repos must exist or the script will exit. [default] --with-cmake=WITHCMAKE CMake executable to use with cmake -P scripts internally (only set by unit testing code). --skip-deps-update If set, skip the update of the dependency XML file. If the package structure has not changed since the last invocation, then it is safe to use this option. --enable-packages=ENABLEPACKAGES List of comma separated packages to test changes for (example, 'Teuchos,Epetra'). If this list of packages is empty, then the list of packages to enable will be determined automatically by examining the set of modified files from the version control update log. Note that this will skip the auto-detection of changed packages based on changed files. --enable-extra-packages=ENABLEEXTRAPACKAGES List of comma separated packages to test in addition to the packages that are enabled determined automatically by examining the set of modified files from the version control update log. This option is mostly just used in ACI sync servers. --disable-packages=DISABLEPACKAGES List of comma separated packages to explicitly disable (example, 'Tpetra,NOX'). This list of disables will be appended after all of the listed enables no matter how they are determined (see --enable-packages option). NOTE: Only use this option to remove packages that will not build for some reason. You can disable tests that run by using the CTest option -E passed through the --ctest-options argument in this script. --enable-all-packages=ENABLEALLPACKAGES Determine if all packages are enabled 'on', or 'off', or 'auto' (let other logic decide). Setting to 'off' is appropriate when the logic in this script determines that a global build file has changed but you know that you don't need to rebuild and test every package for a reasonable test. Setting --enable- packages effectively disables this option. Setting this to 'off' does *not* stop the forward enabling of downstream packages for packages that are modified or set by --enable-packages. Setting this to 'on' will skip the automatic detection of changed packages based on changed files. It can be helpful to stop the auto- detection changed packages when there are thousands of changed files and hundreds of defined packages. Choices = ('auto', 'on', 'off'). [default = 'auto'] --enable-fwd-packages Enable forward packages. [default] --no-enable-fwd-packages Do not enable forward packages. --continue-if-no-updates If set, then the script will continue if no updates are pulled from any repo. [default] --abort-gracefully-if-no-changes-pulled If set, then the script will abort gracefully if no updates are pulled from any repo. --continue-if-no-changes-to-push If set, then the script will continue if no changes to push from any repo. [default] --abort-gracefully-if-no-changes-to-push If set, then the script will abort gracefully if no changes to push from any repo. --continue-if-no-enables If set, then the script will continue if no packages are enabled. [default] --abort-gracefully-if-no-enables If set, then the script will abort gracefully if no packages are enabled. --extra-cmake-options=EXTRACMAKEOPTIONS Extra options to pass to 'cmake' after all other options. This should be used only as a last resort. To disable packages, instead use --disable-packages. To change test categories, use --test-categories. --test-categories=TESTCATEGORIES . Change the test categories. Can be 'BASIC', 'CONTINUOUS', 'NIGHTLY', or 'HEAVY' (default set by project, see --show-defaults). -j OVERALLNUMPROCS, --parallel=OVERALLNUMPROCS The options to pass to make and ctest (e.g. -j4). --use-makefiles If set, then -G'Unix Makfiles' used for backend build tool. Note: The command 'make' must be in the default path. [default] --use-ninja If set, then -GNinja used for backend build tool. Note: The command 'ninja' must be in the default path. --make-options=MAKEOPTIONS The options to pass to 'make' (e.g. -j4) or ninja (if --use-ninja given). --ctest-options=CTESTOPTIONS Extra options to pass to 'ctest' (e.g. -j2). --ctest-timeout=CTESTTIMEOUT timeout (in seconds) for each single 'ctest' test (e.g. 180 for three minutes). This sets the CMake cache var DART_TESTING_TIMEOUT which becomes the default timeout for tests, even when running raw ctest. This value can be overridden using the ctest argument --timeout. Individual tests may have their own timeouts set which will not be impacted by this default global timeout. See the configure variable <Project>_SCALE_TEST_TIMEOUT to scale up timeouts for all tests, even those that have individuals timeouts set. --show-all-tests Show all of the tests in the summary email and in the commit message summary (see --append-test-results). --no-show-all-tests Don't show all of the test results in the summary email. [default] --without-default-builds Skip the default builds (same as --default-builds=''). You would use option along with --extra- builds=BUILD1,BUILD2,... to run your own local custom builds. --st-extra-builds=STEXTRABUILDS List of comma-separated ST extra build names. For each of the build names in --st-extra- builds=<BUILD1>,<BUILD2>,..., there must be a file <BUILDN>.config in the local directory along side the COMMON.config file that defines the special build options for the extra build. --ss-extra-builds=SSEXTRABUILDS DEPRECATED! Use --st-extra-builds instead!. (Default empty ) --extra-builds=EXTRABUILDS List of comma-separated extra build names. For each of the build names in --extra- builds=<BUILD1>,<BUILD2>,..., there must be a file <BUILDN>.config in the local directory along side the COMMON.config file that defines the special build options for the extra build. --log-file=LOGFILE File used for detailed log info. --send-email-to=SENDEMAILTO List of comma-separated email addresses to send email notification to after every build/test case finishes and at the end for an overall summary and push status. By default, this is the email address you set for git returned by `git config --get user.email`. In order to turn off email notification, just set --send-email- to='' and no email will be sent. --skip-case-send-email If set then if a build/test case is skipped for some reason (i.e. because no packages are enabled) then an email will go out for that case. [default] --skip-case-no-email If set, then if a build/test case is skipped for some reason (i.e. because no packages are enabled) then no email will go out for that case. (opposite of --skip- case-send-email) [default] --send-build-case-email=SENDBUILDCASEEMAIL Determines when email goes out to --send-email- to=<email> for a build case. But the final status email will still go out if --send-email-to=<email> is not empty. [default = 'always'] Choices = ('always', 'only-on-failure', 'never'). [default = 'always'] --send-email-for-all If set, then emails will get sent out for all operations. [default] --send-email-only-on-failure If set, then emails will only get sent out for failures. --send-email-to-on-push=SENDEMAILTOONPUSH List of comma-separated email addresses to send email notification to on a successful push. This is used to log pushes to a central list. In order to turn off this email notification, just set --send-email-to-on- push='' and no email will be sent to these email lists. --force-push Force the local push even if there are build/test errors. WARNING: Only do this when you are 100% certain that the errors are not caused by your code changes. This only applies when --push is specified and this script. --no-force-push Do not force a push if there are failures. [default] --do-push-readiness-check Check the push readiness status at the end and send email if not actually pushing. [default] --skip-push-readiness-check Skip push status check. --rebase Rebase the local commits on top of <remoterepo>/<remotebranch> before amending the last commit and pushing. Rebasing keeps a nice linear commit history like with CVS or SVN and will work perfectly for the basic workflow of adding commits to the 'master' branch and then syncing up with <remoterepo>/<remotebranch> before the final push. [default] --no-rebase Do *not* rebase the local commits on top of <remoterepo>/<remotebranch> before amending the final commit and pushing. This allows for some more complex workflows involving local branches with multiple merges. However, this will result in non- linear history and will allow for trivial merge commits with <remoterepo>/<remotebranch> to get pushed. This mode should only be used in cases where the rebase mode will not work or when it is desired to use a merge commit to integrate changes on a branch that you wish be able to easily back out. For sophisticated users of git, this may in fact be the preferred mode. --append-test-results Before the final push, amend the most recent local commit by appending a summary of the test results. This provides a record of what builds and tests were performed in order to test the local changes. This is only performed if --push is also set. NOTE: If the same local commit is amended more than once, the prior test summary sections will be overwritten with the most recent test results from the current run. [default] --no-append-test-results Do not amend the last local commit with test results. NOTE: If you have uncommitted local changes that you do not want this script to commit then you must select this option to avoid this last amending commit. Also, if you are pushing commits from a shared branch and don't want to change any of the SHA1s for the commits, then you must set this option! --extra-pull-from=EXTRAPULLFROM Optional extra git pull(s) to merge in changes from after pulling in changes from the tracking branch. The format of this argument is: ...,<local- repoi>:<remote-repoi>:<remote-branchi>,... where each pull specification gives the name (not the directory) of the local repo <local-repoi>, the remote repo name <remote-repoi>, and the branch in the remote repo to pull <remote-branchi>. If only two semicolons ':' are given then an pull field takes the form ...,<remote- repo>:<remote-branch>,... where the remote <remote- name> must be defined in all the repos and the branch <remote-branch> must exist in all the remote repos. If the <remote-repoi> is empty such as with ...,:<remote-repoi>:<remote-branchi>,... then this matches the base git repo. The extra pull(s) are only done if --pull is also specified. NOTE: when using --extra-repos=<repo0>,<repo1>,... the <local-repoi> must be a named repository that is present in all of the git repos or it will be an error. --allow-no-pull Allowing for there to be no pull performed and still doing the other actions. This option is useful for testing against local changes without having to get the updates from the global repo. However, if you don't pull, you can't push your changes to the global repo. WARNING: This does *not* stop a pull attempt from being performed by --pull or --do-all! --wipe-clean [ACTION] Blow existing build directories and build/test results. The action can be performed on its own or with other actions in which case the wipe clean will be performed before any other actions. NOTE: This will only wipe clean the builds that are specified and will not touch those being ignored (e.g. SERIAL_RELEASE will not be removed if --default- builds=MPI_DEBUG is specified). --pull [ACTION] Do the pull from the tracking branch and optionally also merge in changes from the repo pointed to by --extra-pull-from. --configure [ACTION] Do the configure step. --build [ACTION] Do the build step. --test [ACTION] Do the running of the enabled tests. --local-do-all [AGGR ACTION] Do configure, build, and test with no pull (same as setting --allow-no-pull ---configure --build --test). This is the same as --do-all except it does not do --pull and also allows for no pull. --do-all [AGGR ACTION] Do update, configure, build, and test (same as --pull --configure --build --test). NOTE: This will do a --pull regardless if --allow-no-pull is set or not. To avoid the pull, use --local-do-all. --push [ACTION] Push the committed changes in the local repo into to remote repo pointed to by the tracking branch. --execute-on-ready-to-push=EXECUTEONREADYTOPUSH [ACTION] A command to execute on successful execution and 'READY TO PUSH' status from this script. This can be used to do a remote SSH invocation to a remote machine to do a remote pull/test/push after this machine finishes.
Below is a snapshot of the output from is_checkin_tested_commit.py --help. For more details see Using Git Bisect with checkin-test.py workflows.
File "../../ci_support/is_checkin_tested_commit.py", line 27 print "NOTE: TRIBITS_IS_CHECKIN_TESTED_COMMIT=ON set in env, doing debug dump ..." ^ SyntaxError: Missing parentheses in call to 'print'. Did you mean print("NOTE: TRIBITS_IS_CHECKIN_TESTED_COMMIT=ON set in env, doing debug dump ...")?
Below is a snapshot of the output from get-tribits-packages-from-files-list.py --help. For more details see TriBITS Project Dependencies XML file and tools.
Usage: get-tribits-packages-from-files-list.py --deps-xml-file=<DEPS_XML_FILE> \ --files-list-file=<FILES_LIST_FILE> [--project-dir=<projectDir>] This script returns a comma-seprated list of all of the project's TriBITS packages that must be directly tested for changes in the input list of files. This may also include the special package name 'ALL_PACKAGES' which means that at least one changed file (e.g. <projectDir>/CMakeLists.txt) should result in having to test all of the TriBITS packages in the project. The logic for which file changes should trigger testing all packages can be specialized for the project through the Python module: <projectDir>/cmake/ProjectCiFileChangeLogic.py (if that file exists). This script is used in continuous integration testing workflows involving TriBITS projects where only packages impacted by the changes are tested. For such a scenario, the list of changed files can come from: git diff --name-only <upstream>..<branch-tip> > changed-files.txt where <upstream> (e.g. origin/master) is the commit reference that the local branch was created from and <branch-tip> is the tip of the topic branch. Options: -h, --help show this help message and exit --deps-xml-file=DEPSXMLFILE File containing TriBITS-generated XML data-structure the listing of packages, dir names, dependencies, etc. --files-list-file=FILESLISTFILE File containing the list of modified files relative to project base directory, one file per line. --project-dir=PROJECTDIR Base project directory. Used to access more specialized logic beyond what is known in the <DEPSXMLFILE>. If empty '', then it will be set automatically if TriBITS is is the standard location w.r.t. the project in relation to this script run from the TriBITS dir.
Below is a snapshot of the output from get-tribits-packages-from-last-tests-failed.py --help. For more details see TriBITS Project Dependencies XML file and tools.
Usage: get-tribits-packages-from-last-tests-failed.py --deps-xml-file=<file> \ --last-tests-failed-file=<file> This returns a comma-separated list of TriBITS packages that correspond to the list of failing tests provided in the passed-in file LastTestsFailed*.log generated by ctest in the directory <build-dir>/Testing/Temporary/. Options: -h, --help show this help message and exit --deps-xml-file=DEPSXMLFILE File containing the listing of packages, dir names, dependencies, etc. --last-tests-failed-file=LASTTESTSFAILEDFILE Path to file LastTestsFailed*.log file generated by CTest under <build-dir>/Testing/Temporary/.
Below is a snapshot of the output from filter-packages-list.py --help. For more details see TriBITS Project Dependencies XML file and tools.
Usage: filter-packages-list.py --deps-xml-file=<PROJECT_DEPS_FILE> \ --input-packages-list=<P1>,<P2>,... --keep-test-test-categories=<T1>,<T2>,... This script takes in a comma-separated list of TriBITS package names in --input-packages-list='<P1>,<P2>,...' and then filters out the names for packages that don't match the set categories listed in --keep-test-test-categories='<T1>,<T2>,...' (where each package's test test category is given the input TriBITS-generated project dependencies file --deps-xml-file=<PROJECT_DEPS_FILE>). The filtered list of packages is printed to STDOUT as a comma-separated list. For example, to keep only the Primary Tested (PT) packages, use: filter-packages-list.py --keep-test-test-categories=PT [other args] To keep both Primary Tested and Secondary Tested packages, use: filter-packages-list.py --keep-test-test-categories=PT,ST [other args] To keep all packages, use: filter-packages-list.py --keep-test-test-categories=PT,ST,EX [other args] (or don't bother running the script). Options: -h, --help show this help message and exit --deps-xml-file=DEPSXMLFILE TriBITS generated XML file containing the listing of packages, dir names, dependencies, etc. --input-packages-list=INPUTPACKAGESLIST Comma-seprated List of packages that needs to be filtered (i.e. "P1,P2,..."). --keep-test-test-categories=KEEPTESTTESTCATEGORIES List of package types to keep (i.e. "PT,ST,EX".
Below is a snapshot of the output from install_devtools.py --help.
Usage: install-devtools.py [OPTIONS] This script drives the installation of a number of tools needed by many TriBITS-based projects. The most typical usage is to first create a scratch directory with:: mkdir scratch cd scratch and then run: install-devtools.py --install-dir=<dev_env_base> \ --parallel=<num-procs> --do-all By default, this installs the following tools in the dev env install directory: <dev_env_base>/ common_tools/ autoconf-<autoconf-version>/ cmake-<cmake-version>/ gitdist gcc-<gcc-version>/ load_dev_env.[sh,csh] toolset/ gcc-<gcc-version>/ mpich-<mpich-version>/ The default versions of the tools installed are: * autoconf-2.69 * cmake-3.3.2 * gcc-4.8.3 * mpich-3.1.3 The tools installed under common_tools/ only need to be installed once independent of any compilers that may be used to build TriBITS-based projects. The tools installed under gcc-<gcc-version>/ are specific to a GCC compiler and MPICH configuration and build. The download and install of each of these tools is drive by its own install-<toolname>.py script in the same directory as install-devtools.py. Before running this script, some version of a C and C++ compiler must already be installed on the system. At a high level, this script performs the following actions. 1) Create the base directories (if they don't already exist) and install load_dev_env.sh (csh). (if --initial-setup is passed in.) 2) Download the sources for all of the requested common tools and compiler toolset. (if --download is passed in.) 3) Configure, build, and install the requested common tools under common_tools/. (if --install is passed in.) 4) Configure, build, and install the downloaded GCC and MPICH tools. First install GCC then MPICH using the installed GCC and install under gcc-<gcc-version>/. (if --install is passed in.) The informational arguments to this function are: --install-dir=<dev_env_base> The base directory that will be used for the install. There is not default. If this is not specified then it will abort. --source-git-url-base=<url_base> Gives the base URL for to get the tool sources from. The default is: https://github.com/tribitsdevtools/ This is used to build the full git URL as: <url_base><tool_name>-<tool_version>-base This can also accommodate gitolite repos and other directory structures, for example, with: git@<host-name>:prerequisites/ --common-tools=all Specifies the tools to download and install under common_tools/. One can pick specific tools with: --common-tools=autoconf,cmake,... This will download and install the default versions of these tools. To select specific versions, use: --common-tools=autoconf:2.69,cmake:3.3.2,... The default is 'all'. To install none of these, pass in empty: --common-tools='' (NOTE: A version of 'git' is *not* installed using this script but can be installed using the script install-git.py. But note the extra packages that must be installed on a system in order to fully install git and its documentation. All of the git-related TriBITS tools can use any recent version of git and most systems will already have a current-enough version of git so there is no need to install one to be effective doing development.) --compiler-toolset=all Specifies GCC and MPICH (and other compiler-specific tools) to download and install under gcc-<gcc-version>/toolset/. One can pick specific components with: --compiler-toolset=gcc,mpich or specific versions with: --compiler-toolset=gcc:4.8.3,mpich:3.1.3 Of course if one is only installing GCC with an existing installed MPICH, one will need to also reinstall MPICH as well. The default is 'all'. To install none of these, pass in empty: --compiler-toolset='' The action arguments are: --initial-setup: Create <dev_env_base>/ directories and install load_dev_env.sh --download: Download all of the requested tools --install: Configure, build, and install all of the requested tools --do-all: Do everything. Implies --initial-setup --downlaod --install To change modify the permissions of the installed files, see the options --install-owner, --install-group, and --install-for-all. Note that the user can see what operations and command would be run without actually running them by passing in --no-op. This can be used to show how to run each of the individual install command so that the user can run it for him/her-self and customize it as needed. If the user needs more customization, then they can just run with --do-all --no-op and see what commands are run to install things and then they can run the commands themselves manually and make whatever modifications they need. NOTE: The actual tool installs are performed using the scripts: * install-autoconf.py * install-cmake.py * install-gcc.py * install-git.py * install-mpich.py * install-openmpi.py More information about what versions are installed, how they are installed, etc. is found in these scripts. Note that some of these scripts apply patches for certain versions. For details, look at the --help output from these scripts and look at the implementation of these scripts. Options: -h, --help show this help message and exit --install-dir=INSTALLDIR The base directory <dev_env_base> that will be used for the install. There is not default. If this is not specified then will abort. --install-owner=INSTALLOWNER If set, then 'chown -R <install-owner> <install-dir>' will be run after install. Note that you can only change the owner when running this script as sudo. --install-group=INSTALLGROUP If set, then 'chgrp -R <install-group> <install-dir>' and 'chmod -R g+rX <install-dir> will be run after install. Note that you can only change a to a group that the owner is a member of. --install-for-all If set, then 'chmod -R a+rX <install-dir>' will be run after install. --no-install-for-all If set, then <install-dir> is not opened up to everyone. --source-git-url-base=SOURCEGITURLBASE Gives the base URL <url_base> for the git repos to object the source from. --load-dev-env-file-base-name=LOADDEVENVFILEBASENAME Base name of the load dev env script that will be installed. (Default = 'load_dev_env') --common-tools=COMMONTOOLS Specifies the common tools to download and install under common_tools/. Can be 'all', or empty '', or any combination of 'gitdist,autoconf,cmake' (separated by commas, no spaces). --compiler-toolset=COMPILERTOOLSET Specifies GCC and MPICH and other compiler-specific tools to download and install under gcc-<gcc- version>/toolset/. Can be 'all', or empty '', or any combination of 'gcc,mpich' (separated by commas, no spaces). --parallel=PARALLELLEVEL Number of parallel processes to use in the build. The default is just '1'. Use something like '8' to get faster parallel builds. --do-op Do all of the requested actions [default]. --no-op Skip all of the requested actions and just print what would be done. --show-defaults [ACTION] Show the defaults and exit. --initial-setup [ACTION] Create base directories under <dev_env_base>/ and install load_dev_env.[sh,csh]. --download [ACTION] Download all of the tools specified by --common-tools and --compiler-toolset. WARNING: If the source for a tool has already been downloaded, it will be deleted (along with the build directory) and downloaded from scratch! --install [ACTION] Configure, build, and install all of the tools specified by --common-tools and --compiler- toolset. --show-final-instructions [ACTION] Show final instructions for using the installed dev env. --do-all [AGGR ACTION] Do everything. Implies --initial-setup --downlaod --install --show-final-instructions