CMake configuration

Last update: 28 May 2024 [History] [Edit]

This page describes how to write CMakeLists.txt files for building ATLAS (offline or analysis) packages.

Overall declarations


Every CMakeLists.txt file in ATLAS packages should begin with a “package declaration”. With a line that declares the name of the current package. One uses the atlas_subdir function for this. Like:

atlas_subdir( MyPackage )

The function declares a name for the current package. Just like the package command did in CMT requirements files.

atlas_depends_on_subdirs [obsolete]

Ideally you should not need to use this function to set up the build of your package! As long as the dependencies are declared correctly on all the libraries/executables/tests of the package, this function call should be removed from the package.

It is a left-over from the automatic CMT migration and was used to declare additional package dependencies, i.e. for header-only dependencies. The latter can equally be achieved by using an interface library. The syntax is:

atlas_depends_on_subdirs( PUBLIC Control/AthContainers Control/AthLinks
   PRIVATE Event/xAOD/xAODEgamma Event/xAOD/xAODEventInfo )

This function can be called with any number of package names. Packages listed after the PUBLIC keyword are treated as public dependencies of the package, and packages listed after PRIVATE become private dependencies.

Generic build target declarations

This section describes how to set up the building of libraries and executables in a package.


Building an installed library is done by calling the atlas_add_library function. This function already demonstrates all the complexities of CMake function argument parsing. So let’s look at the accepted arguments one by one.

atlas_add_library( libraryName <sources>
   [INCLUDE_DIRS dir1...]
   [LINK_LIBRARIES dir1...]
   [DEFINITIONS def1...]
  • The first argument of the function call must be the name of the library. This is the name of the “CMake target” that is created in the background, so in other places of the code you can refer to this library by this name. (More on this for the LINK_LIBRARIES and PRIVATE_LINK_LIBRARIES options.)
  • Next you need to (But you don’t have to. See the INTERFACE option later.) declare all the source files that need to be built into the library. This declaration accepts wildcards, so in the simplest case you can just write src/*.cxx as the argument. (All relative paths given in the CMakeLists.txt files are taken relative to the package’s main directory.) Note that technically you only have to list the source (cxx) files here. But in order for the header files of the libraries to show up in some IDEs, those have to be listed here as well. Something like “HeaderDir/*.h” works for that perfectly.
  • Next you need to decide whether your library provides any public headers. The convention with installed libraries is that they provide all their public headers in a subdirectory of the package that has the same name as the package itself. So for a regular installed library you would use PUBLIC_HEADERS MyPackage here. Notice that it’s also possible to declare multiple header directories for your library. But it’s not conventional in the ATLAS code to use this feature. Finally, if the library doesn’t provide any public headers, you have to use the NO_PUBLIC_HEADERS argument. This option doesn’t take any additional arguments.
  • Next you can decide what sort of library it should be.
    • SHARED libraries are the default. These produce (as the name suggests) shared libraries, which other components can link against.
    • INTERFACE libraries are purely a “CMake concept”. These are “libraries” that describe header files. Either in packages that only hold headers, or in packages that hold some pure virtual interface headers, and may also hold the implementation of those headers. (Where the implementation would be compiled with a separate CMake call.)
    • STATIC libraries produce (as the name suggest) static libraries. Note that by default static libraries are not built with “position independent code”. In case you want to use the static library in one or more shared libraries, you need to declare that it should be compiled with position independent code. (Using the POSITION_INDEPENDENT_CODE CMake target property.
    • OBJECT libraries are built by CMake as .o object files, which would later be linked into shared or static libraries. Such libraries are mainly used in “big/fat” libraries and when writing CUDA code.
    • MODULE libraries should in practice never be created with an atlas_add_library(...) call. It is an option that atlas_add_component(...) uses internally. (Just for completeness, MODULE libraries are libraries that can be loaded at runtime, but can’t be linked against during the build.)
  • Next you need to list all the extra include directories that are needed in order to build the library. Note that you only need to list directories here referring to “externals”. When you need to have this library use another library declared by atlas_add_library in this package or another one, it’s enough to just list those dependencies in LINK_LIBRARIES or PRIVATE_LINK_LIBRARIES. So, you would only have declarations like ${Boost_INCLUDE_DIRS}, ${EIGEN_INCLUDE_DIRS}, etc. here.
    • You have to put all directories that are used in the public headers of the library into INCLUDE_DIRS.
    • All the directories that are only used by private headers, and by the source files, need to be listed as PRIVATE_INCLUDE_DIRS.
  • Finally you need to list the libraries that this library needs to be linked against. The linked libraries come in two flavours.
    • Libraries built by CMake in one of the ATLAS packages need to be referred to with their “target name”. I.e. the libraryName argument used in their own atlas_add_library call. In this case you don’t need to do anything extra for the build system to find the headers of this library. CMake will know where to find the header files of a library that it itself created.
    • Libraries provided by external packages need to be added using the variables provided by their find_package call. So, expressions like ${Boost_SYSTEM_LIBRARY} or ${CLHEP_LIBRARIES}. For these cases you also have to add (an) appropriate entry/entries to the INCLUDE_DIRS and/or PRIVATE_INCLUDE_DIRS parameter(s).
  • Just like the include directories, the linked libraries can also be specified either as public or private dependencies. You should list libraries from which headers are used in public headers of this library as public dependencies, while libraries whose headers only appear in source files or private headers, must be listed as private dependencies.
  • Finally libraries can have public and private “definitions” applied to them. These are declarations in the format NAME VALUE, which are passed to the pre-processor as definitions/macros.

When all is said and done, a full library declaration may look like:

atlas_add_library( MySuperLibrary MySuperLibrary/*.h Root/*.cxx src/*.cxx
   PUBLIC_HEADERS MySuperLibrary


This function sets up the build of a “component library” in the package. Which is a MODULE library that:

  • Can’t be linked against;
  • Implements “Gaudi components” (algorithms, tools, services, etc.);
  • Gets python configurables made out of during the build;
  • Gets its component names extracted during the build.

The full argument list of the function is:

atlas_add_component( libraryName source1.cxx source2.cxx...
   [INCLUDE_DIRS Include1...]
   [LINK_LIBRARIES Library1 Library2...] )

Its arguments are:

  • The library name and source file declaration goes in the same way as for atlas_add_library.
  • NOCLIDDB: By default component libraries are processed by the genCLIDDB application during the build in order to extract all the class IDs out of them. But for some libraries, most notably the libraries needed for genCLIDDB itself, this needs to be turned off. For regular user libraries this flag should not be used.
  • INCLUDE_DIRS and LINK_LIBRARIES are treated the same as in atlas_add_library. The only difference here is that you don’t separate public and private dependencies. Since component libraries can’t be linked against, technically all dependencies are private in the end.
  • A component library is never meant to provide public headers, so the function doesn’t have an option for declaring such things.

If your package builds both an installed (atlas_add_library( myPackageLib ...)) and component library with the same library dependencies, the component library can simply be declare with one dependency on the installed library:

atlas_add_component( myPackage ...
   LINK_LIBRARIES myPackageLib)


This is a simple helper function that builds a “normal” library with the same arguments that atlas_add_library accepts, and just extracts the component information from the produced library.

Producing a library that other components can link against, and which Athena can auto-load for the converter(s) declared in it.


This is a highly specialised function for building POOL converter libraries. Which are typically housed by packages with an “AthenaPool” suffix in their name.

In order to ease the transition from CMT to CMake, the argument list of this function mimics the parameter list of the poolcnv CMT pattern. Its arguments are:

atlas_add_poolcnv_library( EventInfoAthenaPool source1.cxx...
   FILES xAODEventInfo/EventInfo.h...
   [INCLUDE_DIRS Include1...]
   [LINK_LIBRARIES Library1...]
   [MULT_CHAN_TYPES My::SuperType...]
   [CNV_PFX xAOD] )

The role of these arguments are:

  • The library name and source file declarations go exactly the same as in the other library declaration functions. Usually for !AthenaPool packages we pick up all source files from the package with src/*.cxx.
  • INCLUDE_DIRS and LINK_LIBRARIES are treated the same as for atlas_add_component. POOL converter libraries can’t be linked against, so all dependencies are treated as private dependencies.
  • FILES: List of files that declare EDM objects that we need to generate a converter for. The convention is exactly the same as for the poolcnv pattern. A file called Foo.h is expected to hold a class called Foo. For which this function declares a converter called FooCnv. If the source file for FooCnv (src/FooCnv.cxx) is found in the package, it is used as-is. If not, a default POOL converter is generated during the build. Notice that the header files need to be declared with the name that you would be able to include them with in a source file. Which usually means <PkgName>/<HdrName>.
  • TYPES_WITH_NAMESPACE: The convention in ATLAS is that header file names would not describe the namespace of the class that they hold. As explained earlier, a file called Foo.h is expected to hold a class called Foo. But according to the ATLAS naming rules, it could also hold a class called Bar::Foo. In order to tell the function that the class in one of the headers is in a namespace, the namespaced type names need to be listed here. Note that this means that it’s impossible to build POOL converters, within the same library, for classes that have the same name, but are in different namespaces. (Putting converters for such types into separate libraries is absolutely possible though.) Note that the POOL converter for Bar::Foo is called FooCnv by default. (See the CNV_PFX for details on this.)
  • MULT_CHAN_TYPES: There are two types of POOL converters. Ones inheriting from AthenaPoolCnvSvc/T_AthenaPoolCnv.h, and ones inheriting from AthenaPoolCnvSvc/T_AthenaPoolCoolMultChanCnv.h. By default auto-generated converters are made to inherit from the former. By listing a type name after this argument, it gets an auto-generated converter inheriting from the second class.
  • CNV_PFX: When we do have two or more EDM classes with the same class name, in separate namespaces, by default they would all get a converter class with the same name associated to them. The classic example is CaloClusterContainer and xAOD::CaloClusterContainer, which would both be converted by a class called CaloClusterContainerCnv. In order to be able to differentiate between these converters, a common prefix can be declared for all the converters in the current library. This way we generate a converter for xAOD::CaloClusterContainer called xAODCaloClusterContainerCnv. (Notice that there is still no namespace in the converter’s name.)

A fully fledged function call may look like:

atlas_add_poolcnv_library( xAODEventInfoAthenaPool src/*.h src/*.cxx
   FILES xAODEventInfo/EventInfo.h xAODEventInfo/EventAuxInfo.h
   xAODEventInfo/EventInfoContainer.h xAODEventInfo/EventInfoAuxContainer.h
   TYPES_WITH_NAMESPACE xAOD::EventInfo xAOD::EventAuxInfo
   xAOD::EventInfoContainer xAOD::EventInfoAuxContainer
   LINK_LIBRARIES AthenaPoolCnvSvcLib AthenaPoolUtilities xAODEventInfo )


This function is very similar to the previous one. It builds serialiser converters for the trigger !ByteStream conversion. It is modelled after the sercnv CMT pattern. Its full list of arguments are:

atlas_add_sercnv_library( TrigMuonEventSerCnv source1.cxx...
   FILES TrigMuonEvent/MuonFeature.h...
   [INCLUDE_DIRS Include1...]
   [LINK_LIBRARIES Library1...]
   [TYPES_WITH_NAMESPACE xAOD::MuonContainer...]
   [CNV_PFX xAOD] )

Since the two calls are very similar (they are technically implemented by the same code under the hood), only the differences are listed here:

  • Since serialiser converters are typically declared in the main EDM packages, one has to be very careful with declaring source files for these libraries. src/*.cxx is a clean no-go. (Even the sercnv pattern does this wrong. It builds the EDM code into both the main EDM libraries and the serialiser converter ones…) Actually, since there are no hand-written serialiser converters that I would know of (most of the POOL converters have custom implementations), usually you’re not supposed to list any source files for this function.
  • There is only one type of serialiser converter, so there’s no equivalent of MULT_CHAN_TYPES for this function.
  • CNV_PFX is used the same way as for the previous function. And corresponds to the libtag parameter of the sercnv CMT pattern.
  • The name of the converter class for class Foo is set up to be FooSerCnv. Or if CNV_PFX is used, it’s ${CNV_PFX}FooSerCnv. In case a converter ever needs to be implemented by hand, the implementation file needs to be called src/FooSerCnv.cxx (or src/${CNV_PFX}FooSerCnv.cxx).


This is the main function of AtlasCMake for building executable targets. Its full argument list is very simple, and looks like:

atlas_add_executable( ExecutableName util/source1.cxx...
   [INCLUDE_DIRS Include1...]
   [LINK_LIBRARIES Library1...] )

The arguments are:

  • The executable name is what the executable will be called on POSIX platforms, and is suffixed by .exe on Windows.
  • The source files of the executable can be picked up from any directory of the package. The executable may have one or more source files.
  • The INCLUDE_DIRS and LINK_LIBRARIES arguments are used as for all the other functions.

Since in the ATLAS CMT builds the executables by default got a .exe suffix, even on POSIX platforms, for backwards compatibility reasons this function sets up an “alias” with that suffix as well. Technically, just like with atlas_add_alias, it’s a script that is set up. Which prints a warning asking the user to stop using the .exe suffixed name, and then runs the application.


The following functions are used for generating ROOT dictionaries during the CMake builds.


This is the primary way in the ATLAS source code for generating dictionaries. It builds a separate dictionary library that only has the code of the dictionaries in it. Not the code of the classes/functions being described by the dictionaries.

The full argument list is as follows:

atlas_add_dictionary( LibName Lib/LibDict.h Lib/selection.xml
   [INCLUDE_DIRS Include1...]
   [LINK_LIBRARIES Library1 Library2...]
   [EXTRA_FILES Root/dict/*.cxx]
   [NAVIGABLES type1...]
   [DATA_LINKS type2...]
   [ELEMENT_LINKS type3...]
  • The name of the dictionary library can in principle be anything, but usually we call it mainLibNameDict. Where mainLibName is the name of the main library that this library provides the dictionaries for.
  • The compulsory header file and selection file declarations are not detailed here. For a generic description on the format of those files, look at the ROOT documentation.
  • Dictionary libraries are (for technical reasons) SHARED libraries, not MODULE ones. So technically it is possible to link code against them. But by convention this should never be done. As a result, we don’t separate public and private dependencies for these libraries. All dependencies of the library are handled by INCLUDE_DIRS and LINK_LIBRARIES.
  • EXTRA_FILES: In some special cases we need to build some custom code into the dictionary libraries beside the code generated by ROOT itself. The source files of this extra code can be declared in this optional argument.
  • NAVIGABLES, DATA_LINKS, ELEMENT_LINKS, ELEMENT_LINK_VECTORS: These arguments allow the user to specify class names for which Navigable<T>, DataLink<T>, ElementLink<T> or ElementLinkVector<T> dictionaries should be built. Declaring the dictionaries for these types like this, instead of declaring them explicitly in the header file and selection file of the dictionary, provides more flexibility for us in the construction of these special classes. So on the whole, whenever possible, these types of dictionaries should be declared in the CMake code using these arguments.
  • NO_ROOTMAP_MERGE: By default the code collects all the types that we generate dictionaries for, and puts their declarations into a project-wide “rootmap” file. So that ROOT could automatically load the libraries it needs in order to handle some type that it encounters. In some cases we need to build (mostly test) libraries that contain some dictionaries, which we don’t want to declare to ROOT for automatic loading. This can be achieved by using this argument.)

In ROOT 5 times you would’ve thought of the dictionaries produced by this function as “Reflex dictionaries”.


For classes that don’t inherit from TObject, by convention, we build separate libraries holding their dictionaries. This is what was explained for atlas_add_dictionary. But for the classes that do inherit from TObject, we must build the dictionaries into the main libraries that the types are provided by.

Because of this, this function doesn’t build a library itself. It just generates the source file(s) of a dictionary, which then need to be used in the build of a library.

The arguments of the function as as follows:

atlas_add_root_dictionary( mainLibName dictFileNameVar
   [ROOT_HEADERS Header1.h... LinkDef.h]
   [INCLUDE_PATHS Include1...] )
  • The first argument is the name of the library that the generated source file will be built into. This is necessary to be able to build the dictionary such that it could be auto-loaded by ROOT. You need to provide the target name of the library here. Which doesn’t need to be defined at this point yet. (Since that would create a chicken-or-egg problem in the declaration order.)
  • The second argument is a special one. CMake functions can’t return values like C/C++ functions do. The way in which they can return some value to the caller is that the caller specifies a variable name as an argument to the function call, and after the function has finished, that variable is set to something in the caller’s scope. The second argument of this function is the name of a variable that will be set to the name of the generated source file that needs to be built into the library specified as the first argument. (See the example further down.)
  • ROOT_HEADERS: The formalism for declaring ROOT dictionaries is not discussed here. This argument allows you to specify the header files for/from which dictionaries need to be generated.
  • EXTERNAL_PACKAGES: This is an “old type” argument that may be removed in the future. You can specify external package names to it that the generation function will call find_package(...) on, and include their INCLUDE_DIRS variables for the dictionary generation. When writing new code, you should always use the INCLUDE_PATHS variable instead.
  • INCLUDE_PATHS: To be renamed to INCLUDE_DIRS. Extra include directories that are needed during the dictionary generation.

Note that normally you will probably not need to use either the EXTERNAL_PACKAGES or the INCLUDE_PATHS arguments. The dictionary generation inherits the include directories requested by the main library, that this dictionary is generated for. Which in most cases is enough for the dictionary generation.

A typical signature for calling this function would look like:

atlas_add_root_dictionary( MyLibrary _dictSource
   ROOT_HEADERS MyLibrary/HeaderA.h MyLibrary/HeaderB.h Root/LinkDef.h )
atlas_add_library( MyLibrary Root/*.cxx ${_dictSource}

In ROOT 5 times you would’ve thought of the dictionaries produced by this function as “CINT dictionaries”.


The following describes how to set up unit tests in the packages, which can be run using CTest.


This is the main function for declaring unit tests in packages. It provides two signatures, for either setting up a compiled executable that is run as the test, or for setting up a custom script that executes some test.

Technically the function generates a simple bash script in the build area that:

  • Sets up the right runtime environment;
  • Runs a possible pre-exec command;
  • Runs the executable/script specified;
  • Runs a possible post-exec check. By default this runs a script called, that compares the output of the test to a test reference file.
atlas_add_test( TestName SOURCES test/source1.cxx...
   [INCLUDE_DIRS Dir1...]
   [LINK_LIBRARIES Library1...]
   [LOG_IGNORE_PATTERN patterns]
   [LOG_SELECT_PATTERN patterns]
   [PRE_EXEC_SCRIPT script]
   [POST_EXEC_SCRIPT script]
   [DEPENDS OtherTest1...]
   [LABELS label1...]
   [ENVIRONMENT env...]
   [PROPERTIES TIMEOUT <seconds> ]
   [PROPERTIES <name> <value>...] )

This formalism builds an executable out of all the source files specified, and set it up to be run by CTest.

  • SOURCES: This mandatory argument specifies the source files that should be built for the executable.
  • INCLUDE_DIRS: Extra include directories needed to build the test executable.
  • LINK_LIBRARIES: Libraries to link the test executable against.
  • LOG_IGNORE_PATTERN: Extra patterns (passed to to ignore in log file comparison. Has no effect if POST_EXEC_SCRIPT is used.
  • LOG_SELECT_PATTERN: Select matching lines for log file comparison. If both SELECT and IGNORE are specified the lines are first selected and then filtered by the ignore pattern. Error messages are always selected. Has no effect if POST_EXEC_SCRIPT is used.
  • PRE_EXEC_SCRIPT: The name of a script, or in general any executable BASH command that should be run before the test. For example to remove some file left over by previous tests in the package.
  • POST_EXEC_SCRIPT: This argument allows you to override the post execution check on the test. As discussed before, by default a script called is executed. This optional argument allows you to set up anything to be run as a post test check. E.g. if you wish to disable the reference file comparison, you can use POST_EXEC_SCRIPT See the list of available post-processing scripts or write your own if needed.
  • DEPENDS: Run this test after the specified list of tests. Same functionality as the DEPENDS test property but takes care of converting the local per-package test names to global ones. (since atlasexternals 2.0.129)
  • LABELS: Additional labels to be associated with the test (each test always carries the package name as label).
  • ENVIRONMENT: Special environment settings for the test. To include settings on top of the usual ATLAS runtime environment.
  • PRIVATE_WORKING_DIRECTORY: Run the test in a private working directory that is cleaned before each test invocation. Useful for tests that create output files that may interfere when run in parallel.
  • PROPERTIES WORKING_DIRECTORY: Allows user to create run directory themselves if they intend to use a non-standard one. See atlasexternals!253.
  • PROPERTIES: This argument allows you to set any additional properties on the test. You can get a list of possible properties from this link. It can be used for instance to set up an explicit order in which tests need to be executed.

Deprecated options:

  • EXTRA_PATTERNS: deprecated/replaced by LOG_IGNORE_PATTERN since atlasexternals 2.0.50
atlas_add_test( TestName SCRIPT test/
   [LOG_IGNORE_PATTERN patterns]
   [LOG_SELECT_PATTERN patterns]
   [PRE_EXEC_SCRIPT script]
   [POST_EXEC_SCRIPT script]
   [LABELS label...]
   [ENVIRONMENT env...]
   [PROPERTIES TIMEOUT <seconds> ]
   [PROPERTIES <name> <value>...] )

This formalism runs an executable script from the package as the unit test.

  • SCRIPT: The location of a single executable script. Can be either a shell/python/etc. script. The script itself can do practically anything, including running an Athena job.
  • All the other arguments have the same meaning as in the previous incantation.

Test reference files

If the test uses the default post-processing script, you have two options for storing the test reference file:

  1. In the package itself under the name share/TestName.ref. This is suitable for reasonably sized reference files that can be added to the git repository itself.
  2. Large reference files should be kept in the data-art area on CVMFS. See the ART documentation on how to mange files in that area. You will need permissions for the grid-input area. Since the references files are now separate from the testing code, you need to take care of versioning them yourself. Add each new set of references to a new directory and tell the unit test which version to use via:
    atlas_add_test( TestName
       ENVIRONMENT "ATLAS_REFERENCE_TAG=MyPackage/MyPackage-00-00-01" )

Utility/helper functions

These are functions doing various helper operations during the build.


This function is used behind the scenes for extracting the component lists of libraries, using the listcomponents.exe executable from Gaudi. The function is available to be called directly as well though. For special libraries that are in neither of the other categories, and implement some components.

The function is extremely simple to call. It just needs one argument, the name of the library that it should process. It should be the name of a “library target” specified earlier. (It can’t be called on a library not built by CMake in the current project.) So, the call can simply be:

atlas_generate_componentslist( MyLibrary )


This function is used behind the scenes to extract CLID declarations from libraries that usually have such things. But it can also be called explicitly by the user on any type of library. Mostly one would call it on an EDM library though.

It just receives a single argument, the name of the library to extract CLIDs from. As for the previous function, this needs to be the target name of a library built in the current CMake project.

atlas_generate_cliddb( MyEDMLibrary )


This function sets up an “alias” for a command, with a different name. Technically it’s not actually an alias that’s created, but a shell script (put into the project’s bin directory) instead. With the name of the “alias”, executing the command(s) given to the function. The function’s declaration looks like:

atlas_add_alias( name command [arg1...] )

A typical call for it looks like:

atlas_add_alias( athena )

Note that when specifying multiple strings/words as the command, this is expected to be a single command with multiple arguments. So, something like:

atlas_add_alias( super_ls ls -la )

Installation functions

The following functions take care of installing “resource files” from the package into both the build and installation areas in a way that’s compatible with the runtime environment configured by the code.

atlas_install_headers [obsolete]

The header installation is a bit special. The function expects one or more directory names to be given to it, which it can then soft link under ${CMAKE_BUILD_DIR}/${ATLAS_PLATFORM}/include in the build area. During the installation step a custom operation is set up that creates a soft link from the install area’s include/ directory to the appropriate directories under src/.

The specified directory doesn’t have to necessarily be a directory in the package’s main directory, but by convention it usually is. By convention we put all the public headers of a package into a directory in the package that has the package’s name. In such a case the appropriate call looks like:

atlas_install_headers( MyPackage )

Note that for most packages this function should not be used. If the package sets up one or more libraries that export public headers, then atlas_add_library takes care of calling this function itself. At the moment it’s only really used in the packages that build a component library using atlas_add_component, but still provide some (mostly pure virtual) header files. But even those cases should be replaced by an INTERFACE library.


This function, and all the other functions listed here, just call atlas_install_generic (shown further down) with different sets of arguments.

This function makes sure that all the files and directories in its argument list get installed under ${CMAKE_BINARY_DIR}/${ATLAS_PLATFORM}/python/${pkgName} in the build area, and under python/${pkgName} in the installation area. The most usual call is:

atlas_install_python_modules( python/*.py )


This function makes sure that all the files and directories in its argument list get installed under ${CMAKE_BINARY_DIR}/${ATLAS_PLATFORM}/jobOptions/${pkgName} in the build area, and under jobOptions/${pkgName} in the installation area. The most usual call is:

atlas_install_joboptions( share/*.py )


This function makes sure that all the files and directories in its argument list get installed under ${CMAKE_BINARY_DIR}/${ATLAS_PLATFORM}/doc/${pkgName} in the build area, and under doc/${pkgName} in the installation area. The most usual call is:

atlas_install_docs( doc/*.html )


This function makes sure that all the files and directories in its argument list get installed under ${CMAKE_BINARY_DIR}/${ATLAS_PLATFORM}/share in the build area, and under share in the installation area. (Note that no subdirectory is added with the package name.) The files are set executable.


This function makes sure that all the files and directories in its argument list get installed under ${CMAKE_BINARY_DIR}/${ATLAS_PLATFORM}/bin in the build area, and under bin in the installation area. (Note that no subdirectory is added with the package name.) The files are set executable.


This function makes sure that all the files and directories in its argument list get installed under ${CMAKE_BINARY_DIR}/${ATLAS_PLATFORM}/XML/${pkgName} in the build area, and under XML/${pkgName} in the installation area. The most usual call is:

atlas_install_xmls( share/*.xml share/*.dtd )


This function makes sure that all the files and directories in its argument list get installed under ${CMAKE_BINARY_DIR}/${ATLAS_PLATFORM}/data/${pkgName} in the build area, and under data/${pkgName} in the installation area.

It is not actively used in the offline build at the moment, it was mainly added to mimic the behaviour of !RootCore’s data file installation.


This function can be used to install practically any type of file from the package under any directory in the build and install areas. It provides the following options:

atlas_install_generic( dir/file1 dir/dir2...
   [TYPENAME type]
  • First you need to provide it with a (possibly wildcarded) list of file/directory names. These will be the files/directories that will be installed.
  • DESTINATION: A directory name under which the specified files/directories should be installed. Like “share”, “data”, etc.
  • BUILD_DESTINATION: In case the installation path should be different for the build area w.r.t. the installation area, you can provide an absolute path using this argument for the location inside the build area for these files/directories. When you do this, you will usually want to make use of the ${CMAKE_BINARY_DIR} and ${ATLAS_PLATFORM} variables to define this value. If such an option is not provided, the installation path inside the build directory becomes ${CMAKE_BINARY_DIR}/${ATLAS_PLATFORM}/${ARG_DESTINATION}.
  • TYPENAME: An optional short, descriptive type that makes the printouts during the build a little nicer. By default the function creates a custom target called ${pkgName}GenericInstall, which takes care of installing all the files/directories specified. With TYPENAME provided, this becomes ${pkgName}${ARG_TYPENAME}Install. So it’s really only a stylistic thing in the end.
  • EXECUTABLE: Flag specifying that the installed files should get their executable flag turned on. (On POSIX systems…)
  • PKGNAME_SUBDIR: When specified, the installation adds the package’s name as the last part of the installation path. This is used for instance to install python modules or jobOptions from the packages.

An example call could look like:

atlas_install_generic( data/*_test.xml
   DESTINATION data/TestXmls
   TYPEANAME TestXmls )