From 9417dff2cbd2ac86239cb60cea0fb08bd6e76910 Mon Sep 17 00:00:00 2001 From: "github-actions[bot]" Date: Mon, 5 Aug 2024 09:31:36 +0000 Subject: [PATCH] Deployed 7817d75 to dev with MkDocs 1.5.3 and mike 2.0.0 --- dev/authentication_and_upload/index.html | 4 +- dev/search/search_index.json | 2 +- dev/sitemap.xml | 52 +++++++++++------------ dev/sitemap.xml.gz | Bin 466 -> 466 bytes 4 files changed, 29 insertions(+), 29 deletions(-) diff --git a/dev/authentication_and_upload/index.html b/dev/authentication_and_upload/index.html index 5eb8f7da..9efc9e34 100644 --- a/dev/authentication_and_upload/index.html +++ b/dev/authentication_and_upload/index.html @@ -1333,7 +1333,7 @@

Authenticating with a server "BearerToken": "your_token" }, "otherhost.com": { - "BasicHttp": { + "BasicHTTP": { "username": "your_username", "password": "your_password" } @@ -1351,7 +1351,7 @@

Authenticating with a serverUploading packages#

If you want to upload packages, then rattler-build comes with a built-in diff --git a/dev/search/search_index.json b/dev/search/search_index.json index 296a337f..ad7b5f29 100644 --- a/dev/search/search_index.json +++ b/dev/search/search_index.json @@ -1 +1 @@ -{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Home","text":""},{"location":"#rattler-build-a-fast-conda-package-builder","title":"rattler-build: A Fast Conda Package Builder","text":"

The rattler-build tooling and library creates cross-platform relocatable binaries / packages from a simple recipe format. The recipe format is heavily inspired by conda-build and boa, and the output of a regular rattler-build run is a package that can be installed using mamba, rattler or conda.

rattler-build does not have any dependencies on conda-build or Python and works as a standalone binary.

"},{"location":"#installation","title":"Installation","text":"

The recommended way of installing rattler-build, being a conda-package builder, is through a conda package manager. Next to rattler-build we are also building pixi.

With pixi you can install rattler-build globally:

pixi global install rattler-build\n

Other options are:

CondaHomebrewArch LinuxBinary
conda install rattler-build -c conda-forge\n\nmamba install rattler-build -c conda-forge\nmicromamba install rattler-build -c conda-forge\n\npixi global install rattler-build\npixi add rattler-build # To a pixi project\n
brew install rattler-build\n
pacman -S rattler-build\n

# Download the latest release from the GitHub releases page, for example the linux x86 version with curl:\ncurl -SL --progress-bar https://github.com/prefix-dev/rattler-build/releases/latest/download/rattler-build-x86_64-unknown-linux-musl\n
You can grab version of rattler-build from the Github Releases.

"},{"location":"#completion","title":"Completion","text":"

When installing rattler-build you might want to enable shell completion. Afterwards, restart the shell or source the shell config file.

"},{"location":"#bash-default-on-most-linux-systems","title":"Bash (default on most Linux systems)","text":"
echo 'eval \"$(rattler-build completion --shell bash)\"' >> ~/.bashrc\n
"},{"location":"#zsh-default-on-macos","title":"Zsh (default on macOS)","text":"
echo 'eval \"$(rattler-build completion --shell zsh)\"' >> ~/.zshrc\n
"},{"location":"#powershell-pre-installed-on-all-windows-systems","title":"PowerShell (pre-installed on all Windows systems)","text":"
Add-Content -Path $PROFILE -Value '(& rattler-build completion --shell powershell) | Out-String | Invoke-Expression'\n

Failure because no profile file exists

Make sure your profile file exists, otherwise create it with:

New-Item -Path $PROFILE -ItemType File -Force\n

"},{"location":"#fish","title":"Fish","text":"
echo 'rattler-build completion --shell fish | source' >> ~/.config/fish/config.fish\n
"},{"location":"#nushell","title":"Nushell","text":"

Add the following to the end of your Nushell env file (find it by running $nu.env-path in Nushell):

mkdir ~/.cache/rattler-build\nrattler-build completion --shell nushell | save -f ~/.cache/rattler-build/completions.nu\n

And add the following to the end of your Nushell configuration (find it by running $nu.config-path):

use ~/.cache/rattler-build/completions.nu *\n
"},{"location":"#elvish","title":"Elvish","text":"
echo 'eval (rattler-build completion --shell elvish | slurp)' >> ~/.elvish/rc.elv\n
"},{"location":"#dependencies","title":"Dependencies","text":"

Currently rattler-build needs some dependencies on the host system which are executed as subprocess. We plan to reduce the number of external dependencies over time by writing what we need in Rust to make rattler-build fully self-contained.

  • tar to unpack tarballs downloaded from the internet in a variety of formats. .gz, .bz2 and .xz are widely used and one might have to install the compression packages as well (e.g. gzip, bzip2, ...)
  • patch to patch source code after downloading
  • install_name_tool is necessary on macOS to rewrite the rpath of shared libraries and executables to make it relative
  • patchelf is required on Linux to rewrite the rpath and runpath of shared libraries and executables
  • git to checkout Git repositories (not implemented yet, but will require git in the future)
  • msvc on Windows because we cannot ship the MSVC compiler on conda-forge (needs to be installed on the host machine)

On Windows, to obtain these dependencies from conda-forge, one can install m2-patch, m2-bzip2, m2-gzip, m2-tar.

"},{"location":"#github-action","title":"GitHub Action","text":"

There is a GitHub Action for rattler-build. It can be used to install rattler-build in CI/CD workflows and run a build command. Please check out the GitHub Action documentation for more information.

"},{"location":"#the-recipe-format","title":"The recipe format","text":"

Note You can find all examples below in the examples folder in the codebase and run them with rattler-build.

A simple example recipe for the xtensor header-only C++ library:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n\ncontext:\n  name: xtensor\n  version: 0.24.6\n\npackage:\n  name: ${{ name|lower }}\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/xtensor-stack/xtensor/archive/${{ version }}.tar.gz\n  sha256: f87259b51aabafdd1183947747edfff4cff75d55375334f2e81cee6dc68ef655\n\nbuild:\n  number: 0\n  script:\n    - if: win\n      then: |\n        cmake -G \"NMake Makefiles\" -D BUILD_TESTS=OFF -D CMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% %SRC_DIR%\n        nmake\n        nmake install\n      else: |\n        cmake ${CMAKE_ARGS} -DBUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=$PREFIX $SRC_DIR -DCMAKE_INSTALL_LIBDIR=lib\n        make install\n\nrequirements:\n  build:\n    - ${{ compiler('cxx') }}\n    - cmake\n    - if: unix\n      then: make\n  host:\n    - xtl >=0.7,<0.8\n  run:\n    - xtl >=0.7,<0.8\n  run_constraints:\n    - xsimd >=8.0.3,<10\n\ntests:\n  - script:\n    - if: unix or emscripten\n      then:\n        - test -d ${PREFIX}/include/xtensor\n        - test -f ${PREFIX}/include/xtensor/xarray.hpp\n        - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfig.cmake\n        - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfigVersion.cmake\n    - if: win\n      then:\n        - if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)\n        - if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfig.cmake (exit 1)\n        - if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)\n\nabout:\n  homepage: https://github.com/xtensor-stack/xtensor\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: The C++ tensor algebra library\n  description: Multi dimensional arrays with broadcasting and lazy computing\n  documentation: https://xtensor.readthedocs.io\n  repository: https://github.com/xtensor-stack/xtensor\n\nextra:\n  recipe-maintainers:\n    - some-maintainer\n

A recipe for the rich Python package (using noarch):

context:\n  version: \"13.4.2\"\n\npackage:\n  name: \"rich\"\n  version: ${{ version }}\n\nsource:\n  - url: https://pypi.io/packages/source/r/rich/rich-${{ version }}.tar.gz\n    sha256: d653d6bccede5844304c605d5aac802c7cf9621efd700b46c7ec2b51ea914898\n\nbuild:\n  # Thanks to `noarch: python` this package works on all platforms\n  noarch: python\n  script:\n    - python -m pip install . -vv --no-deps --no-build-isolation\n\nrequirements:\n  host:\n    - pip\n    - poetry-core >=1.0.0\n    - python 3.10\n  run:\n    # sync with normalized deps from poetry-generated setup.py\n    - markdown-it-py >=2.2.0\n    - pygments >=2.13.0,<3.0.0\n    - python 3.10\n    - typing_extensions >=4.0.0,<5.0.0\n\ntests:\n  - python:\n      imports:\n        - rich\n      pip_check: true\n\nabout:\n  homepage: https://github.com/Textualize/rich\n  license: MIT\n  license_file: LICENSE\n  summary: Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal\n  description: |\n    Rich is a Python library for rich text and beautiful formatting in the terminal.\n\n    The Rich API makes it easy to add color and style to terminal output. Rich\n    can also render pretty tables, progress bars, markdown, syntax highlighted\n    source code, tracebacks, and more \u2014 out of the box.\n  documentation: https://rich.readthedocs.io\n  repository: https://github.com/Textualize/rich\n

A recipe for the curl library:

context:\n  version: \"8.0.1\"\n\npackage:\n  name: curl\n  version: ${{ version }}\n\nsource:\n  url: http://curl.haxx.se/download/curl-${{ version }}.tar.bz2\n  sha256: 9b6b1e96b748d04b968786b6bdf407aa5c75ab53a3d37c1c8c81cdb736555ccf\n\nbuild:\n  number: 0\n\nrequirements:\n  build:\n    - ${{ compiler('c') }}\n    - if: win\n      then:\n        - cmake\n        - ninja\n    - if: unix\n      then:\n        - make\n        - perl\n        - pkg-config\n        - libtool\n  host:\n    - if: linux\n      then:\n        - openssl\n\nabout:\n  homepage: http://curl.haxx.se/\n  license: MIT/X derivate (http://curl.haxx.se/docs/copyright.html)\n  license_file: COPYING\n  summary: tool and library for transferring data with URL syntax\n  description: |\n    Curl is an open source command line tool and library for transferring data\n    with URL syntax. It is used in command lines or scripts to transfer data.\n  documentation: https://curl.haxx.se/docs/\n  repository: https://github.com/curl/curl\n

For the curl library recipe, two additional script files (build.sh and build.bat) are needed.

build.sh

#!/bin/bash\n\n# Get an updated config.sub and config.guess\ncp $BUILD_PREFIX/share/libtool/build-aux/config.* .\n\nif [[ $target_platform =~ linux.* ]]; then\n    USESSL=\"--with-openssl=${PREFIX}\"\nelse\n    USESSL=\"--with-secure-transport\"\nfi;\n\n./configure \\\n    --prefix=${PREFIX} \\\n    --host=${HOST} \\\n    ${USESSL} \\\n    --with-ca-bundle=${PREFIX}/ssl/cacert.pem \\\n    --disable-static --enable-shared\n\nmake -j${CPU_COUNT} ${VERBOSE_AT}\nmake install\n\n# Includes man pages and other miscellaneous.\nrm -rf \"${PREFIX}/share\"\n

build.bat

mkdir build\n\ncmake -GNinja ^\n      -DCMAKE_BUILD_TYPE=Release ^\n      -DBUILD_SHARED_LIBS=ON ^\n      -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% ^\n      -DCMAKE_PREFIX_PATH=%LIBRARY_PREFIX% ^\n      -DCURL_USE_SCHANNEL=ON ^\n      -DCURL_USE_LIBSSH2=OFF ^\n      -DUSE_ZLIB=ON ^\n      -DENABLE_UNICODE=ON ^\n      %SRC_DIR%\n\nIF %ERRORLEVEL% NEQ 0 exit 1\n\nninja install --verbose\n
"},{"location":"authentication_and_upload/","title":"Server authentication","text":""},{"location":"authentication_and_upload/#authenticating-with-a-server","title":"Authenticating with a server","text":"

You may want to use private channels for which you need to be authenticated. To do this ephemerally you can use the RATTLER_AUTH_FILE environment variable to point to a JSON file with the following structure:

{\n    \"*.prefix.dev\": {\n        \"BearerToken\": \"your_token\"\n    },\n    \"otherhost.com\": {\n        \"BasicHttp\": {\n            \"username\": \"your_username\",\n            \"password\": \"your_password\"\n        }\n    },\n    \"anaconda.org\": {\n        \"CondaToken\": \"your_token\"\n    }\n}\n

The keys are the host names. You can use wildcard specifiers here (e.g. *.prefix.dev to match all subdomains of prefix.dev, such as repo.prefix.dev). This will allow you to also obtain packages from any private channels that you have access to.

The following known authentication methods are supported:

  • BearerToken: prefix.dev
  • CondaToken: anaconda.org, quetz
  • BasicHttp: artifactory
"},{"location":"authentication_and_upload/#uploading-packages","title":"Uploading packages","text":"

If you want to upload packages, then rattler-build comes with a built-in upload command. There are 4 options:

  • prefix.dev: you can create public or private channels on the prefix.dev hosted server
  • anaconda.org: you can upload packages to the free anaconda.org server
  • quetz: you can host your own quetz server and upload packages to it
  • artifactory: you can upload packages to a JFrog Artifactory server

The command is:

rattler-build upload <server> <package_files>\n

Note: you can also use the RATTLER_AUTH_FILE environment variable to authenticate with the server.

"},{"location":"authentication_and_upload/#prefixdev","title":"prefix.dev","text":"

To upload to prefix.dev, you need to have an account and a token. You can create a token in the settings of your account. The token is used to authenticate the upload.

export PREFIX_API_KEY=<your_token>\nrattler-build upload prefix -c <channel> <package_files>\n

You can also use the --api-key=$PREFIX_API_KEY option to pass the token directly to the command. Note that you need to have created the channel on the prefix.dev website before you can upload to it.

"},{"location":"authentication_and_upload/#quetz","title":"Quetz","text":"

You need to pass a token and API key to upload to a channel on your own Quetz server. The token is used to authenticate the upload.

export QUETZ_API_KEY=<your_token>\nrattler-build upload quetz -u <url> -c <channel> <package_files>\n
"},{"location":"authentication_and_upload/#artifactory","title":"Artifactory","text":"

To upload to an Artifactory server, you need to pass a username and password. The username and password are used to authenticate the upload.

export ARTIFACTORY_USERNAME=<your_username>\nexport ARTIFACTORY_PASSWORD=<your_password>\nrattler-build upload artifactory -u <url> -c <channel> <package_files>\n
"},{"location":"authentication_and_upload/#anacondaorg","title":"anaconda.org","text":"

To upload to anaconda.org, you need to specify the owner and API key. The API key is used to authenticate the upload.

The owner is the owner of the distribution, for example, your user name or organization.

One can also specify a label such as dev for release candidates using the -c flag. The default value is main.

You can also add the --force argument to forcibly upload a new package (and overwrite any existing ones).

export ANACONDA_API_KEY=<your_token>\nrattler-build upload anaconda -o <your_username> -c <label> <package_files>\n
"},{"location":"automatic_linting/","title":"Enabling Automatic Linting in VSCode","text":"

Our new recipe format adheres to a strict JSON schema, which you can access here.

This schema is implemented using pydantic and can be rendered into a JSON schema file. The YAML language server extension in VSCode is capable of recognizing this schema, providing useful hints during the editing process.

To enable automatic linting with the YAML language server, you need to add the following line at the beginning of your recipe file:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n

Alternatively, if you prefer not to add this line to your file, you can install the JSON Schema Store Catalog extension. This extension will also enable automatic linting for your recipe files.

"},{"location":"build_options/","title":"Advanced build options","text":"

There are some specialized build options to control various features:

  • prefix replacement
  • variant configuration
  • encoded file type

These are all found under the build key in the recipe.yaml.

"},{"location":"build_options/#include-only-certain-files-in-the-package","title":"Include only certain files in the package","text":"

Sometimes you may want to include only a subset of the files installed by the build process in your package. For this, the files key can be used. Only new files are considered for inclusion (ie. files that were not in the host environment beforehand).

recipe.yaml
build:\n  # select files to be included in the package\n  # this can be used to remove files from the package, even if they are installed in the\n  # environment\n  files: list of globs\n

For example, to only include the header files in a package, you could use:

recipe.yaml
build:\n  files:\n    - include/**/*.h\n

Glob patterns throughout the recipe file can also use a flexible include / exclude pair, such as:

recipe.yaml
build:\n  files:\n    include:\n      - include/**/*.h\n    exclude:\n      - include/**/private.h\n
"},{"location":"build_options/#glob-evaluation","title":"Glob evaluation","text":"

Glob patterns are used throughout the build options to specify files. The patterns are matched against the relative path of the file in the build directory. Patterns can contain * to match any number of characters, ? to match a single character, and ** to match any number of directories.

For example:

  • *.txt matches all files ending in .txt
  • **/*.txt matches all files ending in .txt in any directory
  • **/test_*.txt matches all files starting with test_ and ending in .txt in any directory
  • foo/ matches all files under the foo directory

The globs are always evaluated relative to the prefix directory. If you have no include globs, but an exclude glob, then all files are included except those that match the exclude glob. This is equivalent to include: ['**'].

"},{"location":"build_options/#always-include-and-always-copy-files","title":"Always include and always copy files","text":"

There are some options that control the inclusion of files in the final package.

The always_include_files option can be used to include files even if they are already in the environment as part of some other host dependency. This is normally \"clobbering\" and should be used with caution (since packages should not have any overlapping files).

The always_copy_files option can be used to copy files instead of linking them. This is useful for files that might be modified inside the environment (e.g. configuration files). Normally, files are linked from a central cache into the environment to save space \u2013 that means that files modified in one environment will be modified in all environments. This is not always desirable, and in that case you can use the always_copy_files option.

??? note \"How always_copy_files works\" The always_copy_files option works by setting the no_link option in the info/paths.json to true for the files in question. This means that the files are copied instead of linked when the package is installed.

recipe.yaml
build:\n  # include files even if they are already in the environment\n  # as part of some other host dependency\n  always_include_files: list of globs\n\n  # do not soft- or hard-link these files, but always copy them was `no_link`\n  always_copy_files: list of globs\n
"},{"location":"build_options/#merge-build-and-host-environments","title":"Merge build and host environments","text":"

In very rare cases you might want to merge the build and host environments to obtain the \"legacy\" behavior of conda-build.

recipe.yaml
build:\n  # merge the build and host environments (used in many R packages on Windows)\n  merge_build_and_host_envs: bool (defaults to false)\n
"},{"location":"build_options/#prefix-detection-replacement-options","title":"Prefix detection / replacement options","text":"

During installation time the \"install\"-prefix is injected into text and binary files. Sometimes this is not desired, and sometimes the user might want closer control over the automatic text/binary detection.

The main difference between prefix replacement for text and binary files is that for binary files, the prefix string is padded with null bytes to match the length of the original prefix. The original prefix is the very long placeholder string that you might have seen in the build process.

On Windows, binary prefix replacement is never performed.

recipe.yaml
package:\n  name: mypackage\n  version: 1.0\n\nbuild:\n  # settings concerning the prefix detection in files\n  prefix_detection:\n    # force the file type of the given files to be TEXT or BINARY\n    # for prefix replacement\n    force_file_type:\n      # force TEXT file type (list of globs)\n      text: list of globs\n      # force binary file type (list of globs)\n      binary: list of globs\n\n    # ignore all or specific files for prefix replacement`\n    ignore: bool | [path] (defaults to false)\n\n    # whether to detect binary files with prefix or not\n    # defaults to true on Unix and (always) false on Windows\n    ignore_binary_files: bool\n
"},{"location":"build_options/#variant-configuration","title":"Variant configuration","text":"

To control the variant precisely you can use the \"variant configuration\" options.

A variant package has the same version number, but different \"hash\" and potentially different dependencies or build options. Variant keys are extracted from the variant_config.yaml file and usually any used Jinja variables or dependencies without version specifier are used as variant keys.

Variant keys can also be forcibly set or ignored with the use_keys and ignore_keys options.

In order to decide which of the variant packages to prefer and install by default, the down_prioritize_variant option can be used. The higher the value, the less preferred the variant is.

More about variants can be found in the variant documentation.

The following options are available in the build section to control the variant configuration:

recipe.yaml
build:\n  # settings for the variant\n  variant:\n    # Keys to forcibly use for the variant computation\n    # even if they are not in the dependencies\n    use_keys: list of strings\n\n    # Keys to forcibly ignore for the variant computation\n    # even if they are in the dependencies\n    ignore_keys: list of strings\n\n    # used to prefer this variant less\n    down_prioritize_variant: integer (defaults to 0, higher is less preferred)\n
"},{"location":"build_options/#dynamic-linking-configuration","title":"Dynamic linking configuration","text":"

After the package is built, rattler-build performs some \"post-processing\" on the binaries and libraries.

This entails making the shared libraries relocatable and checking that all linked libraries are present in the run requirements. The following settings control this behavior.

With the rpath option you can forcibly set the rpath of the shared libraries. The path is relative to the install prefix. Any rpath setting is ignored on Windows.

The rpath_allowlist option can be used to allow the rpath to point to locations outside of the environment. This is useful if you want to link against libraries that are not part of the conda environment (e.g. proprietary software).

If you want to stop rattler-build from relocating the binaries, you can set binary_relocation to false. If you want to only relocate some binaries, you can select the relevant ones with a glob pattern.

To read more about rpaths and how rattler-build creates relocatable binary packages, see the internals docs.

If you link against some libraries (possibly even outside of the prefix, in a system location), then you can use the missing_dso_allowlist to allow linking against these and suppress any warnings. This list is pre-populated with a list of known system libraries on the different operating systems.

As part of the post-processing, rattler-build checks for overlinking and overdepending. \"Overlinking\" is when a binary links against a library that is not specified in the run requirements. This is usually a mistake because the library would not be present in the environment when the package is installed.

Conversely, \"overdepending\" is when a library is part of the run requirements, but is not actually used by any of the binaries/libraries in the package.

recipe.yaml
build:\n  # settings for shared libraries and executables\n  dynamic_linking:\n    # linux only, list of rpaths relative to the installation prefix\n    rpaths: list of paths (defaults to ['lib/'])\n\n    # Allow runpath / rpath to point to these locations\n    # outside of the environment\n    rpath_allowlist: list of globs\n\n    # whether to relocate binaries or not. If this is a list of paths, then\n    # only the listed paths are relocated\n    binary_relocation: bool (defaults to true) | list of globs\n\n    # Allow linking against libraries that are not in the run requirements\n    missing_dso_allowlist: list of globs\n\n    # what to do when detecting overdepending\n    overdepending_behavior: \"ignore\" or \"error\" # (defaults to \"error\")\n\n    # what to do when detecting overlinking\n    overlinking_behavior: \"ignore\" or \"error\" # (defaults to \"error\")\n
"},{"location":"build_script/","title":"Build scripts","text":"

The build.sh file is the build script for Linux and macOS and build.bat is the build script for Windows. These scripts contain the logic that carries out your build steps. Anything that your build script copies into the $PREFIX or %PREFIX% folder will be included in your output package.

For example, this build.sh:

build.sh
mkdir -p $PREFIX/bin\ncp $RECIPE_DIR/my_script_with_recipe.sh $PREFIX/bin/super-cool-script.sh\n

There are many environment variables defined for you to use in build.sh and build.bat. Please see environment variables for more information.

build.sh and build.bat are optional. You can instead use the build/script key in your recipe.yaml, with each value being either a string command or a list of string commands. Any commands you put there must be able to run on every platform for which you build. For example, you can't use the cp command because cmd.exe won't understand it on Windows.

build.sh is run with bash and build.bat is run with cmd.exe.

recipe.yaml
build:\n  script:\n    - if: unix\n      then:\n        - mkdir -p $PREFIX/bin\n        - cp $RECIPE_DIR/my_script_with_recipe.sh $PREFIX/bin/super-cool-script.sh\n    - if: win\n      then:\n        - mkdir %PREFIX%\\bin\n        - copy %RECIPE_DIR%\\my_script_with_recipe.bat %PREFIX%\\bin\\super-cool-script.bat\n
"},{"location":"build_script/#environment-variables","title":"Environment variables","text":"

There are many environment variables that are automatically set during the build process.

However, you can also set your own environment variables easily in the script section of your recipe:

recipe.yaml
build:\n  script:\n    # Either use `content` or `file` to specify the script\n    # Note: this script only works on Unix :)\n    content: |\n      echo $FOO\n      echo $BAR\n      echo \"Secret value: $BAZ\"\n    env:\n      # hard coded value for `FOO`\n      FOO: \"foo\"\n      # Forward a value from the \"outer\" environment\n      # Without `default=...`, the build process will error if `BAR` is not set\n      BAR: ${{ env.get(\"BAR\", default=\"NOBAR\") }}\n    secrets:\n      # This value is a secret and will be masked in the logs and not stored in the rendered recipe\n      # The value needs to be available as an environment variable in the outer environment\n      - BAZ\n
"},{"location":"build_script/#alternative-script-interpreters","title":"Alternative script interpreters","text":"

With rattler-build and the new recipe syntax you can select an interpreter for your script.

So far, the following interpreters are supported:

  • bash (default on Unix)
  • cmd.exe (default on Windows)
  • nushell
  • python

Note

Using alternative interpreters is less battle-tested than using bash or cmd.exe. If you encounter any issues, please open an issue.

"},{"location":"build_script/#using-nushell","title":"Using nushell","text":"

In order to use nushell you can select the interpreter: nu or have a build.nu file in your recipe directory. Nushell works on Windows, macOS and Linux with the same syntax.

recipe.yaml
build:\n  script:\n    interpreter: nu\n    content: |\n      echo \"Hello from nushell!\"\n\n# Note: it's required to have `nushell` in the `build` section of your recipe!\nrequirements:\n  build:\n    - nushell\n
"},{"location":"build_script/#using-python","title":"Using python","text":"

In order to use python you can select the interpreter: python or have a build.py file in your recipe directory and python in the requirements/build section.

recipe.yaml
build:\n  script:\n    interpreter: python\n    content: |\n      print(\"Hello from Python!\")\n\n# Note: it's required to have `python` in the `build` section of your recipe!\nrequirements:\n  build:\n    - python\n
"},{"location":"build_script/#default-environment-variables-set-during-the-build-process","title":"Default environment variables set during the build process","text":"

During the build process, the following environment variables are set, on Windows with build.bat and on macOS and Linux with build.sh. By default, these are the only variables available to your build script. Unless otherwise noted, no variables are inherited from the shell environment in which you invoke conda-build. To override this behavior, see :ref:inherited-env-vars.

ARCH

Either 32 or 64, to specify whether the build is 32-bit or 64-bit. The value depends on the ARCH environment variable and defaults to the architecture the interpreter running conda was compiled with.

CMAKE_GENERATOR

The CMake generator string for the current build environment. On Linux systems, this is always Unix Makefiles. On Windows, it is generated according to the Visual Studio version activated at build time, for example, Visual Studio 9 2008 Win64.

CONDA_BUILD=1

Always set to indicate that the conda-build process is running.

CPU_COUNT

Represents the number of CPUs on the system.

SHLIB_EXT

Denotes the shared library extension specific to the operating system (e.g. .so for Linux, .dylib for macOS, and .dll for Windows).

HTTP_PROXY, HTTPS_PROXY

Inherited from the user's shell environment, specifying the HTTP and HTTPS proxy settings.

LANG

Inherited from the user's shell environment, defining the system language and locale settings.

MAKEFLAGS

Inherited from the user's shell environment. This can be used to set additional arguments for the make command, such as -j2 to utilize 2 CPU cores for building the recipe.

PY_VER

Specifies the Python version against which the build is occurring. This can be modified with a variant_config.yaml file.

PATH

Inherited from the user's shell environment and augmented with the activated host and build prefixes.

PREFIX

The build prefix to which the build script should install the software.

PKG_BUILDNUM

Indicates the build number of the package currently being built.

PKG_NAME

The name of the package that is being built.

PKG_VERSION

The version of the package currently under construction.

PKG_BUILD_STRING

The complete build string of the package being built, including the hash (e.g. py311h21422ab_0).

PKG_HASH

Represents the hash of the package being built, excluding the leading 'h' (e.g. 21422ab). This is applicable from conda-build 3.0 onwards.

PYTHON

The path to the Python executable in the host prefix. Python is installed in the host prefix only when it is listed as a host requirement.

R

The path to the R executable in the build prefix. R is installed in the build prefix only when it is listed as a build requirement.

RECIPE_DIR

The directory where the recipe is located.

SP_DIR

The location of Python's site-packages, where Python libraries are installed.

SRC_DIR

The path to where the source code is unpacked or cloned. If the source file is not a recognized archive format, this directory contains a copy of the source file.

STDLIB_DIR

The location of Python's standard library.

build_platform

Represents the native subdirectory of the conda executable, indicating the platform for which the build is occurring.

Removed from conda-build are: - NPY_VER - PY3K

"},{"location":"build_script/#windows","title":"Windows","text":"

Unix-style packages on Windows are built in a special Library directory under the build prefix. The environment variables listed in the following table are defined only on Windows.

Variable Description LIBRARY_BIN <build prefix>\\Library\\bin. LIBRARY_INC <build prefix>\\Library\\include. LIBRARY_LIB <build prefix>\\Library\\lib. LIBRARY_PREFIX <build prefix>\\Library. SCRIPTS <build prefix>\\Scripts.

Not yet supported in rattler-build:

  • CYGWIN_PREFIX
  • VS_MAJOR
  • VS_VERSION
  • VS_YEAR

Additionally, the following variables are forwarded from the environment:

  • ALLUSERSPROFILE
  • APPDATA
  • CommonProgramFiles
  • CommonProgramFiles(x86)
  • CommonProgramW6432
  • COMPUTERNAME
  • ComSpec
  • HOMEDRIVE
  • HOMEPATH
  • LOCALAPPDATA
  • LOGONSERVER
  • NUMBER_OF_PROCESSORS
  • PATHEXT
  • ProgramData
  • ProgramFiles
  • ProgramFiles(x86)
  • ProgramW6432
  • PROMPT
  • PSModulePath
  • PUBLIC
  • SystemDrive
  • SystemRoot
  • TEMP
  • TMP
  • USERDOMAIN
  • USERNAME
  • USERPROFILE
  • windir
  • PROCESSOR_ARCHITEW6432
  • PROCESSOR_ARCHITECTURE
  • PROCESSOR_IDENTIFIER
"},{"location":"build_script/#unix","title":"Unix","text":"

The environment variables listed in the following table are defined only on macOS and Linux.

Variable Description HOME Standard $HOME environment variable. PKG_CONFIG_PATH Path to pkgconfig directory, defaults to `$PREFIX/lib/pkgconfig SSL_CERT_FILE Path to SSL_CERT_FILE file. CFLAGS Empty, can be forwarded from env to set additional arguments to C compiler. CXXFLAGS Same as CFLAGS for C++ compiler. LDFLAGS Empty, additional flags to be passed to the linker when linking object files into an executable or shared object."},{"location":"build_script/#macos","title":"macOS","text":"

The environment variables listed in the following table are defined only on macOS.

Variable Description MACOSX_DEPLOYMENT_TARGET Same as the Anaconda Python macOS deployment target. Currently 10.9 for intel 32- and 64bit macOS, and 11.0 for arm64. OSX_ARCH i386 or x86_64 or arm64, depending on the target platform"},{"location":"build_script/#linux","title":"Linux","text":"

The environment variable listed in the following table is defined only on Linux.

Variable Description LD_RUN_PATH Defaults to <build prefix>/lib. QEMU_LD_PREFIX The prefix used by QEMU's user mode emulation for library paths. QEMU_UNAME Set qemu uname release string to 'uname'. DEJAGNU The path to the dejagnu testing framework used by the GCC test suite. DISPLAY The X11 display to use for graphical applications. BUILD Target triple ({build_arch}-conda_{build_distro}-linux-gnu) where build_distro is one of cos6 or cos7, for Centos 6 or 7"},{"location":"compilers/","title":"Compilers and cross-compilation","text":"

To use a compiler in your project, it's best to use the ${{ compiler('lang') }} template function. The compiler function works by taking a language, determining the configured compiler for that language, and adding some information about the target platform to the selected compiler. To configure a compiler for a specific language, the variant_config.yaml file can be used.

For example, in a recipe that uses a C-compiler, you can use the following code:

requirements:\n  build:\n    - ${{ compiler('c') }}\n

To set the compiler that you want to use, create a variant config that looks like the following:

c_compiler:\n  - gcc\n\n# optionally you can specify a version\nc_compiler_version:\n  - 9.3.0\n

When the template function is evaluated, it will look something like: gcc_linux-64 9.3.0. You can define your own compilers. For example, for Rust you can use ${{ compiler('rust') }} and rust_compiler_{version} in your variant config.

"},{"location":"compilers/#cross-compilation","title":"Cross-compilation","text":"

Cross-compilation is supported by rattler-build and the compiler template function is part of what makes it possible. When you want to cross-compile from linux-64 to linux-aarch64 (i.e. intel to ARM), you can pass --target-platform linux-aarch64 to the rattler-build command. This will cause the compiler template function to select a compiler that is configured for linux-aarch64. The above example would resolve to gcc_linux-aarch64 9.3.0. Provided that the package is available for linux-64 (your build platform), the compilation should succeed.

The distinction between the build and host sections begins to make sense when thinking about cross-compilation. The build environment is resolved to packages that need to run at compilation time. For example, cmake, gcc, and autotools are all tools that need to be executed. Therefore, the build environment resolves to packages for the linux-64 architecture (in our example). On the other hand, the host packages resolve to linux-aarch64 - those are packages that we want to link against.

# packages that need to run at build time (cmake, gcc, autotools, etc.)\n# in the platform that rattler-build is executed on (the build_platform)\nbuild:\n  - cmake\n  - ${{ compiler('c') }}\n# packages that we want to link against in the architecture we are\n# cross-compiling to the target_platform\nhost:\n  - libcurl\n  - openssl\n
"},{"location":"converting_from_conda_build/","title":"Converting a recipe from conda-build","text":"

The recipe format of rattler-build differs in some aspects from conda-build. This document aims to help you convert a recipe from conda-build to rattler-build.

"},{"location":"converting_from_conda_build/#automatic-conversion","title":"Automatic conversion","text":"

To convert a recipe from meta.yaml to recipe.yaml you can use the automatic conversion utility.

To install conda-recipe-manager, run

pixi global install conda-recipe-manager\n# or\nconda install -c conda-forge conda-recipe-manager\n

Then, run the conversion utility:

conda-recipe-manager convert my-recipe/meta.yaml\n

This will print the converted recipe to the console. You can save it to a file by redirecting the output:

conda-recipe-manager convert my-recipe/meta.yaml > recipe.yaml\n

To learn more about the tool, or contribute, find the repository here.

"},{"location":"converting_from_conda_build/#converting-jinja-and-selectors","title":"Converting Jinja and selectors","text":"

To use jinja in the new recipes, you need to keep in mind two conversions. The {% set version = \"1.2.3\" %} syntax is replaced by the context section in the new recipe format.

{% set version = \"1.2.3\" %}\n

becomes

context:\n  version: \"1.2.3\"\n

To use the values or other Jinja expressions (e.g. from the variant config) you can use the ${{ version }} syntax. Note the $ sign before the curly braces - it makes Jinja fully compatible with the YAML format.

meta.yaml
# instead of\npackage:\n  version: \"{{ version }}\"\nsource:\n  url: https://example.com/foo-{{ version }}.tar.gz\n

becomes

recipe.yaml
package:\n  version: ${{ version }}\nsource:\n  url: https://example.com/foo-${{ version }}.tar.gz\n
"},{"location":"converting_from_conda_build/#converting-selectors","title":"Converting selectors","text":"

conda-build has a line based \"selector\" system, to e.g. disable certain fields on Windows vs. Unix.

In rattler-build we\u00a0use two different syntaxes: an if/else/then map or a inline jinja expression.

A typical selector in conda-build looks something like this:

meta.yaml
requirements:\n  host:\n    - pywin32  # [win]\n

To convert this to rattler-build syntax, you can use one of the following two syntaxes:

recipe.yaml
requirements:\n  host:\n    - ${{ \"pywin32\" if win }}  # empty strings are automatically filtered\n    # or\n    - if: win\n      then:\n        - pywin32  # this list extends the outer list\n
"},{"location":"converting_from_conda_build/#converting-the-recipe-script","title":"Converting the recipe script","text":"

We still support the build.sh script, but the bld.bat script was renamed to build.bat in order to be more consistent with the build.sh script.

You can also choose a different name for your script:

build:\n  # note: if there is no extension, we will try to find .sh on unix and .bat on windows\n  script: my_build_script\n

There are also new ways of writing scripts, for example with nushell or python

Variant keys in build scripts

conda-build tries to analyze the build scripts for any usage of variant keys. We do not attempt that. If you want to use variant keys in your build script that are not used anywhere else you need to manually add them to your script environment, e.g.

recipe.yaml
build:\n  script:\n    content: echo $MY_VARIANT\n    env:\n      MY_VARIANT: ${{ my_variant }}\n
"},{"location":"converting_from_conda_build/#converting-the-recipe-structure","title":"Converting the recipe structure","text":"

There are a few differences in the recipe structure. However, the schema will tell you quite easily what is expected and you should see red squiggly lines in your editor (e.g. VSCode) if you make a mistake.

Here are a few differences:

  • build.run_exports is now requirements.run_exports
  • requirements.run_constrained is now requirements.run_constraints
  • build.ignore_run_exports is now requirements.ignore_run_exports.by_name
  • build.ignore_run_exports_from is now requirements.ignore_run_exports.from_package
  • A git source now uses git, tag, ... and not git_url and git_rev, e.g.
    git: https://github.com/foo/bar.git\ntag: 1.2.3\n
"},{"location":"converting_from_conda_build/#converting-the-test-section","title":"Converting the test section","text":"

The test section is renamed to tests and is a list of independent tests. Each test runs in its own environment.

Let's have a look at converting an existing test section:

meta.yaml
test:\n  imports:\n    - mypackage\n  commands:\n    - mypackage --version\n

This would now be split into two tests:

recipe.yaml
tests:\n  - script:\n      - mypackage --version\n  - python:\n      imports:\n        - mypackage\n      # by default we perform a `pip check` in the python test but\n      # it can be disabled by setting this to false\n      pip_check: false\n

The script tests also take a requirements section with run and build requirements. The build requirements can be used to install emulators and similar tools that need to run to execute tests in a cross-compilation environment.

"},{"location":"experimental_features/","title":"Experimental features","text":"

Warning

These are experimental features of rattler-build and may change or go away completely.

Currently only the build and rebuild commands support the following experimental features.

To enable them, use the --experimental flag with the command. Or, use the environment variable, RATTLER_BUILD_EXPERIMENTAL=1.

"},{"location":"experimental_features/#jinja-functions","title":"Jinja functions","text":""},{"location":"experimental_features/#load_from_filefile_path","title":"load_from_file(<file_path>)","text":"

The Jinja function load_from_file allows loading from files; specifically, it allows loading from toml, json, and yaml file types to an object to allow it to fetch things directly from the file. It loads all other files as strings.

"},{"location":"experimental_features/#usage","title":"Usage","text":"

load_from_file is useful when there is a project description in a well-defined project file such as Cargo.toml, package.json, pyproject.toml, package.yaml, or stack.yaml. It enables the recipe to be preserved in as simple a state as possible, especially when there is no need to keep the changes in sync; some example use cases for this are with CI/CD infrastructure or when there is a well-defined output format.

Below is an example loading a Cargo.toml inside of the rattler-build GitHub repository:

recipe.yaml
context:\n  name: ${{ load_from_file(\"Cargo.toml\").package.name }}\n  version: ${{ load_from_file(\"Cargo.toml\").package.version }}\n  source_url: ${{ load_from_file(\"Cargo.toml\").package.homepage }}\n  rust_toolchain: ${{ load_from_file(\"rust-toolchains\") }}\n\npackage:\n  name: ${{ name }}\n  version: ${{ version }}\n\nsource:\n  git: ${{ source_url }}\n  tag: ${{ source_tag }}}}\n\nrequirements:\n  build:\n    - rust ==${{ rust_toolchain }}\n\nbuild:\n  script: cargo build --release -p ${{ name }}\n\ntest:\n  - script: cargo test -p ${{ name }}\n  - script: cargo test -p rust-test -- --test-threads=1\n\nabout:\n  home: ${{ source_url }}\n  repository: ${{ source_url }}\n  documentation: ${{ load_from_file(\"Cargo.toml\").package.documentation }}\n  summary: ${{ load_from_file(\"Cargo.toml\").package.description }}\n  license: ${{ load_from_file(\"Cargo.toml\").package.license }}\n
"},{"location":"experimental_features/#git-functions","title":"git functions","text":"

git functions are useful for getting the latest tag and commit hash. These can be used in the context section of the recipe, to fetch version information from a repository.

Examples
# latest tag in the repo\ngit.latest_tag(<git_repo_url>)\n\n# latest tag revision(aka, hash of tag commit) in the repo\ngit.latest_tag_rev(<git_repo_url>)\n\n# latest commit revision(aka, hash of head commit) in the repo\ngit.head_rev(<git_repo_url>)\n
"},{"location":"experimental_features/#usage_1","title":"Usage","text":"

These can be useful for automating minor things inside of the recipe itself, such as if the current version is the latest version or if the current hash is the latest hash, etc.

recipe.yaml
context:\n  git_repo_url: \"https://github.com/prefix-dev/rattler-build\"\n  latest_tag: ${{ git.latest_tag( git_repo_url ) }}\n\npackage:\n  name: \"rattler-build\"\n  version: ${{ latest_tag }}\n\nsource:\n  git: ${{ git_repo_url }}\n  tag: ${{ latest_tag }}\n

There is currently no guarantee of caching for repo fetches when using git functions. This may lead to some performance issues.

"},{"location":"highlevel/","title":"What is rattler-build?","text":"

rattler-build is a tool to build and package software so that it can be installed on any operating system \u2013 with any compatible package manager such as mamba, conda, or rattler. We are also intending for rattler-build to be used as a library to drive builds of packages from any other recipe format in the future.

"},{"location":"highlevel/#how-does-rattler-build-work","title":"How does rattler-build work?","text":"

Building of packages consists of several steps. It all begins with a recipe.yaml file that specifies how the package is to be built and what the dependencies are. From the recipe file, rattler-build executes several steps:

  1. Rendering: Parse the recipe file and evaluate conditionals, Jinja expressions, and variables, and variants.

  2. Fetch source: Retrieve specified source files, such as .tar.gz files, git repositories, local paths. Additionally, this step will apply patches that can be specified alongside the source file.

  3. Install build environments: Download and install dependencies into temporary \"host\" and \"build\" workspaces. Any dependencies that are needed at build time are installed in this step.

  4. Build source: Execute the build script to build/compile the source code and install it into the host environment.

  5. Prepare package files: Collect all files that are new in the \"host\" environment and apply some transformations if necessary; specifically, we edit the rpath on Linux and macOS to make binaries relocatable.

  6. Package: Bundle all the files in a package and write out any additional metadata into the info/index.json, info/about.json, and info/paths.json files. This also creates the test files that are bundled with the package.

  7. Test: Run any tests specified in the recipe. The package is considered done if it passes all the tests, otherwise its moved to broken/ in the output directory.

After this process, a package is created. This package can be uploaded to somewhere like a custom prefix.dev private or public channel.

"},{"location":"highlevel/#how-to-run-rattler-build","title":"How to run rattler-build","text":"

Running rattler-build is straightforward. It can be done on the command line:

rattler-build build --recipe myrecipe/recipe.yaml\n

A custom channel that is not conda-forge (the default) can be specified like so:

rattler-build build -c robostack --recipe myrecipe/recipe.yaml\n

You can also use the --recipe-dir argument if you want to build all the packages in a directory:

rattler-build build --recipe-dir myrecipes/\n
"},{"location":"highlevel/#overview-of-a-recipeyaml","title":"Overview of a recipe.yaml","text":"

A recipe.yaml file is separated into multiple sections and can conditionally include or exclude sections. Recipe files also support a limited amount of string interpolation with Jinja (specifically minijinja in our case).

A simple example of a recipe file for the zlib package would look as follows:

recipe.yaml
# variables from the context section can be used in the rest of the recipe\n# in jinja expressions\ncontext:\n  version: 1.2.13\n\npackage:\n  name: zlib\n  version: ${{ version }}\n\nsource:\n  url: http://zlib.net/zlib-${{ version }}.tar.gz\n  sha256: b3a24de97a8fdbc835b9833169501030b8977031bcb54b3b3ac13740f846ab30\n\nbuild:\n  # build numbers can be set arbitrarily\n  number: 0\n  script:\n    # build script to install the package into the $PREFIX (host prefix)\n    - if: unix\n      then:\n      - ./configure --prefix=$PREFIX\n      - make -j$CPU_COUNT\n    - if: win\n      then:\n      - cmake -G \"Ninja\" -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH=%LIBRARY_PREFIX%\n      - ninja install\n\nrequirements:\n  build:\n    # compiler is a special function.\n    - ${{ compiler(\"c\") }}\n    # The following two dependencies are only needed on Windows,\n    # and thus conditionally selected\n    - if: win\n      then:\n        - cmake\n        - ninja\n    - if: unix\n      then:\n        - make\n

The sections of a recipe are:

sections description context Defines variables that can be used in the Jinja context later in the recipe (e.g. name and version are commonly interpolated in strings) package This section defines the name and version of the package you are currently building and will be the name of the final output source Defines where the source code is going to be downloaded from and checksums build Settings for the build and the build script requirements Allows the definition of build, host, run and run-constrained dependencies"},{"location":"internals/","title":"What does rattler-build do to build a package?","text":"

rattler-build creates conda packages which are relocatable packages. These packages are built up with some rules and conventions in mind.

"},{"location":"internals/#what-goes-into-a-package","title":"What goes into a package?","text":"

Generally speaking, any new files that are copied into the $PREFIX directory at build time are part of the new package. However, there is some filtering going on to exclude unwanted files, and noarch: python packages have special handling as well. The rules are as follows:

"},{"location":"internals/#filtering","title":"Filtering","text":""},{"location":"internals/#general-file-filtering","title":"General File Filtering","text":"

Certain files are filtered out to prevent them from being included in the package. These include:

  • .pyo files: Optimized Python files are not included because they are considered harmful.
  • .la files: Libtool archive files that are not needed at runtime.
  • .DS_Store files: macOS-specific files that are irrelevant to the package.
  • .git files and directories: Version control files, including .gitignore and the .git directory, which are not needed in the package.
  • share/info/dir This file is ignored because it would be written from multiple packages.
"},{"location":"internals/#special-handling-for-noarch-python-packages","title":"Special Handling for noarch: python Packages","text":"

For packages marked as noarch: python, special transformations are applied to ensure compatibility across different platforms:

  • Stripping Python Library Prefix: The \"lib/pythonX.X\" prefix is removed, retaining only the \"site-packages\" part of the path.
  • Skipping __pycache__ Directories and .pyc Files: These are excluded and recreated during installation (they are specific to the Python version).
  • Replacing bin and Scripts Directories:
    • On Unix systems, the bin directory is replaced with python-scripts.
    • On Windows systems, the Scripts directory is replaced with python-scripts.
  • Remove explicitly mentioned entrypoints: For noarch: python packages, entry points registered in the package are also taken into account. Files in the bin or Scripts directories that match entry points are excluded to avoid duplications.
"},{"location":"internals/#symlink-handling","title":"Symlink Handling","text":"

Symlinks are carefully managed to ensure they are relative rather than absolute, which aids in making the package relocatable:

  • Absolute symlinks pointing within the $PREFIX are converted to relative symlinks.
  • On Unix systems, this conversion is handled directly by creating new relative symlinks.
  • On Windows, a warning is issued since symlink creation requires administrator privileges.
"},{"location":"internals/#making-packages-relocatable-with-rattler-build","title":"Making Packages Relocatable with rattler-build","text":"

Often, the most challenging aspect of building a package using rattler-build is making it relocatable. A relocatable package can be installed into any prefix, allowing it to be used outside the environment in which it was built. This is in contrast to a non-relocatable package, which can only be utilized within its original build environment.

rattler-build automatically performs the following actions to make packages relocatable:

  1. Binary object file conversion: Binary object files are converted to use relative paths using install_name_tool on macOS and patchelf on Linux. This uses $ORIGIN for elf files on Linux and @loader_path for Mach-O files on macOS to make the rpath relative to the executable / shared library.
  2. Text file prefix registration: Any text file without NULL bytes containing the placeholder prefix have the registered prefix replaced with the install prefix.
  3. Binary file prefix detection and registration: Binary files containing the build prefix can be automatically registered. The registered files will have their build prefix replaced with the install prefix at install time. This works by padding the install prefix with null terminators, such that the length of the binary file remains the same. The build prefix must be long enough to accommodate any reasonable installation prefix. On macOS and Linux, rattler-build pads the build prefix to 255 characters by appending _placehold to the end of the build directory name.
"},{"location":"multiple_output_cache/","title":"The cache for multiple outputs","text":"

Note

The \"multi-output\" cache is a little bit different from a compilation cache. If you look for tips and tricks on how to use sccache or ccache with rattler-build, please refer to the tips and tricks section.

Sometimes you build a package and want to split the contents into multiple sub-packages. For example, when building a C/C++ package, you might want to create multiple packages for the runtime requirements (library), and the development time requirements such as header files.

The \"cache\" output makes this easy. It allows you to specify a single top-level cache that can produce arbitrary files, that can then be used in other packages.

Let's take a look at an example:

recipe.yaml
recipe:\n  name: mypackage\n  version: '0.1.0'\n\ncache:\n  requirements:\n    build:\n      - ${{ compiler('c') }}\n  build:\n    script:\n      - mkdir -p $PREFIX/lib\n      - mkdir -p $PREFIX/include\n      - echo \"This is the library\" > lib/library.txt\n      - echo \"This is the header\" > include/header.txt\n\noutputs:\n  - package:\n      name: mypackage-library\n    build:\n      files:\n        - lib/*\n\n  - package:\n      name: mypackage-headers\n    build:\n      files:\n        - include/*\n

Note

Since this is an experimental feature, you need to pass the --experimental flag to enable parsing of the cache top-level section.

In this example, we have a single package called mypackage that creates two outputs: mypackage-library and mypackage-headers. The cache output will run like a regular output, but after the build is finished, the files will be copied to a \"cache\" directory (in your output folder, under output/build_cache).

The files in the cache folder are then copied into the $PREFIX of each output package. Since they are \"new\" files in the prefix, they will be included in the output package. The easiest way to select a subset of the files in the prefix is by using the files field in the output definition. You can use a list of globs to select only the files that you want.

For something more complicated you can also use include and exclude fields in the files selector. Please refer to the the build options documentation.

"},{"location":"multiple_output_cache/#run-exports-from-the-cache","title":"Run exports from the cache","text":"

Since the cache output also has build- and host requirements we need to additionally take care of eventual \"run-exports\" from the cache output. Run exports from the cache-dependencies are handled very similar to the run exports from a given output. We append any run exports to the outputs.

If the cache has an \"ignore run exports\" section, than we apply those filters at the cache level. If the output ignores any run exports, then we also ignore the run-exports if they would come from the cache.

"},{"location":"multiple_output_cache/#caching-in-the-src_dir","title":"Caching in the $SRC_DIR","text":"

If you used conda-build a lot, you might have noticed that a top-level build is also caching the changes in the $SRC_DIR. This is not the case for rattler-build yet.

You could try to work around by e.g. copying files into the $PREFIX and restoring them in each output.

"},{"location":"package_spec/","title":"Package specification","text":"

rattler-build produces \"conda\" packages. These packages work with the mamba and conda package managers, and they work cross-platform on Windows, Linux and macOS.

By default, a conda package is a tar.bz2 archive which contains:

  • Metadata under the info/ directory
  • A collection of files that are installed directly into an install prefix

The format is identical across platforms and operating systems. During the install process, all files are extracted into the install prefix, except the ones in info/. Installing a conda package into an environment is similar to executing the following commands:

cd <environment prefix>\ntar xjf mypkg-1.0.0-h2134.tar.bz2\n

Only files, including symbolic links, are part of a conda package. Directories are not included. Directories are created and removed as needed, but you cannot create an empty directory from the tar archive directly.

There is also a newer archive type, suffixed with .conda. This archive type consists of an outer \"zip\" archive that is not compressed, and two inner archives that are compressed with zstd, which is very fast for decompression.

The inner archives are split into info and pkg files, which makes it possible to extract only the info part of the archive (only the metadata), which is often smaller in size.

"},{"location":"package_spec/#package-filename","title":"Package filename","text":"

A conda package conforms to the following filename:

<name>-<version>-<hash>.tar.bz2 OR <name>-<version>-<hash>.conda\n
"},{"location":"package_spec/#special-files-in-packages","title":"Special files in packages","text":"

There are some special files in a package:

  • activation and deactivation scripts that are executed when the environment is activated or deactivated
  • post-link and pre-unlink scripts that are executed when the package is installed or uninstalled

You can read more about these files in the activation scripts and other special files section.

"},{"location":"package_spec/#package-metadata","title":"Package metadata","text":"

The info/ directory contains all metadata about a package. Files in this location are not installed under the install prefix. Although you are free to add any file to this directory, conda only inspects the content of the files discussed below:

"},{"location":"package_spec/#infoindexjson","title":"info/index.json","text":"

This file contains basic information about the package, such as name, version, build string, and dependencies. The content of this file is stored in repodata.json, which is the repository index file, hence the name index.json. The JSON object is a dictionary containing the keys shown below.

name: string

The lowercase name of the package. May contain lowercase characters, underscores, and dashes.

version: string

The package version. May not contain \"-\". Acknowledges PEP 440.

build: string

The build string. May not contain \"-\". Differentiates builds of packages with otherwise identical names and versions, such as:

  • A build with other dependencies, such as Python 3.4 instead of Python 2.7.
  • A bug fix in the build process.
  • Some different optional dependencies, such as MKL versus ATLAS linkage. Nothing in conda actually inspects the build string. Strings such as np18py34_1 are designed only for human readability and conda never parses them.
build_number: integer

A non-negative integer representing the build number of the package. Unlike the build string, the build_number is inspected by conda. Conda uses it to sort packages that have otherwise identical names and versions to determine the latest one. This is important because new builds that contain bug fixes for the way a package is built may be added to a repository.

depends: list of match specs

A list of dependency specifications, where each element is a string. These come from the run section of the recipe or any run exports of dependencies.

constrains: list of match specs

A list of optional dependency constraints. The packages listed under constrains are not installed by default, but if they are installed they have to respect the constraints.

subdir: string

The subdir (like linux-64) of this package.

arch: string

Optional. The architecture the package is built for. EXAMPLE: x86_64. This key is generally not used (duplicate information from sudir).

platform: string

Optional. The OS that the package is built for, e.g. osx. This key is generally not used (duplicate information from sudir).

"},{"location":"package_spec/#infopathsjson","title":"info/paths.json","text":"

The paths.json file lists all files that are installed into the environment.

It consists of a list of path entries, each with the following keys:

_path: string

The relative path of the file

path_type: optional, string

The type of linking, can be hardlink, softlink, or directory. Default is hardlink.

file_mode: - optional, string

The file mode can be binary or text. This is only relevant for prefix replacement.

prefix_placeholder: optional, string

The prefix placeholder string that is encoded in the text or binary file, which is replaced at installation time. Note that this prefix placeholder uses / even on Windows.

no_link: bool, optional

Determines whether this file should be linked or not when installing the package (linking the file from the cache into the environment). Defaults to false.

sha256: string

The SHA256 hash of the file. For symbolic links it contains the SHA256 hash of the file pointed to.

size_in_bytes: number

The size, in bytes, of the file. For symbolic links, it contains the file size of the file pointed to.

Due to the way the binary replacement works, the placeholder prefix must be longer than the install prefix.

"},{"location":"package_spec/#infolicense","title":"info/license/<...>","text":"

All licenses mentioned in the recipe are copied to this folder.

"},{"location":"package_spec/#infoaboutjson","title":"info/about.json","text":"

Optional file. Contains the entries of the \"about\" section of the recipe of the recipe.yaml file. The following keys are added to info/about.json if present in the build recipe:

Renamed fields

The new recipe spec renamed a few fields (from conda-build's original implementation). This means that some fields in the about.json file still have the old names (for backwards compatibility), while you would generally use different names in the recipe.

home: url (from about.homepage)

The URL of the homepage of the package.

dev_url: url (from about.repository)

The URL of the development repository of the package.

doc_url: url (from about.documentation)

The URL of the documentation of the package.

license_url: url

The URL of the license of the package.

license: string (from about.license)

The SPDX license identifier of the package.

summary: string

A short summary of the package.

description: string

A longer description of the package.

license_family: string

(this field is not used anymore as we rely on SPDX license identifiers)

"},{"location":"package_spec/#inforecipe","title":"info/recipe/<...>","text":"

A directory containing the full contents of the build recipe. This folder also contains a rendered version of the recipe (rendered_recipe.yaml). This rendered version is used for the rebuild command. However, note that currently this format is still in flux and can change at any time.

You can also use --no-include-recipe to disable the inclusion of the recipe in the package.

"},{"location":"rebuild/","title":"Rebuilding a package","text":"

The rebuild command allows you to rebuild a package from an existing package. The main use case is to examine if a package can be rebuilt in a reproducible manner. You can read more about reproducible builds here.

"},{"location":"rebuild/#usage","title":"Usage","text":"
rattler-build rebuild ./mypkg-0.1.0-h60d57d3_0.tar.bz2\n
"},{"location":"rebuild/#how-it-works","title":"How it works","text":"

The recipe is \"rendered\" and stored into the package. The way the recipe is rendered is subject to change. For the moment, the rendered recipe is stored as info/recipe/rendered_recipe.yaml. It includes the exact package versions that were used at build time. When rebuilding, we use the package resolutions from the rendered recipe, and execute the same build script as the original package.

We also take great care to sort files in a deterministic manner as well as erasing any time stamps. The SOURCE_DATE_EPOCH environment variable is set to the same timestamp as the original build for additional determinism (some build tools use this variable to set timestamps).

"},{"location":"rebuild/#how-to-check-the-reproducibility-of-a-package","title":"How to check the reproducibility of a package","text":"

There is an excellent tool called diffoscope that allows you to compare two packages and see the differences. You can install it with pixi:

pixi global install diffoscope\n

To compare two packages, you can use the following command:

rattler-build rebuild ./build0.tar.bz2\ndiffoscope ./build0.tar.bz2 ./mypkg-0.1.0-h60d57d3_0.tar.bz2\n
"},{"location":"recipe_generation/","title":"Generating recipes for different ecosystems","text":"

Rattler-build has some builtin functionality to generate recipes for different (existing) ecosystems.

Currently we support the following ecosystems:

  • pypi (Python) - generates a recipe for a Python package
  • cran (R) - generates a recipe for an R package

To generate a recipe for a Python package, you can use the following command:

rattler-build generate-recipe pypi jinja2\n

This will generate a recipe for the jinja2 package from PyPI and print it to the console. To turn it into a recipe, you can either pipe the stdout to a file or use the -w flag. The -w flag will create a new folder with the recipe in it.

The generated recipe for jinja2 will look something like:

recipe.yaml
package:\n  name: jinja2\n  version: 3.1.4\n\nsource:\n- url: https://files.pythonhosted.org/packages/ed/55/39036716d19cab0747a5020fc7e907f362fbf48c984b14e62127f7e68e5d/jinja2-3.1.4.tar.gz\n  sha256: 4a3aee7acbbe7303aede8e9648d13b8bf88a429282aa6122a993f0ac800cb369\n\nbuild:\n  script: python -m pip install .\n\nrequirements:\n  host:\n  - flit_core <4\n  - python >=3.7\n  - pip\n  run:\n  - python >=3.7\n  - markupsafe >=2.0\n  # - babel >=2.7  # extra == 'i18n'\n\ntests: []\n\nabout:\n  summary: A very fast and expressive template engine.\n  documentation: https://jinja.palletsprojects.com/\n
"},{"location":"recipe_generation/#generating-recipes-for-r-packages","title":"Generating recipes for R packages","text":"

To generate a recipe for an R package, you can use the following command:

rattler-build generate-recipe cran dplyr\n

The R recipe generation supports some additional flags:

  • -u/--universe select an R universe to use (e.g. bioconductor)
  • -t/--tree generate multiple recipes, for every dependency as well

R packages will be prefixed with r- to avoid name conflicts with Python packages. The generated recipe for dplyr will look something like:

recipe.yaml
package:\n  name: r-dplyr\n  version: 1.1.4\n\nsource:\n- url: https://cran.r-project.org/src/contrib/dplyr_1.1.4.tar.gz\n  md5: e3066ea859b26e0d3b992c476ea3af2e\n\nbuild:\n  script: R CMD INSTALL --build .\n  python: {}\n\nrequirements:\n  host:\n  - r-base >=3.5.0\n  run:\n  - r-cli >=3.4.0\n  - r-generics\n  - r-glue >=1.3.2\n  - r-lifecycle >=1.0.3\n  - r-magrittr >=1.5\n  - r-methods\n  - r-pillar >=1.9.0\n  - r-r6\n  - r-rlang >=1.1.0\n  - r-tibble >=3.2.0\n  - r-tidyselect >=1.2.0\n  - r-utils\n  - r-vctrs >=0.6.4\n  # -  r-bench  # suggested\n  # -  r-broom  # suggested\n  # -  r-callr  # suggested\n  # -  r-covr  # suggested\n  # -  r-dbi  # suggested\n  # -  r-dbplyr >=2.2.1  # suggested\n  # -  r-ggplot2  # suggested\n  # -  r-knitr  # suggested\n  # -  r-lahman  # suggested\n  # -  r-lobstr  # suggested\n  # -  r-microbenchmark  # suggested\n  # -  r-nycflights13  # suggested\n  # -  r-purrr  # suggested\n  # -  r-rmarkdown  # suggested\n  # -  r-rmysql  # suggested\n  # -  r-rpostgresql  # suggested\n  # -  r-rsqlite  # suggested\n  # -  r-stringi >=1.7.6  # suggested\n  # -  r-testthat >=3.1.5  # suggested\n  # -  r-tidyr >=1.3.0  # suggested\n  # -  r-withr  # suggested\n\nabout:\n  homepage: https://dplyr.tidyverse.org, https://github.com/tidyverse/dplyr\n  summary: A Grammar of Data Manipulation\n  description: |-\n    A fast, consistent tool for working with data frame like\n    objects, both in memory and out of memory.\n  license: MIT\n  license_file: LICENSE\n  repository: https://github.com/cran/dplyr\n

Tip

You can use the generated recipes to build your own \"forge\" with rattler-build. Read more about it in the Building your own forge section.

"},{"location":"selectors/","title":"Selectors in recipes","text":"

Recipe and variant configuration files can utilize selectors to conditionally add, remove, or modify dependencies, configuration options, or even skip recipe execution based on specific conditions.

Selectors are implemented using an if / then / else map, which is a valid YAML dictionary. The condition is evaluated using minijinja and follows the same syntax as a Python expression.

During rendering, several variables are set based on the platform and variant being built. For example, the unix variable is true for macOS and Linux, while win is true for Windows. Consider the following recipe executed on Linux:

requirements:\n  host:\n    - if: unix\n      then: unix-tool\n    - if: win\n      then: win-tool\n

This will be evaluated as:

requirements:\n  host:\n    - unix-tool\n

The line containing the Windows-specific configuration is removed. Multiple items can also be selected, such as:

host:\n  - if: linux\n    then:\n      - linux-tool-1\n      - linux-tool-2\n      - linux-tool-3\n

For Linux, this will result in:

host:\n  - linux-tool-1\n  - linux-tool-2\n  - linux-tool-3\n

Other examples often found in the wild:

if: build_platform != target_platform ... # true if cross-platform build\nif: osx and arm64 ... # true for apple silicon (osx-arm64)\nif: linux and (aarch64 or ppc64le)) ... # true for linux ppc64le or linux-aarch64\n
"},{"location":"selectors/#available-variables","title":"Available variables","text":"

The following variables are available during rendering of the recipe:

Variable Description target_platform the configured target_platform for the build build_platform the configured build_platform for the build linux \"true\" if target_platform is Linux osx \"true\" if target_platform is OSX / macOS win \"true\" if target_platform is Windows unix \"true\" if target_platform is a Unix (macOS or Linux) x86, x86_64 x86 32/64-bit Architecture aarch64, arm64 64-bit Arm (these are the same but are both supported for legacy) armV6l, armV7l 32-bit Arm ppc64, s390x, Big endian ppc64le Little endian riscv32, riscv64 The RISC-V Architecture wasm32 The WebAssembly Architecture"},{"location":"selectors/#variant-selectors","title":"Variant selectors","text":"

To select based on variant configuration you can use the names in the selectors as well. For example, if the build uses python: 3.8 as a variant, we can use if: python == \"3.8\" to enable a dependency for only when the Python version is 3.8.

String comparison

The comparison is a string comparison done by minijinja, so it is important to use the correct string representation of the variant. Use the match function to compare versions.

variants.yaml
python:\n  - 3.8\n  - 3.9\n
recipe.yaml
requirements:\n  host:\n    - if: python == \"3.8\" # (1)!\n      then: mydep\n      else: otherdep\n
  1. This will only add mydep when the Python version is 3.8. This comparison is a string comparison, so it is important to use the correct string representation of the variant.
"},{"location":"selectors/#the-match-function","title":"The match function","text":"

!!! note \"Rename from cmp to match\" The cmp function has been renamed to match to better reflect its purpose.

Inside selectors, one can use a special match function to test if the selected variant version has a matching version. For example, having the following variants file, we could use the these tests:

variants.yaml
python:\n  - 3.8\n  - 3.9\n
recipe.yaml
- if: match(python, \"3.8\")    # true, false\n  then: mydep\n- if: match(python, \">=3.8\")  # true, true\n  then: mydep\n- if: match(python, \"<3.8\")   # false, false (1)\n  then: mydep\n
  1. else: would also have worked here.

This function eliminates the need to implement any Python-specific conda-build selectors (such as py3k, py38, etc.) or the py and npy integers.

Please note that during the initial phase of rendering we do not know the variant, and thus the match condition always evaluates to true.

"},{"location":"selectors/#selector-evaluation","title":"Selector evaluation","text":"

Except for the rattler-build specific selectors, the selectors are evaluated using the minijinja engine. This means that the selectors are evaluated by minijinja thus Python like expressions. Some notable options are:

- if: python == \"3.8\" # equal\n- if: python != \"3.8\" # not equal\n- if: python and linux # true if python variant is set and the target_platform is linux\n- if: python and not linux # true if python variant is set and the target_platform is not linux\n- if: python and (linux or osx) # true if python variant is set and the target_platform is linux or osx\n
"},{"location":"special_files/","title":"Activation scripts and other special files","text":"

A conda package can contain \"special\" files in the prefix. These files are scripts that are executed during activation, installation, or uninstallation process.

If possible, they should be avoided since they execute arbitrary code at installation time and slow down the installation and activation process.

"},{"location":"special_files/#activation-scripts","title":"Activation scripts","text":"

The activation scripts are executed when the environment containing the package is activated (e.g. when doing micromamba activate myenv or pixi run ...).

The scripts are located in special folders:

  • etc/conda/activate.d/{script.sh/bat} - scripts in this folder are executed before the environment is activated
  • etc/conda/deactivate.d/{script.sh/bat} - scripts in this folder are executed when the environment is deactivated

The scripts are executed in lexicographical order, so you can prefix them with numbers to control the order of execution.

To add a script to the package, just make sure that you install the file in this folder. For example, on Linux:

mkdir -p $PREFIX/etc/conda/activate.d\ncp activate-mypkg.sh $PREFIX/etc/conda/activate.d/10-activate-mypkg.sh\n\nmkdir -p $PREFIX/etc/conda/deactivate.d\ncp deactivate-mypkg.sh $PREFIX/etc/conda/deactivate.d/10-deactivate-mypkg.sh\n
"},{"location":"special_files/#post-link-and-pre-unlink-scripts","title":"Post-link and pre-unlink scripts","text":"

The post-link and pre-unlink scripts are executed when the package is installed or uninstalled. They are both heavily discouraged but implemented for compatibility with conda in rattler-build since version 0.17.

For a post-link script to be executed when a package is installed, the built package needs to have a .<package_name>-post-link.{sh/bat} in its bin/ folder. The same is applicable for pre-unlink scripts, just with the name .<package_name>-pre-unlink.{sh/bat} (note the leading period). For example, for a package mypkg, you would need to have a .mypkg-post-link.sh in its bin/ folder.

To make sure the scripts are included in the correct location, use your recipe's build script or build/script key. For example, assuming you have a post-link.sh script in your source, alongside the recipe in the recipe's folder, the following configuration will copy it correctly:

build:\n  ...\n  script:\n    - ...\n    - mkdir -p $PREFIX/bin\n    - cp $RECIPE_DIR/post-link.sh $PREFIX/bin/.mypkg-post-link.sh\n    - chmod +x $PREFIX/bin/.mypkg-post-link.sh\n

The $PREFIX and $RECIPE_DIR environment variables will be set during the build process to help you specify the correct paths.

"},{"location":"testing/","title":"Testing packages","text":"

When you are developing a package, you should write tests for it. The tests are automatically executed right after the package build has finished.

The tests from the test section are actually packaged into your package and can also be executed straight from the existing package.

The idea behind adding the tests into the package is that you can execute the tests independently from building the package. That is also why we are shipping a test subcommand that takes as input an existing package and executes the tests:

rattler-build test --package-file ./xtensor-0.24.6-h60d57d3_0.tar.bz2\n

Running the above command will extract the package and create a clean environment where the package and dependencies are installed. Then the tests are executed in this newly-created environment.

If you inspect the package contents, you would find the test files under info/test/*.

"},{"location":"testing/#how-tests-are-translated","title":"How tests are translated","text":"

The tests section allows you to specify the following things:

tests:\n  - script:\n      # commands to run to test the package. If any of the commands\n      # returns with an error code, the test is considered failed.\n      - echo \"Hello world\"\n      - pytest ./tests\n\n    # additional requirements at test time\n    requirements:\n      run:\n        - pytest\n\n    files:\n      # Extra files to be copied to the test directory from the \"work directory\"\n      source:\n        - tests/\n        - test.py\n        - *.sh\n      recipe:\n        - more_tests/*.py\n\n  # This test section tries to import the Python modules and errors if it can't\n  - python:\n      imports:\n        - mypkg\n        - mypkg.subpkg\n

When you are writing a test for your package, additional files are created and added to your package. These files are placed under the info/tests/{index}/ folder for each test.

For a script test:

  • All the files are copied straight into the test folder (under info/tests/{index}/)
  • The script is turned into a run_test.sh or run_test.bat file
  • The extra requirements are stored as a JSON file called test_time_dependencies.json

For a Python import test:

  • A JSON file is created that is called python_test.json and stores the imports to be tested and whether to execute pip check or not. This file is placed under info/tests/{index}/

For a downstream test:

  • A JSON file is created that is called downstream_test.json and stores the downstream tests to be executed. This file is placed under info/tests/{index}/
"},{"location":"testing/#legacy-tests","title":"Legacy tests","text":"

Legacy tests (from conda-build) are still supported for execution. These tests are stored as files under the info/test/ folder.

The files are:

  • run_test.sh (Unix)
  • run_test.bat (Windows)
  • run_test.py (for the Python import tests)
  • test_time_dependencies.json (for additional dependencies at test time)

Additionally, the info/test/ folder contains all the files specified in the test section as source_files and files. The tests are executed pointing to this directory as the current working directory.

"},{"location":"tips_and_tricks/","title":"Tips and tricks for rattler-build","text":"

This section contains some tips and tricks for using rattler-build.

"},{"location":"tips_and_tricks/#using-sccache-or-ccache-with-rattler-build","title":"Using sccache or ccache with rattler-build","text":"

When debugging a recipe it can help a lot to use sccache or ccache. You can install both tools e.g. with pixi global install sccache.

To use them with a CMake project, you can use the following variables:

export CMAKE_C_COMPILER_LAUNCHER=sccache\nexport CMAKE_CXX_COMPILER_LAUNCHER=sccache\n\n# or more generally\n\nexport C=\"sccache $C\"\nexport CXX=\"sccache $CXX\"\n

However, both ccache and sccache are sensitive to changes in the build location. Since rattler-build, by default, always creates a new build directory with the timestamp, you need to use the --no-build-id flag. This will disable the time stamp in the build directory and allow ccache and sccache to cache the build.

rattler-build build --no-build-id --recipe ./path/to/recipe.yaml\n
"},{"location":"tips_and_tricks/#building-your-own-forge","title":"Building your own \"forge\"","text":"

You might want to publish your own software packages to a channel you control. These might be packages that are not available in the main conda-forge channel, or proprietary packages, or packages that you have modified in some way.

Doing so is pretty straightforward with rattler-build and a CI provider of your choice. We have a number of example repositories for \"custom\" forges:

  • rust-forge: This repository builds a number of Rust packages for Windows, macOS and Linux on top of Github Actions.
  • r-forge: The same idea, but for R packages
"},{"location":"tips_and_tricks/#directory-structure","title":"Directory structure","text":"

To create your own forge, you should create a number of sub-directories where each sub-directory should contain at most one recipe. With the --recipe-dir flag of rattler-build, the program will go and collect all recipes it finds in the given directory or sub-directories.

We can combine this with the --skip-existing=all flag which will skip all packages that are already built locally or in the channel (if you upload them). Using all will also look at the repodata.json file in the channel to see if the package is already there. Packages are skipped based on their complete name, including the version and build string.

To note: the build string changes if the variant configuration changes! So if you update a package in the variant configuration, the packages that need rebuilding should be rebuilt.

Note

You can generate recipes for different ecosystems with the rattler-build generate-recipe command. Read more about it in the Generating recipes section.

"},{"location":"tips_and_tricks/#ci-setup","title":"CI setup","text":"

As an example, the following is the CI setup for rust-forge. The workflow uses rattler-build to build and upload packages to a custom channel on https://prefix.dev \u2013 but you can also use rattler-build to upload to your own quetz instance, or a channel on anaconda.org.

Example CI setup for rust-forge

The following is an example of a Github Actions workflow for rust-forge:

.github/workflows/forge.yml
name: Build all packages\n\non:\n  push:\n    branches:\n      - main\n  workflow_dispatch:\n  pull_request:\n    branches:\n      - main\n\njobs:\n  build:\n    strategy:\n      matrix:\n        include:\n          - { target: linux-64, os: ubuntu-20.04 }\n          - { target: win-64, os: windows-latest }\n          # force older macos-13 to get x86_64 runners\n          - { target: osx-64, os: macos-13 }\n          - { target: osx-arm64, os: macos-14 }\n      fail-fast: false\n\n    runs-on: ${{ matrix.os }}\n    steps:\n      - uses: actions/checkout@v4\n        with:\n          fetch-depth: 2\n      - uses: prefix-dev/setup-pixi@v0.5.1\n        with:\n          pixi-version: v0.24.2\n          cache: true\n\n      - name: Run code in changed subdirectories\n        shell: bash\n        env:\n          TARGET_PLATFORM: ${{ matrix.target }}\n\n        run: |\n          pixi run rattler-build build --recipe-dir . \\\n            --skip-existing=all --target-platform=$TARGET_PLATFORM \\\n            -c conda-forge -c https://prefix.dev/rust-forge\n\n      - name: Upload all packages\n        shell: bash\n        # do not upload on PR\n        if: github.event_name == 'push'\n        env:\n          PREFIX_API_KEY: ${{ secrets.PREFIX_API_KEY }}\n        run: |\n          # ignore errors because we want to ignore duplicate packages\n          for file in output/**/*.conda; do\n            pixi run rattler-build upload prefix -c rust-forge \"$file\" || true\n          done\n
"},{"location":"tui/","title":"Terminal User Interface","text":"

rattler-build offers a terminal user interface for building multiple packages and viewing the logs.

To launch the TUI, run the build command with the --tui flag as shown below:

$ rattler-build build -r recipe.yaml --tui\n

Note

rattler-build-tui is gated behind the tui feature flag to avoid extra dependencies. Build the project with --features tui arguments to enable the TUI functionality.

"},{"location":"tui/#key-bindings","title":"Key Bindings","text":"Key Action \u23ce Build a Build all j/k Next/previous package up/down/left/right Scroll logs e Edit recipe (via $EDITOR) c, : Open command prompt (available commands: edit) q, ctrl-c, esc, Quit"},{"location":"variants/","title":"Variant configuration","text":"

rattler-build can automatically build multiple variants of a given package. For example, a Python package might need multiple variants per Python version (especially if it is a binary package such as numpy).

For this use case, one can specify variant configuration files. A variant configuration file has 2 special entries and a list of packages with variants. For example:

variants.yaml
# special entry #1, the zip keys\nzip_keys:\n- [python, numpy]\n\n# special entry #2, the pin_run_as_build key\npin_run_as_build:\n  numpy:\n    max_pin: 'x.x'\n\n# entries per package version that users are interested in\npython:\n# Note that versions are _strings_ (not numbers)\n- \"3.8\"\n- \"3.9\"\n- \"3.10\"\n\nnumpy:\n- \"1.12\"\n- \"1.12\"\n- \"1.20\"\n

If we have a recipe, that has a build, host or run dependency on python we will build multiple variants of this package, one for each configured python version (\"3.8\", \"3.9\" and \"3.10\").

For example:

# ...\nrequirements:\n  host:\n  - python\n

... will be rendered as (for the first variant):

# ...\nrequirements:\n  host:\n- python 3.8*\n

Note that variants are only applied if the requirement doesn't specify any constraints. If the requirement would be python >3.8,<3.10 then the variant entry would be ignored.

"},{"location":"variants/#automatic-variantsyaml-discovery","title":"Automatic variants.yaml discovery","text":"

rattler-build automatically includes the variant configuration from a variants.yaml file next to a recipe. Use the --ignore-recipe-variants option to disable automatic discovery of variants.yaml files next to the recipes.

To include a variant config file from another location or include multiple configuration files use the --variant-config option:

rattler-build build --variant-config ~/user_variants.yaml --variant-config /opt/rattler-build/global_variants.yaml --recipe myrecipe.yaml\n
"},{"location":"variants/#package-hash-from-variant","title":"Package hash from variant","text":"

You might have wondered what the role of the build string is. The build string is (if not explicitly set) computed from the variant configuration. It serves as a mechanism to discern different build configurations that produce a package with the same name and version.

The hash is computed by dumping all of the variant configuration values that are used by a given recipe into a JSON file, and then hashing that JSON file.

For example, in our python example, we would get a variant configuration file that looks something like:

{\n    \"python\": \"3.8\"\n}\n

This JSON string is then hashed with the MD5 hash algorithm, and produces the hash. For certain packages (such as Python packages) special rules exists, and the py<Major.Minor> version is prepended to the hash, so that the final hash would look something like py38h123123.

"},{"location":"variants/#zip-keys","title":"Zip keys","text":"

Zip keys modify how variants are combined. Usually, each variant key that has multiple entries is expanded to a build matrix. For example, if we have:

python: [\"3.8\", \"3.9\"]\nnumpy: [\"1.12\", \"1.14\"]\n

...then we obtain 4 variants for a recipe that uses both numpy and python:

- python 3.8, numpy 1.12\n- python 3.8, numpy 1.14\n- python 3.9, numpy 1.12\n- python 3.9, numpy 1.14\n

However, if we use the zip_keys and specify:

zip_keys: [\"python\", \"numpy\"]\npython: [\"3.8\", \"3.9\"]\nnumpy: [\"1.12\", \"1.14\"]\n

...then the versions are \"zipped up\" and we only get 2 variants. Note that both python and numpy need to specify the exact same number of versions to make this work.

The resulting variants with the zip applied are:

- python 3.8, numpy 1.12\n- python 3.9, numpy 1.14\n
"},{"location":"variants/#pin-run-as-build","title":"Pin run as build","text":"

The pin_run_as_build key allows the user to inject additional pins. Usually, the run_exports mechanism is used to specify constraints for runtime dependencies from build time dependencies, but pin_run_as_build offers a mechanism to override that if the package does not contain a run exports file.

For example:

pin_run_as_build:\n  libcurl:\n    min_pin: 'x'\n    max_pin: 'x'\n

If we now have a recipe that uses libcurl in the host and run dependencies like:

requirements:\n  host:\n  - libcurl\n  run:\n  - libcurl\n

During resolution, libcurl might be evaluated to libcurl 8.0.1 h13284. Our new runtime dependency then looks like:

requirements:\n  host:\n  - libcurl 8.0.1 h13284\n  run:\n  - libcurl >=8,<9\n
"},{"location":"variants/#prioritizing-variants","title":"Prioritizing variants","text":"

You might produce multiple variants for a package, but want to define a priority for a given variant. The variant with the highest priority would be the default package that is selected by the resolver.

There are two mechanisms to make this possible: mutex packages and the down_prioritize_variant option in the recipe.

"},{"location":"variants/#the-down_prioritize_variant-option","title":"The down_prioritize_variant option","text":"

Note

It is not always necessary to use the down_prioritize_variant option - only if the solver has no other way to prefer a given variant. For example, if you have a package that has multiple variants for different Python versions, the solver will automatically prefer the variant with the highest Python version.

The down_prioritize_variant option allows you to specify a variant that should be down-prioritized. For example:

recipe.yaml
build:\n  variant_config:\n    use_keys:\n      # use cuda from the variant config, e.g. to build multiple CUDA variants\n      - cuda\n    # this will down-prioritize the cuda variant versus other variants of the package\n    down_prioritize_variant: ${{ 1 if cuda else 0 }}\n
"},{"location":"variants/#mutex-packages","title":"Mutex packages","text":"

Another way to make sure the right variants are selected are \"mutex\" packages. A mutex package is a package that is mutually exclusive. We use the fact that only one package of a given name can be installed at a time (the solver has to choose).

A mutex package might be useful to make sure that all packages that depend on BLAS are compiled against the same BLAS implementation. The mutex package will serve the purpose that \"openblas\" and \"mkl\" can never be installed at the same time.

We could define a BLAS mutex package like this:

variant_config.yaml
blas_variant:\n  - \"openblas\"\n  - \"mkl\"\n

And then the recipe.yaml for the mutex package could look like this:

recipe.yaml
package:\n  name: blas_mutex\n  version: 1.0\n\nbuild:\n  string: ${{ blas_variant }}${{ hash }}_${{ build_number }}\n  variant_config:\n    # make sure that `openblas` is preferred over `mkl`\n    down_prioritize_variant: ${{ 1 if blas_variant == \"mkl\" else 0 }}\n

This will create two package: blas_mutex-1.0-openblas and blas_mutex-1.0-mkl. Only one of these packages can be installed at a time because they share the same name. The solver will then only select one of these two packages.

The blas package in turn should have a run_export for the blas_mutex package, so that any package that links against blas also has a dependency on the correct blas_mutex package:

recipe.yaml
package:\n  name: openblas\n  version: 1.0\n\nrequirements:\n  # any package depending on openblas should also depend on the correct blas_mutex package\n  run_export:\n    # Add a run export on _any_ version of the blas_mutex package whose build string starts with \"openblas\"\n    - blas_mutex * openblas*\n

Then the recipe of a package that wants to build two variants, one for openblas and one for mkl could look like this:

recipe.yaml
package:\n  name: fastnumerics\n  version: 1.0\n\nrequirements:\n  host:\n    # build against both openblas and mkl\n    - ${{ blas_variant }}\n  run:\n    # implicitly adds the correct blas_mutex package through run exports\n    # - blas_mutex * ${{ blas_variant }}*\n
"},{"location":"reference/cli/","title":"Command-Line Help for rattler-build","text":"

This document contains the help content for the rattler-build command-line program.

"},{"location":"reference/cli/#rattler-build","title":"rattler-build","text":"

Usage: rattler-build [OPTIONS] [COMMAND]

"},{"location":"reference/cli/#subcommands","title":"Subcommands:","text":"
  • build \u2014 Build a package from a recipe
  • test \u2014 Run a test for a single package
  • rebuild \u2014 Rebuild a package from a package file instead of a recipe
  • upload \u2014 Upload a package
  • completion \u2014 Generate shell completion script
  • generate-recipe \u2014 Generate a recipe from PyPI or CRAN
  • auth \u2014 Handle authentication to external channels
"},{"location":"reference/cli/#options","title":"Options:","text":"
  • -v, --verbose

    Increase logging verbosity

  • -q, --quiet

    Decrease logging verbosity

  • --log-style <LOG_STYLE>

    Logging style

    • Default value: fancy
    • Possible values:
      • fancy: Use fancy logging output
      • json: Use JSON logging output
      • plain: Use plain logging output
  • --color <COLOR>

    Enable or disable colored output from rattler-build. Also honors the CLICOLOR and CLICOLOR_FORCE environment variable

    • Default value: auto
    • Possible values:
      • always: Always use colors
      • never: Never use colors
      • auto: Use colors when the output is a terminal
"},{"location":"reference/cli/#build","title":"build","text":"

Build a package from a recipe

Usage: rattler-build build [OPTIONS]

"},{"location":"reference/cli/#options_1","title":"Options:","text":"
  • -r, --recipe <RECIPE>

    The recipe file or directory containing recipe.yaml. Defaults to the current directory

    • Default value: .
  • --recipe-dir <RECIPE_DIR>

    The directory that contains recipes

  • --up-to <UP_TO>

    Build recipes up to the specified package

  • --build-platform <BUILD_PLATFORM>

    The build platform to use for the build (e.g. for building with emulation, or rendering)

    • Default value: current platform
  • --target-platform <TARGET_PLATFORM>

    The target platform for the build

    • Default value: current platform
  • -c, --channel <CHANNEL>

    Add a channel to search for dependencies in

    • Default value: conda-forge
  • -m, --variant-config <VARIANT_CONFIG>

    Variant configuration files for the build

  • --ignore-recipe-variants

    Do not read the variants.yaml file next to a recipe

    • Possible values: true, false
  • --render-only

    Render the recipe files without executing the build

    • Possible values: true, false
  • --with-solve

    Render the recipe files with solving dependencies

    • Possible values: true, false
  • --keep-build

    Keep intermediate build artifacts after the build

    • Possible values: true, false
  • --no-build-id

    Don't use build id(timestamp) when creating build directory name

    • Possible values: true, false
  • --compression-threads <COMPRESSION_THREADS>

    The number of threads to use for compression (only relevant when also using --package-format conda)

  • --use-zstd

    Enable support for repodata.json.zst

    • Default value: true
    • Possible values: true, false
  • --use-bz2

    Enable support for repodata.json.bz2

    • Default value: true
    • Possible values: true, false
  • --experimental

    Enable experimental features

    • Possible values: true, false
  • --auth-file <AUTH_FILE>

    Path to an auth-file to read authentication information from

  • --tui

    Launch the terminal user interface

    • Default value: false
    • Possible values: true, false
"},{"location":"reference/cli/#modifying-result","title":"Modifying result","text":"
  • --package-format <PACKAGE_FORMAT>

    The package format to use for the build. Can be one of tar-bz2 or conda. You can also add a compression level to the package format, e.g. tar-bz2:<number> (from 1 to 9) or conda:<number> (from -7 to 22).

    • Default value: conda
  • --no-include-recipe

    Don't store the recipe in the final package

    • Possible values: true, false
  • --no-test

    Don't run the tests after building the package

    • Default value: false
    • Possible values: true, false
  • --color-build-log

    Don't force colors in the output of the build script

    • Default value: true
    • Possible values: true, false
  • --output-dir <OUTPUT_DIR>

    Output directory for build artifacts.

    • Default value: ./output
  • --skip-existing <SKIP_EXISTING>

    Whether to skip packages that already exist in any channel If set to none, do not skip any packages, default when not specified. If set to local, only skip packages that already exist locally, default when using --skip-existing. If set toall`, skip packages that already exist in any channel

    • Default value: none
    • Possible values:
      • none: Do not skip any packages
      • local: Skip packages that already exist locally
      • all: Skip packages that already exist in any channel
"},{"location":"reference/cli/#test","title":"test","text":"

Run a test for a single package

This creates a temporary directory, copies the package file into it, and then runs the indexing. It then creates a test environment that installs the package and any extra dependencies specified in the package test dependencies file.

With the activated test environment, the packaged test files are run:

  • info/test/run_test.sh or info/test/run_test.bat on Windows * info/test/run_test.py

These test files are written at \"package creation time\" and are part of the package.

Usage: rattler-build test [OPTIONS] --package-file <PACKAGE_FILE>

"},{"location":"reference/cli/#options_2","title":"Options:","text":"
  • -c, --channel <CHANNEL>

    Channels to use when testing

  • -p, --package-file <PACKAGE_FILE>

    The package file to test

  • --compression-threads <COMPRESSION_THREADS>

    The number of threads to use for compression

  • --use-zstd

    Enable support for repodata.json.zst

    • Default value: true
    • Possible values: true, false
  • --use-bz2

    Enable support for repodata.json.bz2

    • Default value: true
    • Possible values: true, false
  • --experimental

    Enable experimental features

    • Possible values: true, false
  • --auth-file <AUTH_FILE>

    Path to an auth-file to read authentication information from

"},{"location":"reference/cli/#modifying-result_1","title":"Modifying result","text":"
  • --output-dir <OUTPUT_DIR>

    Output directory for build artifacts.

    • Default value: ./output
"},{"location":"reference/cli/#rebuild","title":"rebuild","text":"

Rebuild a package from a package file instead of a recipe

Usage: rattler-build rebuild [OPTIONS] --package-file <PACKAGE_FILE>

"},{"location":"reference/cli/#options_3","title":"Options:","text":"
  • -p, --package-file <PACKAGE_FILE>

    The package file to rebuild

  • --no-test

    Do not run tests after building

    • Default value: false
    • Possible values: true, false
  • --compression-threads <COMPRESSION_THREADS>

    The number of threads to use for compression

  • --use-zstd

    Enable support for repodata.json.zst

    • Default value: true
    • Possible values: true, false
  • --use-bz2

    Enable support for repodata.json.bz2

    • Default value: true
    • Possible values: true, false
  • --experimental

    Enable experimental features

    • Possible values: true, false
  • --auth-file <AUTH_FILE>

    Path to an auth-file to read authentication information from

"},{"location":"reference/cli/#modifying-result_2","title":"Modifying result","text":"
  • --output-dir <OUTPUT_DIR>

    Output directory for build artifacts.

    • Default value: ./output
"},{"location":"reference/cli/#upload","title":"upload","text":"

Upload a package

Usage: rattler-build upload [OPTIONS] [PACKAGE_FILES]... <COMMAND>

"},{"location":"reference/cli/#subcommands_1","title":"Subcommands:","text":"
  • quetz \u2014 Upload to aQuetz server. Authentication is used from the keychain / auth-file
  • artifactory \u2014 Options for uploading to a Artifactory channel. Authentication is used from the keychain / auth-file
  • prefix \u2014 Options for uploading to a prefix.dev server. Authentication is used from the keychain / auth-file
  • anaconda \u2014 Options for uploading to a Anaconda.org server
"},{"location":"reference/cli/#arguments","title":"Arguments:","text":"
  • <PACKAGE_FILES>

    The package file to upload

"},{"location":"reference/cli/#options_4","title":"Options:","text":"
  • --use-zstd

    Enable support for repodata.json.zst

    • Default value: true
    • Possible values: true, false
  • --use-bz2

    Enable support for repodata.json.bz2

    • Default value: true
    • Possible values: true, false
  • --experimental

    Enable experimental features

    • Possible values: true, false
  • --auth-file <AUTH_FILE>

    Path to an auth-file to read authentication information from

"},{"location":"reference/cli/#modifying-result_3","title":"Modifying result","text":"
  • --output-dir <OUTPUT_DIR>

    Output directory for build artifacts.

    • Default value: ./output
"},{"location":"reference/cli/#quetz","title":"quetz","text":"

Upload to aQuetz server. Authentication is used from the keychain / auth-file

Usage: rattler-build upload quetz [OPTIONS] --url <URL> --channel <CHANNEL>

"},{"location":"reference/cli/#options_5","title":"Options:","text":"
  • -u, --url <URL>

    The URL to your Quetz server

  • -c, --channel <CHANNEL>

    The URL to your channel

  • -a, --api-key <API_KEY>

    The Quetz API key, if none is provided, the token is read from the keychain / auth-file

"},{"location":"reference/cli/#artifactory","title":"artifactory","text":"

Options for uploading to a Artifactory channel. Authentication is used from the keychain / auth-file

Usage: rattler-build upload artifactory [OPTIONS] --url <URL> --channel <CHANNEL>

"},{"location":"reference/cli/#options_6","title":"Options:","text":"
  • -u, --url <URL>

    The URL to your Artifactory server

  • -c, --channel <CHANNEL>

    The URL to your channel

  • -r, --username <USERNAME>

    Your Artifactory username

  • -p, --password <PASSWORD>

    Your Artifactory password

"},{"location":"reference/cli/#prefix","title":"prefix","text":"

Options for uploading to a prefix.dev server. Authentication is used from the keychain / auth-file

Usage: rattler-build upload prefix [OPTIONS] --channel <CHANNEL>

"},{"location":"reference/cli/#options_7","title":"Options:","text":"
  • -u, --url <URL>

    The URL to the prefix.dev server (only necessary for self-hosted instances)

    • Default value: https://prefix.dev
  • -c, --channel <CHANNEL>

    The channel to upload the package to

  • -a, --api-key <API_KEY>

    The prefix.dev API key, if none is provided, the token is read from the keychain / auth-file

"},{"location":"reference/cli/#anaconda","title":"anaconda","text":"

Options for uploading to a Anaconda.org server

Usage: rattler-build upload anaconda [OPTIONS] --owner <OWNER>

"},{"location":"reference/cli/#options_8","title":"Options:","text":"
  • -o, --owner <OWNER>

    The owner of the distribution (e.g. conda-forge or your username)

  • -c, --channel <CHANNEL>

    The channel / label to upload the package to (e.g. main / rc)

    • Default value: main
  • -a, --api-key <API_KEY>

    The Anaconda API key, if none is provided, the token is read from the keychain / auth-file

  • -u, --url <URL>

    The URL to the Anaconda server

    • Default value: https://api.anaconda.org
  • -f, --force

    Replace files on conflict

    • Default value: false
    • Possible values: true, false
"},{"location":"reference/cli/#completion","title":"completion","text":"

Generate shell completion script

Usage: rattler-build completion --shell <SHELL>

"},{"location":"reference/cli/#options_9","title":"Options:","text":"
  • -s, --shell <SHELL>

    Specifies the shell for which the completions should be generated

    • Possible values:
      • bash: Bourne Again SHell (bash)
      • elvish: Elvish shell
      • fish: Friendly Interactive SHell (fish)
      • nushell: Nushell
      • powershell: PowerShell
      • zsh: Z SHell (zsh)
"},{"location":"reference/cli/#generate-recipe","title":"generate-recipe","text":"

Generate a recipe from PyPI or CRAN

Usage: rattler-build generate-recipe <COMMAND>

"},{"location":"reference/cli/#subcommands_2","title":"Subcommands:","text":"
  • pypi \u2014 Generate a recipe for a Python package from PyPI
  • cran \u2014 Generate a recipe for an R package from CRAN
"},{"location":"reference/cli/#pypi","title":"pypi","text":"

Generate a recipe for a Python package from PyPI

Usage: rattler-build generate-recipe pypi [OPTIONS] <PACKAGE>

"},{"location":"reference/cli/#arguments_1","title":"Arguments:","text":"
  • <PACKAGE>

    Name of the package to generate

"},{"location":"reference/cli/#options_10","title":"Options:","text":"
  • -w, --write

    Whether to write the recipe to a folder

    • Possible values: true, false
  • -u, --use-mapping

    Whether to use the conda-forge PyPI name mapping

    • Default value: true
    • Possible values: true, false
  • -t, --tree

    Whether to generate recipes for all dependencies

    • Possible values: true, false
"},{"location":"reference/cli/#cran","title":"cran","text":"

Generate a recipe for an R package from CRAN

Usage: rattler-build generate-recipe cran [OPTIONS] <PACKAGE>

"},{"location":"reference/cli/#arguments_2","title":"Arguments:","text":"
  • <PACKAGE>

    Name of the package to generate

"},{"location":"reference/cli/#options_11","title":"Options:","text":"
  • -u, --universe <UNIVERSE>

    The R Universe to fetch the package from (defaults to cran)

  • -t, --tree

    Whether to create recipes for the whole dependency tree or not

    • Possible values: true, false
  • -w, --write

    Whether to write the recipe to a folder

    • Possible values: true, false
"},{"location":"reference/cli/#auth","title":"auth","text":"

Handle authentication to external channels

Usage: rattler-build auth <COMMAND>

"},{"location":"reference/cli/#subcommands_3","title":"Subcommands:","text":"
  • login \u2014 Store authentication information for a given host
  • logout \u2014 Remove authentication information for a given host
"},{"location":"reference/cli/#login","title":"login","text":"

Store authentication information for a given host

Usage: rattler-build auth login [OPTIONS] <HOST>

"},{"location":"reference/cli/#arguments_3","title":"Arguments:","text":"
  • <HOST>

    The host to authenticate with (e.g. repo.prefix.dev)

"},{"location":"reference/cli/#options_12","title":"Options:","text":"
  • --token <TOKEN>

    The token to use (for authentication with prefix.dev)

  • --username <USERNAME>

    The username to use (for basic HTTP authentication)

  • --password <PASSWORD>

    The password to use (for basic HTTP authentication)

  • --conda-token <CONDA_TOKEN>

    The token to use on anaconda.org / quetz authentication

"},{"location":"reference/cli/#logout","title":"logout","text":"

Remove authentication information for a given host

Usage: rattler-build auth logout <HOST>

"},{"location":"reference/cli/#arguments_4","title":"Arguments:","text":"
  • <HOST>

    The host to remove authentication for

This document was generated automatically by clap-markdown.

"},{"location":"reference/jinja/","title":"Jinja","text":"

rattler-build comes with a couple of useful Jinja functions and filters that can be used in the recipe.

"},{"location":"reference/jinja/#functions","title":"Functions","text":""},{"location":"reference/jinja/#the-compiler-function","title":"The compiler function","text":"

The compiler function can be used to put together a compiler that works for the current platform and the compilation \"target_platform\". The syntax looks like: ${{ compiler('c') }} where 'c' signifies the programming language that is used.

This function evaluates to <compiler>_<target_platform> <compiler_version>. For example, when compiling on linux and to linux-64, this function evaluates to gcc_linux-64.

The values can be influenced by the variant_configuration. The <lang>_compiler and <lang>_compiler_version variables are the keys with influence. See below for an example:

"},{"location":"reference/jinja/#usage-in-a-recipe","title":"Usage in a recipe","text":"recipe.yaml
requirements:\n  build:\n    - ${{ compiler('c') }}\n

With a corresponding variant_configuration:

variant_configuration.yaml
c_compiler:\n- clang\nc_compiler_version:\n- 9.0\n

The variables shown above would select the clang compiler in version 9.0. Note that the final output will still contain the target_platform, so that the full compiler will read clang_linux-64 9.0 when compiling with --target-platform linux-64.

rattler-build defines some default compilers for the following languages (inherited from conda-build):

  • c: gcc on Linux, clang on osx and vs2017 on Windows
  • cxx: gxx on Linux, clangxx on osx and vs2017 on Windows
  • fortran: gfortran on Linux, gfortran on osx and vs2017 on Windows
  • rust: rust
"},{"location":"reference/jinja/#the-stdlib-function","title":"The stdlib function","text":"

The stdlib function closely mirrors the compiler function. It can be used to put together a standard library that works for the current platform and the compilation \"target_platform\".

Usage: ${{ stdlib('c') }}

Results in <stdlib>_<target_platform> <stdlib_version>. And uses the variant variables <lang>_stdlib and <lang>_stdlib_version to influence the output.

"},{"location":"reference/jinja/#usage-in-a-recipe_1","title":"Usage in a recipe:","text":"recipe.yaml
requirements:\n  build:\n    # these are usually paired!\n    - ${{ compiler('c') }}\n    - ${{ stdlib('c') }}\n

With a corresponding variant_configuration:

variant_configuration.yaml
# these are the values `conda-forge` uses in their pinning file\n# found at https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yaml\nc_stdlib:\n- sysroot\nc_stdlib_version:\n- 2.17\n
"},{"location":"reference/jinja/#the-pin-functions","title":"The pin functions","text":"

A pin is created based on the version input (from a subpackage or a package resolution).

The pin functions take the following three arguments:

  • lower_bound (default: \"x.x.x.x.x.x\"): The lower bound pin expression to be used. When set to None, no lower bound is set.
  • upper_bound (default: \"x\"): The maximum pin to be used. When set to None, no upper bound is set.

The lower bound and upper bound can either be a \"pin expression\" (only x and . are allowed) or a hard-coded version string.

A \"pin expression\" is applied to the version input to create the lower and upper bounds. For example, if the version is 3.10.5 with a lower_bound=\"x.x\", upper_bound=\"x.x.x\", the lower bound will be 3.10 and the upper bound will be 3.10.6.0a0. A pin expression for the upper_bound will increment the last selected segment of the version by 1, and append .0a0 to the end to prevent any alpha versions from being selected.

If the last segment of the version contains a letter (e.g. 9e or 1.1.1j), then incrementing the version will set that letter to a, e.g. 9e will become 10a, and 1.1.1j will become 1.1.2a. In this case, also no 0a0 is appended to the end.

Sometimes you want to strongly connect your outputs. This can be achieved with the following input:

  • exact=True (default: False): This will pin the version exactly to the version of the output, incl. the build string.

To override the lower or upper bound with a hard-coded value, you can use the following input:

  • lower_bound (default: None): This will override the lower bound with the given value.
  • upper_bound (default: None): This will override the upper bound with the given value.

Both lower_bound and upper_bound expect a valid version string (e.g. 1.2.3).

"},{"location":"reference/jinja/#the-pin_subpackage-function","title":"The pin_subpackage function","text":"
  • ${{ pin_subpackage(\"mypkg\", lower_bound=\"x.x\", upper_bound=\"x.x\") }} creates a pin to another output in the recipe. With an input of 3.1.5, this would create a pin of mypkg >=3.1,<3.2.0a0.
  • ${{ pin_subpackage(\"other_output\", exact=True) }} creates a pin to another output in the recipe with an exact version.
  • ${{ pin_subpackage(\"other_output\", lower_bound=\"1.2.3\", upper_bound=\"1.2.4\") }} creates a pin to another output in the recipe with a lower bound of 1.2.3 and an upper bound of 1.2.4. This is equivalent to writing other_output >=1.2.3,<1.2.4.
"},{"location":"reference/jinja/#the-pin_compatible-function","title":"The pin_compatible function","text":"

The pin compatible function works exactly as the pin_subpackage function, but it pins the package in the run requirements based on the resolved package of the host or build section.

  • pin_compatible pins a package in the run requirements based on the resolved package of the host or build section.
"},{"location":"reference/jinja/#the-cdt-function","title":"The cdt function","text":"
  • ${{ cdt(\"mypkg\") }} creates a cross-dependency to another output in the recipe.

This function helps add Core Dependency Tree packages as dependencies by converting packages as required according to hard-coded logic. See below for an example of how this function can be used:

# on x86_64 system\ncdt('package-name') # outputs: package-name-cos6-x86_64\n# on aarch64 system\ncdt('package-name') # outputs: package-name-cos6-aarch64\n
"},{"location":"reference/jinja/#the-hash-variable","title":"The hash variable","text":"
  • ${{ hash }} is the variant hash and is useful in the build string computation.
"},{"location":"reference/jinja/#the-version_to_buildstring-function","title":"The version_to_buildstring function","text":"
  • ${{ python | version_to_buildstring }} converts a version from the variant to a build string (it removes the . character and takes only the first two elements of the version).
"},{"location":"reference/jinja/#the-env-object","title":"The env object","text":"

You can use the env object to retrieve environment variables and forward them to your build script. ${{ env.get(\"MY_ENV_VAR\") }} will return the value of the environment variable MY_ENV_VAR or throw an error if it is not set.

To supply a default value when the environment variable is not set, you can use ${{ env.get(\"MY_ENV_VAR\", default=\"default_value\") }}. In this case, if MY_ENV_VAR is not set, the value default_value will be returned (and no error is thrown).

You can also check for the existence of an environment variable:

  • ${{ env.exists(\"MY_ENV_VAR\") }} will return true if the environment variable MY_ENV_VAR is set and false otherwise.
"},{"location":"reference/jinja/#filters","title":"Filters","text":"

A feature of jinja is called \"filters\". Filters are functions that can be applied to variables in a template expression.

The syntax for a filter is {{ variable | filter_name }}. A filter can also take arguments, such as ... | replace('foo', 'bar').

The following Jinja filters are available, taken from the upstream minijinja library:

  • replace: replace a string with another string (e.g. \"{{ 'foo' | replace('oo', 'aa') }}\" will return \"faa\")
  • lower: convert a string to lowercase (e.g. \"{{ 'FOO' | lower }}\" will return \"foo\")
  • upper: convert a string to uppercase (e.g. \"{{ 'foo' | upper }}\" will return \"FOO\") - int: convert a string to an integer (e.g. \"{{ '42' | int }}\" will return 42)
  • abs: return the absolute value of a number (e.g. \"{{ -42 | abs }}\" will return 42)
  • bool: convert a value to a boolean (e.g. \"{{ 'foo' | bool }}\" will return true)
  • default: return a default value if the value is falsy (e.g. \"{{ '' | default('foo') }}\" will return \"foo\")
  • first: return the first element of a list (e.g. \"{{ [1, 2, 3] | first }}\" will return 1) - last: return the last element of a list (e.g. \"{{ [1, 2, 3] | last }}\" will return 3)
  • length: return the length of a list (e.g. \"{{ [1, 2, 3] | length }}\" will return 3)
  • list: convert a string to a list (e.g. \"{{ 'foo' | list }}\" will return ['f', 'o', 'o'])
  • join: join a list with a separator (e.g. \"{{ [1, 2, 3] | join('.') }}\" will return \"1.2.3\")
  • min: return the minimum value of a list (e.g. \"{{ [1, 2, 3] | min }}\" will return 1)
  • max: return the maximum value of a list (e.g. \"{{ [1, 2, 3] | max }}\" will return 3)
  • reverse: reverse a list (e.g. \"{{ [1, 2, 3] | reverse }}\" will return [3, 2, 1])
  • slice: slice a list (e.g. \"{{ [1, 2, 3] | slice(1, 2) }}\" will return [2])
  • batch: This filter works pretty much like slice just the other way round. It returns a list of lists with the given number of items. If you provide a second parameter this is used to fill up missing items.
  • sort: sort a list (e.g. \"{{ [3, 1, 2] | sort }}\" will return [1, 2, 3])
  • trim: remove leading and trailing whitespace from a string (e.g. \"{{ ' foo ' | trim }}\" will return \"foo\")
  • unique: remove duplicates from a list (e.g. \"{{ [1, 2, 1, 3] | unique }}\" will return [1, 2, 3])
  • split: split a string into a list (e.g. \"{{ '1.2.3' | split('.') }}\" will return ['1', '2', '3']). By default, splits on whitespace.
Removed filters

The following filters are removed from the builtins:

  • attr
  • indent
  • select
  • selectattr
  • dictsort
  • reject
  • rejectattr
  • round
  • map
  • title
  • capitalize
  • urlencode
  • escape
  • pprint
  • safe
  • items
  • float
  • tojson
"},{"location":"reference/jinja/#extra-filters-for-recipes","title":"Extra filters for recipes","text":""},{"location":"reference/jinja/#the-version_to_buildstring-filter","title":"The version_to_buildstring filter","text":"
  • ${{ python | version_to_buildstring }} converts a version from the variant to a build string (it removes the . character and takes only the first two elements of the version).

For example the following:

context:\n  cuda: \"11.2.0\"\n\nbuild:\n  string: ${{ hash }}_cuda${{ cuda_version | version_to_buildstring }}\n

Would evaluate to a abc123_cuda112 (assuming the hash was abc123).

"},{"location":"reference/jinja/#various-remarks","title":"Various remarks","text":""},{"location":"reference/jinja/#inline-conditionals-with-jinja","title":"Inline conditionals with Jinja","text":"

The new recipe format allows for inline conditionals with Jinja. If they are falsey, and no else branch exists, they will render to an empty string (which is, for example in a list or dictionary, equivalent to a YAML null).

When a recipe is rendered, all values that are null must be filtered from the resulting YAML.

requirements:\n  host:\n    - ${{ \"numpy\" if cuda == \"yes\" }}\n

If cuda is not equal to yes, the first item of the host requirements will be empty (null) and thus filtered from the final list.

This must also work for dictionary values. For example:

build:\n  number: ${{ 100 if cuda == \"yes\" }}\n  # or an `else` branch can be used, of course\n  number: ${{ 100 if cuda == \"yes\" else 0 }}\n
"},{"location":"reference/recipe_file/","title":"The recipe spec","text":"

rattler-build implements a new recipe spec, different from the traditional \"meta.yaml\" file used in conda-build. A recipe has to be stored as a recipe.yaml file.

"},{"location":"reference/recipe_file/#history","title":"History","text":"

A discussion was started on what a new recipe spec could or should look like. The fragments of this discussion can be found here.

The reason for a new spec are:

  • make it easier to parse (i.e. \"pure YAML\"); conda-build uses a mix of comments and Jinja to achieve a great deal of flexibility, but it's hard to parse the recipe with a computer
  • iron out some inconsistencies around multiple outputs (build vs. build/script and more)
  • remove any need for recursive parsing & solving
  • finally, the initial implementation in boa relied on conda-build; rattler-build removes any dependency on Python or conda-build and reimplements everything in Rust
"},{"location":"reference/recipe_file/#major-differences-from-conda-build","title":"Major differences from conda-build","text":"
  • recipe filename is recipe.yaml, not meta.yaml
  • outputs have less complicated behavior, keys are same as top-level recipe (e.g. build/script, not just script and package/name, not just name)
  • no implicit meta-packages in outputs
  • no full Jinja2 support: no conditional or {% set ... support, only string interpolation; variables can be set in the toplevel \"context\" which is valid YAML
  • Jinja string interpolation needs to be preceded by a dollar sign at the beginning of a string, e.g. - ${{ version }} in order for it to be valid YAML
  • selectors use a YAML dictionary style (vs. comments in conda-build). Instead of - somepkg #[osx] we use:
    if: osx\nthen:\n  - somepkg\n
  • skip instruction uses a list of skip conditions and not the selector syntax from conda-build (e.g. skip: [\"osx\", \"win and py37\"])
"},{"location":"reference/recipe_file/#spec","title":"Spec","text":"

The recipe spec has the following parts:

  • context: to set up variables that can later be used in Jinja string interpolation
  • package: defines name, version etc. of the top-level package
  • source: points to the sources that need to be downloaded in order to build the recipe
  • build: defines how to build the recipe and what build number to use
  • requirements: defines requirements of the top-level package
  • test: defines tests for the top-level package
  • outputs: a recipe can have multiple outputs. Each output can and should have a package, requirements and test section
"},{"location":"reference/recipe_file/#spec-reference","title":"Spec reference","text":"

The spec is also made available through a JSON Schema (which is used for validation). The schema (and pydantic source file) can be found in this repository: recipe-format

To use with VSCode(yaml-plugin) and other IDEs:

Either start the document with the following line:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n
Or, using yaml.schemas,
yaml.schemas: {\n  \"https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\": \"**/recipe.yaml\",\n}\n
Read more about this here.

See more in the automatic linting chapter.

"},{"location":"reference/recipe_file/#examples","title":"Examples","text":"recipe.yaml
# this sets up \"context variables\" (in this case name and version) that\n# can later be used in Jinja expressions\ncontext:\n  version: 1.1.0\n  name: imagesize\n\n# top level package information (name and version)\npackage:\n  name: ${{ name }}\n  version: ${{ version }}\n\n# location to get the source from\nsource:\n  url: https://pypi.io/packages/source/${{ name[0] }}/${{ name }}/${{ name }}-${{ version }}.tar.gz\n  sha256: f3832918bc3c66617f92e35f5d70729187676313caa60c187eb0f28b8fe5e3b5\n\n# build number (should be incremented if a new build is made, but version is not incrementing)\nbuild:\n  number: 1\n  script: python -m pip install --no-deps --ignore-installed .\n\n# the requirements at build and runtime\nrequirements:\n  host:\n    - python\n    - pip\n  run:\n    - python\n\n# tests to validate that the package works as expected\ntests:\n  - python:\n      imports:\n        - imagesize\n\n# information about the package\nabout:\n  homepage: https://github.com/shibukawa/imagesize_py\n  license: MIT\n  summary: 'Getting image size from png/jpeg/jpeg2000/gif file'\n  description: |\n    This module analyzes jpeg/jpeg2000/png/gif image header and\n    return image size.\n  repository: https://github.com/shibukawa/imagesize_py\n  documentation: https://pypi.python.org/pypi/imagesize\n\n# the below is conda-forge specific!\nextra:\n  recipe-maintainers:\n    - somemaintainer\n
"},{"location":"reference/recipe_file/#package-section","title":"Package section","text":"

Specifies package information.

package:\n  name: bsdiff4\n  version: \"2.1.4\"\n
  • name: The lower case name of the package. It may contain \"-\", but no spaces.
  • version: The version number of the package. Use the PEP-386 verlib conventions. Cannot contain \"-\". YAML interprets version numbers such as 1.0 as floats, meaning that 0.10 will be the same as 0.1. To avoid this, put the version number in quotes so that it is interpreted as a string.
"},{"location":"reference/recipe_file/#source-section","title":"Source section","text":"

Specifies where the source code of the package is coming from. The source may come from a tarball file, git, hg, or svn. It may be a local path and it may contain patches.

"},{"location":"reference/recipe_file/#source-from-tarball-or-zip-archive","title":"Source from tarball or zip archive","text":"
source:\n  url: https://pypi.python.org/packages/source/b/bsdiff4/bsdiff4-1.1.4.tar.gz\n  md5: 29f6089290505fc1a852e176bd276c43\n  sha1: f0a2c9a30073449cfb7d171c57552f3109d93894\n  sha256: 5a022ff4c1d1de87232b1c70bde50afbb98212fd246be4a867d8737173cf1f8f\n

If an extracted archive contains only 1 folder at its top level, its contents will be moved 1 level up, so that the extracted package contents sit in the root of the work folder.

"},{"location":"reference/recipe_file/#source-from-git","title":"Source from git","text":"
source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  # branch: master # note: defaults to fetching the repo's default branch\n

You can use rev to pin the commit version directly:

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  rev: \"50a1f7ed6c168eb0815d424cba2df62790f168f0\"\n

Or you can use the tag:

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  tag: \"1.1.4\"\n

git can also be a relative path to the recipe directory:

source:\n  git: ../../bsdiff4/.git\n  tag: \"1.1.4\"\n

Furthermore, if you want to fetch just the current \"HEAD\" (this may result in non-deterministic builds), then you can use depth.

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  depth: 1 # note: the behaviour defaults to -1\n

Note: tag or rev may not be available within commit depth range, hence we don't allow using rev or the tag and depth of them together if not set to -1.

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  tag: \"1.1.4\"\n  depth: 1 # error: use of `depth` with `rev` is invalid, they are mutually exclusive\n

When you want to use git-lfs, you need to set lfs: true. This will also pull the lfs files from the repository.

source:\n  git: ../../bsdiff4/.git\n  tag: \"1.1.4\"\n  lfs: true # note: defaults to false\n
"},{"location":"reference/recipe_file/#source-from-a-local-path","title":"Source from a local path","text":"

If the path is relative, it is taken relative to the recipe directory. The source is copied to the work directory before building.

  source:\n    path: ../src\n    use_gitignore: false # note: defaults to true\n

By default, all files in the local path that are ignored by git are also ignored by rattler-build. You can disable this behavior by setting use_gitignore to false.

"},{"location":"reference/recipe_file/#patches","title":"Patches","text":"

Patches may optionally be applied to the source.

  source:\n    #[source information here]\n    patches:\n      - my.patch # the patch file is expected to be found in the recipe\n
"},{"location":"reference/recipe_file/#destination-path","title":"Destination path","text":"

Within rattler-build's work directory, you may specify a particular folder to place the source into. rattler-build will always drop you into the same folder ([build folder]/work), but it's up to you whether you want your source extracted into that folder, or nested deeper. This feature is particularly useful when dealing with multiple sources, but can apply to recipes with single sources as well.

source:\n  #[source information here]\n  target_directory: my-destination/folder\n
"},{"location":"reference/recipe_file/#source-from-multiple-sources","title":"Source from multiple sources","text":"

Some software is most easily built by aggregating several pieces.

The syntax is a list of source dictionaries. Each member of this list follows the same rules as the single source. All features for each member are supported.

Example:

source:\n  - url: https://package1.com/a.tar.bz2\n    target_directory: stuff\n  - url: https://package1.com/b.tar.bz2\n    target_directory: stuff\n  - git: https://github.com/mamba-org/boa\n    target_directory: boa\n

Here, the two URL tarballs will go into one folder, and the git repo is checked out into its own space. git will not clone into a non-empty folder.

"},{"location":"reference/recipe_file/#build-section","title":"Build section","text":"

Specifies build information.

Each field that expects a path can also handle a glob pattern. The matching is performed from the top of the build environment, so to match files inside your project you can use a pattern similar to the following one: \"**/myproject/**/*.txt\". This pattern will match any .txt file found in your project. Quotation marks (\"\") are required for patterns that start with a *.

Recursive globbing using ** is also supported.

"},{"location":"reference/recipe_file/#build-number-and-string","title":"Build number and string","text":"

The build number should be incremented for new builds of the same version. The number defaults to 0. The build string cannot contain \"-\". The string defaults to the default rattler-build build string plus the build number.

build:\n  number: 1\n  string: abc\n
"},{"location":"reference/recipe_file/#dynamic-linking","title":"Dynamic linking","text":"

This section contains settings for the shared libraries and executables.

build:\n  dynamic_linking:\n    rpath_allowlist: [\"/usr/lib/**\"]\n
"},{"location":"reference/recipe_file/#python-entry-points","title":"Python entry points","text":"

The following example creates a Python entry point named \"bsdiff4\" that calls bsdiff4.cli.main_bsdiff4().

build:\n  python:\n    entry_points:\n      - bsdiff4 = bsdiff4.cli:main_bsdiff4\n      - bspatch4 = bsdiff4.cli:main_bspatch4\n
"},{"location":"reference/recipe_file/#script","title":"Script","text":"

By default, rattler-build uses a build.sh file on Unix (macOS and Linux) and a build.bat file on Windows, if they exist in the same folder as the recipe.yaml file. With the script parameter you can either supply a different filename or write out short build scripts. You may need to use selectors to use different scripts for different platforms.

build:\n  # A very simple build script\n  script: pip install .\n\n  # The build script can also be a list\n  script:\n    - pip install .\n    - echo \"hello world\"\n    - if: unix\n      then:\n        - echo \"unix\"\n
"},{"location":"reference/recipe_file/#skipping-builds","title":"Skipping builds","text":"

Lists conditions under which rattler-build should skip the build of this recipe. Particularly useful for defining recipes that are platform-specific. By default, a build is never skipped.

build:\n  skip:\n    - win\n    ...\n
"},{"location":"reference/recipe_file/#architecture-independent-packages","title":"Architecture-independent packages","text":"

Allows you to specify \"no architecture\" when building a package, thus making it compatible with all platforms and architectures. Architecture-independent packages can be installed on any platform.

Assigning the noarch key as generic tells conda to not try any manipulation of the contents.

build:\n  noarch: generic\n

noarch: generic is most useful for packages such as static JavaScript assets and source archives. For pure Python packages that can run on any Python version, you can use the noarch: python value instead:

build:\n  noarch: python\n

Note

At the time of this writing, noarch packages should not make use of preprocess-selectors: noarch packages are built with the directives which evaluate to true in the platform it is built on, which probably will result in incorrect/incomplete installation in other platforms.

"},{"location":"reference/recipe_file/#include-build-recipe","title":"Include build recipe","text":"

The recipe and rendered recipe.yaml file are included in the package_metadata by default. You can disable this by passing --no-include-recipe on the command line.

Note

There are many more options in the build section. These additional options control how variants are computed, prefix replacements, and more. See the full build options for more information.

"},{"location":"reference/recipe_file/#requirements-section","title":"Requirements section","text":"

Specifies the build and runtime requirements. Dependencies of these requirements are included automatically.

Versions for requirements must follow the conda/mamba match specification. See build-version-spec.

"},{"location":"reference/recipe_file/#build","title":"Build","text":"

Tools required to build the package.

These packages are run on the build system and include things such as version control systems (git, svn) make tools (GNU make, Autotool, CMake) and compilers (real cross, pseudo-cross, or native when not cross-compiling), and any source pre-processors.

Packages which provide \"sysroot\" files, like the CDT packages (see below), also belong in the build section.

requirements:\n  build:\n    - git\n    - cmake\n
"},{"location":"reference/recipe_file/#host","title":"Host","text":"

Represents packages that need to be specific to the target platform when the target platform is not necessarily the same as the native build platform. For example, in order for a recipe to be \"cross-capable\", shared libraries requirements must be listed in the host section, rather than the build section, so that the shared libraries that get linked are ones for the target platform, rather than the native build platform. You should also include the base interpreter for packages that need one. In other words, a Python package would list python here and an R package would list mro-base or r-base.

requirements:\n  build:\n    - ${{ compiler('c') }}\n    - if: linux\n      then:\n        - ${{ cdt('xorg-x11-proto-devel') }}\n  host:\n    - python\n

Note

When both \"build\" and \"host\" sections are defined, the build section can be thought of as \"build tools\" - things that run on the native platform, but output results for the target platform (e.g. a cross-compiler that runs on linux-64, but targets linux-armv7).

The PREFIX environment variable points to the host prefix. With respect to activation during builds, both the host and build environments are activated. The build prefix is activated before the host prefix so that the host prefix has priority over the build prefix. Executables that don't exist in the host prefix should be found in the build prefix.

The build and host prefixes are always separate when both are defined, or when ${{ compiler() }} Jinja2 functions are used. The only time that build and host are merged is when the host section is absent, and no ${{ compiler() }} Jinja2 functions are used in meta.yaml.

"},{"location":"reference/recipe_file/#run","title":"Run","text":"

Packages required to run the package.

These are the dependencies that are installed automatically whenever the package is installed. Package names should follow the package match specifications.

requirements:\n  run:\n    - python\n    - six >=1.8.0\n

To build a recipe against different versions of NumPy and ensure that each version is part of the package dependencies, list numpy as a requirement in recipe.yaml and use a conda_build_config.yaml file with multiple NumPy versions.

"},{"location":"reference/recipe_file/#run-constraints","title":"Run constraints","text":"

Packages that are optional at runtime but must obey the supplied additional constraint if they are installed.

Package names should follow the package match specifications.

requirements:\n  run_constraints:\n    - optional-subpackage ==${{ version }}\n

For example, let's say we have an environment that has package \"a\" installed at version 1.0. If we install package \"b\" that has a run_constraints entry of \"a >1.0\", then mamba would need to upgrade \"a\" in the environment in order to install \"b\".

This is especially useful in the context of virtual packages, where the run_constraints dependency is not a package that mamba manages, but rather a virtual package that represents a system property that mamba can't change. For example, a package on Linux may impose a run_constraints dependency on __glibc >=2.12. This is the version bound consistent with CentOS 6. Software built against glibc 2.12 will be compatible with CentOS 6. This run_constraints dependency helps mamba, conda or pixi tell the user that a given package can't be installed if their system glibc version is too old.

"},{"location":"reference/recipe_file/#run-exports","title":"Run exports","text":"

Packages may have runtime requirements such as shared libraries (e.g. zlib), which are required for linking at build time, and for resolving the link at run time. Such packages use run_exports for defining the runtime requirements to let the dependent packages understand the runtime requirements of the package.

Example from zlib:

  requirements:\n    run_exports:\n      - ${{ pin_subpackage('libzlib', exact=True) }}\n

Run exports are weak by default. But you can also define strong run_exports.

  requirements:\n    run_exports:\n      strong:\n        - ${{ pin_subpackage('libzlib', exact=True) }}\n
"},{"location":"reference/recipe_file/#ignore-run-exports","title":"Ignore run exports","text":"

There maybe cases where an upstream package has a problematic run_exports constraint. You can ignore it in your recipe by listing the upstream package name in the ignore_run_exports section in requirements.

You can ignore them by package name, or by naming the runtime dependency directly.

  requirements:\n    ignore_run_exports:\n      from_package:\n        - zlib\n

Using a runtime depenedency name:

  requirements:\n    ignore_run_exports:\n      by_name:\n        - libzlib\n

Note

ignore_run_exports only applies to runtime dependencies coming from an upstream package.

"},{"location":"reference/recipe_file/#tests-section","title":"Tests section","text":"

rattler-build supports four different types of tests. The \"script test\" installs the package and runs a list of commands. The \"Python test\" attempts to import a list of Python modules and runs pip check. The \"downstream test\" runs the tests of a downstream package that reverse depends on the package being built. And lastly, the \"package content test\" checks if the built package contains the mentioned items.

The tests section is a list of these items:

tests:\n  - script:\n      - echo \"hello world\"\n    requirements:\n      run:\n        - pytest\n    files:\n      source:\n        - test-data.txt\n\n  - python:\n      imports:\n        - bsdiff4\n      pip_check: true  # this is the default\n  - downstream: numpy\n
"},{"location":"reference/recipe_file/#script-test","title":"Script test","text":"

The script test has 3 top-level keys: script, files and requirements. Only the script key is required.

"},{"location":"reference/recipe_file/#test-commands","title":"Test commands","text":"

Commands that are run as part of the test.

tests:\n  - script:\n      - echo \"hello world\"\n      - bsdiff4 -h\n      - bspatch4 -h\n
"},{"location":"reference/recipe_file/#extra-test-files","title":"Extra test files","text":"

Test files that are copied from the source work directory into the temporary test directory and are needed during testing (note that the source work directory is otherwise not available at all during testing).

You can also include files that come from the recipe folder. They are copied into the test directory as well.

At test execution time, the test directory is the current working directory.

tests:\n  - script:\n      - ls\n    files:\n      source:\n        - myfile.txt\n        - tests/\n        - some/directory/pattern*.sh\n      recipe:\n        - extra-file.txt\n
"},{"location":"reference/recipe_file/#test-requirements","title":"Test requirements","text":"

In addition to the runtime requirements, you can specify requirements needed during testing. The runtime requirements that you specified in the \"run\" section described above are automatically included during testing (because the built package is installed as it regularly would be).

In the build section you can specify additional requirements that are only needed on the build system for cross-compilation (e.g. emulators or compilers).

tests:\n  - script:\n      - echo \"hello world\"\n    requirements:\n      build:\n        - myemulator\n      run:\n        - nose\n
"},{"location":"reference/recipe_file/#python-tests","title":"Python tests","text":"

For this test type you can list a set of Python modules that need to be importable. The test will fail if any of the modules cannot be imported.

The test will also automatically run pip check to check for any broken dependencies. This can be disabled by setting pip_check: false in the YAML.

tests:\n  - python:\n      imports:\n        - bsdiff4\n        - bspatch4\n      pip_check: true  # can be left out because this is the default\n

Internally this will write a small Python script that imports the modules:

import bsdiff4\nimport bspatch4\n
"},{"location":"reference/recipe_file/#check-for-package-contents","title":"Check for package contents","text":"

Checks if the built package contains the mentioned items. These checks are executed directly at the end of the build process to make sure that all expected files are present in the package.

tests:\n  - package_contents:\n      # checks for the existence of files inside $PREFIX or %PREFIX%\n      # or, checks that there is at least one file matching the specified `glob`\n      # pattern inside the prefix\n      files:\n        - etc/libmamba/test.txt\n        - etc/libmamba\n        - etc/libmamba/*.mamba.txt\n\n      # checks for the existence of `mamba/api/__init__.py` inside of the\n      # Python site-packages directory (note: also see Python import checks)\n      site_packages:\n        - mamba.api\n\n\n      # looks in $PREFIX/bin/mamba for unix and %PREFIX%\\Library\\bin\\mamba.exe on Windows\n      # note: also check the `commands` and execute something like `mamba --help` to make\n      # sure things work fine\n      bin:\n        - mamba\n\n      # searches for `$PREFIX/lib/libmamba.so` or `$PREFIX/lib/libmamba.dylib` on Linux or macOS,\n      # on Windows for %PREFIX%\\Library\\lib\\mamba.dll & %PREFIX%\\Library\\bin\\mamba.bin\n      lib:\n        - mamba\n\n      # searches for `$PREFIX/include/libmamba/mamba.hpp` on unix, and\n      # on Windows for `%PREFIX%\\Library\\include\\libmamba\\mamba.hpp`\n      include:\n        - libmamba/mamba.hpp\n
"},{"location":"reference/recipe_file/#downstream-tests","title":"Downstream tests","text":"

Warning

Downstream tests are not yet implemented in rattler-build.

A downstream test can mention a single package that has a dependency on the package being built. The test will install the package and run the tests of the downstream package with our current package as a dependency.

Sometimes downstream packages do not resolve. In this case, the test is ignored.

tests:\n  - downstream: numpy\n
"},{"location":"reference/recipe_file/#outputs-section","title":"Outputs section","text":"

Explicitly specifies packaging steps. This section supports multiple outputs, as well as different package output types. The format is a list of mappings.

When using multiple outputs, certain top-level keys are \"forbidden\": package and requirements. Instead of package, a top-level recipe key can be defined. The recipe.name is ignored but the recipe.version key is used as default version for each output. Other \"top-level\" keys are merged into each output (e.g. the about section) to avoid repetition. Each output is a complete recipe, and can have its own build, requirements, and test sections.

recipe:\n  # the recipe name is ignored\n  name: some\n  version: 1.0\n\noutputs:\n  - package:\n      # version is taken from recipe.version (1.0)\n      name: some-subpackage\n\n  - package:\n      name: some-other-subpackage\n      version: 2.0\n

Each output acts like an independent recipe and can have their own script, build_number, and so on.

outputs:\n  - package:\n      name: subpackage-name\n    build:\n      script: install-subpackage.sh\n

Each output is built independently. You should take care of not packaging the same files twice.

"},{"location":"reference/recipe_file/#subpackage-requirements","title":"Subpackage requirements","text":"

Like a top-level recipe, a subpackage may have zero or more dependencies listed as build, host or run requirements.

The dependencies listed as subpackage build requirements are available only during the packaging phase of that subpackage.

outputs:\n  - package:\n      name: subpackage-name\n    requirements:\n      build:\n        - some-dep\n      run:\n        - some-dep\n

You can also use the pin_subpackage function to pin another output from the same recipe.

outputs:\n  - package:\n      name: libtest\n  - package:\n      name: test\n    requirements:\n      build:\n        - ${{ pin_subpackage('libtest', max_pin='x.x') }}\n

The outputs are topologically sorted by the dependency graph which is taking the pin_subpackage invocations into account. When using pin_subpackage(name, exact=True) a special behavior is used where the name package is injected as a \"variant\" and the variant matrix is expanded appropriately. For example, when you have the following situation, with a variant_config.yaml file that contains openssl: [1, 3]:

outputs:\n  - package:\n      name: libtest\n    requirements:\n      host:\n        - openssl\n  - package:\n      name: test\n    requirements:\n      build:\n        - ${{ pin_subpackage('libtest', exact=True) }}\n

Due to the variant config file, this will build two versions of libtest. We will also build two versions of test, one that depends on libtest (openssl 1) and one that depends on libtest (openssl 3).

"},{"location":"reference/recipe_file/#about-section","title":"About section","text":"

Specifies identifying information about the package. The information displays in the package server.

about:\n  homepage: https://example.com/bsdiff4\n  license: BSD-3-Clause # (1)!\n  license_file: LICENSE\n  summary: binary diff and patch using the BSDIFF4-format\n  description: |\n    Long description of bsdiff4 ...\n  repository: https://github.com/ilanschnell/bsdiff4\n  documentation: https://docs.com\n
  1. Only the SPDX specifiers are allowed, more info here: SPDX If you want another license type LicenseRef-<YOUR-LICENSE> can be used, e.g. license: LicenseRef-Proprietary
"},{"location":"reference/recipe_file/#license-file","title":"License file","text":"

Adds a file containing the software license to the package metadata. Many licenses require the license statement to be distributed with the package. The filename is relative to the source or recipe directory. The value can be a single filename or a YAML list for multiple license files. Values can also point to directories with license information. Directory entries must end with a / suffix (this is to lessen unintentional inclusion of non-license files; all the directory's contents will be unconditionally and recursively added).

about:\n  license_file:\n    - LICENSE\n    - vendor-licenses/\n
"},{"location":"reference/recipe_file/#extra-section","title":"Extra section","text":"

A schema-free area for storing non-conda-specific metadata in standard YAML form.

Example: To store recipe maintainers information
extra:\n  maintainers:\n   - name of maintainer\n
"},{"location":"reference/recipe_file/#templating-with-jinja","title":"Templating with Jinja","text":"

rattler-build supports limited Jinja templating in the recipe.yaml file.

You can set up Jinja variables in the context section:

context:\n  name: \"test\"\n  version: \"5.1.2\"\n  # later keys can reference previous keys\n  # and use jinja functions to compute new values\n  major_version: ${{ version.split('.')[0] }}\n

Later in your recipe.yaml you can use these values in string interpolation with Jinja:

source:\n  url: https://github.com/mamba-org/${{ name }}/v${{ version }}.tar.gz\n

Jinja has built-in support for some common string manipulations.

In rattler-build, complex Jinja is completely disallowed as we try to produce YAML that is valid at all times. So you should not use any {% if ... %} or similar Jinja constructs that produce invalid YAML. Furthermore, instead of plain double curly brackets Jinja statements need to be prefixed by $, e.g. ${{ ... }}:

package:\n  name: {{ name }}   # WRONG: invalid yaml\n  name: ${{ name }} # correct\n

For more information, see the Jinja template documentation and the list of available environment variables env-vars.

Jinja templates are evaluated during the build process.

"},{"location":"reference/recipe_file/#additional-jinja2-functionality-in-rattler-build","title":"Additional Jinja2 functionality in rattler-build","text":"

Besides the default Jinja2 functionality, additional Jinja functions are available during the rattler-build process: pin_compatible, pin_subpackage, and compiler.

The compiler function takes c, cxx, fortran and other values as argument and automatically selects the right (cross-)compiler for the target platform.

build:\n  - ${{ compiler('c') }}\n

The pin_subpackage function pins another package produced by the recipe with the supplied parameters.

Similarly, the pin_compatible function will pin a package according to the specified rules.

"},{"location":"reference/recipe_file/#pin-expressions","title":"Pin expressions","text":"

rattler-build knows pin expressions. A pin expression can have a min_pin, max_pin and exact value. A max_pin and min_pin are specified with a string containing only x and ., e.g. max_pin=\"x.x.x\" would signify to pin the given package to <1.2.3 (if the package version is 1.2.2, for example).

A pin with min_pin=\"x.x\",max_pin=\"x.x\" for a package of version 1.2.2 would evaluate to >=1.2.2,<1.2.3.

If exact=true, then the hash is included, and the package is pinned exactly, e.g. ==1.2.2 h1234. This is a unique package variant that cannot exist more than once, and thus is \"exactly\" pinned.

"},{"location":"reference/recipe_file/#pin-subpackage","title":"Pin subpackage","text":"

Pin subpackage refers to another package from the same recipe file. It is commonly used in the build/run_exports section to export a run export from the package, or with multiple outputs to refer to a previous build.

It looks something like:

package:\n  name: mypkg\n  version: \"1.2.3\"\n\nrequirements:\n  run_exports:\n    # this will evaluate to `mypkg <1.3`\n    - ${{ pin_subpackage(name, max_pin='x.x') }}\n
"},{"location":"reference/recipe_file/#pin-compatible","title":"Pin compatible","text":"

Pin compatible lets you pin a package based on the version retrieved from the variant file (if the pinning from the variant file needs customization).

For example, if the variant specifies a pin for numpy: 1.11, one can use pin_compatible to relax it:

requirements:\n  host:\n    # this will select nupy 1.11\n    - numpy\n  run:\n    # this will export `numpy >=1.11,<2`, instead of the stricter `1.11` pin\n    - ${{ pin_compatible('numpy', min_pin='x.x', max_pin='x') }}\n
"},{"location":"reference/recipe_file/#the-env-jinja-functions","title":"The env Jinja functions","text":"

You can access the current environment variables using the env object in Jinja.

There are three functions:

  • env.get(\"ENV_VAR\") will insert the value of \"ENV_VAR\" into the recipe.
  • env.get(\"ENV_VAR\", default=\"undefined\") will insert the value of ENV_VAR into the recipe or, if ENV_VAR is not defined, the specified default value (in this case \"undefined\")
  • env.exists(\"ENV_VAR\") returns a boolean true of false if the env var is set to any value

This can be used for some light templating, for example:

build:\n  string: ${{ env.get(\"GIT_BUILD_STRING\") }}_${{ PKG_HASH }}\n
"},{"location":"reference/recipe_file/#match-function","title":"match function","text":"

This function matches the first argument (the package version) against the second argument (the version spec) and returns the resulting boolean. This only works for packages defined in the \"variant_config.yaml\" file.

recipe.yaml
match(python, '>=3.4')\n

For example, you could require a certain dependency only for builds against python 3.4 and above:

recipe.yaml
requirements:\n  build:\n    - if: match(python, '>=3.4')\n      then:\n        - some-dep\n

With a corresponding variant config that looks like the following:

variant_config.yaml
python: [\"3.2\", \"3.4\", \"3.6\"]\n

Example: match usage example

"},{"location":"reference/recipe_file/#cdt-function","title":"cdt function","text":"

This function helps add Core Dependency Tree packages as dependencies by converting packages as required according to hard-coded logic.

# on x86_64 system\ncdt('package-name') # outputs: package-name-cos6-x86_64\n# on aarch64 system\ncdt('package-name') # outputs: package-name-cos6-aarch64\n

Example: cdt usage example

"},{"location":"reference/recipe_file/#preprocessing-selectors","title":"Preprocessing selectors","text":"

You can add selectors to any item, and the selector is evaluated in a preprocessing stage. If a selector evaluates to true, the item is flattened into the parent element. If a selector evaluates to false, the item is removed.

Selectors can use if ... then ... else as follows:

source:\n  - if: not win\n    then:\n      - url: http://path/to/unix/source\n    else:\n      - url: http://path/to/windows/source\n\n# or the equivalent with two if conditions:\n\nsource:\n  - if: unix\n    then:\n      - url: http://path/to/unix/source\n  - if: win\n    then:\n      - url: http://path/to/windows/source\n

A selector is a valid Python statement that is executed. You can read more about them in the \"Selectors in recipes\" chapter.

The use of the Python version selectors, py27, py34, etc. is discouraged in favor of the more general comparison operators. Additional selectors in this series will not be added to conda-build.

Because the selector is any valid Python expression, complicated logic is possible:

- if: unix and not win\n  then: ...\n- if: (win or linux) and not py27\n  then: ...\n

Lists are automatically \"merged\" upwards, so it is possible to group multiple items under a single selector:

tests:\n  - script:\n    - if: unix\n      then:\n      - test -d ${PREFIX}/include/xtensor\n      - test -f ${PREFIX}/lib/cmake/xtensor/xtensorConfigVersion.cmake\n    - if: win\n      then:\n      - if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)\n      - if not exist %LIBRARY_PREFIX%\\lib\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)\n\n# On unix this is rendered to:\ntests:\n  - script:\n    - test -d ${PREFIX}/include/xtensor\n    - test -f ${PREFIX}/lib/cmake/xtensor/xtensorConfigVersion.cmake\n
"},{"location":"reference/recipe_file/#experimental-features","title":"Experimental features","text":"

Warning

These are experimental features of rattler-build and may change or go away completely.

"},{"location":"reference/recipe_file/#jinja-functions","title":"Jinja functions","text":"
  • load_from_file
  • git.* functions
"},{"location":"tutorials/cpp/","title":"Packaging a C++ package","text":"

This tutorial will guide you though making a C++ package with rattler-build.

"},{"location":"tutorials/cpp/#building-a-header-only-library","title":"Building a Header-only Library","text":"

To build a package for the header-only library xtensor, you need to manage dependencies and ensure proper installation paths.

"},{"location":"tutorials/cpp/#key-steps","title":"Key Steps","text":"
  1. Dependencies: Ensure cmake, ninja, and a compiler are available as dependencies.

  2. CMake Installation Prefix: Use the CMAKE_INSTALL_PREFIX setting to instruct CMake to install the headers in the correct location.

  3. Unix Systems: Follow the standard Unix prefix:

    $PREFIX/include\n$PREFIX/lib\n

  4. Windows Systems: Use a Unix-like prefix but nested in a Library directory:

    $PREFIX/Library/include\n$PREFIX/Library/lib\n
    Utilize the handy variables %LIBRARY_PREFIX% and %LIBRARY_BIN% to guide CMake to install the headers and libraries correctly.

This approach ensures that the headers and libraries are installed in the correct directories on both Unix and Windows systems.

"},{"location":"tutorials/cpp/#recipe","title":"Recipe","text":"recipe.yaml
context:\n  version: \"0.24.6\"\n\npackage:\n  name: xtensor\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/xtensor-stack/xtensor/archive/${{ version }}.tar.gz\n  sha256: f87259b51aabafdd1183947747edfff4cff75d55375334f2e81cee6dc68ef655\n\nbuild:\n  number: 0\n  script:\n    - if: win # (1)!\n      then: |\n        cmake -GNinja ^\n            -D BUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% ^\n            %SRC_DIR%\n        ninja install\n      else: |\n        cmake -GNinja \\\n              -DBUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=$PREFIX \\\n              $SRC_DIR\n        ninja install\n\nrequirements:\n  build:\n    - ${{ compiler('cxx') }} # (2)!\n    - cmake\n    - ninja\n  host:\n    - xtl >=0.7,<0.8\n  run:\n    - xtl >=0.7,<0.8\n  run_constraints: # (3)!\n    - xsimd >=8.0.3,<10\n\ntests:\n  - package_contents:\n      include: # (4)!\n        - xtensor/xarray.hpp\n      files: # (5)!\n        - ${{ \"Library\" if win }}/share/cmake/xtensor/xtensorConfig.cmake\n        - ${{ \"Library\" if win }}/share/cmake/xtensor/xtensorConfigVersion.cmake\n\nabout:\n  homepage: https://github.com/xtensor-stack/xtensor\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: The C++ tensor algebra library\n  description: Multi dimensional arrays with broadcasting and lazy computing\n  documentation: https://xtensor.readthedocs.io\n  repository: https://github.com/xtensor-stack/xtensor\n\nextra:\n  recipe-maintainers:\n    - some-maintainer\n
  1. The if: condition allows the user to switch behavior of the build based on some checks like, the operating system.
  2. The compiler function is used to get the C++ compiler for the build system.
  3. The run_constraints section specifies the version range of a package which the package can run \"with\". But which the package doesn't depend on itself.
  4. The include section specifies the header file to tested for existence.
  5. The files section specifies the files to be tested for existence, using a glob pattern.

CMAKE_ARGS

It can be tedious to remember all the different variables one needs to pass to CMake to create the perfect build. The cmake package on conda-forge introduces theCMAKE_ARGS environment variable. This variable contains the necessary flags to make the package build correctly, also when cross-compiling from one machine to another. Therefore, it is often not necessary to pass any additional flags to the cmake command. However, because this is a tutorial we will show how to pass the necessary flags to cmake manually.

For more information please refer to the conda-forge documentation.

"},{"location":"tutorials/cpp/#building-a-c-application","title":"Building A C++ application","text":"

In this example, we'll build poppler, a C++ application for manipulating PDF files from the command line. The final package will install several tools into the bin/ folder. We'll use external build scripts and run actual scripts in the test.

"},{"location":"tutorials/cpp/#key-steps_1","title":"Key Steps","text":"
  1. Dependencies:

    • Build Dependencies: These are necessary for the building process, including cmake, ninja, and pkg-config.
    • Host Dependencies: These are the libraries poppler links against, such as cairo, fontconfig, freetype, glib, and others.
  2. Compiler Setup: We use the compiler function to obtain the appropriate C and C++ compilers.

  3. Build Script: The build.script field points to an external script (poppler-build.sh) which contains the build commands.

  4. Testing: Simple tests are included to verify that the installed tools (pdfinfo, pdfunite, pdftocairo) are working correctly by running them, and expecting an exit code 0.

"},{"location":"tutorials/cpp/#recipe_1","title":"Recipe","text":"recipe.yaml
context:\n  version: \"24.01.0\"\n\npackage:\n  name: poppler\n  version: ${{ version }}\n\nsource:\n  url: https://poppler.freedesktop.org/poppler-${{ version }}.tar.xz\n  sha256: c7def693a7a492830f49d497a80cc6b9c85cb57b15e9be2d2d615153b79cae08\n\nbuild:\n  script: poppler-build.sh\n\nrequirements:\n  build:\n    - ${{ compiler('c') }} # (1)!\n    - ${{ compiler('cxx') }}\n    - pkg-config\n    - cmake\n    - ninja\n  host:\n    - cairo # (2)!\n    - fontconfig\n    - freetype\n    - glib\n    - libboost-headers\n    - libjpeg-turbo\n    - lcms2\n    - libiconv\n    - libpng\n    - libtiff\n    - openjpeg\n    - zlib\n\ntests:\n  - script:\n      - pdfinfo -listenc  # (3)!\n      - pdfunite --help\n      - pdftocairo --help\n
  1. The compiler jinja function to get the correct compiler for C and C++ on the build system.
  2. These are all the dependencies that the library links against.
  3. The script test just executes some of the installed tools to check if they are working. These can be as complex as you want. (bash or cmd.exe)
"},{"location":"tutorials/cpp/#external-build-script","title":"External Build Script","text":"

We've defined an external build script in the recipe. This will be searched next to the recipe by the file name given, or the default name build.sh on unix or build.bat on windows are searched for.

poppler-build.sh
#! /bin/bash\n\nextra_cmake_args=(\n    -GNinja\n    -DCMAKE_INSTALL_LIBDIR=lib\n    -DENABLE_UNSTABLE_API_ABI_HEADERS=ON\n    -DENABLE_GPGME=OFF\n    -DENABLE_LIBCURL=OFF\n    -DENABLE_LIBOPENJPEG=openjpeg2\n    -DENABLE_QT6=OFF\n    -DENABLE_QT5=OFF\n    -DENABLE_NSS3=OFF\n)\n\nmkdir build && cd build\n\ncmake ${CMAKE_ARGS} \"${extra_cmake_args[@]}\" \\\n    -DCMAKE_PREFIX_PATH=$PREFIX \\\n    -DCMAKE_INSTALL_PREFIX=$PREFIX \\\n    -DTIFF_INCLUDE_DIR=$PREFIX/include \\\n    $SRC_DIR\n\nninja\n\n# The `install` command will take care of copying the files to the right place\nninja install\n
"},{"location":"tutorials/cpp/#parsing-the-rattler-build-build-output","title":"Parsing the rattler-build build Output","text":"

When running the rattler-build command, you might notice some interesting information in the output. Our package will have some run dependencies, even if we didn't specify any.

These come from the run-exports of the packages listed in the host section of the recipe. This is indicated by \"RE of [host: package]\" in the output.

For example, libcurl specifies that if you depend on it in the host section, you should also depend on it during runtime with specific version ranges. This ensures proper linking to shared libraries.

Run dependencies:\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Name                  \u2506 Spec                                         \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 libcurl               \u2506 >=8.5.0,<9.0a0 (RE of [host: libcurl])       \u2502\n\u2502 fontconfig            \u2506 >=2.14.2,<3.0a0 (RE of [host: fontconfig])   \u2502\n\u2502 fonts-conda-ecosystem \u2506 (RE of [host: fontconfig])                   \u2502\n\u2502 lcms2                 \u2506 >=2.16,<3.0a0 (RE of [host: lcms2])          \u2502\n\u2502 gettext               \u2506 >=0.21.1,<1.0a0 (RE of [host: gettext])      \u2502\n\u2502 freetype              \u2506 >=2.12.1,<3.0a0 (RE of [host: freetype])     \u2502\n\u2502 openjpeg              \u2506 >=2.5.0,<3.0a0 (RE of [host: openjpeg])      \u2502\n\u2502 libiconv              \u2506 >=1.17,<2.0a0 (RE of [host: libiconv])       \u2502\n\u2502 cairo                 \u2506 >=1.18.0,<2.0a0 (RE of [host: cairo])        \u2502\n\u2502 libpng                \u2506 >=1.6.42,<1.7.0a0 (RE of [host: libpng])     \u2502\n\u2502 libzlib               \u2506 >=1.2.13,<1.3.0a0 (RE of [host: zlib])       \u2502\n\u2502 libtiff               \u2506 >=4.6.0,<4.7.0a0 (RE of [host: libtiff])     \u2502\n\u2502 libjpeg-turbo         \u2506 >=3.0.0,<4.0a0 (RE of [host: libjpeg-turbo]) \u2502\n\u2502 libglib               \u2506 >=2.78.3,<3.0a0 (RE of [host: glib])         \u2502\n\u2502 libcxx                \u2506 >=16 (RE of [build: clangxx_osx-arm64])      \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n

You can also see \"linking\" information in the output, for example on macOS:

[lib/libpoppler-glib.8.26.0.dylib] links against:\n \u251c\u2500 @rpath/libgio-2.0.0.dylib\n \u251c\u2500 @rpath/libgobject-2.0.0.dylib\n \u251c\u2500 /usr/lib/libSystem.B.dylib\n \u251c\u2500 @rpath/libglib-2.0.0.dylib\n \u251c\u2500 @rpath/libpoppler.133.dylib\n \u251c\u2500 @rpath/libfreetype.6.dylib\n \u251c\u2500 @rpath/libc++.1.dylib\n \u251c\u2500 @rpath/libpoppler-glib.8.dylib\n \u2514\u2500 @rpath/libcairo.2.dylib\n

rattler-build ensures that:

  1. All shared libraries linked against are present in the run dependencies. Missing libraries trigger an overlinking warning.
  2. You don't require any packages in the host that you are not linking against. This triggers an overdepending warning.
"},{"location":"tutorials/python/","title":"Writing a Python package","text":"

Writing a Python package is fairly straightforward, especially for \"Python-only\" packages. In the second example we will build a package for numpy which contains compiled code.

"},{"location":"tutorials/python/#a-python-only-package","title":"A Python-only package","text":"

The following recipe uses the noarch: python setting to build a noarch package that can be installed on any platform without modification. This is very handy for packages that are pure Python and do not contain any compiled extensions.

Additionally, noarch: python packages work with a range of Python versions (contrary to packages with compiled extensions that are tied to a specific Python version).

recipe.yaml
context:\n  version: \"8.1.2\"\n\npackage:\n  name: ipywidgets\n  version: ${{ version }}\n\nsource:\n  url: https://pypi.io/packages/source/i/ipywidgets/ipywidgets-${{ version }}.tar.gz\n  sha256: d0b9b41e49bae926a866e613a39b0f0097745d2b9f1f3dd406641b4a57ec42c9\n\nbuild:\n  noarch: python # (1)!\n  script: pip install . -v\n\nrequirements:\n  # note that there is no build section\n  host:\n    - pip\n    - python >=3.7\n    - setuptools\n    - wheel\n  run:\n    - comm >=0.1.3\n    - ipython >=6.1.0\n    - jupyterlab_widgets >=3.0.10,<3.1.0\n    - python >=3.7\n    - traitlets >=4.3.1\n    - widgetsnbextension >=4.0.10,<4.1.0\n\ntests:\n  - python:\n      imports:\n        - ipywidgets # (2)!\n\nabout:\n  homepage: https://github.com/ipython/ipywidgets\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: Jupyter Interactive Widgets\n  description: |\n    ipywidgets are interactive HTML widgets for Jupyter notebooks and the IPython kernel.\n  documentation: https://ipywidgets.readthedocs.io/en/latest/\n
  1. The noarch: python line tells rattler-build that this package is pure Python and can be one-size-fits-all. noarch packages can be installed on any platform without modification which is very handy.
  2. The imports section in the tests is used to check that the package is installed correctly and can be imported.
"},{"location":"tutorials/python/#running-the-recipe","title":"Running the recipe","text":"

To build this recipe, simply run:

rattler-build build --recipe ./ipywidgets\n
"},{"location":"tutorials/python/#a-python-package-with-compiled-extensions","title":"A Python package with compiled extensions","text":"

We will build a package for numpy \u2013 which contains compiled code. Since compiled code is python version-specific, we will need to specify the python version explicitly. The best way to do this is with a \"variant_config.yaml\" file:

variants.yaml
python:\n  - 3.11\n  - 3.12\n

This will replace any python found in the recipe with the versions specified in the variants.yaml file.

recipe.yaml
context:\n  version: 1.26.4\n\npackage:\n  name: numpy\n  version: ${{ version }}\n\nsource:\n  - url: https://github.com/numpy/numpy/releases/download/v${{ version }}/numpy-${{ version }}.tar.gz\n    sha256: 2a02aba9ed12e4ac4eb3ea9421c420301a0c6460d9830d74a9df87efa4912010\n\nbuild:\n  python:\n    entry_points:\n      - f2py = numpy.f2py.f2py2e:main  # [win]\n\nrequirements:\n  build:\n    - ${{ compiler('c') }}\n    - ${{ compiler('cxx') }}\n  host:\n    # note: variant is injected here!\n    - python\n    - pip\n    - meson-python\n    - ninja\n    - pkg-config\n    - python-build\n    - cython\n    - libblas\n    - libcblas\n    - liblapack\n  run:\n    - python\n  run_exports:\n    - ${{ pin_subpackage(\"numpy\") }}\n\ntests:\n  - python:\n      imports:\n        - numpy\n        - numpy.array_api\n        - numpy.array_api.linalg\n        - numpy.ctypeslib\n\n  - script:\n    - f2py -h\n\nabout:\n  homepage: http://numpy.org/\n  license: BSD-3-Clause\n  license_file: LICENSE.txt\n  summary: The fundamental package for scientific computing with Python.\n  documentation: https://numpy.org/doc/stable/\n  repository: https://github.com/numpy/numpy\n

The build script for Unix:

build.sh
mkdir builddir\n\n$PYTHON -m build -w -n -x \\\n    -Cbuilddir=builddir \\\n    -Csetup-args=-Dblas=blas \\\n    -Csetup-args=-Dlapack=lapack\n\n$PYTHON -m pip install dist/numpy*.whl\n

The build script for Windows:

build.bat
mkdir builddir\n\n%PYTHON% -m build -w -n -x ^\n    -Cbuilddir=builddir ^\n    -Csetup-args=-Dblas=blas ^\n    -Csetup-args=-Dlapack=lapack\nif %ERRORLEVEL% neq 0 exit 1\n\n:: `pip install dist\\numpy*.whl` does not work on windows,\n:: so use a loop; there's only one wheel in dist/ anyway\nfor /f %%f in ('dir /b /S .\\dist') do (\n    pip install %%f\n    if %ERRORLEVEL% neq 0 exit 1\n)\n
"},{"location":"tutorials/python/#running-the-recipe_1","title":"Running the recipe","text":"

Running this recipe with the variant config file will build a total of 2 numpy packages:

rattler-build build --recipe ./numpy\n

At the beginning of the build process, rattler-build will print the following message to show you the variants it found:

Found variants:\n\nnumpy-1.26.4-py311h5f8ada8_0\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Variant         \u2506 Version   \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 python          \u2506 3.11      \u2502\n\u2502 target_platform \u2506 osx-arm64 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\nnumpy-1.26.4-py312h440f24a_0\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Variant         \u2506 Version   \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 python          \u2506 3.12      \u2502\n\u2502 target_platform \u2506 osx-arm64 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n
"},{"location":"tutorials/rust/","title":"Building a Rust package","text":"

We're using rattler-build to build a Rust package for the cargo-edit utility. This utility manages Cargo dependencies from the command line.

recipe.yaml
context:\n  version: \"0.11.9\"\n\npackage:\n  name: cargo-edit\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/killercup/cargo-edit/archive/refs/tags/v${{ version }}.tar.gz\n  sha256: 46670295e2323fc2f826750cdcfb2692fbdbea87122fe530a07c50c8dba1d3d7\n\nbuild:\n  script:\n    - cargo-bundle-licenses --format yaml --output ${SRC_DIR}/THIRDPARTY.yml  # !(1)\n    - $BUILD_PREFIX/bin/cargo install --locked --bins --root ${PREFIX} --path .\n\nrequirements:\n  build:\n    - ${{ compiler('rust') }}\n    - cargo-bundle-licenses\n\ntests:\n  - script:\n      - cargo-upgrade --help # !(2)\n\nabout:\n  homepage: https://github.com/killercup/cargo-edit\n  license: MIT\n  license_file:\n    - LICENSE\n    - THIRDPARTY.yml\n  description: \"A utility for managing cargo dependencies from the command line.\"\n  summary: \"A utility for managing cargo dependencies from the command line.\"\n

Note

The ${{ compiler(...) }} functions are very useful in the context of cross-compilation. When the function is evaluated it will insert the correct compiler (as selected with the variant config) as well the target_platform. The \"rendered\" compiler will look like rust_linux-64 when you are targeting the linux-64 platform.

You can read more about this in the cross-compilation section.

  1. The cargo-bundle-licenses utility is used to bundle all the licenses of the dependencies into a THIRDPARTY.yml file. This file is then included in the package. You should always include this file in your package when you are redistributing it.
  2. Running scripts in bash or cmd.exe to test the package build well, expects an exit code of 0 to pass the test.

To build this recipe, simply run:

rattler-build build \\\n    --recipe ./cargo-edit/recipe.yaml\n
"}]} \ No newline at end of file +{"config":{"lang":["en"],"separator":"[\\s\\-]+","pipeline":["stopWordFilter"]},"docs":[{"location":"","title":"Home","text":""},{"location":"#rattler-build-a-fast-conda-package-builder","title":"rattler-build: A Fast Conda Package Builder","text":"

The rattler-build tooling and library creates cross-platform relocatable binaries / packages from a simple recipe format. The recipe format is heavily inspired by conda-build and boa, and the output of a regular rattler-build run is a package that can be installed using mamba, rattler or conda.

rattler-build does not have any dependencies on conda-build or Python and works as a standalone binary.

"},{"location":"#installation","title":"Installation","text":"

The recommended way of installing rattler-build, being a conda-package builder, is through a conda package manager. Next to rattler-build we are also building pixi.

With pixi you can install rattler-build globally:

pixi global install rattler-build\n

Other options are:

CondaHomebrewArch LinuxBinary
conda install rattler-build -c conda-forge\n\nmamba install rattler-build -c conda-forge\nmicromamba install rattler-build -c conda-forge\n\npixi global install rattler-build\npixi add rattler-build # To a pixi project\n
brew install rattler-build\n
pacman -S rattler-build\n

# Download the latest release from the GitHub releases page, for example the linux x86 version with curl:\ncurl -SL --progress-bar https://github.com/prefix-dev/rattler-build/releases/latest/download/rattler-build-x86_64-unknown-linux-musl\n
You can grab version of rattler-build from the Github Releases.

"},{"location":"#completion","title":"Completion","text":"

When installing rattler-build you might want to enable shell completion. Afterwards, restart the shell or source the shell config file.

"},{"location":"#bash-default-on-most-linux-systems","title":"Bash (default on most Linux systems)","text":"
echo 'eval \"$(rattler-build completion --shell bash)\"' >> ~/.bashrc\n
"},{"location":"#zsh-default-on-macos","title":"Zsh (default on macOS)","text":"
echo 'eval \"$(rattler-build completion --shell zsh)\"' >> ~/.zshrc\n
"},{"location":"#powershell-pre-installed-on-all-windows-systems","title":"PowerShell (pre-installed on all Windows systems)","text":"
Add-Content -Path $PROFILE -Value '(& rattler-build completion --shell powershell) | Out-String | Invoke-Expression'\n

Failure because no profile file exists

Make sure your profile file exists, otherwise create it with:

New-Item -Path $PROFILE -ItemType File -Force\n

"},{"location":"#fish","title":"Fish","text":"
echo 'rattler-build completion --shell fish | source' >> ~/.config/fish/config.fish\n
"},{"location":"#nushell","title":"Nushell","text":"

Add the following to the end of your Nushell env file (find it by running $nu.env-path in Nushell):

mkdir ~/.cache/rattler-build\nrattler-build completion --shell nushell | save -f ~/.cache/rattler-build/completions.nu\n

And add the following to the end of your Nushell configuration (find it by running $nu.config-path):

use ~/.cache/rattler-build/completions.nu *\n
"},{"location":"#elvish","title":"Elvish","text":"
echo 'eval (rattler-build completion --shell elvish | slurp)' >> ~/.elvish/rc.elv\n
"},{"location":"#dependencies","title":"Dependencies","text":"

Currently rattler-build needs some dependencies on the host system which are executed as subprocess. We plan to reduce the number of external dependencies over time by writing what we need in Rust to make rattler-build fully self-contained.

  • tar to unpack tarballs downloaded from the internet in a variety of formats. .gz, .bz2 and .xz are widely used and one might have to install the compression packages as well (e.g. gzip, bzip2, ...)
  • patch to patch source code after downloading
  • install_name_tool is necessary on macOS to rewrite the rpath of shared libraries and executables to make it relative
  • patchelf is required on Linux to rewrite the rpath and runpath of shared libraries and executables
  • git to checkout Git repositories (not implemented yet, but will require git in the future)
  • msvc on Windows because we cannot ship the MSVC compiler on conda-forge (needs to be installed on the host machine)

On Windows, to obtain these dependencies from conda-forge, one can install m2-patch, m2-bzip2, m2-gzip, m2-tar.

"},{"location":"#github-action","title":"GitHub Action","text":"

There is a GitHub Action for rattler-build. It can be used to install rattler-build in CI/CD workflows and run a build command. Please check out the GitHub Action documentation for more information.

"},{"location":"#the-recipe-format","title":"The recipe format","text":"

Note You can find all examples below in the examples folder in the codebase and run them with rattler-build.

A simple example recipe for the xtensor header-only C++ library:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n\ncontext:\n  name: xtensor\n  version: 0.24.6\n\npackage:\n  name: ${{ name|lower }}\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/xtensor-stack/xtensor/archive/${{ version }}.tar.gz\n  sha256: f87259b51aabafdd1183947747edfff4cff75d55375334f2e81cee6dc68ef655\n\nbuild:\n  number: 0\n  script:\n    - if: win\n      then: |\n        cmake -G \"NMake Makefiles\" -D BUILD_TESTS=OFF -D CMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% %SRC_DIR%\n        nmake\n        nmake install\n      else: |\n        cmake ${CMAKE_ARGS} -DBUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=$PREFIX $SRC_DIR -DCMAKE_INSTALL_LIBDIR=lib\n        make install\n\nrequirements:\n  build:\n    - ${{ compiler('cxx') }}\n    - cmake\n    - if: unix\n      then: make\n  host:\n    - xtl >=0.7,<0.8\n  run:\n    - xtl >=0.7,<0.8\n  run_constraints:\n    - xsimd >=8.0.3,<10\n\ntests:\n  - script:\n    - if: unix or emscripten\n      then:\n        - test -d ${PREFIX}/include/xtensor\n        - test -f ${PREFIX}/include/xtensor/xarray.hpp\n        - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfig.cmake\n        - test -f ${PREFIX}/share/cmake/xtensor/xtensorConfigVersion.cmake\n    - if: win\n      then:\n        - if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)\n        - if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfig.cmake (exit 1)\n        - if not exist %LIBRARY_PREFIX%\\share\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)\n\nabout:\n  homepage: https://github.com/xtensor-stack/xtensor\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: The C++ tensor algebra library\n  description: Multi dimensional arrays with broadcasting and lazy computing\n  documentation: https://xtensor.readthedocs.io\n  repository: https://github.com/xtensor-stack/xtensor\n\nextra:\n  recipe-maintainers:\n    - some-maintainer\n

A recipe for the rich Python package (using noarch):

context:\n  version: \"13.4.2\"\n\npackage:\n  name: \"rich\"\n  version: ${{ version }}\n\nsource:\n  - url: https://pypi.io/packages/source/r/rich/rich-${{ version }}.tar.gz\n    sha256: d653d6bccede5844304c605d5aac802c7cf9621efd700b46c7ec2b51ea914898\n\nbuild:\n  # Thanks to `noarch: python` this package works on all platforms\n  noarch: python\n  script:\n    - python -m pip install . -vv --no-deps --no-build-isolation\n\nrequirements:\n  host:\n    - pip\n    - poetry-core >=1.0.0\n    - python 3.10\n  run:\n    # sync with normalized deps from poetry-generated setup.py\n    - markdown-it-py >=2.2.0\n    - pygments >=2.13.0,<3.0.0\n    - python 3.10\n    - typing_extensions >=4.0.0,<5.0.0\n\ntests:\n  - python:\n      imports:\n        - rich\n      pip_check: true\n\nabout:\n  homepage: https://github.com/Textualize/rich\n  license: MIT\n  license_file: LICENSE\n  summary: Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal\n  description: |\n    Rich is a Python library for rich text and beautiful formatting in the terminal.\n\n    The Rich API makes it easy to add color and style to terminal output. Rich\n    can also render pretty tables, progress bars, markdown, syntax highlighted\n    source code, tracebacks, and more \u2014 out of the box.\n  documentation: https://rich.readthedocs.io\n  repository: https://github.com/Textualize/rich\n

A recipe for the curl library:

context:\n  version: \"8.0.1\"\n\npackage:\n  name: curl\n  version: ${{ version }}\n\nsource:\n  url: http://curl.haxx.se/download/curl-${{ version }}.tar.bz2\n  sha256: 9b6b1e96b748d04b968786b6bdf407aa5c75ab53a3d37c1c8c81cdb736555ccf\n\nbuild:\n  number: 0\n\nrequirements:\n  build:\n    - ${{ compiler('c') }}\n    - if: win\n      then:\n        - cmake\n        - ninja\n    - if: unix\n      then:\n        - make\n        - perl\n        - pkg-config\n        - libtool\n  host:\n    - if: linux\n      then:\n        - openssl\n\nabout:\n  homepage: http://curl.haxx.se/\n  license: MIT/X derivate (http://curl.haxx.se/docs/copyright.html)\n  license_file: COPYING\n  summary: tool and library for transferring data with URL syntax\n  description: |\n    Curl is an open source command line tool and library for transferring data\n    with URL syntax. It is used in command lines or scripts to transfer data.\n  documentation: https://curl.haxx.se/docs/\n  repository: https://github.com/curl/curl\n

For the curl library recipe, two additional script files (build.sh and build.bat) are needed.

build.sh

#!/bin/bash\n\n# Get an updated config.sub and config.guess\ncp $BUILD_PREFIX/share/libtool/build-aux/config.* .\n\nif [[ $target_platform =~ linux.* ]]; then\n    USESSL=\"--with-openssl=${PREFIX}\"\nelse\n    USESSL=\"--with-secure-transport\"\nfi;\n\n./configure \\\n    --prefix=${PREFIX} \\\n    --host=${HOST} \\\n    ${USESSL} \\\n    --with-ca-bundle=${PREFIX}/ssl/cacert.pem \\\n    --disable-static --enable-shared\n\nmake -j${CPU_COUNT} ${VERBOSE_AT}\nmake install\n\n# Includes man pages and other miscellaneous.\nrm -rf \"${PREFIX}/share\"\n

build.bat

mkdir build\n\ncmake -GNinja ^\n      -DCMAKE_BUILD_TYPE=Release ^\n      -DBUILD_SHARED_LIBS=ON ^\n      -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% ^\n      -DCMAKE_PREFIX_PATH=%LIBRARY_PREFIX% ^\n      -DCURL_USE_SCHANNEL=ON ^\n      -DCURL_USE_LIBSSH2=OFF ^\n      -DUSE_ZLIB=ON ^\n      -DENABLE_UNICODE=ON ^\n      %SRC_DIR%\n\nIF %ERRORLEVEL% NEQ 0 exit 1\n\nninja install --verbose\n
"},{"location":"authentication_and_upload/","title":"Server authentication","text":""},{"location":"authentication_and_upload/#authenticating-with-a-server","title":"Authenticating with a server","text":"

You may want to use private channels for which you need to be authenticated. To do this ephemerally you can use the RATTLER_AUTH_FILE environment variable to point to a JSON file with the following structure:

{\n    \"*.prefix.dev\": {\n        \"BearerToken\": \"your_token\"\n    },\n    \"otherhost.com\": {\n        \"BasicHTTP\": {\n            \"username\": \"your_username\",\n            \"password\": \"your_password\"\n        }\n    },\n    \"anaconda.org\": {\n        \"CondaToken\": \"your_token\"\n    }\n}\n

The keys are the host names. You can use wildcard specifiers here (e.g. *.prefix.dev to match all subdomains of prefix.dev, such as repo.prefix.dev). This will allow you to also obtain packages from any private channels that you have access to.

The following known authentication methods are supported:

  • BearerToken: prefix.dev
  • CondaToken: anaconda.org, quetz
  • BasicHTTP: artifactory
"},{"location":"authentication_and_upload/#uploading-packages","title":"Uploading packages","text":"

If you want to upload packages, then rattler-build comes with a built-in upload command. There are 4 options:

  • prefix.dev: you can create public or private channels on the prefix.dev hosted server
  • anaconda.org: you can upload packages to the free anaconda.org server
  • quetz: you can host your own quetz server and upload packages to it
  • artifactory: you can upload packages to a JFrog Artifactory server

The command is:

rattler-build upload <server> <package_files>\n

Note: you can also use the RATTLER_AUTH_FILE environment variable to authenticate with the server.

"},{"location":"authentication_and_upload/#prefixdev","title":"prefix.dev","text":"

To upload to prefix.dev, you need to have an account and a token. You can create a token in the settings of your account. The token is used to authenticate the upload.

export PREFIX_API_KEY=<your_token>\nrattler-build upload prefix -c <channel> <package_files>\n

You can also use the --api-key=$PREFIX_API_KEY option to pass the token directly to the command. Note that you need to have created the channel on the prefix.dev website before you can upload to it.

"},{"location":"authentication_and_upload/#quetz","title":"Quetz","text":"

You need to pass a token and API key to upload to a channel on your own Quetz server. The token is used to authenticate the upload.

export QUETZ_API_KEY=<your_token>\nrattler-build upload quetz -u <url> -c <channel> <package_files>\n
"},{"location":"authentication_and_upload/#artifactory","title":"Artifactory","text":"

To upload to an Artifactory server, you need to pass a username and password. The username and password are used to authenticate the upload.

export ARTIFACTORY_USERNAME=<your_username>\nexport ARTIFACTORY_PASSWORD=<your_password>\nrattler-build upload artifactory -u <url> -c <channel> <package_files>\n
"},{"location":"authentication_and_upload/#anacondaorg","title":"anaconda.org","text":"

To upload to anaconda.org, you need to specify the owner and API key. The API key is used to authenticate the upload.

The owner is the owner of the distribution, for example, your user name or organization.

One can also specify a label such as dev for release candidates using the -c flag. The default value is main.

You can also add the --force argument to forcibly upload a new package (and overwrite any existing ones).

export ANACONDA_API_KEY=<your_token>\nrattler-build upload anaconda -o <your_username> -c <label> <package_files>\n
"},{"location":"automatic_linting/","title":"Enabling Automatic Linting in VSCode","text":"

Our new recipe format adheres to a strict JSON schema, which you can access here.

This schema is implemented using pydantic and can be rendered into a JSON schema file. The YAML language server extension in VSCode is capable of recognizing this schema, providing useful hints during the editing process.

To enable automatic linting with the YAML language server, you need to add the following line at the beginning of your recipe file:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n

Alternatively, if you prefer not to add this line to your file, you can install the JSON Schema Store Catalog extension. This extension will also enable automatic linting for your recipe files.

"},{"location":"build_options/","title":"Advanced build options","text":"

There are some specialized build options to control various features:

  • prefix replacement
  • variant configuration
  • encoded file type

These are all found under the build key in the recipe.yaml.

"},{"location":"build_options/#include-only-certain-files-in-the-package","title":"Include only certain files in the package","text":"

Sometimes you may want to include only a subset of the files installed by the build process in your package. For this, the files key can be used. Only new files are considered for inclusion (ie. files that were not in the host environment beforehand).

recipe.yaml
build:\n  # select files to be included in the package\n  # this can be used to remove files from the package, even if they are installed in the\n  # environment\n  files: list of globs\n

For example, to only include the header files in a package, you could use:

recipe.yaml
build:\n  files:\n    - include/**/*.h\n

Glob patterns throughout the recipe file can also use a flexible include / exclude pair, such as:

recipe.yaml
build:\n  files:\n    include:\n      - include/**/*.h\n    exclude:\n      - include/**/private.h\n
"},{"location":"build_options/#glob-evaluation","title":"Glob evaluation","text":"

Glob patterns are used throughout the build options to specify files. The patterns are matched against the relative path of the file in the build directory. Patterns can contain * to match any number of characters, ? to match a single character, and ** to match any number of directories.

For example:

  • *.txt matches all files ending in .txt
  • **/*.txt matches all files ending in .txt in any directory
  • **/test_*.txt matches all files starting with test_ and ending in .txt in any directory
  • foo/ matches all files under the foo directory

The globs are always evaluated relative to the prefix directory. If you have no include globs, but an exclude glob, then all files are included except those that match the exclude glob. This is equivalent to include: ['**'].

"},{"location":"build_options/#always-include-and-always-copy-files","title":"Always include and always copy files","text":"

There are some options that control the inclusion of files in the final package.

The always_include_files option can be used to include files even if they are already in the environment as part of some other host dependency. This is normally \"clobbering\" and should be used with caution (since packages should not have any overlapping files).

The always_copy_files option can be used to copy files instead of linking them. This is useful for files that might be modified inside the environment (e.g. configuration files). Normally, files are linked from a central cache into the environment to save space \u2013 that means that files modified in one environment will be modified in all environments. This is not always desirable, and in that case you can use the always_copy_files option.

??? note \"How always_copy_files works\" The always_copy_files option works by setting the no_link option in the info/paths.json to true for the files in question. This means that the files are copied instead of linked when the package is installed.

recipe.yaml
build:\n  # include files even if they are already in the environment\n  # as part of some other host dependency\n  always_include_files: list of globs\n\n  # do not soft- or hard-link these files, but always copy them was `no_link`\n  always_copy_files: list of globs\n
"},{"location":"build_options/#merge-build-and-host-environments","title":"Merge build and host environments","text":"

In very rare cases you might want to merge the build and host environments to obtain the \"legacy\" behavior of conda-build.

recipe.yaml
build:\n  # merge the build and host environments (used in many R packages on Windows)\n  merge_build_and_host_envs: bool (defaults to false)\n
"},{"location":"build_options/#prefix-detection-replacement-options","title":"Prefix detection / replacement options","text":"

During installation time the \"install\"-prefix is injected into text and binary files. Sometimes this is not desired, and sometimes the user might want closer control over the automatic text/binary detection.

The main difference between prefix replacement for text and binary files is that for binary files, the prefix string is padded with null bytes to match the length of the original prefix. The original prefix is the very long placeholder string that you might have seen in the build process.

On Windows, binary prefix replacement is never performed.

recipe.yaml
package:\n  name: mypackage\n  version: 1.0\n\nbuild:\n  # settings concerning the prefix detection in files\n  prefix_detection:\n    # force the file type of the given files to be TEXT or BINARY\n    # for prefix replacement\n    force_file_type:\n      # force TEXT file type (list of globs)\n      text: list of globs\n      # force binary file type (list of globs)\n      binary: list of globs\n\n    # ignore all or specific files for prefix replacement`\n    ignore: bool | [path] (defaults to false)\n\n    # whether to detect binary files with prefix or not\n    # defaults to true on Unix and (always) false on Windows\n    ignore_binary_files: bool\n
"},{"location":"build_options/#variant-configuration","title":"Variant configuration","text":"

To control the variant precisely you can use the \"variant configuration\" options.

A variant package has the same version number, but different \"hash\" and potentially different dependencies or build options. Variant keys are extracted from the variant_config.yaml file and usually any used Jinja variables or dependencies without version specifier are used as variant keys.

Variant keys can also be forcibly set or ignored with the use_keys and ignore_keys options.

In order to decide which of the variant packages to prefer and install by default, the down_prioritize_variant option can be used. The higher the value, the less preferred the variant is.

More about variants can be found in the variant documentation.

The following options are available in the build section to control the variant configuration:

recipe.yaml
build:\n  # settings for the variant\n  variant:\n    # Keys to forcibly use for the variant computation\n    # even if they are not in the dependencies\n    use_keys: list of strings\n\n    # Keys to forcibly ignore for the variant computation\n    # even if they are in the dependencies\n    ignore_keys: list of strings\n\n    # used to prefer this variant less\n    down_prioritize_variant: integer (defaults to 0, higher is less preferred)\n
"},{"location":"build_options/#dynamic-linking-configuration","title":"Dynamic linking configuration","text":"

After the package is built, rattler-build performs some \"post-processing\" on the binaries and libraries.

This entails making the shared libraries relocatable and checking that all linked libraries are present in the run requirements. The following settings control this behavior.

With the rpath option you can forcibly set the rpath of the shared libraries. The path is relative to the install prefix. Any rpath setting is ignored on Windows.

The rpath_allowlist option can be used to allow the rpath to point to locations outside of the environment. This is useful if you want to link against libraries that are not part of the conda environment (e.g. proprietary software).

If you want to stop rattler-build from relocating the binaries, you can set binary_relocation to false. If you want to only relocate some binaries, you can select the relevant ones with a glob pattern.

To read more about rpaths and how rattler-build creates relocatable binary packages, see the internals docs.

If you link against some libraries (possibly even outside of the prefix, in a system location), then you can use the missing_dso_allowlist to allow linking against these and suppress any warnings. This list is pre-populated with a list of known system libraries on the different operating systems.

As part of the post-processing, rattler-build checks for overlinking and overdepending. \"Overlinking\" is when a binary links against a library that is not specified in the run requirements. This is usually a mistake because the library would not be present in the environment when the package is installed.

Conversely, \"overdepending\" is when a library is part of the run requirements, but is not actually used by any of the binaries/libraries in the package.

recipe.yaml
build:\n  # settings for shared libraries and executables\n  dynamic_linking:\n    # linux only, list of rpaths relative to the installation prefix\n    rpaths: list of paths (defaults to ['lib/'])\n\n    # Allow runpath / rpath to point to these locations\n    # outside of the environment\n    rpath_allowlist: list of globs\n\n    # whether to relocate binaries or not. If this is a list of paths, then\n    # only the listed paths are relocated\n    binary_relocation: bool (defaults to true) | list of globs\n\n    # Allow linking against libraries that are not in the run requirements\n    missing_dso_allowlist: list of globs\n\n    # what to do when detecting overdepending\n    overdepending_behavior: \"ignore\" or \"error\" # (defaults to \"error\")\n\n    # what to do when detecting overlinking\n    overlinking_behavior: \"ignore\" or \"error\" # (defaults to \"error\")\n
"},{"location":"build_script/","title":"Build scripts","text":"

The build.sh file is the build script for Linux and macOS and build.bat is the build script for Windows. These scripts contain the logic that carries out your build steps. Anything that your build script copies into the $PREFIX or %PREFIX% folder will be included in your output package.

For example, this build.sh:

build.sh
mkdir -p $PREFIX/bin\ncp $RECIPE_DIR/my_script_with_recipe.sh $PREFIX/bin/super-cool-script.sh\n

There are many environment variables defined for you to use in build.sh and build.bat. Please see environment variables for more information.

build.sh and build.bat are optional. You can instead use the build/script key in your recipe.yaml, with each value being either a string command or a list of string commands. Any commands you put there must be able to run on every platform for which you build. For example, you can't use the cp command because cmd.exe won't understand it on Windows.

build.sh is run with bash and build.bat is run with cmd.exe.

recipe.yaml
build:\n  script:\n    - if: unix\n      then:\n        - mkdir -p $PREFIX/bin\n        - cp $RECIPE_DIR/my_script_with_recipe.sh $PREFIX/bin/super-cool-script.sh\n    - if: win\n      then:\n        - mkdir %PREFIX%\\bin\n        - copy %RECIPE_DIR%\\my_script_with_recipe.bat %PREFIX%\\bin\\super-cool-script.bat\n
"},{"location":"build_script/#environment-variables","title":"Environment variables","text":"

There are many environment variables that are automatically set during the build process.

However, you can also set your own environment variables easily in the script section of your recipe:

recipe.yaml
build:\n  script:\n    # Either use `content` or `file` to specify the script\n    # Note: this script only works on Unix :)\n    content: |\n      echo $FOO\n      echo $BAR\n      echo \"Secret value: $BAZ\"\n    env:\n      # hard coded value for `FOO`\n      FOO: \"foo\"\n      # Forward a value from the \"outer\" environment\n      # Without `default=...`, the build process will error if `BAR` is not set\n      BAR: ${{ env.get(\"BAR\", default=\"NOBAR\") }}\n    secrets:\n      # This value is a secret and will be masked in the logs and not stored in the rendered recipe\n      # The value needs to be available as an environment variable in the outer environment\n      - BAZ\n
"},{"location":"build_script/#alternative-script-interpreters","title":"Alternative script interpreters","text":"

With rattler-build and the new recipe syntax you can select an interpreter for your script.

So far, the following interpreters are supported:

  • bash (default on Unix)
  • cmd.exe (default on Windows)
  • nushell
  • python

Note

Using alternative interpreters is less battle-tested than using bash or cmd.exe. If you encounter any issues, please open an issue.

"},{"location":"build_script/#using-nushell","title":"Using nushell","text":"

In order to use nushell you can select the interpreter: nu or have a build.nu file in your recipe directory. Nushell works on Windows, macOS and Linux with the same syntax.

recipe.yaml
build:\n  script:\n    interpreter: nu\n    content: |\n      echo \"Hello from nushell!\"\n\n# Note: it's required to have `nushell` in the `build` section of your recipe!\nrequirements:\n  build:\n    - nushell\n
"},{"location":"build_script/#using-python","title":"Using python","text":"

In order to use python you can select the interpreter: python or have a build.py file in your recipe directory and python in the requirements/build section.

recipe.yaml
build:\n  script:\n    interpreter: python\n    content: |\n      print(\"Hello from Python!\")\n\n# Note: it's required to have `python` in the `build` section of your recipe!\nrequirements:\n  build:\n    - python\n
"},{"location":"build_script/#default-environment-variables-set-during-the-build-process","title":"Default environment variables set during the build process","text":"

During the build process, the following environment variables are set, on Windows with build.bat and on macOS and Linux with build.sh. By default, these are the only variables available to your build script. Unless otherwise noted, no variables are inherited from the shell environment in which you invoke conda-build. To override this behavior, see :ref:inherited-env-vars.

ARCH

Either 32 or 64, to specify whether the build is 32-bit or 64-bit. The value depends on the ARCH environment variable and defaults to the architecture the interpreter running conda was compiled with.

CMAKE_GENERATOR

The CMake generator string for the current build environment. On Linux systems, this is always Unix Makefiles. On Windows, it is generated according to the Visual Studio version activated at build time, for example, Visual Studio 9 2008 Win64.

CONDA_BUILD=1

Always set to indicate that the conda-build process is running.

CPU_COUNT

Represents the number of CPUs on the system.

SHLIB_EXT

Denotes the shared library extension specific to the operating system (e.g. .so for Linux, .dylib for macOS, and .dll for Windows).

HTTP_PROXY, HTTPS_PROXY

Inherited from the user's shell environment, specifying the HTTP and HTTPS proxy settings.

LANG

Inherited from the user's shell environment, defining the system language and locale settings.

MAKEFLAGS

Inherited from the user's shell environment. This can be used to set additional arguments for the make command, such as -j2 to utilize 2 CPU cores for building the recipe.

PY_VER

Specifies the Python version against which the build is occurring. This can be modified with a variant_config.yaml file.

PATH

Inherited from the user's shell environment and augmented with the activated host and build prefixes.

PREFIX

The build prefix to which the build script should install the software.

PKG_BUILDNUM

Indicates the build number of the package currently being built.

PKG_NAME

The name of the package that is being built.

PKG_VERSION

The version of the package currently under construction.

PKG_BUILD_STRING

The complete build string of the package being built, including the hash (e.g. py311h21422ab_0).

PKG_HASH

Represents the hash of the package being built, excluding the leading 'h' (e.g. 21422ab). This is applicable from conda-build 3.0 onwards.

PYTHON

The path to the Python executable in the host prefix. Python is installed in the host prefix only when it is listed as a host requirement.

R

The path to the R executable in the build prefix. R is installed in the build prefix only when it is listed as a build requirement.

RECIPE_DIR

The directory where the recipe is located.

SP_DIR

The location of Python's site-packages, where Python libraries are installed.

SRC_DIR

The path to where the source code is unpacked or cloned. If the source file is not a recognized archive format, this directory contains a copy of the source file.

STDLIB_DIR

The location of Python's standard library.

build_platform

Represents the native subdirectory of the conda executable, indicating the platform for which the build is occurring.

Removed from conda-build are: - NPY_VER - PY3K

"},{"location":"build_script/#windows","title":"Windows","text":"

Unix-style packages on Windows are built in a special Library directory under the build prefix. The environment variables listed in the following table are defined only on Windows.

Variable Description LIBRARY_BIN <build prefix>\\Library\\bin. LIBRARY_INC <build prefix>\\Library\\include. LIBRARY_LIB <build prefix>\\Library\\lib. LIBRARY_PREFIX <build prefix>\\Library. SCRIPTS <build prefix>\\Scripts.

Not yet supported in rattler-build:

  • CYGWIN_PREFIX
  • VS_MAJOR
  • VS_VERSION
  • VS_YEAR

Additionally, the following variables are forwarded from the environment:

  • ALLUSERSPROFILE
  • APPDATA
  • CommonProgramFiles
  • CommonProgramFiles(x86)
  • CommonProgramW6432
  • COMPUTERNAME
  • ComSpec
  • HOMEDRIVE
  • HOMEPATH
  • LOCALAPPDATA
  • LOGONSERVER
  • NUMBER_OF_PROCESSORS
  • PATHEXT
  • ProgramData
  • ProgramFiles
  • ProgramFiles(x86)
  • ProgramW6432
  • PROMPT
  • PSModulePath
  • PUBLIC
  • SystemDrive
  • SystemRoot
  • TEMP
  • TMP
  • USERDOMAIN
  • USERNAME
  • USERPROFILE
  • windir
  • PROCESSOR_ARCHITEW6432
  • PROCESSOR_ARCHITECTURE
  • PROCESSOR_IDENTIFIER
"},{"location":"build_script/#unix","title":"Unix","text":"

The environment variables listed in the following table are defined only on macOS and Linux.

Variable Description HOME Standard $HOME environment variable. PKG_CONFIG_PATH Path to pkgconfig directory, defaults to `$PREFIX/lib/pkgconfig SSL_CERT_FILE Path to SSL_CERT_FILE file. CFLAGS Empty, can be forwarded from env to set additional arguments to C compiler. CXXFLAGS Same as CFLAGS for C++ compiler. LDFLAGS Empty, additional flags to be passed to the linker when linking object files into an executable or shared object."},{"location":"build_script/#macos","title":"macOS","text":"

The environment variables listed in the following table are defined only on macOS.

Variable Description MACOSX_DEPLOYMENT_TARGET Same as the Anaconda Python macOS deployment target. Currently 10.9 for intel 32- and 64bit macOS, and 11.0 for arm64. OSX_ARCH i386 or x86_64 or arm64, depending on the target platform"},{"location":"build_script/#linux","title":"Linux","text":"

The environment variable listed in the following table is defined only on Linux.

Variable Description LD_RUN_PATH Defaults to <build prefix>/lib. QEMU_LD_PREFIX The prefix used by QEMU's user mode emulation for library paths. QEMU_UNAME Set qemu uname release string to 'uname'. DEJAGNU The path to the dejagnu testing framework used by the GCC test suite. DISPLAY The X11 display to use for graphical applications. BUILD Target triple ({build_arch}-conda_{build_distro}-linux-gnu) where build_distro is one of cos6 or cos7, for Centos 6 or 7"},{"location":"compilers/","title":"Compilers and cross-compilation","text":"

To use a compiler in your project, it's best to use the ${{ compiler('lang') }} template function. The compiler function works by taking a language, determining the configured compiler for that language, and adding some information about the target platform to the selected compiler. To configure a compiler for a specific language, the variant_config.yaml file can be used.

For example, in a recipe that uses a C-compiler, you can use the following code:

requirements:\n  build:\n    - ${{ compiler('c') }}\n

To set the compiler that you want to use, create a variant config that looks like the following:

c_compiler:\n  - gcc\n\n# optionally you can specify a version\nc_compiler_version:\n  - 9.3.0\n

When the template function is evaluated, it will look something like: gcc_linux-64 9.3.0. You can define your own compilers. For example, for Rust you can use ${{ compiler('rust') }} and rust_compiler_{version} in your variant config.

"},{"location":"compilers/#cross-compilation","title":"Cross-compilation","text":"

Cross-compilation is supported by rattler-build and the compiler template function is part of what makes it possible. When you want to cross-compile from linux-64 to linux-aarch64 (i.e. intel to ARM), you can pass --target-platform linux-aarch64 to the rattler-build command. This will cause the compiler template function to select a compiler that is configured for linux-aarch64. The above example would resolve to gcc_linux-aarch64 9.3.0. Provided that the package is available for linux-64 (your build platform), the compilation should succeed.

The distinction between the build and host sections begins to make sense when thinking about cross-compilation. The build environment is resolved to packages that need to run at compilation time. For example, cmake, gcc, and autotools are all tools that need to be executed. Therefore, the build environment resolves to packages for the linux-64 architecture (in our example). On the other hand, the host packages resolve to linux-aarch64 - those are packages that we want to link against.

# packages that need to run at build time (cmake, gcc, autotools, etc.)\n# in the platform that rattler-build is executed on (the build_platform)\nbuild:\n  - cmake\n  - ${{ compiler('c') }}\n# packages that we want to link against in the architecture we are\n# cross-compiling to the target_platform\nhost:\n  - libcurl\n  - openssl\n
"},{"location":"converting_from_conda_build/","title":"Converting a recipe from conda-build","text":"

The recipe format of rattler-build differs in some aspects from conda-build. This document aims to help you convert a recipe from conda-build to rattler-build.

"},{"location":"converting_from_conda_build/#automatic-conversion","title":"Automatic conversion","text":"

To convert a recipe from meta.yaml to recipe.yaml you can use the automatic conversion utility.

To install conda-recipe-manager, run

pixi global install conda-recipe-manager\n# or\nconda install -c conda-forge conda-recipe-manager\n

Then, run the conversion utility:

conda-recipe-manager convert my-recipe/meta.yaml\n

This will print the converted recipe to the console. You can save it to a file by redirecting the output:

conda-recipe-manager convert my-recipe/meta.yaml > recipe.yaml\n

To learn more about the tool, or contribute, find the repository here.

"},{"location":"converting_from_conda_build/#converting-jinja-and-selectors","title":"Converting Jinja and selectors","text":"

To use jinja in the new recipes, you need to keep in mind two conversions. The {% set version = \"1.2.3\" %} syntax is replaced by the context section in the new recipe format.

{% set version = \"1.2.3\" %}\n

becomes

context:\n  version: \"1.2.3\"\n

To use the values or other Jinja expressions (e.g. from the variant config) you can use the ${{ version }} syntax. Note the $ sign before the curly braces - it makes Jinja fully compatible with the YAML format.

meta.yaml
# instead of\npackage:\n  version: \"{{ version }}\"\nsource:\n  url: https://example.com/foo-{{ version }}.tar.gz\n

becomes

recipe.yaml
package:\n  version: ${{ version }}\nsource:\n  url: https://example.com/foo-${{ version }}.tar.gz\n
"},{"location":"converting_from_conda_build/#converting-selectors","title":"Converting selectors","text":"

conda-build has a line based \"selector\" system, to e.g. disable certain fields on Windows vs. Unix.

In rattler-build we\u00a0use two different syntaxes: an if/else/then map or a inline jinja expression.

A typical selector in conda-build looks something like this:

meta.yaml
requirements:\n  host:\n    - pywin32  # [win]\n

To convert this to rattler-build syntax, you can use one of the following two syntaxes:

recipe.yaml
requirements:\n  host:\n    - ${{ \"pywin32\" if win }}  # empty strings are automatically filtered\n    # or\n    - if: win\n      then:\n        - pywin32  # this list extends the outer list\n
"},{"location":"converting_from_conda_build/#converting-the-recipe-script","title":"Converting the recipe script","text":"

We still support the build.sh script, but the bld.bat script was renamed to build.bat in order to be more consistent with the build.sh script.

You can also choose a different name for your script:

build:\n  # note: if there is no extension, we will try to find .sh on unix and .bat on windows\n  script: my_build_script\n

There are also new ways of writing scripts, for example with nushell or python

Variant keys in build scripts

conda-build tries to analyze the build scripts for any usage of variant keys. We do not attempt that. If you want to use variant keys in your build script that are not used anywhere else you need to manually add them to your script environment, e.g.

recipe.yaml
build:\n  script:\n    content: echo $MY_VARIANT\n    env:\n      MY_VARIANT: ${{ my_variant }}\n
"},{"location":"converting_from_conda_build/#converting-the-recipe-structure","title":"Converting the recipe structure","text":"

There are a few differences in the recipe structure. However, the schema will tell you quite easily what is expected and you should see red squiggly lines in your editor (e.g. VSCode) if you make a mistake.

Here are a few differences:

  • build.run_exports is now requirements.run_exports
  • requirements.run_constrained is now requirements.run_constraints
  • build.ignore_run_exports is now requirements.ignore_run_exports.by_name
  • build.ignore_run_exports_from is now requirements.ignore_run_exports.from_package
  • A git source now uses git, tag, ... and not git_url and git_rev, e.g.
    git: https://github.com/foo/bar.git\ntag: 1.2.3\n
"},{"location":"converting_from_conda_build/#converting-the-test-section","title":"Converting the test section","text":"

The test section is renamed to tests and is a list of independent tests. Each test runs in its own environment.

Let's have a look at converting an existing test section:

meta.yaml
test:\n  imports:\n    - mypackage\n  commands:\n    - mypackage --version\n

This would now be split into two tests:

recipe.yaml
tests:\n  - script:\n      - mypackage --version\n  - python:\n      imports:\n        - mypackage\n      # by default we perform a `pip check` in the python test but\n      # it can be disabled by setting this to false\n      pip_check: false\n

The script tests also take a requirements section with run and build requirements. The build requirements can be used to install emulators and similar tools that need to run to execute tests in a cross-compilation environment.

"},{"location":"experimental_features/","title":"Experimental features","text":"

Warning

These are experimental features of rattler-build and may change or go away completely.

Currently only the build and rebuild commands support the following experimental features.

To enable them, use the --experimental flag with the command. Or, use the environment variable, RATTLER_BUILD_EXPERIMENTAL=1.

"},{"location":"experimental_features/#jinja-functions","title":"Jinja functions","text":""},{"location":"experimental_features/#load_from_filefile_path","title":"load_from_file(<file_path>)","text":"

The Jinja function load_from_file allows loading from files; specifically, it allows loading from toml, json, and yaml file types to an object to allow it to fetch things directly from the file. It loads all other files as strings.

"},{"location":"experimental_features/#usage","title":"Usage","text":"

load_from_file is useful when there is a project description in a well-defined project file such as Cargo.toml, package.json, pyproject.toml, package.yaml, or stack.yaml. It enables the recipe to be preserved in as simple a state as possible, especially when there is no need to keep the changes in sync; some example use cases for this are with CI/CD infrastructure or when there is a well-defined output format.

Below is an example loading a Cargo.toml inside of the rattler-build GitHub repository:

recipe.yaml
context:\n  name: ${{ load_from_file(\"Cargo.toml\").package.name }}\n  version: ${{ load_from_file(\"Cargo.toml\").package.version }}\n  source_url: ${{ load_from_file(\"Cargo.toml\").package.homepage }}\n  rust_toolchain: ${{ load_from_file(\"rust-toolchains\") }}\n\npackage:\n  name: ${{ name }}\n  version: ${{ version }}\n\nsource:\n  git: ${{ source_url }}\n  tag: ${{ source_tag }}}}\n\nrequirements:\n  build:\n    - rust ==${{ rust_toolchain }}\n\nbuild:\n  script: cargo build --release -p ${{ name }}\n\ntest:\n  - script: cargo test -p ${{ name }}\n  - script: cargo test -p rust-test -- --test-threads=1\n\nabout:\n  home: ${{ source_url }}\n  repository: ${{ source_url }}\n  documentation: ${{ load_from_file(\"Cargo.toml\").package.documentation }}\n  summary: ${{ load_from_file(\"Cargo.toml\").package.description }}\n  license: ${{ load_from_file(\"Cargo.toml\").package.license }}\n
"},{"location":"experimental_features/#git-functions","title":"git functions","text":"

git functions are useful for getting the latest tag and commit hash. These can be used in the context section of the recipe, to fetch version information from a repository.

Examples
# latest tag in the repo\ngit.latest_tag(<git_repo_url>)\n\n# latest tag revision(aka, hash of tag commit) in the repo\ngit.latest_tag_rev(<git_repo_url>)\n\n# latest commit revision(aka, hash of head commit) in the repo\ngit.head_rev(<git_repo_url>)\n
"},{"location":"experimental_features/#usage_1","title":"Usage","text":"

These can be useful for automating minor things inside of the recipe itself, such as if the current version is the latest version or if the current hash is the latest hash, etc.

recipe.yaml
context:\n  git_repo_url: \"https://github.com/prefix-dev/rattler-build\"\n  latest_tag: ${{ git.latest_tag( git_repo_url ) }}\n\npackage:\n  name: \"rattler-build\"\n  version: ${{ latest_tag }}\n\nsource:\n  git: ${{ git_repo_url }}\n  tag: ${{ latest_tag }}\n

There is currently no guarantee of caching for repo fetches when using git functions. This may lead to some performance issues.

"},{"location":"highlevel/","title":"What is rattler-build?","text":"

rattler-build is a tool to build and package software so that it can be installed on any operating system \u2013 with any compatible package manager such as mamba, conda, or rattler. We are also intending for rattler-build to be used as a library to drive builds of packages from any other recipe format in the future.

"},{"location":"highlevel/#how-does-rattler-build-work","title":"How does rattler-build work?","text":"

Building of packages consists of several steps. It all begins with a recipe.yaml file that specifies how the package is to be built and what the dependencies are. From the recipe file, rattler-build executes several steps:

  1. Rendering: Parse the recipe file and evaluate conditionals, Jinja expressions, and variables, and variants.

  2. Fetch source: Retrieve specified source files, such as .tar.gz files, git repositories, local paths. Additionally, this step will apply patches that can be specified alongside the source file.

  3. Install build environments: Download and install dependencies into temporary \"host\" and \"build\" workspaces. Any dependencies that are needed at build time are installed in this step.

  4. Build source: Execute the build script to build/compile the source code and install it into the host environment.

  5. Prepare package files: Collect all files that are new in the \"host\" environment and apply some transformations if necessary; specifically, we edit the rpath on Linux and macOS to make binaries relocatable.

  6. Package: Bundle all the files in a package and write out any additional metadata into the info/index.json, info/about.json, and info/paths.json files. This also creates the test files that are bundled with the package.

  7. Test: Run any tests specified in the recipe. The package is considered done if it passes all the tests, otherwise its moved to broken/ in the output directory.

After this process, a package is created. This package can be uploaded to somewhere like a custom prefix.dev private or public channel.

"},{"location":"highlevel/#how-to-run-rattler-build","title":"How to run rattler-build","text":"

Running rattler-build is straightforward. It can be done on the command line:

rattler-build build --recipe myrecipe/recipe.yaml\n

A custom channel that is not conda-forge (the default) can be specified like so:

rattler-build build -c robostack --recipe myrecipe/recipe.yaml\n

You can also use the --recipe-dir argument if you want to build all the packages in a directory:

rattler-build build --recipe-dir myrecipes/\n
"},{"location":"highlevel/#overview-of-a-recipeyaml","title":"Overview of a recipe.yaml","text":"

A recipe.yaml file is separated into multiple sections and can conditionally include or exclude sections. Recipe files also support a limited amount of string interpolation with Jinja (specifically minijinja in our case).

A simple example of a recipe file for the zlib package would look as follows:

recipe.yaml
# variables from the context section can be used in the rest of the recipe\n# in jinja expressions\ncontext:\n  version: 1.2.13\n\npackage:\n  name: zlib\n  version: ${{ version }}\n\nsource:\n  url: http://zlib.net/zlib-${{ version }}.tar.gz\n  sha256: b3a24de97a8fdbc835b9833169501030b8977031bcb54b3b3ac13740f846ab30\n\nbuild:\n  # build numbers can be set arbitrarily\n  number: 0\n  script:\n    # build script to install the package into the $PREFIX (host prefix)\n    - if: unix\n      then:\n      - ./configure --prefix=$PREFIX\n      - make -j$CPU_COUNT\n    - if: win\n      then:\n      - cmake -G \"Ninja\" -DCMAKE_BUILD_TYPE=Release -DCMAKE_PREFIX_PATH=%LIBRARY_PREFIX%\n      - ninja install\n\nrequirements:\n  build:\n    # compiler is a special function.\n    - ${{ compiler(\"c\") }}\n    # The following two dependencies are only needed on Windows,\n    # and thus conditionally selected\n    - if: win\n      then:\n        - cmake\n        - ninja\n    - if: unix\n      then:\n        - make\n

The sections of a recipe are:

sections description context Defines variables that can be used in the Jinja context later in the recipe (e.g. name and version are commonly interpolated in strings) package This section defines the name and version of the package you are currently building and will be the name of the final output source Defines where the source code is going to be downloaded from and checksums build Settings for the build and the build script requirements Allows the definition of build, host, run and run-constrained dependencies"},{"location":"internals/","title":"What does rattler-build do to build a package?","text":"

rattler-build creates conda packages which are relocatable packages. These packages are built up with some rules and conventions in mind.

"},{"location":"internals/#what-goes-into-a-package","title":"What goes into a package?","text":"

Generally speaking, any new files that are copied into the $PREFIX directory at build time are part of the new package. However, there is some filtering going on to exclude unwanted files, and noarch: python packages have special handling as well. The rules are as follows:

"},{"location":"internals/#filtering","title":"Filtering","text":""},{"location":"internals/#general-file-filtering","title":"General File Filtering","text":"

Certain files are filtered out to prevent them from being included in the package. These include:

  • .pyo files: Optimized Python files are not included because they are considered harmful.
  • .la files: Libtool archive files that are not needed at runtime.
  • .DS_Store files: macOS-specific files that are irrelevant to the package.
  • .git files and directories: Version control files, including .gitignore and the .git directory, which are not needed in the package.
  • share/info/dir This file is ignored because it would be written from multiple packages.
"},{"location":"internals/#special-handling-for-noarch-python-packages","title":"Special Handling for noarch: python Packages","text":"

For packages marked as noarch: python, special transformations are applied to ensure compatibility across different platforms:

  • Stripping Python Library Prefix: The \"lib/pythonX.X\" prefix is removed, retaining only the \"site-packages\" part of the path.
  • Skipping __pycache__ Directories and .pyc Files: These are excluded and recreated during installation (they are specific to the Python version).
  • Replacing bin and Scripts Directories:
    • On Unix systems, the bin directory is replaced with python-scripts.
    • On Windows systems, the Scripts directory is replaced with python-scripts.
  • Remove explicitly mentioned entrypoints: For noarch: python packages, entry points registered in the package are also taken into account. Files in the bin or Scripts directories that match entry points are excluded to avoid duplications.
"},{"location":"internals/#symlink-handling","title":"Symlink Handling","text":"

Symlinks are carefully managed to ensure they are relative rather than absolute, which aids in making the package relocatable:

  • Absolute symlinks pointing within the $PREFIX are converted to relative symlinks.
  • On Unix systems, this conversion is handled directly by creating new relative symlinks.
  • On Windows, a warning is issued since symlink creation requires administrator privileges.
"},{"location":"internals/#making-packages-relocatable-with-rattler-build","title":"Making Packages Relocatable with rattler-build","text":"

Often, the most challenging aspect of building a package using rattler-build is making it relocatable. A relocatable package can be installed into any prefix, allowing it to be used outside the environment in which it was built. This is in contrast to a non-relocatable package, which can only be utilized within its original build environment.

rattler-build automatically performs the following actions to make packages relocatable:

  1. Binary object file conversion: Binary object files are converted to use relative paths using install_name_tool on macOS and patchelf on Linux. This uses $ORIGIN for elf files on Linux and @loader_path for Mach-O files on macOS to make the rpath relative to the executable / shared library.
  2. Text file prefix registration: Any text file without NULL bytes containing the placeholder prefix have the registered prefix replaced with the install prefix.
  3. Binary file prefix detection and registration: Binary files containing the build prefix can be automatically registered. The registered files will have their build prefix replaced with the install prefix at install time. This works by padding the install prefix with null terminators, such that the length of the binary file remains the same. The build prefix must be long enough to accommodate any reasonable installation prefix. On macOS and Linux, rattler-build pads the build prefix to 255 characters by appending _placehold to the end of the build directory name.
"},{"location":"multiple_output_cache/","title":"The cache for multiple outputs","text":"

Note

The \"multi-output\" cache is a little bit different from a compilation cache. If you look for tips and tricks on how to use sccache or ccache with rattler-build, please refer to the tips and tricks section.

Sometimes you build a package and want to split the contents into multiple sub-packages. For example, when building a C/C++ package, you might want to create multiple packages for the runtime requirements (library), and the development time requirements such as header files.

The \"cache\" output makes this easy. It allows you to specify a single top-level cache that can produce arbitrary files, that can then be used in other packages.

Let's take a look at an example:

recipe.yaml
recipe:\n  name: mypackage\n  version: '0.1.0'\n\ncache:\n  requirements:\n    build:\n      - ${{ compiler('c') }}\n  build:\n    script:\n      - mkdir -p $PREFIX/lib\n      - mkdir -p $PREFIX/include\n      - echo \"This is the library\" > lib/library.txt\n      - echo \"This is the header\" > include/header.txt\n\noutputs:\n  - package:\n      name: mypackage-library\n    build:\n      files:\n        - lib/*\n\n  - package:\n      name: mypackage-headers\n    build:\n      files:\n        - include/*\n

Note

Since this is an experimental feature, you need to pass the --experimental flag to enable parsing of the cache top-level section.

In this example, we have a single package called mypackage that creates two outputs: mypackage-library and mypackage-headers. The cache output will run like a regular output, but after the build is finished, the files will be copied to a \"cache\" directory (in your output folder, under output/build_cache).

The files in the cache folder are then copied into the $PREFIX of each output package. Since they are \"new\" files in the prefix, they will be included in the output package. The easiest way to select a subset of the files in the prefix is by using the files field in the output definition. You can use a list of globs to select only the files that you want.

For something more complicated you can also use include and exclude fields in the files selector. Please refer to the the build options documentation.

"},{"location":"multiple_output_cache/#run-exports-from-the-cache","title":"Run exports from the cache","text":"

Since the cache output also has build- and host requirements we need to additionally take care of eventual \"run-exports\" from the cache output. Run exports from the cache-dependencies are handled very similar to the run exports from a given output. We append any run exports to the outputs.

If the cache has an \"ignore run exports\" section, than we apply those filters at the cache level. If the output ignores any run exports, then we also ignore the run-exports if they would come from the cache.

"},{"location":"multiple_output_cache/#caching-in-the-src_dir","title":"Caching in the $SRC_DIR","text":"

If you used conda-build a lot, you might have noticed that a top-level build is also caching the changes in the $SRC_DIR. This is not the case for rattler-build yet.

You could try to work around by e.g. copying files into the $PREFIX and restoring them in each output.

"},{"location":"package_spec/","title":"Package specification","text":"

rattler-build produces \"conda\" packages. These packages work with the mamba and conda package managers, and they work cross-platform on Windows, Linux and macOS.

By default, a conda package is a tar.bz2 archive which contains:

  • Metadata under the info/ directory
  • A collection of files that are installed directly into an install prefix

The format is identical across platforms and operating systems. During the install process, all files are extracted into the install prefix, except the ones in info/. Installing a conda package into an environment is similar to executing the following commands:

cd <environment prefix>\ntar xjf mypkg-1.0.0-h2134.tar.bz2\n

Only files, including symbolic links, are part of a conda package. Directories are not included. Directories are created and removed as needed, but you cannot create an empty directory from the tar archive directly.

There is also a newer archive type, suffixed with .conda. This archive type consists of an outer \"zip\" archive that is not compressed, and two inner archives that are compressed with zstd, which is very fast for decompression.

The inner archives are split into info and pkg files, which makes it possible to extract only the info part of the archive (only the metadata), which is often smaller in size.

"},{"location":"package_spec/#package-filename","title":"Package filename","text":"

A conda package conforms to the following filename:

<name>-<version>-<hash>.tar.bz2 OR <name>-<version>-<hash>.conda\n
"},{"location":"package_spec/#special-files-in-packages","title":"Special files in packages","text":"

There are some special files in a package:

  • activation and deactivation scripts that are executed when the environment is activated or deactivated
  • post-link and pre-unlink scripts that are executed when the package is installed or uninstalled

You can read more about these files in the activation scripts and other special files section.

"},{"location":"package_spec/#package-metadata","title":"Package metadata","text":"

The info/ directory contains all metadata about a package. Files in this location are not installed under the install prefix. Although you are free to add any file to this directory, conda only inspects the content of the files discussed below:

"},{"location":"package_spec/#infoindexjson","title":"info/index.json","text":"

This file contains basic information about the package, such as name, version, build string, and dependencies. The content of this file is stored in repodata.json, which is the repository index file, hence the name index.json. The JSON object is a dictionary containing the keys shown below.

name: string

The lowercase name of the package. May contain lowercase characters, underscores, and dashes.

version: string

The package version. May not contain \"-\". Acknowledges PEP 440.

build: string

The build string. May not contain \"-\". Differentiates builds of packages with otherwise identical names and versions, such as:

  • A build with other dependencies, such as Python 3.4 instead of Python 2.7.
  • A bug fix in the build process.
  • Some different optional dependencies, such as MKL versus ATLAS linkage. Nothing in conda actually inspects the build string. Strings such as np18py34_1 are designed only for human readability and conda never parses them.
build_number: integer

A non-negative integer representing the build number of the package. Unlike the build string, the build_number is inspected by conda. Conda uses it to sort packages that have otherwise identical names and versions to determine the latest one. This is important because new builds that contain bug fixes for the way a package is built may be added to a repository.

depends: list of match specs

A list of dependency specifications, where each element is a string. These come from the run section of the recipe or any run exports of dependencies.

constrains: list of match specs

A list of optional dependency constraints. The packages listed under constrains are not installed by default, but if they are installed they have to respect the constraints.

subdir: string

The subdir (like linux-64) of this package.

arch: string

Optional. The architecture the package is built for. EXAMPLE: x86_64. This key is generally not used (duplicate information from sudir).

platform: string

Optional. The OS that the package is built for, e.g. osx. This key is generally not used (duplicate information from sudir).

"},{"location":"package_spec/#infopathsjson","title":"info/paths.json","text":"

The paths.json file lists all files that are installed into the environment.

It consists of a list of path entries, each with the following keys:

_path: string

The relative path of the file

path_type: optional, string

The type of linking, can be hardlink, softlink, or directory. Default is hardlink.

file_mode: - optional, string

The file mode can be binary or text. This is only relevant for prefix replacement.

prefix_placeholder: optional, string

The prefix placeholder string that is encoded in the text or binary file, which is replaced at installation time. Note that this prefix placeholder uses / even on Windows.

no_link: bool, optional

Determines whether this file should be linked or not when installing the package (linking the file from the cache into the environment). Defaults to false.

sha256: string

The SHA256 hash of the file. For symbolic links it contains the SHA256 hash of the file pointed to.

size_in_bytes: number

The size, in bytes, of the file. For symbolic links, it contains the file size of the file pointed to.

Due to the way the binary replacement works, the placeholder prefix must be longer than the install prefix.

"},{"location":"package_spec/#infolicense","title":"info/license/<...>","text":"

All licenses mentioned in the recipe are copied to this folder.

"},{"location":"package_spec/#infoaboutjson","title":"info/about.json","text":"

Optional file. Contains the entries of the \"about\" section of the recipe of the recipe.yaml file. The following keys are added to info/about.json if present in the build recipe:

Renamed fields

The new recipe spec renamed a few fields (from conda-build's original implementation). This means that some fields in the about.json file still have the old names (for backwards compatibility), while you would generally use different names in the recipe.

home: url (from about.homepage)

The URL of the homepage of the package.

dev_url: url (from about.repository)

The URL of the development repository of the package.

doc_url: url (from about.documentation)

The URL of the documentation of the package.

license_url: url

The URL of the license of the package.

license: string (from about.license)

The SPDX license identifier of the package.

summary: string

A short summary of the package.

description: string

A longer description of the package.

license_family: string

(this field is not used anymore as we rely on SPDX license identifiers)

"},{"location":"package_spec/#inforecipe","title":"info/recipe/<...>","text":"

A directory containing the full contents of the build recipe. This folder also contains a rendered version of the recipe (rendered_recipe.yaml). This rendered version is used for the rebuild command. However, note that currently this format is still in flux and can change at any time.

You can also use --no-include-recipe to disable the inclusion of the recipe in the package.

"},{"location":"rebuild/","title":"Rebuilding a package","text":"

The rebuild command allows you to rebuild a package from an existing package. The main use case is to examine if a package can be rebuilt in a reproducible manner. You can read more about reproducible builds here.

"},{"location":"rebuild/#usage","title":"Usage","text":"
rattler-build rebuild ./mypkg-0.1.0-h60d57d3_0.tar.bz2\n
"},{"location":"rebuild/#how-it-works","title":"How it works","text":"

The recipe is \"rendered\" and stored into the package. The way the recipe is rendered is subject to change. For the moment, the rendered recipe is stored as info/recipe/rendered_recipe.yaml. It includes the exact package versions that were used at build time. When rebuilding, we use the package resolutions from the rendered recipe, and execute the same build script as the original package.

We also take great care to sort files in a deterministic manner as well as erasing any time stamps. The SOURCE_DATE_EPOCH environment variable is set to the same timestamp as the original build for additional determinism (some build tools use this variable to set timestamps).

"},{"location":"rebuild/#how-to-check-the-reproducibility-of-a-package","title":"How to check the reproducibility of a package","text":"

There is an excellent tool called diffoscope that allows you to compare two packages and see the differences. You can install it with pixi:

pixi global install diffoscope\n

To compare two packages, you can use the following command:

rattler-build rebuild ./build0.tar.bz2\ndiffoscope ./build0.tar.bz2 ./mypkg-0.1.0-h60d57d3_0.tar.bz2\n
"},{"location":"recipe_generation/","title":"Generating recipes for different ecosystems","text":"

Rattler-build has some builtin functionality to generate recipes for different (existing) ecosystems.

Currently we support the following ecosystems:

  • pypi (Python) - generates a recipe for a Python package
  • cran (R) - generates a recipe for an R package

To generate a recipe for a Python package, you can use the following command:

rattler-build generate-recipe pypi jinja2\n

This will generate a recipe for the jinja2 package from PyPI and print it to the console. To turn it into a recipe, you can either pipe the stdout to a file or use the -w flag. The -w flag will create a new folder with the recipe in it.

The generated recipe for jinja2 will look something like:

recipe.yaml
package:\n  name: jinja2\n  version: 3.1.4\n\nsource:\n- url: https://files.pythonhosted.org/packages/ed/55/39036716d19cab0747a5020fc7e907f362fbf48c984b14e62127f7e68e5d/jinja2-3.1.4.tar.gz\n  sha256: 4a3aee7acbbe7303aede8e9648d13b8bf88a429282aa6122a993f0ac800cb369\n\nbuild:\n  script: python -m pip install .\n\nrequirements:\n  host:\n  - flit_core <4\n  - python >=3.7\n  - pip\n  run:\n  - python >=3.7\n  - markupsafe >=2.0\n  # - babel >=2.7  # extra == 'i18n'\n\ntests: []\n\nabout:\n  summary: A very fast and expressive template engine.\n  documentation: https://jinja.palletsprojects.com/\n
"},{"location":"recipe_generation/#generating-recipes-for-r-packages","title":"Generating recipes for R packages","text":"

To generate a recipe for an R package, you can use the following command:

rattler-build generate-recipe cran dplyr\n

The R recipe generation supports some additional flags:

  • -u/--universe select an R universe to use (e.g. bioconductor)
  • -t/--tree generate multiple recipes, for every dependency as well

R packages will be prefixed with r- to avoid name conflicts with Python packages. The generated recipe for dplyr will look something like:

recipe.yaml
package:\n  name: r-dplyr\n  version: 1.1.4\n\nsource:\n- url: https://cran.r-project.org/src/contrib/dplyr_1.1.4.tar.gz\n  md5: e3066ea859b26e0d3b992c476ea3af2e\n\nbuild:\n  script: R CMD INSTALL --build .\n  python: {}\n\nrequirements:\n  host:\n  - r-base >=3.5.0\n  run:\n  - r-cli >=3.4.0\n  - r-generics\n  - r-glue >=1.3.2\n  - r-lifecycle >=1.0.3\n  - r-magrittr >=1.5\n  - r-methods\n  - r-pillar >=1.9.0\n  - r-r6\n  - r-rlang >=1.1.0\n  - r-tibble >=3.2.0\n  - r-tidyselect >=1.2.0\n  - r-utils\n  - r-vctrs >=0.6.4\n  # -  r-bench  # suggested\n  # -  r-broom  # suggested\n  # -  r-callr  # suggested\n  # -  r-covr  # suggested\n  # -  r-dbi  # suggested\n  # -  r-dbplyr >=2.2.1  # suggested\n  # -  r-ggplot2  # suggested\n  # -  r-knitr  # suggested\n  # -  r-lahman  # suggested\n  # -  r-lobstr  # suggested\n  # -  r-microbenchmark  # suggested\n  # -  r-nycflights13  # suggested\n  # -  r-purrr  # suggested\n  # -  r-rmarkdown  # suggested\n  # -  r-rmysql  # suggested\n  # -  r-rpostgresql  # suggested\n  # -  r-rsqlite  # suggested\n  # -  r-stringi >=1.7.6  # suggested\n  # -  r-testthat >=3.1.5  # suggested\n  # -  r-tidyr >=1.3.0  # suggested\n  # -  r-withr  # suggested\n\nabout:\n  homepage: https://dplyr.tidyverse.org, https://github.com/tidyverse/dplyr\n  summary: A Grammar of Data Manipulation\n  description: |-\n    A fast, consistent tool for working with data frame like\n    objects, both in memory and out of memory.\n  license: MIT\n  license_file: LICENSE\n  repository: https://github.com/cran/dplyr\n

Tip

You can use the generated recipes to build your own \"forge\" with rattler-build. Read more about it in the Building your own forge section.

"},{"location":"selectors/","title":"Selectors in recipes","text":"

Recipe and variant configuration files can utilize selectors to conditionally add, remove, or modify dependencies, configuration options, or even skip recipe execution based on specific conditions.

Selectors are implemented using an if / then / else map, which is a valid YAML dictionary. The condition is evaluated using minijinja and follows the same syntax as a Python expression.

During rendering, several variables are set based on the platform and variant being built. For example, the unix variable is true for macOS and Linux, while win is true for Windows. Consider the following recipe executed on Linux:

requirements:\n  host:\n    - if: unix\n      then: unix-tool\n    - if: win\n      then: win-tool\n

This will be evaluated as:

requirements:\n  host:\n    - unix-tool\n

The line containing the Windows-specific configuration is removed. Multiple items can also be selected, such as:

host:\n  - if: linux\n    then:\n      - linux-tool-1\n      - linux-tool-2\n      - linux-tool-3\n

For Linux, this will result in:

host:\n  - linux-tool-1\n  - linux-tool-2\n  - linux-tool-3\n

Other examples often found in the wild:

if: build_platform != target_platform ... # true if cross-platform build\nif: osx and arm64 ... # true for apple silicon (osx-arm64)\nif: linux and (aarch64 or ppc64le)) ... # true for linux ppc64le or linux-aarch64\n
"},{"location":"selectors/#available-variables","title":"Available variables","text":"

The following variables are available during rendering of the recipe:

Variable Description target_platform the configured target_platform for the build build_platform the configured build_platform for the build linux \"true\" if target_platform is Linux osx \"true\" if target_platform is OSX / macOS win \"true\" if target_platform is Windows unix \"true\" if target_platform is a Unix (macOS or Linux) x86, x86_64 x86 32/64-bit Architecture aarch64, arm64 64-bit Arm (these are the same but are both supported for legacy) armV6l, armV7l 32-bit Arm ppc64, s390x, Big endian ppc64le Little endian riscv32, riscv64 The RISC-V Architecture wasm32 The WebAssembly Architecture"},{"location":"selectors/#variant-selectors","title":"Variant selectors","text":"

To select based on variant configuration you can use the names in the selectors as well. For example, if the build uses python: 3.8 as a variant, we can use if: python == \"3.8\" to enable a dependency for only when the Python version is 3.8.

String comparison

The comparison is a string comparison done by minijinja, so it is important to use the correct string representation of the variant. Use the match function to compare versions.

variants.yaml
python:\n  - 3.8\n  - 3.9\n
recipe.yaml
requirements:\n  host:\n    - if: python == \"3.8\" # (1)!\n      then: mydep\n      else: otherdep\n
  1. This will only add mydep when the Python version is 3.8. This comparison is a string comparison, so it is important to use the correct string representation of the variant.
"},{"location":"selectors/#the-match-function","title":"The match function","text":"

!!! note \"Rename from cmp to match\" The cmp function has been renamed to match to better reflect its purpose.

Inside selectors, one can use a special match function to test if the selected variant version has a matching version. For example, having the following variants file, we could use the these tests:

variants.yaml
python:\n  - 3.8\n  - 3.9\n
recipe.yaml
- if: match(python, \"3.8\")    # true, false\n  then: mydep\n- if: match(python, \">=3.8\")  # true, true\n  then: mydep\n- if: match(python, \"<3.8\")   # false, false (1)\n  then: mydep\n
  1. else: would also have worked here.

This function eliminates the need to implement any Python-specific conda-build selectors (such as py3k, py38, etc.) or the py and npy integers.

Please note that during the initial phase of rendering we do not know the variant, and thus the match condition always evaluates to true.

"},{"location":"selectors/#selector-evaluation","title":"Selector evaluation","text":"

Except for the rattler-build specific selectors, the selectors are evaluated using the minijinja engine. This means that the selectors are evaluated by minijinja thus Python like expressions. Some notable options are:

- if: python == \"3.8\" # equal\n- if: python != \"3.8\" # not equal\n- if: python and linux # true if python variant is set and the target_platform is linux\n- if: python and not linux # true if python variant is set and the target_platform is not linux\n- if: python and (linux or osx) # true if python variant is set and the target_platform is linux or osx\n
"},{"location":"special_files/","title":"Activation scripts and other special files","text":"

A conda package can contain \"special\" files in the prefix. These files are scripts that are executed during activation, installation, or uninstallation process.

If possible, they should be avoided since they execute arbitrary code at installation time and slow down the installation and activation process.

"},{"location":"special_files/#activation-scripts","title":"Activation scripts","text":"

The activation scripts are executed when the environment containing the package is activated (e.g. when doing micromamba activate myenv or pixi run ...).

The scripts are located in special folders:

  • etc/conda/activate.d/{script.sh/bat} - scripts in this folder are executed before the environment is activated
  • etc/conda/deactivate.d/{script.sh/bat} - scripts in this folder are executed when the environment is deactivated

The scripts are executed in lexicographical order, so you can prefix them with numbers to control the order of execution.

To add a script to the package, just make sure that you install the file in this folder. For example, on Linux:

mkdir -p $PREFIX/etc/conda/activate.d\ncp activate-mypkg.sh $PREFIX/etc/conda/activate.d/10-activate-mypkg.sh\n\nmkdir -p $PREFIX/etc/conda/deactivate.d\ncp deactivate-mypkg.sh $PREFIX/etc/conda/deactivate.d/10-deactivate-mypkg.sh\n
"},{"location":"special_files/#post-link-and-pre-unlink-scripts","title":"Post-link and pre-unlink scripts","text":"

The post-link and pre-unlink scripts are executed when the package is installed or uninstalled. They are both heavily discouraged but implemented for compatibility with conda in rattler-build since version 0.17.

For a post-link script to be executed when a package is installed, the built package needs to have a .<package_name>-post-link.{sh/bat} in its bin/ folder. The same is applicable for pre-unlink scripts, just with the name .<package_name>-pre-unlink.{sh/bat} (note the leading period). For example, for a package mypkg, you would need to have a .mypkg-post-link.sh in its bin/ folder.

To make sure the scripts are included in the correct location, use your recipe's build script or build/script key. For example, assuming you have a post-link.sh script in your source, alongside the recipe in the recipe's folder, the following configuration will copy it correctly:

build:\n  ...\n  script:\n    - ...\n    - mkdir -p $PREFIX/bin\n    - cp $RECIPE_DIR/post-link.sh $PREFIX/bin/.mypkg-post-link.sh\n    - chmod +x $PREFIX/bin/.mypkg-post-link.sh\n

The $PREFIX and $RECIPE_DIR environment variables will be set during the build process to help you specify the correct paths.

"},{"location":"testing/","title":"Testing packages","text":"

When you are developing a package, you should write tests for it. The tests are automatically executed right after the package build has finished.

The tests from the test section are actually packaged into your package and can also be executed straight from the existing package.

The idea behind adding the tests into the package is that you can execute the tests independently from building the package. That is also why we are shipping a test subcommand that takes as input an existing package and executes the tests:

rattler-build test --package-file ./xtensor-0.24.6-h60d57d3_0.tar.bz2\n

Running the above command will extract the package and create a clean environment where the package and dependencies are installed. Then the tests are executed in this newly-created environment.

If you inspect the package contents, you would find the test files under info/test/*.

"},{"location":"testing/#how-tests-are-translated","title":"How tests are translated","text":"

The tests section allows you to specify the following things:

tests:\n  - script:\n      # commands to run to test the package. If any of the commands\n      # returns with an error code, the test is considered failed.\n      - echo \"Hello world\"\n      - pytest ./tests\n\n    # additional requirements at test time\n    requirements:\n      run:\n        - pytest\n\n    files:\n      # Extra files to be copied to the test directory from the \"work directory\"\n      source:\n        - tests/\n        - test.py\n        - *.sh\n      recipe:\n        - more_tests/*.py\n\n  # This test section tries to import the Python modules and errors if it can't\n  - python:\n      imports:\n        - mypkg\n        - mypkg.subpkg\n

When you are writing a test for your package, additional files are created and added to your package. These files are placed under the info/tests/{index}/ folder for each test.

For a script test:

  • All the files are copied straight into the test folder (under info/tests/{index}/)
  • The script is turned into a run_test.sh or run_test.bat file
  • The extra requirements are stored as a JSON file called test_time_dependencies.json

For a Python import test:

  • A JSON file is created that is called python_test.json and stores the imports to be tested and whether to execute pip check or not. This file is placed under info/tests/{index}/

For a downstream test:

  • A JSON file is created that is called downstream_test.json and stores the downstream tests to be executed. This file is placed under info/tests/{index}/
"},{"location":"testing/#legacy-tests","title":"Legacy tests","text":"

Legacy tests (from conda-build) are still supported for execution. These tests are stored as files under the info/test/ folder.

The files are:

  • run_test.sh (Unix)
  • run_test.bat (Windows)
  • run_test.py (for the Python import tests)
  • test_time_dependencies.json (for additional dependencies at test time)

Additionally, the info/test/ folder contains all the files specified in the test section as source_files and files. The tests are executed pointing to this directory as the current working directory.

"},{"location":"tips_and_tricks/","title":"Tips and tricks for rattler-build","text":"

This section contains some tips and tricks for using rattler-build.

"},{"location":"tips_and_tricks/#using-sccache-or-ccache-with-rattler-build","title":"Using sccache or ccache with rattler-build","text":"

When debugging a recipe it can help a lot to use sccache or ccache. You can install both tools e.g. with pixi global install sccache.

To use them with a CMake project, you can use the following variables:

export CMAKE_C_COMPILER_LAUNCHER=sccache\nexport CMAKE_CXX_COMPILER_LAUNCHER=sccache\n\n# or more generally\n\nexport C=\"sccache $C\"\nexport CXX=\"sccache $CXX\"\n

However, both ccache and sccache are sensitive to changes in the build location. Since rattler-build, by default, always creates a new build directory with the timestamp, you need to use the --no-build-id flag. This will disable the time stamp in the build directory and allow ccache and sccache to cache the build.

rattler-build build --no-build-id --recipe ./path/to/recipe.yaml\n
"},{"location":"tips_and_tricks/#building-your-own-forge","title":"Building your own \"forge\"","text":"

You might want to publish your own software packages to a channel you control. These might be packages that are not available in the main conda-forge channel, or proprietary packages, or packages that you have modified in some way.

Doing so is pretty straightforward with rattler-build and a CI provider of your choice. We have a number of example repositories for \"custom\" forges:

  • rust-forge: This repository builds a number of Rust packages for Windows, macOS and Linux on top of Github Actions.
  • r-forge: The same idea, but for R packages
"},{"location":"tips_and_tricks/#directory-structure","title":"Directory structure","text":"

To create your own forge, you should create a number of sub-directories where each sub-directory should contain at most one recipe. With the --recipe-dir flag of rattler-build, the program will go and collect all recipes it finds in the given directory or sub-directories.

We can combine this with the --skip-existing=all flag which will skip all packages that are already built locally or in the channel (if you upload them). Using all will also look at the repodata.json file in the channel to see if the package is already there. Packages are skipped based on their complete name, including the version and build string.

To note: the build string changes if the variant configuration changes! So if you update a package in the variant configuration, the packages that need rebuilding should be rebuilt.

Note

You can generate recipes for different ecosystems with the rattler-build generate-recipe command. Read more about it in the Generating recipes section.

"},{"location":"tips_and_tricks/#ci-setup","title":"CI setup","text":"

As an example, the following is the CI setup for rust-forge. The workflow uses rattler-build to build and upload packages to a custom channel on https://prefix.dev \u2013 but you can also use rattler-build to upload to your own quetz instance, or a channel on anaconda.org.

Example CI setup for rust-forge

The following is an example of a Github Actions workflow for rust-forge:

.github/workflows/forge.yml
name: Build all packages\n\non:\n  push:\n    branches:\n      - main\n  workflow_dispatch:\n  pull_request:\n    branches:\n      - main\n\njobs:\n  build:\n    strategy:\n      matrix:\n        include:\n          - { target: linux-64, os: ubuntu-20.04 }\n          - { target: win-64, os: windows-latest }\n          # force older macos-13 to get x86_64 runners\n          - { target: osx-64, os: macos-13 }\n          - { target: osx-arm64, os: macos-14 }\n      fail-fast: false\n\n    runs-on: ${{ matrix.os }}\n    steps:\n      - uses: actions/checkout@v4\n        with:\n          fetch-depth: 2\n      - uses: prefix-dev/setup-pixi@v0.5.1\n        with:\n          pixi-version: v0.24.2\n          cache: true\n\n      - name: Run code in changed subdirectories\n        shell: bash\n        env:\n          TARGET_PLATFORM: ${{ matrix.target }}\n\n        run: |\n          pixi run rattler-build build --recipe-dir . \\\n            --skip-existing=all --target-platform=$TARGET_PLATFORM \\\n            -c conda-forge -c https://prefix.dev/rust-forge\n\n      - name: Upload all packages\n        shell: bash\n        # do not upload on PR\n        if: github.event_name == 'push'\n        env:\n          PREFIX_API_KEY: ${{ secrets.PREFIX_API_KEY }}\n        run: |\n          # ignore errors because we want to ignore duplicate packages\n          for file in output/**/*.conda; do\n            pixi run rattler-build upload prefix -c rust-forge \"$file\" || true\n          done\n
"},{"location":"tui/","title":"Terminal User Interface","text":"

rattler-build offers a terminal user interface for building multiple packages and viewing the logs.

To launch the TUI, run the build command with the --tui flag as shown below:

$ rattler-build build -r recipe.yaml --tui\n

Note

rattler-build-tui is gated behind the tui feature flag to avoid extra dependencies. Build the project with --features tui arguments to enable the TUI functionality.

"},{"location":"tui/#key-bindings","title":"Key Bindings","text":"Key Action \u23ce Build a Build all j/k Next/previous package up/down/left/right Scroll logs e Edit recipe (via $EDITOR) c, : Open command prompt (available commands: edit) q, ctrl-c, esc, Quit"},{"location":"variants/","title":"Variant configuration","text":"

rattler-build can automatically build multiple variants of a given package. For example, a Python package might need multiple variants per Python version (especially if it is a binary package such as numpy).

For this use case, one can specify variant configuration files. A variant configuration file has 2 special entries and a list of packages with variants. For example:

variants.yaml
# special entry #1, the zip keys\nzip_keys:\n- [python, numpy]\n\n# special entry #2, the pin_run_as_build key\npin_run_as_build:\n  numpy:\n    max_pin: 'x.x'\n\n# entries per package version that users are interested in\npython:\n# Note that versions are _strings_ (not numbers)\n- \"3.8\"\n- \"3.9\"\n- \"3.10\"\n\nnumpy:\n- \"1.12\"\n- \"1.12\"\n- \"1.20\"\n

If we have a recipe, that has a build, host or run dependency on python we will build multiple variants of this package, one for each configured python version (\"3.8\", \"3.9\" and \"3.10\").

For example:

# ...\nrequirements:\n  host:\n  - python\n

... will be rendered as (for the first variant):

# ...\nrequirements:\n  host:\n- python 3.8*\n

Note that variants are only applied if the requirement doesn't specify any constraints. If the requirement would be python >3.8,<3.10 then the variant entry would be ignored.

"},{"location":"variants/#automatic-variantsyaml-discovery","title":"Automatic variants.yaml discovery","text":"

rattler-build automatically includes the variant configuration from a variants.yaml file next to a recipe. Use the --ignore-recipe-variants option to disable automatic discovery of variants.yaml files next to the recipes.

To include a variant config file from another location or include multiple configuration files use the --variant-config option:

rattler-build build --variant-config ~/user_variants.yaml --variant-config /opt/rattler-build/global_variants.yaml --recipe myrecipe.yaml\n
"},{"location":"variants/#package-hash-from-variant","title":"Package hash from variant","text":"

You might have wondered what the role of the build string is. The build string is (if not explicitly set) computed from the variant configuration. It serves as a mechanism to discern different build configurations that produce a package with the same name and version.

The hash is computed by dumping all of the variant configuration values that are used by a given recipe into a JSON file, and then hashing that JSON file.

For example, in our python example, we would get a variant configuration file that looks something like:

{\n    \"python\": \"3.8\"\n}\n

This JSON string is then hashed with the MD5 hash algorithm, and produces the hash. For certain packages (such as Python packages) special rules exists, and the py<Major.Minor> version is prepended to the hash, so that the final hash would look something like py38h123123.

"},{"location":"variants/#zip-keys","title":"Zip keys","text":"

Zip keys modify how variants are combined. Usually, each variant key that has multiple entries is expanded to a build matrix. For example, if we have:

python: [\"3.8\", \"3.9\"]\nnumpy: [\"1.12\", \"1.14\"]\n

...then we obtain 4 variants for a recipe that uses both numpy and python:

- python 3.8, numpy 1.12\n- python 3.8, numpy 1.14\n- python 3.9, numpy 1.12\n- python 3.9, numpy 1.14\n

However, if we use the zip_keys and specify:

zip_keys: [\"python\", \"numpy\"]\npython: [\"3.8\", \"3.9\"]\nnumpy: [\"1.12\", \"1.14\"]\n

...then the versions are \"zipped up\" and we only get 2 variants. Note that both python and numpy need to specify the exact same number of versions to make this work.

The resulting variants with the zip applied are:

- python 3.8, numpy 1.12\n- python 3.9, numpy 1.14\n
"},{"location":"variants/#pin-run-as-build","title":"Pin run as build","text":"

The pin_run_as_build key allows the user to inject additional pins. Usually, the run_exports mechanism is used to specify constraints for runtime dependencies from build time dependencies, but pin_run_as_build offers a mechanism to override that if the package does not contain a run exports file.

For example:

pin_run_as_build:\n  libcurl:\n    min_pin: 'x'\n    max_pin: 'x'\n

If we now have a recipe that uses libcurl in the host and run dependencies like:

requirements:\n  host:\n  - libcurl\n  run:\n  - libcurl\n

During resolution, libcurl might be evaluated to libcurl 8.0.1 h13284. Our new runtime dependency then looks like:

requirements:\n  host:\n  - libcurl 8.0.1 h13284\n  run:\n  - libcurl >=8,<9\n
"},{"location":"variants/#prioritizing-variants","title":"Prioritizing variants","text":"

You might produce multiple variants for a package, but want to define a priority for a given variant. The variant with the highest priority would be the default package that is selected by the resolver.

There are two mechanisms to make this possible: mutex packages and the down_prioritize_variant option in the recipe.

"},{"location":"variants/#the-down_prioritize_variant-option","title":"The down_prioritize_variant option","text":"

Note

It is not always necessary to use the down_prioritize_variant option - only if the solver has no other way to prefer a given variant. For example, if you have a package that has multiple variants for different Python versions, the solver will automatically prefer the variant with the highest Python version.

The down_prioritize_variant option allows you to specify a variant that should be down-prioritized. For example:

recipe.yaml
build:\n  variant_config:\n    use_keys:\n      # use cuda from the variant config, e.g. to build multiple CUDA variants\n      - cuda\n    # this will down-prioritize the cuda variant versus other variants of the package\n    down_prioritize_variant: ${{ 1 if cuda else 0 }}\n
"},{"location":"variants/#mutex-packages","title":"Mutex packages","text":"

Another way to make sure the right variants are selected are \"mutex\" packages. A mutex package is a package that is mutually exclusive. We use the fact that only one package of a given name can be installed at a time (the solver has to choose).

A mutex package might be useful to make sure that all packages that depend on BLAS are compiled against the same BLAS implementation. The mutex package will serve the purpose that \"openblas\" and \"mkl\" can never be installed at the same time.

We could define a BLAS mutex package like this:

variant_config.yaml
blas_variant:\n  - \"openblas\"\n  - \"mkl\"\n

And then the recipe.yaml for the mutex package could look like this:

recipe.yaml
package:\n  name: blas_mutex\n  version: 1.0\n\nbuild:\n  string: ${{ blas_variant }}${{ hash }}_${{ build_number }}\n  variant_config:\n    # make sure that `openblas` is preferred over `mkl`\n    down_prioritize_variant: ${{ 1 if blas_variant == \"mkl\" else 0 }}\n

This will create two package: blas_mutex-1.0-openblas and blas_mutex-1.0-mkl. Only one of these packages can be installed at a time because they share the same name. The solver will then only select one of these two packages.

The blas package in turn should have a run_export for the blas_mutex package, so that any package that links against blas also has a dependency on the correct blas_mutex package:

recipe.yaml
package:\n  name: openblas\n  version: 1.0\n\nrequirements:\n  # any package depending on openblas should also depend on the correct blas_mutex package\n  run_export:\n    # Add a run export on _any_ version of the blas_mutex package whose build string starts with \"openblas\"\n    - blas_mutex * openblas*\n

Then the recipe of a package that wants to build two variants, one for openblas and one for mkl could look like this:

recipe.yaml
package:\n  name: fastnumerics\n  version: 1.0\n\nrequirements:\n  host:\n    # build against both openblas and mkl\n    - ${{ blas_variant }}\n  run:\n    # implicitly adds the correct blas_mutex package through run exports\n    # - blas_mutex * ${{ blas_variant }}*\n
"},{"location":"reference/cli/","title":"Command-Line Help for rattler-build","text":"

This document contains the help content for the rattler-build command-line program.

"},{"location":"reference/cli/#rattler-build","title":"rattler-build","text":"

Usage: rattler-build [OPTIONS] [COMMAND]

"},{"location":"reference/cli/#subcommands","title":"Subcommands:","text":"
  • build \u2014 Build a package from a recipe
  • test \u2014 Run a test for a single package
  • rebuild \u2014 Rebuild a package from a package file instead of a recipe
  • upload \u2014 Upload a package
  • completion \u2014 Generate shell completion script
  • generate-recipe \u2014 Generate a recipe from PyPI or CRAN
  • auth \u2014 Handle authentication to external channels
"},{"location":"reference/cli/#options","title":"Options:","text":"
  • -v, --verbose

    Increase logging verbosity

  • -q, --quiet

    Decrease logging verbosity

  • --log-style <LOG_STYLE>

    Logging style

    • Default value: fancy
    • Possible values:
      • fancy: Use fancy logging output
      • json: Use JSON logging output
      • plain: Use plain logging output
  • --color <COLOR>

    Enable or disable colored output from rattler-build. Also honors the CLICOLOR and CLICOLOR_FORCE environment variable

    • Default value: auto
    • Possible values:
      • always: Always use colors
      • never: Never use colors
      • auto: Use colors when the output is a terminal
"},{"location":"reference/cli/#build","title":"build","text":"

Build a package from a recipe

Usage: rattler-build build [OPTIONS]

"},{"location":"reference/cli/#options_1","title":"Options:","text":"
  • -r, --recipe <RECIPE>

    The recipe file or directory containing recipe.yaml. Defaults to the current directory

    • Default value: .
  • --recipe-dir <RECIPE_DIR>

    The directory that contains recipes

  • --up-to <UP_TO>

    Build recipes up to the specified package

  • --build-platform <BUILD_PLATFORM>

    The build platform to use for the build (e.g. for building with emulation, or rendering)

    • Default value: current platform
  • --target-platform <TARGET_PLATFORM>

    The target platform for the build

    • Default value: current platform
  • -c, --channel <CHANNEL>

    Add a channel to search for dependencies in

    • Default value: conda-forge
  • -m, --variant-config <VARIANT_CONFIG>

    Variant configuration files for the build

  • --ignore-recipe-variants

    Do not read the variants.yaml file next to a recipe

    • Possible values: true, false
  • --render-only

    Render the recipe files without executing the build

    • Possible values: true, false
  • --with-solve

    Render the recipe files with solving dependencies

    • Possible values: true, false
  • --keep-build

    Keep intermediate build artifacts after the build

    • Possible values: true, false
  • --no-build-id

    Don't use build id(timestamp) when creating build directory name

    • Possible values: true, false
  • --compression-threads <COMPRESSION_THREADS>

    The number of threads to use for compression (only relevant when also using --package-format conda)

  • --use-zstd

    Enable support for repodata.json.zst

    • Default value: true
    • Possible values: true, false
  • --use-bz2

    Enable support for repodata.json.bz2

    • Default value: true
    • Possible values: true, false
  • --experimental

    Enable experimental features

    • Possible values: true, false
  • --auth-file <AUTH_FILE>

    Path to an auth-file to read authentication information from

  • --tui

    Launch the terminal user interface

    • Default value: false
    • Possible values: true, false
"},{"location":"reference/cli/#modifying-result","title":"Modifying result","text":"
  • --package-format <PACKAGE_FORMAT>

    The package format to use for the build. Can be one of tar-bz2 or conda. You can also add a compression level to the package format, e.g. tar-bz2:<number> (from 1 to 9) or conda:<number> (from -7 to 22).

    • Default value: conda
  • --no-include-recipe

    Don't store the recipe in the final package

    • Possible values: true, false
  • --no-test

    Don't run the tests after building the package

    • Default value: false
    • Possible values: true, false
  • --color-build-log

    Don't force colors in the output of the build script

    • Default value: true
    • Possible values: true, false
  • --output-dir <OUTPUT_DIR>

    Output directory for build artifacts.

    • Default value: ./output
  • --skip-existing <SKIP_EXISTING>

    Whether to skip packages that already exist in any channel If set to none, do not skip any packages, default when not specified. If set to local, only skip packages that already exist locally, default when using --skip-existing. If set toall`, skip packages that already exist in any channel

    • Default value: none
    • Possible values:
      • none: Do not skip any packages
      • local: Skip packages that already exist locally
      • all: Skip packages that already exist in any channel
"},{"location":"reference/cli/#test","title":"test","text":"

Run a test for a single package

This creates a temporary directory, copies the package file into it, and then runs the indexing. It then creates a test environment that installs the package and any extra dependencies specified in the package test dependencies file.

With the activated test environment, the packaged test files are run:

  • info/test/run_test.sh or info/test/run_test.bat on Windows * info/test/run_test.py

These test files are written at \"package creation time\" and are part of the package.

Usage: rattler-build test [OPTIONS] --package-file <PACKAGE_FILE>

"},{"location":"reference/cli/#options_2","title":"Options:","text":"
  • -c, --channel <CHANNEL>

    Channels to use when testing

  • -p, --package-file <PACKAGE_FILE>

    The package file to test

  • --compression-threads <COMPRESSION_THREADS>

    The number of threads to use for compression

  • --use-zstd

    Enable support for repodata.json.zst

    • Default value: true
    • Possible values: true, false
  • --use-bz2

    Enable support for repodata.json.bz2

    • Default value: true
    • Possible values: true, false
  • --experimental

    Enable experimental features

    • Possible values: true, false
  • --auth-file <AUTH_FILE>

    Path to an auth-file to read authentication information from

"},{"location":"reference/cli/#modifying-result_1","title":"Modifying result","text":"
  • --output-dir <OUTPUT_DIR>

    Output directory for build artifacts.

    • Default value: ./output
"},{"location":"reference/cli/#rebuild","title":"rebuild","text":"

Rebuild a package from a package file instead of a recipe

Usage: rattler-build rebuild [OPTIONS] --package-file <PACKAGE_FILE>

"},{"location":"reference/cli/#options_3","title":"Options:","text":"
  • -p, --package-file <PACKAGE_FILE>

    The package file to rebuild

  • --no-test

    Do not run tests after building

    • Default value: false
    • Possible values: true, false
  • --compression-threads <COMPRESSION_THREADS>

    The number of threads to use for compression

  • --use-zstd

    Enable support for repodata.json.zst

    • Default value: true
    • Possible values: true, false
  • --use-bz2

    Enable support for repodata.json.bz2

    • Default value: true
    • Possible values: true, false
  • --experimental

    Enable experimental features

    • Possible values: true, false
  • --auth-file <AUTH_FILE>

    Path to an auth-file to read authentication information from

"},{"location":"reference/cli/#modifying-result_2","title":"Modifying result","text":"
  • --output-dir <OUTPUT_DIR>

    Output directory for build artifacts.

    • Default value: ./output
"},{"location":"reference/cli/#upload","title":"upload","text":"

Upload a package

Usage: rattler-build upload [OPTIONS] [PACKAGE_FILES]... <COMMAND>

"},{"location":"reference/cli/#subcommands_1","title":"Subcommands:","text":"
  • quetz \u2014 Upload to aQuetz server. Authentication is used from the keychain / auth-file
  • artifactory \u2014 Options for uploading to a Artifactory channel. Authentication is used from the keychain / auth-file
  • prefix \u2014 Options for uploading to a prefix.dev server. Authentication is used from the keychain / auth-file
  • anaconda \u2014 Options for uploading to a Anaconda.org server
"},{"location":"reference/cli/#arguments","title":"Arguments:","text":"
  • <PACKAGE_FILES>

    The package file to upload

"},{"location":"reference/cli/#options_4","title":"Options:","text":"
  • --use-zstd

    Enable support for repodata.json.zst

    • Default value: true
    • Possible values: true, false
  • --use-bz2

    Enable support for repodata.json.bz2

    • Default value: true
    • Possible values: true, false
  • --experimental

    Enable experimental features

    • Possible values: true, false
  • --auth-file <AUTH_FILE>

    Path to an auth-file to read authentication information from

"},{"location":"reference/cli/#modifying-result_3","title":"Modifying result","text":"
  • --output-dir <OUTPUT_DIR>

    Output directory for build artifacts.

    • Default value: ./output
"},{"location":"reference/cli/#quetz","title":"quetz","text":"

Upload to aQuetz server. Authentication is used from the keychain / auth-file

Usage: rattler-build upload quetz [OPTIONS] --url <URL> --channel <CHANNEL>

"},{"location":"reference/cli/#options_5","title":"Options:","text":"
  • -u, --url <URL>

    The URL to your Quetz server

  • -c, --channel <CHANNEL>

    The URL to your channel

  • -a, --api-key <API_KEY>

    The Quetz API key, if none is provided, the token is read from the keychain / auth-file

"},{"location":"reference/cli/#artifactory","title":"artifactory","text":"

Options for uploading to a Artifactory channel. Authentication is used from the keychain / auth-file

Usage: rattler-build upload artifactory [OPTIONS] --url <URL> --channel <CHANNEL>

"},{"location":"reference/cli/#options_6","title":"Options:","text":"
  • -u, --url <URL>

    The URL to your Artifactory server

  • -c, --channel <CHANNEL>

    The URL to your channel

  • -r, --username <USERNAME>

    Your Artifactory username

  • -p, --password <PASSWORD>

    Your Artifactory password

"},{"location":"reference/cli/#prefix","title":"prefix","text":"

Options for uploading to a prefix.dev server. Authentication is used from the keychain / auth-file

Usage: rattler-build upload prefix [OPTIONS] --channel <CHANNEL>

"},{"location":"reference/cli/#options_7","title":"Options:","text":"
  • -u, --url <URL>

    The URL to the prefix.dev server (only necessary for self-hosted instances)

    • Default value: https://prefix.dev
  • -c, --channel <CHANNEL>

    The channel to upload the package to

  • -a, --api-key <API_KEY>

    The prefix.dev API key, if none is provided, the token is read from the keychain / auth-file

"},{"location":"reference/cli/#anaconda","title":"anaconda","text":"

Options for uploading to a Anaconda.org server

Usage: rattler-build upload anaconda [OPTIONS] --owner <OWNER>

"},{"location":"reference/cli/#options_8","title":"Options:","text":"
  • -o, --owner <OWNER>

    The owner of the distribution (e.g. conda-forge or your username)

  • -c, --channel <CHANNEL>

    The channel / label to upload the package to (e.g. main / rc)

    • Default value: main
  • -a, --api-key <API_KEY>

    The Anaconda API key, if none is provided, the token is read from the keychain / auth-file

  • -u, --url <URL>

    The URL to the Anaconda server

    • Default value: https://api.anaconda.org
  • -f, --force

    Replace files on conflict

    • Default value: false
    • Possible values: true, false
"},{"location":"reference/cli/#completion","title":"completion","text":"

Generate shell completion script

Usage: rattler-build completion --shell <SHELL>

"},{"location":"reference/cli/#options_9","title":"Options:","text":"
  • -s, --shell <SHELL>

    Specifies the shell for which the completions should be generated

    • Possible values:
      • bash: Bourne Again SHell (bash)
      • elvish: Elvish shell
      • fish: Friendly Interactive SHell (fish)
      • nushell: Nushell
      • powershell: PowerShell
      • zsh: Z SHell (zsh)
"},{"location":"reference/cli/#generate-recipe","title":"generate-recipe","text":"

Generate a recipe from PyPI or CRAN

Usage: rattler-build generate-recipe <COMMAND>

"},{"location":"reference/cli/#subcommands_2","title":"Subcommands:","text":"
  • pypi \u2014 Generate a recipe for a Python package from PyPI
  • cran \u2014 Generate a recipe for an R package from CRAN
"},{"location":"reference/cli/#pypi","title":"pypi","text":"

Generate a recipe for a Python package from PyPI

Usage: rattler-build generate-recipe pypi [OPTIONS] <PACKAGE>

"},{"location":"reference/cli/#arguments_1","title":"Arguments:","text":"
  • <PACKAGE>

    Name of the package to generate

"},{"location":"reference/cli/#options_10","title":"Options:","text":"
  • -w, --write

    Whether to write the recipe to a folder

    • Possible values: true, false
  • -u, --use-mapping

    Whether to use the conda-forge PyPI name mapping

    • Default value: true
    • Possible values: true, false
  • -t, --tree

    Whether to generate recipes for all dependencies

    • Possible values: true, false
"},{"location":"reference/cli/#cran","title":"cran","text":"

Generate a recipe for an R package from CRAN

Usage: rattler-build generate-recipe cran [OPTIONS] <PACKAGE>

"},{"location":"reference/cli/#arguments_2","title":"Arguments:","text":"
  • <PACKAGE>

    Name of the package to generate

"},{"location":"reference/cli/#options_11","title":"Options:","text":"
  • -u, --universe <UNIVERSE>

    The R Universe to fetch the package from (defaults to cran)

  • -t, --tree

    Whether to create recipes for the whole dependency tree or not

    • Possible values: true, false
  • -w, --write

    Whether to write the recipe to a folder

    • Possible values: true, false
"},{"location":"reference/cli/#auth","title":"auth","text":"

Handle authentication to external channels

Usage: rattler-build auth <COMMAND>

"},{"location":"reference/cli/#subcommands_3","title":"Subcommands:","text":"
  • login \u2014 Store authentication information for a given host
  • logout \u2014 Remove authentication information for a given host
"},{"location":"reference/cli/#login","title":"login","text":"

Store authentication information for a given host

Usage: rattler-build auth login [OPTIONS] <HOST>

"},{"location":"reference/cli/#arguments_3","title":"Arguments:","text":"
  • <HOST>

    The host to authenticate with (e.g. repo.prefix.dev)

"},{"location":"reference/cli/#options_12","title":"Options:","text":"
  • --token <TOKEN>

    The token to use (for authentication with prefix.dev)

  • --username <USERNAME>

    The username to use (for basic HTTP authentication)

  • --password <PASSWORD>

    The password to use (for basic HTTP authentication)

  • --conda-token <CONDA_TOKEN>

    The token to use on anaconda.org / quetz authentication

"},{"location":"reference/cli/#logout","title":"logout","text":"

Remove authentication information for a given host

Usage: rattler-build auth logout <HOST>

"},{"location":"reference/cli/#arguments_4","title":"Arguments:","text":"
  • <HOST>

    The host to remove authentication for

This document was generated automatically by clap-markdown.

"},{"location":"reference/jinja/","title":"Jinja","text":"

rattler-build comes with a couple of useful Jinja functions and filters that can be used in the recipe.

"},{"location":"reference/jinja/#functions","title":"Functions","text":""},{"location":"reference/jinja/#the-compiler-function","title":"The compiler function","text":"

The compiler function can be used to put together a compiler that works for the current platform and the compilation \"target_platform\". The syntax looks like: ${{ compiler('c') }} where 'c' signifies the programming language that is used.

This function evaluates to <compiler>_<target_platform> <compiler_version>. For example, when compiling on linux and to linux-64, this function evaluates to gcc_linux-64.

The values can be influenced by the variant_configuration. The <lang>_compiler and <lang>_compiler_version variables are the keys with influence. See below for an example:

"},{"location":"reference/jinja/#usage-in-a-recipe","title":"Usage in a recipe","text":"recipe.yaml
requirements:\n  build:\n    - ${{ compiler('c') }}\n

With a corresponding variant_configuration:

variant_configuration.yaml
c_compiler:\n- clang\nc_compiler_version:\n- 9.0\n

The variables shown above would select the clang compiler in version 9.0. Note that the final output will still contain the target_platform, so that the full compiler will read clang_linux-64 9.0 when compiling with --target-platform linux-64.

rattler-build defines some default compilers for the following languages (inherited from conda-build):

  • c: gcc on Linux, clang on osx and vs2017 on Windows
  • cxx: gxx on Linux, clangxx on osx and vs2017 on Windows
  • fortran: gfortran on Linux, gfortran on osx and vs2017 on Windows
  • rust: rust
"},{"location":"reference/jinja/#the-stdlib-function","title":"The stdlib function","text":"

The stdlib function closely mirrors the compiler function. It can be used to put together a standard library that works for the current platform and the compilation \"target_platform\".

Usage: ${{ stdlib('c') }}

Results in <stdlib>_<target_platform> <stdlib_version>. And uses the variant variables <lang>_stdlib and <lang>_stdlib_version to influence the output.

"},{"location":"reference/jinja/#usage-in-a-recipe_1","title":"Usage in a recipe:","text":"recipe.yaml
requirements:\n  build:\n    # these are usually paired!\n    - ${{ compiler('c') }}\n    - ${{ stdlib('c') }}\n

With a corresponding variant_configuration:

variant_configuration.yaml
# these are the values `conda-forge` uses in their pinning file\n# found at https://github.com/conda-forge/conda-forge-pinning-feedstock/blob/main/recipe/conda_build_config.yaml\nc_stdlib:\n- sysroot\nc_stdlib_version:\n- 2.17\n
"},{"location":"reference/jinja/#the-pin-functions","title":"The pin functions","text":"

A pin is created based on the version input (from a subpackage or a package resolution).

The pin functions take the following three arguments:

  • lower_bound (default: \"x.x.x.x.x.x\"): The lower bound pin expression to be used. When set to None, no lower bound is set.
  • upper_bound (default: \"x\"): The maximum pin to be used. When set to None, no upper bound is set.

The lower bound and upper bound can either be a \"pin expression\" (only x and . are allowed) or a hard-coded version string.

A \"pin expression\" is applied to the version input to create the lower and upper bounds. For example, if the version is 3.10.5 with a lower_bound=\"x.x\", upper_bound=\"x.x.x\", the lower bound will be 3.10 and the upper bound will be 3.10.6.0a0. A pin expression for the upper_bound will increment the last selected segment of the version by 1, and append .0a0 to the end to prevent any alpha versions from being selected.

If the last segment of the version contains a letter (e.g. 9e or 1.1.1j), then incrementing the version will set that letter to a, e.g. 9e will become 10a, and 1.1.1j will become 1.1.2a. In this case, also no 0a0 is appended to the end.

Sometimes you want to strongly connect your outputs. This can be achieved with the following input:

  • exact=True (default: False): This will pin the version exactly to the version of the output, incl. the build string.

To override the lower or upper bound with a hard-coded value, you can use the following input:

  • lower_bound (default: None): This will override the lower bound with the given value.
  • upper_bound (default: None): This will override the upper bound with the given value.

Both lower_bound and upper_bound expect a valid version string (e.g. 1.2.3).

"},{"location":"reference/jinja/#the-pin_subpackage-function","title":"The pin_subpackage function","text":"
  • ${{ pin_subpackage(\"mypkg\", lower_bound=\"x.x\", upper_bound=\"x.x\") }} creates a pin to another output in the recipe. With an input of 3.1.5, this would create a pin of mypkg >=3.1,<3.2.0a0.
  • ${{ pin_subpackage(\"other_output\", exact=True) }} creates a pin to another output in the recipe with an exact version.
  • ${{ pin_subpackage(\"other_output\", lower_bound=\"1.2.3\", upper_bound=\"1.2.4\") }} creates a pin to another output in the recipe with a lower bound of 1.2.3 and an upper bound of 1.2.4. This is equivalent to writing other_output >=1.2.3,<1.2.4.
"},{"location":"reference/jinja/#the-pin_compatible-function","title":"The pin_compatible function","text":"

The pin compatible function works exactly as the pin_subpackage function, but it pins the package in the run requirements based on the resolved package of the host or build section.

  • pin_compatible pins a package in the run requirements based on the resolved package of the host or build section.
"},{"location":"reference/jinja/#the-cdt-function","title":"The cdt function","text":"
  • ${{ cdt(\"mypkg\") }} creates a cross-dependency to another output in the recipe.

This function helps add Core Dependency Tree packages as dependencies by converting packages as required according to hard-coded logic. See below for an example of how this function can be used:

# on x86_64 system\ncdt('package-name') # outputs: package-name-cos6-x86_64\n# on aarch64 system\ncdt('package-name') # outputs: package-name-cos6-aarch64\n
"},{"location":"reference/jinja/#the-hash-variable","title":"The hash variable","text":"
  • ${{ hash }} is the variant hash and is useful in the build string computation.
"},{"location":"reference/jinja/#the-version_to_buildstring-function","title":"The version_to_buildstring function","text":"
  • ${{ python | version_to_buildstring }} converts a version from the variant to a build string (it removes the . character and takes only the first two elements of the version).
"},{"location":"reference/jinja/#the-env-object","title":"The env object","text":"

You can use the env object to retrieve environment variables and forward them to your build script. ${{ env.get(\"MY_ENV_VAR\") }} will return the value of the environment variable MY_ENV_VAR or throw an error if it is not set.

To supply a default value when the environment variable is not set, you can use ${{ env.get(\"MY_ENV_VAR\", default=\"default_value\") }}. In this case, if MY_ENV_VAR is not set, the value default_value will be returned (and no error is thrown).

You can also check for the existence of an environment variable:

  • ${{ env.exists(\"MY_ENV_VAR\") }} will return true if the environment variable MY_ENV_VAR is set and false otherwise.
"},{"location":"reference/jinja/#filters","title":"Filters","text":"

A feature of jinja is called \"filters\". Filters are functions that can be applied to variables in a template expression.

The syntax for a filter is {{ variable | filter_name }}. A filter can also take arguments, such as ... | replace('foo', 'bar').

The following Jinja filters are available, taken from the upstream minijinja library:

  • replace: replace a string with another string (e.g. \"{{ 'foo' | replace('oo', 'aa') }}\" will return \"faa\")
  • lower: convert a string to lowercase (e.g. \"{{ 'FOO' | lower }}\" will return \"foo\")
  • upper: convert a string to uppercase (e.g. \"{{ 'foo' | upper }}\" will return \"FOO\") - int: convert a string to an integer (e.g. \"{{ '42' | int }}\" will return 42)
  • abs: return the absolute value of a number (e.g. \"{{ -42 | abs }}\" will return 42)
  • bool: convert a value to a boolean (e.g. \"{{ 'foo' | bool }}\" will return true)
  • default: return a default value if the value is falsy (e.g. \"{{ '' | default('foo') }}\" will return \"foo\")
  • first: return the first element of a list (e.g. \"{{ [1, 2, 3] | first }}\" will return 1) - last: return the last element of a list (e.g. \"{{ [1, 2, 3] | last }}\" will return 3)
  • length: return the length of a list (e.g. \"{{ [1, 2, 3] | length }}\" will return 3)
  • list: convert a string to a list (e.g. \"{{ 'foo' | list }}\" will return ['f', 'o', 'o'])
  • join: join a list with a separator (e.g. \"{{ [1, 2, 3] | join('.') }}\" will return \"1.2.3\")
  • min: return the minimum value of a list (e.g. \"{{ [1, 2, 3] | min }}\" will return 1)
  • max: return the maximum value of a list (e.g. \"{{ [1, 2, 3] | max }}\" will return 3)
  • reverse: reverse a list (e.g. \"{{ [1, 2, 3] | reverse }}\" will return [3, 2, 1])
  • slice: slice a list (e.g. \"{{ [1, 2, 3] | slice(1, 2) }}\" will return [2])
  • batch: This filter works pretty much like slice just the other way round. It returns a list of lists with the given number of items. If you provide a second parameter this is used to fill up missing items.
  • sort: sort a list (e.g. \"{{ [3, 1, 2] | sort }}\" will return [1, 2, 3])
  • trim: remove leading and trailing whitespace from a string (e.g. \"{{ ' foo ' | trim }}\" will return \"foo\")
  • unique: remove duplicates from a list (e.g. \"{{ [1, 2, 1, 3] | unique }}\" will return [1, 2, 3])
  • split: split a string into a list (e.g. \"{{ '1.2.3' | split('.') }}\" will return ['1', '2', '3']). By default, splits on whitespace.
Removed filters

The following filters are removed from the builtins:

  • attr
  • indent
  • select
  • selectattr
  • dictsort
  • reject
  • rejectattr
  • round
  • map
  • title
  • capitalize
  • urlencode
  • escape
  • pprint
  • safe
  • items
  • float
  • tojson
"},{"location":"reference/jinja/#extra-filters-for-recipes","title":"Extra filters for recipes","text":""},{"location":"reference/jinja/#the-version_to_buildstring-filter","title":"The version_to_buildstring filter","text":"
  • ${{ python | version_to_buildstring }} converts a version from the variant to a build string (it removes the . character and takes only the first two elements of the version).

For example the following:

context:\n  cuda: \"11.2.0\"\n\nbuild:\n  string: ${{ hash }}_cuda${{ cuda_version | version_to_buildstring }}\n

Would evaluate to a abc123_cuda112 (assuming the hash was abc123).

"},{"location":"reference/jinja/#various-remarks","title":"Various remarks","text":""},{"location":"reference/jinja/#inline-conditionals-with-jinja","title":"Inline conditionals with Jinja","text":"

The new recipe format allows for inline conditionals with Jinja. If they are falsey, and no else branch exists, they will render to an empty string (which is, for example in a list or dictionary, equivalent to a YAML null).

When a recipe is rendered, all values that are null must be filtered from the resulting YAML.

requirements:\n  host:\n    - ${{ \"numpy\" if cuda == \"yes\" }}\n

If cuda is not equal to yes, the first item of the host requirements will be empty (null) and thus filtered from the final list.

This must also work for dictionary values. For example:

build:\n  number: ${{ 100 if cuda == \"yes\" }}\n  # or an `else` branch can be used, of course\n  number: ${{ 100 if cuda == \"yes\" else 0 }}\n
"},{"location":"reference/recipe_file/","title":"The recipe spec","text":"

rattler-build implements a new recipe spec, different from the traditional \"meta.yaml\" file used in conda-build. A recipe has to be stored as a recipe.yaml file.

"},{"location":"reference/recipe_file/#history","title":"History","text":"

A discussion was started on what a new recipe spec could or should look like. The fragments of this discussion can be found here.

The reason for a new spec are:

  • make it easier to parse (i.e. \"pure YAML\"); conda-build uses a mix of comments and Jinja to achieve a great deal of flexibility, but it's hard to parse the recipe with a computer
  • iron out some inconsistencies around multiple outputs (build vs. build/script and more)
  • remove any need for recursive parsing & solving
  • finally, the initial implementation in boa relied on conda-build; rattler-build removes any dependency on Python or conda-build and reimplements everything in Rust
"},{"location":"reference/recipe_file/#major-differences-from-conda-build","title":"Major differences from conda-build","text":"
  • recipe filename is recipe.yaml, not meta.yaml
  • outputs have less complicated behavior, keys are same as top-level recipe (e.g. build/script, not just script and package/name, not just name)
  • no implicit meta-packages in outputs
  • no full Jinja2 support: no conditional or {% set ... support, only string interpolation; variables can be set in the toplevel \"context\" which is valid YAML
  • Jinja string interpolation needs to be preceded by a dollar sign at the beginning of a string, e.g. - ${{ version }} in order for it to be valid YAML
  • selectors use a YAML dictionary style (vs. comments in conda-build). Instead of - somepkg #[osx] we use:
    if: osx\nthen:\n  - somepkg\n
  • skip instruction uses a list of skip conditions and not the selector syntax from conda-build (e.g. skip: [\"osx\", \"win and py37\"])
"},{"location":"reference/recipe_file/#spec","title":"Spec","text":"

The recipe spec has the following parts:

  • context: to set up variables that can later be used in Jinja string interpolation
  • package: defines name, version etc. of the top-level package
  • source: points to the sources that need to be downloaded in order to build the recipe
  • build: defines how to build the recipe and what build number to use
  • requirements: defines requirements of the top-level package
  • test: defines tests for the top-level package
  • outputs: a recipe can have multiple outputs. Each output can and should have a package, requirements and test section
"},{"location":"reference/recipe_file/#spec-reference","title":"Spec reference","text":"

The spec is also made available through a JSON Schema (which is used for validation). The schema (and pydantic source file) can be found in this repository: recipe-format

To use with VSCode(yaml-plugin) and other IDEs:

Either start the document with the following line:

# yaml-language-server: $schema=https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\n
Or, using yaml.schemas,
yaml.schemas: {\n  \"https://raw.githubusercontent.com/prefix-dev/recipe-format/main/schema.json\": \"**/recipe.yaml\",\n}\n
Read more about this here.

See more in the automatic linting chapter.

"},{"location":"reference/recipe_file/#examples","title":"Examples","text":"recipe.yaml
# this sets up \"context variables\" (in this case name and version) that\n# can later be used in Jinja expressions\ncontext:\n  version: 1.1.0\n  name: imagesize\n\n# top level package information (name and version)\npackage:\n  name: ${{ name }}\n  version: ${{ version }}\n\n# location to get the source from\nsource:\n  url: https://pypi.io/packages/source/${{ name[0] }}/${{ name }}/${{ name }}-${{ version }}.tar.gz\n  sha256: f3832918bc3c66617f92e35f5d70729187676313caa60c187eb0f28b8fe5e3b5\n\n# build number (should be incremented if a new build is made, but version is not incrementing)\nbuild:\n  number: 1\n  script: python -m pip install --no-deps --ignore-installed .\n\n# the requirements at build and runtime\nrequirements:\n  host:\n    - python\n    - pip\n  run:\n    - python\n\n# tests to validate that the package works as expected\ntests:\n  - python:\n      imports:\n        - imagesize\n\n# information about the package\nabout:\n  homepage: https://github.com/shibukawa/imagesize_py\n  license: MIT\n  summary: 'Getting image size from png/jpeg/jpeg2000/gif file'\n  description: |\n    This module analyzes jpeg/jpeg2000/png/gif image header and\n    return image size.\n  repository: https://github.com/shibukawa/imagesize_py\n  documentation: https://pypi.python.org/pypi/imagesize\n\n# the below is conda-forge specific!\nextra:\n  recipe-maintainers:\n    - somemaintainer\n
"},{"location":"reference/recipe_file/#package-section","title":"Package section","text":"

Specifies package information.

package:\n  name: bsdiff4\n  version: \"2.1.4\"\n
  • name: The lower case name of the package. It may contain \"-\", but no spaces.
  • version: The version number of the package. Use the PEP-386 verlib conventions. Cannot contain \"-\". YAML interprets version numbers such as 1.0 as floats, meaning that 0.10 will be the same as 0.1. To avoid this, put the version number in quotes so that it is interpreted as a string.
"},{"location":"reference/recipe_file/#source-section","title":"Source section","text":"

Specifies where the source code of the package is coming from. The source may come from a tarball file, git, hg, or svn. It may be a local path and it may contain patches.

"},{"location":"reference/recipe_file/#source-from-tarball-or-zip-archive","title":"Source from tarball or zip archive","text":"
source:\n  url: https://pypi.python.org/packages/source/b/bsdiff4/bsdiff4-1.1.4.tar.gz\n  md5: 29f6089290505fc1a852e176bd276c43\n  sha1: f0a2c9a30073449cfb7d171c57552f3109d93894\n  sha256: 5a022ff4c1d1de87232b1c70bde50afbb98212fd246be4a867d8737173cf1f8f\n

If an extracted archive contains only 1 folder at its top level, its contents will be moved 1 level up, so that the extracted package contents sit in the root of the work folder.

"},{"location":"reference/recipe_file/#source-from-git","title":"Source from git","text":"
source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  # branch: master # note: defaults to fetching the repo's default branch\n

You can use rev to pin the commit version directly:

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  rev: \"50a1f7ed6c168eb0815d424cba2df62790f168f0\"\n

Or you can use the tag:

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  tag: \"1.1.4\"\n

git can also be a relative path to the recipe directory:

source:\n  git: ../../bsdiff4/.git\n  tag: \"1.1.4\"\n

Furthermore, if you want to fetch just the current \"HEAD\" (this may result in non-deterministic builds), then you can use depth.

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  depth: 1 # note: the behaviour defaults to -1\n

Note: tag or rev may not be available within commit depth range, hence we don't allow using rev or the tag and depth of them together if not set to -1.

source:\n  git: https://github.com/ilanschnell/bsdiff4.git\n  tag: \"1.1.4\"\n  depth: 1 # error: use of `depth` with `rev` is invalid, they are mutually exclusive\n

When you want to use git-lfs, you need to set lfs: true. This will also pull the lfs files from the repository.

source:\n  git: ../../bsdiff4/.git\n  tag: \"1.1.4\"\n  lfs: true # note: defaults to false\n
"},{"location":"reference/recipe_file/#source-from-a-local-path","title":"Source from a local path","text":"

If the path is relative, it is taken relative to the recipe directory. The source is copied to the work directory before building.

  source:\n    path: ../src\n    use_gitignore: false # note: defaults to true\n

By default, all files in the local path that are ignored by git are also ignored by rattler-build. You can disable this behavior by setting use_gitignore to false.

"},{"location":"reference/recipe_file/#patches","title":"Patches","text":"

Patches may optionally be applied to the source.

  source:\n    #[source information here]\n    patches:\n      - my.patch # the patch file is expected to be found in the recipe\n
"},{"location":"reference/recipe_file/#destination-path","title":"Destination path","text":"

Within rattler-build's work directory, you may specify a particular folder to place the source into. rattler-build will always drop you into the same folder ([build folder]/work), but it's up to you whether you want your source extracted into that folder, or nested deeper. This feature is particularly useful when dealing with multiple sources, but can apply to recipes with single sources as well.

source:\n  #[source information here]\n  target_directory: my-destination/folder\n
"},{"location":"reference/recipe_file/#source-from-multiple-sources","title":"Source from multiple sources","text":"

Some software is most easily built by aggregating several pieces.

The syntax is a list of source dictionaries. Each member of this list follows the same rules as the single source. All features for each member are supported.

Example:

source:\n  - url: https://package1.com/a.tar.bz2\n    target_directory: stuff\n  - url: https://package1.com/b.tar.bz2\n    target_directory: stuff\n  - git: https://github.com/mamba-org/boa\n    target_directory: boa\n

Here, the two URL tarballs will go into one folder, and the git repo is checked out into its own space. git will not clone into a non-empty folder.

"},{"location":"reference/recipe_file/#build-section","title":"Build section","text":"

Specifies build information.

Each field that expects a path can also handle a glob pattern. The matching is performed from the top of the build environment, so to match files inside your project you can use a pattern similar to the following one: \"**/myproject/**/*.txt\". This pattern will match any .txt file found in your project. Quotation marks (\"\") are required for patterns that start with a *.

Recursive globbing using ** is also supported.

"},{"location":"reference/recipe_file/#build-number-and-string","title":"Build number and string","text":"

The build number should be incremented for new builds of the same version. The number defaults to 0. The build string cannot contain \"-\". The string defaults to the default rattler-build build string plus the build number.

build:\n  number: 1\n  string: abc\n
"},{"location":"reference/recipe_file/#dynamic-linking","title":"Dynamic linking","text":"

This section contains settings for the shared libraries and executables.

build:\n  dynamic_linking:\n    rpath_allowlist: [\"/usr/lib/**\"]\n
"},{"location":"reference/recipe_file/#python-entry-points","title":"Python entry points","text":"

The following example creates a Python entry point named \"bsdiff4\" that calls bsdiff4.cli.main_bsdiff4().

build:\n  python:\n    entry_points:\n      - bsdiff4 = bsdiff4.cli:main_bsdiff4\n      - bspatch4 = bsdiff4.cli:main_bspatch4\n
"},{"location":"reference/recipe_file/#script","title":"Script","text":"

By default, rattler-build uses a build.sh file on Unix (macOS and Linux) and a build.bat file on Windows, if they exist in the same folder as the recipe.yaml file. With the script parameter you can either supply a different filename or write out short build scripts. You may need to use selectors to use different scripts for different platforms.

build:\n  # A very simple build script\n  script: pip install .\n\n  # The build script can also be a list\n  script:\n    - pip install .\n    - echo \"hello world\"\n    - if: unix\n      then:\n        - echo \"unix\"\n
"},{"location":"reference/recipe_file/#skipping-builds","title":"Skipping builds","text":"

Lists conditions under which rattler-build should skip the build of this recipe. Particularly useful for defining recipes that are platform-specific. By default, a build is never skipped.

build:\n  skip:\n    - win\n    ...\n
"},{"location":"reference/recipe_file/#architecture-independent-packages","title":"Architecture-independent packages","text":"

Allows you to specify \"no architecture\" when building a package, thus making it compatible with all platforms and architectures. Architecture-independent packages can be installed on any platform.

Assigning the noarch key as generic tells conda to not try any manipulation of the contents.

build:\n  noarch: generic\n

noarch: generic is most useful for packages such as static JavaScript assets and source archives. For pure Python packages that can run on any Python version, you can use the noarch: python value instead:

build:\n  noarch: python\n

Note

At the time of this writing, noarch packages should not make use of preprocess-selectors: noarch packages are built with the directives which evaluate to true in the platform it is built on, which probably will result in incorrect/incomplete installation in other platforms.

"},{"location":"reference/recipe_file/#include-build-recipe","title":"Include build recipe","text":"

The recipe and rendered recipe.yaml file are included in the package_metadata by default. You can disable this by passing --no-include-recipe on the command line.

Note

There are many more options in the build section. These additional options control how variants are computed, prefix replacements, and more. See the full build options for more information.

"},{"location":"reference/recipe_file/#requirements-section","title":"Requirements section","text":"

Specifies the build and runtime requirements. Dependencies of these requirements are included automatically.

Versions for requirements must follow the conda/mamba match specification. See build-version-spec.

"},{"location":"reference/recipe_file/#build","title":"Build","text":"

Tools required to build the package.

These packages are run on the build system and include things such as version control systems (git, svn) make tools (GNU make, Autotool, CMake) and compilers (real cross, pseudo-cross, or native when not cross-compiling), and any source pre-processors.

Packages which provide \"sysroot\" files, like the CDT packages (see below), also belong in the build section.

requirements:\n  build:\n    - git\n    - cmake\n
"},{"location":"reference/recipe_file/#host","title":"Host","text":"

Represents packages that need to be specific to the target platform when the target platform is not necessarily the same as the native build platform. For example, in order for a recipe to be \"cross-capable\", shared libraries requirements must be listed in the host section, rather than the build section, so that the shared libraries that get linked are ones for the target platform, rather than the native build platform. You should also include the base interpreter for packages that need one. In other words, a Python package would list python here and an R package would list mro-base or r-base.

requirements:\n  build:\n    - ${{ compiler('c') }}\n    - if: linux\n      then:\n        - ${{ cdt('xorg-x11-proto-devel') }}\n  host:\n    - python\n

Note

When both \"build\" and \"host\" sections are defined, the build section can be thought of as \"build tools\" - things that run on the native platform, but output results for the target platform (e.g. a cross-compiler that runs on linux-64, but targets linux-armv7).

The PREFIX environment variable points to the host prefix. With respect to activation during builds, both the host and build environments are activated. The build prefix is activated before the host prefix so that the host prefix has priority over the build prefix. Executables that don't exist in the host prefix should be found in the build prefix.

The build and host prefixes are always separate when both are defined, or when ${{ compiler() }} Jinja2 functions are used. The only time that build and host are merged is when the host section is absent, and no ${{ compiler() }} Jinja2 functions are used in meta.yaml.

"},{"location":"reference/recipe_file/#run","title":"Run","text":"

Packages required to run the package.

These are the dependencies that are installed automatically whenever the package is installed. Package names should follow the package match specifications.

requirements:\n  run:\n    - python\n    - six >=1.8.0\n

To build a recipe against different versions of NumPy and ensure that each version is part of the package dependencies, list numpy as a requirement in recipe.yaml and use a conda_build_config.yaml file with multiple NumPy versions.

"},{"location":"reference/recipe_file/#run-constraints","title":"Run constraints","text":"

Packages that are optional at runtime but must obey the supplied additional constraint if they are installed.

Package names should follow the package match specifications.

requirements:\n  run_constraints:\n    - optional-subpackage ==${{ version }}\n

For example, let's say we have an environment that has package \"a\" installed at version 1.0. If we install package \"b\" that has a run_constraints entry of \"a >1.0\", then mamba would need to upgrade \"a\" in the environment in order to install \"b\".

This is especially useful in the context of virtual packages, where the run_constraints dependency is not a package that mamba manages, but rather a virtual package that represents a system property that mamba can't change. For example, a package on Linux may impose a run_constraints dependency on __glibc >=2.12. This is the version bound consistent with CentOS 6. Software built against glibc 2.12 will be compatible with CentOS 6. This run_constraints dependency helps mamba, conda or pixi tell the user that a given package can't be installed if their system glibc version is too old.

"},{"location":"reference/recipe_file/#run-exports","title":"Run exports","text":"

Packages may have runtime requirements such as shared libraries (e.g. zlib), which are required for linking at build time, and for resolving the link at run time. Such packages use run_exports for defining the runtime requirements to let the dependent packages understand the runtime requirements of the package.

Example from zlib:

  requirements:\n    run_exports:\n      - ${{ pin_subpackage('libzlib', exact=True) }}\n

Run exports are weak by default. But you can also define strong run_exports.

  requirements:\n    run_exports:\n      strong:\n        - ${{ pin_subpackage('libzlib', exact=True) }}\n
"},{"location":"reference/recipe_file/#ignore-run-exports","title":"Ignore run exports","text":"

There maybe cases where an upstream package has a problematic run_exports constraint. You can ignore it in your recipe by listing the upstream package name in the ignore_run_exports section in requirements.

You can ignore them by package name, or by naming the runtime dependency directly.

  requirements:\n    ignore_run_exports:\n      from_package:\n        - zlib\n

Using a runtime depenedency name:

  requirements:\n    ignore_run_exports:\n      by_name:\n        - libzlib\n

Note

ignore_run_exports only applies to runtime dependencies coming from an upstream package.

"},{"location":"reference/recipe_file/#tests-section","title":"Tests section","text":"

rattler-build supports four different types of tests. The \"script test\" installs the package and runs a list of commands. The \"Python test\" attempts to import a list of Python modules and runs pip check. The \"downstream test\" runs the tests of a downstream package that reverse depends on the package being built. And lastly, the \"package content test\" checks if the built package contains the mentioned items.

The tests section is a list of these items:

tests:\n  - script:\n      - echo \"hello world\"\n    requirements:\n      run:\n        - pytest\n    files:\n      source:\n        - test-data.txt\n\n  - python:\n      imports:\n        - bsdiff4\n      pip_check: true  # this is the default\n  - downstream: numpy\n
"},{"location":"reference/recipe_file/#script-test","title":"Script test","text":"

The script test has 3 top-level keys: script, files and requirements. Only the script key is required.

"},{"location":"reference/recipe_file/#test-commands","title":"Test commands","text":"

Commands that are run as part of the test.

tests:\n  - script:\n      - echo \"hello world\"\n      - bsdiff4 -h\n      - bspatch4 -h\n
"},{"location":"reference/recipe_file/#extra-test-files","title":"Extra test files","text":"

Test files that are copied from the source work directory into the temporary test directory and are needed during testing (note that the source work directory is otherwise not available at all during testing).

You can also include files that come from the recipe folder. They are copied into the test directory as well.

At test execution time, the test directory is the current working directory.

tests:\n  - script:\n      - ls\n    files:\n      source:\n        - myfile.txt\n        - tests/\n        - some/directory/pattern*.sh\n      recipe:\n        - extra-file.txt\n
"},{"location":"reference/recipe_file/#test-requirements","title":"Test requirements","text":"

In addition to the runtime requirements, you can specify requirements needed during testing. The runtime requirements that you specified in the \"run\" section described above are automatically included during testing (because the built package is installed as it regularly would be).

In the build section you can specify additional requirements that are only needed on the build system for cross-compilation (e.g. emulators or compilers).

tests:\n  - script:\n      - echo \"hello world\"\n    requirements:\n      build:\n        - myemulator\n      run:\n        - nose\n
"},{"location":"reference/recipe_file/#python-tests","title":"Python tests","text":"

For this test type you can list a set of Python modules that need to be importable. The test will fail if any of the modules cannot be imported.

The test will also automatically run pip check to check for any broken dependencies. This can be disabled by setting pip_check: false in the YAML.

tests:\n  - python:\n      imports:\n        - bsdiff4\n        - bspatch4\n      pip_check: true  # can be left out because this is the default\n

Internally this will write a small Python script that imports the modules:

import bsdiff4\nimport bspatch4\n
"},{"location":"reference/recipe_file/#check-for-package-contents","title":"Check for package contents","text":"

Checks if the built package contains the mentioned items. These checks are executed directly at the end of the build process to make sure that all expected files are present in the package.

tests:\n  - package_contents:\n      # checks for the existence of files inside $PREFIX or %PREFIX%\n      # or, checks that there is at least one file matching the specified `glob`\n      # pattern inside the prefix\n      files:\n        - etc/libmamba/test.txt\n        - etc/libmamba\n        - etc/libmamba/*.mamba.txt\n\n      # checks for the existence of `mamba/api/__init__.py` inside of the\n      # Python site-packages directory (note: also see Python import checks)\n      site_packages:\n        - mamba.api\n\n\n      # looks in $PREFIX/bin/mamba for unix and %PREFIX%\\Library\\bin\\mamba.exe on Windows\n      # note: also check the `commands` and execute something like `mamba --help` to make\n      # sure things work fine\n      bin:\n        - mamba\n\n      # searches for `$PREFIX/lib/libmamba.so` or `$PREFIX/lib/libmamba.dylib` on Linux or macOS,\n      # on Windows for %PREFIX%\\Library\\lib\\mamba.dll & %PREFIX%\\Library\\bin\\mamba.bin\n      lib:\n        - mamba\n\n      # searches for `$PREFIX/include/libmamba/mamba.hpp` on unix, and\n      # on Windows for `%PREFIX%\\Library\\include\\libmamba\\mamba.hpp`\n      include:\n        - libmamba/mamba.hpp\n
"},{"location":"reference/recipe_file/#downstream-tests","title":"Downstream tests","text":"

Warning

Downstream tests are not yet implemented in rattler-build.

A downstream test can mention a single package that has a dependency on the package being built. The test will install the package and run the tests of the downstream package with our current package as a dependency.

Sometimes downstream packages do not resolve. In this case, the test is ignored.

tests:\n  - downstream: numpy\n
"},{"location":"reference/recipe_file/#outputs-section","title":"Outputs section","text":"

Explicitly specifies packaging steps. This section supports multiple outputs, as well as different package output types. The format is a list of mappings.

When using multiple outputs, certain top-level keys are \"forbidden\": package and requirements. Instead of package, a top-level recipe key can be defined. The recipe.name is ignored but the recipe.version key is used as default version for each output. Other \"top-level\" keys are merged into each output (e.g. the about section) to avoid repetition. Each output is a complete recipe, and can have its own build, requirements, and test sections.

recipe:\n  # the recipe name is ignored\n  name: some\n  version: 1.0\n\noutputs:\n  - package:\n      # version is taken from recipe.version (1.0)\n      name: some-subpackage\n\n  - package:\n      name: some-other-subpackage\n      version: 2.0\n

Each output acts like an independent recipe and can have their own script, build_number, and so on.

outputs:\n  - package:\n      name: subpackage-name\n    build:\n      script: install-subpackage.sh\n

Each output is built independently. You should take care of not packaging the same files twice.

"},{"location":"reference/recipe_file/#subpackage-requirements","title":"Subpackage requirements","text":"

Like a top-level recipe, a subpackage may have zero or more dependencies listed as build, host or run requirements.

The dependencies listed as subpackage build requirements are available only during the packaging phase of that subpackage.

outputs:\n  - package:\n      name: subpackage-name\n    requirements:\n      build:\n        - some-dep\n      run:\n        - some-dep\n

You can also use the pin_subpackage function to pin another output from the same recipe.

outputs:\n  - package:\n      name: libtest\n  - package:\n      name: test\n    requirements:\n      build:\n        - ${{ pin_subpackage('libtest', max_pin='x.x') }}\n

The outputs are topologically sorted by the dependency graph which is taking the pin_subpackage invocations into account. When using pin_subpackage(name, exact=True) a special behavior is used where the name package is injected as a \"variant\" and the variant matrix is expanded appropriately. For example, when you have the following situation, with a variant_config.yaml file that contains openssl: [1, 3]:

outputs:\n  - package:\n      name: libtest\n    requirements:\n      host:\n        - openssl\n  - package:\n      name: test\n    requirements:\n      build:\n        - ${{ pin_subpackage('libtest', exact=True) }}\n

Due to the variant config file, this will build two versions of libtest. We will also build two versions of test, one that depends on libtest (openssl 1) and one that depends on libtest (openssl 3).

"},{"location":"reference/recipe_file/#about-section","title":"About section","text":"

Specifies identifying information about the package. The information displays in the package server.

about:\n  homepage: https://example.com/bsdiff4\n  license: BSD-3-Clause # (1)!\n  license_file: LICENSE\n  summary: binary diff and patch using the BSDIFF4-format\n  description: |\n    Long description of bsdiff4 ...\n  repository: https://github.com/ilanschnell/bsdiff4\n  documentation: https://docs.com\n
  1. Only the SPDX specifiers are allowed, more info here: SPDX If you want another license type LicenseRef-<YOUR-LICENSE> can be used, e.g. license: LicenseRef-Proprietary
"},{"location":"reference/recipe_file/#license-file","title":"License file","text":"

Adds a file containing the software license to the package metadata. Many licenses require the license statement to be distributed with the package. The filename is relative to the source or recipe directory. The value can be a single filename or a YAML list for multiple license files. Values can also point to directories with license information. Directory entries must end with a / suffix (this is to lessen unintentional inclusion of non-license files; all the directory's contents will be unconditionally and recursively added).

about:\n  license_file:\n    - LICENSE\n    - vendor-licenses/\n
"},{"location":"reference/recipe_file/#extra-section","title":"Extra section","text":"

A schema-free area for storing non-conda-specific metadata in standard YAML form.

Example: To store recipe maintainers information
extra:\n  maintainers:\n   - name of maintainer\n
"},{"location":"reference/recipe_file/#templating-with-jinja","title":"Templating with Jinja","text":"

rattler-build supports limited Jinja templating in the recipe.yaml file.

You can set up Jinja variables in the context section:

context:\n  name: \"test\"\n  version: \"5.1.2\"\n  # later keys can reference previous keys\n  # and use jinja functions to compute new values\n  major_version: ${{ version.split('.')[0] }}\n

Later in your recipe.yaml you can use these values in string interpolation with Jinja:

source:\n  url: https://github.com/mamba-org/${{ name }}/v${{ version }}.tar.gz\n

Jinja has built-in support for some common string manipulations.

In rattler-build, complex Jinja is completely disallowed as we try to produce YAML that is valid at all times. So you should not use any {% if ... %} or similar Jinja constructs that produce invalid YAML. Furthermore, instead of plain double curly brackets Jinja statements need to be prefixed by $, e.g. ${{ ... }}:

package:\n  name: {{ name }}   # WRONG: invalid yaml\n  name: ${{ name }} # correct\n

For more information, see the Jinja template documentation and the list of available environment variables env-vars.

Jinja templates are evaluated during the build process.

"},{"location":"reference/recipe_file/#additional-jinja2-functionality-in-rattler-build","title":"Additional Jinja2 functionality in rattler-build","text":"

Besides the default Jinja2 functionality, additional Jinja functions are available during the rattler-build process: pin_compatible, pin_subpackage, and compiler.

The compiler function takes c, cxx, fortran and other values as argument and automatically selects the right (cross-)compiler for the target platform.

build:\n  - ${{ compiler('c') }}\n

The pin_subpackage function pins another package produced by the recipe with the supplied parameters.

Similarly, the pin_compatible function will pin a package according to the specified rules.

"},{"location":"reference/recipe_file/#pin-expressions","title":"Pin expressions","text":"

rattler-build knows pin expressions. A pin expression can have a min_pin, max_pin and exact value. A max_pin and min_pin are specified with a string containing only x and ., e.g. max_pin=\"x.x.x\" would signify to pin the given package to <1.2.3 (if the package version is 1.2.2, for example).

A pin with min_pin=\"x.x\",max_pin=\"x.x\" for a package of version 1.2.2 would evaluate to >=1.2.2,<1.2.3.

If exact=true, then the hash is included, and the package is pinned exactly, e.g. ==1.2.2 h1234. This is a unique package variant that cannot exist more than once, and thus is \"exactly\" pinned.

"},{"location":"reference/recipe_file/#pin-subpackage","title":"Pin subpackage","text":"

Pin subpackage refers to another package from the same recipe file. It is commonly used in the build/run_exports section to export a run export from the package, or with multiple outputs to refer to a previous build.

It looks something like:

package:\n  name: mypkg\n  version: \"1.2.3\"\n\nrequirements:\n  run_exports:\n    # this will evaluate to `mypkg <1.3`\n    - ${{ pin_subpackage(name, max_pin='x.x') }}\n
"},{"location":"reference/recipe_file/#pin-compatible","title":"Pin compatible","text":"

Pin compatible lets you pin a package based on the version retrieved from the variant file (if the pinning from the variant file needs customization).

For example, if the variant specifies a pin for numpy: 1.11, one can use pin_compatible to relax it:

requirements:\n  host:\n    # this will select nupy 1.11\n    - numpy\n  run:\n    # this will export `numpy >=1.11,<2`, instead of the stricter `1.11` pin\n    - ${{ pin_compatible('numpy', min_pin='x.x', max_pin='x') }}\n
"},{"location":"reference/recipe_file/#the-env-jinja-functions","title":"The env Jinja functions","text":"

You can access the current environment variables using the env object in Jinja.

There are three functions:

  • env.get(\"ENV_VAR\") will insert the value of \"ENV_VAR\" into the recipe.
  • env.get(\"ENV_VAR\", default=\"undefined\") will insert the value of ENV_VAR into the recipe or, if ENV_VAR is not defined, the specified default value (in this case \"undefined\")
  • env.exists(\"ENV_VAR\") returns a boolean true of false if the env var is set to any value

This can be used for some light templating, for example:

build:\n  string: ${{ env.get(\"GIT_BUILD_STRING\") }}_${{ PKG_HASH }}\n
"},{"location":"reference/recipe_file/#match-function","title":"match function","text":"

This function matches the first argument (the package version) against the second argument (the version spec) and returns the resulting boolean. This only works for packages defined in the \"variant_config.yaml\" file.

recipe.yaml
match(python, '>=3.4')\n

For example, you could require a certain dependency only for builds against python 3.4 and above:

recipe.yaml
requirements:\n  build:\n    - if: match(python, '>=3.4')\n      then:\n        - some-dep\n

With a corresponding variant config that looks like the following:

variant_config.yaml
python: [\"3.2\", \"3.4\", \"3.6\"]\n

Example: match usage example

"},{"location":"reference/recipe_file/#cdt-function","title":"cdt function","text":"

This function helps add Core Dependency Tree packages as dependencies by converting packages as required according to hard-coded logic.

# on x86_64 system\ncdt('package-name') # outputs: package-name-cos6-x86_64\n# on aarch64 system\ncdt('package-name') # outputs: package-name-cos6-aarch64\n

Example: cdt usage example

"},{"location":"reference/recipe_file/#preprocessing-selectors","title":"Preprocessing selectors","text":"

You can add selectors to any item, and the selector is evaluated in a preprocessing stage. If a selector evaluates to true, the item is flattened into the parent element. If a selector evaluates to false, the item is removed.

Selectors can use if ... then ... else as follows:

source:\n  - if: not win\n    then:\n      - url: http://path/to/unix/source\n    else:\n      - url: http://path/to/windows/source\n\n# or the equivalent with two if conditions:\n\nsource:\n  - if: unix\n    then:\n      - url: http://path/to/unix/source\n  - if: win\n    then:\n      - url: http://path/to/windows/source\n

A selector is a valid Python statement that is executed. You can read more about them in the \"Selectors in recipes\" chapter.

The use of the Python version selectors, py27, py34, etc. is discouraged in favor of the more general comparison operators. Additional selectors in this series will not be added to conda-build.

Because the selector is any valid Python expression, complicated logic is possible:

- if: unix and not win\n  then: ...\n- if: (win or linux) and not py27\n  then: ...\n

Lists are automatically \"merged\" upwards, so it is possible to group multiple items under a single selector:

tests:\n  - script:\n    - if: unix\n      then:\n      - test -d ${PREFIX}/include/xtensor\n      - test -f ${PREFIX}/lib/cmake/xtensor/xtensorConfigVersion.cmake\n    - if: win\n      then:\n      - if not exist %LIBRARY_PREFIX%\\include\\xtensor\\xarray.hpp (exit 1)\n      - if not exist %LIBRARY_PREFIX%\\lib\\cmake\\xtensor\\xtensorConfigVersion.cmake (exit 1)\n\n# On unix this is rendered to:\ntests:\n  - script:\n    - test -d ${PREFIX}/include/xtensor\n    - test -f ${PREFIX}/lib/cmake/xtensor/xtensorConfigVersion.cmake\n
"},{"location":"reference/recipe_file/#experimental-features","title":"Experimental features","text":"

Warning

These are experimental features of rattler-build and may change or go away completely.

"},{"location":"reference/recipe_file/#jinja-functions","title":"Jinja functions","text":"
  • load_from_file
  • git.* functions
"},{"location":"tutorials/cpp/","title":"Packaging a C++ package","text":"

This tutorial will guide you though making a C++ package with rattler-build.

"},{"location":"tutorials/cpp/#building-a-header-only-library","title":"Building a Header-only Library","text":"

To build a package for the header-only library xtensor, you need to manage dependencies and ensure proper installation paths.

"},{"location":"tutorials/cpp/#key-steps","title":"Key Steps","text":"
  1. Dependencies: Ensure cmake, ninja, and a compiler are available as dependencies.

  2. CMake Installation Prefix: Use the CMAKE_INSTALL_PREFIX setting to instruct CMake to install the headers in the correct location.

  3. Unix Systems: Follow the standard Unix prefix:

    $PREFIX/include\n$PREFIX/lib\n

  4. Windows Systems: Use a Unix-like prefix but nested in a Library directory:

    $PREFIX/Library/include\n$PREFIX/Library/lib\n
    Utilize the handy variables %LIBRARY_PREFIX% and %LIBRARY_BIN% to guide CMake to install the headers and libraries correctly.

This approach ensures that the headers and libraries are installed in the correct directories on both Unix and Windows systems.

"},{"location":"tutorials/cpp/#recipe","title":"Recipe","text":"recipe.yaml
context:\n  version: \"0.24.6\"\n\npackage:\n  name: xtensor\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/xtensor-stack/xtensor/archive/${{ version }}.tar.gz\n  sha256: f87259b51aabafdd1183947747edfff4cff75d55375334f2e81cee6dc68ef655\n\nbuild:\n  number: 0\n  script:\n    - if: win # (1)!\n      then: |\n        cmake -GNinja ^\n            -D BUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=%LIBRARY_PREFIX% ^\n            %SRC_DIR%\n        ninja install\n      else: |\n        cmake -GNinja \\\n              -DBUILD_TESTS=OFF -DCMAKE_INSTALL_PREFIX=$PREFIX \\\n              $SRC_DIR\n        ninja install\n\nrequirements:\n  build:\n    - ${{ compiler('cxx') }} # (2)!\n    - cmake\n    - ninja\n  host:\n    - xtl >=0.7,<0.8\n  run:\n    - xtl >=0.7,<0.8\n  run_constraints: # (3)!\n    - xsimd >=8.0.3,<10\n\ntests:\n  - package_contents:\n      include: # (4)!\n        - xtensor/xarray.hpp\n      files: # (5)!\n        - ${{ \"Library\" if win }}/share/cmake/xtensor/xtensorConfig.cmake\n        - ${{ \"Library\" if win }}/share/cmake/xtensor/xtensorConfigVersion.cmake\n\nabout:\n  homepage: https://github.com/xtensor-stack/xtensor\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: The C++ tensor algebra library\n  description: Multi dimensional arrays with broadcasting and lazy computing\n  documentation: https://xtensor.readthedocs.io\n  repository: https://github.com/xtensor-stack/xtensor\n\nextra:\n  recipe-maintainers:\n    - some-maintainer\n
  1. The if: condition allows the user to switch behavior of the build based on some checks like, the operating system.
  2. The compiler function is used to get the C++ compiler for the build system.
  3. The run_constraints section specifies the version range of a package which the package can run \"with\". But which the package doesn't depend on itself.
  4. The include section specifies the header file to tested for existence.
  5. The files section specifies the files to be tested for existence, using a glob pattern.

CMAKE_ARGS

It can be tedious to remember all the different variables one needs to pass to CMake to create the perfect build. The cmake package on conda-forge introduces theCMAKE_ARGS environment variable. This variable contains the necessary flags to make the package build correctly, also when cross-compiling from one machine to another. Therefore, it is often not necessary to pass any additional flags to the cmake command. However, because this is a tutorial we will show how to pass the necessary flags to cmake manually.

For more information please refer to the conda-forge documentation.

"},{"location":"tutorials/cpp/#building-a-c-application","title":"Building A C++ application","text":"

In this example, we'll build poppler, a C++ application for manipulating PDF files from the command line. The final package will install several tools into the bin/ folder. We'll use external build scripts and run actual scripts in the test.

"},{"location":"tutorials/cpp/#key-steps_1","title":"Key Steps","text":"
  1. Dependencies:

    • Build Dependencies: These are necessary for the building process, including cmake, ninja, and pkg-config.
    • Host Dependencies: These are the libraries poppler links against, such as cairo, fontconfig, freetype, glib, and others.
  2. Compiler Setup: We use the compiler function to obtain the appropriate C and C++ compilers.

  3. Build Script: The build.script field points to an external script (poppler-build.sh) which contains the build commands.

  4. Testing: Simple tests are included to verify that the installed tools (pdfinfo, pdfunite, pdftocairo) are working correctly by running them, and expecting an exit code 0.

"},{"location":"tutorials/cpp/#recipe_1","title":"Recipe","text":"recipe.yaml
context:\n  version: \"24.01.0\"\n\npackage:\n  name: poppler\n  version: ${{ version }}\n\nsource:\n  url: https://poppler.freedesktop.org/poppler-${{ version }}.tar.xz\n  sha256: c7def693a7a492830f49d497a80cc6b9c85cb57b15e9be2d2d615153b79cae08\n\nbuild:\n  script: poppler-build.sh\n\nrequirements:\n  build:\n    - ${{ compiler('c') }} # (1)!\n    - ${{ compiler('cxx') }}\n    - pkg-config\n    - cmake\n    - ninja\n  host:\n    - cairo # (2)!\n    - fontconfig\n    - freetype\n    - glib\n    - libboost-headers\n    - libjpeg-turbo\n    - lcms2\n    - libiconv\n    - libpng\n    - libtiff\n    - openjpeg\n    - zlib\n\ntests:\n  - script:\n      - pdfinfo -listenc  # (3)!\n      - pdfunite --help\n      - pdftocairo --help\n
  1. The compiler jinja function to get the correct compiler for C and C++ on the build system.
  2. These are all the dependencies that the library links against.
  3. The script test just executes some of the installed tools to check if they are working. These can be as complex as you want. (bash or cmd.exe)
"},{"location":"tutorials/cpp/#external-build-script","title":"External Build Script","text":"

We've defined an external build script in the recipe. This will be searched next to the recipe by the file name given, or the default name build.sh on unix or build.bat on windows are searched for.

poppler-build.sh
#! /bin/bash\n\nextra_cmake_args=(\n    -GNinja\n    -DCMAKE_INSTALL_LIBDIR=lib\n    -DENABLE_UNSTABLE_API_ABI_HEADERS=ON\n    -DENABLE_GPGME=OFF\n    -DENABLE_LIBCURL=OFF\n    -DENABLE_LIBOPENJPEG=openjpeg2\n    -DENABLE_QT6=OFF\n    -DENABLE_QT5=OFF\n    -DENABLE_NSS3=OFF\n)\n\nmkdir build && cd build\n\ncmake ${CMAKE_ARGS} \"${extra_cmake_args[@]}\" \\\n    -DCMAKE_PREFIX_PATH=$PREFIX \\\n    -DCMAKE_INSTALL_PREFIX=$PREFIX \\\n    -DTIFF_INCLUDE_DIR=$PREFIX/include \\\n    $SRC_DIR\n\nninja\n\n# The `install` command will take care of copying the files to the right place\nninja install\n
"},{"location":"tutorials/cpp/#parsing-the-rattler-build-build-output","title":"Parsing the rattler-build build Output","text":"

When running the rattler-build command, you might notice some interesting information in the output. Our package will have some run dependencies, even if we didn't specify any.

These come from the run-exports of the packages listed in the host section of the recipe. This is indicated by \"RE of [host: package]\" in the output.

For example, libcurl specifies that if you depend on it in the host section, you should also depend on it during runtime with specific version ranges. This ensures proper linking to shared libraries.

Run dependencies:\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Name                  \u2506 Spec                                         \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 libcurl               \u2506 >=8.5.0,<9.0a0 (RE of [host: libcurl])       \u2502\n\u2502 fontconfig            \u2506 >=2.14.2,<3.0a0 (RE of [host: fontconfig])   \u2502\n\u2502 fonts-conda-ecosystem \u2506 (RE of [host: fontconfig])                   \u2502\n\u2502 lcms2                 \u2506 >=2.16,<3.0a0 (RE of [host: lcms2])          \u2502\n\u2502 gettext               \u2506 >=0.21.1,<1.0a0 (RE of [host: gettext])      \u2502\n\u2502 freetype              \u2506 >=2.12.1,<3.0a0 (RE of [host: freetype])     \u2502\n\u2502 openjpeg              \u2506 >=2.5.0,<3.0a0 (RE of [host: openjpeg])      \u2502\n\u2502 libiconv              \u2506 >=1.17,<2.0a0 (RE of [host: libiconv])       \u2502\n\u2502 cairo                 \u2506 >=1.18.0,<2.0a0 (RE of [host: cairo])        \u2502\n\u2502 libpng                \u2506 >=1.6.42,<1.7.0a0 (RE of [host: libpng])     \u2502\n\u2502 libzlib               \u2506 >=1.2.13,<1.3.0a0 (RE of [host: zlib])       \u2502\n\u2502 libtiff               \u2506 >=4.6.0,<4.7.0a0 (RE of [host: libtiff])     \u2502\n\u2502 libjpeg-turbo         \u2506 >=3.0.0,<4.0a0 (RE of [host: libjpeg-turbo]) \u2502\n\u2502 libglib               \u2506 >=2.78.3,<3.0a0 (RE of [host: glib])         \u2502\n\u2502 libcxx                \u2506 >=16 (RE of [build: clangxx_osx-arm64])      \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n

You can also see \"linking\" information in the output, for example on macOS:

[lib/libpoppler-glib.8.26.0.dylib] links against:\n \u251c\u2500 @rpath/libgio-2.0.0.dylib\n \u251c\u2500 @rpath/libgobject-2.0.0.dylib\n \u251c\u2500 /usr/lib/libSystem.B.dylib\n \u251c\u2500 @rpath/libglib-2.0.0.dylib\n \u251c\u2500 @rpath/libpoppler.133.dylib\n \u251c\u2500 @rpath/libfreetype.6.dylib\n \u251c\u2500 @rpath/libc++.1.dylib\n \u251c\u2500 @rpath/libpoppler-glib.8.dylib\n \u2514\u2500 @rpath/libcairo.2.dylib\n

rattler-build ensures that:

  1. All shared libraries linked against are present in the run dependencies. Missing libraries trigger an overlinking warning.
  2. You don't require any packages in the host that you are not linking against. This triggers an overdepending warning.
"},{"location":"tutorials/python/","title":"Writing a Python package","text":"

Writing a Python package is fairly straightforward, especially for \"Python-only\" packages. In the second example we will build a package for numpy which contains compiled code.

"},{"location":"tutorials/python/#a-python-only-package","title":"A Python-only package","text":"

The following recipe uses the noarch: python setting to build a noarch package that can be installed on any platform without modification. This is very handy for packages that are pure Python and do not contain any compiled extensions.

Additionally, noarch: python packages work with a range of Python versions (contrary to packages with compiled extensions that are tied to a specific Python version).

recipe.yaml
context:\n  version: \"8.1.2\"\n\npackage:\n  name: ipywidgets\n  version: ${{ version }}\n\nsource:\n  url: https://pypi.io/packages/source/i/ipywidgets/ipywidgets-${{ version }}.tar.gz\n  sha256: d0b9b41e49bae926a866e613a39b0f0097745d2b9f1f3dd406641b4a57ec42c9\n\nbuild:\n  noarch: python # (1)!\n  script: pip install . -v\n\nrequirements:\n  # note that there is no build section\n  host:\n    - pip\n    - python >=3.7\n    - setuptools\n    - wheel\n  run:\n    - comm >=0.1.3\n    - ipython >=6.1.0\n    - jupyterlab_widgets >=3.0.10,<3.1.0\n    - python >=3.7\n    - traitlets >=4.3.1\n    - widgetsnbextension >=4.0.10,<4.1.0\n\ntests:\n  - python:\n      imports:\n        - ipywidgets # (2)!\n\nabout:\n  homepage: https://github.com/ipython/ipywidgets\n  license: BSD-3-Clause\n  license_file: LICENSE\n  summary: Jupyter Interactive Widgets\n  description: |\n    ipywidgets are interactive HTML widgets for Jupyter notebooks and the IPython kernel.\n  documentation: https://ipywidgets.readthedocs.io/en/latest/\n
  1. The noarch: python line tells rattler-build that this package is pure Python and can be one-size-fits-all. noarch packages can be installed on any platform without modification which is very handy.
  2. The imports section in the tests is used to check that the package is installed correctly and can be imported.
"},{"location":"tutorials/python/#running-the-recipe","title":"Running the recipe","text":"

To build this recipe, simply run:

rattler-build build --recipe ./ipywidgets\n
"},{"location":"tutorials/python/#a-python-package-with-compiled-extensions","title":"A Python package with compiled extensions","text":"

We will build a package for numpy \u2013 which contains compiled code. Since compiled code is python version-specific, we will need to specify the python version explicitly. The best way to do this is with a \"variant_config.yaml\" file:

variants.yaml
python:\n  - 3.11\n  - 3.12\n

This will replace any python found in the recipe with the versions specified in the variants.yaml file.

recipe.yaml
context:\n  version: 1.26.4\n\npackage:\n  name: numpy\n  version: ${{ version }}\n\nsource:\n  - url: https://github.com/numpy/numpy/releases/download/v${{ version }}/numpy-${{ version }}.tar.gz\n    sha256: 2a02aba9ed12e4ac4eb3ea9421c420301a0c6460d9830d74a9df87efa4912010\n\nbuild:\n  python:\n    entry_points:\n      - f2py = numpy.f2py.f2py2e:main  # [win]\n\nrequirements:\n  build:\n    - ${{ compiler('c') }}\n    - ${{ compiler('cxx') }}\n  host:\n    # note: variant is injected here!\n    - python\n    - pip\n    - meson-python\n    - ninja\n    - pkg-config\n    - python-build\n    - cython\n    - libblas\n    - libcblas\n    - liblapack\n  run:\n    - python\n  run_exports:\n    - ${{ pin_subpackage(\"numpy\") }}\n\ntests:\n  - python:\n      imports:\n        - numpy\n        - numpy.array_api\n        - numpy.array_api.linalg\n        - numpy.ctypeslib\n\n  - script:\n    - f2py -h\n\nabout:\n  homepage: http://numpy.org/\n  license: BSD-3-Clause\n  license_file: LICENSE.txt\n  summary: The fundamental package for scientific computing with Python.\n  documentation: https://numpy.org/doc/stable/\n  repository: https://github.com/numpy/numpy\n

The build script for Unix:

build.sh
mkdir builddir\n\n$PYTHON -m build -w -n -x \\\n    -Cbuilddir=builddir \\\n    -Csetup-args=-Dblas=blas \\\n    -Csetup-args=-Dlapack=lapack\n\n$PYTHON -m pip install dist/numpy*.whl\n

The build script for Windows:

build.bat
mkdir builddir\n\n%PYTHON% -m build -w -n -x ^\n    -Cbuilddir=builddir ^\n    -Csetup-args=-Dblas=blas ^\n    -Csetup-args=-Dlapack=lapack\nif %ERRORLEVEL% neq 0 exit 1\n\n:: `pip install dist\\numpy*.whl` does not work on windows,\n:: so use a loop; there's only one wheel in dist/ anyway\nfor /f %%f in ('dir /b /S .\\dist') do (\n    pip install %%f\n    if %ERRORLEVEL% neq 0 exit 1\n)\n
"},{"location":"tutorials/python/#running-the-recipe_1","title":"Running the recipe","text":"

Running this recipe with the variant config file will build a total of 2 numpy packages:

rattler-build build --recipe ./numpy\n

At the beginning of the build process, rattler-build will print the following message to show you the variants it found:

Found variants:\n\nnumpy-1.26.4-py311h5f8ada8_0\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Variant         \u2506 Version   \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 python          \u2506 3.11      \u2502\n\u2502 target_platform \u2506 osx-arm64 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n\nnumpy-1.26.4-py312h440f24a_0\n\u256d\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n\u2502 Variant         \u2506 Version   \u2502\n\u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2561\n\u2502 python          \u2506 3.12      \u2502\n\u2502 target_platform \u2506 osx-arm64 \u2502\n\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f\n
"},{"location":"tutorials/rust/","title":"Building a Rust package","text":"

We're using rattler-build to build a Rust package for the cargo-edit utility. This utility manages Cargo dependencies from the command line.

recipe.yaml
context:\n  version: \"0.11.9\"\n\npackage:\n  name: cargo-edit\n  version: ${{ version }}\n\nsource:\n  url: https://github.com/killercup/cargo-edit/archive/refs/tags/v${{ version }}.tar.gz\n  sha256: 46670295e2323fc2f826750cdcfb2692fbdbea87122fe530a07c50c8dba1d3d7\n\nbuild:\n  script:\n    - cargo-bundle-licenses --format yaml --output ${SRC_DIR}/THIRDPARTY.yml  # !(1)\n    - $BUILD_PREFIX/bin/cargo install --locked --bins --root ${PREFIX} --path .\n\nrequirements:\n  build:\n    - ${{ compiler('rust') }}\n    - cargo-bundle-licenses\n\ntests:\n  - script:\n      - cargo-upgrade --help # !(2)\n\nabout:\n  homepage: https://github.com/killercup/cargo-edit\n  license: MIT\n  license_file:\n    - LICENSE\n    - THIRDPARTY.yml\n  description: \"A utility for managing cargo dependencies from the command line.\"\n  summary: \"A utility for managing cargo dependencies from the command line.\"\n

Note

The ${{ compiler(...) }} functions are very useful in the context of cross-compilation. When the function is evaluated it will insert the correct compiler (as selected with the variant config) as well the target_platform. The \"rendered\" compiler will look like rust_linux-64 when you are targeting the linux-64 platform.

You can read more about this in the cross-compilation section.

  1. The cargo-bundle-licenses utility is used to bundle all the licenses of the dependencies into a THIRDPARTY.yml file. This file is then included in the package. You should always include this file in your package when you are redistributing it.
  2. Running scripts in bash or cmd.exe to test the package build well, expects an exit code of 0 to pass the test.

To build this recipe, simply run:

rattler-build build \\\n    --recipe ./cargo-edit/recipe.yaml\n
"}]} \ No newline at end of file diff --git a/dev/sitemap.xml b/dev/sitemap.xml index f02546c2..3d95e2f6 100644 --- a/dev/sitemap.xml +++ b/dev/sitemap.xml @@ -2,132 +2,132 @@ https://prefix-dev.github.io/rattler-build/dev/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/authentication_and_upload/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/automatic_linting/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/build_options/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/build_script/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/compilers/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/converting_from_conda_build/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/experimental_features/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/highlevel/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/internals/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/multiple_output_cache/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/package_spec/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/rebuild/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/recipe_generation/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/selectors/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/special_files/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/testing/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/tips_and_tricks/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/tui/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/variants/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/reference/cli/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/reference/jinja/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/reference/recipe_file/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/tutorials/cpp/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/tutorials/python/ - 2024-07-24 + 2024-08-05 daily https://prefix-dev.github.io/rattler-build/dev/tutorials/rust/ - 2024-07-24 + 2024-08-05 daily \ No newline at end of file diff --git a/dev/sitemap.xml.gz b/dev/sitemap.xml.gz index 557d35e4d98b5e082b33eea565aaa6580be7ec89..72a8908945b5abad843821fcc37d7bdde3e4b15c 100644 GIT binary patch literal 466 zcmV;@0WJO?iwFq!o3LgA|8r?{Wo=<_E_iKh0M(blZrmUchVOZbmG3rg)JkdPb#8fr z^o%BWY)^uL!mzt}`+|4X?gLbnhKpgqqmMEEXBeI@n?0N`Ci3pseChh;fX?`d+;VJw zzP_|i&GYy$jK}emCi%MoKgQRXo)py^U@!?Qfb#U$N zZLr>q)k`XFLd2Oa?S!XpAzt%aM_xr>wiw&DOm$W*nupT;JKOnjo{pVaZ?x`f>sh%v=e^Qqf6JBU6F7^hS7 z^T+r0OY_n{9Sa2p+1167Pt7cnzZrA6TslU9#bxdiO@{qk2W;*}v+uq({nM_t)xoW| zwPQ%WU%ha3lL7qxD>6eA^iFC_t&4*#IV3k|+Cf?>y|)4D zM&hJ~eCyei*T}OnYW9R#)Ij^RWUMAuL>#Not>@W}longZgR{4JEqj1#2EwK4iwkxH zCxt-I7npIH1DqkKA+Gud6kX60GP=U2~-#Y5`VN_f^H?t