Contributing packages

The staging process

This document presents an overview over how to contribute packages to conda-forge.

Getting Started

There are multiple ways to get started:

  1. Look at the example recipe in the staged-recipes repository and modify it as necessary.

  2. If it is an R package from CRAN, please instead start by using the conda-forge helper script for R recipes. Then if necessary you can make manual edits to the recipe.

  3. If it is a python package you can generate a skeleton as a starting point with conda skeleton pypi your_package_name. You do not have to use the skeleton, and the recipes produced by skeleton will need to be edited. In particular, you’ll at least need to change the build line to use pip, add yourself as a maintainer, and specify a license_file.

Your final recipe should have no comments (unless they’re actually relevant to the recipe, and not generic instruction comments), and follow the order in the example.

If there are details you are not sure about please open a pull request. The conda-forge team will be happy to answer your questions.

In case you are building your first recipe using conda-forge, a step-by-step instruction and checklist that might help you with a successful build is provided in the following.

Step-by-step Instructions

  1. Ensure your source code can be downloaded as a single file. Source code should be downloadable as an archive (.tar.gz, .zip, .tar.bz2, .tar.xz) or tagged on GitHub, to ensure that it can be verified. (For further detail, sees Build from tarballs, not repos).

  2. Fork the example recipes repository.

  3. Create a new branch from the staged-recipes master branch.

  4. Within your forked copy, generate a new folder in the recipes subdirectory and copy the meta.yml file from the example directory. Please leave the example directory unchanged!

  5. Edit the copied recipe (meta.yml) as needed. For details, see The recipe meta.yaml.

  6. Generate the SHA256 key for your source code archive, as described in the example recipe using the openssl tool. As an alternative, you can also go to the package description on PyPi from which you can directly copy the SHA256.

  7. Be sure to fill in the tests section. The simplest test will simply test that the module can be imported, as described in the example.

  8. Remove all irrelevant comments in the meta.yaml file.

Checklist

  • Ensure that the license and license family descriptors (optional) have the right case and that the license is correct. Note that case sensitive inputs are required (e.g. Apache 2.0 rather than APACHE 2.0).

  • Ensure that you have included a license file if your license requires one – most do. (see here)

  • In case your project has tests included, you need to decide if these tests should be executed while building the conda-forge feedstock.

  • Make sure that all tests pass successfully at least on your development machine.

  • Recommended: run the test locally on your source code to ensure the recipe works locally (see Running tests locally for staged recipes).

  • Make sure that your changes do not interfere with other recipes that are in the recipes folder (e.g. the example recipe).

Feedback and revision

Once you finished your PR, all you have to do is waiting for feedback from our reviewer team.

The reviewer team will assist you by pointing out improvements and answering questions. Once the package is ready, the reviewers will approve and merge your pull request.

After merging the PR, our infrastructure will build the package and make it available in the conda-channel.

Note

If you have questions or have not heard back for a while, you can notify us by including @conda-forge/staged-recipes in your GitHub message.

Post staging process

  • After the PR is merged, our CI services will create a new git repo automatically. For example, the recipe for a package named pydstool will be moved to a new repository https://github.com/conda-forge/pydstool-feedstock.

  • CI services will be enabled automatically and a build will be triggered automatically which will build the conda package and upload to https://anaconda.org/conda-forge

  • If this is your first contribution, you will be added to the conda-forge team and given access to the CI services so that you can stop and restart builds. You will also be given commit rights to the new git repository.

  • If you want to make a change to the recipe, send a PR to the git repository from a fork. Branches of the main repository are used for maintaining different versions only.

Maintainer role

The maintainer’s job is to:

  • Keep the feedstock updated by merging eventual maintenance PRs from conda-forge’s bots.

  • Keep the feedstock on par with new releases of the source package by

    • Bumping the version number and checksum.

    • Making sure that feedstock’s requirements stay accurate.

    • Make sure the test requirements match those of the updated package.

  • Answer eventual questions about the package on the feedstock issue tracker.

The recipe meta.yaml

The meta.yaml file in the recipe directory is at the heart of every conda package. It defines everything that is required to build and use the package.

meta.yaml is in yaml format, augmented with Jinja templating.

A full reference of the structure and fields of meta.yaml file can be found in the Defining metadata (meta.yaml) section in the conda-build documentation.

In the following, we highlight particularly important and conda-forge specific information and guidelines, ordered by section in meta.yaml.

Source

Build from tarballs, not repos

Packages should be built from tarballs using the url key, not from repositories directly by using e.g. git_url.

There are several reasons behind this rule:

  • Repositories are usually larger than tarballs, draining shared CI time and bandwidth

  • Repositories are not checksummed. Thus, using a tarball has a stronger guarantee that the download that is obtained to build from is in fact the intended package.

  • On some systems, it is possible to not have permission to remove a repo once it is created.

Populating the hash field

If your package is on PyPi, you can get the sha256 hash from your package’s page on PyPI; look for the SHA256 link next to the download link on your package’s files page, e.g. https://pypi.org/project/<your-project>/#files.

You can also generate a hash from the command line on Linux (and Mac if you install the necessary tools below).

To generate the sha256 hash: openssl sha256 your_sdist.tar.gz

You may need the openssl package, available on conda-forge conda install openssl -c conda-forge.

Downloading extra sources and data files

conda-build 3 supports multiple sources per recipe. Examples are available in the conda-build docs.

Build

Skipping builds

Use the skip key in the build section along with a selector:

You can e.g. specify not to build …

  • on specific architectures:

build:
    skip: true  # [win]
  • for specific python versions:

build:
    skip: true  # [py<35]

A full description of selectors is in the conda docs.

Optional: bld.bat and/or build.sh

In many cases, bld.bat and/or build.sh files are not required. Pure Python packages almost never need them.

If the build can be executed with one line, you may put this line in the script entry of the build section of the meta.yaml file with: script: "{{ PYTHON }} -m pip install . --no-deps -vv".

Remember to always add pip to the host requirements.

Use pip

Normally Python packages should use this line:

build:
  script: "{{ PYTHON }} -m pip install . --no-deps -vv"

as the installation script in the meta.yml file or bld.bat/build.sh script files, while adding pip to the host requirements:

requirements:
  host:
    - pip

These options should be used to ensure a clean installation of the package without its dependencies. This helps make sure that we’re only including this package, and not accidentally bringing any dependencies along into the conda package.

Note that the --no-deps line means that for pure-Python packages, usually only python and pip are needed as build or host requirements; the real package dependencies are only run requirements.

Requirements

Build, host and run

Conda-build distinguishes three different kinds of dependencies. In the following paragraphs, we give a very short overview what packages go where. For a detailed explanation please refer to the conda-build documentation.

Build

Build dependencies are required in the build environment and contain all tools that are not needed on the host of the package.

Following packages are examples of typical build dependencies:

Host

Host dependencies are required during build phase, but in contrast to build packages they have to be present on the host.

Following packages are typical examples for host dependencies:

  • shared libraries (c/c++)

  • python/r libraries that link against c libraries (see e.g. Building Against NumPy)

  • python, r-base

  • setuptools, pip (see Use pip)

Run

Run dependencies are only required during run time of the package. Run dependencies typically include

  • most python/r libraries

Avoid external dependencies

As a general rule: all dependencies have to be packaged by conda-forge as well. This is necessary to assure ABI compatibility for all our packages.

There are only a few exceptions to this rule:

  1. Some dependencies have to be satisfied with CDT packages (see Core dependency tree packages (CDT)).

  2. Some packages require root access (e.g. device drivers) that cannot be distributed by conda-forge. These dependencies should be avoided whenever possible.

Pinning

Linking shared c/c++ libraries creates dependence on the ABI of the library that was used at build time on the package. The exposed interface changes when previously existing exposed symbols are deleted or modified in a newer version.

It is therefore crucial to ensure that only library versions with a compatible ABI are used after linking.

In the best case, the shared library you depend on:

In these cases you do not have to worry about version requirements:

requirements:
  # [...]
  host:
    - readline
    - libpng

In other cases you have to specify ABI compatible versions manually.

requirements:
  # [...]
  host:
    - libawesome 1.1.*

For more information on pinning, please refer to Pinned dependencies.

Constraining packages at runtime

The run_constrained section allows defining restrictions on packages at runtime without depending on the package. It can be used to restrict allowed versions of optional dependencies and defining incompatible packages.

Defining non-dependency restrictions

Imagine a package can be used together with version 1 of awesome-software when present, but does not strictly depend on it. Therefore you would like to let the user choose whether he/she would like to use the package with or without awesome-software. Let’s assume further that the package is incompatible to version 2 of awesome-software.

In this case run_dependencies can be used to restrict awesome-software to version 1.*, if the user chooses to install it:

requirements:
  # [...]
  run_constrained:
    - awesome-software 1.*

Here run_constrained acts as a means to protect users from incompatible versions without introducing an unwanted dependency.

Defining conflicts

Sometimes packages interfere with each other and therefore only one of them can be installed at any time. In combination with an unsatisfiable version, run_constrained can define blockers:

package:
name: awesome-db

requirements:
  # [...]
  run_constrained:
    - amazing-db ==9999999999

In this example, awesome-db cannot be be installed together with amazing-db as there is no package amazing-db-9999999999.

Test

All recipes need tests. Here are some tips, tricks, and justifications. How you should test depends on the type of package (python, c-lib, command-line tool, … ), and what tests are available for that package. But every conda package must have at least some tests.

Simple existence tests

Sometimes defining tests seems to be hard, e.g. due to:

  • tests for the underlying code base may not exist.

  • test suites may take too long to run on limited CI infrastructure.

  • tests may take too much bandwidth.

In these cases, conda-forge may not be able to execute the prescribed test suite.

However, this is no reason for the recipe to not have tests. At the very least we want to verify that the package has installed the desired files in the desired locations. This is called existence testing.

Existence testing can be accomplished in the meta.yaml file in the test/commands block.

On posix systems, use the test utility and the $PREFIX variable.

On Windows, use the exist command. See below for an example.

Simple existence testing example:

test:
  commands:
    - test -f $PREFIX/lib/libboost_log$SHLIB_EXT  # [unix]
    - if not exist %LIBRARY_LIB%\\boost_log-vc140-mt.lib exit 1  # [win]

Testing python packages

For the best information about testing, see the conda build docs test section.

Testing importing

The minimal test of a python package should make sure that the package can be successfully imported. This can be accomplished with this stanza in the meta.yaml:

test:
  imports:
    - package_name

Note that package_name is the name imported by python; not necessarily the name of the conda package (they are sometimes different).

Testing for an import will catch the bulk of the packaging errors, generally including the presence of dependencies. However, it does not assure that the package works correctly. In particular, it doesn’t test if it works correctly with the versions of dependencies used.

It is good to run some other tests of the code itself (the test suite) if possible.

Running unit tests

The trick here is that there are multiple ways to run unit tests in Python, including nose, pytest, etc.

Also, some packages install the tests with the package, and thus they can be run in place, while others keep the tests with the source code, and thus can not be run straight from an installed package.

Test requirements

Sometimes there are packages required to run the tests that are not required to simply use the package. This is usually a test-running framework, such as nose or pytest. You can ensure that it is included by adding it to requirements in the test stanza:

test:
  imports:
    - package_name
...
  requires:
    - pytest
Copying test files

Often test files are not installed alongside packages. Conda creates a fresh working copy to execute the test stage of build recipes, which don’t contain the files of source package.

You can include files required for testing with the source_files section:

test:
  imports:
    - package_name
  requires:
    - pytest tests test_pkg_integration.py
  source_files:
    - tests
    - test_pkg_integration.py

The source_files section works for files and directories.

Built-in tests

Some packages have testing built-in. In this case, you can put a test command directly in the test stanza:

test:
  ...
  commands:
     python -c "import package_name; package_name.tests.runall()"

Alternatively, you can add a file called run_test.py in the recipe that will be run at test time. This allows an arbitrarily complicated test script.

pytest tests

If the tests are installed with the package, pytest can find and run them for you with the following command:

test:
  requires:
    - pytest
  commands:
    - pytest --pyargs package_name

Command Line Utilities

If a python package installs command line utilities, you probably want to test that they were properly installed:

test:
  commands:
    - util_1 --help

If the utility actually has a test mode, great. Otherwise simply invoking --help or --version or something will at least test that it is installed and can run.

Tests outside of the package

Note that conda-build runs the tests in an isolated environment after installing the package – thus, at this point it does not have access to the original source tarball. This is to ensure that the test environment is as close as possible to what an end-user will see.

This makes it very hard to run tests that are not installed with the package.

Running tests locally for staged recipes

If you want to run and build packages in the staged-recipes repository locally, go to the root repository directory and run the .circleci/run_docker_build.sh script.

This requires that you have docker installed on your machine.

$ cd staged-recipes
$ ./.circleci/run_docker_build.sh

About

Packaging the license manually

Sometimes upstream maintainers do not include a license file in their tarball despite being demanded by the license.

If this is the case, you can add the license to the recipe directory (here named LICENSE.txt) and reference it inside the meta.yaml:

about:
  license_file: LICENSE.txt

In this case, please also notify the upstream developers that the license file is missing.

Important

The license should only be shipped along with the recipe if there is no license file in the downloaded archive. If there is a license file in the archive, please set license_file to the path of the license file in the archive.

Miscellaneous

Activate scripts

Recipes are allowed to have activate scripts, which will be sourced or called when the environment is activated. It is generally recommended to avoid using activate scripts when another option is possible because people do not always activate environments the expected way and these packages may then misbehave.

When using them in a recipe, feel free to name them activate.bat, activate.sh, deactivate.bat, and deactivate.sh in the recipe. The installed scripts are recommended to be prefixed by the package name and a separating -. Below is some sample code for Unix and Windows that will make this install process easier. Please feel free to lift it.

In build.sh:

# Copy the [de]activate scripts to $PREFIX/etc/conda/[de]activate.d.
# This will allow them to be run on environment activation.
for CHANGE in "activate" "deactivate"
do
    mkdir -p "${PREFIX}/etc/conda/${CHANGE}.d"
    cp "${RECIPE_DIR}/${CHANGE}.sh" "${PREFIX}/etc/conda/${CHANGE}.d/${PKG_NAME}_${CHANGE}.sh"
done

In build.bat:

setlocal EnableDelayedExpansion

:: Copy the [de]activate scripts to %PREFIX%\etc\conda\[de]activate.d.
:: This will allow them to be run on environment activation.
for %%F in (activate deactivate) DO (
    if not exist %PREFIX%\etc\conda\%%F.d mkdir %PREFIX%\etc\conda\%%F.d
    copy %RECIPE_DIR%\%%F.bat %PREFIX%\etc\conda\%%F.d\%PKG_NAME%_%%F.bat
)

Jinja templating

The recipe meta.yaml can contain expressions that are evaluated during build time. These expressions are written in Jinja syntax.

Jinja expressions serve following purposes in the meta.yaml:

  • They allow defining variables to avoid code duplication. Using a variable for the version allows changing the version only once with every update.

    {% set version = "3.7.3" %}
     [...]
    
    package:
      name: python
      version: {{ version }}
    
    source:
      url: https://www.python.org/ftp/python/{{ version }}/Python-{{ version }}.tar.xz
      sha256: da60b54064d4cfcd9c26576f6df2690e62085123826cff2e667e72a91952d318
    
  • They can call conda-build functions for automatic code generation. Examples are the compilers, cdt packages or the pin_compatible function.

    requirements:
      build:
        - {{ compiler('c') }}
        - {{ compiler('cxx') }}
        - {{ cdt('xorg-x11-proto-devel') }}  # [linux]
        - {{ cdt('libx11-devel') }}          # [linux]
    

    or

    requirements:
      build:
        - {{ compiler('c') }}
        - {{ compiler('cxx') }}
      host:
        - python
        - numpy
      run:
        - python
        - {{ pin_compatible('numpy') }}
    

For more information please refer to the Templating with Jinja section in the conda-build docs.