How do I speed up package installation?

Speedup option 1: use mamba

mamba is a drop-in replacement for conda that uses a faster dependency solving library and parts reimplemented in C++ for speed. Install it just into the base environment so that it’s always available, like this:

conda install mamba -n base -c conda-forge

Then use mamba instead of conda.

For example, instead of conda install, use mamba install. Instead of conda env create use mamba env create, and so on. mamba also uses the same configuration as conda, so you don’t need to reconfigure the channels.


Installing mamba into the base environment (-n base in the command above) means that it does not need to be installed into each subsequent environment you create.

Speedup option 2: use environments strategically

Here are several ways you can use environments to minimize the time spent on solving dependencies, which typically is what takes the longest amount of time:

  1. Keep the base environment small.

    If you install everything into the same environment (e.g. the base environment, which is used any time you don’t otherwise specify an environment), then whenever you add or update packages to it, the solver has to do a lot of work to make sure all of the many packages are mutually compatible with each other.

  2. Use smaller environments.

    Fewer packages means less work for the solver. Try to use environments only containing what you need for a particular project or task.

  3. Pin dependencies.

    Sometimes pinning dependencies to a specific version can speed up the solving, since it reduces the search space for the solver. In some cases this may backfire though. For example, you can’t pin an older version of R and also use newer R packages that don’t support that version of R.

  4. Create an environment from a file with all dependencies.

    Creating an environment with all dependencies at once can be faster than incrementally adding packages to an existing environment. For example conda create -n myenv --file requirements.txt, or conda env create --file env.yaml.

  5. Use strict channel priority.

    Ensure that you’ve run conda config --set channel_priority strict to respect the configured channel order. This can also speed up the solving.

What versions are supported?

Operating Systems

Bioconda only supports 64-bit Linux and macOS. ARM is not currently supported.


Bioconda only supports python 2.7, 3.6, 3.7, 3.8 and 3.9.

The exception to this is Bioconda packages which declare noarch: python and only depend on such packages - those packages can be installed in an environment with any version of python they say they can support. However many python packages in Bioconda depend on other Bioconda packages with architecture specific builds, such as pysam, and so do not meet this criteria.

Pinned packages

Some packages require ABI compatibility with underlying libraries. To ensure that packages can work together, there are some libraries that need to be pinned, or fixed to a particular version. Other packages are then built with that specific version (and therefore that specific ABI) to ensure they can all work together.

The authoritative source for which packages are pinned and to which versions can be found in the bioconda_utils-conda_build_config.yaml file.

This is in addition to the conda-forge specified versions, conda_build_config.yaml which pins versions of base dependencies like boost, zlib, and many others.

Unsupported versions

If there is a version of a dependency you wish to build against that Bioconda does not currently support, please reach out to the Bioconda Gitter for more information about if supporting that version is feasible, if work on that is already being done, and how you can help.

To find out against which version you can pin a package, e.g. x.y.* or x.* please use ABI-Laboratory.

How do I keep track of environments?

You can view your created environments with conda env list.

Note that if keeping track of different environment names becomes a burden, you can create an environment in the same directory as a project with the -p argument, e.g.,

conda create -p ./env --file requirements.txt

and then activate the environment with

conda activate ./env

This also works quite well in a shared directory so everyone can use (and maintain) the same environment.

What’s the difference between Anaconda, conda, Miniconda, and mamba?

  • conda is the name of the package manager, which is what runs when you call, e.g., conda install.

  • mamba is a drop-in replacement for conda (see above for details)

  • Anaconda is a large installation including Python, conda, and a large number of packages.

  • Miniconda just has conda and its dependencies (in contrast to the larger Anaconda distribution).

The Anaconda Python distribution started out as a bundle of scientific Python packages that were otherwise difficult to install. It was created by ContinuumIO and remains the easiest way to install the full scientific Python stack.

Many packaging problems had to be solved in order to provide all of that software in Anaconda in a cross-platform bundle, and one of the tools that came out of that work was the conda package manager. So conda is part of th Anaconda Python distribution. But conda ended up being very useful on its own and for things other than Python, so ContinuumIO spun it out into its own separate open-source package.

Conda became very useful for setting up lightweight environments for testing code or running individual steps of a workflow. To avoid needing to install the entire Anaconda distribution each time, the Miniconda installer was created. This installs only what you need to run conda itself, which can then be used to create other environments. So the “mini” in Miniconda means that it’s a fraction of the size of the full Anaconda installation.

So: conda is a package manager, Miniconda is the conda installer, and Anaconda is a scientific Python distribution that also includes conda.

What’s the difference between a recipe and a package?

A recipe is a directory containing small set of files that defines name, version, dependencies, and URL for source code. A recipe typically contains a meta.yaml file that defines these settings and a build.sh script that builds the software.

A recipe is converted into a package by running conda-build on the recipe. A package is a bgzipped tar file (.tar.bz2) that contains the built software in expected subdirectories, along with a list of what other packages are dependencies. For example, a conda package built for a Python package would end up with .py files in the lib/python3.8/site-packages/ directory inside the tarball, and would specify (at least) Python as a dependency.

Packages are uploaded to anaconda.org so that users can install them with conda install.

See also

The Conda package specification has details on exactly what a package contains and how it is installed into an environment.