Here, we maintain a list of python packages, mostly cosmology related affialiated with or used by the Berkeley Center for Cosmological Physics (BCCP). To install the conda packages maintained in this repository do:
conda install -c bccp package_name
The packages maintained here are:
- abopt: ABstract OPTimizer: optimization of generic numerical models
- bigfile: A reproducible massively parallel IO library for hierarchical data
- cachey: Caching based on computation time and storage space
- classylss: A Python binding of CLASS for large-scale structure calculations
- Corrfunc: Blazing fast correlation functions on the CPU
- fastpm-python: quasi N-body simulations using the FastPM scheme in Python
- fitsio: A python package for FITS input/output wrapping cfitsio
- halotools: Python package for studying large scale structure, cosmology, and galaxy evolution using N-body simulations and halo models
- kdcount: A KDTree pair counter
- mpsort: A Python binding of MP-sort, a peta scale sorting routine
- mcfit: multiplicatively convolutional fast integral transforms in Python
- nbodykit: Analysis kit for large-scale structure datasets, the massively parallel way
- pfft-python: A Python binding of PFFT, a massively parallel FFT library
- pmesh: Particle Mesh in Python
- runtests: Testing pytest based Python projects with optional support to variable MPI sizes.
The Anaconda channel of BCCP can be found at: http://anaconda.org/bccp/
We use the cross-compilation toolchain introduced in anaconda 5.0 to build the packages on Linux and OSX.
bccp used to ship mpich and mpi4py that properly allows compiling packages with the ctng cross compilation tool chain provided by conda. This is no longer an issue since at least Mar 15 2020. We therefore have switched to use the mpich and mpi4py on the default channel.
all packages must be listed in build-order in order to build them. This is our poor man's version to resolve dependency. We loop over the packages to get around
All packages also must be listed in requirements.yaml; except those hard coded in platform directory. A python script, extrude_recipes.py will find the latest version on pypi, generate a correctly versioned recipe for the package in recipes directory.
We use a travis-ci matrix to determine the version of Python for conda-build. This helps us to shrink the time to build the packages to within the travis time-limit.
We sometimes need to add LDFLAGS to make the gfortran compiler on OSX happy.
ContinuumIO/anaconda-issues#739 (comment)
Currrently this still applies to the cross-compilation toolchain. Hopefully this will be fixed soon.
contains scripts to setup the environment to work with python-mpi-bcast on NERSC computers. The environments can be rebuilt nightly with a cronjob.
On a system with a really old gdb that cannot interpret the debugging info generated from newer GCC. add -gdrawf-2 to make sure the debugging info is old enough.