Skip to content

Commit

Permalink
Merge pull request #140 from datalad/gh-139
Browse files Browse the repository at this point in the history
Convert .travis.yml to a GitHub Actions workflow
  • Loading branch information
yarikoptic authored May 20, 2024
2 parents 68a224d + 32b3bd9 commit e4d1050
Show file tree
Hide file tree
Showing 3 changed files with 105 additions and 81 deletions.
104 changes: 104 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,104 @@
name: Test

on:
pull_request:
push:

concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.ref_name }}
cancel-in-progress: true

jobs:
test:
env:
# will be used in the matrix, where neither other variable is used
BOTO_CONFIG: /tmp/nowhere
DATALAD_TESTS_SSH: "1"
DATALAD_LOG_CMD_ENV: GIT_SSH_COMMAND
TESTS_TO_PERFORM: datalad_crawler
PYTEST_OPTS: -s
#PYTEST_SELECTION_OP: "not " # so it would be "not (integration or usecase)"
# Special settings/helper for combined coverage from special remotes execution
COVERAGE: coverage
DATALAD_DATASETS_TOPURL: http://datasets-tests.datalad.org
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version:
- '3.7'
- '3.8'
- '3.9'
- '3.10'
- '3.11'
steps:
- name: Check out repository
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

# Just in case we need to check if nfs is there etc
- run: sudo lsmod

- name: Install git-annex
run: |
# The ultimate one-liner setup for NeuroDebian repository
bash <(wget -q -O- http://neuro.debian.net/_files/neurodebian-travis.sh)
sudo apt-get update -qq
sudo apt-get install eatmydata # to speedup some installations
# install git-annex with the relevant bits
# no recommends to avoid inheriting the entire multimedia stack
sudo eatmydata apt-get install --no-install-recommends git-annex-standalone aria2 git-remote-gcrypt lsof gnupg nocache p7zip-full
sudo eatmydata apt-get install zip pandoc
- name: Install Python dependencies
run: |
# upgrade of importlib-metadata due to https://github.com/pypa/setuptools/issues/3293
python -m pip install --upgrade pip importlib-metadata
python -m pip install build
python -m pip install codecov
pip install -r requirements-devel.txt
# To heal bloody scrapy -- yoh guesses that pip isn't good enough to
# guarantee specified versioning in the depends
pip install --upgrade attrs
#pip install 'sphinx>=1.6.2'
- name: Configure Git
run: |
git config --global user.email "test@github.land"
git config --global user.name "GitHub Almighty"
- name: Verify that building doesn't puke
run: python3 -m build

- name: Test installation system-wide
run: sudo pip install .

- name: Run tests
run: |
# -m "$PYTEST_SELECTION_OP(integration or usecase or slow)"
http_proxy=
PATH="$PWD/tools/coverage-bin:$PATH"
pytest $PYTEST_OPTS \
-v \
--cov datalad_crawler \
--cov-report xml \
--log-cli-level=INFO \
--pyargs \
$TESTS_TO_PERFORM
#if [ ! "${DATALAD_LOG_LEVEL:-}" = 2 ]; then
# PYTHONPATH=$PWD make -C docs html doctest
#fi
- name: Report WTF information using system-wide installed version
run: datalad wtf

- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
with:
fail_ci_if_error: false
token: ${{ secrets.CODECOV_TOKEN }}
name: ${{ matrix.python-version }}
80 changes: 0 additions & 80 deletions .travis.yml

This file was deleted.

2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
|____/ \__,_| \__| \__,_||_____| \__,_| \__,_|
Crawler

[![Travis tests status](https://secure.travis-ci.org/datalad/datalad-crawler.png?branch=master)](https://travis-ci.org/datalad/datalad-crawler) [![codecov.io](https://codecov.io/github/datalad/datalad-crawler/coverage.svg?branch=master)](https://codecov.io/github/datalad/datalad-crawler?branch=master) [![Documentation](https://readthedocs.org/projects/datalad-crawler/badge/?version=latest)](http://datalad-crawler.rtfd.org) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![GitHub release](https://img.shields.io/github/release/datalad/datalad-crawler.svg)](https://GitHub.com/datalad/datalad-crawler/releases/) [![PyPI version fury.io](https://badge.fury.io/py/datalad-crawler.svg)](https://pypi.python.org/pypi/datalad-crawler/) [![Average time to resolve an issue](http://isitmaintained.com/badge/resolution/datalad/datalad-crawler.svg)](http://isitmaintained.com/project/datalad/datalad-crawler "Average time to resolve an issue") [![Percentage of issues still open](http://isitmaintained.com/badge/open/datalad/datalad-crawler.svg)](http://isitmaintained.com/project/datalad/datalad-crawler "Percentage of issues still open")
[![Test Status](https://github.com/datalad/datalad-crawler/actions/workflows/test.yml/badge.svg)](https://github.com/datalad/datalad-crawler/actions/workflows/test.yml) [![codecov.io](https://codecov.io/github/datalad/datalad-crawler/coverage.svg?branch=master)](https://codecov.io/github/datalad/datalad-crawler?branch=master) [![Documentation](https://readthedocs.org/projects/datalad-crawler/badge/?version=latest)](http://datalad-crawler.rtfd.org) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![GitHub release](https://img.shields.io/github/release/datalad/datalad-crawler.svg)](https://GitHub.com/datalad/datalad-crawler/releases/) [![PyPI version fury.io](https://badge.fury.io/py/datalad-crawler.svg)](https://pypi.python.org/pypi/datalad-crawler/) [![Average time to resolve an issue](http://isitmaintained.com/badge/resolution/datalad/datalad-crawler.svg)](http://isitmaintained.com/project/datalad/datalad-crawler "Average time to resolve an issue") [![Percentage of issues still open](http://isitmaintained.com/badge/open/datalad/datalad-crawler.svg)](http://isitmaintained.com/project/datalad/datalad-crawler "Percentage of issues still open")

This extension enhances DataLad (http://datalad.org) for crawling external
web resources into an automated data distribution. Please see the [extension
Expand Down

0 comments on commit e4d1050

Please sign in to comment.