Skip to content

Commit

Permalink
Merge pull request #1660 from microsoft/staging
Browse files Browse the repository at this point in the history
Staging to main (SARplus, SASrec, NCF, RBM etc.)
  • Loading branch information
anargyri authored Mar 16, 2022
2 parents 6003323 + 17204b1 commit c4435a9
Show file tree
Hide file tree
Showing 89 changed files with 5,586 additions and 2,145 deletions.
11 changes: 10 additions & 1 deletion .github/workflows/actions/merge-cov/action.yml
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,18 @@ runs:
# different parallelized runs
- name: Merge coverage reports
shell: bash
# NOTE: Merging code reports generated from self-hosted (aml) runners can be problematic
# as reports may reference the source code at an unresolvable absolute path.
# For example, you may encounter errors like:
# "CoverageWarning: Couldn't parse
# '/home/azureuser/runner/work/recommenders/recommenders/recommenders/evaluation/__init__.py':
# No source for code"
# Work-around: Creating a symlink at the root that points to the default local runner folder: '/home/runner/work/recommenders'
run: |
sudo mkdir -p /home/azureuser
sudo ln -s /home/runner /home/azureuser/runner
python -m coverage combine .coverage*
python -m coverage report
python -m coverage report -i
python -m coverage xml -i
- name: Show merged report
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/nightly.yml
Original file line number Diff line number Diff line change
Expand Up @@ -184,6 +184,7 @@ jobs:
build-gpu:
runs-on: [self-hosted, Linux, gpu, nightly] # this is a union of labels to select specific self-hosted machine
needs: static-analysis
timeout-minutes: 420
strategy:
matrix:
python: [3.7]
Expand Down
7 changes: 5 additions & 2 deletions .github/workflows/pr-gate.yml
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
java: [11]
java: [8]
python: [3.7]
# different kinds of tests are located in tests/<unit|integration|smoke> folders
test-kind: ['unit']
Expand Down Expand Up @@ -176,7 +176,10 @@ jobs:
# different kinds of tests are located in tests/<unit|integration|smoke> folders
test-kind: ['unit']
# pytest markers configured in tox.ini. See https://docs.pytest.org/en/6.2.x/example/markers.html
test-marker: ['gpu and notebooks and not spark and not experimental', 'gpu and not notebooks and not spark and not experimental']
test-marker: [
'gpu and not notebooks and not spark and not experimental',
'gpu and notebooks and not spark and not experimental'
]

steps:
- uses: actions/checkout@v2
Expand Down
31 changes: 22 additions & 9 deletions .github/workflows/sarplus.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ on:
- .github/workflows/sarplus.yml

env:
SARPLUS_ROOT: ${{ github.workspace }}/contrib/sarplus
PYTHON_ROOT: ${{ github.workspace }}/contrib/sarplus/python
SCALA_ROOT: ${{ github.workspace }}/contrib/sarplus/scala

Expand Down Expand Up @@ -52,15 +53,20 @@ jobs:
- name: Package and check
run: |
# build
cd "${PYTHON_ROOT}"
cp ../VERSION ./pysarplus/
cp "${SARPLUS_ROOT}/VERSION" ./pysarplus/VERSION
python -m build --sdist
python -m twine check dist/*
# set sarplus_version
SARPLUS_VERSION=$(cat "${SARPLUS_ROOT}/VERSION")
echo "sarplus_version=${SARPLUS_VERSION}" >> $GITHUB_ENV
- name: Test
run: |
cd "${PYTHON_ROOT}"
python -m pip install dist/*.gz
python -m pip install dist/*.tar.gz
cd "${SCALA_ROOT}"
export SPARK_VERSION=$(python -m pip show pyspark | grep -i version | cut -d ' ' -f 2)
Expand All @@ -75,14 +81,13 @@ jobs:
cd "${PYTHON_ROOT}"
pytest ./tests
echo "sarplus_version=$(cat ../VERSION)" >> $GITHUB_ENV
- name: Upload Python package as GitHub artifact
if: github.ref == 'refs/heads/main' && matrix.python-version == '3.10'
uses: actions/upload-artifact@v2
with:
name: pysarplus-${{ env.sarplus_version }}
path: ${{ env.PYTHON_ROOT }}/dist/*.gz
path: ${{ env.PYTHON_ROOT }}/dist/*.tar.gz

scala-test:
# Test sarplus with different versions of Databricks runtime, 2 LTSs and 1
Expand Down Expand Up @@ -129,6 +134,8 @@ jobs:
env:
GPG_KEY: ${{ secrets.SARPLUS_GPG_PRI_KEY_ASC }}
run: |
SARPLUS_VERSION=$(cat "${SARPLUS_ROOT}/VERSION")
# generate artifacts
cd "${SCALA_ROOT}"
export SPARK_VERSION="3.1.2"
Expand All @@ -142,18 +149,24 @@ jobs:
export HADOOP_VERSION="3.3.1"
export SCALA_VERSION="2.12.14"
sbt ++${SCALA_VERSION}! package
sbt ++${SCALA_VERSION}! packageDoc
sbt ++${SCALA_VERSION}! packageSrc
sbt ++${SCALA_VERSION}! makePom
# sign with GPG
cd target/scala-2.12
cd "${SCALA_ROOT}/target/scala-2.12"
gpg --import <(cat <<< "${GPG_KEY}")
for file in {*.jar,*.pom}; do gpg -ab "${file}"; done
# bundle
jar cvf sarplus-bundle_2.12-$(cat ../VERSION).jar *.jar *.pom *.asc
echo "sarplus_version=$(cat ../VERSION)" >> $GITHUB_ENV
jar cvf sarplus-bundle_2.12-${SARPLUS_VERSION}.jar sarplus_*.jar sarplus_*.pom sarplus_*.asc
jar cvf sarplus-spark-3.2-plus-bundle_2.12-${SARPLUS_VERSION}.jar sarplus-spark*.jar sarplus-spark*.pom sarplus-spark*.asc
# set sarplus_version
echo "sarplus_version=${SARPLUS_VERSION}" >> $GITHUB_ENV
- name: Upload Scala bundle as GitHub artifact
- name: Upload Scala bundle as GitHub artifact
uses: actions/upload-artifact@v2
with:
name: sarplus-bundle_2.12-${{ env.sarplus_version }}
path: ${{ env.SCALA_ROOT }}/target/scala-2.12/sarplus-bundle_2.12-${{ env.sarplus_version }}.jar
path: ${{ env.SCALA_ROOT }}/target/scala-2.12/*bundle*.jar
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -175,3 +175,6 @@ examples/07_tutorials/KDD2020-tutorial/data_folder/

tests/**/resources/
reports/

### pip folders
pip-wheel*
5 changes: 5 additions & 0 deletions AUTHORS.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,6 +41,9 @@ To contributors: please add your name to the list when you submit a patch to the
* **[Aaron He](https://github.com/AaronHeee)**
* Reco utils of NCF
* Deep dive notebook demonstrating the use of NCF
* **[Abir Chakraborty](https://github.com/aeroabir)**
* Self-Attentive Sequential Recommendation (SASRec)
* Sequential Recommendation Via Personalized Transformer (SSEPT)
* **[Alexandros Ioannou](https://github.com/aioannou96)**
* Standard VAE algorithm
* Multinomial VAE algorithm
Expand Down Expand Up @@ -90,6 +93,8 @@ To contributors: please add your name to the list when you submit a patch to the
* Improving documentation
* Quick start notebook
* Operationalization notebook
* **[Nile Wilson](https://github.com/niwilso)**
* Term Frequency - Inverse Document Frequency (TF-IDF) quickstart, utils
* **[Pratik Jawanpuria](https://github.com/pratikjawanpuria)**
* RLRMC algorithm
* GeoIMC algorithm
Expand Down
Loading

0 comments on commit c4435a9

Please sign in to comment.