Skip to content

Commit

Permalink
1.1 update version-bump GHA (#562)
Browse files Browse the repository at this point in the history
* update version-bump GHA

* align dev-requirements file naming w core

* fix dev-requirements name

* Bumping version to 1.1.1rc0 and generate CHANGELOG (#568)

Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>

* Bumping version to 1.1.1 and generate CHANGELOG (#571)

Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>

* skip databricks integ tests

* fix dev-requirements naming

* fix circle CI config

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
  • Loading branch information
3 people authored Dec 23, 2022
1 parent 6bf9d97 commit f0a0110
Show file tree
Hide file tree
Showing 15 changed files with 94 additions and 172 deletions.
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 1.1.0
current_version = 1.1.1
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)
Expand Down
11 changes: 11 additions & 0 deletions .changes/1.1.1.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
## dbt-spark 1.1.1 - December 20, 2022
### Features
- backport changie to 1.1.latest ([#417](https://github.com/dbt-labs/dbt-spark/issues/417), [#425](https://github.com/dbt-labs/dbt-spark/pull/425))
### Fixes
- Support new error messages in the future Spark. ([#515](https://github.com/dbt-labs/dbt-spark/issues/515), [#520](https://github.com/dbt-labs/dbt-spark/pull/520))
### Under the Hood
- updating python version in tox ([#536](https://github.com/dbt-labs/dbt-spark/issues/536), [#534](https://github.com/dbt-labs/dbt-spark/pull/534))
- fix post new release tox issues around passenv and allowlist_externals ([#546](https://github.com/dbt-labs/dbt-spark/issues/546), [#546](https://github.com/dbt-labs/dbt-spark/pull/546))

### Contributors
- [@ueshin](https://github.com/ueshin) ([#520](https://github.com/dbt-labs/dbt-spark/pull/520))
7 changes: 0 additions & 7 deletions .changes/unreleased/Features-20220810-131800.yaml

This file was deleted.

7 changes: 0 additions & 7 deletions .changes/unreleased/Fixes-20221116-234601.yaml

This file was deleted.

7 changes: 0 additions & 7 deletions .changes/unreleased/Under the Hood-20221202-140724.yaml

This file was deleted.

7 changes: 0 additions & 7 deletions .changes/unreleased/Under the Hood-20221209-143611.yaml

This file was deleted.

42 changes: 24 additions & 18 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -80,21 +80,27 @@ jobs:
environment:
DBT_INVOCATION_ENV: circle
DBT_DATABRICKS_RETRY_ALL: True
DBT_TEST_USER_1: "buildbot+dbt_test_user_1@dbtlabs.com"
DBT_TEST_USER_2: "buildbot+dbt_test_user_2@dbtlabs.com"
DBT_TEST_USER_3: "buildbot+dbt_test_user_3@dbtlabs.com"
docker:
- image: fishtownanalytics/test-container:10
steps:
- checkout
- run:
name: Run integration tests
command: tox -e integration-spark-databricks-http
no_output_timeout: 1h
- store_artifacts:
path: ./logs
# - run:
# name: Run integration tests
# command: tox -e integration-spark-databricks-http
# no_output_timeout: 1h
# - store_artifacts:
# path: ./logs

integration-spark-databricks-odbc-cluster: &databricks-odbc
environment:
DBT_INVOCATION_ENV: circle
ODBC_DRIVER: Simba # TODO: move env var to Docker image
DBT_TEST_USER_1: "buildbot+dbt_test_user_1@dbtlabs.com"
DBT_TEST_USER_2: "buildbot+dbt_test_user_2@dbtlabs.com"
DBT_TEST_USER_3: "buildbot+dbt_test_user_3@dbtlabs.com"
docker:
# image based on `fishtownanalytics/test-container` w/ Simba ODBC Spark driver installed
- image: 828731156495.dkr.ecr.us-east-1.amazonaws.com/dbt-spark-odbc-test-container:latest
Expand All @@ -103,23 +109,23 @@ jobs:
aws_secret_access_key: $AWS_SECRET_ACCESS_KEY_STAGING
steps:
- checkout
- run:
name: Run integration tests
command: tox -e integration-spark-databricks-odbc-cluster
no_output_timeout: 1h
- store_artifacts:
path: ./logs
# - run:
# name: Run integration tests
# command: tox -e integration-spark-databricks-odbc-cluster
# no_output_timeout: 1h
# - store_artifacts:
# path: ./logs

integration-spark-databricks-odbc-endpoint:
<<: *databricks-odbc
steps:
- checkout
- run:
name: Run integration tests
command: tox -e integration-spark-databricks-odbc-sql-endpoint
no_output_timeout: 1h
- store_artifacts:
path: ./logs
# - run:
# name: Run integration tests
# command: tox -e integration-spark-databricks-odbc-sql-endpoint
# no_output_timeout: 1h
# - store_artifacts:
# path: ./logs

workflows:
version: 2
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ jobs:
python -m pip install mypy==0.942
mypy --version
python -m pip install -r requirements.txt
python -m pip install -r dev_requirements.txt
python -m pip install -r dev-requirements.txt
dbt --version
unit:
Expand Down
47 changes: 23 additions & 24 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,28 +3,28 @@ name: Build and Release

on:
workflow_dispatch:

# Release version number that must be updated for each release
env:
version_number: '0.20.0rc2'
version_number: "0.20.0rc2"

jobs:
jobs:
Test:
runs-on: ubuntu-latest
steps:
- name: Setup Python
uses: actions/setup-python@v2.2.2
with:
python-version: '3.8'
with:
python-version: "3.8"

- uses: actions/checkout@v2

- name: Test release
- name: Test release
run: |
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
pip install twine wheel setuptools
python setup.py sdist bdist_wheel
pip install dist/dbt-spark-*.tar.gz
Expand All @@ -38,29 +38,29 @@ jobs:
steps:
- name: Setup Python
uses: actions/setup-python@v2.2.2
with:
python-version: '3.8'
with:
python-version: "3.8"

- uses: actions/checkout@v2

- name: Bumping version
run: |
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
bumpversion --config-file .bumpversion-dbt.cfg patch --new-version ${{env.version_number}}
bumpversion --config-file .bumpversion.cfg patch --new-version ${{env.version_number}} --allow-dirty
git status
- name: Commit version bump and tag
uses: EndBug/add-and-commit@v7
with:
author_name: 'Leah Antkiewicz'
author_email: 'leah.antkiewicz@dbtlabs.com'
message: 'Bumping version to ${{env.version_number}}'
author_name: "Leah Antkiewicz"
author_email: "leah.antkiewicz@dbtlabs.com"
message: "Bumping version to ${{env.version_number}}"
tag: v${{env.version_number}}

# Need to set an output variable because env variables can't be taken as input
# This is needed for the next step with releasing to GitHub
- name: Find release type
Expand All @@ -69,7 +69,7 @@ jobs:
IS_PRERELEASE: ${{ contains(env.version_number, 'rc') || contains(env.version_number, 'b') }}
run: |
echo ::set-output name=isPrerelease::$IS_PRERELEASE
- name: Create GitHub release
uses: actions/create-release@v1
env:
Expand All @@ -88,7 +88,7 @@ jobs:
# or
$ pip install "dbt-spark[PyHive]==${{env.version_number}}"
```
PypiRelease:
name: Pypi release
runs-on: ubuntu-latest
Expand All @@ -97,13 +97,13 @@ jobs:
steps:
- name: Setup Python
uses: actions/setup-python@v2.2.2
with:
python-version: '3.8'
with:
python-version: "3.8"

- uses: actions/checkout@v2
with:
ref: v${{env.version_number}}

- name: Release to pypi
env:
TWINE_USERNAME: ${{ secrets.PYPI_USERNAME }}
Expand All @@ -112,8 +112,7 @@ jobs:
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
pip install twine wheel setuptools
python setup.py sdist bdist_wheel
twine upload --non-interactive dist/dbt_spark-${{env.version_number}}-py3-none-any.whl dist/dbt-spark-${{env.version_number}}.tar.gz
103 changes: 14 additions & 89 deletions .github/workflows/version-bump.yml
Original file line number Diff line number Diff line change
@@ -1,103 +1,28 @@
# **what?**
# This workflow will take a version number and a dry run flag. With that
# it will run versionbump to update the version number everywhere in the
# code base and then generate an update Docker requirements file. If this
# is a dry run, a draft PR will open with the changes. If this isn't a dry
# run, the changes will be committed to the branch this is run on.
# This workflow will take the new version number to bump to. With that
# it will run versionbump to update the version number everywhere in the
# code base and then run changie to create the corresponding changelog.
# A PR will be created with the changes that can be reviewed before committing.

# **why?**
# This is to aid in releasing dbt and making sure we have updated
# the versions and Docker requirements in all places.
# This is to aid in releasing dbt and making sure we have updated
# the version in all places and generated the changelog.

# **when?**
# This is triggered either manually OR
# from the repository_dispatch event "version-bump" which is sent from
# the dbt-release repo Action
# This is triggered manually

name: Version Bump

on:
workflow_dispatch:
inputs:
version_number:
description: 'The version number to bump to'
description: 'The version number to bump to (ex. 1.2.0, 1.3.0b1)'
required: true
is_dry_run:
description: 'Creates a draft PR to allow testing instead of committing to a branch'
required: true
default: 'true'
repository_dispatch:
types: [version-bump]

jobs:
bump:
runs-on: ubuntu-latest
steps:
- name: Check out the repository
uses: actions/checkout@v2

- name: Set version and dry run values
id: variables
env:
VERSION_NUMBER: "${{ github.event.client_payload.version_number == '' && github.event.inputs.version_number || github.event.client_payload.version_number }}"
IS_DRY_RUN: "${{ github.event.client_payload.is_dry_run == '' && github.event.inputs.is_dry_run || github.event.client_payload.is_dry_run }}"
run: |
echo Repository dispatch event version: ${{ github.event.client_payload.version_number }}
echo Repository dispatch event dry run: ${{ github.event.client_payload.is_dry_run }}
echo Workflow dispatch event version: ${{ github.event.inputs.version_number }}
echo Workflow dispatch event dry run: ${{ github.event.inputs.is_dry_run }}
echo ::set-output name=VERSION_NUMBER::$VERSION_NUMBER
echo ::set-output name=IS_DRY_RUN::$IS_DRY_RUN
- uses: actions/setup-python@v2
with:
python-version: "3.8"

- name: Install python dependencies
run: |
sudo apt-get install libsasl2-dev
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
- name: Create PR branch
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'true' }}
run: |
git checkout -b bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
git push origin bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
git branch --set-upstream-to=origin/bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
- name: Bumping version
run: |
source env/bin/activate
pip install -r dev_requirements.txt
env/bin/bumpversion --allow-dirty --new-version ${{steps.variables.outputs.VERSION_NUMBER}} major
git status
- name: Commit version bump directly
uses: EndBug/add-and-commit@v7
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'false' }}
with:
author_name: 'Github Build Bot'
author_email: 'buildbot@fishtownanalytics.com'
message: 'Bumping version to ${{steps.variables.outputs.VERSION_NUMBER}}'

- name: Commit version bump to branch
uses: EndBug/add-and-commit@v7
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'true' }}
with:
author_name: 'Github Build Bot'
author_email: 'buildbot@fishtownanalytics.com'
message: 'Bumping version to ${{steps.variables.outputs.VERSION_NUMBER}}'
branch: 'bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'
push: 'origin origin/bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'

- name: Create Pull Request
uses: peter-evans/create-pull-request@v3
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'true' }}
with:
author: 'Github Build Bot <buildbot@fishtownanalytics.com>'
draft: true
base: ${{github.ref}}
title: 'Bumping version to ${{steps.variables.outputs.VERSION_NUMBER}}'
branch: 'bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'
jobs:
version_bump_and_changie:
uses: dbt-labs/actions/.github/workflows/version-bump.yml@main
with:
version_number: ${{ inputs.version_number }}
secrets: inherit # ok since what we are calling is internally maintained
13 changes: 11 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,17 @@
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-spark/blob/main/CONTRIBUTING.md#adding-changelog-entry)
## dbt-spark 1.1.1 - December 20, 2022
### Features
- backport changie to 1.1.latest ([#417](https://github.com/dbt-labs/dbt-spark/issues/417), [#425](https://github.com/dbt-labs/dbt-spark/pull/425))
### Fixes
- Support new error messages in the future Spark. ([#515](https://github.com/dbt-labs/dbt-spark/issues/515), [#520](https://github.com/dbt-labs/dbt-spark/pull/520))
### Under the Hood
- updating python version in tox ([#536](https://github.com/dbt-labs/dbt-spark/issues/536), [#534](https://github.com/dbt-labs/dbt-spark/pull/534))
- fix post new release tox issues around passenv and allowlist_externals ([#546](https://github.com/dbt-labs/dbt-spark/issues/546), [#546](https://github.com/dbt-labs/dbt-spark/pull/546))

### Contributors
- [@ueshin](https://github.com/ueshin) ([#520](https://github.com/dbt-labs/dbt-spark/pull/520))
## dbt-spark 1.1.0 - April 28, 2022

### Features
Expand All @@ -25,11 +35,10 @@
- Configure insert_overwrite models to use parquet ([#301](https://github.com/dbt-labs/dbt-spark/pull/301))

### Contributors
- [@JCZuurmond](https://github.com/dbt-labs/dbt-spark/pull/279) ([#279](https://github.com/dbt-labs/dbt-spark/pull/279))
- [@JCZuurmond](https://github.com/dbt-labs/dbt-spark/pull/279) ( [#279](https://github.com/dbt-labs/dbt-spark/pull/279))
- [@ueshin](https://github.com/ueshin) ([#320](https://github.com/dbt-labs/dbt-spark/pull/320))
- [@amychen1776](https://github.com/amychen1776) ([#288](https://github.com/dbt-labs/dbt-spark/pull/288))
- [@ueshin](https://github.com/ueshin) ([#285](https://github.com/dbt-labs/dbt-spark/pull/285))

## Previous Releases
For information on prior major and minor releases, see their changelogs:
- [1.0](https://github.com/dbt-labs/dbt-spark/blob/1.0.latest/CHANGELOG.md)
Expand Down
Loading

0 comments on commit f0a0110

Please sign in to comment.