Skip to content

Commit

Permalink
1.0 update version-bump GHA (#563)
Browse files Browse the repository at this point in the history
* 1.0 update version-bump GHA

* align dev-requirements naming w core

* Bumping version to 1.0.2rc0 and generate changelog (#566)

* Bumping version to 1.0.2rc0 and generate CHANGELOG

* Update CHANGELOG.md

Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
Co-authored-by: colin-rogers-dbt <111200756+colin-rogers-dbt@users.noreply.github.com>

* Bumping version to 1.0.2 and generate changelog (#569)

* Bumping version to 1.0.2 and generate CHANGELOG

* Update CHANGELOG.md

Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
Co-authored-by: colin-rogers-dbt <111200756+colin-rogers-dbt@users.noreply.github.com>

* skip databricks integration tests

* experimental skipping of integ tests

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: Github Build Bot <buildbot@fishtownanalytics.com>
  • Loading branch information
3 people authored Dec 23, 2022
1 parent 2756ec0 commit df882a2
Show file tree
Hide file tree
Showing 12 changed files with 87 additions and 135 deletions.
2 changes: 1 addition & 1 deletion .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
[bumpversion]
current_version = 1.0.1
current_version = 1.0.2
parse = (?P<major>\d+)
\.(?P<minor>\d+)
\.(?P<patch>\d+)
Expand Down
6 changes: 6 additions & 0 deletions .changes/1.0.2.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
## dbt-spark 1.0.2 - December 20, 2022
### Features
- backport changie to 1.0.latest ([#417](https://github.com/dbt-labs/dbt-spark/issues/417), [#426](https://github.com/dbt-labs/dbt-spark/pull/426))
### Under the Hood
- fix post new release tox issues around passenv and allowlist_externals ([#547](https://github.com/dbt-labs/dbt-spark/issues/547), [#547](https://github.com/dbt-labs/dbt-spark/pull/547))

7 changes: 0 additions & 7 deletions .changes/unreleased/Features-20220810-133356.yaml

This file was deleted.

7 changes: 0 additions & 7 deletions .changes/unreleased/Under the Hood-20221209-143353.yaml

This file was deleted.

66 changes: 47 additions & 19 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,24 @@ jobs:
- checkout
- run: tox -e flake8,unit

integration-spark-session:
environment:
DBT_INVOCATION_ENV: circle
docker:
- image: godatadriven/pyspark:3.1
steps:
- checkout
- run: apt-get update
- run: python3 -m pip install --upgrade pip
- run: apt-get install -y git gcc g++ unixodbc-dev libsasl2-dev
- run: python3 -m pip install tox
- run:
name: Run integration tests
command: tox -e integration-spark-session
no_output_timeout: 1h
- store_artifacts:
path: ./logs

integration-spark-thrift:
environment:
DBT_INVOCATION_ENV: circle
Expand Down Expand Up @@ -61,21 +79,28 @@ jobs:
integration-spark-databricks-http:
environment:
DBT_INVOCATION_ENV: circle
DBT_DATABRICKS_RETRY_ALL: True
DBT_TEST_USER_1: "buildbot+dbt_test_user_1@dbtlabs.com"
DBT_TEST_USER_2: "buildbot+dbt_test_user_2@dbtlabs.com"
DBT_TEST_USER_3: "buildbot+dbt_test_user_3@dbtlabs.com"
docker:
- image: fishtownanalytics/test-container:10
steps:
- checkout
- run:
name: Run integration tests
command: tox -e integration-spark-databricks-http
no_output_timeout: 1h
- store_artifacts:
path: ./logs
# - run:
# name: Run integration tests
# command: tox -e integration-spark-databricks-http
# no_output_timeout: 1h
# - store_artifacts:
# path: ./logs

integration-spark-databricks-odbc-cluster: &databricks-odbc
environment:
DBT_INVOCATION_ENV: circle
ODBC_DRIVER: Simba # TODO: move env var to Docker image
DBT_TEST_USER_1: "buildbot+dbt_test_user_1@dbtlabs.com"
DBT_TEST_USER_2: "buildbot+dbt_test_user_2@dbtlabs.com"
DBT_TEST_USER_3: "buildbot+dbt_test_user_3@dbtlabs.com"
docker:
# image based on `fishtownanalytics/test-container` w/ Simba ODBC Spark driver installed
- image: 828731156495.dkr.ecr.us-east-1.amazonaws.com/dbt-spark-odbc-test-container:latest
Expand All @@ -84,29 +109,32 @@ jobs:
aws_secret_access_key: $AWS_SECRET_ACCESS_KEY_STAGING
steps:
- checkout
- run:
name: Run integration tests
command: tox -e integration-spark-databricks-odbc-cluster
no_output_timeout: 1h
- store_artifacts:
path: ./logs
# - run:
# name: Run integration tests
# command: tox -e integration-spark-databricks-odbc-cluster
# no_output_timeout: 1h
# - store_artifacts:
# path: ./logs

integration-spark-databricks-odbc-endpoint:
<<: *databricks-odbc
steps:
- checkout
- run:
name: Run integration tests
command: tox -e integration-spark-databricks-odbc-sql-endpoint
no_output_timeout: 1h
- store_artifacts:
path: ./logs
# - run:
# name: Run integration tests
# command: tox -e integration-spark-databricks-odbc-sql-endpoint
# no_output_timeout: 1h
# - store_artifacts:
# path: ./logs

workflows:
version: 2
test-everything:
jobs:
- unit
- integration-spark-session:
requires:
- unit
- integration-spark-thrift:
requires:
- unit
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ jobs:
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
pip install twine wheel setuptools
python setup.py sdist bdist_wheel
pip install dist/dbt-spark-*.tar.gz
Expand All @@ -48,7 +48,7 @@ jobs:
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
bumpversion --config-file .bumpversion.cfg patch --new-version ${{env.version_number}} --allow-dirty
git status
Expand Down Expand Up @@ -111,7 +111,7 @@ jobs:
python3 -m venv env
source env/bin/activate
sudo apt-get install libsasl2-dev
pip install -r dev_requirements.txt
pip install -r dev-requirements.txt
pip install twine wheel setuptools
python setup.py sdist bdist_wheel
twine upload --non-interactive dist/dbt_spark-${{env.version_number}}-py3-none-any.whl dist/dbt-spark-${{env.version_number}}.tar.gz
Expand Down
103 changes: 14 additions & 89 deletions .github/workflows/version-bump.yml
Original file line number Diff line number Diff line change
@@ -1,103 +1,28 @@
# **what?**
# This workflow will take a version number and a dry run flag. With that
# it will run versionbump to update the version number everywhere in the
# code base and then generate an update Docker requirements file. If this
# is a dry run, a draft PR will open with the changes. If this isn't a dry
# run, the changes will be committed to the branch this is run on.
# This workflow will take the new version number to bump to. With that
# it will run versionbump to update the version number everywhere in the
# code base and then run changie to create the corresponding changelog.
# A PR will be created with the changes that can be reviewed before committing.

# **why?**
# This is to aid in releasing dbt and making sure we have updated
# the versions and Docker requirements in all places.
# This is to aid in releasing dbt and making sure we have updated
# the version in all places and generated the changelog.

# **when?**
# This is triggered either manually OR
# from the repository_dispatch event "version-bump" which is sent from
# the dbt-release repo Action
# This is triggered manually

name: Version Bump

on:
workflow_dispatch:
inputs:
version_number:
description: 'The version number to bump to'
description: 'The version number to bump to (ex. 1.2.0, 1.3.0b1)'
required: true
is_dry_run:
description: 'Creates a draft PR to allow testing instead of committing to a branch'
required: true
default: 'true'
repository_dispatch:
types: [version-bump]

jobs:
bump:
runs-on: ubuntu-latest
steps:
- name: Check out the repository
uses: actions/checkout@v2

- name: Set version and dry run values
id: variables
env:
VERSION_NUMBER: "${{ github.event.client_payload.version_number == '' && github.event.inputs.version_number || github.event.client_payload.version_number }}"
IS_DRY_RUN: "${{ github.event.client_payload.is_dry_run == '' && github.event.inputs.is_dry_run || github.event.client_payload.is_dry_run }}"
run: |
echo Repository dispatch event version: ${{ github.event.client_payload.version_number }}
echo Repository dispatch event dry run: ${{ github.event.client_payload.is_dry_run }}
echo Workflow dispatch event version: ${{ github.event.inputs.version_number }}
echo Workflow dispatch event dry run: ${{ github.event.inputs.is_dry_run }}
echo ::set-output name=VERSION_NUMBER::$VERSION_NUMBER
echo ::set-output name=IS_DRY_RUN::$IS_DRY_RUN
- uses: actions/setup-python@v2
with:
python-version: "3.8"

- name: Install python dependencies
run: |
sudo apt-get install libsasl2-dev
python3 -m venv env
source env/bin/activate
pip install --upgrade pip
- name: Create PR branch
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'true' }}
run: |
git checkout -b bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
git push origin bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
git branch --set-upstream-to=origin/bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_$GITHUB_RUN_ID
- name: Bumping version
run: |
source env/bin/activate
pip install -r dev_requirements.txt
env/bin/bumpversion --allow-dirty --new-version ${{steps.variables.outputs.VERSION_NUMBER}} major
git status
- name: Commit version bump directly
uses: EndBug/add-and-commit@v7
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'false' }}
with:
author_name: 'Github Build Bot'
author_email: 'buildbot@fishtownanalytics.com'
message: 'Bumping version to ${{steps.variables.outputs.VERSION_NUMBER}}'

- name: Commit version bump to branch
uses: EndBug/add-and-commit@v7
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'true' }}
with:
author_name: 'Github Build Bot'
author_email: 'buildbot@fishtownanalytics.com'
message: 'Bumping version to ${{steps.variables.outputs.VERSION_NUMBER}}'
branch: 'bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'
push: 'origin origin/bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'

- name: Create Pull Request
uses: peter-evans/create-pull-request@v3
if: ${{ steps.variables.outputs.IS_DRY_RUN == 'true' }}
with:
author: 'Github Build Bot <buildbot@fishtownanalytics.com>'
draft: true
base: ${{github.ref}}
title: 'Bumping version to ${{steps.variables.outputs.VERSION_NUMBER}}'
branch: 'bumping-version/${{steps.variables.outputs.VERSION_NUMBER}}_${{GITHUB.RUN_ID}}'
jobs:
version_bump_and_changie:
uses: dbt-labs/actions/.github/workflows/version-bump.yml@main
with:
version_number: ${{ inputs.version_number }}
secrets: inherit # ok since what we are calling is internally maintained
9 changes: 8 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,11 @@
- Changes are listed under the (pre)release in which they first appear. Subsequent releases include changes from previous releases.
- "Breaking changes" listed under a version may require action from end users or external maintainers when upgrading to that version.
- Do not edit this file directly. This file is auto-generated using [changie](https://github.com/miniscruff/changie). For details on how to document a change, see [the contributing guide](https://github.com/dbt-labs/dbt-spark/blob/main/CONTRIBUTING.md#adding-changelog-entry)
## dbt-spark 1.0.2 - December 20, 2022
### Features
- backport changie to 1.0.latest ([#417](https://github.com/dbt-labs/dbt-spark/issues/417), [#426](https://github.com/dbt-labs/dbt-spark/pull/426))
### Under the Hood
- fix post new release tox issues around passenv and allowlist_externals ([#547](https://github.com/dbt-labs/dbt-spark/issues/547), [#547](https://github.com/dbt-labs/dbt-spark/pull/547))

## dbt-spark 1.0.1 - April 19, 2022

Expand All @@ -16,7 +21,7 @@

### Contributors
- [@ueshin](https://github.com/ueshin) ([#285](https://github.com/dbt-labs/dbt-spark/pull/285), [#320](https://github.com/dbt-labs/dbt-spark/pull/320))

## dbt-spark 1.0.0 - December 3, 2021

### Features
Expand All @@ -35,6 +40,8 @@
- [@grindheim](https://github.com/grindheim) ([#262](https://github.com/dbt-labs/dbt-spark/pull/262/))
- [@vingov](https://github.com/vingov) ([#210](https://github.com/dbt-labs/dbt-spark/pull/210))



## Previous Releases
For information on prior releases of dbt-spark prior to 1.0.0 please see
- [0.21](https://github.com/dbt-labs/dbt-spark/blob/0.21.latest/CHANGELOG.md)
Expand Down
2 changes: 1 addition & 1 deletion dbt/adapters/spark/__version__.py
Original file line number Diff line number Diff line change
@@ -1 +1 @@
version = "1.0.1"
version = "1.0.2"
File renamed without changes.
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ def _get_dbt_core_version():

# TODO remove old logic and add to versionBump script
package_name = "dbt-spark"
package_version = "1.0.1"
package_version = "1.0.2"
dbt_core_version = _get_dbt_core_version()
description = """The Apache Spark adapter plugin for dbt"""

Expand Down
12 changes: 6 additions & 6 deletions tox.ini
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ passenv =
DBT_*
PYTEST_ADDOPTS
deps =
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt

[testenv:unit]
allowlist_externals =
Expand All @@ -23,7 +23,7 @@ passenv =
PYTEST_ADDOPTS
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt

[testenv:integration-spark-databricks-http]
allowlist_externals =
Expand All @@ -35,7 +35,7 @@ passenv =
PYTEST_ADDOPTS
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.

[testenv:integration-spark-databricks-odbc-cluster]
Expand All @@ -49,7 +49,7 @@ passenv =
PYTEST_ADDOPTS ODBC_DRIVER
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.

[testenv:integration-spark-databricks-odbc-sql-endpoint]
Expand All @@ -63,7 +63,7 @@ passenv =
PYTEST_ADDOPTS ODBC_DRIVER
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.


Expand All @@ -78,5 +78,5 @@ passenv =
PYTEST_ADDOPTS
deps =
-r{toxinidir}/requirements.txt
-r{toxinidir}/dev_requirements.txt
-r{toxinidir}/dev-requirements.txt
-e.

0 comments on commit df882a2

Please sign in to comment.