Skip to content

Commit

Permalink
Add options clause to create table macro (#171)
Browse files Browse the repository at this point in the history
* Add option clause macro

* Add option clause to create table macro

* Add test for options clause

* Add change log entry

* Add file format delta to test

* Change order of table expression

* Make options lower case

* Change order of table definitions

* Add options to spark config
  • Loading branch information
JCZuurmond authored Jun 2, 2021
1 parent dff1b61 commit c13f1dd
Show file tree
Hide file tree
Showing 4 changed files with 24 additions and 0 deletions.
4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
## dbt next

### Features
- Allow setting table `OPTIONS` using `config` ([#171](https://github.com/fishtown-analytics/dbt-spark/pull/171))

### Fixes

- Cast `table_owner` to string to avoid errors generating docs ([#158](https://github.com/fishtown-analytics/dbt-spark/pull/158), [#159](https://github.com/fishtown-analytics/dbt-spark/pull/159))
Expand All @@ -13,6 +16,7 @@
- [@friendofasquid](https://github.com/friendofasquid) ([#159](https://github.com/fishtown-analytics/dbt-spark/pull/159))
- [@franloza](https://github.com/franloza) ([#160](https://github.com/fishtown-analytics/dbt-spark/pull/160))
- [@Fokko](https://github.com/Fokko) ([#165](https://github.com/fishtown-analytics/dbt-spark/pull/165))
- [@JCZuurmond](https://github.com/JCZuurmond) ([#171](https://github.com/fishtown-analytics/dbt-spark/pull/171))

## dbt-spark 0.19.1 (Release TBD)

Expand Down
1 change: 1 addition & 0 deletions dbt/adapters/spark/impl.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ class SparkConfig(AdapterConfig):
partition_by: Optional[Union[List[str], str]] = None
clustered_by: Optional[Union[List[str], str]] = None
buckets: Optional[int] = None
options: Optional[Dict[str, str]] = None


class SparkAdapter(SQLAdapter):
Expand Down
11 changes: 11 additions & 0 deletions dbt/include/spark/macros/adapters.sql
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,16 @@
{%- endif %}
{%- endmacro -%}

{% macro options_clause() -%}
{%- set options = config.get('options') -%}
{%- if options is not none %}
options (
{%- for option in options -%}
{{ option }} "{{ options[option] }}" {% if not loop.last %}, {% endif %}
{%- endfor %}
)
{%- endif %}
{%- endmacro -%}

{% macro comment_clause() %}
{%- set raw_persist_docs = config.get('persist_docs', {}) -%}
Expand Down Expand Up @@ -83,6 +93,7 @@
create table {{ relation }}
{% endif %}
{{ file_format_clause() }}
{{ options_clause() }}
{{ partition_cols(label="partitioned by") }}
{{ clustered_cols(label="clustered by") }}
{{ location_clause() }}
Expand Down
8 changes: 8 additions & 0 deletions test/unit/test_macros.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,6 +43,14 @@ def test_macros_create_table_as_file_format(self):
sql = self.__run_macro(template, 'spark__create_table_as', False, 'my_table', 'select 1').strip()
self.assertEqual(sql, "create or replace table my_table using delta as select 1")

def test_macros_create_table_as_options(self):
template = self.__get_template('adapters.sql')

self.config['file_format'] = 'delta'
self.config['options'] = {"compression": "gzip"}
sql = self.__run_macro(template, 'spark__create_table_as', False, 'my_table', 'select 1').strip()
self.assertEqual(sql, 'create or replace table my_table using delta options (compression "gzip" ) as select 1')

def test_macros_create_table_as_partition(self):
template = self.__get_template('adapters.sql')

Expand Down

0 comments on commit c13f1dd

Please sign in to comment.