description | keywords | |||
---|---|---|---|---|
dlt source for affinity.co |
|
DLT source for Affinity.
If you don't know DLT but stumbled across this when trying to search if you can get your data out of Affinity somehow: this will do it - it basically allows you to pull mostly any (except some enriched) data out of your Affinity instance and into a different target system (Snowflake, Postgres, etc.) that is supported by DLT.
Create a .dlt/secrets.toml
with your API key:
affinity_api_key="<YOUR_API_KEY>"
and then run the default source with optional list references:
from dlt_source_affinity import ListReference, source as affinity_source
pipeline = dlt.pipeline(
pipeline_name="affinity_pipeline",
destination="duckdb",
dev_mode=True,
)
affinity_data = affinity_source(
# By default the data source loads:
# - organizations
# - persons
# - lists
# - opportunities
# - notes
# And then we can optionally pass an arbitrary number of lists and list views:
list_refs=[
# Loads a list with ID 123,
# e.g. https://<your-subdomain>.affinity.co/lists/123/
ListReference(123),
# Loads a view with ID 456 in list 123,
# e.g. https://<your-subdomain>.affinity.co/lists/123/views/456-all-organizations
ListReference(247888, 1869904),
]
)
pipeline.run(affinity_data)
Resources that can be loaded using this verified source are:
Name | Description | API version | Permissions needed |
---|---|---|---|
companies | The stored companies | V2 | Requires the "Export All Organizations directory" permission. |
persons | The stored persons | V2 | Requires the "Export All People directory" permission. |
opportunities | The stored opportunities | V2 | Requires the "Export data from Lists" permission. |
lists | A given list and/or a saved view of a list | V2 | Requires the "Export data from Lists" permission. |
notes | Notes attached to companies, persons, opportunities | Legacy | n/a |
There are two versions of the Affinity API:
- Legacy which is available for all plans.
- V2 which is only available for customers with an enterprise plan.
This verified source makes use of both API endpoints. The authentication credentials for both APIs are the same, however, they differ in their authentication behavior.
dlt init affinity duckdb
Here, we chose duckdb as the destination. Alternatively, you can also choose redshift, bigquery, or any of the other destinations.
- You'll need to obtain your API key and configure the pipeline with it.
This project is using devenv.
AFFINITY_API_KEY=[...] python affinity_pipeline.py
Run
generate-model