cheqd Studio (formerly known as Credential Service) enables users to consume cheqd's identity functionality, such as DIDs, Trust Registries, Status Lists, Credential Payments and DID-Linked Resources over REST API. This enables users to integrate cheqd's functionality into existing applications or create a full end-to-end trusted ecosystem from the ground up.
We run hosted endpoints for this package (in case you don't want to run it yourself) which have Swagger / OpenAPI definition endpoints that list all of the APIs and how they work.
The Swagger API definition pages are:
The application allows configuring the following parameters using environment variables.
LOG_LEVEL
: specifies log level, like 'trace', 'debug', 'info', 'warn' or 'error';
MAINNET_RPC_URL
: RPC endpoint for cheqd mainnet (Default:https://rpc.cheqd.net:443
).TESTNET_RPC_URL
: RPC endpoint for cheqd testnet (Default:https://rpc.cheqd.network:443
).RESOLVER_URL
: API endpoint for a DID Resolver endpoint that supportsdid:cheqd
(Default:https://resolver.cheqd.net/1.0/identifiers/
).APPLICATION_BASE_URL
: URL of the application (external domain name).CORS_ALLOWED_ORIGINS
: CORS allowed origins used in the app (optional). (Default:APPLICATION_BASE_URL
).
The application supports two modes in which keys are managed: either just storing them in-memory while a container is running, or persisting them in a PostgresSQL database with Veramo SDK. Using an external Postgres database allows for "custodian" mode where identity and cheqd/Cosmos keys can be offloaded by client applications to be stored in the database.
By default, ENABLE_EXTERNAL_DB
is set to off/false
. To enable external Veramo KMS database, set ENABLE_EXTERNAL_DB
to true
, then define below environment variables in .env
file:
EXTERNAL_DB_CONNECTION_URL
: PostgreSQL database connection URL, e.g.postgres://<user>:<password>@<host>:<port>/<database>
.EXTERNAL_DB_ENCRYPTION_KEY
: Secret key used to encrypt the Veramo key-specific database tables. This adds a layer of protection by not storing the database in plaintext.EXTERNAL_DB_CERTIFICATE
: Custom CA certificate required to connect to the database (optional).
By default, the application has API authentication disabled (which can be changed in configuration). If, however, you'd like to run the app with API authentication features, the following variables need to be configured.
We use a self-hosted version of LogTo, which supports OpenID Connect. Theoretically, these values could also be replaced with LogTo Cloud or any other OpenID Connect identity provider.
By default, ENABLE_AUTHENTICATION
is set to off/false
. To enable external Veramo KMS database, set
ENABLE_AUTHENTICATION
to true
, then define below environment variables in .env
file:
- Endpoints
LOGTO_ENDPOINT
: API endpoint for LogTo serverLOGTO_DEFAULT_RESOURCE_URL
: Root of API resources in this application to be guarded. (Default:http://localhost:3000/
on localhost.)LOGTO_MANAGEMENT_API
: URL of management API for LogTo. This is typically static within self-hosted LogTo applications and is not meant to be a resolvable URL. (Default:https://default.logto.app/api
)
- User-facing APIs
LOGTO_APP_ID
: Application ID for the cheqd Studio application in LogTo. This can be set up as type "Traditional Web"LOGTO_APP_SECRET
: Application secret associated with App ID above.
- Machine-to-machine backend APIs
LOGTO_M2M_APP_ID
: Application ID for machine-to-machine application in LogTo. This is used for elevated management APIs within LogTo.LOGTO_M2M_APP_SECRET
: Application secret
- Default role update using LogTo webhooks: LogTo supports
webhooks to fire of requests to an API when it detects certain actions/changes. If you want to automatically assign a
role to users, a webhook is recommended to be setup for firing off whenever there's a new account created, or a new
sign-in.
LOGTO_DEFAULT_ROLE_ID
: LogTo Role ID for the default role to put new users into.LOGTO_WEBHOOK_SECRET
: Webhook secret to authenticate incoming webhook requests from LogTo.
- Miscellaneous
COOKIE_SECRET
: Secret for cookie encryption.API_KEY_EXPIRATION
(optional): Expiration time for API keys in days. (Default 30 days)
This section describes bootstrapping things for newcomers accounts. If it's enabled the CredentialService auto-populates some tokens on the testnet for making the process simpler.
ENABLE_ACCOUNT_TOPUP
: Enable/disable such functionality (false
by default)FAUCET_URI
: Faucet service API endpoint (Default:https://faucet-api.cheqd.network/credit
)TESTNET_MINIMUM_BALANCE
: Minimum balance on account before it is automatically topped up from the faucet. This value should be expressed as an integer inCHEQ
tokens, which will then be converted in the background toncheq
denomination. Account balance check is carried out on every account creation/login. (Default: 10,000 CHEQ testnet tokens)
The application supports Stripe integration for payment processing.
STRIPE_ENABLED
- Enable/disable Stripe integration (false
by default)STRIPE_SECRET_KEY
- Secret key for Stripe API. Please, keep it secret on deployingSTRIPE_PUBLISHABLE_KEY
- Publishable key for Stripe API.STRIPE_WEBHOOK_SECRET
- Secret for Stripe Webhook.
The app supports 3rd party connectors for credential storage and delivery.
The app's Verida Network connector can be enabled to deliver generated credentials to Verida Wallet.
By default, ENABLE_VERIDA_CONNECTOR
is set to off/false
. To enable external Veramo KMS database, set
ENABLE_VERIDA_CONNECTOR
to true
, then define below environment variables in .env
file:
VERIDA_PRIVATE_KEY
: Secret key for Verida Network API.POLYGON_PRIVATE_KEY
: Secret key for Polygon Network.
If you want to run the application without any external databases or dependent services, we provide a Docker Compose file to spin up a standalone service.
docker compose -f docker/no-external-db/docker-compose-no-db.yml up --detach
This standalone service uses an in-memory database with no persistence, and therefore is recommended only if you're managing key/secret storage separately.
The no-db.env
file in the same folder contains all the environment variables
necessary to configure the service. (See section Configuration above.)
Construct the postgres URL and configure the env variables mentioned above.
Spinning up a Docker container from the pre-built studio Docker image on Github is as simple as the command below:
Configure the environment variables in the postgres.env
file:
POSTGRES_USER
: Username for Postgres databasePOSTGRES_PASSWORD
: Password for Postgres databasePOSTGRES_MULTIPLE_DATABASES
: Database names for multiple databases in the same cluster, e.g.:"app,logto"
. This sets up multiple databases in the same cluster, which can be used independently for External Veramo KMS or LogTo service.
Then, make the Postgres initialisation scripts executable:
chmod +x docker/with-external-db/pg-init-scripts/create-multiple-postgresql-databases.sh
Configure the environment variables in the logto.env
file with the settings
described in section above.
Then, run the LogTo service to configure the LogTo application API resources, applications, sign-in experiences, roles etc using Docker Compose:
docker compose -f docker/with-external-db/docker-compose-with-db.yml --profile logto up --detach
Configuring LogTo is outside the scope of this guide, and we recommend reading LogTo documentation to familiarise yourself.
Configure the environment variables in the with-db.env
file with the settings
described in section above. Depending on whether you are using external Veramo KMS only, LogTo only, or both you will
need to have previously provisioned these services as there are environment variables in this file that originate from
Postgres/LogTo.
Then, start the service using Docker Compose:
docker compose -f docker/with-external-db/docker-compose-with-db.yml up --detach
When upgrading either the external Veramo KMS or LogTo, you might need to run migrations for the underlying databases.
You can run just the migration scripts using Docker Compose profiles defined in the Compose file.
For example, to run cheqd Studio app migrations on an existing Postgres database (for external Veramo KMS):
docker compose -f docker/with-external-db/docker-compose-with-db.yml --profile app-setup up --detach
Or to run LogTo migrations on an existing Postgres database:
docker compose -f docker/with-external-db/docker-compose-with-db.yml --profile logto-setup up --detach
To build your own image using Docker, use the Dockerfile provided.
docker build --file docker/Dockerfile --target runner . --tag studio:local
If you notice anything not behaving how you expected, or would like to make a suggestion / request for a new feature, please create a new issue and let us know.
Our Discord server is our primary chat channel for the open-source community, software developers, and node operators.
Please reach out to us there for discussions, help, and feedback on the project.