To improve the general user experience around provider imports and to allow building further abstractions on top of the Terraform provider bindings, a few popular providers are offered as prebuilt packages. At the moment the following providers are built and published to NPM / PyPi on a regular basis automatically.
- AWS Provider
- Google Provider
- Azure Provider
- Kubernetes Provider
- Docker Provider
- Github Provider
- Null Provider
Please check the Terraform CDK Providers organization as well for an up to date list. As these are normal NPM / PyPI packages, they can be used as any other dependency.
e.g. in TypeScript / Node:
npm install @cdktf/provider-aws
CDK for Terraform allows you to import Terraform providers and modules to your project using this workflow.
Let's take the TypeScript example shown in the getting started guide.
The project has the main.ts
file that defines the AWS resources that need to be deployed.
import { Construct } from "constructs";
import { App, TerraformStack } from "cdktf";
import { AwsProvider, Instance } from "./.gen/providers/aws";
class MyStack extends TerraformStack {
constructor(scope: Construct, id: string) {
super(scope, id);
new AwsProvider(this, "aws", {
region: "us-east-1",
});
new Instance(this, "Hello", {
ami: "ami-2757f631",
instanceType: "t2.micro",
});
}
}
const app = new App();
new MyStack(app, "hello-terraform");
app.synth();
The project also has the cdktf.json file that defines what providers and modules are being used by the project.
vim cdktf.json
{
"language": "typescript",
"app": "npm run --silent compile && node main.js",
"terraformProviders": ["aws@~> 2.0"]
}
In order to use another provider or module, edit the cdktf.json
file and add the name of the provider.
For example, to add DNS Simple provider to the project, edit the cdktf.json
file and add the provider name to the terraformProviders
array.
{
"language": "typescript",
"app": "npm run --silent compile && node main.js",
"terraformProviders": ["aws@~> 2.0", "dnsimple"]
}
Then run cdktf get
command in the working directory.
cdktf get
⠋ downloading and generating providers...
Generated typescript constructs in the output directory: .gen
This command creates the appropriate TypeScript classes automatically that can be imported in the application.
Import the DnsimpleProvider
and Record
resources from ./.gen/providers/dnsimple
and define them.
import { Construct } from "constructs";
import { App, TerraformStack } from "cdktf";
import { AwsProvider, Instance } from "./.gen/providers/aws";
import { DnsimpleProvider, Record } from "./.gen/providers/dnsimple";
class MyStack extends TerraformStack {
constructor(scope: Construct, id: string) {
super(scope, id);
new AwsProvider(this, "aws", {
region: "us-east-1",
});
const instance = new Instance(this, "Hello", {
ami: "ami-2757f631",
instanceType: "t2.micro",
});
new DnsimpleProvider(this, "dnsimple", {
token: Token.asString(process.env.DNSIMPLE_TOKEN),
account: Token.asString(process.env.DNSIMPLE_ACCOUNT),
});
new Record(this, "web-www", {
domain: "example.com",
name: "web",
value: instance.publicIp,
type: "A",
});
}
}
const app = new App();
new MyStack(app, "hello-terraform");
app.synth();
Synthesize the code.
cdktf synth --json
{
"//": {
"metadata": {
"version": "0.0.11-pre.8757404fa25b6e405f1a51eac11b96943ccb372e",
"stackName": "vpc-example"
}
},
"terraform": {
"required_providers": {
"aws": "~> 2.0",
"dnsimple": "undefined"
}
},
"provider": {
"aws": [
{
"region": "us-east-1"
}
],
"dnsimple": [
{
"account": "hello@example.com",
"token": "xxxxxxxxxx"
}
]
},
"resource": {
"aws_instance": {
"vpcexample_Hello_279554CB": {
"ami": "ami-2757f631",
"instance_type": "t2.micro",
"//": {
"metadata": {
"path": "vpc-example/Hello",
"uniqueId": "vpcexample_Hello_279554CB",
"stackTrace": [
.....
]
}
}
}
},
"dnsimple_record": {
"vpcexample_webwww_477C7150": {
"domain": "example.com",
"name": "web",
"type": "A",
"value": "${aws_instance.vpcexample_Hello_279554CB.public_ip}",
"//": {
"metadata": {
"path": "vpc-example/web-www",
"uniqueId": "vpcexample_webwww_477C7150",
"stackTrace": [
.....
]
}
}
}
}
}
}
When using the cdktf
cli commands, it'll automatically set the process env TF_PLUGIN_CACHE_DIR
to $HOME/.terraform.d/plugin-cache
if it isn't set to something else. This will avoid re-downlodading the providers between the different cdktf
commands. See the Terraform docs for more information.
cdktf get
works in a temporary directory, hence all downloaded providers would be lost without caching. For the deployment related commands diff
/ deploy
/ destroy
, the working directory is usually cdktf.out
and is treated as throwaway folder. While not common, it's totally reasonable to remove the cdktf.out
folder and synthesize again. In that case, caching will help as well.
Last but not least, when using multiple stacks within one application, provider caching is a basic prerequisite.
This behaviour can be disabled by setting CDKTF_DISABLE_PLUGIN_CACHE_ENV
to non null value, e.g. CDKTF_DISABLE_PLUGIN_CACHE_ENV=1
. This might be desired, when a different cache directory is configured via a .terraformrc
configuration file.
For using modules on the terraform registry, see cdktf.json.
For using modules from other sources (local, github, etc), you can make use of TerraformHclModule
. This doesn't have type safe inputs/outputs, but allows for creating any terraform module.
TypeScript example:
const provider = new TestProvider(stack, "provider", {
accessKey: "key",
alias: "provider1",
});
const module = new TerraformHclModule(stack, "test", {
source: "./foo",
variables: {
param1: "value1",
},
providers: [provider],
});
new TestResource(stack, "resource", {
name: module.getString("name"),
});