Skip to content

Commit

Permalink
add tf scripts and update test cases
Browse files Browse the repository at this point in the history
Signed-off-by: balasubramanian-s <balasubramanian.s@progress.com>
  • Loading branch information
balasubramanian-s committed Oct 15, 2024
1 parent 94d6b9d commit 2470e85
Show file tree
Hide file tree
Showing 9 changed files with 70 additions and 30 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -295,6 +295,7 @@ The following resources are available in the InSpec GCP Profile
| [google_data_fusion_instance](docs/resources/google_data_fusion_instance.md) | [google_data_fusion_instances](docs/resources/google_data_fusion_instances.md) |
| [google_dataflow_project_location_job](docs/resources/google_dataflow_project_location_job.md) | [google_dataflow_project_location_jobs](docs/resources/google_dataflow_project_location_jobs.md) |
| [google_dataproc_autoscaling_policy](docs/resources/google_dataproc_autoscaling_policy.md) | [google_dataproc_autoscaling_policies](docs/resources/google_dataproc_autoscaling_policies.md) |
| [google_dataproc_batch](docs/resources/google_dataproc_batch.md) | [google_dataproc_batches](docs/resources/google_dataproc_batches.md) |
| [google_dataproc_cluster](docs/resources/google_dataproc_cluster.md) | [google_dataproc_clusters](docs/resources/google_dataproc_clusters.md) |
| [google_dataproc_job](docs/resources/google_dataproc_job.md) | [google_dataproc_jobs](docs/resources/google_dataproc_jobs.md) |
| [google_dataproc_metastore_federation](docs/resources/google_dataproc_metastore_federation.md) | [google_dataproc_metastore_federations](docs/resources/google_dataproc_metastore_federations.md) |
Expand Down
11 changes: 5 additions & 6 deletions docs/resources/google_dataproc_batch.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ A `google_dataproc_batch` is used to test a Google Batch resource

## Examples
```
describe google_dataproc_batch(name: ' value_name') do
describe google_dataproc_batch(name: 'projects/*/locations/*/batches/value_name') do
it { should exist }
its('name') { should cmp 'value_name' }
its('uuid') { should cmp 'value_uuid' }
Expand All @@ -33,7 +33,6 @@ describe google_dataproc_batch(name: ' value_name') do
its('state_time') { should cmp 'value_statetime' }
its('creator') { should cmp 'value_creator' }
its('operation') { should cmp 'value_operation' }
end
describe google_dataproc_batch(name: "does_not_exit") do
Expand Down Expand Up @@ -98,15 +97,15 @@ Properties that can be accessed from the `google_dataproc_batch` resource:

* `query_variables`: Optional. Mapping of query variable names to values (equivalent to the Spark SQL command: SET name="value";).

* `additional_properties`:
* `additional_properties`:

* `jar_file_uris`: Optional. HCFS URIs of jar files to be added to the Spark CLASSPATH.

* `runtime_info`: Runtime information about workload execution.

* `endpoints`: Output only. Map of remote access endpoints (such as web interfaces and APIs) to their URIs.

* `additional_properties`:
* `additional_properties`:

* `output_uri`: Output only. A URI pointing to the location of the stdout and stderr of the workload.

Expand Down Expand Up @@ -156,7 +155,7 @@ Properties that can be accessed from the `google_dataproc_batch` resource:

* `labels`: Optional. The labels to associate with this batch. Label keys must contain 1 to 63 characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/rfc1035.txt). Label values may be empty, but, if present, must contain 1 to 63 characters, and must conform to RFC 1035 (https://www.ietf.org/rfc/rfc1035.txt). No more than 32 labels can be associated with a batch.

* `additional_properties`:
* `additional_properties`:

* `runtime_config`: Runtime configuration for a workload.

Expand All @@ -166,7 +165,7 @@ Properties that can be accessed from the `google_dataproc_batch` resource:

* `properties`: Optional. A mapping of property names to values, which are used to configure workload execution.

* `additional_properties`:
* `additional_properties`:

* `repository_config`: Configuration for dependency repositories

Expand Down
10 changes: 9 additions & 1 deletion docs/resources/google_dataproc_batches.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,8 +23,16 @@ A `google_dataproc_batches` is used to test a Google Batch resource

## Examples
```
describe google_dataproc_batches(parent: ' value_parent') do
describe google_dataproc_batches(parent: 'projects/*/locations/*') do
it { should exist }
its('names') { should include 'value_name' }
its('uuids') { should include 'value_uuid' }
its('create_times') { should include 'value_createtime' }
its('states') { should include 'value_state' }
its('state_messages') { should include 'value_statemessage' }
its('state_times') { should include 'value_statetime' }
its('creators') { should include 'value_creator' }
its('operations') { should include 'value_operation' }
end
```

Expand Down
4 changes: 2 additions & 2 deletions libraries/google_dataproc_batch.rb
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@ def exists?
end

def to_s
"Batch #{@params[:]}"
"Batch #{@params[:name]}"
end

private
Expand All @@ -101,6 +101,6 @@ def product_url(_ = nil)
end

def resource_base_url
'{{+name}}'
'{{name}}'
end
end
2 changes: 1 addition & 1 deletion libraries/google_dataproc_batches.rb
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,6 @@ def product_url(_ = nil)
end

def resource_base_url
'{{+parent}}/batches'
'{{parent}}/batches'
end
end
27 changes: 27 additions & 0 deletions test/integration/build/gcp-mm.tf
Original file line number Diff line number Diff line change
Expand Up @@ -269,6 +269,9 @@ variable "data_fusion_instance" {
variable "cloud_run_jobs" {
type = any
}
variable "dataproc_serverless_batches" {
type = any
}
resource "google_compute_ssl_policy" "custom-ssl-policy" {
name = var.ssl_policy["name"]
min_tls_version = var.ssl_policy["min_tls_version"]
Expand Down Expand Up @@ -2245,3 +2248,27 @@ resource "google_cloud_run_v2_job" "default" {
}
}
}
resource "google_dataproc_batch" "inspec_batch_spark" {

batch_id = var.dataproc_serverless_batches.name
location = var.dataproc_serverless_batches.location
labels = {"app": "inspec"}
project = var.gcp_project_id
runtime_config {
properties = { "spark.dynamicAllocation.enabled": "false", "spark.executor.instances": "2" }
}

environment_config {
execution_config {
subnetwork_uri = "default"
ttl = "3600s"
network_tags = ["tag1"]
}
}

spark_batch {
main_class = var.dataproc_serverless_batches.main_class
args = [var.dataproc_serverless_batches.args]
jar_file_uris = [var.dataproc_serverless_batches.path]
}
}
7 changes: 7 additions & 0 deletions test/integration/configuration/mm-attributes.yml
Original file line number Diff line number Diff line change
Expand Up @@ -751,3 +751,10 @@ cloud_run_jobs:
location: "us-central1"
deletion_protection: "false"
image: "us-central1-docker.pkg.dev/ppradhan/nas/balasubs_tutorial1_20230915_182543:latest"

dataproc_serverless_batches:
name: "inspec-test-batch-0052"
location: "us-central1"
main_class: "org.apache.spark.examples.SparkPi"
args: "10"
path: "file:///usr/lib/spark/examples/jars/spark-examples.jar"
19 changes: 9 additions & 10 deletions test/integration/verify/controls/google_dataproc_batch.rb
Original file line number Diff line number Diff line change
Expand Up @@ -16,16 +16,15 @@

gcp_project_id = input(:gcp_project_id, value: 'gcp_project_id', description: 'The GCP project identifier.')

batch = input('batch', value: {
"name": "value_name",
"parent": "value_parent",
"uuid": "value_uuid",
"create_time": "value_createtime",
"state": "value_state",
"state_message": "value_statemessage",
"state_time": "value_statetime",
"creator": "value_creator",
"operation": "value_operation"
batch = input('batch', value: {
"name": "projects/ppradhan/locations/us-central1/batches/inspec-test-batch-0052",
"parent": "projects/ppradhan/locations/us-central1",
"uuid": "5a1b8402-2aa5-4578-98ee-2ff12ff2a14e",
"create_time": "2024-10-15T06:42:29.671473Z",
"state": "SUCCEEDED",
"state_time": "2024-10-15T06:44:55.114445Z",
"creator": "bala-local@ppradhan.iam.gserviceaccount.com",
"operation": "projects/ppradhan/regions/us-central1/operations/19a2ac29-3564-49b8-8116-c36dd98d9cd5"
}, description: 'batch description')
control 'google_dataproc_batch-1.0' do
impact 1.0
Expand Down
19 changes: 9 additions & 10 deletions test/integration/verify/controls/google_dataproc_batches.rb
Original file line number Diff line number Diff line change
Expand Up @@ -16,16 +16,15 @@

gcp_project_id = input(:gcp_project_id, value: 'gcp_project_id', description: 'The GCP project identifier.')

batch = input('batch', value: {
"name": "value_name",
"parent": "value_parent",
"uuid": "value_uuid",
"create_time": "value_createtime",
"state": "value_state",
"state_message": "value_statemessage",
"state_time": "value_statetime",
"creator": "value_creator",
"operation": "value_operation"
batch = input('batch', value: {
"name": "projects/ppradhan/locations/us-central1/batches/inspec-test-batch-0052",
"parent": "projects/ppradhan/locations/us-central1",
"uuid": "5a1b8402-2aa5-4578-98ee-2ff12ff2a14e",
"create_time": "2024-10-15T06:42:29.671473Z",
"state": "SUCCEEDED",
"state_time": "2024-10-15T06:44:55.114445Z",
"creator": "bala-local@ppradhan.iam.gserviceaccount.com",
"operation": "projects/ppradhan/regions/us-central1/operations/19a2ac29-3564-49b8-8116-c36dd98d9cd5"
}, description: 'batch description')
control 'google_dataproc_batches-1.0' do
impact 1.0
Expand Down

0 comments on commit 2470e85

Please sign in to comment.