diff --git a/src/e2e-test/features/bigquery/source/BigQuerySourceError.feature b/src/e2e-test/features/bigquery/source/BigQuerySourceError.feature index eb475837ee..71626aff0c 100644 --- a/src/e2e-test/features/bigquery/source/BigQuerySourceError.feature +++ b/src/e2e-test/features/bigquery/source/BigQuerySourceError.feature @@ -55,3 +55,21 @@ Feature: BigQuery source - Validate BigQuery source plugin error scenarios Then Enter BigQuery source property table name Then Enter BigQuery property temporary bucket name "bqInvalidTemporaryBucket" Then Verify the BigQuery validation error message for invalid property "bucket" + + @BQ_SOURCE_TEST + Scenario:To verify error message when unsupported format is provided in Partition Start date and Partition end Date + Given Open Datafusion Project to configure pipeline + When Expand Plugin group in the LHS plugins list: "Source" + When Select plugin: "BigQuery" from the plugins list as: "Source" + Then Navigate to the properties page of plugin: "BigQuery" + Then Replace input plugin property: "project" with value: "projectId" + And Enter input plugin property: "referenceName" with value: "bqInvalidReferenceName" + Then Replace input plugin property: "dataset" with value: "dataset" + Then Replace input plugin property: "table" with value: "bqSourceTable" + And Enter input plugin property: "partitionFrom" with value: "bqIncorrectFormatStartDate" + And Enter input plugin property: "partitionTo" with value: "bqIncorrectFormatEndDate" + Then Click on the Get Schema button + And Click on the Validate button + Then Verify that the Plugin Property: "referenceName" is displaying an in-line error message: "errorMessageIncorrectReferenceName" + Then Verify that the Plugin Property: "partitionFrom" is displaying an in-line error message: "errorMessageIncorrectPartitionStartDate" + Then Verify that the Plugin Property: "partitionTo" is displaying an in-line error message: "errorMessageIncorrectPartitionEndDate" diff --git a/src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature b/src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature index 299a48125b..31568109b5 100644 --- a/src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature +++ b/src/e2e-test/features/bigquery/source/BigQueryToBigQuery.feature @@ -354,3 +354,34 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data Then Open and capture logs Then Verify the pipeline status is "Succeeded" Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table + + @BQ_SOURCE_TEST @BQ_SINK_TEST + Scenario:Validate that pipeline run gets failed when incorrect filter values and verify the log error message + Given Open Datafusion Project to configure pipeline + When Source is BigQuery + When Sink is BigQuery + Then Open BigQuery source properties + Then Enter BigQuery property reference name + Then Enter BigQuery property projectId "projectId" + Then Enter BigQuery property datasetProjectId "projectId" + Then Override Service account details if set in environment variables + Then Enter BigQuery property dataset "dataset" + Then Enter BigQuery source property table name + Then Enter input plugin property: "filter" with value: "incorrectFilter" + Then Validate output schema with expectedSchema "bqSourceSchema" + Then Validate "BigQuery" plugin properties + Then Close the BigQuery properties + Then Open BigQuery sink properties + Then Override Service account details if set in environment variables + Then Enter the BigQuery sink mandatory properties + Then Validate "BigQuery" plugin properties + Then Close the BigQuery properties + Then Connect source as "BigQuery" and sink as "BigQuery" to establish connection + Then Save the pipeline + Then Deploy the pipeline + Then Run the Pipeline in Runtime + Then Wait till pipeline is in running state + Then Verify the pipeline status is "Failed" + Then Open Pipeline logs and verify Log entries having below listed Level and Message: + | Level | Message | + | ERROR | errorLogsMessageInvalidFilter | diff --git a/src/e2e-test/features/bigquery/source/BigQueryToGCS_WithMacro.feature b/src/e2e-test/features/bigquery/source/BigQueryToGCS_WithMacro.feature index 363fb7535e..3cd7f31963 100644 --- a/src/e2e-test/features/bigquery/source/BigQueryToGCS_WithMacro.feature +++ b/src/e2e-test/features/bigquery/source/BigQueryToGCS_WithMacro.feature @@ -69,3 +69,153 @@ Feature: BigQuery source - Verification of BigQuery to GCS successful data trans Then Verify the pipeline status is "Succeeded" Then Verify data is transferred to target GCS bucket Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled + + @CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST + Scenario:Validate successful records transfer from BigQuery to GCS with macro arguments for partition start date and partition end date + Given Open Datafusion Project to configure pipeline + When Source is BigQuery + When Sink is GCS + Then Open BigQuery source properties + Then Enter BigQuery property reference name + Then Enter BigQuery property "projectId" as macro argument "bqProjectId" + Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId" + Then Enter BigQuery property "partitionFrom" as macro argument "bqStartDate" + Then Enter BigQuery property "partitionTo" as macro argument "bqEndDate" + Then Enter BigQuery property "serviceAccountType" as macro argument "serviceAccountType" + Then Enter BigQuery property "serviceAccountFilePath" as macro argument "serviceAccount" + Then Enter BigQuery property "serviceAccountJSON" as macro argument "serviceAccount" + Then Enter BigQuery property "dataset" as macro argument "bqDataset" + Then Enter BigQuery property "table" as macro argument "bqSourceTable" + Then Validate "BigQuery" plugin properties + Then Close the BigQuery properties + Then Open GCS sink properties + Then Enter GCS property reference name + Then Enter GCS property "projectId" as macro argument "gcsProjectId" + Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType" + Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount" + Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount" + Then Enter GCS property "path" as macro argument "gcsSinkPath" + Then Enter GCS sink property "pathSuffix" as macro argument "gcsPathSuffix" + Then Enter GCS property "format" as macro argument "gcsFormat" + Then Enter GCS sink cmek property "encryptionKeyName" as macro argument "cmekGCS" if cmek is enabled + Then Validate "GCS" plugin properties + Then Close the GCS properties + Then Connect source as "BigQuery" and sink as "GCS" to establish connection + Then Save the pipeline + Then Preview and run the pipeline + Then Enter runtime argument value "projectId" for key "bqProjectId" + Then Enter runtime argument value "projectId" for key "bqDatasetProjectId" + Then Enter runtime argument value "partitionFrom" for key "bqStartDate" + Then Enter runtime argument value "partitionTo" for key "bqEndDate" + Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType" + Then Enter runtime argument value "serviceAccount" for key "serviceAccount" + Then Enter runtime argument value "dataset" for key "bqDataset" + Then Enter runtime argument value for BigQuery source table name key "bqSourceTable" + Then Enter runtime argument value "projectId" for key "gcsProjectId" + Then Enter runtime argument value for GCS sink property path key "gcsSinkPath" + Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix" + Then Enter runtime argument value "csvFormat" for key "gcsFormat" + Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled + Then Run the preview of pipeline with runtime arguments + Then Wait till pipeline preview is in running state + Then Open and capture pipeline preview logs + Then Verify the preview run status of pipeline in the logs is "succeeded" + Then Close the pipeline logs + Then Click on preview data for GCS sink + Then Close the preview data + Then Deploy the pipeline + Then Run the Pipeline in Runtime + Then Enter runtime argument value "projectId" for key "bqProjectId" + Then Enter runtime argument value "projectId" for key "bqDatasetProjectId" + Then Enter runtime argument value "partitionFrom" for key "bqStartDate" + Then Enter runtime argument value "partitionTo" for key "bqEndDate" + Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType" + Then Enter runtime argument value "serviceAccount" for key "serviceAccount" + Then Enter runtime argument value "dataset" for key "bqDataset" + Then Enter runtime argument value for BigQuery source table name key "bqSourceTable" + Then Enter runtime argument value "projectId" for key "gcsProjectId" + Then Enter runtime argument value for GCS sink property path key "gcsSinkPath" + Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix" + Then Enter runtime argument value "csvFormat" for key "gcsFormat" + Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled + Then Run the Pipeline in Runtime with runtime arguments + Then Wait till pipeline is in running state + Then Open and capture logs + Then Verify the pipeline status is "Succeeded" + Then Verify data is transferred to target GCS bucket + Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled + + @CMEK @BQ_SOURCE_TEST @GCS_SINK_TEST + Scenario:Validate successful records transfer from BigQuery to GCS with macro arguments for filter and Output Schema + Given Open Datafusion Project to configure pipeline + When Source is BigQuery + When Sink is GCS + Then Open BigQuery source properties + Then Enter BigQuery property reference name + Then Enter BigQuery property "projectId" as macro argument "bqProjectId" + Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId" + Then Enter BigQuery property "filter" as macro argument "bqFilter" + Then Enter BigQuery property "serviceAccountType" as macro argument "serviceAccountType" + Then Enter BigQuery property "serviceAccountFilePath" as macro argument "serviceAccount" + Then Enter BigQuery property "serviceAccountJSON" as macro argument "serviceAccount" + Then Enter BigQuery property "dataset" as macro argument "bqDataset" + Then Enter BigQuery property "table" as macro argument "bqSourceTable" + Then Select Macro action of output schema property: "Output Schema-macro-input" and set the value to "bqOutputSchema" + Then Validate "BigQuery" plugin properties + Then Close the BigQuery properties + Then Open GCS sink properties + Then Enter GCS property reference name + Then Enter GCS property "projectId" as macro argument "gcsProjectId" + Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType" + Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount" + Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount" + Then Enter GCS property "path" as macro argument "gcsSinkPath" + Then Enter GCS sink property "pathSuffix" as macro argument "gcsPathSuffix" + Then Enter GCS property "format" as macro argument "gcsFormat" + Then Enter GCS sink cmek property "encryptionKeyName" as macro argument "cmekGCS" if cmek is enabled + Then Validate "GCS" plugin properties + Then Close the GCS properties + Then Connect source as "BigQuery" and sink as "GCS" to establish connection + Then Save the pipeline + Then Preview and run the pipeline + Then Enter runtime argument value "projectId" for key "bqProjectId" + Then Enter runtime argument value "projectId" for key "bqDatasetProjectId" + Then Enter runtime argument value "filter" for key "bqFilter" + Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType" + Then Enter runtime argument value "serviceAccount" for key "serviceAccount" + Then Enter runtime argument value "dataset" for key "bqDataset" + Then Enter runtime argument value for BigQuery source table name key "bqSourceTable" + Then Enter runtime argument value "OutputSchema" for key "bqOutputSchema" + Then Enter runtime argument value "projectId" for key "gcsProjectId" + Then Enter runtime argument value for GCS sink property path key "gcsSinkPath" + Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix" + Then Enter runtime argument value "csvFormat" for key "gcsFormat" + Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled + Then Run the preview of pipeline with runtime arguments + Then Wait till pipeline preview is in running state + Then Open and capture pipeline preview logs + Then Verify the preview run status of pipeline in the logs is "succeeded" + Then Close the pipeline logs + Then Click on preview data for GCS sink + Then Close the preview data + Then Deploy the pipeline + Then Run the Pipeline in Runtime + Then Enter runtime argument value "projectId" for key "bqProjectId" + Then Enter runtime argument value "projectId" for key "bqDatasetProjectId" + Then Enter runtime argument value "filter" for key "bqFilter" + Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType" + Then Enter runtime argument value "serviceAccount" for key "serviceAccount" + Then Enter runtime argument value "dataset" for key "bqDataset" + Then Enter runtime argument value for BigQuery source table name key "bqSourceTable" + Then Enter runtime argument value "OutputSchema" for key "bqOutputSchema" + Then Enter runtime argument value "projectId" for key "gcsProjectId" + Then Enter runtime argument value for GCS sink property path key "gcsSinkPath" + Then Enter runtime argument value "gcsPathDateSuffix" for key "gcsPathSuffix" + Then Enter runtime argument value "csvFormat" for key "gcsFormat" + Then Enter runtime argument value "cmekGCS" for GCS cmek property key "cmekGCS" if GCS cmek is enabled + Then Run the Pipeline in Runtime with runtime arguments + Then Wait till pipeline is in running state + Then Open and capture logs + Then Verify the pipeline status is "Succeeded" + Then Verify data is transferred to target GCS bucket + Then Validate the cmek key "cmekGCS" of target GCS bucket if cmek is enabled diff --git a/src/e2e-test/java/io/cdap/plugin/bigquery/stepsdesign/BigQueryBase.java b/src/e2e-test/java/io/cdap/plugin/bigquery/stepsdesign/BigQueryBase.java index d4ae865c8a..f12f6ff15c 100644 --- a/src/e2e-test/java/io/cdap/plugin/bigquery/stepsdesign/BigQueryBase.java +++ b/src/e2e-test/java/io/cdap/plugin/bigquery/stepsdesign/BigQueryBase.java @@ -90,18 +90,18 @@ public void getCountOfNoOfRecordsTransferredToTargetBigQueryTable() throws IOExc int countRecords = BigQueryClient.countBqQuery(TestSetupHooks.bqTargetTable); BeforeActions.scenario.write("**********No of Records Transferred******************:" + countRecords); Assert.assertEquals("Number of records transferred should be equal to records out ", - countRecords, recordOut()); + countRecords, recordOut()); } @Then("Validate records transferred to target table is equal to number of records from source table " + - "with filter {string}") + "with filter {string}") public void validateRecordsTransferredToTargetTableIsEqualToNumberOfRecordsFromSourceTableWithFilter(String filter) - throws IOException, InterruptedException { + throws IOException, InterruptedException { String projectId = (PluginPropertyUtils.pluginProp("projectId")); String datasetName = (PluginPropertyUtils.pluginProp("dataset")); int countRecordsTarget = BigQueryClient.countBqQuery(TestSetupHooks.bqTargetTable); String selectQuery = "SELECT count(*) FROM `" + projectId + "." + datasetName + "." + - TestSetupHooks.bqTargetTable + "` WHERE " + PluginPropertyUtils.pluginProp(filter); + TestSetupHooks.bqTargetTable + "` WHERE " + PluginPropertyUtils.pluginProp(filter); Optional result = BigQueryClient.getSoleQueryResult(selectQuery); int count = result.map(Integer::parseInt).orElse(0); BeforeActions.scenario.write("Number of records transferred with respect to filter:" + count); @@ -110,13 +110,13 @@ public void validateRecordsTransferredToTargetTableIsEqualToNumberOfRecordsFromS @Then("Validate partition date in output partitioned table") public void validatePartitionDateInOutputPartitionedTable() - throws IOException, InterruptedException { + throws IOException, InterruptedException { Optional result = BigQueryClient - .getSoleQueryResult("SELECT distinct _PARTITIONDATE as pt FROM `" + - (PluginPropertyUtils.pluginProp("projectId")) + "." + - (PluginPropertyUtils.pluginProp("dataset")) + "." + - TestSetupHooks.bqTargetTable + - "` WHERE _PARTITION_LOAD_TIME IS Not NULL ORDER BY _PARTITIONDATE DESC "); + .getSoleQueryResult("SELECT distinct _PARTITIONDATE as pt FROM `" + + (PluginPropertyUtils.pluginProp("projectId")) + "." + + (PluginPropertyUtils.pluginProp("dataset")) + "." + + TestSetupHooks.bqTargetTable + + "` WHERE _PARTITION_LOAD_TIME IS Not NULL ORDER BY _PARTITIONDATE DESC "); String outputDate = StringUtils.EMPTY; if (result.isPresent()) { outputDate = result.get(); @@ -136,10 +136,10 @@ public void validateTheRecordsAreNotCreatedInOutputTable() throws IOException, I public void validatePartitioningIsNotDoneOnTheOutputTable() { try { BigQueryClient.getSoleQueryResult("SELECT distinct _PARTITIONDATE as pt FROM `" + - (PluginPropertyUtils.pluginProp("projectId")) - + "." + (PluginPropertyUtils.pluginProp("dataset")) + "." + - TestSetupHooks.bqTargetTable - + "` WHERE _PARTITION_LOAD_TIME IS Not NULL "); + (PluginPropertyUtils.pluginProp("projectId")) + + "." + (PluginPropertyUtils.pluginProp("dataset")) + "." + + TestSetupHooks.bqTargetTable + + "` WHERE _PARTITION_LOAD_TIME IS Not NULL "); } catch (Exception e) { String partitionException = e.toString(); Assert.assertTrue(partitionException.contains("Unrecognized name: _PARTITION_LOAD_TIME")); @@ -168,8 +168,8 @@ public void validateTheCmekKeyOfTargetBigQueryTableIfCmekIsEnabled(String cmek) String cmekBQ = PluginPropertyUtils.pluginProp(cmek); if (cmekBQ != null) { Assert.assertTrue("Cmek key of target BigQuery table should be equal to " + - "cmek key provided in config file", - BigQueryClient.verifyCmekKey(TestSetupHooks.bqTargetTable, cmekBQ)); + "cmek key provided in config file", + BigQueryClient.verifyCmekKey(TestSetupHooks.bqTargetTable, cmekBQ)); return; } BeforeActions.scenario.write("CMEK not enabled"); @@ -204,13 +204,13 @@ public void enterRuntimeArgumentValueForBigQueryCmekPropertyKeyIfBQCmekIsEnabled @Then("Verify the partition table is created with partitioned on field {string}") public void verifyThePartitionTableIsCreatedWithPartitionedOnField(String partitioningField) throws IOException, - InterruptedException { + InterruptedException { Optional result = BigQueryClient - .getSoleQueryResult("SELECT IS_PARTITIONING_COLUMN FROM `" + - (PluginPropertyUtils.pluginProp("projectId")) + "." - + (PluginPropertyUtils.pluginProp("dataset")) + ".INFORMATION_SCHEMA.COLUMNS` " + - "WHERE table_name = '" + TestSetupHooks.bqTargetTable - + "' and column_name = '" + PluginPropertyUtils.pluginProp(partitioningField) + "' "); + .getSoleQueryResult("SELECT IS_PARTITIONING_COLUMN FROM `" + + (PluginPropertyUtils.pluginProp("projectId")) + "." + + (PluginPropertyUtils.pluginProp("dataset")) + ".INFORMATION_SCHEMA.COLUMNS` " + + "WHERE table_name = '" + TestSetupHooks.bqTargetTable + + "' and column_name = '" + PluginPropertyUtils.pluginProp(partitioningField) + "' "); String isPartitioningDoneOnField = StringUtils.EMPTY; if (result.isPresent()) { isPartitioningDoneOnField = result.get(); @@ -230,16 +230,16 @@ public void verifyTheBigQueryValidationErrorMessageForInvalidProperty(String pro String expectedErrorMessage; if (property.equalsIgnoreCase("gcsChunkSize")) { expectedErrorMessage = PluginPropertyUtils - .errorProp(E2ETestConstants.ERROR_MSG_BQ_INCORRECT_CHUNKSIZE); + .errorProp(E2ETestConstants.ERROR_MSG_BQ_INCORRECT_CHUNKSIZE); } else if (property.equalsIgnoreCase("bucket")) { expectedErrorMessage = PluginPropertyUtils - .errorProp(E2ETestConstants.ERROR_MSG_BQ_INCORRECT_TEMPORARY_BUCKET); + .errorProp(E2ETestConstants.ERROR_MSG_BQ_INCORRECT_TEMPORARY_BUCKET); } else if (property.equalsIgnoreCase("table")) { expectedErrorMessage = PluginPropertyUtils - .errorProp(E2ETestConstants.ERROR_MSG_INCORRECT_TABLE_NAME); + .errorProp(E2ETestConstants.ERROR_MSG_INCORRECT_TABLE_NAME); } else { expectedErrorMessage = PluginPropertyUtils.errorProp(E2ETestConstants.ERROR_MSG_BQ_INCORRECT_PROPERTY). - replaceAll("PROPERTY", property.substring(0, 1).toUpperCase() + property.substring(1)); + replaceAll("PROPERTY", property.substring(0, 1).toUpperCase() + property.substring(1)); } String actualErrorMessage = PluginPropertyUtils.findPropertyErrorElement(property).getText(); Assert.assertEquals(expectedErrorMessage, actualErrorMessage); @@ -250,14 +250,20 @@ public void verifyTheBigQueryValidationErrorMessageForInvalidProperty(String pro @Then("Validate records transferred to target table is equal to number of records from source table") public void validateRecordsTransferredToTargetTableIsEqualToNumberOfRecordsFromSourceTable() - throws IOException, InterruptedException { + throws IOException, InterruptedException { int countRecordsTarget = BigQueryClient.countBqQuery(TestSetupHooks.bqTargetTable); Optional result = BigQueryClient.getSoleQueryResult("SELECT count(*) FROM `" + - (PluginPropertyUtils.pluginProp("projectId")) - + "." + (PluginPropertyUtils.pluginProp - ("dataset")) + "." + TestSetupHooks.bqTargetTable + "` "); + (PluginPropertyUtils.pluginProp("projectId")) + + "." + (PluginPropertyUtils.pluginProp + ("dataset")) + "." + TestSetupHooks.bqTargetTable + "` "); int count = result.map(Integer::parseInt).orElse(0); BeforeActions.scenario.write("Number of records transferred from source table to target table:" + count); Assert.assertEquals(count, countRecordsTarget); } + + @Then("Enter BigQuery source properties filter") + public void enterBigQuerysourcePropertiesfilter() throws IOException { + CdfBigQueryPropertiesActions.enterFilter("%%%%"); + } + } diff --git a/src/e2e-test/java/io/cdap/plugin/utils/CdfPluginPropertyLocator.java b/src/e2e-test/java/io/cdap/plugin/utils/CdfPluginPropertyLocator.java index 297c623838..10a848a9bd 100644 --- a/src/e2e-test/java/io/cdap/plugin/utils/CdfPluginPropertyLocator.java +++ b/src/e2e-test/java/io/cdap/plugin/utils/CdfPluginPropertyLocator.java @@ -36,7 +36,11 @@ public enum CdfPluginPropertyLocator { GCS_CREATE_OBJECTS_TO_CREATE("paths"), GCS_CREATE_FAIL_IF_OBJECT_EXISTS("failIfExists"), GCS_MOVE_SOURCE_PATH("sourcePath"), - GCS_MOVE_DESTINATION_PATH("destPath"); + GCS_MOVE_DESTINATION_PATH("destPath"), + PARTITION_START_DATE("partitionFrom"), + PARTITION_END_DATE("partitionTo"), + FILTER("filter"), + OUTPUT_SCHEMA("Output Schema-macro-input"); public String pluginProperty; CdfPluginPropertyLocator(String property) { @@ -74,6 +78,10 @@ public enum CdfPluginPropertyLocator { .put("createFailIfObjectExists", CdfPluginPropertyLocator.GCS_CREATE_FAIL_IF_OBJECT_EXISTS) .put("gcsMoveSourcePath", CdfPluginPropertyLocator.GCS_MOVE_SOURCE_PATH) .put("gcsMoveDestinationPath", CdfPluginPropertyLocator.GCS_MOVE_DESTINATION_PATH) + .put("filter", CdfPluginPropertyLocator.FILTER) + .put("Output Schema-macro-input", CdfPluginPropertyLocator.OUTPUT_SCHEMA) + .put("partitionFrom", CdfPluginPropertyLocator.PARTITION_START_DATE) + .put("partitionTo", CdfPluginPropertyLocator.PARTITION_END_DATE) .build(); } diff --git a/src/e2e-test/java/io/cdap/plugin/utils/E2ETestConstants.java b/src/e2e-test/java/io/cdap/plugin/utils/E2ETestConstants.java index 4fb86da3b4..1e8ea9ba81 100644 --- a/src/e2e-test/java/io/cdap/plugin/utils/E2ETestConstants.java +++ b/src/e2e-test/java/io/cdap/plugin/utils/E2ETestConstants.java @@ -17,4 +17,8 @@ public class E2ETestConstants { public static final String ERROR_MSG_BQ_INCORRECT_CHUNKSIZE = "errorMessageIncorrectBQChunkSize"; public static final String ERROR_MSG_BQ_INCORRECT_TEMPORARY_BUCKET = "errorMessageIncorrectBQBucketName"; public static final String ERROR_MSG_BQ_INCORRECT_PROPERTY = "errorMessageIncorrectBQProperty"; + public static final String ERROR_MSG_INCORRECT_PARTITIONSTARTDATE = "errorMessageIncorrectPartitionStartDate"; + public static final String ERROR_MSG_INCORRECT_PARTITIONENDDATE = "errorMessageIncorrectPartitionEndDate"; + public static final String ERROR_MSG_INCORRECT_REFERENCENAME = "errorMessageIncorrectReferenceName"; + public static final String ERROR_MSG_INCORRECT_FILTER = "errorMessageIncorrectRegexPathFilter"; } diff --git a/src/e2e-test/resources/errorMessage.properties b/src/e2e-test/resources/errorMessage.properties index bd8a1610b9..900e80d22a 100644 --- a/src/e2e-test/resources/errorMessage.properties +++ b/src/e2e-test/resources/errorMessage.properties @@ -33,4 +33,7 @@ errorMessageMultipleFileWithoutClearDefaultSchema=Found a row with 4 fields when errorMessageInvalidSourcePath=Invalid bucket name in path 'abc@'. Bucket name should errorMessageInvalidDestPath=Invalid bucket name in path 'abc@'. Bucket name should errorMessageInvalidEncryptionKey=CryptoKeyName.parse: formattedString not in valid format: Parameter "abc@" must be - +errorMessageIncorrectPartitionStartDate=02-01-2025 is not in a valid format. Enter valid date in format: yyyy-MM-dd +errorMessageIncorrectPartitionEndDate=03-01-2025 is not in a valid format. Enter valid date in format: yyyy-MM-dd +errorMessageIncorrectReferenceName=Invalid reference name 'invalidRef&^*&&*'. Supported characters are: letters, numbers, and '_', '-', '.', or '$'. +errorLogsMessageInvalidFilter=Spark Program 'phase-1' failed. diff --git a/src/e2e-test/resources/pluginParameters.properties b/src/e2e-test/resources/pluginParameters.properties index aae33e0e89..9855607ed9 100644 --- a/src/e2e-test/resources/pluginParameters.properties +++ b/src/e2e-test/resources/pluginParameters.properties @@ -1,5 +1,6 @@ projectId=cdf-athena -datasetprojectId=cdf-athena +datasetprojectId=testbq_bqmt + //cdf-athena dataset=testbq_bqmt wrongSourcePath=gs://00000000-e2e-0014a44f-81be-4501-8360-0ddca192492 serviceAccountType=filePath @@ -354,6 +355,17 @@ bqTargetTable=dummy bqTargetTable2=dummy bqmtTargetTable=tabA bqmtTargetTable2=tabB +bqStartDate=2025-01-02 +bqEndDate=2025-01-03 +partitionFrom=2025-01-02 +partitionTo=2025-01-03 +filter=Id=20 +bqInvalidReferenceName=invalidRef&^*&&* +OutputSchema={ "type": "record", "name": "text", "fields": [{ "name": "Id", "type": "long" }, { "name": "Value", "type": "long" }, \ + { "name": "UID", "type": "string" } ] } +incorrectFilter=%%%% +bqIncorrectFormatStartDate=02-01-2025 +bqIncorrectFormatEndDate=03-01-2025 ## BQMT-PLUGIN-PROPERTIES-END ##CLOUDBIGTABLE-PLUGIN-PROPERTIES-START