Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Brazil visdata - Unified Social Assistance System Readme data #1067

Open
wants to merge 13 commits into
base: master
Choose a base branch
from

Changes according to commnts

4c471dc
Select commit
Loading
Failed to load commit list.
Open

Brazil visdata - Unified Social Assistance System Readme data #1067

Changes according to commnts
4c471dc
Select commit
Loading
Failed to load commit list.
Google Cloud Build / data-pull-request-py (datcom-ci) succeeded Sep 13, 2024 in 6m 38s

Summary

Build Information

Trigger data-pull-request-py
Build 35718f42-6c7b-452c-a741-a253dc623b00
Start 2024-09-13T07:25:44-07:00
Duration 6m36.837s
Status SUCCESS

Steps

Step Status Duration
python_install SUCCESS 3m0.133s
python_test SUCCESS 3m10.88s
python_format_check SUCCESS 1m36.638s

Details

starting build "35718f42-6c7b-452c-a741-a253dc623b00"

FETCHSOURCE
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: 	git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: 	git branch -m <name>
Initialized empty Git repository in /workspace/.git/
From https://github.com/datacommonsorg/data
 * branch            4c471dc43f811c407ad510be4c3cdf7a8b26e89e -> FETCH_HEAD
Updating files:  26% (809/3098)
Updating files:  27% (837/3098)
Updating files:  28% (868/3098)
Updating files:  28% (869/3098)
Updating files:  29% (899/3098)
Updating files:  30% (930/3098)
Updating files:  31% (961/3098)
Updating files:  32% (992/3098)
Updating files:  33% (1023/3098)
Updating files:  34% (1054/3098)
Updating files:  35% (1085/3098)
Updating files:  36% (1116/3098)
Updating files:  37% (1147/3098)
Updating files:  38% (1178/3098)
Updating files:  39% (1209/3098)
Updating files:  40% (1240/3098)
Updating files:  41% (1271/3098)
Updating files:  42% (1302/3098)
Updating files:  43% (1333/3098)
Updating files:  44% (1364/3098)
Updating files:  45% (1395/3098)
Updating files:  46% (1426/3098)
Updating files:  47% (1457/3098)
Updating files:  48% (1488/3098)
Updating files:  48% (1500/3098)
Updating files:  49% (1519/3098)
Updating files:  50% (1549/3098)
Updating files:  51% (1580/3098)
Updating files:  52% (1611/3098)
Updating files:  53% (1642/3098)
Updating files:  54% (1673/3098)
Updating files:  55% (1704/3098)
Updating files:  56% (1735/3098)
Updating files:  57% (1766/3098)
Updating files:  58% (1797/3098)
Updating files:  59% (1828/3098)
Updating files:  60% (1859/3098)
Updating files:  61% (1890/3098)
Updating files:  62% (1921/3098)
Updating files:  63% (1952/3098)
Updating files:  64% (1983/3098)
Updating files:  65% (2014/3098)
Updating files:  66% (2045/3098)
Updating files:  67% (2076/3098)
Updating files:  68% (2107/3098)
Updating files:  69% (2138/3098)
Updating files:  70% (2169/3098)
Updating files:  71% (2200/3098)
Updating files:  72% (2231/3098)
Updating files:  73% (2262/3098)
Updating files:  74% (2293/3098)
Updating files:  75% (2324/3098)
Updating files:  76% (2355/3098)
Updating files:  77% (2386/3098)
Updating files:  78% (2417/3098)
Updating files:  79% (2448/3098)
Updating files:  80% (2479/3098)
Updating files:  81% (2510/3098)
Updating files:  82% (2541/3098)
Updating files:  82% (2548/3098)
Updating files:  83% (2572/3098)
Updating files:  84% (2603/3098)
Updating files:  85% (2634/3098)
Updating files:  86% (2665/3098)
Updating files:  87% (2696/3098)
Updating files:  88% (2727/3098)
Updating files:  89% (2758/3098)
Updating files:  90% (2789/3098)
Updating files:  91% (2820/3098)
Updating files:  92% (2851/3098)
Updating files:  93% (2882/3098)
Updating files:  94% (2913/3098)
Updating files:  95% (2944/3098)
Updating files:  96% (2975/3098)
Updating files:  97% (3006/3098)
Updating files:  98% (3037/3098)
Updating files:  99% (3068/3098)
Updating files: 100% (3098/3098)
Updating files: 100% (3098/3098), done.
HEAD is now at 4c471dc Changes according to commnts
BUILD
Starting Step #0 - "python_install"
Step #0 - "python_install": Pulling image: python:3.11
Step #0 - "python_install": 3.11: Pulling from library/python
Step #0 - "python_install": 8cd46d290033: Already exists
Step #0 - "python_install": 2e6afa3f266c: Already exists
Step #0 - "python_install": 2e66a70da0be: Already exists
Step #0 - "python_install": 1c8ff076d818: Already exists
Step #0 - "python_install": 4e5bb47de96b: Pulling fs layer
Step #0 - "python_install": 434dd8140080: Pulling fs layer
Step #0 - "python_install": 22cc13c843ff: Pulling fs layer
Step #0 - "python_install": 22cc13c843ff: Verifying Checksum
Step #0 - "python_install": 22cc13c843ff: Download complete
Step #0 - "python_install": 4e5bb47de96b: Verifying Checksum
Step #0 - "python_install": 4e5bb47de96b: Download complete
Step #0 - "python_install": 434dd8140080: Verifying Checksum
Step #0 - "python_install": 434dd8140080: Download complete
Step #0 - "python_install": 4e5bb47de96b: Pull complete
Step #0 - "python_install": 434dd8140080: Pull complete
Step #0 - "python_install": 22cc13c843ff: Pull complete
Step #0 - "python_install": Digest: sha256:157a371e60389919fe4a72dff71ce86eaa5234f59114c23b0b346d0d02c74d39
Step #0 - "python_install": Status: Downloaded newer image for python:3.11
Step #0 - "python_install": docker.io/library/python:3.11
Step #0 - "python_install": ### Installing Python requirements
Step #0 - "python_install": Installing Python requirements
Step #0 - "python_install": 
Step #0 - "python_install": [notice] A new release of pip is available: 24.0 -> 24.2
Step #0 - "python_install": [notice] To update, run: pip install --upgrade pip
Finished Step #0 - "python_install"
Starting Step #1 - "python_test"
Starting Step #2 - "python_format_check"
Step #2 - "python_format_check": Already have image (with digest): python:3.11
Step #1 - "python_test": Already have image (with digest): python:3.11
Step #2 - "python_format_check": ### Testing lint
Step #1 - "python_test": ### Running Python tests in util/
Step #2 - "python_format_check": Installing Python requirements
Step #1 - "python_test": Installing Python requirements
Step #2 - "python_format_check": 
Step #2 - "python_format_check": [notice] A new release of pip is available: 24.0 -> 24.2
Step #2 - "python_format_check": [notice] To update, run: pip install --upgrade pip
Step #2 - "python_format_check": #### Testing Python lint
Step #1 - "python_test": 
Step #1 - "python_test": [notice] A new release of pip is available: 24.0 -> 24.2
Step #1 - "python_test": [notice] To update, run: pip install --upgrade pip
Step #1 - "python_test": #### Testing Python code in util/
Step #1 - "python_test": test_aggregate_dict (aggregation_util_test.AggregationUtilTest.test_aggregate_dict) ... ok
Step #1 - "python_test": test_aggregate_value (aggregation_util_test.AggregationUtilTest.test_aggregate_value) ... ok
Step #1 - "python_test": test_config_map_with_override (config_map_test.TestConfigMap.test_config_map_with_override) ... ok
Step #1 - "python_test": test_load_config_file (config_map_test.TestConfigMap.test_load_config_file)
Step #1 - "python_test": Test loading of config dictionary from a file. ... ok
Step #1 - "python_test": test_set_config (config_map_test.TestConfigMap.test_set_config) ... ok
Step #1 - "python_test": test_update_config (config_map_test.TestConfigMap.test_update_config) ... ok
Step #1 - "python_test": test_add_counter (counters_test.TestCounters.test_add_counter)
Step #1 - "python_test": Verify increment and decrement counters. ... Counters:
Step #1 - "python_test":                                        test_inputs =         10
Step #1 - "python_test":                          test_process_elapsed_time =       0.00
Step #1 - "python_test":                                     test_processed =          0
Step #1 - "python_test":                                    test_start_time =     250.52
Step #1 - "python_test": ok
Step #1 - "python_test": test_counter_dict (counters_test.TestCounters.test_counter_dict)
Step #1 - "python_test": Verify counter dict is shared across counters. ... Counters:
Step #1 - "python_test":                               process_elapsed_time =       0.00
Step #1 - "python_test":                                          processed =          0
Step #1 - "python_test":                                         start_time =     250.53
Step #1 - "python_test":                                           test_ctr =          1
Step #1 - "python_test": ok
Step #1 - "python_test": test_debug_counters (counters_test.TestCounters.test_debug_counters)
Step #1 - "python_test": Verify counters with debug string suffixes. ... Counters:
Step #1 - "python_test":                                       test3_inputs =         10
Step #1 - "python_test":                           test3_inputs_test-case-2 =         10
Step #1 - "python_test":                         test3_process_elapsed_time =       0.00
Step #1 - "python_test":                                    test3_processed =          0
Step #1 - "python_test":                                   test3_start_time =     250.53
Step #1 - "python_test": ok
Step #1 - "python_test": test_set_counter (counters_test.TestCounters.test_set_counter)
Step #1 - "python_test": Verify set_counter overrides current value. ... Counters:
Step #1 - "python_test":                                        test2_lines =          1
Step #1 - "python_test":                                  test2_lines_file1 =          1
Step #1 - "python_test":                         test2_process_elapsed_time =       0.00
Step #1 - "python_test":                                    test2_processed =          0
Step #1 - "python_test":                                   test2_start_time =     250.53
Step #1 - "python_test": Counters:
Step #1 - "python_test":                                        test2_lines =         11
Step #1 - "python_test":                                  test2_lines_file1 =         11
Step #1 - "python_test":                         test2_process_elapsed_time =       0.00
Step #1 - "python_test":                                    test2_processed =          0
Step #1 - "python_test":                                   test2_start_time =     250.53
Step #1 - "python_test": ok
Step #1 - "python_test": test_show_counters (counters_test.TestCounters.test_show_counters) ... Counters:
Step #1 - "python_test":                                     test-file-rows =        100
Step #1 - "python_test":                          test-process_elapsed_time =       0.00
Step #1 - "python_test":                        test-process_remaining_time = 1000000.00
Step #1 - "python_test":                                     test-read-rows =          0
Step #1 - "python_test":                                    test-start_time =     250.53
Step #1 - "python_test": Counters:
Step #1 - "python_test":                                     test-file-rows =        100
Step #1 - "python_test":                          test-process_elapsed_time =       0.00
Step #1 - "python_test":                        test-process_remaining_time =       0.01
Step #1 - "python_test":                               test-processing_rate =   15603.96
Step #1 - "python_test":                                     test-read-rows =         10
Step #1 - "python_test":                                    test-start_time =     250.53
Step #1 - "python_test": ok
Step #1 - "python_test": test_dc_api_batched_wrapper (dc_api_wrapper_test.TestDCAPIWrapper.test_dc_api_batched_wrapper)
Step #1 - "python_test": Test DC API wrapper for batched calls. ... ok
Step #1 - "python_test": test_dc_api_is_defined_dcid (dc_api_wrapper_test.TestDCAPIWrapper.test_dc_api_is_defined_dcid)
Step #1 - "python_test": Test API wrapper for defined DCIDs. ... ok
Step #1 - "python_test": test_dc_api_wrapper (dc_api_wrapper_test.TestDCAPIWrapper.test_dc_api_wrapper)
Step #1 - "python_test": Test the wrapper for DC API. ... ok
Step #1 - "python_test": test_dc_get_node_property_values (dc_api_wrapper_test.TestDCAPIWrapper.test_dc_get_node_property_values)
Step #1 - "python_test": Test API wrapper to get all property:values for a node. ... ok
Step #1 - "python_test": test_download_file (download_util_test.TestCounters.test_download_file) ... ok
Step #1 - "python_test": test_prefilled_url (download_util_test.TestCounters.test_prefilled_url) ... ok
Step #1 - "python_test": test_request_url (download_util_test.TestCounters.test_request_url) ... ok
Step #1 - "python_test": test_read_write (file_util_test.FileIOTest.test_read_write) ... ok
Step #1 - "python_test": test_file_get_estimate_num_rows (file_util_test.FileUtilsTest.test_file_get_estimate_num_rows) ... ok
Step #1 - "python_test": test_file_get_matching (file_util_test.FileUtilsTest.test_file_get_matching) ... ok
Step #1 - "python_test": test_file_load_csv_dict (file_util_test.FileUtilsTest.test_file_load_csv_dict) ... ok
Step #1 - "python_test": test_file_type (file_util_test.FileUtilsTest.test_file_type) ... ok
Step #1 - "python_test": test_file_write_load_py_dict (file_util_test.FileUtilsTest.test_file_write_load_py_dict) ... ok
Step #1 - "python_test": test_aa2 (latlng2place_mapsapi_test.Latlng2PlaceMapsAPITest.test_aa2) ... ok
Step #1 - "python_test": test_country (latlng2place_mapsapi_test.Latlng2PlaceMapsAPITest.test_country) ... ok
Step #1 - "python_test": test_main (latlng_recon_geojson_test.LatlngReconGeojsonTest.test_main) ... ok
Step #1 - "python_test": test_basic (latlng_recon_service_test.LatlngReconServiceTest.test_basic) ... /usr/local/lib/python3.11/concurrent/futures/thread.py:58: ResourceWarning: unclosed <ssl.SSLSocket fd=3, family=2, type=1, proto=6, laddr=('192.168.10.3', 40070), raddr=('34.49.176.91', 443)>
Step #1 - "python_test":   result = self.fn(*self.args, **self.kwargs)
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": ok
Step #1 - "python_test": test_filter (latlng_recon_service_test.LatlngReconServiceTest.test_filter) ... /usr/local/lib/python3.11/concurrent/futures/thread.py:58: ResourceWarning: unclosed <ssl.SSLSocket fd=3, family=2, type=1, proto=6, laddr=('192.168.10.3', 40080), raddr=('34.49.176.91', 443)>
Step #1 - "python_test":   result = self.fn(*self.args, **self.kwargs)
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": ok
Step #1 - "python_test": test_dict_list_to_mcf_str (mcf_dict_util_test.TestMCFDict.test_dict_list_to_mcf_str) ... ok
Step #1 - "python_test": test_drop_nodes (mcf_dict_util_test.TestMCFDict.test_drop_nodes) ... ok
Step #1 - "python_test": test_get_dcid_node (mcf_dict_util_test.TestMCFDict.test_get_dcid_node) ... ok
Step #1 - "python_test": test_mcf_dict_rename_namespace (mcf_dict_util_test.TestMCFDict.test_mcf_dict_rename_namespace) ... ok
Step #1 - "python_test": test_mcf_dict_rename_prop (mcf_dict_util_test.TestMCFDict.test_mcf_dict_rename_prop) ... ok
Step #1 - "python_test": test_mcf_dict_rename_prop_value (mcf_dict_util_test.TestMCFDict.test_mcf_dict_rename_prop_value) ... ok
Step #1 - "python_test": test_mcf_to_dict_list (mcf_dict_util_test.TestMCFDict.test_mcf_to_dict_list) ... ok
Step #1 - "python_test": test_node_list_check_existence_dc (mcf_dict_util_test.TestMCFDict.test_node_list_check_existence_dc) ... ok
Step #1 - "python_test": test_node_list_check_existence_node_list (mcf_dict_util_test.TestMCFDict.test_node_list_check_existence_node_list) ... ok
Step #1 - "python_test": test_example_usage (mcf_template_filler_test.MCFTemplateFillerTest.test_example_usage) ... ok
Step #1 - "python_test": test_pop_and_2_obs_with_all_pv (mcf_template_filler_test.MCFTemplateFillerTest.test_pop_and_2_obs_with_all_pv)
Step #1 - "python_test": Use separate templates for Pop Obs, and use Obs template repeatedly. ... ok
Step #1 - "python_test": test_pop_with_missing_req_pv (mcf_template_filler_test.MCFTemplateFillerTest.test_pop_with_missing_req_pv) ... ok
Step #1 - "python_test": test_require_node_name (mcf_template_filler_test.MCFTemplateFillerTest.test_require_node_name) ... ok
Step #1 - "python_test": test_unified_pop_obs_with_missing_optional_pv (mcf_template_filler_test.MCFTemplateFillerTest.test_unified_pop_obs_with_missing_optional_pv) ... ok
Step #1 - "python_test": test_place_id_resolution_by_name (state_division_to_dcid_test.PlaceMapTest.test_place_id_resolution_by_name) ... ok
Step #1 - "python_test": test_boolean_naming (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_boolean_naming) ... ok
Step #1 - "python_test": test_double_underscore (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_double_underscore) ... ok
Step #1 - "python_test": test_ignore_props (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_ignore_props) ... ok
Step #1 - "python_test": test_legacy_mapping (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_legacy_mapping) ... ok
Step #1 - "python_test": test_measured_property (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_measured_property) ... ok
Step #1 - "python_test": test_measurement_constraint_removal (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_measurement_constraint_removal) ... ok
Step #1 - "python_test": test_measurement_denominator (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_measurement_denominator) ... ok
Step #1 - "python_test": test_measurement_qualifier (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_measurement_qualifier) ... ok
Step #1 - "python_test": test_naics_name_generation (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_naics_name_generation) ... ok
Step #1 - "python_test": test_namespace_removal (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_namespace_removal) ... ok
Step #1 - "python_test": test_prepend_append_replace (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_prepend_append_replace) ... ok
Step #1 - "python_test": test_quantity_name_generation (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_quantity_name_generation) ... ok
Step #1 - "python_test": test_quantity_range_name_generation (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_quantity_range_name_generation) ... ok
Step #1 - "python_test": test_soc_map (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_soc_map) ... ok
Step #1 - "python_test": test_soc_name_generation (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_soc_name_generation) ... ok
Step #1 - "python_test": test_sorted_constraints (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_sorted_constraints) ... ok
Step #1 - "python_test": test_stat_type (statvar_dcid_generator_test.TestStatVarDcidGenerator.test_stat_type) ... ok
Step #1 - "python_test": 
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": Ran 61 tests in 4.261s
Step #1 - "python_test": 
Step #1 - "python_test": OK
Step #1 - "python_test": ### Running Python tests in import-automation/executor
Step #1 - "python_test": Installing Python requirements
Step #1 - "python_test": 
Step #1 - "python_test": [notice] A new release of pip is available: 24.0 -> 24.2
Step #1 - "python_test": [notice] To update, run: pip install --upgrade pip
Step #1 - "python_test": #### Testing Python code in import-automation/executor
Step #1 - "python_test": test_appengine_job_request (test.cloud_scheduler_test.CloudSchedulerTest.test_appengine_job_request) ... ok
Step #1 - "python_test": test_http_job_request (test.cloud_scheduler_test.CloudSchedulerTest.test_http_job_request) ... ok
Step #1 - "python_test": test.file_uploader_test (unittest.loader._FailedTest.test.file_uploader_test) ... ERROR
Step #1 - "python_test": test.github_api_test (unittest.loader._FailedTest.test.github_api_test) ... ERROR
Step #1 - "python_test": test_clean_time (test.import_executor_test.ImportExecutorTest.test_clean_time) ... ok
Step #1 - "python_test": test_construct_process_message (test.import_executor_test.ImportExecutorTest.test_construct_process_message) ... ok
Step #1 - "python_test": test_construct_process_message_no_output (test.import_executor_test.ImportExecutorTest.test_construct_process_message_no_output)
Step #1 - "python_test": Tests that _construct_process_message does not append ... ok
Step #1 - "python_test": test_create_venv (test.import_executor_test.ImportExecutorTest.test_create_venv) ... ok
Step #1 - "python_test": test_run_and_handle_exception (test.import_executor_test.ImportExecutorTest.test_run_and_handle_exception) ... ERROR:root:An unexpected exception was thrown
Step #1 - "python_test": Traceback (most recent call last):
Step #1 - "python_test":   File "/workspace/import-automation/executor/app/executor/import_executor.py", line 489, in run_and_handle_exception
Step #1 - "python_test":     return exec_func(*args)
Step #1 - "python_test":            ^^^^^^^^^^^^^^^^
Step #1 - "python_test": TypeError: 'str' object is not callable
Step #1 - "python_test": FAIL
Step #1 - "python_test": test_run_with_timeout (test.import_executor_test.ImportExecutorTest.test_run_with_timeout) ... ERROR:root:An unexpected exception was thrown: Command '['sleep', '5']' timed out after 0.1 seconds when running ['sleep', '5']: Traceback (most recent call last):
Step #1 - "python_test":   File "/workspace/import-automation/executor/app/executor/import_executor.py", line 588, in _run_with_timeout
Step #1 - "python_test":     process = subprocess.run(args,
Step #1 - "python_test":               ^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "/usr/local/lib/python3.11/subprocess.py", line 550, in run
Step #1 - "python_test":     stdout, stderr = process.communicate(input, timeout=timeout)
Step #1 - "python_test":                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "/usr/local/lib/python3.11/subprocess.py", line 1209, in communicate
Step #1 - "python_test":     stdout, stderr = self._communicate(input, endtime, timeout)
Step #1 - "python_test":                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "/usr/local/lib/python3.11/subprocess.py", line 2116, in _communicate
Step #1 - "python_test":     self._check_timeout(endtime, orig_timeout, stdout, stderr)
Step #1 - "python_test":   File "/usr/local/lib/python3.11/subprocess.py", line 1253, in _check_timeout
Step #1 - "python_test":     raise TimeoutExpired(
Step #1 - "python_test": subprocess.TimeoutExpired: Command '['sleep', '5']' timed out after 0.1 seconds
Step #1 - "python_test": Traceback (most recent call last):
Step #1 - "python_test":   File "/workspace/import-automation/executor/app/executor/import_executor.py", line 588, in _run_with_timeout
Step #1 - "python_test":     process = subprocess.run(args,
Step #1 - "python_test":               ^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "/usr/local/lib/python3.11/subprocess.py", line 550, in run
Step #1 - "python_test":     stdout, stderr = process.communicate(input, timeout=timeout)
Step #1 - "python_test":                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "/usr/local/lib/python3.11/subprocess.py", line 1209, in communicate
Step #1 - "python_test":     stdout, stderr = self._communicate(input, endtime, timeout)
Step #1 - "python_test":                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "/usr/local/lib/python3.11/subprocess.py", line 2116, in _communicate
Step #1 - "python_test":     self._check_timeout(endtime, orig_timeout, stdout, stderr)
Step #1 - "python_test":   File "/usr/local/lib/python3.11/subprocess.py", line 1253, in _check_timeout
Step #1 - "python_test":     raise TimeoutExpired(
Step #1 - "python_test": subprocess.TimeoutExpired: Command '['sleep', '5']' timed out after 0.1 seconds
Step #1 - "python_test": FAIL
Step #1 - "python_test": test_are_imports_finished (test.import_service_test.ImportServiceTest.test_are_imports_finished) ... ok
Step #1 - "python_test": test_block_on_import (test.import_service_test.ImportServiceTest.test_block_on_import) ... ok
Step #1 - "python_test": test_fix_input_path (test.import_service_test.ImportServiceTest.test_fix_input_path) ... ok
Step #1 - "python_test": test_format_import_info (test.import_service_test.ImportServiceTest.test_format_import_info) ... ok
Step #1 - "python_test": test_get_fixed_absolute_import_name (test.import_service_test.ImportServiceTest.test_get_fixed_absolute_import_name) ... ok
Step #1 - "python_test": test_get_import_id (test.import_service_test.ImportServiceTest.test_get_import_id) ... ok
Step #1 - "python_test": test_smart_import (test.import_service_test.ImportServiceTest.test_smart_import) ... ok
Step #1 - "python_test": test_absolute_import_name (test.import_target_test.ImportTargetTest.test_absolute_import_name) ... ok
Step #1 - "python_test": test_is_import_targetted_by_commit (test.import_target_test.ImportTargetTest.test_is_import_targetted_by_commit) ... ok
Step #1 - "python_test": test_parse_commit_message_targets (test.import_target_test.ImportTargetTest.test_parse_commit_message_targets) ... ok
Step #1 - "python_test": test.integration_test (unittest.loader._FailedTest.test.integration_test) ... ERROR
Step #1 - "python_test": test_download_file (test.utils_test.AppUtilsTest.test_download_file)
Step #1 - "python_test": Response does not have a Content-Disposition header. ... ok
Step #1 - "python_test": test_download_file_timeout (test.utils_test.AppUtilsTest.test_download_file_timeout)
Step #1 - "python_test": Raises requests.Timeout exception. ... ok
Step #1 - "python_test": test_get_filename (test.utils_test.AppUtilsTest.test_get_filename) ... ok
Step #1 - "python_test": test_get_filename_raise (test.utils_test.AppUtilsTest.test_get_filename_raise) ... ok
Step #1 - "python_test": test_pacific_time_to_datetime (test.utils_test.AppUtilsTest.test_pacific_time_to_datetime)
Step #1 - "python_test": Tests that the string returned by pacific_time can be converted to ... ok
Step #1 - "python_test": test_pacific_time_to_datetime_then_back (test.utils_test.AppUtilsTest.test_pacific_time_to_datetime_then_back)
Step #1 - "python_test": Tests that the string returned by pacific_time can be converted to ... ok
Step #1 - "python_test": test_compare_lines (test.utils_test.TestUtilsTest.test_compare_lines) ... ok
Step #1 - "python_test": test_import_spec_valid (test.validation_test.ValidationTest.test_import_spec_valid) ... ok
Step #1 - "python_test": test_import_spec_valid_fields_absent (test.validation_test.ValidationTest.test_import_spec_valid_fields_absent) ... ok
Step #1 - "python_test": test_import_spec_valid_script_not_exist (test.validation_test.ValidationTest.test_import_spec_valid_script_not_exist) ... ok
Step #1 - "python_test": test_import_targets_valid_absolute_names (test.validation_test.ValidationTest.test_import_targets_valid_absolute_names) ... ok
Step #1 - "python_test": test_import_targets_valid_manifest_not_exist (test.validation_test.ValidationTest.test_import_targets_valid_manifest_not_exist) ... ok
Step #1 - "python_test": test_import_targets_valid_name_not_exist (test.validation_test.ValidationTest.test_import_targets_valid_name_not_exist) ... ok
Step #1 - "python_test": test_import_targets_valid_relative_names (test.validation_test.ValidationTest.test_import_targets_valid_relative_names) ... ok
Step #1 - "python_test": test_import_targets_valid_relative_names_multiple_dirs (test.validation_test.ValidationTest.test_import_targets_valid_relative_names_multiple_dirs) ... ok
Step #1 - "python_test": test_manifest_valid_fields_absent (test.validation_test.ValidationTest.test_manifest_valid_fields_absent) ... ok
Step #1 - "python_test": 
Step #1 - "python_test": ======================================================================
Step #1 - "python_test": ERROR: test.file_uploader_test (unittest.loader._FailedTest.test.file_uploader_test)
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": ImportError: Failed to import test module: test.file_uploader_test
Step #1 - "python_test": Traceback (most recent call last):
Step #1 - "python_test":   File "/usr/local/lib/python3.11/unittest/loader.py", line 419, in _find_test_path
Step #1 - "python_test":     module = self._get_module_from_name(name)
Step #1 - "python_test":              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "/usr/local/lib/python3.11/unittest/loader.py", line 362, in _get_module_from_name
Step #1 - "python_test":     __import__(name)
Step #1 - "python_test":   File "/workspace/import-automation/executor/test/file_uploader_test.py", line 22, in <module>
Step #1 - "python_test":     from test import integration_test
Step #1 - "python_test":   File "/workspace/import-automation/executor/test/integration_test.py", line 30, in <module>
Step #1 - "python_test":     'github_repo_owner_username': os.environ['_GITHUB_REPO_OWNER_USERNAME'],
Step #1 - "python_test":                                   ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "<frozen os>", line 679, in __getitem__
Step #1 - "python_test": KeyError: '_GITHUB_REPO_OWNER_USERNAME'
Step #1 - "python_test": 
Step #1 - "python_test": 
Step #1 - "python_test": ======================================================================
Step #1 - "python_test": ERROR: test.github_api_test (unittest.loader._FailedTest.test.github_api_test)
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": ImportError: Failed to import test module: test.github_api_test
Step #1 - "python_test": Traceback (most recent call last):
Step #1 - "python_test":   File "/usr/local/lib/python3.11/unittest/loader.py", line 419, in _find_test_path
Step #1 - "python_test":     module = self._get_module_from_name(name)
Step #1 - "python_test":              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "/usr/local/lib/python3.11/unittest/loader.py", line 362, in _get_module_from_name
Step #1 - "python_test":     __import__(name)
Step #1 - "python_test":   File "/workspace/import-automation/executor/test/github_api_test.py", line 28, in <module>
Step #1 - "python_test":     from test import integration_test
Step #1 - "python_test":   File "/workspace/import-automation/executor/test/integration_test.py", line 30, in <module>
Step #1 - "python_test":     'github_repo_owner_username': os.environ['_GITHUB_REPO_OWNER_USERNAME'],
Step #1 - "python_test":                                   ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Step #1 - "python_test":   File "<frozen os>", line 679, in __getitem__
Step #1 - "python_test": KeyError: '_GITHUB_REPO_OWNER_USERNAME'
Step #1 - "python_test": 
Step #1 - "python_test": 
Step #1 - "python_test": ======================================================================
Step #1 - "python_test": ERROR: test.integration_test (unittest.loader._FailedTest.test.integration_test)
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": ImportError: Failed to import test module: test.integration_test
Step #1 - "python_test": Traceback (most recent call last):
Step #1 - "python_test":   File "/usr/local/lib/python3.11/unittest/loader.py", line 419, in _find_test_path
Step #1 - "python_test":     module = self._get_module_from_name(name)
Step #1 - "python_test":              ^^^^^^^^^^^
...
[Logs truncated due to log size limitations. For full logs, see https://console.cloud.google.com/cloud-build/builds/35718f42-6c7b-452c-a741-a253dc623b00?project=879489846695.]
...
ckages/xarray/core/pycompat.py:37: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
Step #1 - "python_test":   duck_array_version = LooseVersion("0.0.0")
Step #1 - "python_test": /workspace/.env/lib/python3.11/site-packages/xarray/core/dataset.py:5322: DeprecationWarning: `cumproduct` is deprecated as of NumPy 1.25.0, and will be removed in NumPy 2.0. Please use `cumprod` instead.
Step #1 - "python_test":   index = self.coords.to_index([*ordered_dims])
Step #1 - "python_test": /workspace/.env/lib/python3.11/site-packages/xarray/core/pycompat.py:37: DeprecationWarning: distutils Version classes are deprecated. Use packaging.version instead.
Step #1 - "python_test":   duck_array_version = LooseVersion("0.0.0")
Step #1 - "python_test": /workspace/.env/lib/python3.11/site-packages/xarray/core/dataset.py:5322: DeprecationWarning: `cumproduct` is deprecated as of NumPy 1.25.0, and will be removed in NumPy 2.0. Please use `cumprod` instead.
Step #1 - "python_test":   index = self.coords.to_index([*ordered_dims])
Step #1 - "python_test": ok
Step #1 - "python_test": test_sv (nasa.cmip6_sea_level.process_test.ProcessTest.test_sv) ... ok
Step #1 - "python_test": test_id_map (ocha.geodata.generate_test.GenerateTest.test_id_map) ... ok
Step #1 - "python_test": test_mcf (ocha.geodata.generate_test.GenerateTest.test_mcf) ... ok
Step #1 - "python_test": test_create_csv (ourworldindata.covid19.preprocess_csv_test.TestPreprocessCsvTest.test_create_csv) ... ok
Step #1 - "python_test": test_create_tmcf (ourworldindata.covid19.preprocess_csv_test.TestPreprocessCsvTest.test_create_tmcf) ... ok
Step #1 - "python_test": test_diff_mcf_files (statvar.mcf_diff_test.TestMCFDiff.test_diff_mcf_files) ... Counters:
Step #1 - "python_test":    input-nodes-in-mcf1:/tmp/tmp3ujrz_9e/sample.mcf =          2
Step #1 - "python_test":                               process_elapsed_time =       0.00
Step #1 - "python_test":                                          processed =          0
Step #1 - "python_test":                                         start_time =     363.05
Step #1 - "python_test": ok
Step #1 - "python_test": test_diff_mcf_node_pvs (statvar.mcf_diff_test.TestMCFDiff.test_diff_mcf_node_pvs)
Step #1 - "python_test": Test diff on MCF node dictionary. ... ok
Step #1 - "python_test": test_add_namespace (statvar.mcf_file_util_test.TestMCFFileUtil.test_add_namespace) ... ok
Step #1 - "python_test": test_load_mcf_file (statvar.mcf_file_util_test.TestMCFFileUtil.test_load_mcf_file) ... Counters:
Step #1 - "python_test":                                        PVs-matched =          1
Step #1 - "python_test":                               process_elapsed_time =       0.00
Step #1 - "python_test":                                          processed =          0
Step #1 - "python_test":                                         start_time =     363.05
Step #1 - "python_test": ok
Step #1 - "python_test": test_normalize_mcf_node (statvar.mcf_file_util_test.TestMCFFileUtil.test_normalize_mcf_node) ... ok
Step #1 - "python_test": test_normalize_range (statvar.mcf_file_util_test.TestMCFFileUtil.test_normalize_range) ... ok
Step #1 - "python_test": test_strip_namespace (statvar.mcf_file_util_test.TestMCFFileUtil.test_strip_namespace) ... ok
Step #1 - "python_test": test_drop_existing_mcf_nodes (statvar.mcf_filter_test.TestMCFFilter.test_drop_existing_mcf_nodes) ... Counters:
Step #1 - "python_test":                            existing-nodes-from-api =          2
Step #1 - "python_test":                               process_elapsed_time =       0.24
Step #1 - "python_test":                                          processed =          0
Step #1 - "python_test":                                         start_time =     363.05
Step #1 - "python_test": ok
Step #1 - "python_test": test_filter_mcf_file (statvar.mcf_filter_test.TestMCFFilter.test_filter_mcf_file) ... Counters:
Step #1 - "python_test":                                        input-nodes =       1261
Step #1 - "python_test":                               process_elapsed_time =       0.07
Step #1 - "python_test":                                          processed =          0
Step #1 - "python_test":                                         start_time =     363.36
Step #1 - "python_test": Counters:
Step #1 - "python_test":                                        PVs-matched =        157
Step #1 - "python_test":                                ignore-nodes-loaded =       1261
Step #1 - "python_test":                                        input-nodes =       1261
Step #1 - "python_test":                                input-nodes-ignored =         23
Step #1 - "python_test":                                      nodes-matched =         23
Step #1 - "python_test":                                       output-nodes =       1238
Step #1 - "python_test":                               process_elapsed_time =       0.09
Step #1 - "python_test":                                          processed =          0
Step #1 - "python_test":                                         start_time =     363.36
Step #1 - "python_test": ok
Step #1 - "python_test": test_un_energy_process (un.energy.process_test.TestUNEnergyProcess.test_un_energy_process)
Step #1 - "python_test": Test the process() function for UN energy data set. ... ok
Step #1 - "python_test": test_process_containment (un.sdg.geography_test.GeographyTest.test_process_containment) ... ok
Step #1 - "python_test": test_should_include_containment (un.sdg.geography_test.GeographyTest.test_should_include_containment) ... ok
Step #1 - "python_test": test_write_place_mappings (un.sdg.geography_test.GeographyTest.test_write_place_mappings) ... ok
Step #1 - "python_test": test_write_un_containment (un.sdg.geography_test.GeographyTest.test_write_un_containment) ... ok
Step #1 - "python_test": test_write_un_places (un.sdg.geography_test.GeographyTest.test_write_un_places) ... ok
Step #1 - "python_test": test_drop_null (un.sdg.process_test.ProcessTest.test_drop_null) ... ok
Step #1 - "python_test": test_drop_special (un.sdg.process_test.ProcessTest.test_drop_special) ... ok
Step #1 - "python_test": test_fix_encoding (un.sdg.process_test.ProcessTest.test_fix_encoding) ... ok
Step #1 - "python_test": test_get_geography (un.sdg.process_test.ProcessTest.test_get_geography) ... ok
Step #1 - "python_test": test_get_measurement_method (un.sdg.process_test.ProcessTest.test_get_measurement_method) ... ok
Step #1 - "python_test": test_process (un.sdg.process_test.ProcessTest.test_process) ... ok
Step #1 - "python_test": test_curate_pvs (un.sdg.util_test.UtilTest.test_curate_pvs) ... ok
Step #1 - "python_test": test_format_description (un.sdg.util_test.UtilTest.test_format_description) ... ok
Step #1 - "python_test": test_format_property (un.sdg.util_test.UtilTest.test_format_property) ... ok
Step #1 - "python_test": test_format_title (un.sdg.util_test.UtilTest.test_format_title) ... ok
Step #1 - "python_test": test_format_variable_code (un.sdg.util_test.UtilTest.test_format_variable_code) ... ok
Step #1 - "python_test": test_format_variable_description (un.sdg.util_test.UtilTest.test_format_variable_description) ... ok
Step #1 - "python_test": test_is_float (un.sdg.util_test.UtilTest.test_is_float) ... ok
Step #1 - "python_test": test_is_valid (un.sdg.util_test.UtilTest.test_is_valid) ... ok
Step #1 - "python_test": test_data_processing_small (us_bea.states_gdp.import_data_test.USStateQuarterlyGDPImportTest.test_data_processing_small)
Step #1 - "python_test": Tests end-to-end data cleaning on a small example. ... ok
Step #1 - "python_test": test_data_processing_tiny (us_bea.states_gdp.import_data_test.USStateQuarterlyGDPImportTest.test_data_processing_tiny)
Step #1 - "python_test": Tests end-to-end data cleaning on a tiny example. ... ok
Step #1 - "python_test": test_date_converter (us_bea.states_gdp.import_data_test.USStateQuarterlyGDPImportTest.test_date_converter)
Step #1 - "python_test": Tests the date converter function used to process raw data. ... ok
Step #1 - "python_test": test_geoid_converter (us_bea.states_gdp.import_data_test.USStateQuarterlyGDPImportTest.test_geoid_converter)
Step #1 - "python_test": Tests the geoid converter function used to process raw data. ... ok
Step #1 - "python_test": test_data_processing_tiny (us_bea.states_gdp.import_data_test.USStateQuarterlyPerIndustryImportTest.test_data_processing_tiny)
Step #1 - "python_test": Tests end-to-end data cleaning on a tiny example. ... ok
Step #1 - "python_test": test_industry_class (us_bea.states_gdp.import_data_test.USStateQuarterlyPerIndustryImportTest.test_industry_class)
Step #1 - "python_test": Tests industry class converter function that cleans out empty ... ok
Step #1 - "python_test": test_value_converter (us_bea.states_gdp.import_data_test.USStateQuarterlyPerIndustryImportTest.test_value_converter)
Step #1 - "python_test": Tests value converter function that cleans out empty datapoints. ... ok
Step #1 - "python_test": test_preprocess (us_bjs.nps.import_data_test.TestPreprocess.test_preprocess) ... /workspace/scripts/us_bjs/nps/preprocess_data.py:21: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["PVINF_Temp"] = df["PVINF"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:22: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["PVOTHF_Temp"] = df["PVOTHF"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:23: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["PVINM_Temp"] = df["PVINM"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:24: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["PVOTHM_Temp"] = df["PVOTHM"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:25: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["Female_Total_Temp"] = df[["JURTOTF", "PVINF_Temp", "PVOTHF_Temp"
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:28: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["Male_Total_Temp"] = df[["JURTOTM", "PVINM_Temp", "PVOTHM_Temp"
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:32: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["LFF_Temp"] = df["LFF"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:33: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["LFM_Temp"] = df["LFM"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:41: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["LFCRSTF_Temp"] = df["LFCRSTF"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:42: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["LFCRSTM_Temp"] = df["LFCRSTM"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:50: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["FEDF_Temp"] = df["FEDF"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:51: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["OTHSTF_Temp"] = df["OTHSTF"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:52: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["FEDM_Temp"] = df["FEDM"].apply(convert_nan_for_calculation)
Step #1 - "python_test": /workspace/scripts/us_bjs/nps/preprocess_data.py:53: PerformanceWarning: DataFrame is highly fragmented.  This is usually the result of calling `frame.insert` many times, which has poor performance.  Consider joining all columns at once using pd.concat(axis=1) instead. To get a de-fragmented frame, use `newframe = frame.copy()`
Step #1 - "python_test":   df["OTHSTM_Temp"] = df["OTHSTM"].apply(convert_nan_for_calculation)
Step #1 - "python_test": ok
Step #1 - "python_test": test_filter_series (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF.test_filter_series) ... ok
Step #1 - "python_test": test_generate_statvar (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF.test_generate_statvar) ... ok
Step #1 - "python_test": test_invalid_series_id (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF.test_invalid_series_id) ... ok
Step #1 - "python_test": test_parse_series_id (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF.test_parse_series_id) ... ok
Step #1 - "python_test": test_valid_series_id (us_bls.cpi.generate_csv_mcf_test.TestGenerateCSVMCF.test_valid_series_id) ... ok
Step #1 - "python_test": test_clean_cdc_places_data (us_cdc.500_places.parse_cdc_places_test.TestParseCDCPlaces.test_clean_cdc_places_data)
Step #1 - "python_test": Tests the clean_cdc_places_data function. ... ok
Step #1 - "python_test": test_brfss_asthma_extracted_data (us_cdc.brfss_aggregated_asthma_2016_2018.brfss_asthma_import_test.ProcessTest.test_brfss_asthma_extracted_data) ... ok
Step #1 - "python_test": test_clean_air_quality_data (us_cdc.environmental_health_toxicology.parse_air_quality_test.TestParseAirQuality.test_clean_air_quality_data)
Step #1 - "python_test": Tests the clean_air_quality_data function. ... ok
Step #1 - "python_test": test_clean_precipitation_data (us_cdc.environmental_health_toxicology.parse_precipitation_index_test.TestParsePrecipitationData.test_clean_precipitation_data)
Step #1 - "python_test": Tests the clean_precipitation_data function. ... ok
Step #1 - "python_test": test_spec_generation (us_census.acs5yr.subject_tables.common.acs_spec_generator_test.TestSpecGenerator.test_spec_generation) ... /workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/datacommons_wrappers.py:146: ResourceWarning: unclosed file <_io.TextIOWrapper name='/workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/prefetched_outputs/Person_dc_props.json' mode='r' encoding='UTF-8'>
Step #1 - "python_test":   dc_props = json.load(
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": /workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/datacommons_wrappers.py:171: ResourceWarning: unclosed file <_io.TextIOWrapper name='/workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/prefetched_outputs/Person_dc_props_types.json' mode='r' encoding='UTF-8'>
Step #1 - "python_test":   dc_props = json.load(
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": /workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/datacommons_wrappers.py:198: ResourceWarning: unclosed file <_io.TextIOWrapper name='/workspace/scripts/us_census/acs5yr/subject_tables/common/datacommons_api_wrappers/prefetched_outputs/Person_dc_props_enum_values.json' mode='r' encoding='UTF-8'>
Step #1 - "python_test":   dc_props = json.load(
Step #1 - "python_test": ResourceWarning: Enable tracemalloc to get the object allocation traceback
Step #1 - "python_test": ok
Step #1 - "python_test": test_find_columns_with_no_properties (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_columns_with_no_properties) ... ok
Step #1 - "python_test": test_find_extra_inferred_properties (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_extra_inferred_properties) ... ok
Step #1 - "python_test": test_find_extra_tokens (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_extra_tokens) ... ok
Step #1 - "python_test": test_find_ignore_conflicts (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_ignore_conflicts) ... ok
Step #1 - "python_test": test_find_missing_denominator_total_column (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_missing_denominator_total_column) ... ok
Step #1 - "python_test": test_find_missing_denominators (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_missing_denominators) ... ok
Step #1 - "python_test": test_find_multiple_measurement (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_multiple_measurement) ... ok
Step #1 - "python_test": test_find_multiple_population (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_multiple_population) ... ok
Step #1 - "python_test": test_find_repeating_denominators (us_census.acs5yr.subject_tables.common.acs_spec_validator_test.TestSpecValidator.test_find_repeating_denominators) ... ok
Step #1 - "python_test": test_check_column_map (us_census.acs5yr.subject_tables.common.column_map_validator_test.TestColumnMapValidator.test_check_column_map) ... ok
Step #1 - "python_test": test_column_ignore (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil.test_column_ignore) ... ok
Step #1 - "python_test": test_columns_from_CSVreader (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil.test_columns_from_CSVreader) ... ok
Step #1 - "python_test": test_find_missing_tokens (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil.test_find_missing_tokens) ... ok
Step #1 - "python_test": test_get_spec_token_list (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil.test_get_spec_token_list) ... ok
Step #1 - "python_test": test_token_in_list (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil.test_token_in_list) ... ok
Step #1 - "python_test": test_tokens_from_column_list (us_census.acs5yr.subject_tables.common.common_util_test.TestCommonUtil.test_tokens_from_column_list) ... ok
Step #1 - "python_test": test_csv_file_input (us_census.acs5yr.subject_tables.common.data_loader_test.DataLoaderBaseTest.test_csv_file_input) ... ok
Step #1 - "python_test": test_zip_file_input (us_census.acs5yr.subject_tables.common.data_loader_test.DataLoaderBaseTest.test_zip_file_input) ... ok
Step #1 - "python_test": test_generating_column_map_from_csv (us_census.acs5yr.subject_tables.common.generate_col_map_test.GenerateColMapTest.test_generating_column_map_from_csv) ... ok
Step #1 - "python_test": test_generating_column_map_from_zip (us_census.acs5yr.subject_tables.common.generate_col_map_test.GenerateColMapTest.test_generating_column_map_from_zip) ... ok
Step #1 - "python_test": test_geoIds_at_all_summary_levels (us_census.acs5yr.subject_tables.common.resolve_geo_id_test.ResolveCensusGeoIdTest.test_geoIds_at_all_summary_levels) ... ok
Step #1 - "python_test": test_convert_column_to_stat_var (us_census.acs5yr.subject_tables.s2201.process_test.ProcessTest.test_convert_column_to_stat_var) ... ok
Step #1 - "python_test": test_create_csv (us_census.acs5yr.subject_tables.s2201.process_test.ProcessTest.test_create_csv) ... ok
Step #1 - "python_test": test_create_tmcf (us_census.acs5yr.subject_tables.s2201.process_test.ProcessTest.test_create_tmcf) ... ok
Step #1 - "python_test": test_csv_mcf_column_map (us_census.acs5yr.subject_tables.subject_table_test.TestSubjectTable.test_csv_mcf_column_map) ... ok
Step #1 - "python_test": test_e2e (us_census.decennial.process_test.ProcessTest.test_e2e) ... ok
Step #1 - "python_test": test_bad_tmcf_variable_measured_two_equals_exception (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test.test_bad_tmcf_variable_measured_two_equals_exception) ... ok
Step #1 - "python_test": test_bad_tmcf_variable_measured_two_question_marks_exception (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test.test_bad_tmcf_variable_measured_two_question_marks_exception) ... ok
Step #1 - "python_test": test_csv_file_not_found_exception (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test.test_csv_file_not_found_exception) ... ok
Step #1 - "python_test": test_process_enhanced_tmcf_medium_success (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test.test_process_enhanced_tmcf_medium_success) ... ok
Step #1 - "python_test": test_simple_opaque_success (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test.test_simple_opaque_success) ... ok
Step #1 - "python_test": test_simple_success (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test.test_simple_success) ... ok
Step #1 - "python_test": test_tmcf_file_not_found_exception (us_census.enhanced_tmcf.process_etmcf_test.Process_ETMCF_Test.test_tmcf_file_not_found_exception) ... ok
Step #1 - "python_test": test_process (us_eia.eia_860.main_test.TestProcess.test_process) ... /workspace/scripts/us_eia/eia_860/utility.py:91: FutureWarning: Downcasting behavior in `replace` is deprecated and will be removed in a future version. To retain the old behavior, explicitly call `result.infer_objects(copy=False)`. To opt-in to the future behavior, set `pd.set_option('future.no_silent_downcasting', True)`
Step #1 - "python_test":   raw_df = raw_df.replace(np.nan, '')
Step #1 - "python_test": /workspace/scripts/us_eia/eia_860/power_plant.py:178: FutureWarning: Downcasting behavior in `replace` is deprecated and will be removed in a future version. To retain the old behavior, explicitly call `result.infer_objects(copy=False)`. To opt-in to the future behavior, set `pd.set_option('future.no_silent_downcasting', True)`
Step #1 - "python_test":   raw_df = raw_df.replace(np.nan, '')
Step #1 - "python_test": ok
Step #1 - "python_test": test_cleanup_name (us_eia.opendata.process.common_test.TestProcess.test_cleanup_name) ... ok
Step #1 - "python_test": test_process (us_eia.opendata.process.common_test.TestProcess.test_process) ... ok
Step #1 - "python_test": test_write_csv_county (us_epa.airdata.air_quality_aggregate_test.TestCriteriaGasesTest.test_write_csv_county) ... ok
Step #1 - "python_test": test_write_csv_csba (us_epa.airdata.air_quality_aggregate_test.TestCriteriaGasesTest.test_write_csv_csba) ... ok
Step #1 - "python_test": test_write_tmcf (us_epa.airdata.air_quality_aggregate_test.TestCriteriaGasesTest.test_write_tmcf) ... ok
Step #1 - "python_test": test_write_csv (us_epa.airdata.air_quality_test.TestCriteriaGasesTest.test_write_csv) ... ok
Step #1 - "python_test": test_write_tmcf (us_epa.airdata.air_quality_test.TestCriteriaGasesTest.test_write_tmcf) ... ok
Step #1 - "python_test": test_write_csv (us_epa.ejscreen.ejscreen_test.TestEjscreen.test_write_csv) ... ok
Step #1 - "python_test": test_e2e (us_epa.facility.process_facility_test.ProcessTest.test_e2e) ... ok
Step #1 - "python_test": test_name_to_dcid (us_epa.ghgrp.gas_test.GasTest.test_name_to_dcid) ... ok
Step #1 - "python_test": test_process_direct_emitters (us_epa.ghgrp.process_test.ProcessTest.test_process_direct_emitters) ... ok
Step #1 - "python_test": test_name_to_dcid (us_epa.ghgrp.sources_test.SourcesTest.test_name_to_dcid) ... ok
Step #1 - "python_test": test_parent_companies_e2e (us_epa.parent_company.process_parent_company_test.ProcessTest.test_parent_companies_e2e) ... ok
Step #1 - "python_test": test_svobs_e2e (us_epa.parent_company.process_parent_company_test.ProcessTest.test_svobs_e2e) ... ok
Step #1 - "python_test": test_e2e_superfund_funding_status (us_epa.superfund.site_and_funding_status.process_sites_fundingStatus_test.ProcessTest.test_e2e_superfund_funding_status) ... ok
Step #1 - "python_test": test_e2e_superfund_sites (us_epa.superfund.site_and_funding_status.process_sites_test.ProcessTest.test_e2e_superfund_sites) ... ok
Step #1 - "python_test": test_e2e_superfund_site_contamination (us_epa.superfund.site_contamination.process_sites_contamination_test.ProcessTest.test_e2e_superfund_site_contamination) ... ok
Step #1 - "python_test": test_e2e (us_epa.superfund.site_hazards.process_sites_hazards_test.ProcessTest.test_e2e) ... ok
Step #1 - "python_test": test_e2e (us_epa.superfund.sites.measurement_sites.generate_measurement_site_mcf_test.ProcessTest.test_e2e) ... ok
Step #1 - "python_test": test_e2e (us_epa.superfund.sites.tar_creek.process_contaminants_test.ProcessTest.test_e2e) ... /workspace/scripts/us_epa/superfund/sites/tar_creek/process_contaminants.py:232: FutureWarning: Downcasting behavior in `replace` is deprecated and will be removed in a future version. To retain the old behavior, explicitly call `result.infer_objects(copy=False)`. To opt-in to the future behavior, set `pd.set_option('future.no_silent_downcasting', True)`
Step #1 - "python_test":   df.replace(to_replace=r'^<.*$', value=0, regex=True,
Step #1 - "python_test": /workspace/scripts/us_epa/superfund/sites/tar_creek/process_contaminants.py:223: FutureWarning: The behavior of DataFrame concatenation with empty or all-NA entries is deprecated. In a future version, this will no longer exclude empty or all-NA columns when determining the result dtypes. To retain the old behavior, exclude the relevant entries before the concat operation.
Step #1 - "python_test":   _CLEAN_CSV_FRAMES.append(pd.concat(clean_csv))
Step #1 - "python_test": /workspace/scripts/us_epa/superfund/sites/tar_creek/process_contaminants.py:252: FutureWarning: The provided callable <built-in function max> is currently using SeriesGroupBy.max. In a future version of pandas, the provided callable will be used directly. To keep current behavior pass the string "max" instead.
Step #1 - "python_test":   as_index=False)['value'].transform(max)
Step #1 - "python_test": ok
Step #1 - "python_test": test_main (us_fema.national_risk_index.generate_schema_and_tmcf_test.ProcessFemaNriFileTest.test_main) ... ok
Step #1 - "python_test": test_county_missing_trailing_zero (us_fema.national_risk_index.process_data_test.FormatGeoIDTest.test_county_missing_trailing_zero) ... ok
Step #1 - "python_test": test_county_no_change_needed (us_fema.national_risk_index.process_data_test.FormatGeoIDTest.test_county_no_change_needed) ... ok
Step #1 - "python_test": test_tract_missing_trailing_zero (us_fema.national_risk_index.process_data_test.FormatGeoIDTest.test_tract_missing_trailing_zero) ... ok
Step #1 - "python_test": test_tract_no_change_needed (us_fema.national_risk_index.process_data_test.FormatGeoIDTest.test_tract_no_change_needed) ... ok
Step #1 - "python_test": test_process_county_file (us_fema.national_risk_index.process_data_test.ProcessFemaNriFileTest.test_process_county_file) ... ok
Step #1 - "python_test": test_process_tract_file (us_fema.national_risk_index.process_data_test.ProcessFemaNriFileTest.test_process_tract_file) ... ok
Step #1 - "python_test": test_preprocess (us_gs.earthquake.preprocess_test.USGSEarthquakePreprocessTest.test_preprocess) ... ok
Step #1 - "python_test": test_compute_150 (us_hud.income.process_test.ProcessTest.test_compute_150) ... ok
Step #1 - "python_test": test_get_url (us_hud.income.process_test.ProcessTest.test_get_url) ... ok
Step #1 - "python_test": test_process (us_hud.income.process_test.ProcessTest.test_process) ... ok
Step #1 - "python_test": test_output_mcf (world_bank.boundaries.country_boundaries_mcf_generator_test.CountyBoundariesMcfGeneratorTest.test_output_mcf) ... ok
Step #1 - "python_test": 
Step #1 - "python_test": ----------------------------------------------------------------------
Step #1 - "python_test": Ran 181 tests in 129.254s
Step #1 - "python_test": 
Step #1 - "python_test": OK
Step #1 - "python_test": sys:1: ResourceWarning: unclosed file <_io.TextIOWrapper name='/workspace/scripts/aqicn/cities.csv' mode='r' encoding='UTF-8'>
Finished Step #1 - "python_test"
PUSH
DONE

Build Log: https://console.cloud.google.com/cloud-build/builds/35718f42-6c7b-452c-a741-a253dc623b00?project=879489846695