Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Only run full validation tests when changes are made to validation/schema code #4284

Open
danswick opened this issue Sep 13, 2024 · 1 comment
Assignees
Labels
eng infrastructure For the invisible bits that make it all go.

Comments

@danswick
Copy link
Contributor

Running the full test suite takes >17 minutes. Most of that time is spent running a lot of spreadsheets through a full set of validations. Until we can optimize those tests, we should break out the validation tests and only run them when there are relevant changes in the commits being tested (or possibly when merging into specific branches).

@danswick danswick added infrastructure For the invisible bits that make it all go. eng labels Sep 13, 2024
@phildominguez-gsa
Copy link
Contributor

Looked into this a bit and discussed with @asteel-gsa. It might be possible for us to use filters (which we use elsewhere) to determine when certain tests should be run. Since the audit tests are the ones being slow, here's that example:

with:
  filters: |
    audit:
     - './backend/audit/**' 
     ... # filters for other modules

- name: Run Audit Tests
  if: ${{ needs.check-for-changes.outputs.audit == 'true' }}
  working-directory: ./backend
  run: docker compose -f docker-compose.yml run web bash -c 'coverage run --parallel-mode --concurrency=multiprocessing manage.py test audit --parallel && coverage combine && coverage report -m --fail-under=85 && coverage xml -o coverage.xml'
# Notice above only "manage.py test audit" is run
... # tests for other modules

This would basically have to be set up for all the modules. Unfortunately, there are complications since some interconnectedness exists between some modules. For example, a bunch of modules, including audit, use dissemination.models. This means any changes to there should also run all those tests. There are probably other examples, and any time some new interconnection is made these workflows would have to be updated. On my last contract we tried to be too selective with running our unit tests and it definitely caused some big headaches, so I'm a bit hesitant here. Thoughts @danswick?

@phildominguez-gsa phildominguez-gsa self-assigned this Sep 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
eng infrastructure For the invisible bits that make it all go.
Projects
Status: Backlog
Development

No branches or pull requests

3 participants