Skip to content

Commit

Permalink
Repository Maintenance (#378)
Browse files Browse the repository at this point in the history
* refactor: Move DeepSSMTab file

* fix: resolve type errors and fix styling in TabForm

* fix: orientation marker cube should appear if not showing color scale

* refactor: remove unused import in store methods

* fix: change import of DeepSSMTab in Main view

* refactor: ignore dev/yarn.env

* refactor: remove terraform folder

* refactor: remove references to CloudAMQP Gray

* fix: use known dataset subset for download_upload_cycle test (increase reliability)

* style: reformat test file

* fix: prevent stopping GPU workers if deployment playbook is active

* fix: Apply suggested changes to manage_workers.py

* fix: issue #348

* feat: address issue #349

* style: reformat manage_workers.py

* refactor: remove old files at shapeworks_cloud/core/shapeworks_interface

* updating repo url

* ci: add workflow_dispatch trigger to build-web action

* update key

* force a web build

* set branch back to master

* fix: remove version pins on django and allauth

* fix: pin Django to 4.1

* fix: remove `--username` arg from createsuperuser commands

---------

Co-authored-by: jessdtate@gmail.com <jess@sci.utah.edu>
  • Loading branch information
annehaley and jessdtate authored May 30, 2024
1 parent 107a030 commit dbc572f
Show file tree
Hide file tree
Showing 33 changed files with 88 additions and 301 deletions.
1 change: 1 addition & 0 deletions .github/workflows/build-web.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: Build web app
on:
workflow_dispatch:
push:
branches:
- master
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/download_upload_cycle.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ jobs:
- name: Run migrations
run: docker-compose run --rm django ./manage.py migrate
- name: Create super user
run: docker-compose run -e DJANGO_SUPERUSER_PASSWORD=$DJANGO_SUPERUSER_PASSWORD --rm django ./manage.py createsuperuser --noinput --email=$DJANGO_SUPERUSER_EMAIL --username=$DJANGO_SUPERUSER_EMAIL
run: docker-compose run -e DJANGO_SUPERUSER_PASSWORD=$DJANGO_SUPERUSER_PASSWORD --rm django ./manage.py createsuperuser --noinput --email=$DJANGO_SUPERUSER_EMAIL
- name: Start server
run: docker-compose up -d

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ jobs:
- name: Run migrations
run: docker-compose run --rm django ./manage.py migrate
- name: Create super user
run: docker-compose run -e DJANGO_SUPERUSER_PASSWORD=$DJANGO_SUPERUSER_PASSWORD --rm django ./manage.py createsuperuser --noinput --email=$DJANGO_SUPERUSER_EMAIL --username=$DJANGO_SUPERUSER_EMAIL
run: docker-compose run -e DJANGO_SUPERUSER_PASSWORD=$DJANGO_SUPERUSER_PASSWORD --rm django ./manage.py createsuperuser --noinput --email=$DJANGO_SUPERUSER_EMAIL
- name: Start server
run: docker-compose up -d

Expand Down
4 changes: 1 addition & 3 deletions .github/workflows/update_workers.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,6 @@ jobs:
AWS_SSH_PRIVATE_KEY: ${{ secrets.AWS_SSH_PRIVATE_KEY }}
CLOUDAMQP_URL: ${{ secrets.CLOUDAMQP_URL }}
CLOUDAMQP_APIKEY: ${{ secrets.CLOUDAMQP_APIKEY }}
CLOUDAMQP_GRAY_URL: ${{ secrets.CLOUDAMQP_GRAY_URL }}
CLOUDAMQP_GRAY_APIKEY: ${{ secrets.CLOUDAMQP_GRAY_APIKEY }}
DJANGO_SECRET_KEY: ${{ secrets.DJANGO_SECRET_KEY }}
DJANGO_EMAIL_URL: ${{ secrets.DJANGO_EMAIL_URL }}
PAPERTRAIL_API_TOKEN: ${{ secrets.PAPERTRAIL_API_TOKEN }}
Expand Down Expand Up @@ -43,7 +41,7 @@ jobs:
chmod 600 $HOME/.ssh/shapeworks-ec2-b
- name: Update workers
run: |
SECRETS_NAMES=('AWS_DEFAULT_REGION' 'AWS_ACCESS_KEY_ID' 'AWS_SECRET_ACCESS_KEY' 'CLOUDAMQP_URL' 'CLOUDAMQP_APIKEY' 'CLOUDAMQP_GRAY_URL' 'CLOUDAMQP_GRAY_APIKEY' 'DJANGO_SECRET_KEY' 'DJANGO_EMAIL_URL' 'PAPERTRAIL_API_TOKEN' 'DATABASE_URL')
SECRETS_NAMES=('AWS_DEFAULT_REGION' 'AWS_ACCESS_KEY_ID' 'AWS_SECRET_ACCESS_KEY' 'CLOUDAMQP_URL' 'CLOUDAMQP_APIKEY' 'DJANGO_SECRET_KEY' 'DJANGO_EMAIL_URL' 'PAPERTRAIL_API_TOKEN' 'DATABASE_URL')
SECRETS_JSON=`jq -n '$ARGS.positional | map({ (.): env[.] }) | add' --args "${SECRETS_NAMES[@]}"`
jq --argjson secrets "$SECRETS_JSON" '{django_vars: (.+=$secrets)}' dev/prod.celery.env.json > extra_vars.json
INVENTORY=`python shapeworks_cloud/manage_workers.py start_all`
Expand Down
5 changes: 4 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -126,4 +126,7 @@ dmypy.json
.terraform/

# Ansible
ansible/roles
ansible/roles

# Yarn env
dev/yarn.env
12 changes: 8 additions & 4 deletions ansible/playbook.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,10 +17,9 @@
celery_repository_url: https://github.com/girder/shapeworks-cloud.git
celery_environment: "{{ django_vars }}"
tasks:
- name: Stop celery service to prevent manage_workers task from stopping instances
systemd:
state: stopped
name: celery
- name: Create lockfile to prevent manage_workers task from stopping instances
ansible.builtin.shell: |
date "+%Y.%m.%d-%H.%M.%S" > /home/ubuntu/celery_project/dev/deploy.lock
become: true
become_user: root
- name: Save environment variables
Expand Down Expand Up @@ -94,3 +93,8 @@
chmod 777 /home/ubuntu/celery_project/dev/prod.celery.start.sh
become: true
become_user: root
- name: Delete lockfile to allow manage_workers task to stop instances
ansible.builtin.shell: |
rm /home/ubuntu/celery_project/dev/deploy.lock
become: true
become_user: root
2 changes: 1 addition & 1 deletion dev/prod.celery.env.json
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
{
"AWS_ACCESS_KEY_ID": "AKIAW7NUJL4RQV2PHL4D",
"AWS_ACCESS_KEY_ID": "AKIAYS2NWFYSXVHRJDI6",
"AWS_DEFAULT_REGION": "us-east-1",
"DJANGO_ALLOWED_HOSTS": "app.shapeworks-cloud.org",
"DJANGO_API_URL": "https://app.shapeworks-cloud.org/api/v1",
Expand Down
2 changes: 1 addition & 1 deletion dev/yarn.env
Original file line number Diff line number Diff line change
@@ -1 +1 @@
VUE_APP_OAUTH_CLIENT_ID=Sn075ovzulH1SBTI6IwLM20G4b0Dh5Yj0FV3HNLb
VUE_APP_OAUTH_CLIENT_ID=Sn075ovzulH1SBTI6IwLM20G4b0Dh5Yj0FV3HNLb
4 changes: 2 additions & 2 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,9 +43,9 @@
include_package_data=True,
install_requires=[
'celery',
'django<4.0',
'django<4.2',
'django-admin-display',
'django-allauth<0.56',
'django-allauth',
'django-configurations[database,email]',
'django-extensions',
'django-filter',
Expand Down
Empty file.
28 changes: 0 additions & 28 deletions shapeworks_cloud/core/shapeworks_interface/convert.py

This file was deleted.

80 changes: 0 additions & 80 deletions shapeworks_cloud/core/shapeworks_interface/optimize.py

This file was deleted.

24 changes: 23 additions & 1 deletion shapeworks_cloud/manage_workers.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,14 @@
import datetime
import os
from pathlib import Path
import sys
import time

import boto3

DEPLOY_LOCK = Path(__file__).parent.parent / 'dev' / 'deploy.lock'
MAX_LOCK_TIME = datetime.timedelta(minutes=10)


def inspect_queue(queue_name):
import pyrabbit
Expand Down Expand Up @@ -80,6 +85,23 @@ def manage_workers(**kwargs):
if v is not None:
os.environ[k] = v

# check for lockfile indicating that a deployment is active
if DEPLOY_LOCK.exists():
with open(DEPLOY_LOCK) as lock:
lock_content = lock.readlines()
if len(lock_content) > 0:
lock_time = datetime.datetime.strptime(lock_content[0].strip(), '%Y.%m.%d-%H.%M.%S')
time_delta = datetime.datetime.now() - lock_time
max_mins = MAX_LOCK_TIME.total_seconds() / 60
explanation = f"Deploy playbook started %s {max_mins} mins ago and hasn't exited."
if time_delta < MAX_LOCK_TIME:
result = 'Valid deployment lockfile found. Skipping worker management.'
print(f"{result} {explanation % 'less than'}")
return
else:
result = 'Invalid deployment lockfile found. Continuing with worker management.'
print(f"{result} {explanation % 'greater than'}")

num_messages_ready, num_messages_active = inspect_queue('gpu')
if num_messages_ready < 0:
return
Expand Down Expand Up @@ -116,7 +138,7 @@ def start_all():
client.start_instances(InstanceIds=all_ids)

# Wait for startup
time.sleep(60)
time.sleep(30)

# Refresh hostnames
all_workers = get_all_workers(client)
Expand Down
2 changes: 1 addition & 1 deletion swcc/examples/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Setting up your Shapeworks Cloud instance

1. Clone https://github.com/girder/shapeworks-cloud and open that directory in your code editor. Check out the appropriate branch.
1. Clone https://github.com/SCIInstitute/shapeworks-cloud and open that directory in your code editor. Check out the appropriate branch.

2. In a terminal (in your repo directory), run `docker-compose up`. You may need to install docker. Keep this command running; this is the development server.

Expand Down
2 changes: 1 addition & 1 deletion swcc/swcc/cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,7 @@ def main():
err=True,
)
click.echo(
'https://github.com/girder/shapeworks-cloud/issues/new',
'https://github.com/SCIInstitute/shapeworks-cloud/issues/new',
err=True,
)

Expand Down
17 changes: 10 additions & 7 deletions swcc/tests/test_download_upload.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,14 +3,11 @@
# upload them all to the local server
import filecmp
import os
import random
from tempfile import TemporaryDirectory

from swcc import models
from swcc.api import swcc_session

SAMPLE_SIZE = 3


def project_as_dict_repr(project):
project_repr = dict(project)
Expand Down Expand Up @@ -54,10 +51,16 @@ def public_server_download(download_dir):
with swcc_session() as public_server_session:
public_server_session.login('testuser@noemail.nil', 'cicdtest')
all_datasets = list(models.Dataset.list())
tiny_tests = [d for d in all_datasets if 'tiny_test' in d.name]
dataset_subset = (
random.sample(tiny_tests, SAMPLE_SIZE) if len(tiny_tests) >= SAMPLE_SIZE else tiny_tests
)
dataset_subset = [
d
for d in all_datasets
if d.name
in [
'deep_ssm_femur_tiny_test',
'ellipsoid_multiple_domain_tiny_test',
'left_atrium_tiny_test',
]
]
project_subset = [next(d.projects, None) for d in dataset_subset]
for project in project_subset:
if project is not None:
Expand Down
57 changes: 0 additions & 57 deletions terraform/.terraform.lock.hcl

This file was deleted.

Loading

0 comments on commit dbc572f

Please sign in to comment.