Data Portal web app
- Clone this repo
$ git clone https://github.com/kippnorcal/galaxy.git
- Install Pipenv
$ pip install pipenv
- Install Docker
- Mac: https://docs.docker.com/docker-for-mac/install/
- Linux: https://docs.docker.com/install/linux/docker-ce/debian/
- Windows: https://docs.docker.com/docker-for-windows/install/
- Create .env file with project secrets
dev environment
SECRET_KEY=
POSTGRES_USER=
POSTGRES_PASSWORD=
POSTGRES_DB=
TABLEAU_TRUSTED_URL=
USER_DOMAIN=
SAML_ENTITY_ID=
SAML_URL=
SAML_SLO=
SAML_CERT=
ROLLBAR_TOKEN=
APP_DOMAIN=
prod environment same as dev but also add:
SSL=1
ALLOWED_HOSTS=[]
Generating a unique secret key can be done via Django:
from django.core.management.utils import get_random_secret_key
get_random_secret_key()
- Build Docker Image
$ docker-compose build
dev environment
$ docker-compose up -d
prod environment
$ docker-compose -f docker-compose.prod.yml up -d
Note: The first time running docker-compose you may get an error about the database not being available. Just run docker-compose down
and then rerun docker-compose up
.
$ docker-compose run web python manage.py migrate
$ docker-compose run web python manage.py createsuperuser
Note: docker-compose
must be up to run the following command(s)
$ docker-compose exec web python manage.py dumpdata --indent 2 --exclude=contentypes >> db.json
Copy the db.json
file to your new repo
$ docker-compose exec web python manage.py loaddata db.json
$ docker-compose down
Run tests
$ docker-compose run web pytest
Run tests and check coverage for all modules
$ docker-compose run web pytest --cov=.
Run tests and check coverage for a specific module
$ docker-compose run web pytest --cov=accounts accounts
$ docker-compose -f docker-compose.prod.yml exec web python manage.py collectstatic --no-input --clear