Skip to content

broadinstitute/dig-job-server

Repository files navigation

dig-job-server

Coverage

Project Setup and Running Server Locally

  1. Set up python virtual env using version 3.9 or later. With pyenv and pyenv-virtualenv installed you can do the following:
pyenv install 3.9
pyenv local 3.9
pyenv virtualenv 3.9 dig-job-server
  1. Install dependencies for the virtual env:
pip install -r requirements.txt 
  1. Start mysql db via docker for local development (code defaults to port 3308):
 ./docker_db/docker_db.sh start <port>
  1. Run db migrations:
alembic upgrade head
  1. Run tests (this will create a user in your local db, while also verifying everything is set up):
pytest
  1. Start the server (you can set up a IDE runtime config for this too):
python -m job_server.main
  1. Start using the server:
curl -H "Content-Type: application/json" -X POST http://localhost:8000/api/login \
-d '{"username": "testuser", "password": "change.me"}'
DATASET=$(python3 -c "import urllib.parse; print(urllib.parse.quote('''<data-set-name>'''))")
PRESIGNED_URL=$(curl "http://localhost:8000/api/get-pre-signed-url/$DATASET" \
     -H "Authorization: bearer <token_provided_by_login_response>" \
     | python3 -c "import sys, json; print(json.load(sys.stdin)['presigned_url'])")
curl -X PUT --upload-file <local-file-to-upload> $PRESIGNED_URL  

Just the front end

If you'd like to use the deployed API server and not have to concern yourself with a db or python setup, you can edit frontend/.env to point to the deployed API server and run the front end locally.

echo "NUXT_PUBLIC_API_BASE_URL=http://ec2-98-83-154-159.compute-1.amazonaws.com:5000" > frontend/.env

You'll need to use credentials that are in the API's db. Talk to another dev that has worked on this project if you need help with this.

Deployment

Both the front end and API deploy via github actions that fire upon a push to the main branch. The front end deploys as a static site served by our nginx server, and the API deploys to EC2 instance that runs the server as a python process. The front end talks to the API via an nginx proxy.

About

No description, website, or topics provided.

Resources

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •