- Set up python virtual env using version 3.9 or later. With pyenv and pyenv-virtualenv installed you can do the following:
pyenv install 3.9
pyenv local 3.9
pyenv virtualenv 3.9 dig-job-server
- Install dependencies for the virtual env:
pip install -r requirements.txt
- Start mysql db via docker for local development (code defaults to port 3308):
./docker_db/docker_db.sh start <port>
- Run db migrations:
alembic upgrade head
- Run tests (this will create a user in your local db, while also verifying everything is set up):
pytest
- Start the server (you can set up a IDE runtime config for this too):
python -m job_server.main
- Start using the server:
curl -H "Content-Type: application/json" -X POST http://localhost:8000/api/login \
-d '{"username": "testuser", "password": "change.me"}'
DATASET=$(python3 -c "import urllib.parse; print(urllib.parse.quote('''<data-set-name>'''))")
PRESIGNED_URL=$(curl "http://localhost:8000/api/get-pre-signed-url/$DATASET" \
-H "Authorization: bearer <token_provided_by_login_response>" \
| python3 -c "import sys, json; print(json.load(sys.stdin)['presigned_url'])")
curl -X PUT --upload-file <local-file-to-upload> $PRESIGNED_URL
If you'd like to use the deployed API server and not have to concern yourself with a db or python setup, you can edit frontend/.env to point to the deployed API server and run the front end locally.
echo "NUXT_PUBLIC_API_BASE_URL=http://ec2-98-83-154-159.compute-1.amazonaws.com:5000" > frontend/.env
You'll need to use credentials that are in the API's db. Talk to another dev that has worked on this project if you need help with this.
Both the front end and API deploy via github actions that fire upon a push to the main branch. The front end deploys as a static site served by our nginx server, and the API deploys to EC2 instance that runs the server as a python process. The front end talks to the API via an nginx proxy.