Easy submission of spike sorting jobs locally or to the Cloud:

Keep track of your sorting jobs:

Set ENV variables:
export AWS_DEFAULT_REGION=
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export AWS_BATCH_JOB_QUEUE=
export AWS_BATCH_JOB_DEFINITION=
export DANDI_API_KEY=Running with docker compose pulling images from github packages:
docker compose upRunning with docker compose building images locally (for dev, with hot reaload):
docker compose -f docker-compose-dev.yml upIf you did any changes in requirements.txt, package.json or Dockerfile, you should stop the containers and run again with an extra --build flag:
docker compose down
docker compose -f docker-compose-dev.yml up --buildRun rest api standalone (dev):
cd rest
python main.pyRun frontend standalone (dev):
cd frontend
yarn startThe app is composed of four components:
rest- the rest api, which is a FastAPI appfrontend- the frontend, which is a React appdb- the database, which is a Postgres databaseworker- the worker, which is a sorter container with a Flask app
DOCKER_BUILDKIT=1 docker build -t ghcr.io/catalystneuro/si-sorting-worker:latest -f Dockerfile.combined .
docker push ghcr.io/catalystneuro/si-sorting-worker:latest