Skip to content
This repository has been archived by the owner on Apr 16, 2023. It is now read-only.

Latest commit

 

History

History
225 lines (167 loc) · 9.81 KB

project-development-environment.md

File metadata and controls

225 lines (167 loc) · 9.81 KB

Project Development Environment (Docker)

Local prerequisites

dependency supported versions Installation recommendation for Ubuntu 14.04
docker 1.10 (1.9 may still work, too) Follow the official Docker installation instructions for Ubuntu Linux
docker-compose 1.6 Install system-wide via pip (not via pip3! docker-compose is implemented in Python 2.):
sudo pip install docker-compose
pg_config (required by psycopg2)
sudo apt install libpq-dev
Python 3 3.4 Would be pulled in by python3-gdal or python3-dev, but you should install it explicitly with
sudo apt install python3
various PyPI packages See requirements-all.txt Install in a Python 3 virtual environment. Create one with e.g.
mkdir -p ~/.virtualenvs && \
virtualenv ~/.virtualenvs/osmaxx -p python3
activate it with
source ~/.virtualenvs/osmaxx/bin/activate
Then, in the same shell session, use pip3 to install the packages:
# run from this repo's root dir
pip3 install -r requirements-all.txt

virtualenvwrapper users can perform all of the above in a single step:
# run from this repo's root dir
mkvirtualenv -r requirements-all.txt \
-a . -p $(which python3) osmaxx
The -a . will also associate the repo root as the project directory of the new Python 3 virtual environemnt.
pip3 Provided by virtualenv in the virtual Python environments it creates.
virtualenv (Python 2 and 3) Install system-wide with
sudo apt install python-virtualenv
Python 3 C bindings 3.4 Would be pulled in by python3-gdal, but you should install it explicitly with
sudo apt install python3-dev
GEOS C library whatever GeoDjango supports
sudo apt install python3-gdal
will pull this and other required libraries in. python3-gdal itself is not required. Thus, if you prefer a more minimal installation, only install libgeos-c1.
GDAL library whatever GeoDjango supports
sudo apt install python3-gdal
will pull this and other required libraries in. python3-gdal itself is not required. Thus, if you prefer a more minimal installation, only install libgdal1h.
pip (Python 2)
sudo apt install python-pip

For committing and using the pre-commit hook (which really should be used) flake8 needs to be installed on the local system/machine.

For Ubuntu and Debian this is:

sudo apt-get install python3-flake8

Then the pre-commit hook can be linked to the hooks.

cd <osmaxx-repo-root>
ln -s ../../hooks/pre-commit .git/hooks/pre-commit

Python package management

We use pip-tools to manage the PyPI packages we depend on. It shouldn't matter whether you install pip-tools globally or within the osmaxx virtualenv.

Once pip-tools is installed, try running pip-sync in your osmaxx virtualenv. If it tells you to upgrade or downgrade pip do as instructed (also in the virtualenv) and repeat until pip-sync doesn't complain anymore. (You can ignore warnings about a newer pip version being available.)

Syncing installed python packages

pip-tools implicitly pins package versions, so they are synchronous between developers. The thusly fixed versions are tracked in the *requirements*.txt files. Running

pip-sync *requirements*.txt

in your virtualenv will install, uninstall, up- and downgrade packages as neccessary to make your virtualenv match the packages listed in all these files. You'll want to do this, whenever the *.txt* files have changed, e.g. due to pulling in commits or switching branches.

Note that in staging and production, only the content of requirements.txt (not of requirements-all.txt) should be installed. (The docker containers already take care of that.)

Changing requirements

We track top-level requirements and explicitly pinned versions in *requirements*.in files. If you've changed one of those, run

pip-compile <the changed *.in file>

e.g.

pip-compile requirements-local.in

to update the corresponding *.txt file. This will also upgrade the versions listed in that *.txt file to the greatest ones available on PyPI (within the range allowed in the *requirements*.in file).

To compile the *requirements*.txt files for all changed *requirements*.in and sync the new versions to your virtualenv in one go, you may use our handy GNU make target:

make pip-sync-all

Upgrade package versions

To upgrade the implicitly pinned versions of all dependencies to the greatest ones available on PyPI (within the range allowed in the *requirements*.in files) run

make pip-upgrade

Using the project docker setup

A docker-compose setup is provided as part of this project's repository. Ensure to have docker installed and setup the containers properly, as described in the README.

Running commands

The general way of running any command inside a docker container:

docker-compose run <container> <command>

Examples:

Execute a shell in the webapp:

docker-compose run webapp /bin/bash

Run all tests

(Requires Python 3 on the host.)

./runtests.py

To run the application tests only, see Commonly used commands while developing / Run tests.

Access the application

http://<your_ip>:8889

where <your_ip> is your (public) IP as reported by

ip route get 1 | awk '{print $NF;exit}'

You can generate the complete URL in sh with:

echo "http://$(ip route get 1 | awk '{print $(NF-2);exit}'):8889"

Enable development with debug toolbar enabled

In your docker-compose.yml file, add a line containing the content of the command:

echo $(ip -4 addr show docker0 | grep -Po 'inet \K[\d.]+')

Add that to the docker-compose.yml:

webapp:
   ...
   environment:
   ...
    - DJANGO_INTERNAL_IPS=172.17.42.1 # IP from the command above

Note: More automatic in docker-compose 1.5

Once docker-compose 1.5 or better is being used, you can simplify this process by letting docker-compose evaluate the command for you:

See docker/compose#1765.

Then one should be able to use:

- DJANGO_INTERNAL_IPS=$(ip -4 addr show docker0 | grep -Po 'inet \K[\d.]+')

Reset the box

Normally, just stopping the containers, removing them and updating them is enough:

docker-compose stop # shutdown all containers
# to force shutdown: docker-compose kill
docker-compose rm -f
docker-compose build

# run migrations and create super user, commands see in README

If it should be rebuilt from scratch, destroy the boxes and start over. Replace the step docker-compose build above with docker-compose build --no-cache.

NOTICE: This might not be what you want; you rebuild single images using docker-compose build --no-cache <imagename>, so for example, rebuilding the webapp would be docker-compose build --no-cache webapp.

Useful Docker commands

Save docker image to file:

docker save osmaxx_osmaxxdatabase > /tmp/osmaxx-database-alpha1.docker-img.tar
docker save osmaxx_osmaxxwebapp > /tmp/osmaxx-webapp-alpha1.docker-img.tar

Load docker image from file:

docker load < /tmp/osmaxx-database-alpha1.docker-img.tar
docker load < /tmp/osmaxx-database-alpha1.docker-img.tar

Emails sending

Emails during development and testing can be found locally under /tmp/osmaxx-development-emails. This directory holds all sent emails.

If you should find yourself in need of changing the email settings, please have a look at the django settings for emails: Django email settings

To change them, prepend DJANGO_ to the settings name and add it to the environment in the compose-development.yml.

The following list ist supported:

    - DJANGO_EMAIL_FILE_PATH=/dev-emails
    - DJANGO_EMAIL_BACKEND=django.core.mail.backends.filebased.EmailBackend
    - DJANGO_EMAIL_HOST=localhost
    - DJANGO_EMAIL_HOST_PASSWORD=''
    - DJANGO_EMAIL_HOST_USER=''
    - DJANGO_EMAIL_PORT=25
    - DJANGO_EMAIL_SUBJECT_PREFIX='[OSMaxx-dev] '
    - DJANGO_EMAIL_USE_TLS=False
    - DJANGO_EMAIL_USE_SSL=False
    - DJANGO_EMAIL_TIMEOUT=None
    - DJANGO_EMAIL_SSL_CERTFILET=None
    - DJANGO_EMAIL_SSL_KEYFILE=None