Skip to content

Commit

Permalink
Improved Dockerfile to allow IDEs to use the container as remote inte…
Browse files Browse the repository at this point in the history
…rpreter (#67)

Improved Dockerfile to allow IDEs to use the container as remote interpreter
  • Loading branch information
ysavary authored May 18, 2023
1 parent 0ef5a3c commit 9202928
Show file tree
Hide file tree
Showing 6 changed files with 83 additions and 44 deletions.
2 changes: 1 addition & 1 deletion .tool-versions
Original file line number Diff line number Diff line change
@@ -1 +1 @@
python 3.10.6
python 3.10.11
29 changes: 14 additions & 15 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -1,27 +1,26 @@
FROM python:3.10.6-slim-bullseye AS base
FROM python:3.10.11-slim-bullseye AS base

ENV LANG C.UTF-8
ENV LC_ALL C.UTF-8

RUN apt update; \
apt --yes --no-install-recommends install libpq5 libmariadb3
RUN apt-get update; \
apt-get --yes --no-install-recommends install libpq5 libmariadb3

FROM base AS python
FROM base AS python-dependencies

RUN apt update; \
apt --yes --no-install-recommends install build-essential libpq-dev libmariadb-dev curl
RUN curl -sSL https://install.python-poetry.org | python - --version 1.3.2
RUN apt-get update; \
apt-get --yes --no-install-recommends install build-essential libpq-dev libmariadb-dev curl
RUN curl -sSL https://install.python-poetry.org | python - --version 1.4.2

COPY . .
RUN POETRY_VIRTUALENVS_IN_PROJECT=true /root/.local/bin/poetry install --no-dev
COPY pyproject.toml poetry.lock ./
RUN POETRY_VIRTUALENVS_IN_PROJECT=true /root/.local/bin/poetry install --only=main

FROM base AS runtime

ENV PATH="/.venv/bin:$PATH"
COPY --from=python-dependencies /.venv /.venv
ENV PATH=/.venv/bin:$PATH

COPY . .
COPY . /opt/project/
WORKDIR /opt/project/

FROM runtime AS production

COPY --from=python /.venv /.venv
ENTRYPOINT ["python", "run_providers.py"]
ENTRYPOINT ["python", "run_scheduler.py"]
87 changes: 65 additions & 22 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Create a `.env` file from `.env.template` which will be read by docker compose:

Then start the external services and the providers scheduler:

- `docker compose --profile=application up --build`
- `docker compose --profile=scheduler up --build`

Some providers need [winds-mobi-admin](https://github.com/winds-mobi/winds-mobi-admin#run-the-project-with-docker-compose-simple-way) running to get stations metadata.

Expand Down Expand Up @@ -72,7 +72,7 @@ Then start the external services:

#### Run only a provider

- `PYTHONPATH=. dotenv -f .env.localhost run python providers/ffvl.py`
- `dotenv -f .env.localhost run python providers/ffvl.py`

### Contributing

Expand All @@ -89,7 +89,7 @@ You know a good weather station that would be useful for many paraglider pilots
Awesome! Fork this repository and create a pull request with your new provider code. It's easy, look at the following
example:

my_provider.py
providers/my_provider.py
```
import arrow
import requests
Expand All @@ -103,36 +103,79 @@ class MyProvider(Provider):
def process_data(self):
self.log.info("Processing MyProvider data...")
data = requests.get(
"https://api.my-provider.com/stations.json", timeout=(self.connect_timeout, self.read_timeout)
)
for data_dict in data.json():
station = self.save_station(
provider_id=data_dict["id"],
short_name=data_dict["name"],
# data = requests.get(
# "https://api.my-provider.com/stations.json", timeout=(self.connect_timeout, self.read_timeout)
# ).json()
data = [{
"id": "station-1",
"name": "Station 1",
"latitude": 46.713,
"longitude": 6.503,
"status": "ok",
"measures": [{
"time": arrow.now().format("YYYY-MM-DD HH:mm:ssZZ"),
"windDirection": 180,
"windAverage": 10.5,
"windMaximum": 20.1,
"temperature": 25.7,
"pressure": 1013,
}]
}]
for station in data:
winds_station = self.save_station(
provider_id=station["id"],
short_name=station["name"],
name=None, # Lets winds.mobi provide the full name with the help of Google Geocoding API
latitude=data_dict["latitude"],
longitude=data_dict["longitude"],
status=StationStatus.GREEN if data_dict["status"] == "ok" else StationStatus.RED,
latitude=station["latitude"],
longitude=station["longitude"],
status=StationStatus.GREEN if station["status"] == "ok" else StationStatus.RED,
url=f"https://my-provider.com/stations/{station['id']}",
)
measure_key = arrow.get(data_dict["lastMeasure"]["time"], "YYYY-MM-DD HH:mm:ssZZ").int_timestamp
measures_collection = self.measures_collection(station["_id"])
measure_key = arrow.get(station["measures"][0]["time"], "YYYY-MM-DD HH:mm:ssZZ").int_timestamp
measures_collection = self.measures_collection(winds_station["_id"])
if not self.has_measure(measures_collection, measure_key):
new_measure = self.create_measure(
for_station=station,
for_station=winds_station,
_id=measure_key,
wind_direction=data_dict["lastMeasure"]["windDirection"],
wind_average=Q_(data_dict["lastMeasure"]["windAverage"], ureg.meter / ureg.second),
wind_maximum=Q_(data_dict["lastMeasure"]["windMaximum"], ureg.meter / ureg.second),
temperature=Q_(data_dict["lastMeasure"]["temp"], ureg.degC),
pressure=Pressure(qnh=Q_(data_dict["lastMeasure"]["pressure"], ureg.hPa)),
wind_direction=station["measures"][0]["windDirection"],
wind_average=Q_(station["measures"][0]["windAverage"], ureg.meter / ureg.second),
wind_maximum=Q_(station["measures"][0]["windMaximum"], ureg.meter / ureg.second),
temperature=Q_(station["measures"][0]["temperature"], ureg.degC),
pressure=Pressure(station["measures"][0]["pressure"], qnh=None, qff=None),
)
self.insert_new_measures(measures_collection, station, [new_measure])
self.insert_new_measures(measures_collection, winds_station, [new_measure])
self.log.info("...Done !")
def my_provider():
MyProvider().process_data()
if __name__ == "__main__":
my_provider()
```

##### And test it

Start the external services:

- `docker compose up`

Build a Docker image containing your new provider `providers/my_provider.py`:

- `docker build --tag=winds.mobi/my_provider .`

Then run your provider inside a container with:

- `docker run -it --rm --env-file=.env --network=winds-mobi-providers --entrypoint=python winds.mobi/my_provider -m providers.my_provider`

To avoid building a new image on every change, you can mount your local source to the container directory `/opt/project`
with a Docker volume:

- `docker run -it --rm --env-file=.env --network=winds-mobi-providers --volume=$(pwd):/opt/project --entrypoint=python winds.mobi/my_provider -m providers.my_provider`

Licensing
---------

Expand Down
2 changes: 1 addition & 1 deletion docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ services:
build:
context: .
profiles:
- application
- scheduler
environment:
MONGODB_URL: ${MONGODB_URL}
REDIS_URL: ${REDIS_URL}
Expand Down
3 changes: 0 additions & 3 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,6 @@ description = "Python scripts that get the weather data from different providers
version = "0.0.0"
authors = ["winds.mobi"]
license = " AGPL-3.0-only"
packages = [
{ include = "winds_mobi_provider" },
]

[tool.poetry.dependencies]
python = "3.10.*"
Expand Down
4 changes: 2 additions & 2 deletions run_providers.py → run_scheduler.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
from pydantic import parse_obj_as


def run_providers():
def run_scheduler():
scheduler = BlockingScheduler()
scheduler.configure(
executors={
Expand Down Expand Up @@ -74,4 +74,4 @@ def run_providers():


if __name__ == "__main__":
run_providers()
run_scheduler()

0 comments on commit 9202928

Please sign in to comment.