Releases: getindata/kedro-airflow-k8s
Releases · getindata/kedro-airflow-k8s
Release 0.8.2
[0.8.2] - 2023-08-29
- Fixed incompatibility with airflow's dumb-init entrypoint
�
Release 0.8.1
[0.8.1] - 2023-03-17
- Submit_operator supports
node
andpipeline
parameters - Pass
node
andpipeline
parameters to submit_operator inairflow_dag_template.j2
jinja template - Extra option to skip grouping Spark nodes
�
Release 0.8.0
[0.8.0] - 2022-03-23
- Support spark projects on K8S
- FIX: handle
KEDRO_ENV
environment variable - FIX: Removed hardcoded project path from spark job
�
Release 0.7.3
[0.7.3] - 2021-11-16
- Take DAG status from the final task
- Fix finding pyspark DataFrames
MLFLOW_RUN_ID
passed as environment variable to dataproc oriented pipelines
�
Release 0.7.2
[0.7.2] - 2021-10-25
- Support annotations with quotes
- For pyspark projects, allows to configure post script for dataproc initialization
- FIX: Broken support for kedro<0.17
�
Release 0.7.1
[0.7.1] - 2021-10-21
- Support for failure notifications via slack
- FIX: Missing jinja template for dataproc init script
�
Release 0.7.0
[0.7.0] - 2021-10-19
- Schedule supports
dag-name
parameter - Support for kedro with pyspark, using Google Dataproc
- Support for custom pod templates
- FIX: adding missing dependency package: tabulate
- FIX: Fix default config template
- Support populating k8s node env_vars from Airflow variables
- Generalize auth handler
- Added
VarsAuthHandler
for MLflow authentication which gets credentials from Airflow variables - Changed logging level for pod creation request to debug
�
Release 0.6.7
[0.6.7] - 2021-09-01
- Support for generation of authentication header for secured MLflow API endpoint (via GOOGLE_APPLICATION_CREDENTIALS)
�
Release 0.6.6
[0.6.6] - 2021-08-16
- Support for passing Authorization header for secured Airflow API endpoint (via env variable:
AIRFLOW_API_TOKEN
) - Logging
dag_id
andexecution_date
in mlflow run params
�
Release 0.6.5
[0.6.5] - 2021-08-05
- FIX: Adjust service account setup for image based tasks
�