You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: RELEASE.md
+17-3
Original file line number
Diff line number
Diff line change
@@ -1,8 +1,21 @@
1
-
# Upcoming Release 0.19.6
1
+
# Upcoming Release 0.19.7
2
+
3
+
## Major features and improvements
4
+
5
+
## Bug fixes and other changes
6
+
7
+
## Breaking changes to the API
8
+
9
+
## Documentation changes
10
+
11
+
## Community contributions
12
+
13
+
# Release 0.19.6
2
14
3
15
## Major features and improvements
4
16
* Added `raise_errors` argument to `find_pipelines`. If `True`, the first pipeline for which autodiscovery fails will cause an error to be raised. The default behaviour is still to raise a warning for each failing pipeline.
5
17
* It is now possible to use Kedro without having `rich` installed.
18
+
* Updated custom logging behavior: `conf/logging.yml` will be used if it exists and `KEDRO_LOGGING_CONFIG` is not set; otherwise, `default_logging.yml` will be used.
6
19
7
20
## Bug fixes and other changes
8
21
* User defined catch-all dataset factory patterns now override the default pattern provided by the runner.
@@ -14,12 +27,13 @@
14
27
15
28
## Documentation changes
16
29
* Improved documentation for custom starters
17
-
* Added a new section on deploying Kedro project on AWS Airflow MWAA
30
+
* Added a new docs section on deploying Kedro project on AWS Airflow MWAA
31
+
* Detailed instructions on using `globals` and `runtime_params` with the `OmegaConfigLoader`
18
32
19
33
## Community contributions
20
34
Many thanks to the following Kedroids for contributing PRs to this release:
Copy file name to clipboardexpand all lines: docs/source/deployment/airflow.md
+3-3
Original file line number
Diff line number
Diff line change
@@ -18,13 +18,13 @@ This guide provides instructions on running a Kedro pipeline on different Airflo
18
18
19
19
The following tutorial shows how to deploy an example [Spaceflights Kedro project](https://docs.kedro.org/en/stable/tutorial/spaceflights_tutorial.html) on [Apache Airflow](https://airflow.apache.org/) with [Astro CLI](https://docs.astronomer.io/astro/cli/overview), a command-line tool created by [Astronomer](https://www.astronomer.io/) that streamlines the creation of local Airflow projects. You will deploy it locally first, and then transition to Astro Cloud.
20
20
21
-
[Astronomer](https://docs.astronomer.io/astro/install-cli) is a managed Airflow platform which allows users to spin up and run an Airflow cluster in production. Additionally, it also provides a set of tools to help users get started with Airflow locally in the easiest way possible.
21
+
[Astronomer](https://www.astronomer.io/) is a managed Airflow platform which allows users to spin up and run an Airflow cluster in production. Additionally, it also provides a set of tools to help users get started with Airflow locally in the easiest way possible.
22
22
23
23
### Prerequisites
24
24
25
25
To follow this tutorial, ensure you have the following:
26
26
27
-
* The [Astro CLI installed](https://docs.astronomer.io/astro/install-cli)
27
+
* The [Astro CLI installed](https://www.astronomer.io/docs/astro/cli/install-cli)
28
28
* A container service like [Docker Desktop](https://docs.docker.com/get-docker/) (v18.09 or higher)
29
29
*`kedro>=0.19` installed
30
30
*[`kedro-airflow>=0.8`](https://github.com/kedro-org/kedro-plugins/tree/main/kedro-airflow) installed. We will use this plugin to convert the Kedro pipeline into an Airflow DAG.
@@ -86,7 +86,7 @@ This step should produce a `.py` file called `new_kedro_project_airflow_dag.py`
86
86
87
87
In this section, you will start by setting up a new blank Airflow project using Astro and then copy the files prepared in the previous section from the Kedro project. Next, you will need to customise the Dockerfile to enhance logging capabilities and manage the installation of our Kedro package. Finally, you will be able to run and explore the Airflow cluster.
88
88
89
-
1. To complete this section, you have to install both the [Astro CLI](https://docs.astronomer.io/astro/install-cli) and [Docker Desktop](https://docs.docker.com/get-docker/).
89
+
1. To complete this section, you have to install both the [Astro CLI](https://www.astronomer.io/docs/astro/cli/install-cli) and [Docker Desktop](https://docs.docker.com/get-docker/).
90
90
91
91
2. [Initialise an Airflow project with Astro](https://docs.astronomer.io/astro/cli/develop-project) in a new folder outside of your Kedro project. Let's call it `kedro-airflow-spaceflights`
0 commit comments