Installing and running the Apache Airflow single node is quite simple. Follow these steps:
- Install Python 3.6+ (https://www.python.org/downloads/)
- From the command console with administrative privileges, run the following commands:
$ mkdir airflow
$ cd airflow
$ mkdir dags
$ mkdir plugins
$ pip install apache-airflow
- Export AIRFLOW_HOME with the following commands based on OS:
-
- On Windows, this is done as follows:
$ set AIRFLOW_HOME=<current directory>
-
- On macOS or Linux, this is done as follows:
$ export AIRFLOW_HOME=$(pwd)
- Initialize the database with the following command:
$ airflow initdb
We are now ready to start Airflow. Follow these steps:
- Start the scheduler:
$ airflow scheduler
- Start the web interface with the following command:
$ airflow webserver -p 9999
We can now open the browser to http://localhost:9999:
Airflow user interface
We now have to define our connection to KairosDB. Unfortunately, Airflow doesn't have a connection to KairosDB, but this can be easily created by building a simple operator.