Intermediate. You'll learn how to set up a data pipeline using Apache Airflow. Note: The skill deck assumes that you have an understanding of writing Python code.
This blog post will teach you some of the key concepts used in Apache Airflow.
This blog post will teach you how to install and configure Apache Airflow locally.
The Airflow documentation is really comprehensive. Initially, you can just skim through the parts about Operators and how to write DAGs and you should be good to go. You can always come back in the future to learn more about other Airflow constructs.
After understanding Airflow's key concepts and setting it up locally, you can go through the following example of creating a data pipeline which scrapes data from a website and continuously updates a disease outbreaks dataset for you.