This project focuses on extracting YouTube data using Python, Airflow, and PostgreSQL. The workflow includes several key steps:
Set up Docker containers for Airflow and MinIO. Airflow serves as the orchestrator for your data pipeline, while MinIO acts as the object storage system. I'm using postgresql server on dbeaver (cross-platform database tool)
Created an Airflow DAG that orchestrates the extraction of YouTube data. This DAG likely includes tasks of extracting & transforming data.
After data extraction and transformation, the resulting CSV file is stored locally or within the Airflow environment. This file contains the relevant YouTube data.
The CSV file is then uploaded to MinIO, an object storage system. MinIO provides scalable and secure storage for your data.
Finally, migrate the data from MinIO to a PostgreSQL server. This step involves creating the necessary tables in PostgreSQL and loading the data.