Job Description
Design and maintain our data lakehouse. Build ETL/ELT pipelines processing billions of events daily using Spark and Airflow.
Skill & Experience
3+ years data engineering. Strong SQL, Spark, Airflow experience. Cloud data warehousing knowledge.
Benefits
Remote-first, flexible hours, learning budget.