21h ago

Data Engineer

Warsaw

$170k-$235k / year

full-timemidmedia

🛠 Tech Stack

💼 About This Role

You'll design and maintain scalable data pipelines on AWS, Databricks, BigQuery, and Snowflake. You'll own end-to-end data workflows using Apache Spark, Airflow, and Delta Lake to power analytics and reporting. This role offers hands-on impact on a platform serving 1.5 billion user profiles.

🎯 What You'll Do

  • Design and maintain scalable data pipelines using Spark, Airflow, Databricks
  • Optimize data transformations for large-scale datasets with Delta Lake
  • Own Airflow DAGs and orchestration workflows for reliable data delivery
  • Implement monitoring, logging, and alerting for production data workflows

📋 Requirements

  • 2–4 years professional experience in data engineering
  • Strong proficiency in Python and SQL
  • Experience with Apache Airflow for workflow orchestration
  • Hands-on experience with Databricks and Apache Spark (PySpark)

✨ Nice to Have

  • Experience with BigQuery or Snowflake as primary data warehousing platforms
  • Familiarity with Databricks Unity Catalog for data governance
  • Background in ad tech, media measurement, or streaming data domain

🎁 Benefits & Perks

  • 💻 Work with cutting-edge tech (Databricks, Spark, AWS)
  • 🌍 Global team with cross-timezone collaboration
  • 📈 Growth opportunities toward technical autonomy
0 0 0