about 5 hours ago

Senior Data Engineer, Payments

Bangalore, India
full-timesenior Remotehospitality

Tech Stack

Description

You will design and build robust data pipelines to process large-scale datasets for compliance with tax, payments, and legal regulations. You will collaborate with cross-functional teams to ensure high data quality and accuracy, contributing to Airbnb's world-class data engineering environment.

Requirements

  • 6+ years of relevant industry experience
  • BE/B.Tech in computer science or relevant technical degree
  • Hands-on experience in DSA coding - Data structure and algorithm
  • Extensive experience designing, building, and operating robust distributed data platforms (e.g., Spark, Kafka, Flink, HBase) and handling data at the petabyte scale
  • Strong knowledge of Scala or Python, and expertise with data processing technologies and query authoring (SQL)
  • Demonstrated ability to analyze large data sets to identify gaps and inconsistencies
  • Expertise with ETL schedulers such as Apache Airflow, Luigi, Oozie, AWS Glue or similar frameworks
  • Solid understanding of data warehousing concepts and hands-on experience with relational databases (e.g., PostgreSQL, MySQL) and columnar databases (e.g., Redshift, BigQuery, HBase, ClickHouse)
  • Excellent written and verbal communication skills

Responsibilities

  • Design, build, and maintain robust and efficient data pipelines that collect, process, and store data from various sources
  • Develop data models that enable efficient analysis and manipulation of data for merchandising optimization
  • Ensure data quality, consistency, and accuracy
  • Develop high quality data assets for product use-cases by partnering with Product, AI/ML and Data Science
  • Build scalable data pipelines (SparkSQL Scala) leveraging Airflow scheduler/executor framework
  • Collaborate with cross-functional teams to define data requirements and deliver data solutions
  • Contribute to the broader Data Engineering community at Airbnb to influence tooling and standards
  • Improve code and data quality by leveraging and contributing to internal tools to automatically detect and mitigate issues
0 views 0 saves 0 applications