5h ago
Databricks Data Engineer
REMOTE
full-timemid RemoteIT Consulting
Tech Stack
+2
Description
You will build end-to-end ETL/ELT pipelines for enterprise data ecosystems, ensuring performance and reliability. You'll work with Databricks, AWS, Python, and SQL to transform and analyze large, disconnected datasets in an Agile environment.
Requirements
- Bachelor's degree in Computer Science or equivalent
- 4+ years' experience in data engineering and big data
- 2+ years' professional services experience with clients
- Proficiency in Python, SQL, Databricks, and AWS Data Services
- Experience with big data tools (Hadoop, Spark, Kafka) and data modeling
Responsibilities
- Build end-to-end ETL/ELT pipelines with a focus on performance and reliability
- Assess and understand ETL jobs, workflows, BI tools, and reports
- Address technical inquiries on customization, integration, and enterprise architecture
- Craft database/data warehouse solutions in the cloud (AWS, Azure, GCP)
- Aggregate and transform data from multiple datasets to create data products
0 views 0 saves 0 applications