Job Description

Experience Required: 6+ Years

Key Responsibility:

  • Design and implement modern data lakehouse architectures (Delta Lake, or equivalent) on cloud platforms like AWS, Azure, or GCP.
  • Define data modeling, schema evolution, partitioning, and governance strategies for high-performance and secure data access.
  • Own the technical roadmap for scalable data platform solutions, aligned with enterprise needs and future growth.
  • Provide architectural guidance and code/design reviews across data engineering teams.
  • Build and maintain reliable, high-throughput data pipelines for ingestion, transformation, and integration of structured, semi-structured, and unstructured data.
  • Solid understanding of data warehousing concepts, ETL/ELT pipelines, and data modeling
  • Experience with tools like Apache Spark (PySpark/Scala), Hive, DBT, and SQL for large-scale data transformation.
  • Design ETL/ELT workflows using orchestration tools like Apa...

Ready to Apply?

Take the next step in your AI career. Submit your application to bebo Technologies today.

Submit Application