Job Description

  • Design and architect enterprise-scale data platforms using Databricks Lakehouse architecture.
  • Lead data modernization initiatives including migration from legacy data warehouses to cloud-based platforms.
  • Define best practices for data ingestion, transformation, storage, and analytics using Databricks.
  • Implement and optimize Apache Spark, Delta Lake, and Unity Catalog .
  • Ensure data security, governance, metadata management, and compliance standards are met.
  • Collaborate with cloud, data engineering, analytics, and business stakeholders to translate requirements into technical solutions.
  • Provide architectural guidance, code reviews, and technical leadership to delivery teams.
  • Support CI/CD pipelines, automation, and infrastructure-as-code for data platforms.
  • Troubleshoot performance, scalability, and cost optimization issues.

Required Skills & Qualification...

Ready to Apply?

Take the next step in your AI career. Submit your application to TechDoQuest today.

Submit Application