Job Description
- Programming Languages: Proficient in Python and PySpark with a strong understanding software engineering best practices.
- Cloud Computing: Utilize Azure cloud based data platforms, specifically leveraging Databricks and Delta Live Tables for data engineering tasks, while effectively utilizing services related to storage, compute, and security.
- Data Pipelines: Design, build, and maintain robust and scalable and automated data pipelines for batch and streaming data ingestion of data and processing (Data bricks workflow).
- Data Architecture and Modeling: Design and implement robust data models and architectures that align with business requirements and support efficient data processing, analysis, and reporting.
- Orchestration: Utilize workflow orchestration tools to automate data pipeline execution and dependency management.
- Monitoring and Alerting: Integrate monitoring and alerting mechanisms to track pipeline health, identify pe...
Ready to Apply?
Take the next step in your AI career. Submit your application to TechDoQuest today.
Submit Application