Job Description
Job Responsibilities
Design, develop, and maintain ETL pipelines to ingest, transform, and load data from multiple sources.
Build and optimize data pipelines for performance, reliability, and scalability.
Work with cloud-based data lakehouse platforms such as Databricks, BigQuery, or similar technologies.
Develop and maintain data processing workflows using Python.
Write complex and optimized SQL queries to support analytics and reporting requirements.
Design and manage data models to support data warehousing and business intelligence use cases.
Process and manage structured and semi-structured data (e.g., JSON, CSV, logs).
Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
Ensure data quality, integrity, and consistency across data platforms.
Troubleshoot and resolve data pipeline and performance issues.
Qualifications
Minimum Qualifications:
Bachelor's degree in Information Technology, Com...
Design, develop, and maintain ETL pipelines to ingest, transform, and load data from multiple sources.
Build and optimize data pipelines for performance, reliability, and scalability.
Work with cloud-based data lakehouse platforms such as Databricks, BigQuery, or similar technologies.
Develop and maintain data processing workflows using Python.
Write complex and optimized SQL queries to support analytics and reporting requirements.
Design and manage data models to support data warehousing and business intelligence use cases.
Process and manage structured and semi-structured data (e.g., JSON, CSV, logs).
Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements.
Ensure data quality, integrity, and consistency across data platforms.
Troubleshoot and resolve data pipeline and performance issues.
Qualifications
Minimum Qualifications:
Bachelor's degree in Information Technology, Com...
Ready to Apply?
Take the next step in your AI career. Submit your application to Yondu, Inc. today.
Submit Application