Job Description
Design, build, and optimize scalable data systems to support analytics, reporting, and business insights. Key focus: efficient data pipelines (orchestrated via Apache Airflow), ETL/ELT processes, data modeling, high-performance data warehousing, and denormalized flat report views for fast, join-free BI consumption.
Key Responsibilities
Develop and maintain scalable data pipelines and workflows using Apache Airflow for scheduling, dependency management, monitoring, and alerting.
Build and optimize ETL/ELT processes for ingesting, transforming, and loading data from diverse sources.
Design dimensional/star schemas and scalable data models for data warehouses.
Architect and tune data warehouses for performance (indexing, partitioning, query optimization).
Proficient in designing and creating denormalized, flat report views for BI tools (e.g., Power BI, Tableau, ERPNext), enabling efficient, join-free querying and self-service analytics.
Ensure data quality, security, ...
Key Responsibilities
Develop and maintain scalable data pipelines and workflows using Apache Airflow for scheduling, dependency management, monitoring, and alerting.
Build and optimize ETL/ELT processes for ingesting, transforming, and loading data from diverse sources.
Design dimensional/star schemas and scalable data models for data warehouses.
Architect and tune data warehouses for performance (indexing, partitioning, query optimization).
Proficient in designing and creating denormalized, flat report views for BI tools (e.g., Power BI, Tableau, ERPNext), enabling efficient, join-free querying and self-service analytics.
Ensure data quality, security, ...
Ready to Apply?
Take the next step in your AI career. Submit your application to LucrumX today.
Submit Application