Job Description
Qualifications
- 8+ years of professional experience working specifically with data, ETL tools, and data architecture.
- Expert-level proficiency in SQL.
- Strong experience with cloud data platforms like Snowflake.
- Deep experience with dbt (Data Build Tool) for managing transformations and data quality.
- Solid working experience with Python for data manipulation and scripting.
- Proven ability to implement and manage modern data architectures (e.g., Data Lake, Data Warehouse, Lakehouse).
- Experience developing and maintaining production-level DAGs in Apache Airflow.
- Experience with AWS.
Job responsibilities
- Design, develop, and maintain robust, scalable, and efficient ETL/ELT data pipelines.
- Develop complex data transformation logic using dbt (Data Build Tool) and advanced SQL.
- Implement, monitor, and manage workflows using Apache Airflow for scheduling and orchestrat...
Ready to Apply?
Take the next step in your AI career. Submit your application to GlobalLogic today.
Submit Application