Job Description
Onsite Role
Location: Gulberg Greens, Islamabad
Timings: 7pm-3am
Key Responsibilities
Design, build, and maintain scalable ETL/ELT data pipelines.
Develop and optimize data models using DBT.
Work with cloud-based data warehouses such as Databricks, Redshift, BigQuery, and Snowflake.
Write complex and optimized SQL queries for data transformation and reporting.
Develop data processing workflows using Python and PySpark.
Ensure data quality, integrity, and governance across systems.
Monitor, troubleshoot, and improve data pipeline performance.
Required Skills & Qualifications
2–4 years of experience in Data Engineering or related field.
Strong hands-on experience with SQL, Python, PySpark, DBT
Experience with Databricks and Amazon Redshift/Google BigQuery/Snowflake.
Ready to Apply?
Take the next step in your AI career. Submit your application to Enigma Software Solution today.
Submit Application