Job Description
Job Title: PySpark Data Engineer
Experience: 6+ Years
Location: Hyderabad/ Pune
Employment Type: Full-Time
Job Summary:
We are looking for a skilled and experienced PySpark Data Engineer to join our growing data engineering team. The ideal candidate will have 6+ years of experience in designing and implementing data pipelines using PySpark, AWS Glue, and Apache Airflow, with strong proficiency in SQL. You will be responsible for building scalable data processing solutions, optimizing data workflows, and collaborating with cross-functional teams to deliver high-quality data assets.
Requirements
Key Responsibilities:
Experience: 6+ Years
Location: Hyderabad/ Pune
Employment Type: Full-Time
Job Summary:
We are looking for a skilled and experienced PySpark Data Engineer to join our growing data engineering team. The ideal candidate will have 6+ years of experience in designing and implementing data pipelines using PySpark, AWS Glue, and Apache Airflow, with strong proficiency in SQL. You will be responsible for building scalable data processing solutions, optimizing data workflows, and collaborating with cross-functional teams to deliver high-quality data assets.
Requirements
Key Responsibilities:
- Design, develop, and maintain large-scale ETL pipelines using PySpark and AWS Glue.
- ...
Ready to Apply?
Take the next step in your AI career. Submit your application to DATAECONOMY today.
Submit Application