Job Description
Key Responsibilities
Design, build, and maintain scalable data pipelines using AWS Glue.
Implement ETL/ELT processes for ingesting data from multiple internal and external sources.
Optimize data workflows for performance, reliability, and scalability.
Monitor, troubleshoot, and resolve data pipeline failures and performance issues.
Manage and optimize AWS Redshift data warehouse operations.
Configure and maintain data storage solutions, including AWS S3 and data lake environments.
Implement data partitioning, indexing, and compression strategies to improve performance.
Support Infrastructure as Code (IaC) practices for deploying and managing data infrastructure.
Develop and maintain CI/CD pipelines for data workflows using GitLab.
Implement automated testing for data pipelines and data quality validation.
Support version control, release management, and deployment of data‑related assets.
Configure and manage AWS Lambda functions for automated data pr...
Design, build, and maintain scalable data pipelines using AWS Glue.
Implement ETL/ELT processes for ingesting data from multiple internal and external sources.
Optimize data workflows for performance, reliability, and scalability.
Monitor, troubleshoot, and resolve data pipeline failures and performance issues.
Manage and optimize AWS Redshift data warehouse operations.
Configure and maintain data storage solutions, including AWS S3 and data lake environments.
Implement data partitioning, indexing, and compression strategies to improve performance.
Support Infrastructure as Code (IaC) practices for deploying and managing data infrastructure.
Develop and maintain CI/CD pipelines for data workflows using GitLab.
Implement automated testing for data pipelines and data quality validation.
Support version control, release management, and deployment of data‑related assets.
Configure and manage AWS Lambda functions for automated data pr...
Ready to Apply?
Take the next step in your AI career. Submit your application to HCLTech today.
Submit Application