Job Description

Key Responsibilities

  • Design, build, and maintain scalable and reliable data pipelines for batch and real-time use cases.
  • Work on distributed systems to support large-scale data processing.
  • Develop, optimize, and manage streaming data workflows ensuring high throughput and low latency.
  • Implement strong data quality, validation, and governance frameworks.
  • Build and maintain data services using Python and modern engineering practices.
  • Work with cloud platforms (GCP / AWS / Azure) to deploy, scale, and monitor data infrastructure.
  • Collaborate with engineering teams to design microservices and work with containerization technologies (Docker).
  • Troubleshoot performance issues and ensure system stability, security, and compliance.

Required Skills

Ready to Apply?

Take the next step in your AI career. Submit your application to Infec Services Private Limited today.

Submit Application