Job Description
We are looking for DevOps Engineers with strong experience in data architecture, containerization, and cloud-native deployments. The ideal candidates will have hands-on expertise in Kafka-based data pipelines, Kubernetes orchestration, CI/CD automation, and data integration within regulated environments (preferably banking/financial services). This role involves building scalable, secure, and high-performance data platforms across multiple environments.
Key Responsibilities:
- Implement and configure Helm packages across multiple environments.
- Design and manage data ingestion pipelines using Apache Kafka, ensuring high availability and low-latency data delivery.
- Translate functional and technical requirements into scalable technical solutions and workflows.
- Manage data modelling, schema registry configurations, and serialization strategies.
- Monitor, troubleshoot, and resolve issues in data pipelines, schema alignment,...
Ready to Apply?
Take the next step in your AI career. Submit your application to Systems Limited today.
Submit Application