Job Description
Roles and Responsibilities:
- Design, implement, and maintain data pipelines to ingest and process OpenShift telemetry (metrics, logs, traces) at scale.
- Stream telemetry via Kafka (producers, topics, schemas) and develop resilient consumer services for transformation and enrichment.
- Engineer data models and routing for multi-tenant observability while ensuring lineage, quality, and SLAs.
- Integrate processed telemetry into Splunk for dashboards, alerting, and analytics to achieve Observability Level 4 (proactive insights).
- Implement schema management (Avro/Protobuf), governance, and versioning for telemetry events.
- Build automated validation, replay, and backfill mechanisms for data reliability and recovery.
- Instrument services with OpenTelemetry; standardize tracing, metrics, and structured logging across plat...
Ready to Apply?
Take the next step in your AI career. Submit your application to Han Digital Solution today.
Submit Application