Job Description

Responsibilities

  • Build robust, efficient and reliable data pipelines consisting of diverse data sources to ingest and process data
  • Design and develop real time streaming and batch processing pipeline solutions
  • Assemble large, complex data sets to meet functional / non-functional requirements
  • Design, develop and implement data pipelines for data migration and collection, data analytics and other data movement solutions
  • Work with stakeholders and data analyst teams to assist with data-related technical issues and support their data infrastructure needs
  • Collaborate with Architects to define the architecture and technology selection

Qualifications and experience

  • Experience in building real time or batch ingestion and transformation pipelines.
  • Excellent analytical skills and proven track record solving difficult problems.
  • Experienced in Agile / Scrum projects.

Ready to Apply?

Take the next step in your AI career. Submit your application to Capgemini today.

Submit Application