Job Description
We are looking for an experienced Java + Apache Spark Develope r with 6+ years of hands-on experience in building scalable, high-performance data processing applications. The ideal candidate should have strong expertise in Java, distributed data processing, and big data ecosystems.
Key Responsibilities
Design, develop, and maintain scalable data processing applications using Java and Apache Spark
Develop batch and real-time data pipelines
Optimize Spark jobs for performance, scalability, and reliability
Work with large datasets using distributed computing frameworks
Integrate Spark applications with Hadoop ecosystem tools (HDFS, Hive, etc.)
Collaborate with data engineers, analysts, and cross-functional teams
Troubleshoot production issues and ensure high availability
Follow best practices in coding, testing, and deployment
Required Skills & Qualifications
5+ years of experience in Java development
Strong hands-on experience with Apache Spa...
Key Responsibilities
Design, develop, and maintain scalable data processing applications using Java and Apache Spark
Develop batch and real-time data pipelines
Optimize Spark jobs for performance, scalability, and reliability
Work with large datasets using distributed computing frameworks
Integrate Spark applications with Hadoop ecosystem tools (HDFS, Hive, etc.)
Collaborate with data engineers, analysts, and cross-functional teams
Troubleshoot production issues and ensure high availability
Follow best practices in coding, testing, and deployment
Required Skills & Qualifications
5+ years of experience in Java development
Strong hands-on experience with Apache Spa...
Ready to Apply?
Take the next step in your AI career. Submit your application to Deloitte today.
Submit Application