Job Description
Required Technical Skill Set: Hadoop, PySpark, Spark SQL, HIVE
Desired Experience Range: 5-12 years
Desired Competencies:
- Hands-on experience of Hadoop, PySpark, Spark SQL, Hive, Hadoop Big Data Eco System Tools.
- Should be able to develop, tweak queries and work on performance enhancement.
- Solid understanding of object-oriented programming and HDFS concepts
- The candidate will be responsible for delivering code, setting up environment, connectivity, deploying the code in production after testing
Responsibilities:
- Need to work as a developer in Cloudera Hadoop.
- Work on Hadoop, PySpark, Spark SQL, Hive, Bigdata Eco System Tools.
- Experience in working with teams in a complex organization involving multiple reporting lines.
- The candidate should have strong functional and technical knowledge to deliver what is required and he/she should be well acquainted with Banking terminologies. <...
Ready to Apply?
Take the next step in your AI career. Submit your application to Abhidi Solution today.
Submit Application