Job Description
Roles and Responsibilities :
- 12+ Years experience in Big data Space across Architecture, Design, Development, testing & Deployment, full understanding in SDLC.
- Experience of Hadoop and related technology stack experience
- Experience of the Hadoop Eco-system(HDP+CDP) / Big Data (especially HIVE)
- Hand on experience with programming languages such as Java/Scala/python
- Hand-on experience/knowledge on Spark.
- Being responsible and focusing on uptime and reliable running of all or ingestion/ETL jobs.
- Good SQL and used to work in a Unix/Linux environment is a must.
- Create and maintain optimal data pipeline architecture.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Good to have cloud experience
- Good to have experience for Hadoop integration with data visualizati...
Ready to Apply?
Take the next step in your AI career. Submit your application to Geetha Technology Solutions today.
Submit Application