Job Description

We are looking for energetic, high-performing and highly skilled GCP Data Engineers to help shape our technology and product roadmap.
Job Description:
Responsibilities:
Develop and maintain large scale data processing pipeline using PySpark Data Proc, Big Query and SQL.
Use Big Query and Data proc to migrate existing Hadoop/Spark/Hive workloads to Google Cloud.
Proficient in Big Query to carry out batch and interactive data analysis.
Function as member of an agile team by contributing to software builds through consistent development practices (tools, common components, and documentation)
Develops and tests software, including ongoing refactoring of code, and drives continuous improvement in code structure and quality
Enable the deployment, support, and monitoring of software across test, integration, and production environments
Minimum Qualifications:
This high energy Engineer must have:
A Bachelor’s degree in computer science, computer engineering, ot...

Ready to Apply?

Take the next step in your AI career. Submit your application to IntraEdge today.

Submit Application