Job Description

Key Responsibilities:

  • Design & Build: Develop scalable, resilient ETL workflows and real-time data pipelines using Python, PySpark, and AWS services.
  • Cloud Engineering: Utilize AWS services such as Glue, Lambda, EC2, RDS, and S3 to build efficient cloud-based data architectures.
  • Snowflake Integration: Design and maintain data models, pipelines, and ingestion processes within Snowflake, ensuring performance and scalability.
  • API Development: Integrate RESTful and other APIs to ingest and synchronize external and internal datasets.
  • Optimization: Monitor and tune the performance of data workflows, pipelines, and queries for minimal latency and high throughput.
  • Collaboration: Partner with data analysts, scientists, and stakeholders to define and deliver data requirements.

Required Ski...

Ready to Apply?

Take the next step in your AI career. Submit your application to StatusNeo today.

Submit Application