Job Description

 

  • The Data Operations Engineer is the guardian of the data lifecycle. They ensure that data flows from source to destination without interruption, maintaining the 'plumbing' of the organizations data infrastructure
  • 1. Orchestration & Workflow Engines
  • Apache Airflow: Deep understanding of Operators, Sensors, Hooks, and the Airflow UI to manage complex task dependencies.
  • Workflow Logic: Knowledge of how to handle retries, branching, and ''backfilling'' historical data.
  • 2. ETL/ELT Methodologies
  • Data Ingestion: Familiarity with moving data from APIs, logs, and relational databases into data warehouses (like Snowflake, BigQuery, or Redshift).
  • Transformation: Understanding how data is cleaned and restructured during transit.
  • 3. Scripting & Command Line
  • Python: Ability to read and perform ''hotfixes'' or minor adjustments to existing scripts.
  • SQL: Proficiency in writing queries to validate data qua...

Ready to Apply?

Take the next step in your AI career. Submit your application to Softtek today.

Submit Application