Job Description
Experience
Minimum of 3 years of professional experience in data engineering, with a strong focus on designing, developing, and optimizing scalable data pipelines, ETL/ELT workflows, and data integration solutions using modern cloud technologies.
Knowledge, Skills, and Abilities
Comprehensive understanding of data pipeline and modern data stack architectures, with hands-on experience in cloud-based platforms such as AWS, Azure, or GCP, and data platforms such as Snowflake, Databricks, or Redshift.
Technologies:
Data extraction: SQL, Python, API integration, Change Data Capture (CDC)
Database systems: PostgreSQL, MySQL, SQL Server
Data storage repositories: SFTP, AWS S3
Job scheduling and orchestration: Apache Airflow, AWS Step Functions
ETL/ELT tools and workflows: dbt, PySpark, AWS Glue, AWS Lam...
Ready to Apply?
Take the next step in your AI career. Submit your application to S.i. Systems today.
Submit Application