Job Description
- Role Responsibilities:
- Data Engineering & Pipeline Development: Develop and maintain robust data pipelines to support the value quantification process. Utilize tools such as Apache NiFi, Azure Data Factory, Pentaho, Talend, SSIS, and Alteryx to ensure efficient data integration and transformation.
- Data Management and Analysis: Manage and analyze large datasets using SQL, Hadoop, and other database management systems. Perform data extraction, transformation, and loading (ETL) to support value quantification efforts.
- Advanced Analytics Integration: Use advanced analytics techniques, including machine learning algorithms, to enhance data processing and generate actionable insights. Leverage programming languages such as Python (Pandas, NumPy, PySpark) and Impala for data analysis and model development.
- Business Intelligence and Reporting: Utilize business intelligence platforms such as Tableau and Power BI to create insightful dashb...
Ready to Apply?
Take the next step in your AI career. Submit your application to Dynamic Yield today.
Submit Application