This is a remote position.
Join our team as an Associate Data Engineer to assist in building scalable data pipelines and maintaining data integrity.
Responsibilities:
- Support the development of data pipelines using big data technologies.
- Assist in creating ETL workflows using Talend and DBT.
- Write and test SQL queries for data analysis.
- Support workflow orchestration using Airflow.
- Help implement data governance best practices.
Requirements
- Basic knowledge of big data technologies (Spark, Hive, Hadoop, NiFi).
- Familiarity with Python, SQL, and databases.
- Exposure to DBT, Airflow, and cloud platforms (GCP, AWS).
- Eagerness to learn CI/CD tools and data governance principles.