Senior Data Engineer (AWS, Snowflake, DBT)
Exciting opportunity in the insurance industry for a senior-level expert in data engineering. Leverage Snowflake, dbt, and AWS to architect and implement scalable cloud data solutions. Hybrid role based in Markham. Work cross-functionally on impactful initiatives that enable AI and business transformation.
What is in it for you:
• Salaried: $95-105 per hour.
• Incorporated Business Rate: $110-120 per hour.
• 12-month contract.
• Full-time position: 37.50 hours per week.
• Hybrid model: 3 days per week on-site, subject to change.
Responsibilities:
• Define and drive data engineering strategy, technical standards, and best practices across the organization.
• Lead architecture and detailed design of end-to-end data solutions.
• Build and manage scalable data pipelines with tools like dbt Core/Cloud.
• Design and review conceptual, logical, and physical data models.
• Write clean, maintainable code in SQL, Python, Shell, and Terraform.
• Promote data quality, cataloging, and governance across all initiatives.
• Conduct root cause analysis and resolve complex data issues.
• Oversee agile delivery, ensuring timely and effective execution.
• Mentor engineers to build team capability and elevate technical quality.
• Collaborate with architects, QA engineers, and business stakeholders.
• Create and maintain detailed technical documentation.
• Participate in hiring and onboarding by designing technical assessments and conducting interviews.
What you will need to succeed:
• Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
• Relevant certifications preferred; two or more of the following are considered an asset:
• SnowPRO Core
• SnowPRO Advanced Data Engineer (DEA-C01, DEA-C02)
• SnowPro® Advanced: Architect (ARA-C01)
• dbt Developer
• AWS Cloud Practitioner
• 15+ years of professional experience, with delivery of 10+ high-impact data projects.
• 7+ years of experience in coding with Python, Java, or similar languages.
• 5+ years of hands-on experience with Snowflake, dbt Core/Cloud, and AWS.
• Deep knowledge of relational (PostgreSQL, Amazon Aurora), NoSQL (MongoDB), and big data platforms (Hadoop).
• Proficiency in SQL, Python, Shell scripting, and Terraform.
• Skilled in data visualization tools including Snowsight, Streamlit, Qlik, and SAP BusinessObjects.
• Experience with orchestration tools such as Zena and AWS Managed Airflow.
• Strong communication and presentation skills with ability to bridge business and technical perspectives.
• Demonstrated leadership in mentoring and elevating engineering capabilities.
• Ability to thrive in high-pressure, fast-paced environments.
• Strong problem-solving and customer-centric mindset.
• Familiarity with insurance industry systems and processes is a strong asset.
• Exposure to operationalizing AI/ML and GenAI models is considered a plus.
Why Recruit Action?
Recruit Action (agency permit: AP-2504511) provides recruitment services through quality support and a personalized approach to job seekers and businesses. Only candidates who match hiring criteria will be contacted.
# AVICJP00002824