B

Senior Data Engineer

BlueberryHub
Full-time
On-site
Manila, Philippines
Senior Jobs
Introduction

Why Blueberry?

At Blueberry, data isn’t just numbers sitting in dashboards. It drives decisions, risk management, growth, and how we operate at scale. Our data team turns complex, messy inputs into reliable insights the business can actually trust. As a Senior Data Engineer, you’ll sit at the core of that mission, building platforms and pipelines that keep our data accurate, secure, and cost-efficient.



Description

We’re looking for a hands-on Senior Data Engineer to design, build, and optimise our data ecosystem. You’ll own our Snowflake warehouse, DBT models, and ELT pipelines, while helping the business access clean, timely data for reporting and decision-making. This is a builder role. You’ll refactor legacy models, reduce cloud costs, improve reliability, and grow alongside a fast-moving fintech business.

Note: At Blueberry, moving with purpose means showing up, connecting, and building momentum together. This role is based onsite in Taguig, Manila, Monday to Friday, where the real magic happens.

How You’ll Make an Impact

  • Design, own, and optimise Snowflake data models and DBT transformations end to end.

  • Build and maintain reliable ELT pipelines using Python, Airbyte, AWS DMS, and AWS services (Lambda, S3).

  • Refactor legacy models to improve performance and reduce compute and storage costs.

  • Implement data security, masking, and access controls for internal teams and external partners.

  • Create self-service datasets and BI outputs in Snowflake and Power BI.

  • Partner with analysts, product owners, and stakeholders to translate business needs into scalable data solutions.

  • Monitor pipelines, troubleshoot failures, and resolve production issues quickly and cleanly.

  • Establish best practices for testing, documentation, version control, and CI/CD.

  • Constantly validate and reconcile data to ensure accuracy, reliability, and compliance.



Skills And Experiences

Who We’re Hoping to Find

  • 5+ years’ experience building and scaling data warehouses and pipelines.

  • Deep hands-on experience with Snowflake, DBT, and SQL.

  • Strong Python skills for scripting, automation, and integrations.

  • Proven experience with ELT/ETL tools like Airbyte, AWS DMS, or similar.

  • Solid understanding of AWS data services and architecture (Lambda, S3, IAM).

  • Experience building reporting solutions in Power BI or comparable BI tools.

  • Strong understanding of data modelling, performance optimisation, and cost management.

  • Comfortable working in agile environments using Git and Jira.

  • Clear communicator who can explain complex data concepts to non-technical stakeholders.

  • A problem-solver who spots discrepancies and fixes them properly, not temporarily.

  • Someone who enjoys mentoring, reviewing code, and lifting the team around them.

Extra Points if You Have

  • Experience in Financial Services, Trading, or Banking.

  • Exposure to risk-facing or compliance-heavy environments.

  • Experience with data visualisation, LLMs, or advanced analytics use cases.

  • Familiarity with Docker, containerisation, or medallion architecture.