o

Senior Data Engineer Hybrid

oOh
Full-time
On-site
North Sydney, New South Wales, Australia
Senior Jobs

We are oOh!media (pronounced “oh!” media). 

oOh!media is the #1 Out of Home company in Australia and New Zealand. 

We exist to make public spaces better and brands unmissable, proudly leading the market with innovation, creativity, data and results. Our network plays an important role in the communities they are located, creating engaging environments that inform, entertain and inspire, while connecting brands with audiences at scale. 

We are oOh!media, and we are unmissable. 

Join us:  

We’re a team driven by creativity, innovation, and a sense of community. We show up every day ready to be bold, brave, and push the boundaries of Out of Home advertising. You’ll work alongside a group of talented individuals across Australia and New Zealand, all of whom are dedicated to raising the bar. If you’re ready to realise your potential and make an impact, join us. 

About the opportunity:

  • Permanent role
  • Based in North Sydney (Hybrid)

We are seeking a highly skilled Senior Data Engineer to design, build, and maintain modern data solutions that enable advanced analytics, modelling, and reporting. A key part of the role will be maintaining and extending our enterprise Data Warehouse on Databricks, while also contributing to a broader range of data engineering initiatives across the business. You will collaborate with Data Scientists, Analytics Engineers, Data Business Analysts, and other technical teams to deliver scalable, reliable, and well-governed data products that support strategic decision-making. 

 

Key responsibilities and experience:

• Design, build, and maintain scalable data pipelines across the Data Warehouse, leveraging the Medallion Architecture concept.

• Extend ingestion and transformation pipelines for new data sources, ensuring reliable history handling and production-ready outputs. 

• Maintain and extend the Data Warehouse, ensuring performance, scalability, and alignment with evolving business needs. 

• Implement robust data quality, governance, and lineage controls (e.g., schema validation, integrity checks, freshness monitoring) aligned with internal standards. 

• Partner with Data Scientists to ensure data assets are optimised for modelling and experimentation. • Collaborate with Analytics Engineers to design data models that balance efficiency, maintainability, and business relevance. 

• Ensure pipelines are maintainable and extensible by documenting logic, creating runbooks/playbooks, and enabling smooth knowledge transfer. 

• Monitor and optimise compute usage for performance and cost efficiency. 

• Investigate, debug, and perform root cause analysis to resolve data errors and pipeline failures. 

• Ensure code quality and pipeline reliability through comprehensive testing practices (unit tests, integration tests, and acceptance tests). 

• Contribute to checkpoint reviews, ensuring alignment on delivery expectations and early risk identification. 


You will also have:

• Degree in Computer Science, Engineering, Data Science, or a related field. 

• Experience at a senior level in data engineering, with hands-on delivery of complex data solutions. 

• Proven experience maintaining and extending enterprise Data Warehouses, ideally on modern cloud platforms. 

• Experience working with cloud data platforms such as Databricks or Snowflake, and cloud service providers (AWS, GCP, or Azure). 

• Strong experience with SQL and relational databases (e.g., PostgreSQL).

• Proficiency in Python and PySpark for data engineering. 

• Strong experience with data transformation and modelling tools (e.g., dbt or similar). 

• Experience with CI/CD pipelines and automated testing frameworks in a data engineering context. 

• Experience implementing DataOps principles in production environments. 

• Familiarity with Power BI, Tableau, or similar BI/visualisation tools. 

• Solid understanding of data modelling approaches (e.g. star and snowflake schemas, slowly changing dimensions). 

• Familiarity with workflow automation and scheduling tools (e.g., Airflow, Git, Databricks Workflows). 

• Strong grasp of the data development lifecycle, from ingestion to production ready assets.


Highly Desirable

• Hands-on experience with Databricks, including the use of Delta Live Tables (DLT) for orchestration.

• Exposure to governance frameworks and automated data quality practices. 

• Experience working with commercial, operational, or domain-specific data. 

• Knowledge of cloud-based data platforms and their integration with modern analytics stacks. 

• Background in supporting analytics and BI tools (e.g., Power BI) through curated data marts. 

• Familiarity with optimisation, advanced analytics, or machine learning environments

 

Our benefits and perks:  

  • Competitive salary package 
  • A positive, supportive workplace culture 
  • Professional growth and development opportunities 
  • Comprehensive, paid training and ongoing support 

If you’re curious, ready for a unique challenge, and want to make a real impact, we want to hear from you! 

At oOh!, we celebrate diversity and strive for an inclusive environment. We welcome applications from all backgrounds, including Aboriginal and Torres Strait Islander peoples, people with disabilities, LGBTQIA+ individuals, and refugees. 

To be considered, applicants must be Australian citizens or permanent residents with full working rights in Australia, and all offers of employment are subject to employer-funded pre-employment checks including police, reference, and work rights verification, with additional checks such as credit, bankruptcy, drug and alcohol screening, or driver’s licence verification required for some roles.