GT was founded in 2019 by a former Apple, Nest, and Google executive. GT’s mission is to connect the world’s best talent with product careers offered by high-growth companies in the UK, USA, Canada, Germany, and the Netherlands.
On behalf of KD Pharma, GT is looking for a Data Engineer (Azure) interested in building and scaling a modern data platform to support Finance, Operations/Supply Chain, and Quality/Manufacturing functions.
Founded in 1988, KD Pharma is a pure-play, technology-driven CDMO (Contract Development & Manufacturing Organization) dedicated to revolutionizing pharmaceutical and nutraceutical production. They are uniquely recognized for offering ultra-pure Omega-3 concentrates at commercial scale through patented supercritical fluid chromatography technologies.
Located in Germany, Norway, the UK, the USA, Canada, Peru, and with sales presence across Asia, they deliver end-to-end solutions from custom synthesis to finished dosage forms while adhering to cGMP and global certifications
The project is to establish a robust Azure-based data platform for business intelligence (BI). It includes assessing Microsoft Fabric vs. Azure Data Factory (ADF) and, if needed, re-platforming to a scalable ADF-led architecture.
This role will focus on improving data quality, lineage, reliability, and time-to-insight, while helping reassess Fabric vs. an Azure Data Factory architecture. This is a high-impact role offering the long-term opportunity to own and shape the Azure data platform and grow as a trusted data leader in a global organisation.
Success Measures:
0–6 months: Audit current estate, define migration plan, build ADF pipelines for priority sources (Business Central, TrackWise), achieve >98% dataset refresh success, establish baseline data quality checks & lineage.
6–24 months: Deliver a consolidated Lakehouse/Warehouse with governed semantic models, optimised for cost & performance, with documented controls and stakeholder CSAT/NPS ≥8/10.
Own the Azure data platform architecture & roadmap (ADF vs Fabric; Synapse/Databricks evaluation)
Design, build, and operate ETL/ELT pipelines into ADLS/Warehouse
Model data for Power BI (DAX/Tabular)
Implement data quality, lineage, governance & security (Purview, RBAC, CI/CD)
Partner with BI analysts to deliver reusable, trusted semantic models and dashboards
Drive reliability & cost optimisation (monitoring, alerting, SLAs)
Support immediate projects: Business Central (ERP + MES), TrackWise (QMS), ECC6 extracts
Experience Level: 4-6 years of experience
Strong expertise in Azure Data Factory & Azure Data Lake Gen2
Advanced SQL/T-SQL
Power BI (DAX, Tabular modeling, deployment pipelines)
Python or PySpark
Git & Azure DevOps (CI/CD pipelines)
Dimensional modeling
Security & RBAC
Experience with Synapse, Databricks, and Delta Lake
Knowledge of Microsoft Purview, IaC (Bicep/Terraform)
Familiarity with ML basics
Background in regulated manufacturing/pharma (GxP) — can be learned
Strong communication & collaboration abilities
Pragmatic “architect-builder” mindset — able to balance strategy with hands-on delivery
Comfort in leading technology choices and engaging stakeholders
Results-driven with focus on data reliability, governance, and business value
GT interview with Recruiter
Technical interview
Final interview
Reference Check
Offer