Every year millions of people are either filing their taxes in fear or giving up on their tax refund altogether. We're working on fixing that. Our intuitive app enables anyone, regardless of education or background, to file their taxes with newfound confidence.
Spread across Germany, Spain and the UK, the team at Taxfix Group with its brands Taxfix and Steuerbot, is a compassionate group of solution-finders. We speak our minds openly, and with over 400 professionals, including tax experts, developers, and IT security experts, we're rich in ideas and voices. The group has facilitated more than 3.5 billion euros in tax refunds for its customers since its founding in 2016.
Taxfix makes tax filing simple, fast, and accessible for millions of users across Europe. Behind the product sits a data platform that powers everything from business analytics and regulatory reporting to ML and AI-driven product features. We’re growing and the platform needs to grow with us.
We are looking for an experienced Data Platform Engineer to design, build, and operate the infrastructure and pipelines that make data at Taxfix reliable, compliant, and ready for AI. You will own the systems that move data from operational databases, APIs, and SaaS tools into our analytical environment and make sure that data is correct, timely, and safe to use. Your work will directly enable ML and AI-powered product experiences, self-serve analytics, and regulatory reporting.
This is a hands-on engineering role. You will write code, review designs, and operate production systems not just draw diagrams.
The Data Platform team is part of the Data Organization, which enables everyone at Taxfix to make data-driven decisions, drive business growth, and power ML and automation in our products. Within our group we have product analysts, analytics engineers, machine learning engineers, and data engineers working across the whole company.
The platform team serves many consumers analytics, ML, product, and compliance so you can expect to join cross-functional project teams regularly, working alongside backend and frontend engineers, analysts, ML engineers, product and design. Team members are all data engineers but from different backgrounds. We rely on each other to deliver our best, grow together, share knowledge openly, and invest in personal development through mentoring and innovation.
Build and maintain ingestion pipelines that capture changes from application databases, APIs, SaaS and deliver clean, analytics-ready tables to our cloud data warehouse
Design data models with proper layering that handle real-world data complexity: out-of-order events, schema evolution, late arrivals, and backfills
Own and evolve cloud platform infrastructure — manage GCP resources (GCS, Dataflow, Dataproc), provision and maintain environments with Terraform, and ensure the platform is cost-efficient and scalable
Own data quality monitoring — build validation, monitoring, and alerting that catches problems before downstream consumers do
Implement privacy and compliance controls — anonymization, pseudonymization, access policies, and deletion propagation (GDPR right-to-be-forgotten) across raw and derived layers
Prepare data for ML and AI use cases — build governed, privacy-safe datasets and feature pipelines that ML engineers and data scientists can use for model training, evaluation, and production inference
Operate and improve our orchestration layer — scheduling, retries, SLA tracking, and observability for data pipelines
Define and raise the bar on engineering standards — code quality, testing, CI/CD, documentation, and infrastructure-as-code
Evaluate and adopt new technologies that help the team achieve its goals across data management, analytics, and machine learning
Incorporate AI into platform services — enable AI-assisted development workflows and build internal AI backend services as part of the data platform offering
Communicate across domains — work closely with analytics, product, compliance, and engineering teams; translate between technical and business language
Mentor and grow with the team — share what you learn, support others, and contribute to a culture of honest technical discussion
6+ years of experience in Data Engineering or a similar role (backend engineer working on data-intensive systems counts)
Strong Python skills for data pipeline development — you write production code, not just scripts
Strong SQL skills — window functions, CTEs, query optimization are second nature
Experience with event-driven data pipelines — CQRS, event ordering, idempotency, and the difference between initial load and incremental processing
Expert with Airflow — you’ve built DAGs with proper task dependencies, retries, and monitoring
Experience with Snowflake or/and BigQuery — you understand their architecture, performance characteristics, and how they differ from each other and from other analytical or operational tools
Cloud platform experience — you’ve worked with GCP (GCS, Dataflow, Dataproc etc) or equivalent AWS/Azure services and understand how to manage cloud resources at scale
Infrastructure-as-code — experience with Terraform, Helm, or similar tools for provisioning and managing cloud environments
K8S and Docker containerization — you package and deploy your own work
Data quality mindset — you profile data, validate assumptions, build checks, and don’t trust that “the data looked clean”
Data for AI readiness — you understand what it takes to prepare data for ML and AI: governance, lineage, privacy controls, and reproducibility
Awareness of data privacy requirements — you can identify PII, understand GDPR, and know how to implement anonymization and deletion across multiple data layers
AI-enabled engineering practices — you actively use AI assistants and code generation tools to accelerate development and deliver and you can establish standards for their effective use across the team
Track record of coaching and growing engineers — you’ve helped teammates level up through pairing, code reviews, or structured mentorship
Exposure to ML platforms - (Vertex AI, SageMaker) or feature stores — you’ve helped ML teams get from raw data to production models
Hands-on ML experience — training models, running experiments, and understanding the full lifecycle from data preparation to deployment
Experience with Segment.io or similar event collection and customer data platforms
Advanced data modeling — SCD Type 2, data vault, anchor or other patterns
Advanced privacy engineering — crypto-shredding, differential privacy, consent management systems
Domain knowledge in fintech, tax, or regulated industries — understanding of compliance-driven data requirements, audit trails, and data retention policies.
A chance to do meaningful, people-centric work with an international team of passionate professionals.
Holistic well-being with free mental health coaching sessions and yoga.
A monthly allowance to spend on an extensive range of services that you can use and roll over as flexibly as you like.
Employee stock options for all employees—because everyone deserves to benefit from the success they help to create.
30 annual vacation days and flexible working hours.
Work from abroad for up to six weeks every year. Just align with your team, and then enjoy your trip.
Plenty of opportunities to socialise as a team. In addition to internal tech meetups, our international team hosts regular get-togethers—virtually and in person when possible.
Free tax declaration filing, of course, through the Taxfix app—and internal support for all personal tax-related questions.
Have a four-legged friend in your life? We’re happy to have dogs join us in the office.
Excited? So are we. Learn more about Team Taxfix on our blog and get a glimpse of our culture.
At Taxfix, we believe that incredible things happen when you have a wealth of perspectives and experiences. We're proudly committed to equal employment and development opportunities no matter your gender, race, religion, age, sexual orientation, colour, disability, or place of origin. To help mitigate any potential unconscious biases, we ask that you refrain from including your picture, age, or marital status on your CV. Let your experiences speak for themselves.
Not sure if you meet all the requirements for this role? Please apply anyway. You might bring something special to the team that we hadn't considered previously.