Structureit logo

Data Analytics Engineer

Structureit
Full-time
Remote
South Africa

StructureIt is a fast-growing fintech business, with offices in the UK, South Africa, Mauritius, Thailand and New Zealand. We build mission critical software and data solutions for both major multi-national financial institutions and smaller organisations.

Main focus of the role

We are seeking a technically excellent, customer-focused Analytics Engineer to drive data integration and transformation when building customer-facing solutions. This role sits at the intersection of client data environments and our internal product development efforts—requiring both business insight and technical acumen.

 

Your primary focus will be to deeply understand business and reporting requirements—often expressed in SQL, Excel, or ad hoc analysis and implement a Data Lakehouse. You will work closely with business stakeholders, analysts, and the data architecture team to develop a Data LakeHouse to support these requirements.

Working as part of a collaborative team, you’ll help ensure these specifications are accurately and efficiently implemented—typically in PySpark/SparkSql based data pipelines—by supporting the development process, providing subject-matter expertise, and validating that delivered data outputs match the original requirements. Your work will be critical in ensuring that our analytics solutions are robust, repeatable, and deliver trusted results to the business

 

This role requires a strong analytical mindset, excellent communication skills, and hands-on experience with data transformation and integration processes.

What you’ll do 

  • Data Analysis & Integration Design
    • Analyze complex customer data sets, schemas, and data flows to assess integration needs.
    • Collaborate with data engineers and developers to develop effective data ingestion, transformation, and mapping processes.
    • Validate data quality, completeness, and alignment with business requirements.
  • Technical Collaboration & Delivery
    • Development of scalable data integration and transformation pipelines by coding, testing, and implementing solutions—primarily using PySpark, SparkSQL, and Python.
    • Contribute hands-on to codebases: write and review code, implement validation checks, and assist with the development of complex transformation logic as needed.
    • Help automate and maintain data validation and quality checks to ensure reliability and accuracy of analytics outputs.
    • Collaborate closely with the FPA team to understand financial reporting needs, ensure alignment of technical solutions with finance objectives, and validate outputs supporting business decision-making.
    • Work with modern data formats and platforms (Parquet, Delta Lake, S3/Blob Storage, Databricks, etc.) to enable efficient data handling and integration.
    • Participate in solution architecture and technical discussions, collaborating on user story creation and acceptance criteria to ensure technical solutions align with business needs.
  • Product & Analytics Alignment
    • Work closely with our product team to ensure that customer data is accurately reflected in analytics outputs.
    • Provide feedback on product improvements based on client needs and data insights.
    • Monitor and evaluate the effectiveness of integrated data in supporting customer decision-making.

To fly in your role, you’ll need

  • 5+ years’ experience in data analytics, data engineering, or related technical roles, with significant hands-on coding in Python, PySpark, Notebook and SQL.Proven experience working on data integration projects (ETL, data pipelines, APIs).
  • Proven ability to build and support data integration pipelines (ETL, dataflows, APIs) at scale.
  • Proven ability to build and deploy Data pipelines to Cloud production environments using automated CI/CD pipelines.
  • Strong experience with modern analytics platforms (Databricks, Delta Lake, cloud storage); experience with relational databases and data modelling is a plus. Experience in Agile/Scrum environments.
  • Strong experience in working with cloud based environment (Azure, AWS)
  • Track record of translating business logic and requirements into production-grade, testable code. Experience working with data analytics tools and platforms (e.g., Power BI, SuperSet, Tableau, Looker, Grafana).
  • Solid grasp of data quality, data validation, and monitoring concepts.
  • Strong communication skills—able to present technical logic and results to both technical and non-technical audiences.
  • Experience in Agile/Scrum environments and working collaboratively with both business and engineering teams.

Nice to have

  • Experience in the telecoms industry.
  • Infrastructure as code experience. (terraform and Pulumi)
  • Experience working in a scale up/dynamic consulting environment.
  • Exposure to accounting concepts.

Benefits

  • Working
    • Fully remote working / or from the office with daily lunch
    • Flexible working hours
    • High-spec Dell laptop
    • Money towards a keyboard of your choice, that is yours to keep
  • Insurance - fully paid on top of, not out of your salary
    • Medical Aid, including Gap Cover
    • Life Insurance, with Disability Insurance and Funeral cover
  • Learning
    • Learning Budget - Books or Courses - you choose how to use it
  • Culture
    • People-first culture that encourages work/life balance
    • Everyone has a voice, regardless of title
    • Psychological safety
  • Leave
    • 20 days annual leave
    • Paid Maternity, Paternity, Study & Moving, Conference, CSR Volunteering leave
  • Long-Term Loyalty Benefits
    • 2 years - monthly budget towards a cell phone contract, uber travel vouchers or petrol card
    • 3 years – can apply for a study bursary to enhance your current and future role with us
    • 5 years - 3 additional days of annual leave
    • 7 years – a local weekend away at our cost
    • 10 years - 3 month paid sabbatical

Interested?

Send your CV to jobs@structureit.net