C

senior data engineer

Computer Experts Personnel
Full-time
On-site
Gauteng, South Africa
Senior Jobs

Introduction

We are seeking a Data Engineer to join our growing engineering team. This is a key role for a motivated and technically skilled individual with a solid foundation in software engineering and data systems. You will work on building scalable data infrastructure, implementing robust data integrations, and collaborating with cross-functional teams to solve real-world data challenges.
What You’ll Do
● Design, develop, test, and maintain reliable data pipelines and ETL processes using Python and SQL
● Build and manage API-based data ingestion workfl ows and real-time data integrations
● Apply software engineering best practices: modular design, testing, version control, and documentation
● Own and optimize data workfl ows and automation, ensuring effi ciency and scalability
● Collaborate closely with senior engineers, data scientists, and stakeholders to translate business needs into technical solutions
● Maintain and enhance data reliability, observability, and error handling in production systems
● Develop and support internal data-driven tools
● Implement data operations best practices, including automated monitoring, alerting, and incident response for pipeline health
● Work with data-devops principles: CI/CD for data workfl ows, infrastructure-as-code, and containerized ETL deployments
What You Bring
● 7+ years of professional experience as a Data Engineer or in a similar role developing data ETL pipelines.
● Advanced profi ciency in Python for backend development and scripting
● Strong SQL skills with hands-on experience in querying and modeling relational databases
● Experience with cloud platforms such as AWS, GCP, or Azure
● Hands-on with containerization technologies like Docker or Kubernetes
● Solid understanding of RESTful APIs
● Experience with version control systems (GitHub, GitLab, Bitbucket) and CI/CD workfl ows
● Strong grasp of software development lifecycle (SDLC) and principles of clean, maintainable code
● Demonstrated ability to work independently, own projects end-to-end, and mentor junior engineers
● Familiarity with AI concepts and prompt engineering is a plus
Nice to Have:
● Experience with data security, privacy compliance, and access controls
● Knowledge of infrastructure-as-code tools (e.g., Terraform, Helm)
● Background in event-driven architecture or stream processing 

Duties & Responsibilities


What You’ll Do
● Design, develop, test, and maintain reliable data pipelines and ETL processes using Python and SQL
● Build and manage API-based data ingestion workfl ows and real-time data integrations
● Apply software engineering best practices: modular design, testing, version control, and documentation
● Own and optimize data workfl ows and automation, ensuring effi ciency and scalability
● Collaborate closely with senior engineers, data scientists, and stakeholders to translate business needs into technical solutions
● Maintain and enhance data reliability, observability, and error handling in production systems
● Develop and support internal data-driven tools
● Implement data operations best practices, including automated monitoring, alerting, and incident response for pipeline health
● Work with data-devops principles: CI/CD for data workfl ows, infrastructure-as-code, and containerized ETL deployments
What You Bring
● 7+ years of professional experience as a Data Engineer or in a similar role developing data ETL pipelines.
● Advanced profi ciency in Python for backend development and scripting
● Strong SQL skills with hands-on experience in querying and modeling relational databases
● Experience with cloud platforms such as AWS, GCP, or Azure
● Hands-on with containerization technologies like Docker or Kubernetes
● Solid understanding of RESTful APIs
● Experience with version control systems (GitHub, GitLab, Bitbucket) and CI/CD workfl ows
● Strong grasp of software development lifecycle (SDLC) and principles of clean, maintainable code
● Demonstrated ability to work independently, own projects end-to-end, and mentor junior engineers
● Familiarity with AI concepts and prompt engineering is a plus
Nice to Have:
● Experience with data security, privacy compliance, and access controls
● Knowledge of infrastructure-as-code tools (e.g., Terraform, Helm)
● Background in event-driven architecture or stream processing 

Desired Experience & Qualification

7+ years of professional experience as a Data Engineer or in a similar role developing data ETL pipelines.
● Advanced profi ciency in Python for backend development and scripting
● Strong SQL skills with hands-on experience in querying and modeling relational databases
● Experience with cloud platforms such as AWS, GCP, or Azure
● Hands-on with containerization technologies like Docker or Kubernetes
● Solid understanding of RESTful APIs
● Experience with version control systems (GitHub, GitLab, Bitbucket) and CI/CD workfl ows
● Strong grasp of software development lifecycle (SDLC) and principles of clean, maintainable code
● Demonstrated ability to work independently, own projects end-to-end, and mentor junior engineers
● Familiarity with AI concepts and prompt engineering is a plus
Nice to Have:
● Experience with data security, privacy compliance, and access controls
● Knowledge of infrastructure-as-code tools (e.g., Terraform, Helm)
● Background in event-driven architecture or stream processing 

Package & Remuneration

up toR100K pm