Key Responsibilities:
- Collaborate with cross-functional teams to identify and address data-related requirements, contributing to strategic planning and execution.
- Architect and develop data solutions, spanning from backend infrastructure to data model design and the implementation of AI systems.
- Lead development activities, mentor team members and share your knowledge in a context of continuous improvement.
- Continuously enhance the quality of our tools and applications through bug fixes and code refactoring.
- Leverage the latest data technologies and programming languages, including Python, Scala, and Java, along with systems like Spark, Kafka, and Airflow, within cloud services such as AWS.
- Ensure the ongoing maintenance, troubleshooting, optimization, and reliability of data systems, including timely resolution of unexpected issues.
- Stay abreast of emerging technologies through research and testing, unlocking opportunities for productivity improvements and ensuring our market-leading services for customers.
What You'll Need:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Proficiency in programming languages such as Python, Scala or Java.
- SQL knowledge for database querying and management.
- Strong knowledge of relational and NoSQL databases (e.g., PostgreSQL, MongoDB) and data modeling principles.
- Proven ability to design, build, and maintain scalable data pipelines and workflows using tools like Apache Airflow or similar.
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration skills.
Nice to have:
- Hands-on experience with data warehouse and lakehouse architectures (e.g., Databricks, Snowflake, or similar).
- Experience with big data frameworks (e.g., Apache Spark, Hadoop) and cloud platforms (e.g., AWS, Azure, or GCP).
Benefits:
- 25 Days of Annual Leave
- FitPass membership
- Private Health Insurance
- Internal Reward & Recognition Tool Kudos