The successful candidate will have a proactive approach with the ability to work independently and collaboratively on a wide range of projects. In this role, you will work alongside a small but impactful team, collaborating with data analysts, software developers, project managers and other teams at Apple to understand requirements and translate them into scalable, reliable, and efficient data pipelines, and data processing workflows.
- Responsible for architecting and implementing large scale systems and data pipelines with a focus on agility, interoperability, simplicity, and reusability
- Demonstrate business acumen, expertise and strategies into data modeling solutions
- Utilize deep knowledge in infrastructure, warehousing, data protection, security, data collection, processing, modeling, and metadata management to build end-to-end solutions that also support metadata logging, anomaly detection, data cleaning, and transformation
- Identify process improvements opportunities (tools, work streams, systems) and drive solutions from conception to implementation
- Demonstrate and explain complex business processes, systems, and/or tools with a focus on the upstream/downstream impact and relationship between multiple functions and/or decisions
- Identify and address issues in data design or integration
- Discuss technical tradeoffs across the stack, including: system architecture, database design, API design and infrastructure