SHARE your talent
We’re looking for someone who has these abilities and skills:
Required Skills and Abilities:
· Effective Communication skills.
· Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience.
· Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills.
· Relevant years of programming experience using Databricks.
· Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS).
· Solid knowledge on network and firewall concepts.
· Solid experience writing, optimizing and analyzing SQL.
· Relevant years of experience with Python.
· Ability to break complex data requirements and architect solutions into achievable targets.
· Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile.
· Experience using Harness.
· Technical lead responsible for both individual and team deliveries.
Desired Skills and Abilities:
· Worked in big data migration projects.
· Worked on performance tuning both at database and big data platforms.
· Ability to interpret complex data requirements and architect solutions.
· Distinctive problem-solving and analytical skills combined with robust business acumen.
· Excellent basics on parquet files and delta files.
· Effective Knowledge of Azure cloud computing platform.
· Familiarity with Reporting software – Power BI is a plus.
· Familiarity with DBT is a plus.
· Passion for data and experience working within a data-driven organization.
· You care about what you do, and what we do.
AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it.
How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty.
With an innovative and flexible approach to risk solutions, we partner with those who move the world forward.
Learn more at axaxl.com
AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic.
At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed.
Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer.
Sustainability
At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations.
Our Pillars:
For more information, please see axaxl.com/sustainability.
Data Engineer
Gurgaon/Bangalore, India
AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable – enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model – disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking an Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner.
DISCOVER your opportunity
What will your essential responsibilities include?
· Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate.
· Understand current and future data consumption patterns, architecture (granular level), partner with Architects to ensure optimal design of data layers.
· Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning.
· Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers.
· Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform.
· Design prototypes and work in a fast-paced iterative solution delivery model.
· Design, Develop and maintain ETL pipelines using Pyspark in Azure Databricks using delta tables.
· Use Harness for deployment pipeline.
· Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed.
· Diagnose system performance issue related to data processing and implement solution to address them.
· Collaborate with other teams to ensure successful integration of data pipelines into larger system architecture requirement.
· Maintain integrity and quality across all pipelines and environments.
· Understand and follow secure coding practice to make sure code is not vulnerable.