The
role
As a Senior
Data Engineer, you will be a part
of our Data Engineering Chapter and
be responsible for delivering data solutions on
engagements with our customers and partners. You will
be highly experienced in data engineering,
developing solutions designed for effective data
processing and warehousing requirements of large
enterprises. You will be responsible for
designing, implementing, and optimising highly scalable,
flexible, secure and resilient data solutions in solving
business problems and accelerating the adoption of our
client’s data initiatives on the cloud. You will work
closely with the product owners, analysts, engineers, and
business stakeholders to deliver high-quality data and
analytics products that drive business insights and
innovation.
What
you should expect
Your
responsibilities will include:
- Understanding
business objectives and delivering
quality data centric solutions within
committed timeframes
- Supporting
the discovery and definition of critical
business use cases that will be enabled by
the data platform based on industry
experience and best practices
- Develop
and evolve high-level platform
technology architecture, incorporating
emerging technologies and best practices
- Collaborate
with cross-functional teams (e.g., data science,
engineering, business) to define and evaluate technology
solution options
- Provide
technology recommendations that align with business
objectives and technical constraints
- Develop
high-level solution design that translates the
architecture into a practical and feasible implementation
plan
- Designing
and building end-to-end data solutions and data assets
integrating data from internal and external
sources aligned with business objectives
- Developing
and
maintaining data pipelines
for
model training and inference
- Performing
data analysis,
feature engineering, model
evaluation, and model
optimisation
- Developing
and sharing industry leading standards and
practices to ensure high quality solutions that minimise
risk
- Collaborating
and communicating with business and delivery
stakeholders and working without
supervision
- Identifying,
escalating, and remediating technical debt
to support continuous improvement and
effective risk management
- Building
solutions that are fit for purpose,
reliable, scalable, and perform well with large datasets
and complex data transformation rules
- Implementing
development and release management
processes
- Mentoring
and coaching team members on best
practices, technologies, and problem-solving
techniques
What
you will need to succeed
- Bachelors
degree or above qualification in IT,
Software Engineering or Computer
Science
- 5+
years of data engineering
experience
- Highly
experienced in building enterprise data platforms
and implementing industry
leading data engineering standards and
practices
- Extensive
experience in implementing industry leading data
engineering standards and practices with solid
understanding of cloud data platform architecture and
design
- Expert
in designing, building, and
maintaining robust and scalable
real-time and batch data pipelines
using frameworks such as Apache
Spark and Apache
Flink
- Strong
understanding of data
warehousing concepts,
dimensional modelling, and
data structures
- Adept
in
data
platforms such as Hadoop,
Teradata, Snowflake, and
Databricks as well as NoSQL
databases such as Cassandra,
MongoDB, and
ScyllaDB
- Experience
with data profiling,
cleansing, and validation
techniques
- Experience
with machine learning
algorithms, model building,
and evaluation
- Experience
with leveraging foundational
Gen AI models (LLMs) and their
application to enable
business-specific use
cases
- Experience
with stream processing
technologies
- Expert
in SQL, Spark,
Scala,
Java, Python, Kafka,
HBase, Streaming,
MLlib, Flink,
Airflow
- Experience
with data visualisation tools
such as Tableau and
PowerBI
- Cloud
Environments: AWS (EMR, Redshift, Lambda, S3,
Glue, DynamoDB, Athena), Azure Data
Factory, GCP (BigQuery,
Dataflow)
- Proficient
with tooling such as Intellij IDEA, AutoSys, Git, Jenkins,
GitHub, Jira, Confluence
- Experience
with containerisation technologies such as Docker and
Kubernetes
- Experience
with automating tasks like code builds, testing,
deployment, and monitoring
- Strong
understanding of data management, security
and privacy practices
- Highly
experience in agile
methodology, continuous integration, test
automation, and issue
tracking
- Creative
problem solver and drive
continuous
improvement
- Experience
working in financial
services
industry
- Share
in our company values
What
we offer
- Opportunity
to join a growing and innovative
global organisation that places employee
and customer experience at the
heart
- Be
part of a team that is empowered to
imagine, inspire, create and
innovate
- Flexible
hybrid work environment
- Team
culture that celebrates achievement
and successes
- Opportunity
to participate in social and
sustainability initiatives that build a better
future
- Leadership
and mentoring opportunities, investing
in the growth and development of our
people
- Dynamic
environment working with industry
professionals and leading partner
organisations
Life
at Cayian
At
Cayian, we foster a culture of performance
through continuous improvement and learning. We operate in a
dynamic, diverse, and inclusive environment, where we are
empowered to imagine, inspire, create, and innovate.
Together, we apply our skills and expertise to solve complex
challenges and achieve great things for our customers,
partners and society. We have a passion for
delivery and service excellence and a drive for
progression. We learn from our experiences and
celebrate our successes.
If
you share our values and have a drive to apply
your expertise as we do great things, come join
us!