About the Team You will be joining the newly formed AI, Data & Analytics team, primarily responsible as a Data Engineer leading various projects within the new Data Platform team. The new team is focused on driving increased value from the data InvestCloud captures to enable a smarter financial future for our clients, in particular focused on “enhanced intelligence”. Ensuring we have fit-for-purpose modern capabilities is a key goal for the team.
Key Responsibilities
· Design, develop, and maintain scalable data pipelines to support diverse analytics and machine learning needs.
· Optimize and manage data architectures for reliability, scalability, and performance.
· Implement and support data integration solutions from our data partners, including ETL/ELT processes, ensuring seamless data flow across platforms.
· Collaborate with Data Scientists, Analysts, and Product Teams to define and support data requirements.
· Manage and maintain data platforms such as Oracle, Snowflake, and/or Databricks, ensuring high availability and performance, whilst optimizing for cost.
· Ensure data security and compliance with company policies and relevant regulations.
· Monitor and troubleshoot data systems to identify and resolve performance issues.
· Develop and maintain datasets and data pipelines to support Machine Learning model training and deployment
· Analyze large datasets to identify patterns, trends, and insights that can inform business decisions.
· Work with 3rd party providers of Data and Data Platform products to evaluate and implement solutions achieving Investcloud’s business objectives.
· Lead a small team, as part of the global team, based in India and working closely with co-located data scientists as well as the broader global team.
Required Skills
· Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience.
· Minimum of 5 years of professional experience in data engineering or a related role.
· Proficiency in database technologies, including Oracle and PostgreSQL.
· Hands-on experience with Snowflake and/or Databricks, with a solid understanding of their ecosystems.
· Expertise in programming languages such as Python or SQL.
· Strong knowledge of ETL/ELT tools and data integration frameworks.
· Experience with cloud platforms such as AWS, GCP, or Azure.
· Familiarity with containerization and CI/CD tools (e.g., Docker, Git).
· Excellent problem-solving skills and the ability to handle complex datasets.
· Outstanding communication skills to collaborate with technical and non-technical stakeholders globally.
· Knowledge of data preprocessing, feature engineering, and model evaluation metrics
· Excellent proficiency in English
· Ability to work in a fast-paced environment across multiple projects simultaneously
· Ability to lead a small team ensuring a highly productive, collaborative and positive environment
· Ability to collaborate effectively as a team player, fostering a culture of open communication and mutual respect.
Preferred skills
· Experience with real-time data processing and streaming platforms (e.g., Apache Kafka).
· Knowledge of data warehousing and data lake architectures.
· Familiarity with governance frameworks for data management and security.
· Knowledge of Machine Learning frameworks (TensorFlow, PyTorch, Scikit-learn) and LLM frameworks (e.g. Langchain)
What do we offer
Join our diverse and international cross-functional team, comprising data scientists, product managers, business analyst and software engineers. As a key member of our team, you will have the opportunity to implement cutting-edge technology to create a next-generation advisor and client experience.
Location and Travel The ideal candidate will be expected to work from the office (with some flexibility). Occasional travel may be required.