I

Sr Data Engineer

IND
Full-time
On-site
Pune, Maharashtra, India

 

Ecolab is looking for a Sr. Data Engineer to join the Analytics and Digital Solutions Team within Ecolab Digital in support of Global Supply Chain. If you're a passionate professional seeking growth and a rewarding career, we invite you to apply. This is an excellent opportunity to join a recognized global company offering competitive compensation, benefits, and significant career advancement. 

 

What You'll Do 

As a technical expert, you'll drive continuous improvements in our digital capabilities and advanced analytics. You'll lead the development of new digital products, providing critical insights to solve business challenges. A key part of your role will be enhancing data utilization across the organization through improved processes, governance, and data management. We're looking for someone passionate about building a strong data foundation and adopting modern data architecture for current and future analytical needs. 

 

Key Responsibilities 

  • Data initiatives: Serve as a liaison among stakeholders to analyse and define data requirements for reporting and business process changes. 

  • Manage data infrastructure: Proactively manage Snowflake and SQL databases and analytical data models. 

  • Drive data excellence: Develop, test, and tune semantic models for enterprise reporting, ensuring compliance with IT security requirements. 

  • Advance data architecture: Lead the adoption of modern data architecture and identify opportunities to solve business problems with state-of-the-art solutions. 

  • Perform reverse engineering: Analyze and understand existing complex data structures and processes to facilitate migrations, integrations, or improvements. 

  • Foster best practices: Mentor peers on implementing and improving our data management and governance framework across technologies like Microsoft Power BI, Snowflake, Microsoft Azure, and on-premise data points. 

  • Promote Agile methodologies: Champion and follow SCRUM/Agile frameworks. 

 

 

What You'll Need 

  • Experience:  

  • 5+ years in a data engineering, analytics, or business intelligence role. 

  • Proficiency in ETL data engineering, dimensional data modelling, master data management, data governance, and end-to-end data lineage documentation. 

  • Strong experience with cloud data warehouses, specifically Snowflake, streams, tasks and the Azure Platform (SQL Server, Logic Apps, App Services, Data Factory, Power BI including pipelines, Lakehouse, and Warehouse). 

  • Advanced SQL skills (cursors, triggers, CTEs, procedures, functions, external tables, dynamic tables, security roles) and Python (object-oriented programming, handling JSON/XML). 

  • Build ETL or ELT data pipelines using Snowflake streams, tasks, store procedures and Fivetran/DBT tools. 

  • Experience with medallion data architecture framework. 

  • Expertise in analytical application development utilizing Python, Streamlit, Flask, Node.js, Graph API, Power Apps tools. Deploying application in Azure app services. 

  • At least 2 years of Agile/Scrum project management experience, including data requirements gathering and project ownership. 

  • Skills: Exceptional analytical and problem-solving abilities.  

  • Education: A Bachelor’s or Master’s degree in Mathematics, Statistics, Computer Science, Information Technology, or Engineering is preferred. 

 

Bonus Points If You Have 

  • Experience working in applications within supply chain domain. 

  • Proven ability to manage and deliver multiple projects simultaneously in a dynamic environment. 

  • Experience creating, implementing, and continuously improving processes. 

  • Familiarity with data science areas like time series analysis, forecasting, classification, and regression analysis. 

  • Experience with ERP systems (SAP preferred). 

  • Demonstrated ability to collaborate effectively with global cross-functional teams and individuals with varying technical expertise. 

  • Developed cloud hosted applications. 

  • Have experience with PySpark, Fivetran, DBT (ETL/ELT), Python, Streamlit, Flask, Node.js, Power Apps, Graph API, Azure app services. 

  • Have experience with source control (GIT).