Weekday AI logo

Senior Data Engineer

Weekday AI
Full-time
On-site
Bengaluru, Karnataka, India
₹4 - ₹7 INR yearly
Senior Jobs
Description

This role is for one of Weekday’s clients
Salary range: Rs 400000 - Rs 700000 (ie INR 4-7 LPA)
Min Experience: 3 years
Location: Bengaluru
JobType: full-time



Requirements

We are seeking a skilled Data Engineer to design, build, and maintain scalable data solutions that empower analytics, business intelligence, and machine learning initiatives.

The ideal candidate will be experienced in data architecture, pipeline development, and visualization, with a passion for transforming raw data into actionable insights.

You will collaborate with cross-functional teams to develop reliable, high-performing data systems that support enterprise-wide decision-making.

Key Responsibilities

Data Engineering & Analytics

  • Design and maintain robust ETL pipelines for structured and unstructured data.
  • Develop and optimize data lakes and warehouses using platforms such as Databricks, Azure Synapse, Snowflake, BigQuery, or Redshift.
  • Ensure data quality, governance, and performance optimization across systems.
  • Implement both batch and real-time data processing solutions using Spark, Kafka, or Hadoop.
  • Translate business requirements into data models, metrics, and dashboards.
  • Build interactive dashboards and visualizations using Power BI, Tableau, or Looker.
  • Conduct exploratory data analysis and develop reports using Python and SQL.
  • Collaborate with data science teams to prepare and deliver structured data for AI/ML models.

Collaboration & Documentation

  • Work closely with engineering, product, and business teams to align on key KPIs and analytics requirements.
  • Document data workflows, models, and analysis outputs for transparency and reuse.

Required Skills & Expertise

  • Programming: Proficiency in Python (Pandas, NumPy, PySpark) and SQL.
  • ETL & Workflow Orchestration: Hands-on experience with Airflow, dbt, Azure Data Factory.
  • Data Platforms: Deep understanding of Databricks, Synapse, Snowflake, BigQuery, Redshift, and cloud storage systems (S3, ADLS, GCS).
  • Big Data Technologies: Working knowledge of Spark, Hadoop, and Kafka.
  • Visualization Tools: Expertise in Power BI, Tableau, or Looker.
  • Cloud Ecosystems: Experience with AWS, Azure, or GCP.
  • Strong foundation in data modeling (Star/Snowflake schemas) and query optimization.
  • Analytical mindset with a focus on solving business problems using data.


Preferred / Good-to-Have Skills

  • Experience with MLOps and machine learning pipelines (Databricks, Azure ML).
  • Exposure to API-based integrations and microservices data flows.
  • Familiarity with CI/CD, Git, and DevOps practices.
  • Understanding of data security, compliance, and governance frameworks (GDPR, HIPAA).
  • Strong communication skills with the ability to simplify complex data insights for business stakeholders.

Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or a related field.
  • Minimum 3+ years of hands-on experience in data engineering and analytics within large-scale data environments.