Staples India Business Innovation Hub logo

Data Engineer: Python, Snowflake

Staples India Business Innovation Hub
Full-time
On-site
Chennai, Tamil Nadu, India

Duties & Responsibilities<\/b><\/span>
<\/p>

  • Design, develop, and maintain scalable ETL/ELT data pipelines to support business and analytics needs<\/span>
    <\/span><\/li>
  • Write, tune, and optimize complex SQL queries for data transformation, aggregation, and analysis<\/span>
    <\/span><\/li>
  • Translate business requirements into well -designed, documented, and reusable data solutions<\/span>
    <\/span><\/li>
  • Partner with analysts, data scientists, and stakeholders to deliver accurate, timely, and trusted datasets<\/span>
    <\/span><\/li>
  • Automate data workflows using orchestration/scheduling tools (Airflow, ADF, Luigi, etc.)<\/span>
    <\/span><\/li>
  • Develop unit tests, integration tests, and validation checks to ensure data accuracy and pipeline reliability<\/span>
    <\/span><\/li>
  • Document pipelines, workflows, and design decisions for knowledge sharing and operational continuity<\/span>
    <\/span><\/li>
  • Apply coding standards, version control practices, and peer code reviews to maintain high -quality deliverables<\/span>
    <\/span><\/li>
  • Proactively troubleshoot, optimize, and monitor pipelines for performance, scalability, and cost efficiency<\/span>
    <\/span><\/li>
  • Support function roll outs, including being available for post -production monitoring and issue resolution<\/span>
    <\/span><\/li><\/ul>

    <\/div><\/span>

    Requirements<\/h3>

    Basic Qualifications<\/b><\/span>
    <\/p>

    • Bachelor’s degree in computer science, Information Systems, Engineering, or a related field<\/span>
      <\/span><\/li>
    • 2<\/span>–5 years of hands -on experience in data engineering and building data pipelines<\/span>
      <\/span><\/li>
    • At least 3 years of experience in writing complex SQL queries in a cloud data warehouse/ data lake environment.<\/span>
      <\/span><\/li>
    • Solid hands -on experience with data warehousing concepts and implementations<\/span>
      <\/span><\/li>
    • At least 1 year of experience with Snowflake or another modern cloud data warehouse<\/span>
      <\/span><\/li>
    • At least 1 year of hands -on Python development.<\/span>
      <\/span><\/li>
    • Familiarity on Data modeling and Data warehousing concepts<\/span>
      <\/span><\/li>
    • Experience with orchestration tools (e.g., Airflow, ADF, Luigi)<\/span>
      <\/span><\/li>
    • Familiarity with at least one cloud platform (AWS, Azure, or GCP)<\/span>
      <\/span><\/li>
    • Strong analytical, problem -solving, and communication skills<\/span>
      <\/span><\/li>
    • Ability to work both independently and as part of a collaborative team<\/span>
      <\/span><\/li><\/ul>


      <\/p>

      Preferred Qualifications<\/b><\/span>
      <\/p>

      • Experience with DBT (Data Build Tool) for data transformations<\/span>
        <\/span><\/li>
      • Exposure to real -time/streaming platforms (Kafka, Spark Streaming, <\/span>Flink<\/span>)<\/span>
        <\/span><\/li>
      • Familiarity with CI/CD and version control (Git) in data engineering projects<\/span>
        <\/span><\/li>
      • Exposure to the e -commerce or customer data domain<\/span>
        <\/span><\/li>
      • Understands the technology landscape, up to date on current technology trends and new technology, brings new ideas to the team<\/span>
        <\/span>
        <\/li><\/ul>

        <\/div><\/span>

Apply now
Share this job