AuxoAI Engineering logo

Data Engineer - GCP

AuxoAI Engineering
Full-time
On-site
Bangalore, Karnataka, India

AuxoAI is hiring a Data Engineer<\/b> to join our growing data engineering team focused on building production -grade pipelines on Google Cloud Platform (GCP)<\/b>. This is a hands -on role ideal for someone early in their data career who’s eager to learn fast, work with modern cloud -native tools, and support the development of scalable data systems.
<\/p>

You’ll work alongside experienced engineers and analysts on real client projects across industries, helping implement ELT processes, BigQuery pipelines, orchestration workflows, and foundational MLOps capabilities. 
<\/p>


<\/div>
Responsibilities:<\/b>
<\/div>
  • Assist in developing scalable data pipelines using GCP tools such as Dataflow<\/b>, BigQuery<\/b>, Cloud Composer<\/b>, and Pub/Sub<\/b>
    <\/p><\/li>

  • Write and maintain SQL and Python scripts<\/b> for data ingestion, cleaning, and transformation
    <\/p><\/li>

  • Support the creation and maintenance of Airflow DAGs<\/b> in Cloud Composer for orchestration
    <\/p><\/li>

  • Collaborate with senior data engineers and data scientists to implement data validation and monitoring checks
    <\/p><\/li>

  • Participate in code reviews, sprint planning<\/b>, and cross -functional team meetings
    <\/p><\/li>

  • Help with documentation and knowledge base creation for data workflows and pipeline logic
    <\/p><\/li>

  • Gain exposure to medallion architecture, data lake design<\/b>, and performance tuning on BigQuery
    <\/p><\/li><\/ul>


    <\/div><\/span>

    Requirements<\/h3>
    • 2 -4 years of relevant experience in data engineering, backend development, or analytics engineering
      <\/p><\/li>

    • Strong knowledge of SQL<\/b> and working -level proficiency in Python<\/b>
      <\/p><\/li>

    • Exposure to cloud platforms<\/b> (GCP preferred; AWS/Azure acceptable)
      <\/p><\/li>

    • Familiarity with data pipeline concepts<\/b>, version control (Git), and basic workflow orchestration
      <\/p><\/li>

    • Strong communication and documentation skills
      <\/p><\/li>

    • Eagerness to learn, take feedback, and grow under mentorship
      <\/p><\/li><\/ul>

      Bonus Skills:<\/b>
      <\/p>

      • Hands -on experience with GCP tools like BigQuery<\/b>, Dataflow<\/b>, or Cloud Composer<\/b>
        <\/p><\/li>

      • Experience with dbt<\/b>, Dataform<\/b>, or Apache Beam<\/b>
        <\/p><\/li>

      • Exposure to CI/CD pipelines<\/b>, Terraform<\/b>, or containerization (Docker)
        <\/p><\/li>

      • Knowledge of basic data modeling and schema design concepts
        <\/p><\/li><\/ul>


        <\/div><\/span>

Apply now
Share this job