AuxoAI Engineering logo

Senior Data Engineer - GCP

AuxoAI Engineering
Full-time
On-site
Bangalore North, Karnataka, India
Senior Jobs

Auxo is looking for a Senior Data Engineer – GCP Native Platform<\/b> to design, build, and operate scalable, production -grade data platforms for our enterprise clients.
<\/p>

This is a hands -on delivery role, not a support function. You will own end -to -end data pipeline engineering on Google Cloud Platform, embedding best -in -class data engineering practices across batch and streaming workloads, while working closely with architects, analysts, and data scientists.
<\/p>

Responsibilities<\/b>
<\/div>
  • Design, build, and optimize scalable batch and streaming data pipelines using Dataflow (Apache Beam)<\/b>
    <\/span><\/li>
  • Develop and manage workflow orchestration using Airflow on Cloud Composer<\/b>
    <\/span><\/li>
  • Implement ELT transformations using Dataform<\/b> for SQL -based data modeling and transformations
    <\/span><\/li>
  • Design and maintain BigQuery datasets<\/b> following layered / medallion architecture patterns
    <\/span><\/li>
  • Implement event -driven ingestion and CDC patterns using Pub/Sub<\/b>
    <\/span><\/li>
  • Partner with architects to implement technical designs, standards, and platform best practices
    <\/span><\/li>
  • Ensure performance optimization, reliability, monitoring, and cost efficiency of data pipelines
    <\/span><\/li>
  • Implement data quality checks, validations, and monitoring within pipelines
    <\/span><\/li>
  • Support production deployments, incident resolution, and operational stability
    <\/span><\/li>
  • Mentor junior engineers and contribute to engineering excellence across the team

    <\/span><\/li><\/ul><\/span>

    Requirements<\/h3>

    Required Skills & Experience<\/b>
    <\/h2>

    Data Engineering (Strong Hands -on Experience)<\/b>
    <\/h4>
    • Design and development of production -grade data pipelines
      <\/span><\/li>
    • Batch and streaming data processing architectures
      <\/span><\/li>
    • Workflow orchestration and dependency management
      <\/span><\/li>
    • Data modeling, schema design, and performance optimization
      <\/span><\/li>
    • Pipeline monitoring, troubleshooting, and cost optimization
      <\/span><\/li><\/ul>

      GCP Data Platform<\/b>
      <\/h4>
      • Hands -on experience with BigQuery<\/b> (advanced SQL, partitioning, clustering, optimization)
        <\/span><\/li>
      • Strong experience with Dataflow / Apache Beam<\/b> (Python or Java)
        <\/span><\/li>
      • Experience with Cloud Composer / Airflow<\/b>
        <\/span><\/li>
      • Experience with Pub/Sub<\/b> and Cloud Storage<\/b>
        <\/span><\/li><\/ul>

        Technical Foundation<\/b>
        <\/h4>
        • Strong proficiency in SQL<\/b> and Python<\/b> (Java is a plus)
          <\/span><\/li>
        • Solid understanding of ETL/ELT patterns and modern data stack concepts
          <\/span><\/li>
        • Experience with Git -based version control and CI/CD pipelines
          <\/span><\/li>
        • Working knowledge of cloud monitoring and logging
          <\/span><\/li><\/ul>

          <\/div>
          Preferred Qualifications<\/b>
          <\/div>
          • Experience with GCP Professional Data Engineer<\/b> certification
            <\/span><\/li>
          • Exposure to Dataform or dbt<\/b> for transformation workflows
            <\/span><\/li>
          • Experience with real -time streaming architectures<\/b>
            <\/span><\/li>
          • Familiarity with Vertex AI<\/b>, Cloud Functions<\/b>, or Dataproc<\/b>
            <\/span><\/li>
          • Understanding of data governance concepts<\/b> and platforms (Dataplex, Atlan, Collibra)
            <\/span><\/li>
          • Experience with legacy -to -cloud data migrations
            <\/span><\/li>
          • Familiarity with Looker or Power BI
            <\/span><\/li><\/ul>

            <\/div><\/span>

Apply now
Share this job