Nagarro logo

Senior Big Data Engineer

Nagarro
Full-time
On-site
Warsaw, Poland
Senior Jobs

Company Description

👋🏼 We're Nagarro.

We are a digital product engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18 000+ experts across 39 countries, to be exact). Our work culture is dynamic and non-hierarchical. We're looking for great new colleagues. That's where you come in!

By this point in your career, it is not just about the tech you know or how well you can code. It is about what more you want to do with that knowledge. Can you help your teammates proceed in the right direction? Can you tackle the challenges our clients face while always looking to take our solutions one step further to succeed at an even higher level? Yes? You may be ready to join us.

Job Description

  • Design, build, and maintain scalable data pipelines using Python/PySpark and AWS services.
  • Develop, test, and deploy dbt models, ensuring quality through unit tests and clear documentation.
  • Build and integrate data connectors (FTP, API, JDBC, etc.).
  • Implement data ingestion, transformation, and orchestration workflows using AWS Glue, AppFlow, Lake Formation, Transfer Family, MWAA, or Argo.
  • Use AWS CDK for infrastructure-as-code and deployment automation.
  • Apply best practices for CI/CD, version control (Git), testing, and data quality monitoring.
  • Collaborate with cross-functional teams to understand data requirements and optimize data flows.
  • Troubleshoot performance issues and ensure secure, reliable, and high-performing data pipelines. 

Qualifications

  • Strong hands-on experience with dbt (core concepts, models, tests, documentation).
  • Proficiency in Python and/or PySpark for ETL/ELT development.
  • Expertise in building connectors via FTP, API, JDBC, or other integration patterns.
  • Solid experience with AWS data services: Glue, AppFlow, Lake Formation, Transfer Family, S3, MWAA, etc.
  • Experience with workflow orchestration tools such as MWAA or Argo. Knowledge of AWS CDK for infrastructure automation.
  • Experience with Git, CI/CD tools, and automated testing practices.
  • Strong understanding of data quality monitoring, validation tests, and logging.
  • Good communication skills and ability to work in cross-functional, agile teams.
Apply now
Share this job