Requirements
- 5+ years in a similar role
- Proficiency in building scalable systems using Python, Docker, Kubernetes
- Experience with big data pipelines using tools like Apache Spark, Dataflow, and orchestration with Apache Airflow
- Skilled in designing DataMarts and managing complex data pipelines to meet business needs
- Strong in data modeling, SQL, and distributed NoSQL databases such as Postgres, MongoDB, BigQuery, Cassandra
- Experience with BI tools and reporting frameworks
- Agile development experience in cross-functional, distributed teams
- English fluency at B2–C1 level
Key Responsibilities
- Design, build, and maintain robust data ingestion and processing pipelines for large-scale data
- Collaborate with product teams to deliver comprehensive data solutions that meet business requirements
- Develop and manage data warehouses, marts, and aggregations that offer deep insights into customer and product behavior
- Implement real-time monitoring and alerting for pipeline performance and reliability
- Document key data assets, KPIs, and metrics to promote knowledge sharing and transparency
- Continuously improve the cloud-native data architecture for greater scalability and efficiency
Perks
- 50% KUP (CoE)
- Annual bonus (CoE)
- Private medical healthcare (CoE)
Salary:
- B2B: 23,000 - 30,000 PLN net per month
- CoE: 20,000 - 25,000 PLN gross per month