We are looking for a Data Engineer with strong skills in Python and SQL, as well as Microsoft Azure and Databricks. Expertise in data modelling and ETL processes is essential, along with strong communication, analytical, and teamwork skills. Functional knowledge of banking EDW/EDLs is a valuable plus. Communication with the team and stakeholders will be in English and Portuguese, and work will be in a hybrid model in Lisbon.
Requirements:
• Bachelor's or Master's degree in Computer Science, Engineering, or a related field
• 4+ years of professional experience in Data Engineering
• Advanced proficiency in Python and SQL
• Strong understanding of data modeling and data platform concepts
• Expertise in both on-premises and cloud-based solutions
• Hands-on experience with ETL processes and data quality engines
• Proficient in Microsoft Azure technologies, such as Data Factory, Functions and ADLS
• Experience working with Databricks for big data processing and analytics
Nice-to-have:
• Experience with Microsoft Fabric and building and managing data products
• Experience with Power BI or similar business intelligence tools
• Familiarity with big data technologies, such as Hadoop, Spark, or Kafka
• Hands-on experience with DevOps and GIT, particularly in implementing CI/CD pipelines
• Exposure to Agile methodologies and practices
• Involvement in data governance and data quality projects
Nice-to-have additional certifications:
• DP-600 Fabric Analytics Engineer Associate
• DP-700 Fabric Data Engineer Associate
• PL-300 Power BI Data Analyst Associate Other Microsoft certifications (eg. DP, AZ, AI) and/or equivalent certifications from other cloud providers (eg. AWS)
If it sounds like you, share your CV with us and let's talk more!