This is a remote position.
Description :
- At least 2 years hands on experience working with an Agile data engineering team working on big data pipelines using Azure in a commercial environment.
- Dealing with senior stakeholders/leadership
- Understanding of Azure data security and encryption best practices. [ADFS/ACLs]
- Data Bricks - experience writing in and using data bricks Using Python to transform, manipulate data.
- Data Factory - experience using data factory in an enterprise solution to build data pipelines. Experience calling rest APIs.
- Synapse/data warehouse - experience using synapse/data warehouse to present data securely and to build & manage data models.
- Microsoft SQL server - We- d expect the candidate to have come from a SQL/Data background and progressed into Azure
- PowerBI - Experience with this is preferred
Additionally :
- Experience using GIT as a source control system
- Understanding of DevOps concepts and application
- Understanding of Azure Cloud costs/management and running platforms efficiently
Requirements
- 3 to 6 years of experience.
- Must Have: Azure services like Azure Data Factory, Databricks, Azure Data lakes, Synapse/DW. and SQL
- Good to have: Python, PowerBI, Git