
Enterprise Data Warehouse Operate Lead
- Sydney, NSW
- Permanent
- Full-time
- Thrive in an innovative, collaborative people culture
- Mentoring, coaching and leadership programs to help you make an impact that matters
- We support flexibility and choice. We encourage you to find the right balance between connecting in person with your clients and teams and meeting your own personal needs
- Data Warehouse Management: Oversee the daily operations of the Azure Databricks data warehouse. Responsible for managing the 3rd level support team function, ensuring optimal performance, availability, and security.
- Mentorship, and people leadership. Foster a collaborative and innovative team environment, supporting the team to work at their best.
- Collaboration and Knowledge Transfer: Work closely with key business stakeholders, data engineering leads and ITS teams. Ensure knowledge is shared and documentation is kept up to date.
- Performance Optimisation: Monitor and optimise data warehouse performance, identifying and resolving bottlenecks.
- Cost Strategies: Develop and enforce strategies to optimise resource utilisation and reduce unnecessary expenditure.
- Data Governance: Enforce data governance policies to ensure data quality, consistency, and compliance with relevant regulations.
- Compliance and Security Management: Assess and ensure that all processes adhere to the firm's compliance requirements.
- Process Automation: Drive process automation initiatives to streamline operations, increase efficiency, and maintain high standards of data integrity and security.
- Incident management: Lead incident management, root cause analysis, and remediation for warehouse-related issues, minimising downtime and service impact (meeting SLA targets)
- Communication: Responsible for communicating issues to key stakeholders in a timely fashion
- Experience leading a 3rd level support team
- Ability to mentor and upskill technical teams, fostering a culture of collaboration and continuous learning
- Track record of driving process improvement, automation, and cost optimisation initiatives
- ITIL experience/certification
- 2+ years' experience in data engineering with a strong focus on the Azure ecosystem. Proficient in Azure Databricks, DBT, and Azure Data Factory.
- Solid understanding of Azure services, including storage options, compute, networking, and security features.
- Proficiency in programming languages such as Python, and familiarity with SQL for data manipulation and analysis.
- Experience or knowledge of Azure Fabric is considered a plus, enhancing your ability to integrate new technologies into existing frameworks.
- Bachelor's degree in Computer Science, Information Technology, or a related field.
- Advanced degree or relevant certifications in data engineering or Azure technologies preferred.