Head of Data Engineering
A company in the online sports betting and gaming sector is seeking a Head of Data Engineering to join their team. This role involves both strategic leadership and hands-on technical engagement, contributing to the design, scalability, and efficiency of data solutions supporting business insights and analytics.
Main Responsibilities:
Manage day-to-day activities of engineering initiatives within the Business Intelligence domain, including participating in project planning and defining architectural approaches.
Provide direct line management and mentorship to data engineers, supporting career development and team cohesion.
Run agile ceremonies such as engineering sprints and retrospectives, and assist in post-incident analysis to drive continuous improvement.
Maintain technical responsibility for the BI technology stack, including overseeing data governance processes, ensuring data quality and consistency (e.g. managing data contracts and schema changes), defining engineering standards, and promoting system stability and performance.
Lead efforts around upgrades, migrations, and reducing technical debt.
Foster collaboration with Data Science and Reporting teams, ensuring their data needs are met and their solutions are deployed effectively.
Coordinate with other Technology teams on projects that span multiple departments to ensure seamless integration and alignment.
Desired experience:
Solid background managing teams of data engineers, including coaching and performance management.
Significant practical experience in data engineering, covering:
Developing Python applications for data-related solutions.
Using version control and CI/CD tools such as GitHub or GitLab.
Working with cloud environments, preferably Google Cloud Platform, and services like BigQuery, Kubernetes, and Firestore.
Configuring cloud infrastructure through tools like Terraform.
Implementing messaging technologies such as Kafka or Pub/Sub for data pipelines.
Working with NoSQL technologies like MongoDB or Firestore.
Writing and optimizing complex SQL queries, particularly for platforms like BigQuery or Snowflake, capable of handling large-scale and intricate data sets.
Utilizing orchestration tools such as Airflow, Luigi, or Cloud Composer for managing workflows and data pipelines.