Lead Data Engineer

Location
Job Type Permanent
Salary Very Competitive
Reference 33023

Our client, a global billion dollar iGaming company, is currently hiring a Lead Data Engineer to join their team in Malta!

As a Lead Data Engineer you will work together with other tech leads to define and build solutions on how the data will be configured, integrated, scaled, controlled, and governed. We are looking for a hands-on lead data engineer to provide close support to the data engineering team in enhancing our entire data suite based on multiple data lakes using a data mesh paradigm. The role will involve liaising and working with other technical leads and architects within the company. The Lead Data Engineer interacts with technical and non-technical stakeholders, and other teams within the company, including operations.

Requirements:

 

  • Propose the best tools/technologies, solutions and processes required to improve the team’s productivity, operations, and business value
  • Act as a technical expert with consumers of the data platform whilst troubleshooting issues and discussing new capabilities
  • Mentor and support members of the team to develop their technical abilities
  • Ensure technical solutions are in line with the target architecture
  • Ensure data quality and integrity in data pipelines to leverage data at scale
  • Ensure proper source control, documentation and testing are being followed in order to maintain high data quality
  • Perform technical due diligence whilst driving continuous improvement
  • Respond to ad-hoc data requests and conduct analysis as required by management from time to time
  • Stay up to date with latest industry trends and technologies


Desirable experience:
 

  • Extensive software engineering experience or B.Sc / M.Sc in Computer Science or related field
  • Strong knowledge in a programming language such as Java, Python, or Scala
  • Hands-on cloud computing experience on AWS, Microsoft Azure, or GCP
  • Experience with data lake technologies such as Kafka, Avro, Parquet and Spark
  • Experience with RDBMS, Columnar, and NoSQL databases such as MySQL, PostgreSQL, Elasticsearch
  • Good knowledge with data orchestration frameworks such as Airflow or Luigi
  • Familiar with data streams processing technologies like Kafka Streams or Apache Flink
  • Understanding of data modelling and different data structures adapting to particular use cases
  • Continuous integration and delivery principles
  • Excellent problem solving and analytical thinking/innovation
  • Team spirit; strong communication skills to collaborate with various stakeholders whilst still being able to work autonomously


Next steps: Apply with your CV and we will contact you soon!

Apply Now