Data Tech Lead

Location
Job Type Permanent
Salary 65-80k EUR DOE
Reference 33534

Our client, a technology solutions company passionate about Customer tailored product development, is currently hiring a Data Tech Lead to join their team in Estonia!

From requirements definition and specification, software coding and development, to application support and maintenance; they understand and assist with the entire product lifecycle.

If you are passionate about building and managing data pipelines, you are always keeping in mind that the data flow is efficient and reliable and you are keen to lead a talented Data team – then this role might be for you!

Role:

    • Build and manage efficient and reliable (batch and real-time) data pipelines from disparate data sources (Kafka, and 3rd party tools)

    • Design, develop and launch data ingestion and storage systems with high availability and reliability that can scale (don’t worry here we like scalable and high available cloud solutions)

    • Drive the advancement of data infrastructure by developing and implementing underlying logic and structure for how data is set up, cleaned, and stored

    • Participate in & facilitate interviews of future Data Engineer teammates

    • Participate in the E2E lifecycle of a product delivery and guide the teams as required

    • Architect, launch and manage automated extraction and transformation processes

    • Build scalable data aggregation layer from streams and batches of data for data visualisation

    • Collaborate with development teams on design, architecture, and expansion of infrastructure

    • Collaborate on building the future development plans

    • Collaborate with other Tech Leads and Product Owners on prioritisation of technical and architectural features

    • Work as an SME Operational Data Stores, Data Warehouse, and Data Marts development

    • Guide the development design activities with input and data dependencies

Skills & Qualifications;

    • Strong communication skills

    • Experience leading a technical team

    • Ability to mentor other DE team members

    • Ability to clarify/translate customer requirements into Epics/Stories7+ years Data Warehouse/Data Lake Architecture and Development

    • Knowledge of programming languages like Scala, Java and Python

    • Experience in building architectures based on streaming data technologies for low-latency data processing (Apache Spark/Flink, Apache Kafka, Hadoop ecosystem)

    • Experience in data pipeline orchestration (Apache Airflow)

    • Confidence in using Git, CI\CD, and containerization

    • Experience with Data Quality Tools, Monitoring and Alerting

    • Familiarity with core Kubernetes concepts

    • Experience with working with data coming from various sources (RDMS, APIs, files) in various formats (JSON, Avro, Parquet, Delta)

    • Experience participating in an Agile software development team, e.g. SCRUM

    • Knowledge in software development lifecycles/methodologies i.e. agile as strong presentation and collaboration skills and can communicate all aspects of the job requirements including the creation of formal documentation

    • Our working environment is fully English speaking, ability to communicate in English is required

Next steps: Apply with your CV and we will contact you soon!

Apply Now