Senior Data Engineer at Turaco

Email Job

Job Detail

  • Job ID 1015570
  • Experience  5 Years
  • Qualifications  Degree Bachelor

Job Description

Key Responsibilities

Architecture & Pipeline Development

  • Design and Architect: Lead the design of optimal data pipeline architecture for both batch and real-time streaming (using tools like Kafka or Spark) to support immediate financial transaction processing.
  • Scale Infrastructure: Identify and implement internal process improvements, focusing on re-designing infrastructure for greater scalability and automating manual processes.
  • Advanced ETL/ELT: Build, test, and maintain robust database pipeline architecture for optimal extraction, transformation, and loading (ETL) from a wide variety of data sources, including core banking systems and third-party APIs.

Data Quality & Governance (FinTech Focus)

  • Security Compliance: Ensure strict compliance with data governance, security policies, and financial regulations (e.g., GDPR, PCI-DSS).
  • Reliability: proactively implement methods to improve data reliability and quality, ensuring financial reporting and customer balances are 100% accurate.
  • Root Cause Analysis: Perform deep root cause analysis on internal and external data processes to answer specific business questions and resolve data anomalies immediately.

Leadership & Collaboration

  • Mentorship: Act as a technical mentor to junior data engineers, fostering a culture of technical excellence and “low ego” collaboration.
  • Stakeholder Management: Work with cross-functional teams (Product, Risk, Finance) to translate complex financial requirements into technical data solutions.
  • Strategic Insight: Build analytics tools that provide actionable insights into customer acquisition, operational efficiency, and key business performance metrics

Knowledge, Skills, and Attributes

  • Live Turaco’s values – care and protect, do the right thing, have fun, and low ego
  • Experience: 5+ years of experience in Data Engineering, ideally within Financial Services or FinTech.
  • Education: Degree in Computer Science, Statistics, IT, or similar field.
  • Programming: Advanced proficiency in Python, Java, or Scala.
  • Database Mastery: Expert-level SQL skills and hands-on experience with database design and data modeling. Experience with modern data warehouses (Snowflake, BigQuery, or Redshift).
  • Big Data Tech: Working knowledge of message queuing (Kafka, RabbitMQ) and stream processing.
  • Orchestration: Experience with workflow management tools (Airflow, DBT, Luigi).

Related Jobs