DATA ENGINEER

JOB SUMMARY

  1. To design, build, and maintain robust, scalable, and high-performance ETL/ELT data pipelines for reporting, business intelligence, and machine learning initiatives
  2. The role is critical for ensuring the uality, lineage, and governance of all critical data assets.

 

DUTIES & RESPONSIBILITIES

  1. Build and optimize data pipelines using tools like Airflow/Prefect to ingest data from core banking, payment, and third-party sources.
  2. Design and implement dimensional and denormalized data models within the Data Warehouse (e.g., Postgres/Oracle/BigQuery).
  3. Utilize streaming technologies like Kafka and transformation tools like DBT to process data in real-time or near real-time.
  4. Implement data quality checks and maintain data lineage documentation for governance.
  5. Leverage Python and SQL extensively for scripting, data manipulation, and pipeline development.

 

EDUCATION AND EXPERIENCE

  • Education: Bachelor’s degree or Higher National Diploma from Any Approved University or Polytechnic.
  • Relevant certifications for DBA are an added advantage
  • Experience: 4-6+ years hands-on experience in full stack development.
  • Specific Experience: Prior experience in banking, fintech, or other financial services environments is highly desirable

 

KNOWLEDGE/ SKILLS/COMPETENCIES

  • Warehousing principles, Data Governance frameworks, Dimensional
  • Python, Advanced SQL, ETL/ELT tools (Airflow/Prefect), Modeling, Kafka/Stream processing, Data Quality Postgres/Oracle/BigQuery, DBT, APIs, Git, CI/CD.

More Information

Share this job