Data Engineer
  • Posted On: 12/03/2026

Data Engineer

  • Makati | Work from Home
  • Data Engineer
  • Full-Time
  • Apply Now

Position Details:

Reports To: Head of Engineering – Data

About the Role

The Data Engineer is responsible for architecting and maintaining high-performance data
pipelines and modeling frameworks. This role focuses on implementing best-in-class tools—
specifically SQLMesh and Airflow—to ensure the delivery of high-quality, reliable data to the
business. You will lead the migration from legacy dbt models to a modernized SQLMesh
environment, ensuring data integrity and performance across our AWS (Redshift)
ecosystem.

Key Responsibilities

  • Migrate & Modernize Data Models: Lead the transition from dbt to SQLMesh,
    reviewing and refactoring legacy code to improve performance and maintainability.
  • Orchestrate Complex Workflows: Develop and maintain Airflow DAGs for varied
    tasks, including SFTP integrations, SQLMesh scheduling, and cross-platform data
    synchronization.
  • Manage Data Ingestion: Maintain and optimize AWS DMS and Zero ETL integrations
    to ensure seamless data flow from transactional databases to Redshift.
  • Support Analytics Excellence: Collaborate closely with Data Analysts to optimize data structures for QuickSight, ensuring high performance and ease of use for end-users.
  • Ensure Data Quality: Implement rigorous testing and validation within the modeling
    layer to provide a “single source of truth” for the organization.
  • Stakeholder Collaboration: Work with Business Analysts and Engineering leads to
    translate business requirements into technical data solutions.
  • Troubleshoot & Optimize: Proactively monitor pipeline health and optimize Redshift
    performance (distribution/sort keys, query tuning).

Requirements (Essential):

  • Minimum of 6 years of experience in Data Engineering, with a focus on ELT/ETL
    patterns.\
  • Advanced proficiency in SQL and a deep understanding of Data Modeling (Star
    Schema, Kimball).\
  • Cloud Native Expertise
  • Proven experience with AWS services (S3, Redshift, ECS).
  • Modern Modeling Frameworks: Hands-on experience with dbt is required; experience with (or a strong desire to master) SQLMesh is essential.
  • Orchestration Experience: Strong ability to build and scale workflows using Apache Airflow.
  • Collaboration: Excellent communication skills to bridge the gap between technical
    data infrastructure and business-facing analytics.
  • Self-Driven: Ability to manage a sprint-based workload independently while maintaining an eye for detail.

Key Technical Skills Required

  • Languages/Frameworks: SQL (Advanced), Python, SQLMesh, dbt.
  • Orchestration: Apache Airflow.
  • AWS Stack: Redshift, S3, DMS, Zero ETL, ECS.
  • Tools: GitHub (Version Control/Code Reviews), JIRA, Confluence.
  • BI Support: QuickSight (or similar tools like Tableau/Looker).

Work Details

  • Shift: Monday to Friday: (AU time – 6:00am 3:00pm or 7:00am- 4:00pm PH Time) ;
    depending on business needs
  • Location: Makati | *Work from Home Until Further Notice
  • Status: Full time employment

Scroll to Top