Senior Data Engineer (Snowflake / dbt)
  • Posted On: 19/12/2025

Senior Data Engineer (Snowflake / dbt)

  • Makati | *Work from home
  • Senior Data Engineer (Snowflake / dbt)
  • Full-Time
  • Apply Now

We’re looking for a Senior Data Engineer to build and maintain our modern data infrastructure. You’ll work hands-on with Snowflake, dbt, and cloud-based data pipelines, taking ownership of data orchestration, automation, and dashboard development using Plotly Dash.

Key Responsibilities

  • Design, build, and optimise data pipelines in Snowflake, including CDC data loading from cloud storage (GCS)
  • Develop and maintain dbt models, tests, and documentation
  • Build automated deployment and orchestration systems using dbt with CI/CD integration, logging, and monitoring
  • Create and maintain Snowflake stored procedures for data orchestration
  • Build interactive dashboards and data applications using Plotly Dash
  • Troubleshoot and optimise performance for large-scale data operations (COPY INTO, incremental loading, query performance)
  • Implement proper error handling, alerting (email/Slack), and observability across pipelines
  • Leverage AI coding tools (Cursor, GitHub Copilot, Claude, etc.) to accelerate development and improve code quality

Requirements (Essential):

  • 5+ years of experience in data engineering
  • Strong proficiency with Snowflake (warehouses, stages, stored procedures, roles/permissions, VARIANT/OBJECT types)
  • Solid experience with dbt (models, macros, incremental strategies, CI/CD deployment, orchestration)
  • Experience building dashboards with Plotly Dash or similar Python-based frameworks
  • Experience with cloud storage integrations (GCS, S3, or Azure Blob)
  • Proficiency in SQL and Python
  • Familiarity with Git-based workflows and version control
  • Comfortable using AI coding assistants to boost productivity
  • Strong troubleshooting skills and attention to detail

Nice to Have:

  • Experience with workflow automation tools (n8n, Airflow, or similar)
  • Understanding of data warehouse design patterns and dimensional modelling
  • Experience with API integrations and working with VARIANT/JSON data

Scroll to Top