JOB DESCRIPTION

Primary Objectives

• Design and maintain scalable data pipelines powering CDP and CRM platforms.
• Ensure data quality, consistency, and availability across enterprise systems.
• Collaborate with cross-functional teams to deliver reliable, business-driven solutions.
• Develop and enhance data mart (Customer Profile, Journey, Event) as CDP/CRM backbone.

Main responsibilities

• Data Integration & Pipelines: Build and optimize batch/real-time ETL/ELT pipelines across transactional, marketing, CRM/CDP, and external APIs, ensuring scalability and cost efficiency.
• Data Modeling & Architecture: Design CDP/CRM data models using medallion architecture, dimensional modeling, data vault, or data mesh; align with governance standards.
• Operations & Optimization: Monitor pipelines, troubleshoot, automate alerts, and drive continuous performance improvements.
• Others:
+ Maintain documentation, conduct code reviews, and promote knowledge sharing.
+ Support ad-hoc data needs and troubleshoot data issues.
+ Research and adopt emerging data technologies to enhance platform capabilities

JOB REQUIREMENTS

1. Education level

• Bachelor’s degree in information technology, business, or related field, or equivalent combination of education and experience required

2. Knowledge & Experience

• 3+ years of experience as a Data Engineer with strong ETL/ELT pipeline development.
• Advanced proficiency in SQL and data modeling for large-scale systems.
• Hands-on experience with cloud data platforms (Databricks, BigQuery, Redshift, Snowflake or Spark-based solutions).
• Familiarity with CDP/CRM ecosystems and integration with marketing platforms (Facebook Ads, Google Analytics, Criteo, RTB House).
• Practical experience with workflow orchestration (DBT, Airflow, or equivalent).

3. Technical skills

• Advanced SQL (queries, tuning, optimization); strong Python (data processing, automation, APIs); Scala/Java for Spark is a plus.
• Experience with Databricks, Spark, or similar; cloud data warehouses (Databricks, BigQuery, Synapse, Snowflake); orchestration tools (DBT, Airflow, etc.).
• Building ETL/ELT pipelines for structured/unstructured data; real-time streaming (Kafka, Pub/Sub, EventHub); CRM/CDP and API integrations.
• Knowledge of lineage, catalog, and metadata management; hands-on with frameworks like Great Expectations or Soda.
• CI/CD for data engineering (Git, GitHub Actions, Azure DevOps); exposure to Docker/Kubernetes.

4. Soft skills

• Effective communicator, able to bridge technical and business stakeholders.
• Analytical mindset with strong troubleshooting skills; proactive in identifying risks, gaps, and proposing solutions.
• Curious and open to new technologies, eager to adopt best practices and experiment with new tools for efficiency.

Lương: Thỏa Thuận
Phòng ban: Data
Hạn nộp hồ sơ: 01/10 — 30/11/2025

Theo dõi

Nộp đơn ứng tuyển công việc này

Họ & tên bạn *
Địa chỉ email *
Số điện thoại *
CV của bạn *
Click để chọn & tải lên CV của bạn
Mã bảo mật *

Nộp đơn ứng tuyển