Sydney, New South Wales, Australia Hybrid

Sleek is hiring a Principal Data Engineer

About the Role

Sleek is looking for a Principal Data Engineer to bridge business insights and technical execution, shaping our data roadmap and empowering the team to deliver innovative, scalable solutions. You will drive alignment of data architecture with strategic goals and tackle complex data challenges.

What You'll Do

  • Define the overall greenfield data architecture (batch + streaming) using GCP - BigQuery.
  • Establish best practices for ingestion, transformation, data quality, and governance.
  • Lead the design and implementation of ETL/ELT pipelines.
  • Ensure data quality and reliability with dbt tests, Great Expectations/Soda, and monitoring.
  • Implement Dataplex & Data Catalog for metadata, lineage, and discoverability.
  • Define IAM policies, row/column-level security, DLP strategies, and compliance controls.
  • Define and enforce SLAs, SLOs, and SLIs for pipelines and data products.
  • Implement observability tooling and build alerting and incident response playbooks.
  • Ensure pipeline resilience (idempotency, retries, backfills, incremental loads) and establish disaster recovery strategies.
  • Partner with BI/analytics teams to deliver governed self-service through tools like Looker.
  • Support squad-level data product ownership with clear contracts and SLAs.
  • Mentor a small data engineering team; set coding, CI/CD, and operational standards.
  • Collaborate with squads, product managers, and leadership to deliver trusted data.

What We're Looking For

  • 10+ years experience in data engineering, architecture, or platform roles.
  • Strong expertise in GCP data stack: BigQuery, GCS, Dataplex, Data Catalog, Pub/Sub, Dataflow.
  • Hands-on experience building ETL/ELT pipelines with dbt + orchestration (Composer/Airflow/Dagster).
  • Deep knowledge of data modeling, warehousing, partitioning/clustering strategies.
  • Experience with monitoring, reliability engineering, and observability for data systems.
  • Familiarity with data governance, lineage, and security policies (IAM, DLP, encryption).
  • Strong SQL skills and solid knowledge of Python for data engineering.
  • Ownership: Reliability, accountability, and proactive problem-solving.
  • Humility: Open-mindedness to feedback and willingness to learn.
  • Structured Thinking: Sound judgement and pragmatic solution design.
  • Attention to detail: Ability to manage multiple complex workstreams.
  • Excellent listener and clear communicator.

Nice to Have

  • Experience with Snowflake, Databricks, AWS (Redshift, Glue, Athena), or Azure Synapse.
  • Knowledge of open-source catalogs (DataHub, Amundsen, OpenMetadata).
  • Background in streaming systems (Kafka, Kinesis, Flink, Beam).
  • Exposure to data observability tools (Monte Carlo, Bigeye, Datafold, Databand).
  • Prior work with Looker, Hex, or other BI/analytics tools.
  • Startup or scale-up experience (fast-moving, resource-constrained environments).

Technical Stack

  • GCP: BigQuery, GCS, Dataplex, Data Catalog, Pub/Sub, Dataflow, Cloud Composer (Airflow), Cloud Monitoring, Logging, Error Reporting, Cloud Trace
  • Ingestion: Datastream, Airbyte, Fivetran, Rivery
  • Transformations: dbt, Dagster
  • Data Quality: Great Expectations, Soda
  • Observability: Monte Carlo, Datafold, Databand, Bigeye
  • Analytics: Looker, Looker Studio
  • Languages: SQL, Python

Team & Environment

You will mentor a small data engineering team and work closely with cross-functional teams, squads, product managers, and leadership. This role reports to the CTO.

Benefits & Compensation

  • Competitive market salaries.
  • Eligibility for employee share ownership plan.
  • Generous paid time off and holiday schedules.
  • Internal and external training programmes.
  • Involvement in regional centre of AI excellence.

Work Mode

This is a hybrid role with work from home 5 days per week and flexible start/end times. You can work fully remote from anywhere in the world for 1 month each year. Locations include Singapore, Hong Kong, Australia, and the UK.

Sleek is a certified B Corp committed to diversity, inclusion, and building an equitable, regenerative economy. We aim to be Carbon Neutral by 2030.

Required Skills
GCPBigQueryDataflowdbtAirflowPythonSQLData QualityData ObservabilityLookerPub/SubData CatalogETL/ELT
Got hired remotely?

Get paid like a professional

Remote clients expect company invoices, not personal PayPal requests. Glopay forms an EU partnership that makes you look legitimate while you stay independent.

Professional invoices with EU company details
Compliance handled automatically
Withdraw to any bank account
Income reports for easy tax filing
Create free account
Free signup • 5 min setup
About company
Sleek

Sleek makes back-office operations easy for micro SMEs through proprietary software and AI, focusing on customer delight. It operates three segments: Corporate Secretary services (market leader in Singapore), Accounting & Bookkeeping with proprietary ledger and AI tools, and FinTech payments.

Visit website
Job Details
Category data
Posted 5 months ago