MLabs is hiring a Senior Data Engineer for a foundational 0→1 role, architecting and building our entire data platform from the ground up. You will be the first dedicated data engineer at our fast-growing fintech, designing systems to support a global financial system operating at the intersection of payments, cards, and blockchain infrastructure.
What You'll Do
- Design, build, and maintain core data pipelines for data ingestion from payments processors, card issuers, blockchain nodes, internal services, and third-party APIs.
- Own orchestration and workflow management using tools like Airflow or Dagster to ensure reliable, observable, and scalable data processing.
- Architect and manage the data warehouse (Snowflake, BigQuery, or Redshift), driving performance, cost optimization, and access patterns.
- Develop high-quality ELT/ETL transformations to structure raw logs, transactions, ledgers, and on-chain events into clean, production-grade datasets.
- Implement data quality frameworks and observability including tests, data contracts, freshness checks, and lineage.
- Partner closely with backend engineers to instrument new events, define data contracts, and improve telemetry across the infrastructure.
- Support Analytics and cross-functional teams by delivering well-modeled, well-documented tables for dashboards, ROI analyses, and key business metrics.
- Own data reliability at scale, leading root-cause investigations, reducing pipeline failures, and building monitoring and alerting systems.
- Evaluate and integrate new tools across ingestion, enrichment, observability, and developer experience.
- Help set the long-term technical direction for the data platform as we scale across new products, regions, and chains.
What We're Looking For
- A data infrastructure builder who thrives in early-stage environments, owning pipelines and platforms end-to-end.
- Expert data engineer with strong Python and SQL fundamentals and real experience building production-grade ETL/ELT.
- Hands-on experience with workflow and orchestration tools like Airflow, Dagster, or Prefect.
- Comfortable designing schemas, optimizing performance, and operating modern cloud warehouses like Snowflake, BigQuery, or Redshift.
- Quality-obsessed with a deep care for data integrity, testing, lineage, and observability.
- A systems thinker who sees data as a platform and designs for reliability, scale, and future users.
- A collaborator who works well with backend engineers, analytics engineers, and cross-functional stakeholders.
- 5–7+ years in data engineering roles, ideally within fintech, payments, B2B SaaS, or infrastructure-heavy startups.
Nice to Have
- Experience ingesting and processing payments data, transaction logs, or ledger systems.
- Exposure to smart contracts, blockchain data structures, or on-chain event ingestion.
- Experience building data tooling for compliance, risk, or regulated environments.
- Familiarity with dbt and/or semantic modeling to support analytics layers.
- Prior experience standing up data platforms from 0→1 at early-stage companies.
Technical Stack
- Python, SQL
- Airflow, Dagster, Prefect
- Snowflake, BigQuery, Redshift
- dbt
Team & Environment
You will work directly with the CTO and partner closely with Product, Engineering, Operations, Compliance, and Analytics teams.
Benefits & Compensation
- $140K–$240K base salary + meaningful equity
- Unlimited PTO with a real minimum encouraged
- Flexible remote-first working
- Home office stipend
- Comprehensive health, dental & vision (US)
- 401(k) with company match
- Wellness budget for gym, fitness, recovery, etc.
- Regular team offsites (US + international)
Work Mode
This role operates on a hybrid work model.
At MLabs, we are committed to offering equal opportunities to all candidates. We ensure no discrimination, accessible job adverts, and providing information in accessible formats. Our goal is to foster a diverse, inclusive workplace with equal opportunities for all.





