Saskatoon, Saskatchewan, Canada Hybrid Employment

BHP is hiring a Principal Data Utility

About the Role

At BHP, we are looking for a Principal Data Engineer and Architect. This role establishes the data engineering foundations for our Potash asset, blending deep technical expertise with strategic architectural leadership. You will design scalable systems and ensure high-quality data solutions that support reliable, enterprise-wide decision-making.

What You'll Do

  • Partner with regional Data Utility teams globally to remove technical blockers, uplift engineering practices, and drive consistency in design patterns, frameworks, and reusable components.
  • Build and foster a data community, promoting knowledge sharing, collaboration, and alignment across teams.
  • Work closely with internal customers to understand their data requirements, model data structures, and design and implement scalable ingestion pipelines from operational and enterprise systems.
  • Lead the design and development of integration solutions and ETL pipelines, ensuring high-quality documentation and approval of engineering patterns.
  • Collaborate with on-premise and cloud platform teams to identify capability gaps and evaluate emerging tools and technologies.
  • Work with the Enterprise & Global Data Utility team to enhance and evolve the data platform to meet customer needs.
  • Design and shape the data platform and data ecosystem for Potash, and influence global engineering standards.

What We're Looking For

  • A Master’s degree in Computer Science, MIS, Engineering or a related field.
  • At least 10 years’ experience in Data Engineering or Architecture.
  • Experience working across distributed processing, traditional RDBMS, MPP and NoSQL database technologies.
  • Strong background with ETL and data warehousing tools such as Informatica, Talend, Pentaho or DataStage.
  • Hands-on experience with Hadoop, Spark, Storm, Impala and related platforms.
  • Strong understanding of RDBMS concepts, ETL principles and end-to-end data pipeline development.
  • Solid knowledge of data modelling techniques (ERDs, star schema, snowflake schema).
  • Experience with AWS services including S3, EC2, EMR, RDS, Redshift and Kinesis.
  • Exposure to distributed processing (Spark, Hadoop, EMR), RDBMS (SQL Server, Oracle, MySQL, PostgreSQL), MPP (Redshift, Teradata) and NoSQL technologies (MongoDB, DynamoDB, Cassandra, Neo4J, Titan).
  • Experience designing and building streaming pipelines using tools such as Kafka, Kafka Streams or Spark Streaming.
  • Strong proficiency in Python and at least two of: Scala, SQL or Java.
  • Experience deploying production applications, including testing, packaging, monitoring and release management.
  • Proficiency with Git-based source control and CI/CD pipelines, ideally GitLab.
  • Strong engineering discipline including code reviews, testing frameworks and maintainable coding practices.
  • Experience working within DevOps, Agile, Scrum or Continuous Delivery environments.
  • Ability to mentor team members and support capability development across teams.
  • Strong communication, listening and influencing skills.
  • High levels of motivation, adaptability and problem-solving capability.

Nice to Have

  • Experience with structured, semi-structured and unstructured data.
  • Understanding of data governance, lineage and data quality approaches.
  • Experience with Infrastructure-as-Code tools such as Terraform.
  • Exposure to workflow orchestration tools like Azkaban, Luigi or Airflow.
  • Experience enabling data consumption through APIs, event streams or data marts.
  • Experience with MuleSoft, Solace or StreamSets.

Technical Stack

  • ETL & Warehousing: Informatica, Talend, Pentaho, DataStage
  • Big Data Platforms: Hadoop, Spark, Storm, Impala
  • AWS Services: S3, EC2, EMR, RDS, Redshift, Kinesis
  • Databases: SQL Server, Oracle, MySQL, PostgreSQL, Redshift, Teradata, MongoDB, DynamoDB, Cassandra, Neo4J, Titan
  • Streaming: Kafka, Kafka Streams, Spark Streaming
  • Languages: Python, Scala, SQL, Java
  • DevOps Tools: Git, GitLab CI/CD, Terraform, Azkaban, Luigi, Airflow
  • Integration Tools: MuleSoft, Solace, StreamSets

Team & Environment

You will partner with regional Data Utility teams globally and the Enterprise & Global Data Utility team.

Work Mode

This role offers a hybrid work model based out of the Downtown, Saskatoon office.

At BHP, we know that we are strengthened by diversity. We are an Equal Opportunity employer that is committed to making BHP a safe and inclusive workplace where everyone can thrive and be at their best every day.

Required Skills
InformaticaTalendPentahoDataStageHadoopSparkStormImpalaAWS S3AWS EC2ETLData WarehousingDistributed ProcessingRDBMSMPPNoSQL
Planning long-term in Thailand?

Full relocation support, start to finish

From visa strategy to housing, banking, and schools for your family — SVBL plans and manages every detail of your move to Thailand so nothing falls through the cracks.

Complete relocation planning
Family visa & school enrollment
Banking & insurance setup
Cultural integration support
Plan your move
One partner for everything
About company
BHP

BHP es una compañía dedicada a los recursos naturales presente en más de 90 ubicaciones en el mundo, dedicada a crear un futuro más sostenible. En Chile, su oficina corporativa en Santiago es el corazón estratégico de sus operaciones en el país, liderando proyectos que impulsan la minería responsable y la innovación.

Visit website
Job Details
Department Data and Analytics
Category data
Posted 14 days ago