Bell Canada Enterprises is looking for a Principal Software Architect to lead the design and implementation of our next-generation hybrid data mesh architecture. In this role, you will define our data strategy, integrating on-premise Hadoop infrastructure with Google Cloud Platform (GCP) services to ensure scalability, reliability, and security.
What You'll Do
- Lead the design and implementation of a hybrid data mesh architecture, integrating on-premise Hadoop infrastructure with GCP services.
- Define and implement data governance strategies, including data cataloging and metadata management within the data mesh framework.
- Design and implement ELT (extract, load, transform) pipelines using GCP services like Dataflow, Dataproc, and Cloud Storage, integrating them with on-premise Hadoop data sources.
- Promote domain-driven design principles to ensure data ownership and autonomy within the data mesh.
- Design and implement a composable architecture, enabling flexibility and scalability in data processing and access.
- Develop and maintain architectural documentation, including diagrams, specifications, and standards.
- Provide technical leadership and mentorship to junior architects and engineers.
- Define project scope, packages, and deliverables, including costs and timelines, for medium to large-scale data initiatives.
- Collaborate with organizational stakeholders to understand business needs and translate them into technical solutions.
- Ensure the security and compliance of data assets in the hybrid environment.
- Stay current with the latest advancements in native cloud technologies and data architecture best practices.
- Assist in migrating data from on-premise Hadoop clusters to the cloud by leveraging appropriate ETL/ELT technologies while maintaining data integrity.
What We're Looking For
- 8+ years of experience in data architecture, with a particular focus on hybrid cloud environments.
- 5+ years of experience designing and implementing data mesh architectures.
- Extensive experience with Google Cloud Platform (GCP) services, including Dataflow, Dataproc, BigQuery, Cloud Storage, and Data Catalog.
- Proven experience developing and optimizing ELT (extract, load, transform) pipelines.
- Deep understanding of data modeling (logical and physical), including relational database management systems (RDBMS).
- Experience with data cataloging tools and metadata management.
- Strong understanding of domain-driven design principles and their application in data architecture.
- Experience designing and implementing composable architectures.
- Experience with on-premise Hadoop environments (e.g., HDFS, Hive, Spark) and data migration to the cloud.
- Experience with one or more architecture and service delivery frameworks (TOGAF, eTOM, ITIL, CoBiT, etc.).
- Excellent communication, collaboration, and presentation skills.
- Bachelor's degree in Computer Science, Information Systems, or a related field; Master's preferred.
Nice to Have
- Experience with other cloud platforms (AWS, Azure).
- Experience with data virtualization technologies.
- Experience with data governance and compliance frameworks (e.g., GDPR, CCPA).
- GCP or data architecture related certifications.
Technical Stack
- Google Cloud Platform (GCP): Dataflow, Dataproc, BigQuery, Cloud Storage, Data Catalog
- Hadoop: HDFS, Hive, Spark
- Data Integration: ETL/ELT pipelines
Team & Environment
We believe in empowering everyone. That's why we equip our teams with advanced technology, AI tools, and a collaborative environment that supports creativity and growth.




