Lead the evolution of a next-generation data acquisition platform by designing and implementing high-throughput, secure, and observable ingestion systems. This role is central to enabling reliable data flow across the enterprise, powering analytics, AI initiatives, and product capabilities through scalable pipelines.
What You’ll Do
- Design and develop core components of a unified ingestion system that supports both streaming and batch data from internal and external sources.
- Architect modular, reusable frameworks for data routing, transformation orchestration, schema management, and error handling.
- Ensure end-to-end data traceability and compliance by integrating with governance, access control, and lineage systems.
- Collaborate with engineering teams across domains to understand integration requirements and streamline onboarding with automation.
- Drive technical excellence through code reviews, performance optimization, testing strategies, and operational observability.
- Mentor engineers and establish standards that promote maintainable, production-grade software.
What We’re Looking For
- Proven experience building distributed systems or large-scale data pipelines in Python, Java, or Scala.
- Strong foundation in software engineering principles: modular design, version control, testing, and performance tuning.
- Familiarity with event-driven architectures, schema evolution, and access control patterns.
- Degree in Computer Science, Engineering, or a related technical field.
- Ability to lead technical direction while working collaboratively across infrastructure, data, and product teams.
- A mindset focused on long-term system reliability, security, and operational accountability.
