Iris Software is looking for a Lead Data Engineer to build ETL/ELT pipelines at scale using AWS data services and modern data technologies. You will lead engineering teams, drive architectural decisions, and work on complex, mission-critical applications.
What You'll Do
- Build ETL/ELT pipelines at scale.
- Lead engineering teams and drive architectural decisions.
- Work on complex, mission-critical applications with the latest technologies.
- Balance delivery velocity with engineering excellence.
- Work in Agile/Scrum environments.
What We're Looking For
- 8+ years of experience in Data Engineering.
- 5+ years of hands-on experience with AWS data services.
- Strong proficiency in Python and/or Scala.
- Experience building ETL/ELT pipelines at scale.
- Strong SQL and data modeling expertise (OLAP, dimensional modeling, lakehouse).
- Hands-on experience with Spark / PySpark.
- Hands-on experience with Airflow or similar orchestration tools.
- Hands-on experience with REST APIs and microservices.
- Experience using GitHub Copilot or similar AI-assisted development tools in enterprise environments.
- Solid understanding of IAM, encryption, networking, and cloud security best practices.
- Proven ability to lead engineering teams and drive architectural decisions.
- Strong stakeholder communication skills.
- Ability to balance delivery velocity with engineering excellence.
- Experience working in Agile/Scrum environments.
Nice to Have
- AWS Certifications (e.g., AWS Certified Solutions Architect, AWS Certified Data Analytics).
- Experience with streaming technologies (Kafka, Kinesis).
- Experience with containerization (Docker, Kubernetes).
- Knowledge of Delta Lake, Iceberg, or Hudi.
- Experience implementing data observability solutions.
- Experience implementing cost governance and FinOps best practices on AWS.
- Exposure to AI/ML pipeline integration.
- Experience designing multi-account AWS architectures.
Technical Stack
- Languages: Python, Scala
- Cloud & Services: AWS, AWS S3, S3 glacier, AWS EBS
- Processing Frameworks: Spark, PySpark
- Orchestration: Airflow
- APIs & Architecture: REST APIs, microservices
- Data & Query: SQL
- Streaming: Kafka, Kinesis
- Containerization: Docker, Kubernetes
- Table Formats: Delta Lake, Iceberg, Hudi
Benefits & Compensation
- Benefits for financial, health and well-being needs.
Iris Software is an equal opportunity employer.



