Responsibilities
- Lead end-to-end development and deployment of robust, scalable, and maintainable ETL/ELT data pipelines on GCP
- Work extensively with DBT to model and transform data
- Design data architecture and strategy for various business use cases (especially marketing data)
- Collaborate with product managers, analysts, and data scientists to understand data requirements
- Manage and mentor a team of data engineers, ensure adherence to engineering best practices
- Ensure systems are cost-efficient, well-monitored, and follow data quality and security standards
- Identify and troubleshoot issues in data pipelines; improve reliability and performance
- Drive code reviews, testing strategies, and documentation efforts
- Work with orchestration tools to automate workflows (Airflow, Cloud Composer, etc.)
Requirements
- 6+ years of experience in data engineering or a related role
- 3+ years experience working on GCP (BigQuery, GCS, Pub/Sub, Dataflow, etc.)
- 2+ years hands-on with DBT
- Experience managing teams
- Experience in marketing data pipelines (mandatory)
- Proficient in orchestration tools such as Airflow or Cloud Composer
- Strong understanding of cloud databases like BigQuery, Snowflake, Redshift, etc.
- Experience with requirements gathering, stakeholder management, and designing data architecture
- Familiarity with version control, CI/CD, testing frameworks for data pipelines
Nice to Have
- GCP certification (Associate or Professional level)
- Strong communication and team collaboration skills
Additional Information
- Notice Period: Immediate