About the Role
The role involves building and optimizing data pipelines, ensuring data accuracy, and supporting analytics through modern cloud technologies with a focus on performance and scalability.
Responsibilities
- Design and implement data models in BigQuery for efficient querying
- Develop and maintain ETL pipelines using cloud-native tools
- Ensure data quality and consistency across systems
- Collaborate with analysts to deliver reporting solutions
- Optimize query performance and reduce costs in data environments
- Support Looker development for business intelligence dashboards
- Maintain documentation for data assets and workflows
- Monitor data pipeline health and resolve issues proactively
- Implement data governance and access controls
- Work with stakeholders to understand reporting requirements
- Integrate data from multiple sources into centralized repositories
- Apply best practices in data warehouse architecture
- Use scripting languages to automate data processes
- Troubleshoot data discrepancies and resolve root causes
- Ensure compliance with data privacy standards
Nice to Have
- Experience with Terraform or infrastructure-as-code tools
- Familiarity with data lineage and observability tools
- Knowledge of real-time data processing systems
- Looker certification or equivalent experience
- Experience in regulated industries with strict data controls
Compensation
Competitive salary and benefits package
Work Arrangement
Remote with flexible hours
Team
Collaborative data team focused on scalable analytics solutions
Tech Stack
- Google Cloud Platform (GCP)
- BigQuery
- Looker
- Git
- Python
- Terraform
Project Focus
- Building scalable data pipelines
- Improving data accessibility for analytics teams
- Reducing query latency in reporting systems
- Enabling self-service analytics through Looker
Not available