Responsibilities
- Design and implement data pipelines using GCP services.
- Optimize and maintain data warehouses in BigQuery.
- Develop and manage data models and ETL processes.
- Collaborate with data scientists and analysts to deliver insights.
- Ensure data quality and integrity across systems.
- Create and maintain data dashboards using Looker.
- Troubleshoot and resolve data-related issues.
- Implement data security and compliance measures.
- Document data processes and best practices.
- Provide technical guidance to junior team members.
- Participate in on-call rotations for data-related incidents.
- Contribute to the development of data governance policies.
- Work with stakeholders to understand data requirements.
- Design and implement data integration solutions.
- Monitor and optimize data pipeline performance.
- Conduct data analysis to support business decisions.
- Develop and maintain data documentation.
- Collaborate with cross-functional teams on data projects.
- Implement data backup and recovery solutions.
- Stay updated with the latest data engineering trends and technologies.
Nice to Have
- Experience with Google Cloud Platform (GCP).
- Certification in Google Cloud Data Engineering.
- Experience with data lake solutions.
- Knowledge of machine learning and AI technologies.
- Experience with data governance frameworks.
- Familiarity with data privacy regulations.
- Experience with data pipeline orchestration tools.
- Knowledge of data warehousing solutions.
- Experience with data analytics and reporting tools.
- Familiarity with data integration platforms.
- Experience with data quality management tools.
- Knowledge of data security protocols.
- Experience with data visualization and reporting.
- Familiarity with data migration techniques.
- Experience with data-driven business intelligence.
- Knowledge of data warehousing best practices.
- Experience with data governance and compliance.
Compensation
Competitive salary
Work Arrangement
Remote
Team
Collaborative and data-driven team environment.
What You'll Get
- Competitive salary and benefits package.
- Opportunities for professional growth and development.
- Flexible work arrangements and remote work options.
- Collaborative and supportive team environment.
- Access to the latest tools and technologies.
- Challenging and impactful projects to work on.
- Comprehensive health and wellness benefits.
- Generous time-off policies and work-life balance.
- Opportunities for continuous learning and training.
- Inclusive and diverse workplace culture.
How to Apply
- Submit your resume and cover letter through the application portal.
- Include relevant experience and skills in your application.
- Highlight your expertise in GCP, BigQuery, and Looker.
- Provide examples of your data engineering projects.
- Include any certifications or additional qualifications.
- Describe your experience with data-driven solutions.
- Explain your approach to data pipeline design and implementation.
- Detail your experience with data warehousing and ETL processes.
- Include any relevant data analysis or visualization projects.
- Describe your experience with data security and compliance.
Not provided