About the Role
The position involves building and optimizing data pipelines, ensuring data accuracy and accessibility, and working closely with analytics teams to deliver actionable insights through robust data solutions.
Responsibilities
- Design and implement data models to support business intelligence needs
- Develop and maintain ETL workflows for reliable data ingestion
- Ensure data quality and consistency across systems
- Collaborate with analysts to understand reporting requirements
- Optimize data storage and query performance
- Support data governance and compliance standards
- Troubleshoot and resolve data-related issues
- Document data architectures and processes
- Integrate data from multiple sources into centralized repositories
- Work with cloud-based data platforms and services
- Monitor data pipeline health and performance
- Implement automated testing for data workflows
- Contribute to data warehouse design and evolution
- Assist in defining data standards and best practices
- Enable self-service analytics capabilities
Nice to Have
- Master's degree in a technical or quantitative field
- Experience with real-time data processing systems
- Familiarity with data observability tools
- Knowledge of machine learning pipelines
- Background in regulated industries with strict data controls
Compensation
Competitive salary based on experience
Work Arrangement
Remote position with flexible hours
Team
Collaborative team focused on data infrastructure and analytics
Technology Stack
- Uses modern cloud data platforms including Snowflake and BigQuery
- Leverages dbt for data transformation and modeling
- Employs Git for version control and collaboration
- Integrates with BI tools for analytics delivery
Work Environment
- Fully remote team with asynchronous communication
- Emphasis on work-life balance and personal growth
- Collaborative culture with regular knowledge sharing
Available for qualified candidates