Responsibilities
- Design and manage ELT workflows using Fivetran and custom-built connectors to pull data from systems like Procore, Salesforce, ERP solutions, and internal databases into Snowflake
- Create, validate, and document dbt transformations that convert raw inputs into trusted datasets used by Finance, Production, Supply Chain, and Engineering teams
- Construct dimensional schemas and staging layers according to team standards to support fast Tableau dashboards and flexible exploration
- Implement and oversee dbt data validations, track data timeliness, and lead investigations into data quality problems until root causes are resolved
- Integrate new data sources via APIs and connectors, resolving issues in data ingestion to maintain consistent flow into the data warehouse
- Produce and update documentation for data models, pipeline setups, and business rules so others can understand and build on implemented solutions
- Collaborate with business teams to define data requirements and deliver datasets that address key operational questions
- Track query efficiency and pipeline performance, identifying ways to reduce warehouse costs and improve model execution speed
- Engage in code reviews, adhere to Git and CI/CD protocols, and help refine team development practices
- Keep up with advancements in data tooling and share insights to enhance team capabilities
- Leverage AI-powered coding tools and agent-based systems to speed up development, testing, and documentation tasks, integrating them into daily workflows