What You'll Do
Provide technical direction in designing and implementing data solutions on Snowflake’s cloud architecture, ensuring alignment with enterprise standards and business goals. Lead the development and optimization of data pipelines, leveraging features such as Materialized Views, Data Sharing, Cloning, and Dynamic Data Masking to enhance performance and security.
Collaborate with architects, analysts, and developers to clarify requirements and deliver robust data systems using agile methodologies. Guide teams through the full SDLC, from design to deployment, with a focus on quality, scalability, and maintainability. Integrate pipelines with source control and automate workflows using CI/CD and DevOps tools to streamline delivery.
Evaluate emerging technologies and alternative approaches to recommend efficient, cost-effective solutions. Identify and communicate technical risks and trade-offs during project execution. Support innovation through rapid experimentation and iterative development, enabling faster value delivery to stakeholders.
Requirements
- Must be authorized to work in the US without employer sponsorship; STEM OPT I-983 endorsement not supported
- Minimum of 5 years of hands-on experience with Snowflake data warehousing, ETL/ELT automation, and data pipeline development
- Proven track record in cloud data modernization, including data ingestion, curation, and domain modeling
- Experience processing structured, semi-structured (JSON, XML, Parquet), and unstructured data in Data Lakes
- Strong proficiency in Snow SQL, stored procedures, UDFs (JavaScript), SnowPipe, and related Snowflake utilities
- Familiarity with change data capture (CDC) and cloud-native ELT automation
- Working knowledge of Data Governance, including cataloging, quality monitoring, and lineage tracking
- Solid foundation in data modeling, database design, and data profiling techniques
- Experience with DevOps practices and tools such as GitHub, Jenkins, Nexus, and uDeploy
- Background in financial services or insurance data environments preferred
Preferred Qualifications
- SnowPro Core Certification
- AI-related certifications or experience
- Hands-on work with AI data pipelines, including data extraction, chunking, embedding, and grounding strategies
- Experience using Tableau for data visualization
- Familiarity with Informatica Data Management Cloud
- Industry experience in insurance or risk management domains
- Exposure to artificial intelligence applications in data engineering
Technical Stack
Primary technologies include Snowflake, SQL Server, and AWS. Development leverages Snow SQL, JavaScript-based UDFs, and Snowflake-native features. The environment integrates CI/CD pipelines using GitHub, Jenkins, Nexus, and uDeploy. Data workflows involve CDC, Data Lakes, and governance tools. Tableau and Informatica may be used for visualization and data management.
Benefits
- Annual and short-term performance bonuses
- Long-term incentive plans
- Recognition programs for on-the-spot achievements
- Additional perks and benefits as detailed on the company website