At DataRobot, we are looking for a Senior Data Engineer, Product Analytics to be a technical driver for our end-to-end data strategy. You will build scalable Data Warehouse and Lakehouse solutions on Snowflake, champion the ELT paradigm, and ensure robust data governance and cost optimization.
What You'll Do
- Architect and deliver scalable, reliable data warehouses, analytics platforms, and integration solutions, playing a critical role in supporting our internal AI strategy.
- Partner with a Product Manager, Analytics to shape our project roadmap and lead its implementation.
- Collaborate with and mentor cross-functional teams to design and execute sophisticated data software solutions that elevate business performance and align to coding standards and architecture.
- Develop, deploy, and support analytic data products, such as data marts, ETL jobs, and functions in a cloud data warehouse environment using Snowflake, Stitch/Fivetran/Airflow, and AWS services.
- Navigate various data sources and efficiently locate data in a complex data ecosystem.
- Work closely with data analysts and data scientists to build models and metrics to support their analytics needs.
- Instrument telemetry capture and data pipelines for various environments to provide product usage visibility.
- Maintain and support deployed ETL pipelines and ensure data quality.
- Develop monitoring and alerting systems to provide visibility into the health of data infrastructure, cloud applications, and data pipelines.
- Partner with IT enterprise applications and engineering teams on integration efforts between systems that impact data & Analytics.
- Work with R&D to answer complex technical questions about product analytics and corresponding data structure.
What We're Looking For
- 5-7 years of experience in a data engineering or data analyst role.
- Experience building and maintaining product analytics pipelines, including the implementation of event tracking and the integration of behavioral data into Snowflake.
- Strong understanding of data warehousing concepts, working experience with relational databases, and SQL.
- Experience working with cloud providers like AWS, Azure, or GCP.
- Solid programming foundations and proficiency in data-related languages like Python, Scala, and R.
- Experience with DevOps workflows and tools like DBT, GitHub, and Airflow.
- Experience with an infrastructure-as-code tool such as Terraform or CloudFormation.
- Excellent communication skills and the ability to effectively communicate with both technical and non-technical audiences.
- Knowledge of real-time stream technologies like AWS Firehose and Spark.
- A highly collaborative approach to working with teammates and stakeholders.
Nice to Have
- An AWS cloud certification is a plus.
- A BA/BS in a technical or engineering field is preferred.
Technical Stack
- Snowflake, Stitch, Fivetran, Airflow
- AWS (EC2, lambda, kinesis)
- Python, SQL, DBT
- Terraform, CloudFormation
- AWS Firehose, Spark
Benefits & Compensation
- Medical, Dental & Vision Insurance
- Flexible Time Off Program
- Paid Holidays
- Paid Parental Leave
- Global Employee Assistance Program (EAP)
DataRobot is proud to be an Equal Employment Opportunity and Affirmative Action employer. We do not discriminate based upon race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, status as a protected veteran, status as an individual with a disability, or other applicable legally protected characteristics.





