Boehringer Ingelheim is hiring a Data Engineer for its IT Enterprise Data, AI & Platforms team, specifically within the Finance & Group Functions Data & Analytics domain. In this role, you will enhance our data infrastructure, optimize data flows, and ensure the availability and quality of critical data assets from Finance, Procurement, HR, Sustainability, and IT. You will be a key collaborator with data scientists and analysts, enabling a consistent and scalable data delivery architecture for analytics and AI use cases.
What You'll Do
- Design, develop, and maintain scalable data pipelines and ETL processes to support data integration and analytics.
- Collaborate with data architects, modelers, and IT team members to help define and evolve the overall cloud-based data architecture strategy.
- Collaborate with integration engineers, analysts, and other business stakeholders to understand data requirements and deliver solutions.
- Optimize and manage data storage solutions and data integrations, ensuring data quality, integrity, security, and accessibility.
- Implement data quality and validation processes to ensure data accuracy and reliability.
- Develop and maintain documentation for data processes, architecture, and workflows.
- Monitor and troubleshoot data pipeline performance and resolve issues promptly.
- Meet regularly with defined clients and stakeholders to understand and analyze their processes and needs, proactively presenting possible solutions.
- Stay updated with the latest industry trends and technologies to continuously improve data engineering practices.
What We're Looking For
- Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
- Proven experience as a Data Engineer, Software Developer, or similar role.
- Proven proficiency with the Apache ecosystem (Parquet, Iceberg, Spark, Kafka, Airflow).
- Strong hands-on experience with AWS data services (Kinesis, Glue, AppFlow, Lambda, S3).
- Solid experience with relational SQL and NoSQL databases, preferably Snowflake and/or Databricks, and dbt for building and modeling data pipelines.
- Advanced skills in scripting languages such as Python or Scala.
- Strong analytical capabilities for working with unstructured datasets.
- Familiarity with data pipeline and workflow orchestration tools.
- Knowledge of data visualization tools (e.g., Tableau, Power BI, QuickSight).
- Excellent project management and organizational skills.
- Strong written and verbal communication abilities.
- Consulting and analytical mindset: ability to engage with clients and stakeholders, understand needs, define requirements, and proactively propose solutions.
- High level of proactiveness and problem-solving attitude.
- Fluent in English.
Nice to Have
- SnapLogic experience is a plus.
- AWS certifications (Cloud Practitioner, Architecture, Big Data, or Data Analytics).
Technical Stack
- Apache Ecosystem: Parquet, Iceberg, Spark, Kafka, Airflow
- AWS: Kinesis, Glue, AppFlow, Lambda, S3
- Data Platforms: Snowflake, Databricks, dbt
- Languages: Python, Scala
- Visualization: Tableau, Power BI, QuickSight
- Integration: SnapLogic
Team & Environment
You will be part of the IT Enterprise Data, AI & Platforms team, specifically within the IT EDP Finance & Group Functions Data & Analytics team.
Benefits & Compensation
- Flexible working conditions
- Life and accident insurance
- Health insurance at a competitive price
- Investment in your learning and development
- Gym membership discounts
Boehringer Ingelheim is an equal opportunity employer.






