Teads is looking for a Senior Data Engineer to join our data platform team. You will design, build, and maintain robust and scalable data pipelines, working closely with data scientists, analysts, and engineers to ensure our data is reliable, accessible, and ready for analysis.
What You'll Do
- Design, build, and maintain ETL/ELT pipelines using Apache Spark to process large-scale datasets.
- Architect and manage data infrastructure on Google Cloud Platform (GCP) or Amazon Web Services (AWS), including services like BigQuery, S3, GCS, EMR, and AirFlow.
- Improve and manage data warehouse and data lake solutions, ensuring data quality, consistency, and accessibility.
- Partner with cross-functional teams to understand data needs and implement solutions for new product features and business initiatives.
- Implement monitoring, alerting, and logging systems to maintain data pipeline health and ensure data accuracy.
What We're Looking For
- 7+ years of professional experience in a data engineering or similar role.
- Good programming abilities with a focus on testing, architecture, performance, maintainability, and quality.
- Strong proficiency in SQL / Java or Scala / Python.
- Extensive experience with distributed data processing frameworks like Apache Spark / Flink / Hive / Trino.
- Proven experience with cloud-based data services on GCP or AWS (e.g., BigQuery, S3, GCS, EMR, DataProc).
- Experience with real-time data streaming technologies like Kafka or Flink.
- Deep understanding of data warehouse and data lake concepts and best practices.
- Knowledge of Apache Iceberg or Delta Lake.
- Solid understanding of IaC using Terraform.
- Familiarity with SQL and NoSQL databases.
- Good communication and collaborative teamwork skills.
- Multiple shipped projects in Software Engineering.
- Production knowledge and practices and strong problem-solving skills.
Nice to Have
- Familiarity with containerization (Docker/OrbStack, Kubernetes).
- Knowledge of the ad tech ecosystem (e.g., DSPs, SSPs, Ad Exchanges).
Technical Stack
- Apache Spark, Google Cloud Platform (GCP), Amazon Web Services (AWS), BigQuery, S3, GCS, EMR, AirFlow, SQL, Java, Scala, Python, Flink, Hive, Trino, Kafka, Apache Iceberg, Delta Lake, Terraform, Docker, OrbStack, Kubernetes
Team & Environment
You will be part of the Core Data Platform team within a collaborative, forward-thinking environment that fosters innovation, creative problem-solving, and continuous learning.
Benefits & Compensation
- Competitive compensation and profit-sharing.
- Daily meal vouchers (Swile).
- Family health insurance (Alan).
- Personalized relocation package (if needed).
- In-house and external training, tech conference opportunities.
- Internal mobility (individual contributor or management career ladder).
- 35+ days off per year.
- Hybrid work (2 days remote work per week).
- Fully covered parental leave.
- Reserved daycare places.
- Premium work equipment.
- Remote work subsidy.
- Dedicated charitable time and sustainability actions (Eco Tree, subsidy for eco-mobility).
Work Mode
This is a hybrid role with 2 days remote work per week, open to candidates in Israel, Slovenia, or France.
Teads is an equal employment opportunity employer and committed to diversity and inclusion at all stages of recruitment and employment.





