At NielsenIQ, we deliver the most complete and clear understanding of consumer buying behavior. We are looking for a Big Data Engineer to join the NIQ Activate Technology team. In this role, you will build new data solutions for our rapidly expanding customer base, owning end-to-end data flows and designing scalable batch and real-time data pipelines and lakehouse solutions.
What You'll Do
- Own end-to-end data flows from requirements and architecture through implementation and production operations.
- Design and build scalable batch and real-time data pipelines and lakehouse solutions with a focus on large-scale data processing.
- Explore technologies to scale up the Data ecosystem to handle rapid Big Data growth.
- Partner with Data Science to productionize ML/AI workloads and ensure smooth integration into products.
- Collaborate with cloud, DevOps, application, and client teams to deliver robust, secure, and scalable solutions.
- Evaluate and adopt new technologies and patterns to evolve the data ecosystem as scale and complexity grow.
What We're Looking For
- 3+ years of hands-on Data Engineering building and operating production-grade Data Systems and Pipelines.
- B.Sc. / M.Sc. in Computer Science, Computer Engineering, or equivalent.
- Proficiency in Python; working proficiency in Scala.
- Strong expertise with at least one major cloud provider (AWS, Azure, or GCP).
- Strong experience with Big Data processing (Spark, DataBricks) and event streaming (Kafka).
- Experience with orchestration and platform tooling such as Airflow.
- Strong SQL skills and experience with data storage systems plus Data Lake/Lakehouse, columnar DB, or NoSQL.
- Hands-on experience with containers and Kubernetes (Helm is a plus) and modern CI/CD practices.
- Familiarity with LLM workflow frameworks (LangChain/LangGraph).
- Proven experience designing, building, and owning production-grade data pipelines, including reliability, backfills, and SLA-driven delivery.
- Ability to learn new technologies and work in a dynamic fast-paced environment.
- Result-driven, pragmatic, and innovative.
- Strong English and Hebrew communication skills, both written and verbal.
Nice to Have
- Experience with Delta Lake and/or Apache Iceberg; ML lifecycle tools such as MLflow.
- Experience with Pandas/Polars and building data services/APIs (e.g., FastAPI).
- Experience building LLM-powered agents / chat assistants (RAG, tool/function calling, workflow or multi-agent orchestration).
- Infrastructure as Code (Terraform/Pulumi/CloudFormation) and cloud security fundamentals (IAM, secrets, encryption).
- Experience with observability tooling and cost/performance optimization for distributed workloads.
- Experience building applications with React and Node.js.
Technical Stack
- Languages: Python, Scala
- Cloud: AWS, Azure, GCP
- Data Processing: Spark, DataBricks, Kafka
- Orchestration & Workflow: Airflow
- Databases & Storage: SQL, Data Lake, Lakehouse, columnar DB, NoSQL
- Infrastructure & Ops: Kubernetes, Helm, CI/CD
- ML/AI & LLM: LangChain, LangGraph, Delta Lake, Apache Iceberg, MLflow, Pandas, Polars, FastAPI
- Infrastructure as Code: Terraform, Pulumi, CloudFormation
- Web Technologies: React, Node.js
Team & Environment
You will be joining the NIQ Activate Technology team.
Benefits & Compensation
- Flexible working environment
- Volunteer time off
- LinkedIn Learning
- Employee-Assistance-Program (EAP)
Work Mode
This is a hybrid position based in Yokne'am Illit, Israel.
All employment decisions at NIQ are made without regard to race, color, religion, sex (including pregnancy, sexual orientation, or gender identity), national origin, age, disability, genetic information, marital status, veteran status, or any other characteristic protected by applicable laws.





