Belmont Lavan Ltd is hiring a Data Engineer for a 6-month contract. You will design, build, and optimize large-scale data pipelines that support analytics and automation across telecom systems. You'll be instrumental in developing the next generation of data integration and analytics framework.
What You'll Do
- Design, develop, and maintain scalable data pipelines and ETL processes.
- Integrate data from various telecom domains including CRM, billing, and network systems.
- Build efficient data transformation and orchestration workflows using Spark and Airflow.
- Manage deployments in Kubernetes on Linux-based environments (RHEL or equivalent).
- Optimize data accessibility and analytics performance through Dremio and related tools.
- Automate scheduling and monitoring to ensure stability and reliability.
- Troubleshoot data issues, conduct performance tuning, and maintain platform documentation.
What We're Looking For
- Proven experience in data engineering and ETL workflow design.
- Strong working knowledge of Apache Spark, Airflow, and Kubernetes.
- Solid programming skills in Python, Java, and SQL.
- Hands-on experience with Red Hat Enterprise Linux (RHEL) or similar environments.
- Strong analytical, problem-solving, and communication skills.
- Fluent English.
Nice to Have
- Experience working with telecom data (customer, billing, or network).
- Familiarity with data lake, mesh, or real-time streaming architectures.
- Exposure to cloud platforms such as AWS, Azure, or GCP.
- Understanding of CI/CD automation and data governance frameworks.
Technical Stack
- Apache Spark, Airflow, Kubernetes, Dremio
- Python, Java, SQL
- Linux (RHEL)
Team & Environment
You will be joining the Telecom data platform team.
Work Mode
This is a 100% remote contract position. The role is based in Finland.



