Deloitte Central Europe is looking for a Databricks Data Engineer to join our AI & Data team in Budapest. In this role, you will apply your data engineering expertise to build, manage, and optimize data solutions on the Databricks platform, working on significant projects for a diverse range of clients.
What You'll Do
- Design, build, and maintain scalable data pipelines and architectures using Databricks
- Utilize PySpark, Delta Lake, and SQL for efficient data processing and storage
- Develop and implement data models to support analytics and AI initiatives
- Collaborate with cross-functional teams to understand data requirements and deliver solutions
- Ensure data quality, reliability, and performance across cloud platforms
What We're Looking For
- 3-8 years of hands-on experience in data engineering or a closely related field
- A Bachelor’s degree in Computer Science, Engineering, or a related technical discipline
- Proven hands-on experience with the Databricks platform, including PySpark, Delta Lake, and SQL
- Strong proficiency in Python and/or SQL for data manipulation and transformation
- Familiarity with at least one major cloud platform: Azure, AWS, or GCP
- Excellent problem-solving abilities and a demonstrated eagerness to learn new technologies
- Professional fluency in English and native-level proficiency in Hungarian
Technical Stack
- Core: Databricks, PySpark, Delta Lake, SQL, Python
- Cloud Platforms: Azure, AWS, GCP
Team & Environment
You will be an integral part of our AI & Data team, collaborating with professionals on data-driven projects that deliver value across multiple industries.
Work Mode
This is a local, office-based position in Budapest.



