Bertoni Solutions is hiring a Lead Data Engineer to design, build, and maintain scalable data pipelines in our collaborative, fast-paced environment. You will join a multinational team that combines the talent of Latin American professionals with Swiss organizational principles to solve business challenges with technology.
What You'll Do
- Design and develop scalable data pipelines using PySpark to support analytics and reporting needs.
- Write efficient SQL and Python code to transform, cleanse, and optimize large datasets.
- Collaborate with machine learning engineers, product managers, and developers to understand data requirements and deliver solutions.
- Implement and maintain robust ETL processes to integrate structured and semi-structured data from various sources.
- Ensure data quality, integrity, and reliability across pipelines and systems.
- Participate in code reviews, troubleshooting, and performance tuning.
- Work independently and proactively to identify and resolve data-related issues.
- Contribute to Azure-based data solutions, including ADF, Synapse, ADLS, and other services.
- Support cloud migration initiatives and DevOps practices.
- Provide guidance on best practices and mentor junior team members when needed.
What We're Looking For
- 8+ years of overall experience working with cross-functional teams (machine learning engineers, developers, product managers, analytics teams).
- 3+ years of hands-on experience developing and managing data pipelines using PySpark.
- 3 to 5 years of experience with Azure-native services, including Azure Data Lake Storage (ADLS), Azure Data Factory (ADF), Databricks, Azure Synapse Analytics / Azure SQL DB / Fabric.
- Strong programming skills in Python and SQL.
- Solid experience doing ETL processes and data modeling/data warehousing end-to-end solutions.
- Self-driven, resourceful, and comfortable working in dynamic, fast-paced environments.
- Advanced written and spoken English is a must-have for this position (B2, C1 or C2 only).
- Strong communication skills is a must.
- Proven leadership experience in a current project or previous projects/work experiences.
- Must be located in Central or South America, as this is a nearshore position.
Nice to Have
- Databricks certification.
- Knowledge of DevOps, CI/CD pipelines, and cloud migration best practices.
- Familiarity with Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Services, and Cosmos DB.
- Basic understanding of SAP HANA.
- Intermediate-level experience with Power BI.
Technical Stack
- PySpark, SQL, Python
- Azure Data Factory, Azure Synapse Analytics, Databricks, Fabric
- Azure Data Lake Storage (ADLS), Azure SQL DB
- Event Hub, IoT Hub, Azure Stream Analytics, Azure Analysis Services, Cosmos DB
- SAP HANA, Power BI
Team & Environment
You will collaborate directly with machine learning engineers, product managers, and developers.
Benefits & Compensation
- Be part of an innovative team shaping the future of technology.
- Work in a collaborative and inclusive environment.
- Opportunities for professional development and career growth.
Work Mode
This is a remote position. Candidates must be located in Central America or South America.
Bertoni Solutions is an equal opportunity employer.




