Jobgether is seeking a Data Engineer to design, build, and optimize scalable data pipelines and ETL/ELT workflows in a fully remote, collaborative environment. You will implement medallion architecture, manage batch and streaming ingestion, and create reporting-ready data models to support business and analytics needs.
What You'll Do
- Design, implement, and maintain scalable data pipelines and ETL/ELT workflows on Azure and Microsoft Fabric.
- Implement and manage medallion architecture (Bronze, Silver, Gold layers) for data products.
- Develop data ingestion frameworks for both batch and streaming sources.
- Support creation of reporting-ready data models aligned with business requirements.
- Ensure data quality, governance, lineage, and performance optimization standards are applied.
- Collaborate with Data Architects on design decisions and provide technical guidance.
- Document pipelines, transformations, and architectural decisions for maintainability.
- Communicate complex technical concepts effectively to project managers and non-technical stakeholders.
What We're Looking For
- 5+ years of hands-on experience with cloud data platforms (Azure required).
- Proven experience implementing medallion architecture at scale.
- Expertise in building pipelines using Azure Data Factory, Synapse, Dataflows, or Fabric Data Pipelines.
- Proficiency in SQL and at least one programming/scripting language (Python or PySpark preferred).
- Experience with data modeling for analytics and reporting (star schema, snowflake, dimensional modeling).
- Strong understanding of data governance, metadata management, and performance optimization.
- Excellent communication skills and ability to collaborate with cross-functional teams.
Nice to Have
- Microsoft Fabric experience strongly preferred.
- Familiarity with Power BI and enterprise semantic models.
- DevOps practices (CI/CD, version control).
- Knowledge of finance, real estate, or investment management data domains.
Technical Stack
- Azure, Microsoft Fabric, Azure Data Factory, Synapse, Dataflows
- SQL, Python, PySpark
Team & Environment
You will collaborate with architects, analysts, and reporting developers.
Benefits & Compensation
- Competitive monthly compensation: USD $4,000 - $5,000, depending on experience.
- Fully remote role with flexibility to work from any Latin American country.
- Opportunity to work with top-tier international clients.
- Exposure to data engineering tools and cloud platforms.
- Collaborative environment that encourages continuous learning and growth.
Work Mode
This is a fully remote position open to candidates in Latin America.
Jobgether provides equal employment opportunities to all employees and applicants.


