Build and maintain scalable data infrastructure and ETL pipelines to support workforce analytics, recruiting, engagement, and organizational health insights. Enable data-informed decisions across People & Workplace functions through robust, reliable data systems.
Responsibilities
- Design, construct, and manage scalable data systems that support enterprise-wide analytics and reporting needs.
- Develop and operate ETL workflows to ingest, transform, and deliver large-scale data from diverse sources.
- Utilize distributed data processing frameworks such as Spark, Hive, or other MPP architectures for efficient data handling.
- Apply SQL and data modeling techniques to organize and optimize datasets for analytical use cases.
- Process and analyze high-volume structured and semi-structured datasets using tools like Spark and Presto.
- Write production-grade code in Python, Java, Scala, or Go for data pipeline development and automation.
- Ensure data accuracy and availability by managing and monitoring hundreds of ETL pipelines under strict SLAs.
- Diagnose and resolve complex data problems, including root-cause analysis of pipeline failures and data quality issues.
- Collaborate with data analysts and cross-functional teams to deliver trusted datasets and support effective data usage.
- Identify and resolve data discrepancies in dashboarding platforms such as Tableau, Power BI, or MicroStrategy.
Requirements
- Minimum of 5 years of experience in data engineering, with a focus on building and maintaining data pipelines and infrastructure.
- Proficiency in SQL, including advanced operations such as joins, aggregations, unions, and window functions.
- Practical experience in data modeling and schema design for analytical data systems.
- Experience developing and managing ETL pipelines using Airflow or comparable orchestration tools.
- Hands-on experience with Big Data technologies including Hadoop, Hive, Spark, or similar platforms.
- Programming experience in at least one of the following: Python, Java, Scala, or Go.
- Familiarity with UNIX/Linux environments and shell scripting for system automation and operations.
- Knowledge of software engineering best practices, including testing, monitoring, and documentation.
- Strong communication and collaboration skills when working with analysts and non-technical stakeholders.
- Ability to diagnose and fix data issues across ETL pipelines and business intelligence tools.
Nice to Have
- Bachelor’s or advanced degree in Computer Science or a related technical field.
- Experience working in fast-paced, high-growth technology environments.
- Exposure to real-time data ingestion systems such as Kafka or Flume.
- Experience supporting data science or advanced analytics teams with data infrastructure.
- Understanding of industry standards for large-scale ETL processes and data platform architecture.
- Strong interest in data science and emerging data technologies and methodologies.
Tech Stack
SQL, Spark, Hive, Hadoop, Airflow, Presto, Python, Java, Scala, Go, Tableau, Power BI, MicroStrategy, Kafka, Flume
Benefits
- 100% remote work policy, requiring only a laptop and stable internet connection.
- Highly competitive compensation paid in USD, exceeding typical market rates.
- Paid time off to support personal well-being and work-life balance.
- Autonomy in time management with a focus on delivering results.
- Opportunity to work on high-impact projects for leading U.S. technology companies.
Compensation
Highly Competitive USD Pay
Work Arrangement
global — 100% remote with autonomy to manage time; results-focused environment
Team
Team of 600+ skilled technology professionals based in Latin America, serving U.S. clients through a nearshore staff augmentation model.
- Emphasis on employee well-being and work-life balance, supported by engagement initiatives and supportive team environments.
- Global network of over 600 professionals across 25+ countries, fostering multicultural collaboration and professional connections.
- Collaboration with experienced senior-level professionals, ensuring high-caliber teamwork and knowledge sharing.
Additional Information
- All team members are located in Latin America.
- The role supports a U.S.-based client with global operations in logistics and technology.
- The position is fully remote with no requirement to work from an office.
- Candidates must possess strong English communication skills for effective collaboration with U.S. stakeholders.
- The organization emphasizes career development, digital transformation, and participation in impactful, innovative projects.
