About the Role
The individual will build and manage cloud-based data platforms using AWS services, focusing on developing and maintaining robust data pipelines that support business intelligence and analytics workflows.
Responsibilities
- Design and implement data storage solutions using AWS technologies
- Develop and monitor automated data workflows for ingestion and transformation
- Ensure data reliability, accuracy, and timely delivery across systems
- Collaborate with analytics and engineering teams to define data requirements
- Optimize data pipeline performance and scalability
- Troubleshoot and resolve data integration issues
- Maintain documentation for data architectures and processes
- Support data governance and compliance standards
- Implement monitoring and alerting for data systems
- Work with infrastructure as code for deployment consistency
- Participate in code reviews and system design discussions
- Contribute to disaster recovery and backup strategies
- Integrate third-party data sources into existing pipelines
- Evaluate and adopt new data tools and frameworks
- Ensure secure handling of sensitive data across platforms
Nice to Have
- Experience with Apache Airflow or similar orchestration tools
- Knowledge of streaming data platforms like Kinesis
- Familiarity with DevOps practices and CI/CD pipelines
- Exposure to machine learning data workflows
- Certification in AWS data or solutions architecture
Compensation
Competitive hourly rate for contract position
Work Arrangement
Remote with flexible hours
Team
Collaborative engineering team focused on scalable data systems
Project Duration
Initial contract term of 6 months with potential extension based on project needs
Security Requirements
- Candidate must be able to pass a background check
- U.S. person status required for access to certain systems
Not available