About the Role
The role involves developing and improving Python-based web scraping solutions and backend services in a distributed environment. Candidates should have strong programming skills and experience with asynchronous systems.
Responsibilities
- Design and implement robust web scraping technologies
- Improve performance and reliability of backend services
- Collaborate with engineers to integrate data pipelines
- Write clean, maintainable, and testable code
- Troubleshoot and debug production issues
- Optimize data extraction processes for efficiency
- Participate in code reviews and technical discussions
- Maintain documentation for systems and workflows
- Support the deployment and monitoring of services
- Contribute to architectural decisions for scalable systems
- Ensure compliance with security and data handling standards
- Respond to changing project requirements quickly
- Work with large datasets and distributed systems
- Use version control systems effectively
- Follow agile development practices
- Identify and address performance bottlenecks
- Develop automation tools for internal use
- Integrate third-party APIs and services
- Monitor system health and uptime
- Assist in refining development workflows
- Stay current with advancements in web technologies
- Support continuous integration and delivery pipelines
- Collaborate across time zones in a remote setting
- Contribute to technical planning and estimation
- Ensure code adheres to best practices
Nice to Have
- Experience with Scrapy or similar frameworks
- Background in large-scale data processing
- Knowledge of proxy rotation systems
- Familiarity with headless browsers
- Experience with anti-bot evasion techniques
- Understanding of web security mechanisms
- Contributions to open-source projects
- Experience with Kubernetes
- Knowledge of message queues
- Background in machine learning applications
- Familiarity with data serialization formats
- Experience with performance profiling tools
- Understanding of network latency optimization
- Work with geographically distributed teams
- Prior work in remote-first companies
Compensation
Competitive salary based on experience and location
Work Arrangement
Remote
Team
Collaborative remote team focused on building scalable data extraction systems
Tech Stack
Python, Scrapy, asyncio, Docker, Kubernetes, PostgreSQL, Redis, AWS, Git, Prometheus, Grafana
Professional Development
Access to training resources, conference attendance support, and time for personal projects
Work Flexibility
Fully remote role with flexible scheduling within time zone overlap requirements
Not available
