Responsibilities
- Develop and own new data solutions from conception to deployment, employing best practices in data design.
- Analyze existing use cases by reviewing code in Python, AWS Glue, AWS Kinesis, and SQL.
- Deconstruct complex problems into manageable tasks for streamlined analysis.
- Accurately estimate workload and deliver quality features that align with product requirements.
Requirements
- Bachelor’s degree in Computer Science or a related field.
- Strong experience in software development with SQL, Python, and ETL technologies.
- Proven ability in implementing complex, high-volume data transformation solutions.
- Experience in designing and modifying data infrastructure to enhance data analysis and reporting.
- Excellent communication skills, both oral and written.
- Strong interpersonal abilities and management capabilities.
- Proficient analytical and problem-solving skills, with a track record of working effectively in dynamic environments.
- Availability to participate in on-call rotation and provide after-hours support.
Nice to Have
- Familiarity with AWS Aurora, AWS Redshift, AWS Glue, and AWS Kinesis is preferred but not mandatory.
- Experience with collaboration tools such as Atlassian Tools, DevOps, and CI/CD practices is a plus.
- Knowledge of reporting tools such as SSRS and Logi Report (JReport) is advantageous.
- Experience with CMS platforms (WordPress, Drupal, or headless CMS).
- Familiarity with CI/CD pipelines and deployment workflows.
- Knowledge of cloud platforms (AWS, Azure, or GCP).
- Experience with analytics tools and performance monitoring.
- Understanding of security best practices in web development.
- Experience working in Agile/Scrum environments.
