Join a forward-thinking team as a Data Engineer focused on building and maintaining high-performance data infrastructure. In this fully remote role based in Poland, you'll play a central role in shaping how data is collected, stored, and utilized across systems.
What You’ll Do
- Develop and manage ETL/ELT pipelines that consolidate data from diverse sources into a unified Data Lake environment.
- Design efficient, scalable data architectures tailored to analytics and reporting needs.
- Ensure data integrity, access controls, and consistent availability across platforms.
- Partner with data analysts and business teams to clarify requirements and deliver actionable datasets.
- Monitor pipeline health, identify bottlenecks, and resolve issues proactively.
- Keep pace with evolving data engineering practices and integrate improvements into existing workflows.
What We’re Looking For
- Hands-on experience with pipeline orchestration tools such as Apache Airflow or Talend.
- Proficiency in SQL and Python for data manipulation and automation.
- Background working with cloud-based data platforms including Snowflake, AWS S3, or BigQuery.
- Familiarity with major cloud providers—AWS, GCP, or Azure—and their data processing services.
- Solid understanding of data modeling principles and performance optimization techniques.
- A detail-driven mindset with strong analytical and troubleshooting abilities.
- Excellent communication skills to bridge technical implementation with business needs.
- Ability to maintain a 4-hour overlap between 15:00 and 19:00 Central European Time for team collaboration.
