Requirements
- University degree in IT or relevant discipline, combined with minimum 4 years of relevant working experience in IT;
- At least 3 years of experience in Python;
- At least 2 years of experience in AWS Cloud Computing services and Docker;
- At least 2 years of experience in Java (Java SE, Java EE, EJB, Applet, etc.);
- At least 2 years of experience in Linux;
- At least 1 year of experience in data modelling (BPMN, ARIS, EPC, Flowcharts), data analysis (Microsoft PowerBI, KNIME, Hadoop, Spark) and data processing (Kafka, Airflow, Kuberflow, LangChain, Langraph);
- At least 1 year of experience in machine learning, Large Language Models (LLMs) and Deep Learning Frameworks (Transformers , TensorFlow , PyTorch, AWS Bedrock, AWS SageMaker AI, Terraformers , Haystack);
- At least 1 year of experience in API Gateway;
- At least 1 year of experience in Microsoft Windows 7 and higher;
- At least 1 year of experience in Perl, R and C++;
- At least 1 year of experience in monitoring tools(e.g. ELK, Datadog, Dynatrace);
- At least 1 year of experience in Version control(e.g. SVN, Bitbucket) and Collaboration systems (Jira, Confluence, Microsoft Office 365);
- At least 1 year of experience in one of the following: OWL 2, RDF, RDF-S, Triple stores(Virtuoso or equivalent) and SPARQL 1.0/1.1;
