Advanced Analytics Software Engineer

Do you want to work in a young and dynamic international team united around data and customer centricity? Are you passionate about turning raw data into gold? Are you ready to shape the future of the financial industry within Central and Eastern Europe? Then you are the right person for our team!

Having a strong customer recognition and being a digital leader on the market, Raiffeisenbank Serbia is seeking to enhance a freshly established Advanced Analytics Team and to drive analytical transformation on international level for entire Raiffeisen Group.

We are looking for ambitious candidates who will support us in our productive environment to:

  • Drive the scaling of new banking solutions with the help of data science
  • Be the mastermind in managing data workflows and turning it in practical insights
  • Contribute to RBI Group transformation into a data-driven company & most recommended financial institute

You will be part of our adaptive set-up where data engineers, data scientists, MLOps as well as business experts internationally work together on different use cases.

What your job will look like:

  • Participate in the lifecycle of data science projects, incl. design and development of data processing and monitoring pipelines, resource planning
  • Work with the state-of-the-art cloud infrastructure (AWS, DataBricks)
  • Assemble large, complex data sets to meet functional / non – functional business requirements
  • Develop, maintain and optimize ELT and ETL pipelines (incl. incidents investigation, writing “postmortems”)
  • Continuously support internal consumers (data analysts, data scientists) in best data engineering practices and automation of development pipelines
  • Prepare accompanying documentation and data specifications, as well as contribute to data catalogue


What you bring to the table:

  • Structured and conceptual mindset coupled with strong quantitative and analytical problem-solving attitude
  • Professional experience in designing and developing production-ready data application and pipelines in cloud ecosystem
  • Proficient with PySpark, Python SQL and shell script
  • Software engineering excellence, understanding of SDLC, Linux & bash as your casual instruments
  • Deep knowledge of SQL (DDL, analytical functions, sub-queries, optimization of performance, principles for optimization for popular relational DBs, e.g., PostgreSQL, MySQL, ClickHouse)
  • Experience working within agile (scrum) methodology
  • Good comprehension of data warehousing principles, MDM, data models (LDM/PDM)
  • Fluent English, spoken and written
  • Fire in the eyes, desire to learn and to find improvements to status quo


Will be a plus:

  • BSc in Computer Science, Informatics, Software Engineering or related majors
  • Solid knowledge of ML principles and frameworks, analytical libraries (e.g., pandas, NumPy, scikit-learn etc.)
  • • Familiarity with developing unit and integration tests, TDD
  • Understanding of Git, collaborative coding practice
  • Experience in developing CI/CD/CT pipelines (e.g., GitHub, Jenkins, TeamCity, GitLab CI)


What we offer:

  • Be part of international team at a leading banking group
  • Challenging projects in international financial system
  • Flexible working arrangements and determining your own work-life balance
  • Tailored professional and soft skills trainings and education
  • Private pension and medical insurance
  • Competitive salary

Interested in continuous self-development and working with cutting edge technologies? Press here and apply to join our analytics journey:
Application link