Job description

We are looking for a motivated and driven Data Engineer (m/f/d) who will help us shape our Pricing Intelligence team. This team is responsible for the revenue optimization of the full network. In particular, we aim to define the best possible pricing strategy to maximize our revenue given current and historic demand patterns, competitor actions, and our current and historic supply. Our long-term goal is to enhance our Revenue Management System with integrated solutions (reports, tools, UIs) that would replace our current product portfolio.

Duties and responsibilities
  • Build and maintain ETL pipelines
  • Construct and support deployment pipelines using different workflow orchestration tools (Jenkins, GitLab CI/CD, etc.)
  • Test and validate pipelines and data
  • Develop documentation for pipelines, products, and processes
  • Build and optimize Power BI reports following DAX and data modelling best practices
  • Create visualizations using Python or R
  • Shape and drive improvements in our Revenue Management System, by designing integrated solutions and developing UIs using Power BI and Dash (Python)
  • Place the users first, by collecting requirements, focusing on their feedback, and constantly optimizing the usability of our products
  • Interact regularly with different types of stakeholders from our and other departments
Appreciated qualification and experience
  • You have 3+ years of experience as a Business Intelligence Engineer, Data Engineer, or equivalent
  • You have a master’s degree in Computer Science, Business Informatics, Mathematics, Industrial Engineering, or similar
  • You are proficient in Snowflake/SQL and Python and have solid experience writing data pipelines using Python
  • You have good experience in data modelling, DAX and visualization using Power BI (or any other similar tool)
  • You feel comfortable using Git
  • You have a basic understanding of Amazon Web Services (AWS)
  • You are fluent in workflow orchestration using Apache Airflow, Luigi, or similar
Additional info

Bonus points:

  • You have experience as a Data Analyst, Business Analyst, or equivalent
  • You feel comfortable using GitLab CI/CD
  • You have solid experience in creating visualizations and developing reactive apps using Dash (Python)
  • You have experience using Kubernetes and Docker
  • You have experience with analytics engineering tools such as DBT or similar
  • You are familiar with Kafka

Contract person

Alina Elenescu (

How to apply