Job description

The Deutsche Post DHL Group team is the leading mail and logistics service provider for the world. As one of the planet’s largest employers operating in over 220 countries and territories, we see the world differently. Join our team and discover how an international network that’s focused on service, quality and sustainability is able to connect people and improve lives through the power of global trade. And not just for our customers, but for every member of our team, too.  Join a great international team (> 15 nationalities) of Data Engineers, Cloud Engineers, DevOps Engineers, Data Scientists and Architects to learn from and to share your experiences. The team language is English; hence you don’t need to speak any German. In our family-friendly environment, we offer part-time work, flextime and sabbaticals.

Duties and responsibilities

DPDHL introduced a modern Kubernetes-based data science platform, making use of the open-source Kubeflow solution.  Our environment gives you the chance to work with state-of-the-art open-source components combined with carefully chosen vendor-backed solutions.  We offer you a multi-month Kubeflow training from the ground up, enabling you to become a Kubeflow developer and machine learning (ML) application project contributor.  You then help data scientists get their activities transitioned from informal exploratory data analytics based on Jupyter notebooks or code snippets in Python IDEs like PyCharm and VSCode to structured ML pipelines, using Kubeflow Pipelines. Applying best practices for KF Pipelines components, you help data scientists get more efficient at component reusability and observability. Using KF Serving, you guide data scientists on their journey to productionalizing ML models for scoring/inferencing via web endpoints and front it with our API gateway Apigee. 

Appreciated qualification and experience
  • Multi-year practical experience in Python programming 
  • Experience refactoring Python code to Production maintainability 
  • Practical knowledge of the large ecosystem of Python data science and ML-related libraries and components like scikit-learn, pandas, PyTorch, TensorFlow,… 
  • In-depth experience with at least 1, better multiple existing data science tools and platforms (see under bonus points). 
  • Routine experience with common Python tooling like pip, conda 
  • In-depth Docker image-building experience 
  • Experience with CI/CD and automation best practices and tools like Jenkins, GitHub Actions, ArgoCD, Ansible, Terraform, Helm,
  • Experience with Kubernetes on various levels and on multiple platforms (on-prem, Azure, GCP) 
  • Experience with collaboration with data scientists 

Bonus points for practical experience with: 

  • Prior Kubeflow experience 
  • Any open source or proprietary ML lifecycle management tooling and platforms like AWS Sagemaker, Azure ML, Google Vertex AI, MLFlow, …) 
Additional info

Your benefits: We offer excellent employee benefits, a competitive salary package and great development opportunities, such as joining conferences or paid trainings. We welcome full-time (40 hours) and part-time work and offer fully remote work. We also offer relocation to the Bonn office in Germany. If you choose to relocate, we support you with moving, all official paperwork, an interim flat and finding a permanent flat, as well as the search for a Kindergarten as per your family needs. We have a dedicated team to support you with your Visa and sponsor all needed activities.

Contract person

Christian Krudewig (

How to apply