The Deutsche Post DHL Group team is the leading mail and logistics service provider for the world. As one of the planet’s largest employers operating in over 220 countries and territories, we see the world differently. Join our team and discover how an international network that’s focused on service, quality and sustainability is able to connect people and improve lives through the power of global trade. And not just for our customers, but for every member of our team, too. Join a great international team (> 15 nationalities) of Data Engineers, Cloud Engineers, DevOps Engineers, Data Scientists and Architects to learn from and to share your experiences. The team language is English; hence you don’t need to speak any German. In our family-friendly environment, we offer part-time work, flextime and sabbaticals.
Duties and responsibilities
DPDHL has set up a global big data architecture as part of the Group’s digitalization agenda. In this context, we are looking for an aspiring Cloud DevOps Engineer to join our growing team of analytics experts. You will be responsible for expanding and optimizing our devops and cloud ci/cd architecture, as well as optimizing the (IaC) infrastructure codebase. Preferably, you are interested in or already have experience in maintaining resources for data pipelines as well as processing (transforming, aggregating, wrangling) data. You will be involved in optimizing ci/cd pipelines and building them from the ground up. In your role, you will support our data engineers, database architects, data analysts, and data scientists on data initiatives and will ensure that the optimal ci/cd architecture is consistent throughout ongoing projects. It is mandatory that you work self-directed and are comfortable supporting the infrastructure needs of multiple teams, systems, and products. You are the right candidate to join our team if you are excited by the prospect of participating in optimizing and sometimes even re-designing our company’s data lake architecture to support our next generation of products and data initiatives. Our environment gives you the chance to work with state-of-the-art open-source components combined with carefully chosen cloud vendor backed managed services.
Appreciated qualification and experience
- At least 2-3 years of experience working in a cloud engineer role
- Good command of infrastructure as code with terraform or Azure resource templates.
- Practical experience of working with python, shell, or PowerShell scripting.
- Familiarity with cloud ecosystems for Azure or GCP, specifically data intensive components like (for Azure) Synapse, CosmosDB, Data Factory, Databricks, EventHub, or (for GCP) BigQuery, Dataflow, Pub/Sub, Dataproc.
- Have experience with the CI/CD processes of complex microservice-based cloud solutions
Good to Have:
- Knowledge of modern cloud-native systems, components, and tools like Kubernetes, Docker, ArgoCD, and how to use them.
- Knowledge of cloud storage aspects (ADLS gen2 for Azure, GCS for GCP)
Bonus points for:
- Demonstrable involvements in / contributions to open-source projects with relation to clouds.
- Knowledge with cloud native CI/CD and automation (best) practices and tools like Jenkins, Github Actions, ArgoCD, Ansible, Terraform, Helm, Azure DevOps, GCP Cloud Build.
We offer excellent employee benefits, a competitive salary package and great development opportunities, such as joining conference or paid trainings.
We welcome full-time (40 hours) and part-time work and offer hybrid work at home and in our offices. If you join us in Germany, India and Czech Republic you can also choose a full remote position.
If you move to a new city for joining one of our offices, we support you with moving, all official paperwork, an interim flat and finding a permanent flat, as well as the search for a Kindergarten as per your family needs. We have a dedicated team to support you with your Visa and sponsor all needed activities.
Ole Vollertsen (firstname.lastname@example.org)