Company Description
At T-Systems, you will find groundbreaking projects that contribute to social and ecological well-being. We want to welcome new talents like you, who bring fresh ideas, different perspectives, embrace challenges, and are committed to continuous learning in order to grow and make an impact on society... all in a fun way!
It doesnt matter when or where you work. Its about doing meaningful work that advances society. For this reason, we will do everything possible to ensure that you have all the development opportunities by providing a support network, excellent technology, a new work environment, and the freedom to work autonomously. We support you in growing both personally and professionally so that you can make a notable impact on society.
T-Systems is a team of around 28,000 employees worldwide, making us one of the leading global providers of end-to-end integrated solutions. We develop hybrid cloud solutions, artificial intelligence, and drive the digital transformation of companies, industries, the public sector, and ultimately, society as a whole
Job Description
The team is responsible for the development of Data Analytics & AI Use Cases for the Network Control Platform of Deutsche Telekom Germany (AAA Fixed Net Germany). This also includes our real-time data processing pipeline running on an on-prem kubernetes platform.
We are looking for Data Engineer which will be responsible for the entire lifecycle and further development of the data processing pipelines.
You will be working in a team with DataScientists and DevOps Engineers. You also will be working closely with other internal teams, internal customers and external suppliers.
Activity description and concrete tasks:
* Being responsible for the entire lifecycle management of a realt-time data processing pipeline
* Development and Optimization of Data Pipelines (ETL)
* Integration of new data sources
* Implement new technologies and tools to enhance data processing capabilities
* Implementation and operation on Kubernetes based environments
* System Monitoring and Maintenance to guarantee 24/7 availability.
* Rollout management throughout the entire application lifecycle based on ITIL/corporate processes (Incident/problem/change)
* Collaborate closely with DataScientists and Customers
* Willingness to perform tasks outside of regular working hours.
Qualifications
Mandatory skills:
* Know-how in Programming Language Java and Python
* Experience with Big Data technologies like Apache Kafka and Apache Flink
* Experience in optimizing data processing workflows
* Ability to analyze complex data structures and develop efficient solutions.
* Familiarity with cloud and container technologies (esp. Kubernetes Environments)
* Fundamental know-how in DevOps CI/CD tools (GitLab, Artifactory, Helm, Flux, …) and monitoring systems like Grafana, Promtail, Prometheus
* Strong teamwork and communication skills, especially in interdisciplinary collaboration.
* Problem-solving mindset and ability to work independently.
* Good English skills, both written and spoken, basic German skills would be advantageous.
Optional skills:
* Experience in classic and agile forms of cooperation.
More informations
Being on-call for approx. 6 weeks per year and rarely performing changes outside of regular working hours