Data DevOps Engineer
Job description
Contribute to KPN’s Digital Transformation
As the number one network supplier in the Netherlands, we work with passion on our safe, secure, and future-proof networks and services. Our goal is to ensure that anything and everyone remains connected at all times to create a better and cleaner world.
Your Role as a Data DevOps Engineer
As the future-proof Data DevOps Engineer, you will play a vital role in making KPN a fully data-driven company. The Data & AI Office (DAO) is part of the Technology and Digital Office (TDO) within KPN. DAO is responsible for enabling a digital data-driven KPN.
Are You the One We're Looking For?
Do you have ambitions in the field of Business Intelligence and a passion for working with large telecom data? Would you like to participate in a team focused on making KPN's customer interaction data-driven? If so, you might be the new colleague we are looking for!
You will work in a multidisciplinary team dedicated to providing insights that will enhance customer experience. As an applicant for this role, we expect you to be highly motivated to contribute to these values.
This is What You Like
- You are a (big) data enthusiast.
- You strive for 100% automation and the highest quality, effectivity, and efficiency.
- Teamwork is your core value; you embrace a "we build it, we own it" mindset.
- You challenge the status quo and are not afraid to speak up!
- You are willing to embrace change by learning and exploring new technologies.
- You ensure adherence to data policy and compliance rules in your daily work.
- You aim to create value for stakeholders, businesses, and customers.
Position Details
Position: Data DevOps Engineer
Start Date: 10-11-2025
End Date: 30-12-2026
Hours: 32-36 hours per week
Location: Amsterdam (office on Tuesdays and Thursdays)
Requirements
Technical Skills
You are familiar with a range of technical skills, including:
- SQL
- Teradata
- Informatica/Azure Data Factory/Azure Synapse
- PowerBI
- Python
- Kafka/Streaming Data
- UNIX/LINUX
- Jenkins
- GIT
- Airflow
- JIRA/Confluence
- Slack
Self-Recognition
You recognize yourself in the following:
- You have at least 3 years of experience working with Enterprise Data Warehouse and Data Mart.
- At least 2 years of experience in Data Modelling.
- Good working experience in Python or any other programming language.
- Experience in working with Big Data.
- Some experience working with any cloud environment, preferably Azure.
- You love to design and write maintainable code and can cover it with valuable tests.
- Knowledge of Linux, but also familiar with Windows.
- Experience in creating automation of data pipelines using CICD.
- You understand the value of writing unit tests as part of your coding cycle.
Soft Skills
Alongside your technical expertise, you also possess essential soft skills:
- Analytical
- Attention to detail
- Open-minded
- Result-oriented
- Accountable
- Empathy
- Team Player
Keep embracing your strengths and continue to grow in your journey!