31 (0)10 210 8790

Data DevOps Engineer

Spoed

Job description

Wanted: Data DevOps Engineer at KPN

Contribute to KPN’s digital transformation.

As the number 1 network provider in the Netherlands, we work passionately on our secure, reliable, and future-proof networks and services. We ensure that everything and everyone stays connected at all times to create a better and cleaner world.

As the future-proof Data DevOps Engineer, you will contribute to making KPN a fully data-driven company. The Data & AI Office (DAO) is part of the Technology and Digital Office (TDO) within KPN. DAO is responsible for enabling a digital, data-driven KPN.

Do you have ambitions in the field of Business Intelligence and working with large telecom data? Do you want to join a team that works on making KPN's customer interactions data-driven? Then you might be the new colleague we are looking for! You will work in a multidisciplinary team on insights that are valuable for enhancing the customer experience.

As an applicant for this role, we expect you to be highly motivated to contribute to these values.

This is what you enjoy:

  • You are a (big) data freak ;)
  • You strive for 100% automation.
  • You aim for the highest quality, effectiveness, and efficiency.
  • Teamwork is your core value.
  • We build it, we own it mentality.
  • You challenge the status quo; you dare to speak out!
  • You are willing to change by learning and exploring new technologies.
  • You ensure data policy and compliance rules in your daily work.
  • Creating value for stakeholders/businesses/customers.

Start date: as soon as possible
End date: 30-12-2026
Location: Amsterdam
Number of hours: 32-36 hours per week

Requirements

Technical Skills You Should Be Familiar With:

  • SQL
  • Teradata
  • Informatica/Azure Data Factory/Azure Synapse
  • PowerBI
  • Python
  • Kafka/Streaming Data
  • UNIX/LINUX
  • Jenkins
  • GIT
  • Airflow
  • JIRA/Confluence
  • Slack

Do You Recognize Yourself in the Following?

  • At least 3 years of experience in working with Enterprise Data Warehouse and Data Mart
  • At least 2 years of experience in Data Modelling
  • Good working experience in Python or any other programming language
  • Experience working with Big Data
  • Some experience with any cloud environment, preferably Azure
  • A passion for designing and writing maintainable code, with a focus on valuable tests
  • Knowledge of Linux, with familiarity in Windows as well
  • Experience in the creation of automation of data pipelines using CI/CD
  • Understanding the value of writing unit tests as part of your coding cycle

What Do We Expect From You?

  • Never settle and be open to experimenting with your way of working
  • Be a team player, sharing knowledge and coaching other team members
  • Exhibit problem-solving skills, always finding ways to overcome challenges
  • Embrace the DevOps mindset: you build it, you own it
  • Experience working with Agile, Scrum, or eagerness to learn Agile Ways of Working
  • Strong communication skills in English, with Dutch as an asset
  • Solid knowledge of SQL
  • Decent knowledge of Python
  • Experience with event-driven data (Kafka)
  • Good understanding of data modelling (both semantic modelling and Power BI)
  • Knowledge of Microsoft Fabric is a plus
  • Experience building ETL Pipelines and automation

We're excited about the possibility of working together, so bring your skills and passion for data to the table!

Location

Amsterdam

Publication date

13.11.2025

Publication end date

19.11.2025

Contact person

de Bruin

Other details
More information
Felix de de Bruin
Work phone: (010) 2108 790
Create job alert
Copyright © 2025 IF-Solutions   |  Sitemap   |  Privacy statement
IF-Solutions uses cookies to remember certain preferences and align jobs interests.
Close