+30 km
Uren
Opleiding
Dienstverband
Ervaring
Salaris
Datum
Bedrijfstype
Zoek vacatures
Soortgelijke vacatures omgeving Rotterdam.
Laad meer vacatures

Senior Data Engineer -Data Factory Rotterdam Masters in Public

Solliciteer nu
Solliciteer als één van de eersten
Opslaan
Solliciteer nu
Opslaan
Delen

Gevraagd

  • 36 uur
  • Senior
  • Engels (taal)

Aanbod

  • Vast contract
 

Vacature in het kort

Rotterdam
Join the Data & Analytics team at Schiphol and see your work impact the airport's operations. As a Senior Data Engineer, you'll build robust data pipelines, collaborating with various stakeholders and the Core Data Platform team. You'll work with innovative technologies in a dynamic environment, contributing to the airport's ambition to be Europe's best. Enjoy unique perks like an on-site cinema and being part of a team that values a positive and respectful workplace atmosphere. Continue reading to find out why this challenge is waiting for you.
 

Over het bedrijf

Masters in Public
Uitzendbureau
Bedrijfsprofiel
 

Volledige vacaturetekst

Royal Schiphol Group brings the world within reach. We connect the Netherlands with the rest of the world, creating value for the economy and society. Achieving this involves over 2,000 employees working 24 hours a day, 7 days a week. Our ambition is to develop Schiphol into Europe’s best Airport, both for passengers and for airlines. Innovation is key to achieve this ambition.

At Schiphol you can look out of the window and see your data products take effect on our operation. We are the only employer with both its own police force and a cinema. Our department works on a range of projects, from image recognition of below-wing processes of aircraft sitting at the gate, to expected waiting times at security, and even dynamic routing of passengers through the terminal. Walking through the airport you can see your work take effect.

Department
The Data & Analytics department (DNA) aims to be the driving force behind a data-driven culture, our AI, BI and analytics function, and our data governance & quality frameworks. The goal of the Data Factory team within DnA Foundation is to provide data pipelines on a future-proof data infrastructure, that ensure continuity, safety, and integrity of the delivered data.

The Central Data Factory team is a team of data engineers within D&A, whose primary purpose is to establish robust data pipelines for retrieving data from internal and external sources, and eventually send it to the Core Data Platform. The team works closely with different departments within Schiphol, some of them are AI, BI, IoT, Data Governance and some more.

What will you do as a Senior Data Engineer Data Factory?
As a Data Engineer you will build data pipelines working closely together with our Core Data Platform team, various (internal) business stakeholders and multiple data providers.

You see yourself as a senior engineer, with extensive programming experience, and have the willingness to help the team achieve its goals. To do this, you’ll:
– Identify the integration requirements of several data sources,
– Design, implement, and test cloud based (Azure) data intensive applications,
– Collaborate with Lead Engineers, Data Architects & Enterprise Architects for technical solutions,
– Help team members with technical questions, and
– Coach junior team members.

Your added value
As a professional you are introspective and resilient, with a clear vision, courage, and focus. Innovation and agile working are an absolute priority. You believe that creating a good atmosphere is essential for a successful, respectful, and welcoming workplace. We are a small team, so we’ll be looking for a real ‘click’/connection. Your natural approachability enables you to connect with people. You are passionate about your work, and are curious and open to new developments, with a mindset keyed to possibilities and opportunities. If you meet the requirements below, then we look forward to receiving your application!

Requirements:
– You have at least 4 years of experience running data driven solutions, including deployment and management of data-pipelines in production.
– Strong Scala or Java experience, we would prefer someone with 2+ years hands-on experience.
– Professional experience with Kafka.
– Professional experience with a data processing framework such as Spark, Flink, Beam, implementing both batch and streaming pipelines (we currently use Spark).
– You recognize the importance of logging and monitoring (we currently use Splunk, OpenTelemetry).
– You have at least one year experience with Microsoft Azure or another cloud provider.
– You are comfortable working with the latest DevOps technologies i.e., Kubernetes, OpenShift, Docker, etc.
– Professional experience with Scrum.
– You are fluent in English. Knowledge of the Dutch language is a plus.

What we offer
The role of Data Engineer at Schiphol is a challenging, exciting job in a cornerstone of the Dutch economy that is going through big changes. You work with highly motivated colleagues who are genuinely proud of their work and are part of a young and inspiring Digital & Innovation leadership team.

Additional information:
– CV & Motivation focused on the assignment, please submit in Dutch and in PDF format.
– Maximum size of the CV is 1500 KB.
– Important: Make sure that the candidate information is complete.
– Suppliers of candidates should be aware of the applicable laws and regulations regarding employment conditions and the (Schiphol) collective labor agreement. This deployment falls in scale 13 of the Schiphol CAO.
– A VGB or VOG is required for assignments at Schiphol. The start date will only be discussed after receiving the VOG or the confirmation from the VGB at Magnit.
– The candidate is required to have proof of identity (passport or identity card), which is valid at intake and at the start date of the candidate’s contract.
– When offering the candidate, we assume that you agree with the conditions of this specific client. If you are not familiar with these conditions, you can request them from the responsible recruiter.


De eisen
You have at least 4 years of experience running data driven solutions, including deployment and management of data-pipelines in production.
Strong Scala or Java experience, we would prefer someone with 2+ years hands-on experience.
Professional experience with Kafka.
Professional experience with a data processing framework such as Spark, Flink, Beam, implementing both batch and streaming pipelines (we currently use Spark).
You recognize the importance of logging and monitoring (we currently use Splunk, OpenTelemetry).
You have at least one year experience with Microsoft Azure or another cloud provider.
You are comfortable working with the latest DevOps technologies i.e., Kubernetes, OpenShift, Docker, etc.
You are fluent in English. Knowledge of the Dutch language is a plus.

Vacature opslaan
 Vacature delen
Sluit
Je notitie is succesvol opgeslagen
Voeg een notitie toe aan deze vacature
Opslaan
Sluit
Bedankt, je melding is verstuurd
Rapporteer deze vacature
Leg kort uit waarom je deze vacature rapporteert:
Versturen
Terug naar vacatures
Sluit
Kies 1 of meer
Sluit
Vacature opgeslagen
Klik op het hartje bovenaan de pagina om je opgeslagen vacatures te zien.
Terug naar vacatures
Sluit
Vul een in