Banner Default Image

Senior Data Engineer

Back to job search

Senior Data Engineer

  • Location:


  • Sector:

    ConSol UK IOT & Emerging Technologies

  • Job type:

    Temporary & Contract

  • Salary:

    €40 - €65 per hour

  • Contact:

    Jamie Jenkins

  • Contact email:

  • Job ref:


  • Published:

    12 days ago

  • Duration:

    3 months+

  • Expiry date:


  • Startdate:


Senior Data Engineer

Amsterdam, Netherlands

Initial 3 month contract + extensions

You'll move the world forward. Every day, we create the most innovative mapping and location technologies to shape tomorrow's mobility for the better.

We are proud to be one team of more than 5,000 unique, curious, passionate problem-solvers spread across the world. We bring out the best in each other. And together, we help the automotive industry, businesses, developers, drivers, citizens and cities move towards a safe, autonomous world that is free of congestion and emissions.

We are searching for someone to be responsible for expanding and optimizing our data and data pipeline architecture. As Data Engineer you will support our data analysts and architects on data initiatives and will ensure optimal data delivery architecture is consistent throughout the ongoing projects.

Do you like dealing with Data? Do you have a good understanding of data streaming services, with a solid understanding of APIs? Then this is your opportunity!

What you'll do

* Build the infrastructure and tooling required for optimal extraction, transformation, and loading of data from a wide variety of data sources
* Deploy analytics tools that utilize the data pipeline to provide actionable insights in to customer usage, operational efficiency, and other key business performance metrics
* Work with stakeholders to assist with data-related technical issues and support their data (infrastructure) needs
* Building and analyzing dashboards and dynamic reports
* Champion effective data-driven decision-making across product and marketing teams

What you'll need

* Minimum 5-year hands-on experience building and optimizing big data pipelines, architectures, and data sets
* A successful history of loading, manipulating, processing and extracting value from large disconnected datasets
* Working knowledge of stream processing and highly scalable big data data stores
* Advanced knowledge of query languages like SQL and working familiarity with a variety of databases
* Experience with non-relational & relational databases like Postgres, RedShift, Data Explorer, etc.
* Familiar with data streaming platforms such as Kafka or Azure Event Hubs
* Experience with programming languages like Python, Java, Bash, R etc.
* Experience with processing and managing large data sets (tens to hundreds of TB scale)
* Familiarity with public cloud such as Azure and AWS
* Bash scripting and Linux system administration