• Job code: QR8938
  • Dev Database/DWH/BI

Senior DevOps Data Engineer (Kafka, Java, Linux, CI/CD)

Amsterdam

For our client in Amsterdam, we are looking for a Senior DevOps Data Engineer (Kafka, Java, Linux, CI/CD).

 

We are looking for a Senior DevOps Data Engineer (Kafka), to join our client in building the Strategic Data Exchange between Lending core systems and surrounding systems. The primary focus of the squad is on the processes concerning data delivery to internal applications, to the Data Lake, and data delivery.

 

Data is becoming more important every day. Your contribution to the Strategic Data Integration will be critical to realize our ambitious data platform, with high quality and timely data availability, moving from batch to real-time. This way enabling excellent data consumption possibilities to meet our ever-increasing client- and regulatory demands on data.

 

Our client needs your help designing and building this new exchange and building bridges towards other squads to realize end-to-end delivery across Lending- and other Tribes. They value Agile, self-organization and craftsmanship. They are driven professionals who enjoy shaping the future of this place.

 

 

Needed skills & experience

We are looking for someone with an easy-to-work-with, mature and no-nonsense mentality. Someone who is an open and honest communicator, who values working as part of a team, who is willing and able to coach or train other developers and who is aware of developments and trends in the industry and corporate ecosystem.

 

Are you also passionate about a (not so distant) future where most data processing is done in a streaming fashion, not scared off by complex data, and enjoy developing complex components in Java? Then please read on.

 

On the more technical side you must have 9+ years of relevant experience in data engineering and especially must have experience in the following fields:

 

1. Agile / Scrum.

2. Track record in building larger corporate systems.

3. Kafka Streaming API.

4. Kafka, Schema Registry and Kafka Connect, using the Confluent framework.

5. Java 8 or higher backend development.

6. CI / CD tooling: Azure DevOps, Maven, CheckMarx, Git, Ansible.

7. Running and managing a Kafka cluster and related components.

8. Linux (bash) scripting capabilities.

9. Data Integration techniques.

10. Oracle Sql 12c or higher.

 

Next to these must haves, we appreciate you to have knowledge of the following:

1. Oracle RDBMS 12c or higher.

2. Database Change Data Capture.

3. Logging and monitoring with Grafana, Elastic, Kibana, Prometheus or Logstash.

4. Data modelling.

5. Oracle Data Integrator 12c.

6. Experience in a complex, corporate environment.

7. Issue trackers like JIRA, ServiceNow.

8. Collaboration tooling like Confluence.

Apply