DevOps Engineer (Data Lake/DataStage)Amsterdam
We are building a state-of-the-art Data Lake. An ambitious project based on technologies like IBM InfoSphere (DataStage, TWS-D and Information Governance Catalogue), Netezza and Cognos.
As a DevOps engineer you will be working in a scrum content squad of 6-8 people consisting of three or four other DevOps engineers and two or three Business Analysts. In addition there will be a Product Owner who decides on the priority of the user stories. Each data content squad is responsible for their own set of sources from the extraction of the data, either in real-time or through batches, to the delivery of Data Marts for Cognos Reporting. Agile maturity is required for each team member since we don’t have a Scrum Master.
You will be responsible for the analysis of the data and the build of the physical model at the various stages of the data pipeline: Operational Status, Information Warehouse and Data Marts. The physical modelling method used in our environment is what we call: ‘Key-Version’ modelling. It’s a mix between relational modelling and Data Vault. You will also model our Data Marts, using the Kimball Dimensional Model method. Having hands-on experience in using Type 3 (slowly changing) dimensions is a minimum requirement.
Additional tasks working in the squad will consist of designing, developing, testing and deploying IBM Information Server artefacts, mainly involving DataStage, as well as developing new frameworks and packages using Cognos Analytics.
Maintaining the applications running in our Data Lake is the operations part of your job. You and your fellow engineers are not only responsible for what you build but also for maintenance and monitoring, including incident management and bug fixing. You will also be responsible for keeping the operational controls, which are a key part of the risk appetite of the bank, up to date.
Your activities start at analysis and end at operations. As such you are a true DevOps engineer!
Keeping yourself up to date on the latest technologies is also a part of your job, that’s why we have a ‘Mastery Day’ every three weeks. This is one full day where you can work on your own skills. This can be a topic from one of our learning modules for understanding our standards and way-of-work or a self-chosen topic within your expertise.
Your work environment:
Our engineers work in multi-disciplinary teams based on the Agile methodology and DevOps principles. We adopted the ‘Spotify’ model, which means that you are part of a matrix organization. You work in your squad but are also part of a chapter, an expertise-focused group that shares knowledge and creates and governs standards. Your chapter will be ‘Data Processing’.
Who are you?
You are an enthusiastic, motivated engineer who likes to work in a team on complex data integration solutions. Innovating to do things smarter and continuous personal development are in your engineering DNA. Your goal in the end is to make yourself redundant! Rather than repeatedly building the same job, you standardize and automate work to provide you with the time to take on other engineering challenges to further develop yourself as an engineer.
Automation is not limited to DataStage but to the complete end-to-end delivery including automated testing, continuous delivery and operations. You have a mentality that aligns with our culture where you ‘Take it on and make it happen’.
Together with a minimum of 5 years of experience as a DevOps engineer in similar environments, we expect you to have had experience working with the following tools, techniques and way of working.
DataStage design and development (or similar ETL tooling)
Data Modelling like, but not limited to Data Vault and Dimensional modelling
Netezza, Oracle, SQL Server or similar databases
Scheduling tools like TWS-d, AutoSys
Fluent communication skills in English
GIT (or similar version control system)
Reporting tools like, but not limited to Cognos Analytics, Business Objects
InfoSphere Information Server Enterprise Edition suite components such as Information Governance Catalogue, Information Analyser
End-to-end delivery of data integration
Both development and operations roles
Agile development (Scrum, Kanban)
Joining forces working in a multidisciplinary team