This is an exciting opportunity to use your technical expertise to collaborate with colleagues and build effortless, digital first customer experiences Youll be simplifying the bank by developing innovative data driven solutions, using insight to be commercially successful, and keeping our customers and the banks data safe and secure Participating actively in the data engineering community, youll deliver opportunities to support the banks strategic direction while building your network across the bank
What youll do
As a Data Engineer, youll play a key role in delivering value for our customers by building data solutions. Youll be carrying out data engineering tasks to build a scalable data architecture including carrying out data extractions, transforming data to make it usable to analysts and data scientists, and loading data into data platforms.
Youll also be: Developing comprehensive knowledge of the banks data structures and metrics, advocating change where needed for product development Building automated data engineering pipelines through the removal of manual stages Working closely with core technology and architecture teams in the bank to build data knowledge and data solutions Developing a clear understanding of data platform cost levers to build cost effective and strategic solutions
The skills youll need
To be successful in this role, youll need to be an entry level programmer and data engineer with a qualification in computer science or software engineering. Youll also need a good understanding of data usage and dependencies with wider teams and the end customer, as well as a proven track record in extracting value and features from large scale data.
Youll also demonstrate: Good critical thinking and proven problem solving capabilities Experience of ETL technical design, automated data quality testing, QA and documentation, data warehousing, data modelling and data wrangling Experience using ETL tools such as Informatica PowerCenter or Informatica BDM, which will be an advantage Extensive experience using RDMS, ETL pipelines, Hadoop and SQL Experience in one or more of the following languages Python, PySpark, Scala and Java, which will be desirable Experience in StreamSets, Snowflake, AWS and other cloud technologies, which will be an advantage A good understanding of modern code development practices