Staff Data Engineer at CommBank responsible for building robust AWS Cloud data solutions for Finance Risk and Treasury. Join a leading Data Engineering team to solve complex data-centric problems.
Responsibilities
Design and deliver an integrated Corporate Services data platform providing a single source of truth for timely and accurate data.
Delivery Risk Treasury Finance Data Platform outcomes; and Overseeing the delivery of all data outcomes to support Finance Risk and Treasury.
Requirements
Have built or worked on building out data warehouses or data lakes
Are strong in designing and delivering robust solutions
Are familiar with the full software development lifecycle with a focus on data ingestion process, data transformation pipelines, data integration and visualisation
Understand data profiling, basics of numerical statistics and data quality calculations
Has experience in handling large-scale data processing and analytics
Can provide code quality control through peer programming, code review and automated pipeline release management
Can coach junior engineers, share knowledge to uplift and improve software development practices
Demonstrated expertise in solution design and discovery, with advanced skills in AWS and ETL processes
Should have a good understanding of overall DevSecOps
Proficient in languages such as Python and SQL
ETL Development: Design, Develop and implement ETL processes using Ab Initio (not mandatory)
In-depth knowledge of Oracle
Experience in relevant Amazon data and AI services, such as: S3, RDS, Redshift, GLUE, Lambda, SageMaker, Bedrock, AmazonQ, Kendra, Neptune Augmented AI with human oversight etc.
Certified in relevant AWS Associate or Professional certifications e.g. Data Engineering, AI, Machine Learning
Knowledge of data architectures such as data vault 2.0
Understanding how AI can be used to solve real-world problems through data pipelines and machine learning lifecycle management
Building scalable and efficient data pipelines that support AI model training and inference
Leveraging AI agents to automate and accelerate engineering tasks (highly desirable experience with effective use of Model Context Protocols (MCPs) and automation for DevSecOps)
Data Architect designing and maintaining enterprise data architecture at Envalior. Driving enterprise - wide impact ensuring scalability and reliability of systems, reporting, and AI initiatives.
Data Engineer role at Valmont focused on data analytics and technology for sustainable agricultural practices. Collaborating with cross - functional teams to enhance data management and analytics tools.
Senior Data Engineer at Barclays building and maintaining data pipelines and warehouses. Collaborating with data scientists and ensuring data accuracy, accessibility, and security.
Lead Data Engineer guiding a team in designing scalable data solutions for iKnowHow S.A. Overseeing development of data pipelines while collaborating with cross - functional teams.
Data Engineer at LPL Financial developing Python - based ETL pipelines. Collaborating with cross - functional teams to ensure reliable data delivery and optimizing pipeline performance.
Senior Data Engineer at Keyrus focusing on data solutions and projects to drive performance. Collaborating with teams globally to enhance data transformation and governance processes.
Data Engineer developing scalable data pipelines for ETL/ELT processes using GCP services. Collaborating with team members to optimize data workflows and ensure data integrity.
Data Governance Engineer in Fintech developing a formal cyber data governance framework. Collaborating with cyber security, analytics, and platform engineering teams on metadata and lineage capabilities.
Junior Data Engineer role at Allegro, focusing on developing ETL/ELT pipelines and processing large datasets. Collaborate with cross - functional teams for data quality and reporting.
Data Engineer at Concept Reply developing innovative data - driven solutions in IoT. Collaborating with teams to unlock the potential of data and cloud computing.