GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Responsibilities
Work with both the business teams (finance and actuary initially), data scientists and engineers to design, build, optimise and maintain production grade data pipelines and reporting from an internal Data warehouse solution, based on GCP/Big Query
Work with finance, actuaries, data scientists and engineers to understand how we can make best use of new internal and external data sources
Work with our delivery partners at EY/IBM to ensure robustness of Design and engineering of the data model/ MI and reporting which can support our ambitions for growth and scale
BAU ownership of data models, reporting and integrations/pipelines
Create frameworks, infrastructure and systems to manage and govern Ki’s data asset
Produce detailed documentation to allow ongoing BAU support and maintenance of data structures, schema, reporting etc.
Work with the broader Engineering community to develop our data and MLOps capability infrastructure
Ensure data quality, governance, and compliance with internal and external standards.
Monitor and troubleshoot data pipeline issues, ensuring reliability and accuracy.
Requirements
Experience designing data models and developing industrialised data pipelines
Strong knowledge of database and data lake systems
Hands on experience in Big Query, dbt, GCP cloud storage
Proficient in Python, SQL and Terraform
Knowledge of Cloud SQL, Airbyte, Dagster
Comfortable with shell scripting with Bash or similar
Experience provisioning new infrastructure in a leading cloud provider, preferably GCP
Proficient with Tableau Cloud for data visualization and reporting
Experience creating DataOps pipelines
Comfortable working in an Agile environment, actively participating in approaches such as Scrum or Kanban
Desirable Skills
Experience of streaming data systems and frameworks would be a plus
Experience working in regulated industry, especially financial services would be a plus
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.
Data Engineer strengthening data platform team at Samba TV to improve data analytics and reporting capabilities. Building on AWS, Databricks, BigQuery, and Snowflake technology.
Data Engineer focusing on secure ETL/ELT data pipelines and compliance in healthcare. Designing scalable ingestion frameworks and ensuring alignment with federal standards.
Data Migration Engineer at Capgemini delivering migration solutions for Guidewire Claim Center. Collaborating on cloud data migrations and validating processes in a sustainable tech environment.
Data Engineer responsible for collecting and analyzing data at Cruise Planners. Collaborate with teams for actionable insights using MySQL and Power BI.
Data Engineer for Leader Entertainment developing data solutions on Google Cloud Platform. Collaborating on data models, pipelines, and analytics in a hybrid role.