Lead Data Engineer modernizing design standards for imagery models. Collaborate with stakeholders to build and optimize data pipelines and oversee integration efforts.
Responsibilities
Lead and partner with Data Science, DAVS, Architecture and other business, stakeholder teams to prioritize use cases, gather requirements and implement solutions and pipelines for Imagery Models.
Contribute to the Data Products Strategy for Imagery Model Outputs.
Design and own the end ‑ to ‑ end data and ML pipeline architecture for imagery models (ingestion, preprocessing, feature engineering, training, evaluation, deployment).
Integrate models into production systems and APIs with appropriate monitoring and alerting.
Incorporate assurance processes into data solutions.
Guide team members as they build complex data solutions, correct problems, apply transformations, and recommend data cleansing/quality solutions.
Design complex data solutions, including incorporating new data sources and ensuring designs are consistent across projects and aligned to data strategies.
Define and build frameworks for data solutions that can be applied to multiple projects.
Perform analysis of complex sources to determine value and utilize your subject matter expertise to recommend data to include in analytical processes.
Incorporate core data management competencies including data governance, data security and data quality.
Collaborate and build consensus with leadership and diverse groups of stakeholders in defining, estimating, prioritizing and planning of projects.
Perform data and system analysis, assessment and resolution for defects and incidents of high complexity and correct as appropriate.
Define standards and frameworks for testing on data movement and transformation code and data components.
Perform other duties as assigned.
Requirements
Bachelor’s Degree in STEM related field or equivalent.
Fifteen years of related experience.
2+ years of experience in imagery pipeline development and reusable frameworks.
Experience integrating imagery workflows into production systems and API's.
3+ years of experience with AWS Cloud and Python.
2+ years of experience with Databricks.
Expert knowledge of tools, techniques, and manipulation including cloud platforms, programming languages, and modern software engineering practices.
Excellent delivery skills with the ability to examine and assess the effectiveness of software design strategies and methodologies, devise, apply, and share ways to ensure the quality of complex computer systems.
Demonstrated track record of domain expertise including the ability to improve company level capabilities within domain, consult on business priorities and optimize value by identifying business aligned solutions.
Strong problem solving skills with the ability to create architecture that is particularly robust against single points of failure.
Excellent communication skills with the ability to describe technology concepts in ways the business can understand, document effectively, collaborate across disparate groups.
Strong leadership skills with the ability to engage with other leaders and networks to solve problems as well as work to improve the entire engineering organization.
Benefits
Health Insurance : Employees and their eligible family members – including spouses, domestic partners, and children – are eligible for coverage from the first day of employment.
Retirement: Travelers matches your 401(k) contributions dollar-for-dollar up to your first 5% of eligible pay, subject to an annual maximum. If you have student loan debt, you can enroll in the Paying it Forward Savings Program. When you make a payment toward your student loan, Travelers will make an annual contribution into your 401(k) account. You are also eligible for a Pension Plan that is 100% funded by Travelers.
Paid Time Off: Start your career at Travelers with a minimum of 20 days Paid Time Off annually, plus nine paid company Holidays.
Wellness Program: The Travelers wellness program is comprised of tools, discounts and resources that empower you to achieve your wellness goals and caregiving needs. In addition, our mental health program provides access to free professional counseling services, health coaching and other resources to support your daily life needs.
Volunteer Encouragement: We have a deep commitment to the communities we serve and encourage our employees to get involved. Travelers has a Matching Gift and Volunteer Rewards program that enables you to give back to the charity of your choice.
Intern working on data engineering tasks for machine learning in the automotive field. Collaborating with Data Engineers and learning about data management tools.
Data Engineer developing and maintaining ETL processes using Azure Data Factory and Snowflake. Collaborating with teams to ensure reliable data for analytical purposes.
Senior Data Engineer at a fast - growing MNC designing scalable data pipelines and infrastructure for AI. Collaborate with teams while building solutions for analytics and energy optimization.
Senior Principal Engineer managing data quality framework implementation at Mercer. Collaborating with international stakeholders and ensuring robust data governance practices.
Data Engineer designing and implementing architectures in cloud environments. Collaborating with teams to define technical standards and achieve business goals.
Data Engineer at Yü Group Plc shaping data strategy for the energy supplier in the UK. Collaborating with cross - functional teams to leverage data for business insights.
Data Engineer responsible for building scalable ETL/ELT data pipelines using Azure and Python. Collaborating with teams to ensure data quality and support analytics needs.
Senior Data Engineer at McKesson focusing on claims data, designing data pipelines and ensuring data integrity. Collaborating with teams to support analytics and reporting initiatives.
Data Engineer Student role at Canada Life focusing on connected data products for Canadian business needs. Collaborating with data teams to support analytics and decision - making initiatives.
Data Engineer building and maintaining scalable data pipelines for Flowcode using modern technologies. Collaborating with cross - functional teams to ensure efficient data flow and infrastructure optimization.