Director of Data Engineering leading a team to deliver scalable data solutions at The Hartford. Collaborating across business units for innovative data-driven strategies and machine learning integration.
Responsibilities
Actively lead and develop a team of data engineers to deliver and maintain reusable and sustainable data assets and production pipelines that assist the functional business units in meeting their strategic objectives
Collaborate with Data Scientists and Product Owners to advise approaches to business opportunities
Develop the design and vision for data pipelines and testing frameworks, both batch and real-time to meet business needs while balancing maintainability, reliability, security, and scalability.
Provide guidance on and independently navigate data warehouses, understand data architectures, and join disparate data sources to ensure quality and appropriateness of data solutions.
Lead the use and development of GitHub best practices for version control, documentation, and code collaboration throughout the data science lifecycle and ensure solutions align with best practices
Contribute to and execute a multi-year roadmap to build and remediate enterprise-grade data assets leveraging cloud-based target state technology and architecture
Drive disciplined innovation by balancing a relentless focus on delivering results and customer adoption with out-of-the-box thinking and a continuous improvement mindset
Coordinate activities with cross-functional IT unit stakeholders (e.g., database, operations, telecommunications, technical support, etc.)
Requirements
8+ years of relevant experience recommended
Bachelor’s degree in Computer Science, Engineering, IT, MIS, or a related discipline
Experience in managing Data Engineering teams in an Agile environment
Experience designing and delivering scalable data engineering solutions that integrate GenAI capabilities and agentic AI systems
Expertise in Python and SQL
Expertise in ingesting data from a variety of structures including relational databases, Hadoop/Spark, cloud data sources, XML, JSON
Expertise in ETL concerning metadata management and data validation
Expertise in Unix and Git
Expertise in Automation tools (Autosys, Cron, Airflow, etc.)
Experience with AWS Services (i.e. S3, EMR, etc.) or GCP
Experience with Cloud data warehouses, automation, and data pipelines (i.e. Snowflake, Redshift) a plus
Able to communicate effectively with both technical and non-technical teams
Able to translate complex technical topics into business solutions and strategies as well as turn business requirements into a technical solution
Experience with leading project execution and driving change to core business processes through the innovative use of quantitative techniques
Candidates must be authorized to work in the US without company sponsorship. The company will not support the STEM OPT I-983 Training Plan endorsement for this position.
Associate Data Engineer building and maintaining data systems at Incedo. Transforming raw information into accessible datasets for decision - making and collaborating with analysts and data scientists.
Intern Data Engineer at Flutter Studios learning to create applications in the gaming industry. Join a cross - functional team to implement a basic application in an Agile environment.
AI/Data Engineer at Comcast developing data pipelines and AI solutions for audit processes. Leading team efforts to ensure data quality and compliance with audit objectives across business units.
Senior Data Engineer involved in AWS Cloud and Big Data solutions for Financial Crime. Join CommBank's team to tackle complex data - centric problems with advanced technologies.
Senior Data Engineer leading and mentoring a team in building scalable data pipelines for digital transformation projects in an international software company.
Senior Data Operations Engineer at iKnowHow designing and implementing scalable data - driven applications. Focus on data pipelines, APIs, and collaboration across teams for project success.
Data/AI Engineer supporting innovations in corporate travel with AWS technologies at HRS Group. Collaborating with data teams to develop AI solutions and maintain data pipelines.
Data Engineer developing data solutions for reporting and analytics at Assurant. Implementing and optimizing Data Warehouse solutions in cloud and on - premise environments with Agile methodology.
AI Engineer developing AI - driven analytics for trading at Deloitte. Focusing on scalable data pipelines and collaboration with traders for actionable insights.