Risk Data Engineer and Architect at Lincoln Financial supporting risk analytics through AWS data solutions. Building scalable data pipelines and collaborating with cross-functional teams.
Responsibilities
Architect cloud-native data solutions using AWS services (S3, Redshift, Glue, EMR, Lambda, Athena) and Databricks (Unity Catalog, Workflows, MLflow) or similar technology to support risk analytics at scale
Build and optimize data pipelines using Python, PySpark, and SQL to integrate structured and unstructured data from internal and external sources
Drive modernization initiatives by migrating legacy on-premises systems to cloud platforms while ensuring zero disruption to critical business services
Implement data governance frameworks including cataloging, lineage tracking, quality monitoring, and access controls aligned with internal controls and regulatory requirements
Collaborate cross-functionally with data scientists, risk managers, and business stakeholders to translate requirements into scalable technical solutions, drive change management and process improvements
Develop production-grade code following DevOps practices including CI/CD, automated testing, code reviews, and infrastructure-as-code
Manage and optimize relational databases (MySQL, Aurora PostgreSQL, Redshift) for performance, cost efficiency, and data integrity
Integrate structured and unstructured data from various internal and external sources
Mentor and influence technical standards and architectural decisions across the Risk, Investments Technology organization
Monitor and troubleshoot production systems, implementing observability practices and proactive performance tuning
Requirements
Bachelor’s degree in Computer Science, Information Systems, Engineering or related field
5+ years of experience in data engineering or similar role implementing with a variety of on-premises and cloud data management, integration, and analytical technologies
Production experience with relational databases such as MySQL, and SQL Server, including designing schemas, writing complex queries, optimizing performance, and ensuring data integrity for a variety of business applications
Strong experience with end-to-end data architecture implementing systems that use AWS services related to data storage and management, must have advanced proficiency in SQL, Python, YAML and Bash
Thorough understanding of the Software Development Life Cycle (SDLC), including DevOps practices, CI/CD processes, application resiliency, and security measures
Excellent communication skills with ability to translate technical concepts for business audiences
Data governance experience: catalog tools, data quality frameworks, lineage tracking, and compliance controls is a plus
AWS certifications such as SAA, Associate Developer, Data Analytics Specialty, or Databricks certification is a plus
Benefits
Clearly defined career tracks and job levels, along with associated behaviors for each of Lincoln's core values and leadership attributes
Leadership development and virtual training opportunities
PTO/parental leave
Competitive 401K and employee benefits
Free financial counseling, health coaching and employee assistance program
Tuition assistance program
Work arrangements that work for you
Effective productivity/technology tools and training
Senior Data Engineer designing and implementing sustainable data solutions for diverse clients. Collaborating closely with stakeholders to enhance data services and platforms in a hybrid environment.
Senior Data Engineer designing secure and scalable data systems for maritime and defense applications. Seeking experienced professional with strong expertise in AWS and Azure environments.
Data Engineer managing payment processing and data accuracy while collaborating with financial teams. Building and optimizing data pipelines for transactional data in a hybrid work environment.
Data Engineer building analytical tools for Dry Bulk market data operations at Kpler. Join a team of over 700 experts transforming data into actionable strategies.
Data Engineer developing tools for maintaining data integrity in cargo tracking at Kpler. Collaborating with analysts and engineers to enhance data quality management.
Lead Azure Data Engineer designing and optimizing data ecosystems on Microsoft Cloud. Responsible for building scalable data platforms and pipelines for analytics and reporting.
Data Engineer providing support for IBM DataStage ETL jobs at Callibrity. Collaborating with stakeholders and working to modernize technology solutions in a hybrid work environment.