Senior Data Engineer designing and implementing Data Platform solutions for healthcare enterprise at Milliman IntelliScript. Collaborating with cross-functional teams while ensuring data privacy compliance.
Responsibilities
Data Platform: Creation of Databricks Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise
Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage and data quality
Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions
External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance
ETL: Building solutions within Delta Live Tables and automation of transformations
Medallion Architecture: Building out performant enterprise-level medallion architecture(s)
Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions
Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data
Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles
Costs: Working with the business to build cost effective and cost transparent Data solutions
Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance
Identify and implement improvements to enhance data processing efficiency
Develop and maintain data models, ensuring they align with business objectives and data privacy regulations
Collaborate internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data
Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems
Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics
Requirements
5+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products
Expert level experience working in Databricks and AWS
Expert level experience working in both relational and non-relational databases such as SQL Server, PostgreSQL, and MongoDB
Experience managing and standardizing clinical data from structured and unstructured sources
Experience building and managing solutions on AWS
Knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7
Knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm
Experienced in building out data models, data warehouses, designing of data lakes for enterprise (and product use)
Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions
Experience in performance tuning, query optimization, security, monitoring, and release management
Experience working with and managing large, disparate, identified and de-identified data sets from multiple data sources
Familiarity with building and deploying IAC using terraform, asset bundles and github
Benefits
Medical, Dental and Vision – Coverage for employees, dependents, and domestic partners
Employee Assistance Program (EAP) – Confidential support for personal and work-related challenges
401(k) Plan – Includes a company matching program and profit-sharing contributions
Discretionary Bonus Program – Recognizing employee contributions
Flexible Spending Accounts (FSA) – Pre-tax savings for dependent care, transportation, and eligible medical expenses
Paid Time Off (PTO) – Begins accruing on the first day of work. Full-time employees accrue 15 days per year, and employees working less than full-time accrue PTO on a prorated basis
Holidays – A minimum of 10 paid holidays per year
Family Building Benefits – Includes adoption and fertility assistance
Paid Parental Leave – Up to 12 weeks of paid leave for employees who meet eligibility criteria
Life Insurance & AD&D – 100% of premiums covered by Milliman
Short-Term and Long-Term Disability – Fully paid by Milliman
Job title
Senior Data Engineer, Data Platform – IntelliScript
As a Principal Data Architect at Solstice, lead the design and implementation of data architecture solutions. Ensure data integrity, security, and accessibility to meet strategic organizational goals.
Data Platform Specialist overseeing data workflows and enhancing data quality for Stackgini's AI - driven IT solutions. Collaborating with teams to drive improvements and stakeholder support.
Data Engineer designing data pipelines in Python for a major railway industry client. Collaborate with Data Scientists and ensure code quality with agile methodologies.
Senior Data Engineer responsible for building and optimizing data pipelines for banking analytics initiatives. Collaborating with data teams to ensure data quality and readiness for enterprise use.
Senior Data Engineer developing scalable data solutions on Databricks for analytics and operational workloads. Collaborating with cross - functional teams to modernize the data ecosystem.
Data Engineer focused on analytics and data pipeline development for network optimisation. Collaborating with teams to deliver high - quality data solutions with Python and SQL.
Senior Product Manager defining platform capabilities for Data Cloud in Salesforce. Collaborating with R&D teams while shaping product strategy for Data 360 integration.
Senior Data Engineer at Goodwin enhancing data platforms and fostering data - driven culture across teams. Collaborating with IT and Finance on technology solutions and data governance practices.
Director, Data Platform Design and Strategy at MedImpact leading data platform and AI innovations to enhance healthcare services. Overseeing enterprise projects and managing teams to meet strategic goals.
Data Engineer delivering AI - and data - driven solutions for Honeywell’s industrial customers. Architecting and implementing scalable data pipelines and platforms focused on IoT and real - time data processing.