Staff Data Engineer for IntelliScript's Data Platform managing data strategy and governance. Lead projects ensuring data quality and compliance with industry standards.
Responsibilities
Acts as a subject matter expert and thought leader within the Data Platform Domain
Data Strategy: Serves as a thought leader in data processing design and implementation, defining advanced structure for moving, storing, and maintaining high-quality data
Team Leadership : Leads projects by managing timelines, coordinating teams, and communicating project statuses. Influences organizational direction through effective leadership and strategic collaboration
Data Governance and Security: Serves as a subject matter expert on governance standards, continuously aligning data practices with evolving industry best practices and requirements
Project Management and Scope of Work: Contributes to defining the overall vision and strategy for data engineering within the organization, ensuring alignment with organizational goals and long-term objectives
Results Orientation: Establishes visionary goals, advises on strategic plans, employs advanced monitoring, influences high-level stakeholders, and delivers transformative results
Data Platform: Expansion of our Data Warehouse(s) and Lakehouse solutions for a healthcare data focused enterprise
Data Governance: Configuring and maintaining unity catalog to enable enterprise data lineage, data quality, auditability and data stewardship
Data Security: Building out Data Security protocols and best practices including the management of identified and de-identified (PHI/PII) solutions
Access Management: Always ensure a policy of least privilege is followed for anything being implemented
External Data Products: Building data solutions for clients while upholding the best standards for reliability, quality, and performance
ETL: Building solutions within Delta Live Tables and automation of transformations
Medallion Architecture: Building out performant enterprise-level medallion architecture(s)
Streaming and Batch Processing: Building fit-for-purpose near real-time streaming and batch solutions
Large Data Management: Building out performant and efficient enterprise solutions for internal and external users for both structured and unstructured healthcare data
Platform Engineering: Building out Infrastructure as Code using Terraform and Asset Bundles
Costs: Working with the business to build cost effective and cost transparent Data solutions
Pipeline/ETL Management: You will help architect, build, and maintain robust and scalable data pipelines, monitoring, and optimizing performance
Experience working with Migration tools i.e. Fivetran, AWS technologies and custom solutions
Identify and implement improvements to enhance data processing efficiency
Design and implement reliable and resilient Event Driven data processing
Experience with building out effective pipeline monitoring solutions
Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Delta Live Tables, Python, Scala, and cloud-based ‘big data’ technologies
API Development: Drive our design and implementation of internal APIs for integrating data between different systems and applications
Integration with external systems utilizing API driven processes to ingest data
Develop APIs built on top of datasets for internal systems to consume data from Databricks
Experience integrating with external APIs including but not limited to Salesforce, Financial systems, HR systems and other external systems
Data Modeling: Lead design, implementation, and maintenance of standards based (FHIR, OMOP, etc.) and efficient data models for both structured and unstructured data
Assemble large, complex data sets that meet functional and non-functional business requirements
Develop and maintain data models, ensuring they align with business objectives and data privacy regulations
Collaboration: Partner internally and externally with key stakeholders to ensure we are providing meaningful, functional, and valuable data
Effectively work with Data, Development, Analysts, Data Science, and Business team members to gather requirements, propose, and build solutions
Communicate complex technical concepts to non-technical stakeholders and provide guidance on best practices
Ensure that technology execution aligns with business strategy and provides efficient, secure solutions and systems
Gather requirements and build out project plans to implement those requirements with forecasted efforts to implement
Processes and Tools: Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build analytics tools that utilize the data pipeline to provide actionable insights into operational efficiency and other key business performance metrics
Create data tools for clinical, analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader
Lead investigation of new tooling, develop implementation plans, and deployment of necessary tooling
Requirements
15+ years of relevant experience in design, development, and testing of Data Platform solutions, such as Data Warehouses, Data Lakes, and Data Products
Expert level experience working in Databricks and AWS
Expert level experience working in both relational and non-relational databases such as SQL Server, PostgreSQL, DynamoDB, DocumentDB
Experience managing and standardizing clinical data from structured and unstructured sources
Experience building and managing solutions on AWS
Expert knowledge in healthcare standards including FHIR, C-CDA, and traditional HL7
Expert knowledge in clinical standards/ontologies including ICD10/SNOMED/NDC/LOINC/Rx Norm
Expert in building out data models, data warehouses, designing of data lakes for enterprise and product use cases
Familiarity with designing and building APIs, ETL and data ingestion processes and utilization of tools to support enterprise solutions
Experience in performance tuning, query optimization, security, monitoring, and release management
Experience working with and managing large, disparate, identified and de-identified data sets from multiple data sources
Experience with building and deploying IAC using terraform, asset bundles and GitHub
Experience collaborating with Data Science teams and building AI based solutions to drive efficiencies and business value.
Benefits
Medical, Dental and Vision – Coverage for employees, dependents, and domestic partners
Employee Assistance Program (EAP) – Confidential support for personal and work-related challenges
401(k) Plan – Includes a company matching program and profit-sharing contributions
Discretionary Bonus Program – Recognizing employee contributions
Flexible Spending Accounts (FSA) – Pre-tax savings for dependent care, transportation, and eligible medical expenses
Paid Time Off (PTO) – Begins accruing on the first day of work. Full-time employees accrue 15 days per year, and employees working less than full-time accrue PTO on a prorated basis
Holidays – A minimum of 10 paid holidays per year
Family Building Benefits – Includes adoption and fertility assistance
Paid Parental Leave – Up to 12 weeks of paid leave for employees who meet eligibility criteria
Life Insurance & AD&D – 100% of premiums covered by Milliman
Short-Term and Long-Term Disability – Fully paid by Milliman
Job title
Staff Data Engineer, Data Platform – IntelliScript
Data Engineer developing data products and warehouse solutions using cloud services and platforms while ensuring data governance and quality. Supporting tech team with technical architecture and implementation processes.
Senior Clinical Data Engineer at McKesson designing and implementing data solutions and analytics. Collaborating with clinical teams and technical experts for effective data management with scalable solutions.
Tax Technology Assistant Manager building a global tax data warehouse for Flutter, collaborating across teams. Utilizing SQL and Python for data transformation and optimizations in a hybrid role.
Senior Data Engineer designing and implementing Data Platform solutions for healthcare enterprise at Milliman IntelliScript. Collaborating with cross - functional teams while ensuring data privacy compliance.
Data Warehouse Analyst ensuring business intelligence requirements are met. Collaborating with IT and business resources in Bengaluru, Hyderabad & Chennai.
Data Engineer developing and enhancing Azure - based data platform for cloud analytics. Engaging in collaboration for data - driven decisions in supply chain, logistics, and finance.
Data Engineer building scalable data platforms using Databricks and Azure for Reconomy. Collaborating across teams to deliver high - quality data solutions.
Data Engineering Professional at BT Group enabling data - driven decision making through scalable data solutions. Responsible for development, testing, and deployment of ETL/ELT pipelines.
Senior Data Engineer developing scalable data solutions for SimplePractice. Collaborating across teams for data - driven insights and efficient business operations.
Engineer building and operating data systems for Spotify's marketing initiatives. Contributing to data pipelines and integrations supporting global campaigns like Wrapped.