Staff Data Engineer on Real World Evidence team driving large-scale data initiatives. Collaborating with cross-functional teams to optimize data pipelines and improve healthcare outcomes.
Responsibilities
Act as a self-starter who drives execution independently, taking ownership and initiative with minimal need for day-to-day direction.
Lead high-visibility RWE projects, starting with claims data, and keep multiple initiatives moving by proactively unblocking teams.
Own the end-to-end architecture for critical data assets, ensuring solutions are scalable, reliable, and aligned with H1’s long-term vision.
Design, build, and optimize large-scale data pipelines (hundreds of TBs) for performance, reliability, and cost efficiency.
Partner with Product, Data Science, and downstream engineering teams to align priorities, manage dependencies, and deliver high-value outcomes.
Represent engineering in cross-functional forums, shaping roadmaps and reducing reliance on senior leadership for day-to-day decisions.
Develop deep domain expertise and mentor other engineers, helping raise the technical bar and influence the evolution of our data products.
Requirements
8+ years as a software, data, or backend engineer building and operating scalable, production-grade systems.
Experience with large-scale data processing (e.g., Spark/PySpark on EMR or similar) or scalable distributed backend systems, with the ability to quickly deepen expertise in our data stack (PySpark, EMR, Hudi/Delta).
Strong proficiency in SQL, including writing and optimizing complex queries over large datasets.
Strong programming experience in Python (or a modern language with the ability to quickly ramp up in Python).
Experience designing systems or large-scale datasets/pipelines with attention to performance, reliability, and maintainability.
Hands-on experience with modern engineering workflows and tooling such as Git, JIRA, and CI/CD systems (e.g., CircleCI).
Comfort deploying and troubleshooting distributed workloads in cloud environments such as AWS EMR or Kubernetes.
Experience with workflow orchestration or job scheduling tools (e.g., Airflow, Argo).
Demonstrated ability to independently drive complex, cross-team technical initiatives and influence stakeholders without formal authority.
Experience with streaming/messaging technologies (e.g., Kafka, Kinesis) nice to have
Background in RWE, healthcare data, or other complex/regulated data domains is preferred
Experience using AI-assisted coding tools (e.g., GitHub Copilot, Claude Code) to accelerate development while maintaining quality is encouraged
Benefits
Full suite of health insurance options, in addition to generous paid time off
Pre-planned company-wide wellness holidays
Retirement options
Health & charitable donation stipends
Impactful Business Resource Groups
Flexible work hours & the opportunity to work from anywhere
Senior Data Engineer responsible for migrating and modernising data platforms in banking. Rebuilding critical data platform with a focus on risk and core financial data flows.
Data Engineering Lead managing enterprise - scale data platforms using AWS, Snowflake, and Databricks in financial services. Leading data engineering teams and ensuring data governance.
AWS Data Engineer working in Gurugram to support data architecture and integration solutions. Collaborating and translating business needs into data models.
Senior Data Engineer handling data engineering responsibilities in hybrid setting for banking industry. Collaborating with cross - functional teams and maintaining data quality in Azure environments.
Data Management professional at Kyndryl involved in creating innovative data solutions and ensuring the seamless operation of complex data systems. Collaborating with teams to transform requirements into scalable database solutions.
Manager of Data Platform overseeing AWS cloud infrastructure and Snowflake data warehouses for Thomson Reuters. Leading the design and implementation of data processing applications in a hybrid role located in Bengaluru.
Software Engineer designing and developing scalable data processing applications on cloud infrastructure for Thomson Reuters. Collaborating with Data Analysts on AI - enabled solutions for data management and insight generation.
Senior Data Engineer designing scalable data pipelines and solutions for Enterprise Data Lake at Thomson Reuters. Collaborating across teams to ensure efficient data ingestion and accessibility.
Senior Data Engineer at Technis developing scalable data pipelines and solutions for innovative connected spaces products. Collaborating within a cross - functional team to deliver high - quality data - driven outcomes.