Data Engineer improving healthcare accessibility by building data systems and API workflows. Collaborate with cross-functional teams to enhance data quality and integration strategies.
Responsibilities
Build and maintain robust data systems for collecting, transforming, and storing complex healthcare and financial data. This may include writing short API call scripts, enhancing and maintaining data warehousing systems, and building end to end data collection pipelines.
Implement automated API-based workflows that increase operational efficiencies for Hive’s stakeholders (patients, healthcare providers, employers, and internal operation teams) for implementation of single customer view and utilization management.
Communicate clearly and concisely technical information to non-technical stakeholders.
Perform query-based data exploration in SQL or Python to catalog data, maintain data dictionary, and move company towards self-serve model for dashboards.
Requirements
Experience as a developer, with exposure to data pipelines or data-intensive systems. **Open to fresh graduates and junior candidates.**
Proficiency in SQL and at least one other programming language, preferably Python or JavaScript (e.g. NodeJS).
Strong understanding of software design fundamentals and the ability to design scalable systems (e.g., data modeling, technical requirements gathering).
High attention to detail and quality control, with a focus on writing readable code and following engineering best practices. Familiarity with distributed version control systems and UNIX-like operating systems, such as Git and Ubuntu, is required.
Excellent written communication skills, crucial for our hybrid work environment and asynchronous communications.
Eagerness to learn new skills and tackle challenges related to healthcare data.
Strong problem-solving skills and initiative to identify and solve undefined problems systematically.
**Bonus Qualifications**
Experience with cloud environments like Google Cloud Platform, AWS, and Azure (Hive uses GCP).
Domain-specific knowledge in healthcare, such as working with medical data (e.g. FHIR, ICD-10) and implementing customer service workflows (e.g., setting up and configuring CRMs).
Interest in using AI and Machine Learning for healthcare applications (e.g., LLMs, Document Intelligence, Medical Standardization).
Knowledge of quality assurance practices, including end-to-end testing and integration testing.
Experience with DevOps practices and tools (e.g. CI/CD, Logging, Observability).
Knowledge of responsible technology practices, including health data governance.
Benefits
Join a founding team transforming healthcare in the Philippines and beyond.
Comprehensive healthcare benefits.
Hybrid work flexibility and paid time off.
Career development and mentorship opportunities.
Access to global tech and advisory support from institutions like Harvard and Stanford.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.