Senior Data Engineer at CVS Health developing robust data pipelines for healthcare data. Collaborating with teams to provide actionable insights and integrate them with consumer touchpoints.
Responsibilities
Architect and develop robust, scalable ETL/ELT pipelines using Cloud Dataflow, Cloud composer (Airflow), and Pub/Sub for both batch and streaming use cases
Leverage BigQuery as the central data warehouse and design integrations with other GCP services (e.g., Cloud storage, Cloud functions)
Build and optimize analytical data models in BigQuery
Implement partitioning, clustering, and materialized views for performance and cost efficiency
Ensure compliance with data governance, access controls, and IAM best practices
Develop integrations with external systems (APIs, flat files etc.) using GCP-native or hybrid approaches
Utilize tools like Dataflow or custom Python/Java services on Cloud Functions or Cloud Run to handle transformations and ingestion logic
Build automated CI/CD pipeline using Cloud Build, GitHub Actions, or Jenkins for deploying data pipeline code and workflows
Set up observability using Cloud Monitoring, Cloud Logging, and Error Reporting to ensure pipeline reliability
Lead architectural decisions for data platforms and mentor junior engineers on cloud-native data engineering patterns
Promote best practices for code quality, version control, cost optimization, and data security in a GCP environment
Drive initiatives around data democratization, including building reusable datasets and data catalogs via Datapelx or Data Catalog
Requirements
3+ years of experience with SQL, NoSQL
3+ years of experience with Python (or a comparable scripting language)
3+ years of experience with Data warehouses (such as data modeling and technical architectures) and infrastructure components
3+ years of experience with ETL/ELT, and building high-volume data pipelines
3+ years of experience with reporting/analytic tools
3+ years of experience with Query optimization, data structures, transformation, metadata, dependency, and workload management
3+ years of experience with Big data and cloud architecture
3+ years of hands-on experience building modern data pipelines within a major cloud platform (GCP, AWS, Azure)
3+ years of experience with deployment/scaling of apps on containerized environment (i.e. Kubernetes, AKS)
3+ years of experience with real-time and streaming technology (i.e. Azure Event Hubs, Azure Functions, Kafka, Spark Streaming)
1+ year(s) of soliciting complex requirements and managing relationships with key stakeholders
1+ year(s) of experience independently managing deliverables
Benefits
Affordable medical plan options
401(k) plan (including matching company contributions)
Employee stock purchase plan
No-cost programs for all colleagues including wellness screenings, tobacco cessation and weight management programs
Confidential counseling and financial coaching
Paid time off
Flexible work schedules
Family leave
Dependent care resources
Colleague assistance programs
Tuition assistance
Retiree medical access and many other benefits depending on eligibility
Director leading strategy, governance, and delivery of enterprise data platform at Phillips 66. Partnering with AI, Data Science, and business teams to enhance analytics and business systems.
Product Owner driving ERP data migration initiatives for BioNTech’s global landscape. Leading effective data management and ensuring compliance with regulatory standards in a fast - paced environment.
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.