Data Engineer II developing scalable data solutions for GSK's research platform. Collaborating on modular code and ensuring compliance with best practices in data engineering and software development.
Responsibilities
Builds modular code / libraries / services / etc using modern data engineering tools (Python/Spark, Kafka, Storm, …) and orchestration tools (e.g. Google Workflow, Airflow Composer)
Produces well-engineered software, including appropriate automated test suites and technical documentation
Develop, measure, and monitor key metrics for all tools and services and consistently seek to iterate on and improve them
Ensure consistent application of platform abstractions to ensure quality and consistency with respect to logging and lineage
Fully versed in coding best practices and ways of working, and participates in code reviews and partnering to improve the team’s standards
Adhere to QMS framework and CI/CD best practices
Provide L3 support to existing tools / pipelines / services
Requirements
Bachelor’s degree in Data Engineering, Computer Science, Software Engineering, or a related discipline
4+ years of Data engineering Experience
Software engineering experience
Familiarity with orchestrating tooling
Cloud experience (GCP, Azure or AWS)
Experience in automated testing and design
Benefits
health care and other insurance benefits (for employee and family)
Senior Lead Data Engineer at Capital One collaborating with Agile teams and mentoring developers. Leading full - stack development and driving cloud - based solutions for financial empowerment.
Senior Data Engineer responsible for developing data pipelines and collaborating with US business teams. Working at Beghou Consulting, a life sciences company providing advanced analytics and technology solutions.
Data Solutions Architect designing enterprise - scale Azure and Databricks Lakehouse solutions for clinical trials and life sciences data enabling advanced analytics and compliance.
Data Architect at ADEO ensuring interoperability of IT systems through architecture design and data knowledge diffusion. Collaborating with teams to maintain data integrity and quality standards in an international setup.
Consultant, Data Engineer leading end - to - end data solutions and analytics. Collaborating with clients to improve data strategies and deliver actionable insights.
Big Data Engineer developing applications for Synchrony’s Enterprise Data Lake within an Agile scrum team. Collaborating to deliver high - quality data ingestion and maintain data governance standards.
Data Engineer optimizing data pipelines and cloud solutions for GFT Poland. Involves performance tuning, ETL pipelines, and data model development across multiple locations in Poland.
Junior AI Data Engineer specializing in data - focused solutions for financial services. Collaborating on digital transformation projects across various regions.
Data Engineer building scalable cloud data pipelines in Azure and Databricks. Focus on Lakehouse architecture and data governance in a hybrid work model.