Senior Data Engineer shaping how data drives processes at Phoenix Group. Working with cross-functional teams on cloud platforms including Databricks and Azure.
Responsibilities
Design and implement end-to-end data engineering solutions across multiple platforms, including Azure, Databricks, SQL Server, and Salesforce, enabling seamless data integration and interoperability
Architect and optimize Delta Lake environments within Databricks to support scalable, reliable, and high-performance data pipelines for both batch and streaming workloads
Develop and manage robust data pipelines for operational, analytical, and digital use cases, leveraging best practices for data ingestion, transformation, and delivery
Integrate diverse data sources—cloud, on-premises, and third-party systems—using connectors, APIs, and ETL frameworks to ensure consistent and accurate data flow across the enterprise
Implement advanced data storage and retrieval strategies that support operational data stores (ODS), transactional systems, and analytical platforms
Collaborate with cross-functional teams (data scientists, analysts, product owners, and operational leaders) to embed data capabilities into business processes and digital services
Optimize workflows for performance and scalability, addressing bottlenecks and ensuring efficient processing of large-scale datasets
Apply security and compliance best practices, safeguarding sensitive data and ensuring adherence to governance and regulatory standards
Create and maintain comprehensive documentation for data architecture, pipelines, and integration processes to support transparency and knowledge sharing
Requirements
Proven experience in enterprise-scale data engineering, with a strong focus on cloud platforms (Azure preferred) and cross-platform integration (e.g., Azure ↔ Salesforce, SQL Server)
Deep expertise in Databricks and Delta Lake architecture, including designing and optimizing data pipelines for batch and streaming workloads
Strong proficiency in building and managing data pipelines using modern ETL/ELT frameworks and connectors for diverse data sources
Hands-on experience with operational and analytical data solutions, including ODS, data warehousing, and real-time processing
Solid programming skills in Python, Scala, and SQL, with experience in performance tuning and workflow optimization
Experience with cloud-native services (Azure Data Factory, Synapse, Event Hub, etc.) and integration patterns for hybrid environments
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.
Data Architect designing and implementing scalable data architectures for Keyrus in Bordeaux. Leading client transitions and contributing to the tech ecosystem with innovative data solutions.