Senior Fabric Data Engineer designing and optimizing data solutions using Microsoft Fabric for diverse clients. Collaborating with teams and mentoring junior engineers in data platform initiatives.
Responsibilities
Design, build, and optimize enterprise-grade data solutions using Microsoft Fabric.
Collaborate with cross-functional teams to deliver high-quality data solutions.
Mentor junior engineers and share best practices in Data Engineering.
Develop Lakehouse architectures and implement data pipelines for various workloads.
Integrate and utilize Power BI for enterprise analytics solutions.
Requirements
7–10+ years of experience in Data Engineering.
Strong hands-on experience with Microsoft Fabric, including at least one production implementation.
Design and development of Lakehouse architectures using Microsoft Fabric, including OneLake, Delta tables, and Medallion architecture.
Development of Dataflows Gen2, Notebooks (PySpark / SQL), and Pipelines for data ingestion, transformation, and orchestration.
Implementation of end-to-end data pipelines using Data Factory (Fabric), Spark notebooks, and streaming workloads.
SQL (Advanced/Expert) and PySpark / Spark SQL.
Design of high-performance data pipelines for batch and real-time workloads.
Development of semantic models and governed datasets to support enterprise analytics.
Collaboration with Power BI developers to design DAX measures, relationships, and optimized data models.
Implementation of Git integration and deployment pipelines for Fabric workloads.
Experience with Eventstream, Real-Time Hub, or KQL (Nice to have).
Experience implementing data governance frameworks, lineage, cataloging, and metadata management using tools such as Purview (Nice to have).
Strong collaboration experience working with cross-functional teams, including Data Scientists, BI Engineers, Domain Owners, and Business Stakeholders.
Ability to mentor junior engineers and contribute to engineering best practices.
Experience supporting modern data platform initiatives and contributing to Centers of Excellence, reusable frameworks, and architectural standards.
Experience with the Azure Data Platform ecosystem (ADLS, Azure Data Factory, Synapse, Databricks) (Nice to have).
Benefits
WELLNESS: We promote personal, professional, and financial well-being.
LET’S RELEASE YOUR POWER: Opportunities to specialize and grow across different technologies and domains.
WE CREATE NEW THINGS: Freedom and support to design innovative data solutions.
WE GROW TOGETHER: Participation in cutting-edge, multinational technology projects with diverse teams.
Senior Data Architect responsible for defining scalable data architectures, supporting analytics, and collaborating with engineering, cloud, and security teams for a client.
Enterprise Data Architect at Togetherwork responsible for AWS - native data strategy and architecture. Design and implement scalable data solutions to support multi - product SaaS portfolio.
Senior Data Engineer managing end - to - end Business Intelligence solutions at a Lyon - based company specializing in e - commerce. Leading technical developments and ensuring operational continuity of reporting systems.
Distinguished Data Engineer at Capital One defining AI systems' strategic vision and overseeing development. Providing technical leadership and mentoring teams to drive engineering excellence.
Senior Lead Data Engineer developing and implementing data solutions for Capital One. Collaborating with Agile teams to enhance data - driven technology and cloud solutions.
Senior Manager Data Engineer leading full - stack development and driving transformation at Capital One. Collaborating with Agile teams to enhance cloud - based solutions for customers' financial empowerment.
Data Engineer developing cloud - based solutions at Capital One, solving complex business problems with data and technology. Collaborating with Agile teams and sharing expertise in full - stack development.
Senior Data Engineer at Sedgwick integrating data pipelines with Snowflake and modern AI stacks. Creating solutions for Data Science and AI needs while ensuring data quality across platforms.