Data Architect leading the design and governance of data architecture for FFF Enterprises. Collaborating across teams to optimize data solutions and support analytics and reporting initiatives.
Responsibilities
Lead the design and governance of our enterprise data architecture.
Define and implement end-to-end data architecture standards, including data modeling, data integration, metadata management, and governance across the enterprise.
Design scalable data models across Bronze, Silver, and Gold layers of a Databricks Lakehouse using Delta Lake and Unity Catalog and DLT.
Collaborate with data engineers to optimize ingestion and transformation processes across streaming and batch pipelines.
Establish canonical models and semantic layers that power downstream BI tools and self-service analytics.
Define and enforce data quality, data lineage, and data security policies in coordination with governance teams.
Work with business stakeholders, product managers, and analysts to translate analytical use cases into high-quality data models and architecture patterns.
Provide architecture oversight and best practices guidance on data integration tools (e.g., Fivetran), data cataloging, and performance optimization.
Review and approve physical and logical data model changes across teams to ensure consistency and maintain architectural integrity.
Requirements
Bachelor’s Degree in a related field or four (4) years relevant experience in lieu of degree.
Deep understanding of SAP ERP data models, especially core financials, logistics, and materials domains.
Expertise in dimensional modeling, star/snowflake schemas, and modern data warehousing patterns.
Proficiency in SQL and Python, with the ability to guide data engineers on best practices.
Strong understanding of data governance, lineage, and security frameworks.
Ability to communicate architectural concepts clearly to both technical and non-technical audiences.
Must have seven (7) years [eleven (11) for non-degreed candidates] of experience in data architecture or data engineering, with a focus on enterprise-scale data.
Strong hands-on experience with Databricks, including Delta Lake, Unity Catalog, and Lakehouse architecture.
Proven experience in conceptual, logical, and physical data modeling using tools like ER/Studio, Erwin, Lucidchart, or dbt.
Experience working in cloud environments such as Azure, AWS, or GCP.
Senior Lead Data Engineer at Capital One collaborating with Agile teams and mentoring developers. Leading full - stack development and driving cloud - based solutions for financial empowerment.
Senior Data Engineer responsible for developing data pipelines and collaborating with US business teams. Working at Beghou Consulting, a life sciences company providing advanced analytics and technology solutions.
Data Solutions Architect designing enterprise - scale Azure and Databricks Lakehouse solutions for clinical trials and life sciences data enabling advanced analytics and compliance.
Data Architect at ADEO ensuring interoperability of IT systems through architecture design and data knowledge diffusion. Collaborating with teams to maintain data integrity and quality standards in an international setup.
Consultant, Data Engineer leading end - to - end data solutions and analytics. Collaborating with clients to improve data strategies and deliver actionable insights.
Big Data Engineer developing applications for Synchrony’s Enterprise Data Lake within an Agile scrum team. Collaborating to deliver high - quality data ingestion and maintain data governance standards.
Data Engineer optimizing data pipelines and cloud solutions for GFT Poland. Involves performance tuning, ETL pipelines, and data model development across multiple locations in Poland.
Junior AI Data Engineer specializing in data - focused solutions for financial services. Collaborating on digital transformation projects across various regions.
Data Engineer building scalable cloud data pipelines in Azure and Databricks. Focus on Lakehouse architecture and data governance in a hybrid work model.