Data Architect leading the design and governance of data architecture for FFF Enterprises. Collaborating across teams to optimize data solutions and support analytics and reporting initiatives.
Responsibilities
Lead the design and governance of our enterprise data architecture.
Define and implement end-to-end data architecture standards, including data modeling, data integration, metadata management, and governance across the enterprise.
Design scalable data models across Bronze, Silver, and Gold layers of a Databricks Lakehouse using Delta Lake and Unity Catalog and DLT.
Collaborate with data engineers to optimize ingestion and transformation processes across streaming and batch pipelines.
Establish canonical models and semantic layers that power downstream BI tools and self-service analytics.
Define and enforce data quality, data lineage, and data security policies in coordination with governance teams.
Work with business stakeholders, product managers, and analysts to translate analytical use cases into high-quality data models and architecture patterns.
Provide architecture oversight and best practices guidance on data integration tools (e.g., Fivetran), data cataloging, and performance optimization.
Review and approve physical and logical data model changes across teams to ensure consistency and maintain architectural integrity.
Requirements
Bachelor’s Degree in a related field or four (4) years relevant experience in lieu of degree.
Deep understanding of SAP ERP data models, especially core financials, logistics, and materials domains.
Expertise in dimensional modeling, star/snowflake schemas, and modern data warehousing patterns.
Proficiency in SQL and Python, with the ability to guide data engineers on best practices.
Strong understanding of data governance, lineage, and security frameworks.
Ability to communicate architectural concepts clearly to both technical and non-technical audiences.
Must have seven (7) years [eleven (11) for non-degreed candidates] of experience in data architecture or data engineering, with a focus on enterprise-scale data.
Strong hands-on experience with Databricks, including Delta Lake, Unity Catalog, and Lakehouse architecture.
Proven experience in conceptual, logical, and physical data modeling using tools like ER/Studio, Erwin, Lucidchart, or dbt.
Experience working in cloud environments such as Azure, AWS, or GCP.
Data Engineer managing payment processing and data accuracy while collaborating with financial teams. Building and optimizing data pipelines for transactional data in a hybrid work environment.
Data Engineer building analytical tools for Dry Bulk market data operations at Kpler. Join a team of over 700 experts transforming data into actionable strategies.
Data Engineer developing tools for maintaining data integrity in cargo tracking at Kpler. Collaborating with analysts and engineers to enhance data quality management.
Lead Azure Data Engineer designing and optimizing data ecosystems on Microsoft Cloud. Responsible for building scalable data platforms and pipelines for analytics and reporting.
Data Engineer providing support for IBM DataStage ETL jobs at Callibrity. Collaborating with stakeholders and working to modernize technology solutions in a hybrid work environment.
Cloud Data Engineer implementing tailored solutions for Volkswagen Group data processing. Building ETL/ELT pipelines while collaborating with technical experts.
Data Engineer responsible for building scalable data infrastructure that supports data - driven decisions. Collaborating with team to maintain systems and unlock data value for organizations.
Data Engineer designing and optimizing data pipelines using Databricks and Google Cloud Platform. Collaborating with analysts and scientists to deliver high - quality data products.
Associate Data Engineer supporting privacy engineering controls and executing privacy impact assessments in a financial services company. Collaborating across business units to ensure alignment with privacy regulations.