Lead the design and governance of our enterprise data architecture.
Define and implement end-to-end data architecture standards, including data modeling, data integration, metadata management, and governance across the enterprise.
Design scalable data models across Bronze, Silver, and Gold layers of a Databricks Lakehouse using Delta Lake and Unity Catalog and DLT.
Collaborate with data engineers to optimize ingestion and transformation processes across streaming and batch pipelines.
Establish canonical models and semantic layers that power downstream BI tools and self-service analytics.
Define and enforce data quality, data lineage, and data security policies in coordination with governance teams.
Work with business stakeholders, product managers, and analysts to translate analytical use cases into high-quality data models and architecture patterns.
Provide architecture oversight and best practices guidance on data integration tools (e.g., Fivetran), data cataloging, and performance optimization.
Review and approve physical and logical data model changes across teams to ensure consistency and maintain architectural integrity.
Requirements
Bachelor’s Degree in a related field or four (4) years relevant experience in lieu of degree.
Deep understanding of SAP ERP data models, especially core financials, logistics, and materials domains.
Expertise in dimensional modeling, star/snowflake schemas, and modern data warehousing patterns.
Proficiency in SQL and Python, with the ability to guide data engineers on best practices.
Strong understanding of data governance, lineage, and security frameworks.
Ability to communicate architectural concepts clearly to both technical and non-technical audiences.
Must have seven (7) years [eleven (11) for non-degreed candidates] of experience in data architecture or data engineering, with a focus on enterprise-scale data.
Strong hands-on experience with Databricks, including Delta Lake, Unity Catalog, and Lakehouse architecture.
Proven experience in conceptual, logical, and physical data modeling using tools like ER/Studio, Erwin, Lucidchart, or dbt.
Experience working in cloud environments such as Azure, AWS, or GCP.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.
Data Engineer designing and implementing data pipelines and services for Ford Pro analytics. Working with diverse teams and technologies to drive data - driven solutions.
Full Stack Data Engineer on a Central Engineering Portfolio Team in Chennai delivering curated data products and collaborating with data engineers and product owners.
Data Engineer developing best - in - class data platforms for ClearBank with a focus on data insights and automation. Collaborating closely with stakeholders and supporting data science initiatives.
Data Engineer operating cloud - based data platform for Business Intelligence and Data Science. Collaborating on data architectures and ETL processes for Sparkassen - Finanzgruppe.
Data Engineer at Love, Bonito optimizing data pipelines and ensuring data quality for analytics. Collaborating on data architecture, operations, and compliance for effective data management.