Data Engineer developing and maintaining CRE forecasting infrastructure for Cushman & Wakefield. Collaborates with senior economists and technical teams to ensure high-quality data solutions.
Responsibilities
Supports the development, optimization, and maintenance of Cushman & Wakefield’s commercial real estate (CRE) forecasting infrastructure across the Americas.
Operate as a self-sufficient data practitioner, capable of independently delivering data solutions or working side-by-side with technology teams to ensure alignment and production readiness of QIG capabilities on an iterative basis.
Works closely with senior economists, analytics leads, and technical teams to deliver high-quality, production-ready data solutions that underpin the firm’s House View and related analytical products.
Prototype, build and maintain automated data pipelines for ingesting, transforming, and storing CRE and macroeconomic datasets used in forecasting models.
Ensure data integrity and consistency across all QIG’s inputs and outputs through rigorous validation and quality control procedures.
Design and enforce structured data interfaces and integration patterns to ensure consistent ingestion and interoperability across internal and external data sources.
Work closely with cross-functional partners to define, refine, and validate data quality rules, using both automated checks and hands-on analysis to ensure outputs meet analytical expectations.
Performs exploratory data analysis and profiling on raw and processed datasets to validate pipeline outputs and identify anomalies or inconsistencies.
Partner with PRI (Property Research & Intelligence), TDS (Technology Data Solutions), GIS (Geographic Information System) and forecasting team to ensure governance of time series data, as revisions to geography-based competitive sets can occur.
Collaborate with PRI, TDS/GIS and other QIG teams to integrate internal and external data sources into infrastructure deployed by QIG teams.
Ensure Global Think Tank, Americas Research and other stakeholders have access to relevant time series (and forecast) data via various tools and capabilities in coordination with QIG leads.
Create and maintain documentation of any synthetic data model architecture, data flows, and diagnostic procedures.
Partner with Head of Data Science & Geospatial Analytics to build state-of-the-art, novel real estate dataset, with additional relevant data geospatially integrated (e.g., demographics, socioeconomic data, zoning or flood maps, climate or walk score information); produce detailed specifications that guide engineering implementation.
Develop internal documentation and process automation, and serve as expert on the integration, application and processing of internal data, 3rd party vendor data and other public data (e.g., Census TIGER, IPUMS) as appropriate with QIG leads.
Advise, integrate and execute normalization methods with internal and external partners, co-developing approaches with technology teams when necessary and validating outputs through hands-on implementation and analysis.
Identify new data use cases for proprietary data, ensure appropriate cleaning and normalization techniques so data can be used in statistical, econometric and other commercial analytics applications.
Contribute to evolution of the QIG data infrastructure by identifying opportunities for efficiency gains, automation, and scalability.
Support the integration of emerging technologies (e.g., ML/AI, advanced lakehouse patterns) into data workflows under guidance from senior team members through hands-on experimentation, prototyping, or coordination with TDS as needed.
Coordinate with TDS and PRI on internal data and technology initiatives; contributing hands-on development or feedback where appropriate to scale, optimize, and productionize solutions in support of QIG capabilities.
Serve as the key liaison for all external data dependencies; monitor the evolution of 3rd party data products and capabilities, assess their fit against QIG analytical requirements, and produce intake specifications when new sources are approved for integration.
When/where appropriate, maintain a living requirements register and change log that tracks open data engineering requests, their status in the TDS backlog, acceptance criteria, and QIG sign-off outcomes.
Requirements
Bachelor’s or Master’s degree in Data Engineering, Data Science, Computer Science, Statistics, or a related technical field. Advanced degree a plus.
5-7 years of experience in data engineering or a hybrid analytical/engineering role, preferably in a forecasting or analytics/production environment. Real estate experience a plus.
Strong proficiency in Python/R, SQL, Databricks, Delta Lake and data pipeline frameworks (e.g., medallion architecture).
Experience with time series data, econometric / data science modeling workflows, and automation tools.
Familiarity with cloud platforms (e.g., Azure, AWS) and version control systems.
Demonstrated ability to operate in a collaborative, cross-functional environment, contributing both independently and alongside engineering and analytical teams to deliver data solutions.
Comfort working in iterative development settings, balancing hands-on execution with stakeholder collaboration and continuous feedback.
Strong attention to detail and commitment to data quality.
Excellent documentation, communication, and stakeholder management skills; comfortable operating as the technical translator between analytical domain experts and data engineering teams (when appropriate).
Excellent documentation and communication skills for technical audiences. Ability to participate meaningfully in engineering discussions.
Exposure to geospatial data concepts and CRE or macroeconomic datasets.
Experience working with agile/scrum delivery models in a data and analytics context.
Data Warehouse Modelling Engineer designing and maintaining data models using Data Vault 2.0 for iGaming industry. Collaborating with stakeholders and optimizing data models in a hybrid work environment.
Senior Data Engineer driving impactful data solutions for the climate logistics startup HIVED's core data platform. Collaborating with cross - functional squads to enhance analytics and delivery.
Data Engineer at PwC, engaging with Azure cloud services to enhance data handling and integrity. Responsibilities include pipeline optimizations, documentation, and collaboration with stakeholders.
Data Engineer Manager at PwC focusing on building data infrastructure and solutions. Leading data engineering projects to transform raw data into actionable insights and drive business growth.
Junior Data Engineer at OneMarketData focusing on data quality and integrity in financial datasets. Collaborating with senior analysts and assisting in data management and analysis tasks.
Senior Data Engineering Analyst developing and implementing data solutions. Collaborating in a diverse environment focused on data processing and analysis for clients' digital transformation.
Senior Azure Data Engineer maintaining business intelligence solutions for Grupo Gloria, implementing and stabilizing projects in Azure and Databricks with Power BI reporting.
Principal Software Engineer in Threat Data Platform developing AI - driven tools for threat intelligence automation. Collaborating on robust data pipelines for PANW’s product ecosystem.
Staff Data Engineer at URBN developing AI - powered digital experiences by integrating algorithmic solutions with creative tools. Collaborating with cross - functional teams for impactful product evolution.