Data Engineer designing and developing end-to-end data pipelines for analytics and reporting at Deloitte. Collaborating with BI teams and optimizing data models using Azure services.
Responsibilities
Diseñar, desarrollar y mantener pipelines de datos end-to-end
Implementar procesos de ingesta de datos desde múltiples fuentes (APIs, bases de datos, ficheros, sistemas externos)
Desarrollar procesos de transformación y enriquecimiento de datos asegurando calidad, trazabilidad y eficiencia
Diseñar y optimizar modelos de datos orientados a analítica y reporting
Trabajar con servicios de Azure como: Azure Data Factory, Azure Data Lake Storage, Azure Synapse, Databricks y/o Microsoft Fabric
Implementar buenas prácticas de versionado, control de errores, logging y monitorización de pipelines
Colaborar con equipos de BI, analítica y negocio para entender requisitos y traducirlos a soluciones técnicas
Participar en decisiones de arquitectura de datos y mejora continua de las soluciones existentes
Documentar los desarrollos y flujos de datos de forma clara y mantenible
Requirements
Al menos 2 años de experiencia en proyectos de Data Engineering o backend orientados a datos
Experiencia en Azure (Azure Data Factory, Azure Databricks, Azure Synapse, Azure Data Lake o Microsoft Fabric) o AWS
Experiencia programando en Python y SQL
Conocimientos de modelado de datos y arquitecturas tipo Data Lake / Data Warehouse / Lakehouse
Nivel intermedio de inglés
Benefits
Modelo de trabajo híbrido-flexible
Sistema de retribución flexible
Servicio médico
Seguro de salud
Seguro de vida y accidente
Plan de formación a lo largo de la trayectoria profesional
Cultura de feedback para fomentar aprendizaje continuo
Participación en programas de acción social y voluntariado de alcance nacional e internacional
Data Engineer responsible for building and maintaining data solutions on AWS and GCP. Focus on Lakehouse architecture to support analytics, reporting, and AI/ML use cases.
Data Engineer Pleno at Verity focusing on building data pipelines and analytics frameworks. Engage with teams to ensure data quality and enhance cloud - based solutions.
Data Engineer for Verity, a digital transformation consultancy, designing data architectures and building scalable pipelines. Collaborates on data quality and analytical dataset structuring.
Data Engineer at GFT managing data systems and workflows, focusing on data engineering and data science collaboration. Leveraging technologies like Python, Airflow, and AWS.
Senior Data Engineer at Shopmonkey building and maintaining data infrastructure. Driving strategic decisions for tools and processes while ensuring data quality across platforms.
Data Engineer at dsm - firmenich designing and maintaining robust data pipelines. Collaborating on impactful projects centered around data for health, nutrition, and beauty.
Data Scientist developing customer - focused data products to improve customer journey at Noibu. Collaborating with teams to analyze feedback data and shape analytics strategies.
Data Engineer at fundcraft ensuring seamless data movement and storage. Focus on data migrations, ETL maintenance, and collaboration with the Product team for insights.
Databricks Data Engineer developing scalable data pipelines within the insurance domain. Building ETL workflows on Azure and Databricks for analytics and business decision - making.
Senior Engineer with expertise in AWS, Microsoft Fabric, and Purview at EXL. Leading development of scalable cloud and data governance solutions while mentoring engineering teams.