Data Engineer focusing on SQL database management and Azure Data Factory pipelines for a consulting firm in Spain. Collaborating on data integration and ensuring data accuracy in cloud environments.
Responsibilities
SQL Database Management: Design, implement, and maintain SQL databases to ensure efficient data storage and retrieval.
Optimize and tune SQL queries for maximum performance.
Elasticsearch Integration: Work with Elasticsearch to index, search, and analyze large volumes of data efficiently.
Collaborate with cross-functional teams to integrate Elasticsearch into our data ecosystem.
Azure Data Factory Pipelines: Develop and manage data pipelines using Azure Data Factory.
ETL Development and Maintenance: Design, develop, and maintain ETL processes to extract, transform, and load data from various sources into our data warehouse.
Implement data quality checks and ensure the integrity of data throughout the ETL process.
Ensure the reliability, scalability, and efficiency of data movement within the Azure cloud environment.
Work in migrating our data sources into Snowflake.
Proactive Problem Solving: Proactively identify and address data-related issues, ensuring data accuracy and consistency.
Collaborate with other teams to understand their data requirements and provide effective solutions.
Clearly communicate complex technical concepts to non-technical stakeholders.
Collaborate with data scientists, analysts, and other team members to understand data needs and deliver solutions.
Documentation: Maintain thorough documentation for all data engineering processes, ensuring knowledge transfer and best practices.
Requirements
Bachelors degree in Computer Science, Information Technology, or related field.
Proven experience in SQL database design and optimization.
Experience with github or gitlab.
Experience with dbt.
Strong ETL development skills.
Experience with data modeling.
Hands-on experience with Snowflake or Databricks.
Proficiency in creating and managing data pipelines using Azure Data Factory.
Excellent problem-solving and analytical skills.
Proactive mindset with the ability to work independently and collaboratively.
Data Migration Specialist handling large - scale data migration from legacy to enterprise PLM platform. Analyzing data structures, developing strategies, and ensuring integrity across systems.
Director leading strategy, governance, and delivery of enterprise data platform at Phillips 66. Partnering with AI, Data Science, and business teams to enhance analytics and business systems.
Product Owner driving ERP data migration initiatives for BioNTech’s global landscape. Leading effective data management and ensuring compliance with regulatory standards in a fast - paced environment.
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.