Senior Databricks DWH Engineer responsible for designing ETL data pipelines. Collaborating with teams to deliver scalable solutions in the banking sector.
Responsibilities
Advanced Design & Implementation: Designing and implementing robust, scalable, high-performance ETL/ELT data pipelines using PySpark/Scala and Databricks SQL on the Databricks platform.
Delta Lake: Expertise in implementing and optimizing the Medallion architecture (Bronze, Silver, Gold) using Delta Lake to ensure data quality, consistency, and historical tracking.
Lakehouse Platform: Efficient implementation of the Lakehouse architecture on Databricks, combining best practices from DWH and Data Lake environments.
Performance Optimization: Optimizing Databricks clusters, Spark operations, and Delta tables (e.g., Zordering, compaction, query tuning) to reduce latency and compute costs.
Streaming: Designing and implementing real-time/near–real-time data processing solutions using Spark Structured Streaming and Delta Live Tables (DLT).
Unity Catalog: Implementation and administration of Unity Catalog for centralized data governance, fine-grained security (row- and column-level security), and data lineage.
Data Quality: Defining and implementing data quality standards and rules (e.g., using DLT or Great Expectations) to maintain data integrity.
Orchestration: Developing and managing complex workflows using Databricks Workflows (Jobs) or external tools (e.g., Azure Data Factory, Airflow) to automate pipelines.
DevOps/CI/CD: Integrating Databricks pipelines into CI/CD processes using tools such as Git, Databricks Repos, and Bundles.
Collaboration: Working closely with Data Scientists, Analysts, and Architects to understand business requirements and deliver optimal technical solutions.
Mentorship: Providing technical guidance to junior developers and promoting best practices.
Requirements
Professional Experience: Minimum 5+ years of experience in Data Engineering, including at least 3+ years working with Databricks and large-scale Spark.
Databricks Platform: Proven, expert-level experience with the full Databricks ecosystem (Workspace, Cluster Management, Notebooks, Databricks SQL).
Apache Spark: Deep knowledge of Spark architecture (RDD, DataFrames, Spark SQL) and advanced optimization techniques.
Delta Lake: Expertise in implementing and administering Delta Lake (ACID properties, Time Travel, Merge, Optimize, Vacuum).
SQL: Advanced/expert skills in SQL and Data Modeling (Dimensional, 3NF, Data Vault).
Cloud: Strong experience with a major Cloud platform (AWS, Azure, or GCP), particularly with storage services (S3, ADLS Gen2, GCS) and networking.
Unity Catalog: Hands-on experience with implementing and administering Unity Catalog.
Lakeflow: Experience with Delta Live Tables (DLT) and Databricks Workflows.
ML/AI Fundamentals: Understanding of basic MLOps concepts and experience with MLflow to support integration with Data Science teams.
DevOps: Experience with Terraform or equivalent tools for Infrastructure as Code (IaC).
Certifications: Databricks certifications (e.g., Databricks Certified Data Engineer Professional) are a strong advantage.
Benefits
Premium medical package
Lunch Tickets & Pluxee Card
Bookster subscription
13th salary and yearly bonuses
Enterprise job security with a startup mentality (diverse & engaging environment, international exposure, flat hierarchy) under the stability of a secure multinational
A supportive culture (we value ownership, autonomy, and healthy work-life balance) with great colleagues, team events and activities
Flexible working program and openness to remote work
Collaborative mindset – employees shape their own benefits, tools, team events and internal practices
Diverse opportunities in Software Development with international exposure
Flexibility to choose projects aligned with your career path and technical goals
Access to leading learning platforms, courses, and certifications (Pluralsight, Udemy, Microsoft, Google Cloud)
Career growth & learning – mentorship programs, certifications, professional development opportunities, and above-market salary
Technician Engineer supporting Roads Network Management Services at Fife Council. Investigating solutions for roads problems and managing budgets for projects.
System and Monitoring Engineer providing technical support for ODM platform at TalentHackers. Ensure system performance and stability while monitoring integrations with core banking systems.
Senior Analog Design Engineer responsible for designing and validating high performance analog circuitry. Collaborates with cross functional teams to translate requirements into robust, manufacturable designs.
Advanced Control Engineer optimizing turbine and combustion control strategies for efficient power plant operations at Emerson. Leading DCS commissioning and innovative control solutions for enhanced performance.
CAE Engineer responsible for process modeling and optimization in machining assembly for Powertrain programs. Collaborating with cross - functional teams to ensure feasibility and technical support.
Engineer - BESS at Aula Energy developing battery energy storage systems for renewable projects. Supporting design, construction, and operations of energy storage across Australia.
Senior Manufacturing Engineer managing manufacturing processes for Argen, the largest dental zirconia manufacturer in North America. Supporting project execution and driving process improvements with a focus on quality and efficiency.
Configurator Engineer contributing to tailoring solutions for banking and insurance AI. Collaborating with teams to optimize configurations and validate solutions for business requirements.
Ingénieur matériaux spécialisé en composites pour protection thermique chez ArianeGroup. Développement de nouveaux matériaux et optimisation de procédés en milieu spatial.
Plastic Component Engineer developing and preparing plastic and rubber components for CUBE Bikes. Collaborating with suppliers and ensuring production readiness in a professional environment.