Data Engineer optimizing data architecture and pipelines at Kantox, a fintech company. Collaborate with teams to develop data products within a modern Lakehouse environment.
Responsibilities
Build and maintain high-performance, tested and well-documented dbt models, following best practices.
Write efficient, maintainable, and scalable SQL transformations across different layers of the data stack.
Collaborate with analytics and domain teams to translate data needs into well-modeled, trusted datasets.
Help monitor and improve the quality, reliability, and performance of data pipelines.
Integrate new data sources into the platform through ingestion pipelines (batch or streaming).
Implement data tests, CI/CD validations and participate in peer reviews to ensure code quality.
Contribute to data product development within a Data Mesh architecture.
Participate in agile rituals and collaborate with cross-functional teams.
Requirements
2+ years of experience in Data Engineering or a similar role.
Strong experience with SQL and dbt, and a passion for clean, efficient code.
Solid understanding of data modeling principles (dimensional, star schema, SCDs…).
Experience working with big data tools like Trino, Spark, or similar query engines.
Familiarity with batch and/or streaming pipelines, e.g. Kafka or RabbitMQ.
Experience writing data tests and following version control workflows (e.g., Git).
Basic proficiency in Python (used for scripting, testing, or orchestration logic).
Familiarity with data quality tools like dbt tests, Great Expectations, or Soda is a plus.
Exposure to orchestration tools like Dagster, Airflow, or Prefect is a plus.
Comfortable working in a collaborative and agile environment, using tools like Jira, GitHub and Slack.
Curious, pragmatic, and always looking to learn and improve.
Fluent in English.
Permission to work within the EU is a plus.
Benefits
Competitive salary 💰
Sponsored learning budget
Free private health insurance
Free Spanish, English, French and Catalan lessons
Relocation package if needed
Flexible working hours + short Fridays
Hybrid work model
29 days of annual vacations🌴
Gym discounts and free sports activities 💪
Restaurant Ticket with monthly credit and regular cross-team lunches
Fresh fruit and unlimited coffee 🍇☕️
Pizza Fridays 🍕
Beautiful office with incredible 360-degree views of Barcelona ☀️
Data Engineer at Equinix implementing data architecture solutions for scalability and analytics. Collaborating with teams to design data pipelines and maintain data models for business objectives.
Data Warehouse Architect developing and optimizing robust data warehouse environments on SAP BW/4HANA. Critical for enabling advanced analytics and reporting across the organization.
Data Engineering Manager leading a new Data Engineering team in Bengaluru. Shaping the design and scaling of core data engineering practices across the organization.
Sr. ETL/Data Warehouse Lead at Huntington designing, developing, and supporting ETL and Data Warehousing framework. Analyzing systems based on specifications and providing technical assistance.
Senior Google Data Architect designing and delivering scalable data solutions on Google Cloud Platform. Collaborating across teams to shape target - state data architectures and influence enterprise data strategy.
Data Engineer developing scalable data lake solutions and optimizing data pipelines at U.S. Bank. Collaborating with teams to manage data governance and cloud migration activities.
Lead AI, MLOps & Data Engineer at WedR, guiding complex data projects and AI innovation. Collaborate with diverse experts in a Product Studio for digital transformations.
Lead Azure Databricks Data Engineer implementing data solutions for data engineering projects at Ryan Specialty. Collaborating with stakeholders and mentoring junior staff on data pipelines and ETL processes.
Lead Azure Databricks Data Engineer at Ryan Specialty focused on implementing data solutions and collaborating with cross - functional teams to enhance data architecture.
Senior Data Engineer designing and implementing sustainable data solutions for diverse clients. Collaborating closely with stakeholders to enhance data services and platforms in a hybrid environment.