Data Platform Engineer at Taxfix managing data infrastructures and pipelines to support analytics and AI-driven product features. Collaborating with cross-functional teams to ensure data reliability.
Responsibilities
Build and maintain ingestion pipelines that capture changes from application databases, APIs, SaaS and deliver clean, analytics-ready tables to our cloud data warehouse
Design data models with proper layering that handle real-world data complexity: out-of-order events, schema evolution, late arrivals, and backfills
Own and evolve cloud platform infrastructure — manage GCP resources (GCS, Dataflow, Dataproc), provision and maintain environments with Terraform, and ensure the platform is cost-efficient and scalable
Own data quality monitoring — build validation, monitoring, and alerting that catches problems before downstream consumers do
Implement privacy and compliance controls — anonymization, pseudonymization, access policies, and deletion propagation (GDPR right-to-be-forgotten) across raw and derived layers
Prepare data for ML and AI use cases — build governed, privacy-safe datasets and feature pipelines that ML engineers and data scientists can use for model training, evaluation, and production inference
Operate and improve our orchestration layer — scheduling, retries, SLA tracking, and observability for data pipelines
Define and raise the bar on engineering standards — code quality, testing, CI/CD, documentation, and infrastructure-as-code
Evaluate and adopt new technologies that help the team achieve its goals across data management, analytics, and machine learning
Incorporate AI into platform services — enable AI-assisted development workflows and build internal AI backend services as part of the data platform offering
Communicate across domains — work closely with analytics, product, compliance, and engineering teams; translate between technical and business language
Mentor and grow with the team — share what you learn, support others, and contribute to a culture of honest technical discussion
Requirements
6+ years of experience in Data Engineering or a similar role (backend engineer working on data-intensive systems counts)
Strong Python skills for data pipeline development — you write production code, not just scripts
Strong SQL skills — window functions, CTEs, query optimization are second nature
Experience with event-driven data pipelines — CQRS, event ordering, idempotency, and the difference between initial load and incremental processing
Expert with Airflow — you’ve built DAGs with proper task dependencies, retries, and monitoring
Experience with Snowflake or/and BigQuery — you understand their architecture, performance characteristics, and how they differ from each other and from other analytical or operational tools
Cloud platform experience — you’ve worked with GCP (GCS, Dataflow, Dataproc etc) or equivalent AWS/Azure services and understand how to manage cloud resources at scale
Infrastructure-as-code — experience with Terraform, Helm, or similar tools for provisioning and managing cloud environments
K8S and Docker containerization — you package and deploy your own work
Data quality mindset — you profile data, validate assumptions, build checks, and don’t trust that “the data looked clean”
Data for AI readiness — you understand what it takes to prepare data for ML and AI: governance, lineage, privacy controls, and reproducibility
Awareness of data privacy requirements — you can identify PII, understand GDPR, and know how to implement anonymization and deletion across multiple data layers
AI-enabled engineering practices — you actively use AI assistants and code generation tools to accelerate development and deliver and you can establish standards for their effective use across the team.
Benefits
A chance to do meaningful, people-centric work with an international team of passionate professionals.
Holistic well-being with free mental health coaching sessions and yoga.
A monthly allowance to spend on an extensive range of services that you can use and roll over as flexibly as you like.
Employee stock options for all employees—because everyone deserves to benefit from the success they help to create.
30 annual vacation days and flexible working hours.
Work from abroad for up to six weeks every year. Just align with your team, and then enjoy your trip.
Plenty of opportunities to socialise as a team. In addition to internal tech meetups, our international team hosts regular get-togethers—virtually and in person when possible.
Free tax declaration filing, of course, through the Taxfix app—and internal support for all personal tax-related questions.
Have a four-legged friend in your life? We’re happy to have dogs join us in the office.
Staff Data Engineer designing next - gen data platforms and pipelines at CommBank. Integrating data sources and ensuring high - quality, scalable data products while collaborating with Business Banking technology teams.
Senior Data Engineer optimizing and industrializing data pipelines for AI products, ensuring data quality and performance across the lifecycle. Collaboration with cross - functional teams is crucial for this role.
Senior Data Engineer supporting critical banking insights at Smile. Driving data solutions and reporting in a collaborative environment across Brussels' financial sector.
Data Engineer I responsible for supporting AI solutions in Consumer applications at Bank of America. Involved in Hadoop ecosystem, big data technologies, and ensuring system stability.
Senior Data Architect defining comprehensive data strategy and architecture for AI. Delivering organization’s data vision and ensuring governance and technical oversight of enterprise data architecture.
Data Engineer at Booz Allen utilizing data to impact critical missions like fraud detection and cancer research. Collaborating with analysts and developers on advanced technology solutions.
Technical Product Manager for Data Engineering at Betclic. Owning product roadmaps, driving data infrastructure evolution, and ensuring alignment across engineering teams.
Senior Data Engineer maturing strategic data assets and delivering business analytics in a regulated financial environment. Collaborating with stakeholders to advance business data strategy on cloud platforms.
Staff Data Engineer leading scalable data solutions for analytics and reporting at Asurion. Designing data pipelines and ensuring data quality across cloud platforms.
Lead Architect at Travelers shaping enterprise database and data platform solutions. Collaborating across technology and business units to drive digital transformation and modernization efforts.