Senior Staff Data Engineer at DeepL working on enterprise-wide data engineering standards and cloud solutions. Leading technical initiatives and mentoring engineers to support data capabilities across the organization.
Responsibilities
Define and implement enterprise-wide Data engineering standards, strategies, and best practices for data solutions
Provide expert guidance on technology selection, cloud services (AWS), and architectural decisions for data solutions
Drive continuous improvements in efficiency, cost reduction, and innovation across data
Evaluate and recommend tools, technologies, and frameworks to enhance our data capabilities
Partner with and influence leaders from engineering, analytics, machine learning, and security teams to align on goals
Mentor and be a thought leader across data, engineering and platform teams, fostering a culture of technical excellence
Collaborate with cross-functional stakeholders to understand data requirements and translate them into technical solutions
Work closely with customer-facing teams to ensure data solutions meet enterprise client needs
Drive best practices in data security, governance, and compliance aligned with enterprise B2B standards
Implement robust security measures for data at rest and in transit
Ensure thorough documentation of data processes, systems architecture, and stakeholder dependencies
Maintain compliance with GDPR, SOC 2, and other relevant regulatory requirements
Requirements
Extensive Data expertise: 10+ years of experience in data engineering or related role with at least 5 years in a staff or principal role
Data architecture experience: Deep understanding of data infrastructure, data warehousing, ETL/ELT processes, and/or data pipeline orchestration
Cloud mastery: Proven experience with cloud platforms (AWS, Azure, or GCP) and cloud-native data services
Scripting & automation: Advanced scripting skills in Python, Bash, or similar languages for automation and tooling
Leadership & communication: Proven track record of technical leadership, mentoring engineers, and influencing cross-functional teams
Enterprise experience: Experience working in high-growth technology or SaaS environments with distributed systems and microservices architecture
Experience with data-specific tools and technologies such as Apache Airflow, dbt, Apache Spark, Kafka, or similar
Experience with real-time streaming data processing pipelines (spark, flink, etc.)
Knowledge of data warehousing solutions: (Snowflake, BigQuery, Redshift) and data lake architectures
Background in data engineering or analytics engineering
AI-Native Orchestration & Advocacy: You don’t just use AI; you redefine engineering workflows through it. You possess a deep-seated belief in AI’s power to transform the software development lifecycle, data accessibility and infrastructure management.
Nice to haves: Familiarity with machine learning operations (MLOps) and ML infrastructure
Experience with security best practices, including secrets management, network security, and compliance frameworks
Experience with agile methodologies and tools (Jira, Confluence) for managing project timelines and deliverables
Contributions to open-source projects or technical community involvement
Experience with cost optimization strategies for cloud infrastructure
Benefits
Diverse and internationally distributed team: joining our team means becoming part of a large, global community with people of more than 90 nationalities. We're more than just colleagues; we're a group of professionals with a shared mission to connect diverse cultures. Our global presence is growing–we've doubled in size nearly every year, with our employees based in the UK, Germany, the Netherlands, Poland, the US, and Japan, and we continue to expand our network.
Open communication, regular feedback: as a language-focused company, we value the importance of clear, honest communication. We value smooth collaboration, direct and actionable feedback, and believe that leading with empathy and growth mindset makes us better together.
Hybrid work, flexible hours: we offer a hybrid work schedule, with team members coming into the office twice a week. This allows you to engage directly with your team and experience the unique energy of our workspace, while still enjoying the flexibility and comfort of working from home. With flexible working hours and trust in your productivity, we are in sync with your team’s general locations and time zones to foster effective and seamless collaboration.
Virtual Shares - An ownership mindset in every role. We believe everyone should share in our success, and that’s why every employee receives Virtual Shares, linking your contribution directly to DeepL’s growth and rewarding you with a stake in our future.
Regular in-person team events: we bond over vibrant events that are as unique as our team, from local team and business unit gatherings, to new-joiner onboardings, to company-wide events that bring us all together–literally.
Monthly full-day hacking sessions: every month, we have Hack Fridays, where you can spend your time diving into a project you're passionate about and get the opportunity to work with other teams–we value your initiatives, impact, and creativity.
30 days of annual leave: we value your peace of mind. With 30 days off (excluding public holidays) and access to mental health resources, we make sure you're as strong mentally as you are professionally.
Competitive benefits: just as our team spans the globe, so does our benefits package. We've crafted it to reflect the diversity of our team and tailored it to align with your unique location, to ensure you feel supported every step of the way.
Senior Data Engineer optimizing and designing data pipelines on AWS for The Rec Hub. Collaborating with the team to enhance data processing and mentorship.
Data Engineer responsible for analyzing data and supporting decision - making processes in a wellness - focused company. Utilizing advanced data technologies and collaborating with internal teams for effective data management.
Data Engineer at TeCreation focusing on data analysis and innovative system development in the well - being industry. Collaborating on data integration and business intelligence reporting.
Senior Data Engineer developing high - impact data solutions in a collaborative financial team. Integrating data systems and ensuring performance with innovative technologies.
Senior Data Engineer developing data pipelines and infrastructure on Google Cloud Platform for WorkWhile's staffing marketplace. Collaborating with Data Science and Engineering teams to enhance data quality and availability.
Data Engineer developing data platforms for a consulting firm focused on quality solutions. Collaborating within a small team to deliver robust infrastructure and systems.
Senior Data Engineer designing and maintaining data pipelines within a fast - growing social impact startup. Collaborate cross - functionally to enhance products and analytics capabilities.
Senior Associate in data engineering at PwC focusing on designing robust data solutions. Leading complex data pipeline projects and collaborating with cross - functional teams to support automation and analytics.
Data Engineering & Warehousing Manager overseeing data engineering and warehousing operations at Hastings Insurance. Leading pipelines, platforms, and technical teams for enterprise data insights.
Senior Data Engineer delivering scalable data solutions in data engineering team at fintech startup. Building and maintaining data pipelines, collaborating with cross - functional teams for accurate data delivery.