Staff Data Engineer needed to architect data systems at Dialectica. Collaborate on complex infrastructures and mentor engineering teams, focusing on high-quality data pipelines.
Responsibilities
Architect End-to-End Systems: Go beyond traditional data pipelining. Design and oversee the implementation of complex, distributed data architectures that span ingestion, processing, storage, and serving layers on AWS.
Bridge App & Data: Collaborate deeply with Backend and Frontend engineering teams during the design phase of new features. You will influence database schema design in microservices and define event emission standards to ensure data is usable downstream before code is even written.
API & Consumption Layer Design: Don't just dump data into a warehouse. Architect high-performance data access layers (e.g., GraphQL APIs, low-latency lookups via Redis) that allow our product teams to consume processed data easily and efficiently in the UI.
Elevate Engineering Standards: Define and enforce best practices for Infrastructure as Code (Terraform), CI/CD for data products, data testing, and observability capabilities across the entire stack.
Technical Leadership & Mentorship: Serve as a technical beacon for the data organization. Mentor Senior Data Engineers, conduct high-level code reviews, and drive pragmatic technical decision-making that balances immediate business needs with long-term scalability.
Solve the "Hardest" Problems: Take ownership of the most complex, intractable technical challenges related to data consistency, real-time processing, and cross-system integrations.
Requirements
8+ years of combined experience in Software Engineering and Data Engineering, with at least 3 years operating at a Senior or Lead level.
True "Full Stack" Exposure: You must have a background in core software engineering beyond just SQL and Python scripts. You should be comfortable navigating backend codebases (e.g., Node.js, Go, or Python application code) to understand how data is generated.
Mastery of Modern Data Stacks: Deep expertise in Python, advanced SQL, and architecting Production Data Lakes/Warehouses on cloud platforms (AWS preferred).
Architectural Expertise: Strong background in distributed systems design, event-driven architecture (Kafka, Kinesis, SNS/SQS), and microservices patterns. You understand the trade-offs between consistency and availability (CAP theorem) in real-world scenarios.
Infrastructure as Code: Deep hands-on experience with Kubernetes, Docker, and Terraform. You don't just use infrastructure; you design it.
Advanced Tooling: Expert-level knowledge of orchestration tools (Airflow, Dagster) and modern transformation tools like DBT.
A Product Mindset: You don't just serve data; you understand the business value of what you are building and how it impacts the end-user experience.
Fluency in English is a must.
Benefits
Competitive salary pegged to international standards with performance incentives.
Premium Prepaid Medicine (Medicina Prepagada) coverage.
Flexible Hybrid or Remote work model based in Bogota.
Extra personal/flex days and paid volunteer days.
Learning and development budget (Udemy, conferences, certifications).
Entrepreneurial culture and amazing coworkers across 3 continents.
Company-sponsored team-bonding events and wellness activities.
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.