Staff Data Engineer needed to architect data systems at Dialectica. Collaborate on complex infrastructures and mentor engineering teams, focusing on high-quality data pipelines.
Responsibilities
Architect End-to-End Systems: Go beyond traditional data pipelining. Design and oversee the implementation of complex, distributed data architectures that span ingestion, processing, storage, and serving layers on AWS.
Bridge App & Data: Collaborate deeply with Backend and Frontend engineering teams during the design phase of new features. You will influence database schema design in microservices and define event emission standards to ensure data is usable downstream before code is even written.
API & Consumption Layer Design: Don't just dump data into a warehouse. Architect high-performance data access layers (e.g., GraphQL APIs, low-latency lookups via Redis) that allow our product teams to consume processed data easily and efficiently in the UI.
Elevate Engineering Standards: Define and enforce best practices for Infrastructure as Code (Terraform), CI/CD for data products, data testing, and observability capabilities across the entire stack.
Technical Leadership & Mentorship: Serve as a technical beacon for the data organization. Mentor Senior Data Engineers, conduct high-level code reviews, and drive pragmatic technical decision-making that balances immediate business needs with long-term scalability.
Solve the "Hardest" Problems: Take ownership of the most complex, intractable technical challenges related to data consistency, real-time processing, and cross-system integrations.
Requirements
8+ years of combined experience in Software Engineering and Data Engineering, with at least 3 years operating at a Senior or Lead level.
True "Full Stack" Exposure: You must have a background in core software engineering beyond just SQL and Python scripts. You should be comfortable navigating backend codebases (e.g., Node.js, Go, or Python application code) to understand how data is generated.
Mastery of Modern Data Stacks: Deep expertise in Python, advanced SQL, and architecting Production Data Lakes/Warehouses on cloud platforms (AWS preferred).
Architectural Expertise: Strong background in distributed systems design, event-driven architecture (Kafka, Kinesis, SNS/SQS), and microservices patterns. You understand the trade-offs between consistency and availability (CAP theorem) in real-world scenarios.
Infrastructure as Code: Deep hands-on experience with Kubernetes, Docker, and Terraform. You don't just use infrastructure; you design it.
Advanced Tooling: Expert-level knowledge of orchestration tools (Airflow, Dagster) and modern transformation tools like DBT.
A Product Mindset: You don't just serve data; you understand the business value of what you are building and how it impacts the end-user experience.
Fluency in English is a must.
Benefits
Competitive salary pegged to international standards with performance incentives.
Premium Prepaid Medicine (Medicina Prepagada) coverage.
Flexible Hybrid or Remote work model based in Bogota.
Extra personal/flex days and paid volunteer days.
Learning and development budget (Udemy, conferences, certifications).
Entrepreneurial culture and amazing coworkers across 3 continents.
Company-sponsored team-bonding events and wellness activities.
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.