Design, build, and maintain scalable and reliable data systems to support Forto’s innovation in the logistics industry.
Collaborate with cross-functional teams—including Software Engineering, Analytics, and Product—to deliver impactful data products.
Drive the adoption of robust, high-throughput data processing architectures.
Partner with stakeholders to promote a strong, data-driven culture.
Manage and optimize GCP infrastructure, including CI/CD workflows, Terraform, and container orchestration.
Contribute to the ongoing development of our DevOps and Data Engineering best practices.
Requirements
Proficient in Python, with a strong understanding of data processing libraries, testing strategies, and best practices
Highly skilled in SQL, particularly with BigQuery, and experienced with modern data platforms
Demonstrated experience building and maintaining large-scale pipelines using Apache Airflow
Genuine interest in AI/LLM engineering, including RAG, fine-tuning, prompt engineering, or LLMOps
Possess a growth mindset with a drive to learn and keep pace with advancements in data engineering
Experienced in cross-functional environments and comfortable engaging in architectural discussions, stakeholder meetings, and collaborative working models (pair/mob sessions)
Fluent in written and spoken English
Willing to work in a hybrid arrangement, spending at least 2 days per week in the office
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.