Data Engineer Intern supporting Intrepid's Martech solutions for ecommerce brands while gaining hands-on experience with data systems. Collaborate with experienced team members to enhance data collection and integration processes.
Responsibilities
Support the Tech team in building and improving Intrepid Martech , our core marketing and data solution for brands.
Work closely with experienced Data Engineers, BI specialists, and Product team members to gain hands-on experience with real production data systems.
Help collect, process, and integrate data from multiple external sources into our data platforms, while learning best practices in data architecture, data quality, system reliability, and monitoring.
Assisting in the design and development of crawler systems and data pipelines to collect data from e-commerce platforms, including login-required sources and open APIs.
Supporting the maintenance and continuous improvement of existing data pipelines, scraping workflows, and ingestion processes.
Testing, validating, and checking data from APIs and crawling processes to ensure accuracy, consistency, and data quality.
Assisting in monitoring data flows, identifying issues, and troubleshooting basic problems under the guidance of senior engineers.
Working closely with team leaders and following technical standards and guidelines to deliver assigned tasks effectively.
Documenting technical processes and contributing to internal knowledge sharing within the team.
Requirements
Careful, diligent, and responsible, with good teamwork and communication skills.
Currently pursuing or recently graduated with a degree in Information Technology, Software Engineering, Computer Science, or Data Science.
Basic understanding of backend development , data pipelines , and Linux environments.
Solid foundation in SQL/NoSQL , including writing queries, understanding table schemas, and basic query and table optimization.
Familiarity with data orchestration and scheduling tools such as Airflow (concepts, DAGs, task dependencies).
Basic knowledge of HTTP, HTML, JavaScript, and networking concepts.
Basic understanding of distributed systems and modern data infrastructure components , including: Message queues and streaming platforms such as RabbitMQ and Kafka.
In-memory data stores and caching mechanisms such as Redis.
Search and analytics engines such as Elasticsearch.
Familiarity with Golang is preferred; exposure to Node.js , Python , and web scraping or browser automation tools (e.g. Puppeteer) is a plus.
Basic experience with Linux command-line tools.
Familiarity with version control systems such as Git.
Exposure to CI/CD concepts and pipelines (e.g. GitLab CI/CD) is a plus.
Willingness and ability to learn cloud-based and containerized technologies , such as Kubernetes (K8s) , Cloud Functions , Cloud Storage , and BigQuery.
Basic awareness of AI-assisted development tools and AI agents , with an understanding of how to use them responsibly and effectively to support development, debugging, and learning.
Ability to review and evaluate AI-generated outputs , ensuring correctness, code quality, and alignment with existing systems before applying them.
Strong attention to code quality and maintainability , avoiding unnecessary complexity, redundant code, or “noise” in the codebase.
Eager to learn and work in an Agile development environment.
Interest in data collection, crawling/scraping, browser automation, and reverse engineering is an advantage.
Good written and spoken English communication skills.
Passion for ecommerce, data, and technology-driven solutions is a plus.
Benefits
The opportunity to contribute to the operating system for digital commerce in South East Asia , covering key platforms and functionalities across Middleware, Martech, Data & Analytics, and more.
Hands-on exposure to real-world SaaS platforms and large-scale data systems built on modern, scalable infrastructure.
The chance to work within a mature, enterprise-grade Tech team that has been operating since 2017, with well-established processes and best practices—while still maintaining a non-hierarchical, entrepreneurial, and collaborative culture.
Close collaboration with business stakeholders, ensuring your technical work has real-world impact and visibility.
Good ideas are encouraged, ownership is valued, and initiative is rewarded.
You are empowered to shape both your role and your growth.
You will have access to structured training through Intrepid Academy , coaching, and real-world project experience that accelerates your professional growth.
We offer a competitive internship allowance and a supportive environment that values your contributions and development.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.