Data Engineer developing scalable data pipelines for RunBuggy's automotive logistics platform. Collaborate with cross-functional teams to unlock powerful insights and optimize data infrastructure.
Responsibilities
Design, develop, and maintain scalable data pipelines and systems.
Independently create and own new data capture/ETL’s for the entire stack and ensure data quality.
Collaborate with data scientists, engineers, business leaders, and other stakeholders to understand data requirements and provide the necessary infrastructure.
Create and contribute to frameworks that improve the effectiveness of logging data, triage issues, and resolution.
Define and manage Service Level Agreements (SLA) for all data sets in allocated areas of ownership.
Lead data engineering projects and determine the appropriate tools and libraries for each task.
Implement data security and privacy best practices.
Create and maintain technical documentation for data engineering processes.
Work with cloud-based data storage and processing solutions (for example, Docker and Kubernetes).
Build out and support a DAG orchestration cluster framework.
Migrate workflows from batch processes to the DAG cluster via concurrent data flows.
Data pipeline maintenance, including debugging code, monitoring, and incident response.
Collaborate with engineering to enforce data collection and data contracts for API’s, databases, etc.
Optimize pipelines, dashboards, frameworks, and systems to facilitate easier development of data artifacts.
Requirements
Bachelor's degree in Computer Science, Engineering, or a related field required; master’s degree preferred.
5+ years of experience in data engineering.
Proficiency in Python and experience with data engineering libraries (e.g., Pandas).
Experience with ETL processes and tools.
Strong knowledge of relational and non-relational databases.
Experience with cloud platforms (e.g., AWS, GCP, Azure).
Excellent communication skills.
Ability to work independently and lead projects.
Experience with data warehousing solutions.
Familiarity with data visualization tools (e.g., Tableau).
Experience with building and managing DAG clusters (e.g. Airflow, Prefect).
Ability to work with the following: JavaScript, Node.js, AngularJS, Java, and Java Spring Boot.
Knowledge of machine learning and data science workflows.
Ability to handle a variety of duties in a fast-paced environment.
Excellent organizational skills, along with professionalism and diplomacy with internal and external customers/vendors.
Ability to prioritize tasks and manage time.
Ability to work under tight deadlines.
Benefits
Highly competitive medical, dental, vision, Life w/ AD&D, Short-Term Disability insurance, Long-Term Disability insurance, pet insurance, identity theft protection, and a 401(k) retirement savings plan.
Employee wellness program.
Employee rewards, discounts, and recognition programs.
Generous company-paid holidays (12 per year), vacation, and sick time.
Paid paternity/maternity leave.
Monthly connectivity/home office stipend if working from home 5 days a week.
A supportive and positive space for you to grow and expand your career.
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.