PrePass® is North America's most trusted weigh station bypass and toll management platform. We’re transforming how the transportation industry operates—creating solutions that keep trucks moving safely, efficiently, and compliantly. This means making bold decisions and building systems that support not only fleets but the broader economy. It all starts with enabling commercial vehicles to keep rolling with seamless toll management, weigh station bypass, and safety solutions. It’s what we do best, and we do it to meet the demands of the road every day.
That’s why people join us: our solutions are implemented in real-time, on highways and interstates across the nation, helping fleets go farther, faster. This work challenges and rewards, presenting complex problems that need ambitious answers. We hire bold thinkers with a heart for impact, a passion for progress, and the optimism to shape the future of transportation.
**About the Role**
We’re looking for a skilled Data Engineer to join our team in the transportation sector. In this role, you’ll work with modern cloud technologies to build and maintain data pipelines that support analytics, reporting, and operational insights. You’ll be part of a collaborative team focused on delivering reliable, scalable data solutions that help drive smarter decision-making across the organization.
This is a great opportunity for someone with solid experience in backend data systems who enjoys solving real-world problems and working with evolving data platforms. This is a hybrid position located at our office in downtown Phoenix.
**Key Responsibilities**
Design, develop, and maintain cloud-native data pipelines leveraging Databricks, Microsoft Azure Data Factory, and Microsoft Fabric to support robust data integration and analytics solutions.
Implement incremental and real-time data ingestion strategies using medallion architecture for data lake storage.
Write and optimize complex SQL queries to transform, integrate, and analyze data across enterprise systems.
Support and troubleshoot legacy data platforms built on SSIS and SQL Server, ensuring high availability and performance of critical data processes
Develop features with a focus on scalability, maintainability, and testability.
Troubleshoot and resolve data integration and quality issues, ensuring reliable data delivery.
Participate in proof-of-concept projects, providing technical analysis and recommendations.
Requirements
**Required**
5+ years of experience designing and building data solutions.
Strong proficiency in SQL and Python for data analytics and transformation.
Experience with ETL pipeline development and automation.
Solid understanding of Data Lake architecture and design principles.
Excellent collaboration skills and the ability to adapt in a dynamic environment.
**Preferred**
Experience with Azure Cloud services and cloud-based ETL tools.
Familiarity with data visualization tools such as Power BI or Tableau.
Understanding of event-driven architectures, including queues, batch processing, and pub/sub models.
Exposure to NoSQL databases like MongoDB or Cassandra.
**Bonus Points For**
Experience in Data Science or Machine Learning, particularly in model deployment or feature engineering.
Benefits
**How We Will Take Care of You**
Robust benefit package that includes medical, dental, and vision that start on date of hire.
Paid Time Off, to include vacation, sick, holidays, and floating holidays.
401(k) plan with employer match.
Company-funded “lifestyle account” upon date of hire for you to apply toward your physical and mental well-being (i.e., ski passes, retreats, gym memberships).
Tuition Reimbursement Program.
Voluntary benefits, to include but not limited to Legal and Pet Discounts.
Employee Assistance Program (available at no cost to you).
Company-sponsored and funded “Culture Team” that focuses on the Physical, Mental, and Professional well-being of employees.
Community Give-Back initiatives.
Culture that focuses on employee development initiatives.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.
Full Stack Data Engineer on a Central Engineering Portfolio Team in Chennai delivering curated data products and collaborating with data engineers and product owners.
Data Engineer designing and implementing data pipelines and services for Ford Pro analytics. Working with diverse teams and technologies to drive data - driven solutions.
Data Engineer developing best - in - class data platforms for ClearBank with a focus on data insights and automation. Collaborating closely with stakeholders and supporting data science initiatives.