Staff Data Engineer designing and improving ETL/ELT pipelines at Prosper. Collaborating with teams to use trusted data efficiently across internal and external systems.
Responsibilities
Work with engineers, DBAs, infrastructure, product, data engineers, and analysts to learn Prosper’s data ecosystem and keep it running fast and secure.
Forge strong relationships with business stakeholders, analysts, and data scientists to grasp their needs and craft data solutions to meet them.
Design and run self-checking ETL/ELT pipelines with logging, alerting, and automated tests.
Develop pipelines on Google Cloud Platform (GCP) using Python, dbt, and Airflow (Composer), and additional tools as needed.
Evaluate new tools and approaches; bring forward practical ideas that improve speed, quality, or cost.
Bring curiosity, ownership, and clear thinking to tough engineering problems.
Requirements
Degree in Computer Science or related field, or equivalent experience.
8+ years of object-oriented programming in an enterprise setting. Deep experience in Python; experience with Java, C#, or Go is a plus.
Proficiency in a SQL (e.g. BigQuery, T-SQL, Redshift, PostgreSQL), with an interest in dimensional modeling and data warehouses.
Solid Git/GitHub skills and familiarity with Agile and the SDLC.
Strong communication and collaboration skills across technical and non-technical teams.
DevOps experience with CI/CD, containers (Docker, Kubernetes), and infrastructure as code (Terraform or similar).
Proficient with LLM-assisted development in IDEs such as Cursor.
Commitment to an inclusive, learning-focused culture and continuous improvement.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.
Data Architect designing and implementing scalable data architectures for Keyrus in Bordeaux. Leading client transitions and contributing to the tech ecosystem with innovative data solutions.