Staff Data Engineer designing and improving ETL/ELT pipelines at Prosper. Collaborating with teams to use trusted data efficiently across internal and external systems.
Responsibilities
Work with engineers, DBAs, infrastructure, product, data engineers, and analysts to learn Prosper’s data ecosystem and keep it running fast and secure.
Forge strong relationships with business stakeholders, analysts, and data scientists to grasp their needs and craft data solutions to meet them.
Design and run self-checking ETL/ELT pipelines with logging, alerting, and automated tests.
Develop pipelines on Google Cloud Platform (GCP) using Python, dbt, and Airflow (Composer), and additional tools as needed.
Evaluate new tools and approaches; bring forward practical ideas that improve speed, quality, or cost.
Bring curiosity, ownership, and clear thinking to tough engineering problems.
Requirements
Degree in Computer Science or related field, or equivalent experience.
8+ years of object-oriented programming in an enterprise setting. Deep experience in Python; experience with Java, C#, or Go is a plus.
Proficiency in a SQL (e.g. BigQuery, T-SQL, Redshift, PostgreSQL), with an interest in dimensional modeling and data warehouses.
Solid Git/GitHub skills and familiarity with Agile and the SDLC.
Strong communication and collaboration skills across technical and non-technical teams.
DevOps experience with CI/CD, containers (Docker, Kubernetes), and infrastructure as code (Terraform or similar).
Proficient with LLM-assisted development in IDEs such as Cursor.
Commitment to an inclusive, learning-focused culture and continuous improvement.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.
Data Engineer strengthening data platform team at Samba TV to improve data analytics and reporting capabilities. Building on AWS, Databricks, BigQuery, and Snowflake technology.
Data Engineer focusing on secure ETL/ELT data pipelines and compliance in healthcare. Designing scalable ingestion frameworks and ensuring alignment with federal standards.