Work with engineers, DBAs, infrastructure, product, data engineers, and analysts to learn Prosper’s data ecosystem and keep it running fast and secure.
Forge strong relationships with business stakeholders, analysts, and data scientists to grasp their needs and craft data solutions to meet them.
Design and run self-checking ETL/ELT pipelines with logging, alerting, and automated tests.
Develop pipelines on Google Cloud Platform (GCP) using Python, dbt, and Airflow (Composer), and additional tools as needed.
Evaluate new tools and approaches; bring forward practical ideas that improve speed, quality, or cost.
Bring curiosity, ownership, and clear thinking to tough engineering problems.
Requirements
Degree in Computer Science or related field, or equivalent experience.
8+ years of object-oriented programming in an enterprise setting. Deep experience in Python; experience with Java, C#, or Go is a plus.
Proficiency in a SQL (e.g. BigQuery, T-SQL, Redshift, PostgreSQL), with an interest in dimensional modeling and data warehouses.
Solid Git/GitHub skills and familiarity with Agile and the SDLC.
Strong communication and collaboration skills across technical and non-technical teams.
DevOps experience with CI/CD, containers (Docker, Kubernetes), and infrastructure as code (Terraform or similar).
Proficient with LLM-assisted development in IDEs such as Cursor.
Commitment to an inclusive, learning-focused culture and continuous improvement.
Data Engineer developing architecture and pipelines for data analytics at NinjaTrader. Empowering analysts and improving business workflows through data - driven solutions.
Data Engineer joining Alterric to collaborate on data platform projects and analytics solutions. Working with Azure Cloud technologies to ensure data quality and integrity for informed decision - making.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.