Data Engineer developing scalable data pipelines for ETL/ELT processes using GCP services. Collaborating with team members to optimize data workflows and ensure data integrity.
Responsibilities
Design, develop, and implement scalable and efficient data pipelines using GCP services (e.g., Dataflow, Cloud Functions, Workflows) for ETL/ELT processes.
Build robust and scalable solutions for orchestrating data workflows, ensuring data integrity.
Develop and optimize data models and data warehouse solutions in BigQuery.
Write complex and highly optimized SQL queries to extract, transform, and load data.
Define and enforce best practices for data platform development and usage.
Evaluate new technologies to improve our data capabilities.
Provide technical expertise and guidance to junior team members.
Collaborate closely with team members to understand data requirements and propose solutions.
Requirements
Strong proficiency in Python and SQL, including advanced SQL query writing and optimization.
Java is also good to have.
Experience with Google Cloud Platform (GCP) services, such as BigQuery, Cloud Run, Data Catalog, Cloud Functions, IAM, GCS, Monitoring, Workflows, Cloud SQL, and Secret Manager.
Solid understanding of backend development concepts, including CI/CD pipelines (CircleCI), Docker, and microservices architecture.
Knowledge of data modeling, data architecture, data pipelines, ETL/ELT processes, and business intelligence tools.
Familiarity with business intelligence tools, with experience building dashboards in Sisense being a significant advantage.
Exp. in shell scripting.
AWS is good to have.
Excellent communication and interpersonal skills to effectively collaborate with team members and stakeholders.
Strong problem-solving and analytical abilities to identify and resolve complex technical challenges.
Ability to work independently and manage tasks effectively.
A strong passion for data and a commitment to delivering high-quality data solutions.
Experience with data mesh principles and practices would be a significant advantage.
Familiarity with data governance and compliance frameworks is desirable.
Data Engineer at LPL Financial developing Python - based ETL pipelines. Collaborating with cross - functional teams to ensure reliable data delivery and optimizing pipeline performance.
Senior Data Engineer at Keyrus focusing on data solutions and projects to drive performance. Collaborating with teams globally to enhance data transformation and governance processes.
Data Governance Engineer in Fintech developing a formal cyber data governance framework. Collaborating with cyber security, analytics, and platform engineering teams on metadata and lineage capabilities.
Junior Data Engineer role at Allegro, focusing on developing ETL/ELT pipelines and processing large datasets. Collaborate with cross - functional teams for data quality and reporting.
Data Engineer at Concept Reply developing innovative data - driven solutions in IoT. Collaborating with teams to unlock the potential of data and cloud computing.
Data Engineer creating and managing data pipelines for critical data solutions at S&P Global. Collaborating on enterprise - scale data processing in a supportive, innovative environment.
Data Engineer supporting and evolving data environment in cloud migration. Maintain and optimize existing databases while designing modern data solutions with cross - functional collaboration.
Senior Data Engineer responsible for data pipeline projects at Suprema Gaming. Focus on batch and streaming data solutions while collaborating with business teams.
Senior data leader managing the enterprise data architecture at Breakthru Beverage. Leading high - performing teams in data engineering and defining modern data strategies.
Data Engineer at Equinix implementing data architecture solutions for scalability and analytics. Collaborating with teams to design data pipelines and maintain data models for business objectives.