Data Engineer role at Qodea, a leading cloud consultancy. Designing and maintaining scalable data pipelines using cloud platforms.
Responsibilities
How You’ll Shape Our SuccessThe purpose of this role is to design, build, and maintain scalable data pipelines and infrastructure that enable the efficient processing and analysis of large, complex data sets.
What You’ll Do*Develop and maintain automated data processing pipelines using Google Cloud:*
Design, build, and maintain data pipelines to support data ingestion, ETL, and storage
Build and maintain automated data pipelines to monitor data quality and troubleshoot issues
*Implement and maintain databases and data storage solutions:*
Stay up-to-date with emerging trends and technologies in big data and data engineering
Ensure data quality, accuracy, and completeness
*Implement and enforce data governance policies and procedures to ensure data quality and accuracy:*
Collaborate with data scientists and analysts to design and optimise data models for analytical and reporting purposes
Develop and maintain data models to support analytics and reporting
Monitor and maintain data infrastructure to ensure availability and performance
Requirements
What You’ll Need to Succeed
Experience with cloud platforms such as Amazon Web Services (AWS) or Google Cloud Platform (GCP).
Proficiency in SQL and experience with relational databases such as MySQL, PostgreSQL, or Oracle.
Experience with big data technologies such as Hadoop, Spark, or Hive.
Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow.
Proficiency in at least one programming language such as Python, Java, or Scala.
Strong analytical and problem-solving skills with the ability to work independently and in a team environment.
Benefits
**Financial:**
Competitive base salary.
Discretionary company bonus scheme.
Employee referral scheme
Meal Vouchers
**Health & Wellbeing:**
Health Care Package
Life and Health Insurance
Bookster
**Time Off & Flexibility:**
28 days of annual leave
Floating bank holidays
An extra paid day off on your birthday.
Ten paid learning days per year.
Flexible working hours
Sabbatical leave (after 5 years).
Work from anywhere (up to 3 weeks per year).
**Development & Recognition:**
Industry-recognised training & certifications.
Bonusly: employee recognition and rewards platform.
Data Engineer at Equinix implementing data architecture solutions for scalability and analytics. Collaborating with teams to design data pipelines and maintain data models for business objectives.
Data Warehouse Architect developing and optimizing robust data warehouse environments on SAP BW/4HANA. Critical for enabling advanced analytics and reporting across the organization.
Data Engineering Manager leading a new Data Engineering team in Bengaluru. Shaping the design and scaling of core data engineering practices across the organization.
Sr. ETL/Data Warehouse Lead at Huntington designing, developing, and supporting ETL and Data Warehousing framework. Analyzing systems based on specifications and providing technical assistance.
Senior Google Data Architect designing and delivering scalable data solutions on Google Cloud Platform. Collaborating across teams to shape target - state data architectures and influence enterprise data strategy.
Data Engineer developing scalable data lake solutions and optimizing data pipelines at U.S. Bank. Collaborating with teams to manage data governance and cloud migration activities.
Lead AI, MLOps & Data Engineer at WedR, guiding complex data projects and AI innovation. Collaborate with diverse experts in a Product Studio for digital transformations.
Lead Azure Databricks Data Engineer implementing data solutions for data engineering projects at Ryan Specialty. Collaborating with stakeholders and mentoring junior staff on data pipelines and ETL processes.
Lead Azure Databricks Data Engineer at Ryan Specialty focused on implementing data solutions and collaborating with cross - functional teams to enhance data architecture.
Senior Data Engineer designing and implementing sustainable data solutions for diverse clients. Collaborating closely with stakeholders to enhance data services and platforms in a hybrid environment.