Hybrid Mid-level Data Engineer – GCP

Posted 1 hour ago

Apply now

About the role

  • Data Engineer responsible for building and maintaining Datalake in BigQuery (GCP) with data ingestion and transformations. Collaborating on data governance, analysis, and visualizations.

Responsibilities

  • Build and maintain a Data Lake in BigQuery (GCP)
  • Ingest data from databases (Oracle on-premises)
  • Versioned ELT transformations in Git
  • Apply data science techniques
  • Perform exploratory data analysis
  • Develop machine learning models
  • Create dashboards.

Requirements

  • Advanced SQL — window functions, deduplication, DML in BigQuery
  • Python — ingestion scripts, pipeline automation, and data analysis
  • BigQuery — modeling, machine learning, partitioning, clustering, and cost optimization
  • Google Cloud Platform — basic console and CLI (gcloud) navigation and operation
  • Git — versioning for code and transformations
  • Advanced English — ability to read technical documentation
  • Experience with data projects on BigQuery / GCP
  • Experience with batch ingestion pipelines (any stack)
  • Experience with layered ELT/ETL transformations.

Benefits

  • Work arrangement: Hybrid
  • Data professional role focused on building and maintaining a Data Lake in BigQuery (GCP)
  • Autonomy across data engineering, governance, and analysis/visualization

Job title

Mid-level Data Engineer – GCP

Job type

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

No Education Requirement

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job