Hybrid Data Platform Engineer

Posted 2 months ago

Apply now

About the role

  • Data Platform Engineer developing data ingestion pipelines for a high-volume data processing system at Mercari. Collaborating with teams to ensure successful data integration.

Responsibilities

  • Design, develop, and maintain data ingestion pipelines for a high-volume data processing system that collects data from mobile and web applications (clients).
  • Develop and maintain streaming data pipelines to ingest raw data and write it to a data warehouse and lake house.
  • Implement batch data transformation pipelines.
  • Write SQL queries to extract, transform, and load data from various sources.
  • Collaborate with data scientists, data analysts, and software engineers to understand data requirements, manage data schemas, and ensure successful data integration.
  • Manage and maintain the CI/CD release pipeline.
  • Utilize Docker, YAML, Bash scripting, Terraform, and other technologies to automate infrastructure provisioning and deployments.
  • Monitor and troubleshoot pipeline issues to ensure smooth data flow and data quality.
  • Write clean, maintainable, and well-documented code.
  • Develop and execute on the long-term goals and roadmap of the data platform.
  • Develop and maintain a very high RPS REST service to receive user events from clients.
  • Develop and maintain logging SDK for the server side system.

Requirements

  • Resonates with the mission and values of the Mercari Group and its individual companies
  • Experience with streaming data processing frameworks like Apache Beam or Spark or Flink.
  • Experience with Data Warehouse technologies like Google BigQuery, Amazon Redshift, Hive/Hadoop or Snowflake.
  • Experience designing, developing, and operating large-scale services and/or distributed systems or data pipelines using a variety of programming languages including Go, Python, Java, Scala.
  • Experience with building APIs and using data serialization formats (e.g., Protobuf, Avro, Parquet).
  • Experience in writing design documents or technical proposals and reaching agreements with stakeholders.
  • Familiarity with monitoring and alerting tools.
  • Experience with Google Cloud Platform (Dataflow, Pubsub, Kubernetes Engine, Compute Engine)
  • Experience with Confluent Cloud or Apache Kafka.
  • Experience with Workflow engines like Argo Workflow or Apache Airflow.
  • Experience publishing and contributing to OSS.
  • English: Independent (CEFR - B2)
  • Japanese: Independent (CEFR - B2) preferred

Job title

Data Platform Engineer

Job type

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job