Manage the daily operations of Hadoop, Spark, Flink, Clickhouse, and other big data platforms to ensure system stability and data security;
Operate and administer Linux systems, proficient in common commands, with scripting skills in Shell, Python, or similar languages;
Maintain and operate big data components on cloud platforms (AWS/GCP/Azure), with hands-on experience in container technologies such as Kubernetes and Docker;
Build and maintain monitoring systems to track platform performance, optimize configurations and resource allocation, and troubleshoot performance issues (tools such as Datadog, Prometheus, Grafana, Zabbix, etc.);
Deploy, upgrade, and scale core big data components including Flink, Kafka, Spark, Impala, Clickhouse and Kudu;
Write and maintain technical documentation for operations, and support in platform solution design and upgrades;
Explore and adopt new technologies to optimize operation processes and provide technical guidance to help client teams improve their capabilities.
Preferred: Experience on the development of n8n workflow
Requirements
Familiarity with CDH/CDP/HDP or similar Hadoop distributions, with proven experience in big data platform operations preferred;
Proficiency in Linux administration and scripting (Shell, Python, or equivalent);
Experience with major cloud platforms (AWS/GCP/Azure) and containerized environments (Kubernetes, Docker);
Strong knowledge of big data components and performance tuning, able to resolve stability and scalability issues independently;
Strong technical writing and communication skills to support documentation and solution delivery;
Self-motivated with a passion for new technologies, able to apply innovative approaches to optimize operations and drive efficiency.
Benefits
Competitive salary
Attractive annual leave entitlement including: birthday, work anniversary
Work Flexibility Adoption. Flexi-work hour and hybrid or remote set-up
Aspire career alternatives through us. Our internal mobility program can offer employees a diverse scope.
Work Perks: crypto.com visa card provided upon joining
Our Crypto.com benefits packages vary depending on region requirements, you can learn more from our talent acquisition team.
Data Engineer developing architecture and pipelines for data analytics at NinjaTrader. Empowering analysts and improving business workflows through data - driven solutions.
Data Engineer joining Alterric to collaborate on data platform projects and analytics solutions. Working with Azure Cloud technologies to ensure data quality and integrity for informed decision - making.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.