Software engineer working on scalable applications involving frontend UI and backend infrastructure for cryptocurrency platform Crypto.com.
Responsibilities
Manage the daily operations of Hadoop, Spark, Flink, Clickhouse, and other big data platforms to ensure system stability and data security;
Operate and administer Linux systems, proficient in common commands, with scripting skills in Shell, Python, or similar languages;
Maintain and operate big data components on cloud platforms (AWS/GCP/Azure), with hands-on experience in container technologies such as Kubernetes and Docker;
Build and maintain monitoring systems to track platform performance, optimize configurations and resource allocation, and troubleshoot performance issues (tools such as Datadog, Prometheus, Grafana, Zabbix, etc.);
Deploy, upgrade, and scale core big data components including Flink, Kafka, Spark, Impala, Clickhouse and Kudu;
Write and maintain technical documentation for operations, and support in platform solution design and upgrades;
Explore and adopt new technologies to optimize operation processes and provide technical guidance to help client teams improve their capabilities.
Preferred: Experience on the development of n8n workflow
Requirements
Familiarity with CDH/CDP/HDP or similar Hadoop distributions, with proven experience in big data platform operations preferred;
Proficiency in Linux administration and scripting (Shell, Python, or equivalent);
Experience with major cloud platforms (AWS/GCP/Azure) and containerized environments (Kubernetes, Docker);
Strong knowledge of big data components and performance tuning, able to resolve stability and scalability issues independently;
Strong technical writing and communication skills to support documentation and solution delivery;
Self-motivated with a passion for new technologies, able to apply innovative approaches to optimize operations and drive efficiency.
Benefits
Competitive salary
Attractive annual leave entitlement including: birthday, work anniversary
Work Flexibility Adoption. Flexi-work hour and hybrid or remote set-up
Aspire career alternatives through us. Our internal mobility program can offer employees a diverse scope.
Work Perks: crypto.com visa card provided upon joining
Snowflake Data Engineer optimizing data pipelines using Snowflake for a global life science company. Collaborate with cross - functional teams for data solutions and performance improvements in Madrid.
Data Engineer designing and implementing big data solutions at DATAIS. Collaborating with clients to deliver actionable business insights and innovative data products in a hybrid environment.
SAP Data Engineer supporting MERKUR GROUP in becoming a data - driven company. Responsible for data integration, ETL processes, and collaboration with various departments.
Big Data Engineer designing and managing data applications on Google Cloud. Join Vodafone’s global tech team to optimize data ingestion and processing for machine learning.
Data Engineer building and maintaining data pipelines for Farfetch’s data platform. Collaborating with the Data team to improve data reliability and architecture in Porto.
Senior Data Engineer at Razer leading initiatives in data engineering and AI infrastructure. Collaborating across teams to develop robust data solutions and enhancing AI/ML projects.
Data Engineering Intern working with data as Jua builds AI for climate and geospatial datasets. Contributing to the integration and validation of new datasets with experienced mentors.
Data Engineer supporting a fintech company in building and maintaining data pipelines. Collaborating with tech teams and enhancing data processing in a high - volume environment.
Staff Engineer developing innovative data solutions for dentsu's B2B marketing vision. Collaborating using cutting - edge cloud technologies and mentoring engineers in their careers.
Senior Data Engineer developing and optimizing data pipelines for Scene+’s cloud - native platform in Toronto. Collaborating across teams to enhance data governance and analytics capabilities.