Engineering team at Crypto.com responsible for designing and developing software for various ventures projects. Focused on scalable applications and innovative solutions.
Responsibilities
Manage the daily operations of Hadoop, Spark, Flink, Clickhouse, and other big data platforms to ensure system stability and data security;
Operate and administer Linux systems, proficient in common commands, with scripting skills in Shell, Python, or similar languages;
Maintain and operate big data components on cloud platforms (AWS/GCP/Azure), with hands-on experience in container technologies such as Kubernetes and Docker;
Build and maintain monitoring systems to track platform performance, optimize configurations and resource allocation, and troubleshoot performance issues (tools such as Datadog, Prometheus, Grafana, Zabbix, etc.);
Deploy, upgrade, and scale core big data components including Flink, Kafka, Spark, Impala,Clickhouse and Kudu;
Write and maintain technical documentation for operations, and support in platform solution design and upgrades;
Explore and adopt new technologies to optimize operation processes and provide technical guidance to help client teams improve their capabilities.
Preferred: Experience on the development of n8n workflow
Requirements
Familiarity with CDH/CDP/HDP or similar Hadoop distributions, with proven experience in big data platform operations preferred;
Proficiency in Linux administration and scripting (Shell, Python, or equivalent);
Experience with major cloud platforms (AWS/GCP/Azure) and containerized environments (Kubernetes, Docker);
Strong knowledge of big data components and performance tuning, able to resolve stability and scalability issues independently;
Strong technical writing and communication skills to support documentation and solution delivery;
Self-motivated with a passion for new technologies, able to apply innovative approaches to optimize operations and drive efficiency.
Benefits
Competitive salary
Attractive annual leave entitlement including: birthday, work anniversary
Work Flexibility Adoption. Flexi-work hour and hybrid or remote set-up
Aspire career alternatives through us. Our internal mobility program can offer employees a diverse scope.
Work Perks: crypto.com visa card provided upon joining
Data Engineer engineering DUAL Personal Lines’ strategic data platforms for global insurance group. Providing technical expertise in data engineering and collaborating with internal teams for solution delivery.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.