Senior DataOps Engineer at Crypto.com designing and improving scalable applications. Involved in big data operations and cloud platform management for diverse projects.
Responsibilities
Manage the daily operations of Hadoop, Spark, Flink, Clickhouse, and other big data platforms to ensure system stability and data security;
Operate and administer Linux systems, proficient in common commands, with scripting skills in Shell, Python, or similar languages;
Maintain and operate big data components on cloud platforms (AWS/GCP/Azure), with hands-on experience in container technologies such as Kubernetes and Docker;
Build and maintain monitoring systems to track platform performance, optimize configurations and resource allocation, and troubleshoot performance issues (tools such as Datadog, Prometheus, Grafana, Zabbix, etc.);
Deploy, upgrade, and scale core big data components including Flink, Kafka, Spark, Impala, Clickhouse and Kudu;
Write and maintain technical documentation for operations, and support in platform solution design and upgrades;
Explore and adopt new technologies to optimize operation processes and provide technical guidance to help client teams improve their capabilities.
Preferred: Experience on the development of n8n workflow
Requirements
Familiarity with CDH/CDP/HDP or similar Hadoop distributions, with proven experience in big data platform operations preferred;
Proficiency in Linux administration and scripting (Shell, Python, or equivalent);
Experience with major cloud platforms (AWS/GCP/Azure) and containerized environments (Kubernetes, Docker);
Strong knowledge of big data components and performance tuning, able to resolve stability and scalability issues independently;
Strong technical writing and communication skills to support documentation and solution delivery;
Self-motivated with a passion for new technologies, able to apply innovative approaches to optimize operations and drive efficiency.
Benefits
Competitive salary
Attractive annual leave entitlement including: birthday, work anniversary
Work Flexibility Adoption. Flexi-work hour and hybrid or remote set-up
Aspire career alternatives through us. Our internal mobility program can offer employees a diverse scope.
Work Perks: crypto.com visa card provided upon joining
Our Crypto.com benefits packages vary depending on region requirements, you can learn more from our talent acquisition team.
Data Engineer II focusing on strategic ingestion product for Travelers. Building data solutions and collaborating across teams to support analytic transformation.
Data Engineer at Tatum focusing on scalable data solutions and blockchain technology. Collaborating with teams to ensure data integrity and manage data infrastructure in a hybrid setup.
Senior Lead Data Engineer at Capital One collaborating with Agile teams and mentoring developers. Leading full - stack development and driving cloud - based solutions for financial empowerment.
Senior Data Engineer responsible for developing data pipelines and collaborating with US business teams. Working at Beghou Consulting, a life sciences company providing advanced analytics and technology solutions.
Data Solutions Architect designing enterprise - scale Azure and Databricks Lakehouse solutions for clinical trials and life sciences data enabling advanced analytics and compliance.
Data Architect at ADEO ensuring interoperability of IT systems through architecture design and data knowledge diffusion. Collaborating with teams to maintain data integrity and quality standards in an international setup.
Consultant, Data Engineer leading end - to - end data solutions and analytics. Collaborating with clients to improve data strategies and deliver actionable insights.
Big Data Engineer developing applications for Synchrony’s Enterprise Data Lake within an Agile scrum team. Collaborating to deliver high - quality data ingestion and maintain data governance standards.
Junior AI Data Engineer specializing in data - focused solutions for financial services. Collaborating on digital transformation projects across various regions.
Data Engineer optimizing data pipelines and cloud solutions for GFT Poland. Involves performance tuning, ETL pipelines, and data model development across multiple locations in Poland.