Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Responsibilities
Design, implement, and maintain scalable data pipelines using ELK Stack (Elasticsearch, Logstash, Kibana) and Beats for monitoring and analytics.
Develop data processing workflows to handle real-time and batch data ingestion, transformation and visualization.
Implement techniques like grok patterns, regular expressions, and plugins to handle complex log formats and structures.
Configure and optimize Elasticsearch clusters for efficient indexing, searching, and performance tuning.
Collaborate with business users to understand their data integration & visualization needs and translate them into technical solutions.
Create dynamic and interactive dashboards in Kibana for data visualization and insights that can enable to detect the root cause of the issue.
Leverage open-source tools such as Beats and Python to integrate and process data from multiple sources.
Collaborate with cross-functional teams to implement ITSM solutions integrating ELK with tools like ServiceNow and other ITSM platforms.
Anomaly detection using Elastic ML and create alerts using Watcher functionality.
Extract data by Python programming using API.
Build and deploy solutions in containerized environments using Kubernetes.
Monitor Elasticsearch clusters for health, performance, and resource utilization.
Automate routine tasks and data workflows using scripting languages such as Python or shell scripting.
Provide technical expertise in troubleshooting, debugging, and resolving complex data and system issues.
Create and maintain technical documentation, including system diagrams, deployment procedures, and troubleshooting guides.
Requirements
At least 3 years of experience with ELK Stack and Python programming, along with a minimum of 5 years overall experience in the IT industry.
ELK Stack: Deep expertise in Elasticsearch, Logstash, Kibana, Beats and Anomaly detection.
Programming: Good understanding in Python for scripting and automation.
ITSM Platforms: Hands-on experience with ServiceNow or similar ITSM tools.
Containerization: Understanding with Kubernetes and containerized applications.
Operating Systems: Strong working knowledge of Windows, Linux, and AIX environments.
Open Source Tools: Familiarity with various open-source data integration and monitoring tools.
Additional Skills: Knowledge of network protocols, log management, and system performance optimization.
Experience in integrating ELK solutions with enterprise IT environments.
Strong analytical and problem-solving skills with attention to detail.
Knowledge in MySQL or NoSQL Databases will be added advantage.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.