Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Responsibilities
Design, implement, and maintain scalable data pipelines using ELK Stack (Elasticsearch, Logstash, Kibana) and Beats for monitoring and analytics.
Develop data processing workflows to handle real-time and batch data ingestion, transformation and visualization.
Implement techniques like grok patterns, regular expressions, and plugins to handle complex log formats and structures.
Configure and optimize Elasticsearch clusters for efficient indexing, searching, and performance tuning.
Collaborate with business users to understand their data integration & visualization needs and translate them into technical solutions.
Create dynamic and interactive dashboards in Kibana for data visualization and insights that can enable to detect the root cause of the issue.
Leverage open-source tools such as Beats and Python to integrate and process data from multiple sources.
Collaborate with cross-functional teams to implement ITSM solutions integrating ELK with tools like ServiceNow and other ITSM platforms.
Anomaly detection using Elastic ML and create alerts using Watcher functionality.
Extract data by Python programming using API.
Build and deploy solutions in containerized environments using Kubernetes.
Monitor Elasticsearch clusters for health, performance, and resource utilization.
Automate routine tasks and data workflows using scripting languages such as Python or shell scripting.
Provide technical expertise in troubleshooting, debugging, and resolving complex data and system issues.
Create and maintain technical documentation, including system diagrams, deployment procedures, and troubleshooting guides.
Requirements
At least 3 years of experience with ELK Stack and Python programming, along with a minimum of 5 years overall experience in the IT industry.
ELK Stack: Deep expertise in Elasticsearch, Logstash, Kibana, Beats and Anomaly detection.
Programming: Good understanding in Python for scripting and automation.
ITSM Platforms: Hands-on experience with ServiceNow or similar ITSM tools.
Containerization: Understanding with Kubernetes and containerized applications.
Operating Systems: Strong working knowledge of Windows, Linux, and AIX environments.
Open Source Tools: Familiarity with various open-source data integration and monitoring tools.
Additional Skills: Knowledge of network protocols, log management, and system performance optimization.
Experience in integrating ELK solutions with enterprise IT environments.
Strong analytical and problem-solving skills with attention to detail.
Knowledge in MySQL or NoSQL Databases will be added advantage.
Data Architect designing scalable data architectures for analytics and reporting at XTEL. Collaborating with international teams to ensure data quality and infrastructure improvements.
Lead Enterprise Data Architect building and owning foundational data management capabilities at a technology - driven company. Enhancing data architecture for AI and operational use with strategic leadership and technical expertise.
Associate Data Engineer supporting data engineering projects at The Hartford in Hartford, CT and Charlotte, NC. Engaging in projects that involve data analysis and developing data assets using various technologies.
Senior Data Engineer / Snowflake Architect leading the design and optimization of data solutions. Working closely with clients and internal teams to build scalable architectures in a hybrid environment.
Data Engineer II developing ETL/ELT solutions for higher education data warehouse. Ensuring reliable institutional data for strategic decision - making by university leaders.
Associate Data Engineer building and maintaining data systems at Incedo. Transforming raw information into accessible datasets for decision - making and collaborating with analysts and data scientists.
Intern Data Engineer at Flutter Studios learning to create applications in the gaming industry. Join a cross - functional team to implement a basic application in an Agile environment.
AI/Data Engineer at Comcast developing data pipelines and AI solutions for audit processes. Leading team efforts to ensure data quality and compliance with audit objectives across business units.
Senior Data Engineer involved in AWS Cloud and Big Data solutions for Financial Crime. Join CommBank's team to tackle complex data - centric problems with advanced technologies.
Senior Data Engineer leading and mentoring a team in building scalable data pipelines for digital transformation projects in an international software company.