Junior Data Engineer developing and maintaining data pipelines for AI-powered identity platform at Saviynt. Collaborating with senior engineers, analysts, and BI developers to ensure reliable data for decision-making.
Responsibilities
Develop, test, and maintain ETL/ELT pipelines for data ingestion and transformation.
Assist in designing and implementing data models, database schemas, and lakehouse architectures.
Write and optimize SQL queries in Snowflake for data extraction and reporting.
Collaborate with BI developers and analysts to ensure data accessibility and usability.
Monitor and troubleshoot data workflows to maintain system reliability.
Support migration and integration projects involving cloud platforms (e.g., AWS Lambda) and Python.
Expert level knowledge of Python Scripting.
Document processes, pipelines, and best practices for knowledge sharing.
Apply AI agents for data modeling and quality assurance.
Manage user permissions and security roles to ensure data governance and compliance.
Monitor database performance and recommend improvements.
Troubleshoot data integration issues and resolve data quality discrepancies.
Requirements
Bachelor’s degree in computer science, Information Systems, Business Analytics, Data Science, or related field (or equivalent experience).
Up to 5 years of experience in data engineering with a strong background in SQL development.
Experience in data warehousing and data modeling concepts.
Certifications in relevant tools (e.g., SQL or Snowflake, Azure).
Experience with cloud-based BI solutions (AWS).
Familiarity with Agile development methodologies.
Knowledge of data governance, privacy, and security best practices.Strong experience with SQL and Snowflake (queries, stored procedures, functions).
Knowledge of data warehousing and cloud platforms (AWS).
Exposure of fivetran tool (good to have)
Familiarity with agentic AI for data engineering tasks.
Exposure to reporting tools (e.g., Tableau – good to have).
Basic knowledge of ERP tools (e.g., Salesforce).
IGA Tool knowledge (good to have)
Strong analytical and problem-solving abilities.
Excellent communication skills for both technical and non-technical stakeholders.
Ability to work independently and collaboratively in a team environment.
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.