Data Engineer developing and maintaining data infrastructure at Highmark Health. Responsible for ensuring efficient and reliable flow of data across various systems.
Responsibilities
Design, develop, and maintain robust data processes and solutions to ensure the efficient movement and transformation of data across multiple systems
Develop and maintain data models, databases, and data warehouses to support business intelligence and analytics needs
Collaborate with stakeholders across IT, product, analytics, and business teams to gather requirements and provide data solutions that meet organizational needs
Monitor work against the production schedule, provide progress updates, and report any issues or technical difficulties to lead developers regularly
Implement and manage data governance practices, ensuring data quality, integrity, and compliance with relevant regulations
Collaborate on the design and implementation of data security measures
Perform data analysis and provide insights to support decision-making across various departments
Stay current with industry trends and emerging technologies in data engineering, recommending new tools and best practices as needed
Other duties as assigned or requested.
Requirements
3 years of experience in design and analysis of algorithms, data structures, and design patterns
3 years of experience in a data engineering, ETL development, or data management role
3 years of experience in SQL and experience with database technologies (e.g., MySQL, PostgreSQL, MongoDB)
3 years of experience in data warehousing concepts and experience with data warehouse solutions (e.g., Snowflake, Redshift, BigQuery)
Proficiency in Python for data manipulation, scripting, API integrations, and developing robust data engineering solutions
Strong SQL skills for complex data extraction, transformation, and loading (ETL/ELT) across various database systems
Demonstrated experience with version control systems (Git/GitHub/GitLab) for collaborative development, code management, and deployment best practices
Experience implementing and following SDLC best practices for data solutions
Hands-on experience with Google BigQuery or comparable cloud-native data warehouse solutions
Knowledge of data governance, data quality, and data lineage best practices
Experience with dimensional modeling (Star Schema, Snowflake Schema) and other data modeling techniques
Experience with Infrastructure as Code (IaC) tools like Terraform for managing cloud resources
Experience building and managing data pipelines using workflow orchestration tools like Apache Airflow or Google Cloud Composer
Proficiency with Google Cloud Platform (GCP) services relevant to data engineering, such as BigQuery, Cloud Storage, Cloud Functions, and Pub/Sub
Familiarity with Data Lake/Lakehouse concepts and their application in modern data architectures
Solid understanding and practical application of dbt (data build tool) for data transformation, testing, documentation, and version control of data models
Experience with Starburst for data federation and analytics.
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.