Data Engineer developing data products and solutions within Equifax's data analytics team. Building scalable data pipelines and enabling AI capabilities for better insights.
Responsibilities
Build complex batch and streaming pipelines to ingest data from upstream Equifax cloud systems.
Design and implement data engineering frameworks to scale the development and deployment of data pipelines across the D&A organization.
Leverage AI-powered coding assistants to accelerate development, optimize code, and generate documentation for data pipelines and infrastructure.
Develop and refine prompts for Large Language Models (LLMs) to assist in data-related tasks such as data cleansing, transformation logic generation, and automated data documentation.
Design, build, and maintain scalable data pipelines that support AI/ML applications.
Explore and implement AI agents to automate repetitive data management tasks, monitor data quality, and orchestrate complex data workflows.
Play an active role in setting engineering standards and best practices in EWS D&A.
Requirements
At least 5 years of experience in data engineering, data architecture, or a related field.
A strong understanding of data engineering principles and best practices, including data modeling, data warehousing, and data integration.
At least 1 year of experience working in a GCP big data environment.
Experience building complex data pipelines and solutions using two or more of the following: BigQuery, DataFlow, DataProc, Pub/Sub, Cloud Functions.
Experience with Airflow or Cloud Composer.
Experience with Vertex AI.
Proficiency in Python development.
Professional experience with SQL.
Proven ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
A Bachelor's degree or higher in Computer Science, Information Systems, or a related field.
Benefits
comprehensive compensation and healthcare packages
401k matching
paid time off
organizational growth potential through our online learning platform with guided career tracks
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.
Data Architect designing and implementing scalable data architectures for Keyrus in Bordeaux. Leading client transitions and contributing to the tech ecosystem with innovative data solutions.