Data Engineer focusing on data integrations and ETL pipeline development at Tanium. Collaborate with IT and Engineering, providing scalable data solutions and ensuring data governance standards.
Responsibilities
Design, develop, and maintain scalable, enterprise-grade ETL pipelines to extract, transform, and load data across internal and external systems.
Build secure, high-performance integrations between on-prem and cloud-based platforms, ensuring reliability, consistency, and timely data delivery.
Architect and manage data ingestion processes from systems such as Salesforce, NetSuite, SuccessFactors, Coupa, and other corporate applications.
Identify and implement internal process improvements including automation, data quality controls, and optimization of data delivery.
Ensure compliance with data governance, security, and privacy policies across all integration workflows.
Analyze complex business and technical requirements to design scalable data integration solutions.
Lead efforts to modernize and rationalize legacy integrations and optimize existing data pipelines for performance and maintainability.
Collaborate cross-functionally to design and implement frameworks that enhance data accessibility and usability across teams.
Monitor integration performance, troubleshoot issues, and proactively implement improvements or system upgrades as needed.
Develop and manage infrastructure-as-code and CI/CD practices for data pipeline deployment, version control, and change management.
Create, maintain, and continually update documentation for all integration workflows, data models, and process automation scripts.
Ensure timely root-cause analysis and resolution of data or system integration incidents.
Work directly with stakeholders from IT, Finance, HR, and Engineering to align integration and reporting initiatives with business objectives.
Partner with external vendors or service providers to implement and maintain robust integration solutions.
Act as a technical advisor within the business systems team, contributing best practices for integration design and development.
Requirements
Bachelor's degree in computer science, Information Systems, Engineering, or related field, or equivalent practical experience.
7+ years of professional experience in data engineering, data integration, or system integration roles within enterprise environments.
Proven success in building, maintaining, and scaling ETL pipelines using dedicated ETL or data integration tools.
Strong knowledge of data modeling, APIs (RESTful/SOAP), and data warehousing concepts.
Experience designing integrations across diverse systems in cloud and on-premises environments.
Proficiency in Python, Java, or JavaScript for scripting and automation.
Hands-on experience with data warehouse technologies (e.g., Snowflake or similar) and workflow orchestration frameworks (e.g., Airflow or equivalent).
Demonstrated ability to diagram and communicate end-to-end data flows across complex system landscapes.
Familiarity with modern DevOps and CI/CD practices for data engineering environments.
Strong understanding of SDLC frameworks, change management, and IT governance processes.
Data Engineer at Equinix implementing data architecture solutions for scalability and analytics. Collaborating with teams to design data pipelines and maintain data models for business objectives.
Data Warehouse Architect developing and optimizing robust data warehouse environments on SAP BW/4HANA. Critical for enabling advanced analytics and reporting across the organization.
Data Engineering Manager leading a new Data Engineering team in Bengaluru. Shaping the design and scaling of core data engineering practices across the organization.
Sr. ETL/Data Warehouse Lead at Huntington designing, developing, and supporting ETL and Data Warehousing framework. Analyzing systems based on specifications and providing technical assistance.
Senior Google Data Architect designing and delivering scalable data solutions on Google Cloud Platform. Collaborating across teams to shape target - state data architectures and influence enterprise data strategy.
Data Engineer developing scalable data lake solutions and optimizing data pipelines at U.S. Bank. Collaborating with teams to manage data governance and cloud migration activities.
Lead AI, MLOps & Data Engineer at WedR, guiding complex data projects and AI innovation. Collaborate with diverse experts in a Product Studio for digital transformations.
Lead Azure Databricks Data Engineer implementing data solutions for data engineering projects at Ryan Specialty. Collaborating with stakeholders and mentoring junior staff on data pipelines and ETL processes.
Lead Azure Databricks Data Engineer at Ryan Specialty focused on implementing data solutions and collaborating with cross - functional teams to enhance data architecture.
Senior Data Engineer designing and implementing sustainable data solutions for diverse clients. Collaborating closely with stakeholders to enhance data services and platforms in a hybrid environment.