Hybrid Data Engineer

Posted 2 months ago

Apply now

About the role

  • Data Engineer focused on ETL pipeline development for data integrations at Tanium. Collaborating across teams to enhance data infrastructure in a hybrid environment.

Responsibilities

  • Design, develop, and maintain scalable, enterprise-grade ETL pipelines to extract, transform, and load data across internal and external systems.
  • Build secure, high-performance integrations between on-prem and cloud-based platforms, ensuring reliability, consistency, and timely data delivery.
  • Architect and manage data ingestion processes from systems such as Salesforce, NetSuite, SuccessFactors, Coupa, and other corporate applications.
  • Identify and implement internal process improvements including automation, data quality controls, and optimization of data delivery.
  • Ensure compliance with data governance, security, and privacy policies across all integration workflows.
  • Analyze complex business and technical requirements to design scalable data integration solutions.
  • Lead efforts to modernize and rationalize legacy integrations and optimize existing data pipelines for performance and maintainability.
  • Collaborate cross-functionally to design and implement frameworks that enhance data accessibility and usability across teams.
  • Monitor integration performance, troubleshoot issues, and proactively implement improvements or system upgrades as needed.
  • Develop and manage infrastructure-as-code and CI/CD practices for data pipeline deployment, version control, and change management.
  • Create, maintain, and continually update documentation for all integration workflows, data models, and process automation scripts.
  • Ensure timely root-cause analysis and resolution of data or system integration incidents.
  • Work directly with stakeholders from IT, Finance, HR, and Engineering to align integration and reporting initiatives with business objectives.
  • Partner with external vendors or service providers to implement and maintain robust integration solutions.
  • Act as a technical advisor within the business systems team, contributing best practices for integration design and development.

Requirements

  • Bachelor's degree in computer science, Information Systems, Engineering, or related field, or equivalent practical experience.
  • 7+ years of professional experience in data engineering, data integration, or system integration roles within enterprise environments.
  • Proven success in building, maintaining, and scaling ETL pipelines using dedicated ETL or data integration tools.
  • Strong knowledge of data modeling, APIs (RESTful/SOAP), and data warehousing concepts.
  • Experience designing integrations across diverse systems in cloud and on-premises environments.
  • Proficiency in Python, Java, or JavaScript for scripting and automation.
  • Hands-on experience with data warehouse technologies (e.g., Snowflake or similar) and workflow orchestration frameworks (e.g., Airflow or equivalent).
  • Demonstrated ability to diagram and communicate end-to-end data flows across complex system landscapes.
  • Familiarity with modern DevOps and CI/CD practices for data engineering environments.
  • Strong understanding of SDLC frameworks, change management, and IT governance processes.

Benefits

  • medical, dental and vision plan
  • family planning benefits
  • health savings account
  • flexible spending account
  • transportation savings account
  • 401(k) retirement savings plan with company match
  • life, accident and disability coverage
  • business travel accident insurance
  • employee assistance programs
  • disability insurance
  • other well-being benefits

Job title

Data Engineer

Job type

Experience level

SeniorLead

Salary

$95,000 - $280,000 per year

Degree requirement

Bachelor's Degree

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job