About the role

  • Responsible for high performance data capture systems
  • Managing the data pipeline that flows between capturing, storing and transforming the world's market data
  • Envision future growth in both utilization and technology selection for the components of work

Requirements

  • 5+ years of professional experience with Python, Linux and Bash
  • 3+ years of experience with Ansible or similar configuration management tool
  • Proficiency with Linux and an understanding of Linux server architecture and operation
  • Should be very comfortable with data copying, scheduling tasks and Linux daemons
  • Time spent working in a batch data processing environment or other environment where large data jobs are performed using automation
  • Extensive experience with software deployments, including risk assessment and mitigation strategies for potential failures
  • Experience with Git-based version control systems, such as GitLab and/or GitHub
  • Basic networking knowledge and ability to interpret Wireshark/tcpdump capture output
  • Experience with AWS, S3, AWS CLI, Rsync, and Docker
  • English proficiency is required
  • Must be willing to work US Eastern hours for the first 3 – 6 months during initial training
  • Must be willing to work on some weekends when needed

Benefits

  • healthcare
  • retirement planning
  • paid volunteering days
  • wellbeing initiatives

Job title

DevOps Engineer

Job type

Experience level

Mid levelSenior

Salary

Not specified

Degree requirement

No Education Requirement

Location requirements

Report this job

See something inaccurate? Let us know and we'll update the listing.

Report job