Build end-to-end data pipelines and configure Quantexa’s decisioning platform to detect key insights
Write defensive, fault-tolerant and efficient production-level code for data processing systems
Configure and deploy Quantexa software using tools such as Spark, Hadoop, Scala and Elasticsearch on private and public clouds (Google Cloud, Azure, AWS)
Manage, transform and cleanse high-volume data to help clients solve problems in fraud, financial crime, data management, risk and customer intelligence
Act as a trusted source of technical knowledge for clients and articulate technical concepts to non-technical audiences
Collaborate with solution architects and R&D engineers to champion solutions, standards and best practices for complex big data challenges
Work from the Liverpool office (near Liverpool Science Park) and attend client sites as required
Requirements
At least 18 months of industry experience in a data engineering role or equivalent
Hands-on technical development background, preferably with some software industry experience
Proficiency in Scala, Java, Python or similar programming languages (primary language is Scala)
Experience building and deploying production-level data processing batch systems
Experience with Spark and Hadoop
Experience with Elasticsearch
Familiarity with cloud platforms (Google Cloud, Microsoft Azure, Amazon/AWS)
Experience with modern development tooling (Git, Gradle, Nexus)
Experience with DevOps and automation tooling (Jenkins, Docker, Bash scripting)
Knowledge of testing libraries (e.g., ScalaTest) and understanding of unit vs integration tests
Strong technical communication ability and experience working in rapidly changing client environments
Benefits
Competitive salary
Company bonus
25 days annual leave (option to buy up to 5 days, roll over up to 10) plus national holidays and your birthday off
Pension scheme with a company contribution of 6% (when you contribute 3%)
Private Healthcare with AXA, including dental & optic cover
Life Insurance and Income Protection
Regularly bench-marked salary rates
Enhanced Maternity, Paternity, Adoption, or Shared Parental Leave
Well-being days
Volunteer Day off
Work from Home Equipment
Commuter, Tech and cycle to work schemes
Octopus EV Salary Sacrifice scheme
Free Calm App Subscription
Continuous Training and Development, including access to Udemy Business
Spend up to 2 months working outside of your country of employment over a rolling 12-month period with ‘Work from Anywhere’ policy
Data Engineer joining Alterric to collaborate on data platform projects and analytics solutions. Working with Azure Cloud technologies to ensure data quality and integrity for informed decision - making.
Data Engineer at Kyndryl transforming raw data into actionable insights using ELK Stack. Responsible for developing, implementing, and maintaining data pipelines and processing workflows.
Senior Data Engineer at Clorox developing cloud - based data solutions. Leading data engineering projects and collaborating with business stakeholders to optimize data flows.
Data Engineer building solutions on AWS for high - performance data processing. Leading initiatives in data architecture and analytics for operational support.
Senior Data Engineer overseeing Databricks platform integrity, optimizing data practices for efficient usage. Leading teams on compliance while mentoring a junior Data Engineer.
Associate Data Engineer contributing to software applications development and maintenance using Python. Collaborating with teams for clean coding and debugging practices in Pune, India.
Data Engineer focusing on development and optimization of data pipelines in an insurance context. Ensuring data integrity and supporting data - driven decision - making processes.
Lead Data Engineer responsible for delivering scalable cloud - based data solutions and managing cross - functional teams. Collaborating with global stakeholders and ensuring high - quality project execution in a fast - paced environment.
Full Stack Data Engineer on a Central Engineering Portfolio Team in Chennai delivering curated data products and collaborating with data engineers and product owners.