Senior Data Engineer designing and managing data infrastructure for energy storage at Arenko. Collaborating with teams to optimize energy assets and data-driven decisions in the renewable sector.
Responsibilities
As a Senior Data Engineer, you will be instrumental in designing, building, and managing the infrastructure that enables efficient processing and analysis of our large and diverse datasets, ranging from power market data to asset telemetry.
Build modern data services: Take charge of planning, designing, and developing data engineering projects, working with big data frameworks and integrating with a broader microservice stack.
Enable industry-leading asset optimisation: Build the infrastructure that empowers our team to implement performant data science methods that will best optimise energy assets on the electricity grid.
ETL process design and implementation: Create efficient Extract, Transform, Load (ETL) processes to streamline data integration and transformation.
DevOps adoption: Embrace a DevOps culture by implementing Terraform and infrastructure as code (IaC) for our AWS data estate.
Ensure data quality and consistency: Maintain high standards of data integrity across multiple data sources.
Engage in cross-team collaboration: Work closely with data scientists, software engineers, stakeholders, etc. to deliver solutions that optimise energy storage assets.
Mentorship and team guidance: Define and promote best practices for data infrastructure development and tooling. Mentor the wider data team on effective ways of working.
Requirements
You’re degree level educated in computer science or a related discipline OR able to demonstrate equivalent workplace experience.
You have extensive experience delivering projects focused on data collection, storage systems and data pipelines.
You have advanced Python skills with a focus on writing clean, maintainable, and production-ready code.
You’re experienced with the end-to-end development and deployment of data-focused software services.
You have an in-depth understanding of ETL technologies and best practices.
You’ve worked with SQL and NoSQL databases and big data frameworks like Spark, Pulsar, etc.
You can demonstrate ability to integrate with external APIs for both data collection and providing data feeds to customers and third parties.
You have hands-on experience with data storage, processing and analytics in cloud platforms (e.g., AWS, Google Cloud, Azure).
You have an understanding of MLOps best practices and the tools required to maintain a suite of machine learning models.
You're an organised and adaptable team player capable of leading and mentoring colleagues.
You're able to communicate with stakeholders to identify needs and evaluate alternative technical solutions and strategies.
You have a strong technical capacity for evaluating and expressing the effort and value associated with technical development decisions.
Benefits
25 days holidays in addition to public holidays, carry over flex and festive office closure (a pretty decent annual leave allowance even if we do say so ourselves!)
A genuine approach to flexible working - the vast majority of our employees spend at least a day or 2 each week in the office but we’re open to making this work for everyone, their personal circumstances and their workloads
Enhanced parental leave offerings with 6 months of full salary for the primary caregiver and 1 month for the secondary caregiver
A yearly salary review for all employees
We are proud to support the continuous professional growth of our employees by providing a personal annual learning and development budget and fully covering the costs of professional membership fees and subscriptions relevant to your role
Access to MindTools on us
Bike to Work Scheme (because it wouldn’t be very on brand if we didn’t encourage you to think about how your travel affects the environment)
Octopus EV Scheme (see above!)
A commitment to your wellbeing, with access to our Employee Assistance Programme and complimentary eye tests in partnership with Specsavers
Internal working groups to focus on the things that matter to you like our “Women of Arenko”, “Diversity, Equity and Inclusion” and “Parents of Arenko” groups
Regular opportunities to blow off some steam engage with our team, including quarterly socials, the big Festive Gathering and the Summer Party. We’re also working on our wellness initiatives to support your Physical, social, mental and financial wellbeing.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.
Data Engineer strengthening data platform team at Samba TV to improve data analytics and reporting capabilities. Building on AWS, Databricks, BigQuery, and Snowflake technology.