Data Engineer at Methods Analytics focusing on transforming data into insights for public sector clients. Utilizing tools like Power BI and Azure to drive data solutions and innovation.
Responsibilities
Work closely with cross-functional teams, translating complex technical concepts into clear, accessible language for non-technical audiences and aligning data solutions with business needs.
Collaborate with a dynamic delivery team on innovative projects, transforming raw data into powerful insights that shape strategic decisions and drive business transformation.
Utilise platforms and tools such as Microsoft Fabric, Azure Data Factory, Azure Synapse, Databricks, and PowerBI to build robust, scalable, and future-proof end-to-end data solutions.
Design and implement efficient ETL and ELT pipelines, ensuring seamless integration and transformation of data from various sources to deliver clean, reliable data.
Develop and maintain sophisticated data models, employing dimensional modelling techniques to support comprehensive data analysis and reporting.
Implement and uphold best practices in data governance, security, and compliance, using tools like Azure Purview, Unity Catalog, and Apache Atlas to maintain data integrity and trust.
Ensure data quality and integrity through meticulous attention to detail and rigorous QA processes, continually refining and optimising data queries for performance and cost-efficiency.
Develop intuitive and visually compelling Power BI dashboards that provide actionable insights to stakeholders across the organisation.
Monitor and tune solution performance, identifying opportunities for optimisation to enhance the reliability, speed, and functionality of data systems.
Stay ahead of industry trends and advancements, continuously enhancing your skills and incorporating the latest Data Engineering tools, languages, and methodologies into your work.
Enable business leaders to make informed decisions with confidence by providing them with timely, accurate, and actionable data insights.
Be at the forefront of data innovation, driving the adoption and understanding of modern tooling, architectures, and platforms.
Deliver seamless and intuitive data solutions that enhance the user experience, from real-time streaming data services to interactive dashboards.
Play a key role in cultivating a data-driven culture within the organisation, mentoring team members, and contributing to the continuous improvement of the Engineering Practice
Requirements
Proficiency in SQL and Python: You are highly proficient in SQL and Python, enabling you to handle complex data problems with ease.
Understanding of Data Lakehouse Architecture: You have a strong grasp of the principles and implementation of Data Lakehouse architecture.
Hands-On Experience with Spark-Based Solutions: You possess experience with Spark-based platforms like Azure Synapse, Databricks, Microsoft Fabric, or even on-premise Spark clusters, using PySpark or Spark SQL to manage and process large datasets.
Expertise in Building ETL and ELT Pipelines: You are skilled in building robust ETL and ELT pipelines, mostly in Azure, utilising Azure Data Factory and Spark-based solutions to ensure efficient data flow and transformation.
Efficiency in Query Writing: You can craft and optimise queries to be both cost-effective and high-performing, ensuring fast and reliable data retrieval.
Experience in Power BI Dashboard Development: You possess experience in creating insightful and interactive Power BI dashboards that drive business decisions.
Proficiency in Dimensional Modelling: You are adept at applying dimensional modelling techniques, creating efficient and effective data models tailored to business needs.
CI/CD Mindset: You naturally work within Continuous Integration and Continuous Deployment (CI/CD) environments, ensuring automated builds, deployments, and unit testing are integral parts of your development workflow.
Business Requirements Translation: You have a knack for understanding business requirements and translating them into precise technical specifications that guide data solutions.
Strong Communication Skills: Ability to effectively translate complex technical topics into clear, accessible language for non-technical audiences
Continuous Learning and Development: Commitment to continuous learning and professional development, staying up to date with the latest industry trends, tools, and technologies.
Exposure to Microsoft Fabric: Familiarity with Microsoft Fabric and its capabilities would be a significant advantage.
Experience with High-Performance Data Systems: Handling large-scale data systems with high performance and low latency, such as managing 1 billion+ records or terabyte-sized databases.
Knowledge of Delta Tables or Apache Iceberg: Understanding and experience with Delta Tables or Apache Iceberg for managing large-scale data lakes efficiently.
Knowledge of Data Governance Tools: Experience with data governance tools like Azure Purview, Unity Catalog, or Apache Atlas to ensure data integrity and compliance.
Exposure to Streaming/Event-Based Technologies: Experience with technologies such as Kafka, Azure Event Hub, and Spark Streaming for real-time data processing and event-driven architectures.
Understanding of SOLID Principles: Familiarity with the SOLID principles of object-oriented programming.
Understanding of Agile Development Methodologies: Familiarity with iterative and agile development methodologies such as SCRUM, contributing to a flexible and responsive development environment.
Familiarity with Recent Innovations: Knowledge of recent innovations such as GenAI, RAG, and Microsoft Copilot, as well as certifications with leading cloud providers and in areas of data science, AI, and ML.
Experience with Data for Data Science/AI/ML: Experience working with data tailored for data science, AI, and ML applications.
Experience with Public Sector Clients: Experience working with public sector clients and understanding their specific needs and requirements.
Benefits
Development access to LinkedIn Learning, a management development programme and training
24/7 Confidential employee assistance programme
office parties, pizza Friday and commitment to charitable causes
25 days of annual leave a year, plus bank holidays, with the option to buy 5 extra days each year
2 paid days per year to volunteer in our local communities or within a charity organisation
Salary Exchange Scheme with 4% employer contribution and 5% employee contribution
Life Assurance of 4 times base salary
Private Medical Insurance which is non-contributory (spouse and dependants included)
Worldwide Travel Insurance which is non-contributory (spouse and dependants included)
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.
Data Engineer strengthening data platform team at Samba TV to improve data analytics and reporting capabilities. Building on AWS, Databricks, BigQuery, and Snowflake technology.
Data Engineer focusing on secure ETL/ELT data pipelines and compliance in healthcare. Designing scalable ingestion frameworks and ensuring alignment with federal standards.
Data Migration Engineer at Capgemini delivering migration solutions for Guidewire Claim Center. Collaborating on cloud data migrations and validating processes in a sustainable tech environment.
Data Engineer responsible for collecting and analyzing data at Cruise Planners. Collaborate with teams for actionable insights using MySQL and Power BI.
Data Engineer for Leader Entertainment developing data solutions on Google Cloud Platform. Collaborating on data models, pipelines, and analytics in a hybrid role.
Senior Data Engineer designing and scaling data foundations for AI adoption across Ad Tech. Collaborating with cross - functional teams to deliver robust pipelines for high - profile AI applications.
Specialist in Data Engineering leading pipeline optimization at Inmetrics. Collaborating in innovative data - driven projects within a hybrid work environment.