Data Engineer at Flutter Studios developing reliable platforms for in-house gaming content. Engaging with technologies like Big Data and Event-Driven Architecture in a collaborative team.
Responsibilities
Be ready to work with Architects and Technical Leads in order to perform system & gap analysis with given business specifications and define functional and non-functional requirements for the software components, where SDLC and best-practices are self-implied.
Be involved in researching, producing POCs and evaluating new methodologies & technologies that improve quality, reliability, scalability, security, and performance of Flutter Studios Technology.
Deliver projects within agreed deadlines with a high degree of confidence and quality while taking responsibility for the delivered products.
Prove to be an effective team player in a Scrum Team while understanding and contributing to the agile delivery process and following the team’s ways of working.
Design and implement low-latency, high-availability and performant applications.
Participate in the software development process from the analysis phase to release, including: design, document, test and develop solutions by emphasizing code review & peer design activities.
Write reusable, testable, and efficient code while also inspiring other data engineers to do so.
Implement and integrate data storage solutions.
Be proactive in the identification of opportunities to improve and rationalize applications and processes across the whole Data team, working in close collaboration with other Data team members and subject matter experts.
Ensure that accurate and effective support documentation is maintained to reflect code development and changes.
Be responsible for effectively escalating any risks & issues identified in a timely manner to the Data Delivery Manager while also providing resolution and mitigation strategies.
Requirements
Bachelor’s degree or technical diploma in computer science, computer technology or related field.
3+ years experience in software development with involvement in the Data Engineering field.
Strong experience with Event Driven Architecture, Data Streaming and Big Data.
Very good experience in NoSQL, with an understanding of concepts such as Data Clustering, Data Replication and Partitioning Strategies.
Very good ability to write SQL is essential.
Demonstrable experience with high-volume data loads.
A good understanding of ETL processing in highly transactional OLTP systems, Event Modelling, Data Lake, Dimensional Data Modelling, Data Composition, Data Governance, Data Warehouse, as well as Designing and Applying Schema, while having experience with Schema-on-write & Schema-on-read.
Experience with log-based messaging systems, with experience in one of the following: Kafka (preferably), Kinesis or Pulsar.
Very good understanding of ring-based databases, having worked with one of the following: Cassandra (preferably), DynamoDB or Riak.
Proven experience with map-reduce technology, with experience in either of the following technologies: Scala Spark (preferably) or PySpark.
Understanding of stream processing frameworks, with experience in one of the following: Spark Steaming, Kafka Streams or Flink/Kinesis.
A good understanding of BI reporting, with experience in one of the following or similar: Looker, Tableau or Quick Sight.
Good experience with AWS data services within cloud technology, in any of the following tools: S3, Redshift, Glue, Lake House, Rds, Emr or Athena.
Demonstrable experience of object oriented in either Python or Java.
A proven ability to influence technical decisions in a fast-moving commercial environment.
Unit testing knowledge and practice.
Experience with Continuous Integration / Continuous Delivery tools.
Excellent people skills, communication interpersonal skills, high energy.
Proactive work ethic with the ability to deliver results and meet challenging deadlines.
Passion & flexibility to work the hours required to see projects to completion in a timely, accurate & efficient manner.
Self-motivating.
Attention to detail with a high degree of acknowledgement in work produced.
Proven ability & desire to innovate.
Very strong analytical skills.
Enthusiasm for the software development process.
Good English language skills.
Having Knowledge of the online gaming/gambling industry is a plus.
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.