Senior Data Engineer designing and maintaining scalable data pipelines using modern technologies. Collaborating with cross-functional teams and providing mentorship in a dynamic environment.
Responsibilities
Design and implement scalable and high-performance data pipelines using Snowflake or Matillion, DBT, and the Microsoft Data Stack (Azure SQL Database, Azure Data Factory, Azure Synapse Analytics).
Collaborate with cross-functional teams to gather requirements, define project scope, and develop data models.
Provide technical leadership and mentorship to junior data engineers, guiding them in best practices and coding standards.
Conduct code reviews to ensure quality, performance, and adherence to best practices.
Serve as a subject matter expert in Snowflake or Matillion, DBT, and the Microsoft Data Stack, staying abreast of industry trends and advancements.
Troubleshoot and optimize existing data pipelines and processes to improve performance and reliability.
Create and maintain technical documentation, including architecture diagrams, design documents, and implementation guides.
Requirements
5+ years of experience in data engineering, with a focus on cloud-based data platforms.
Extensive hands-on experience with either Snowflake or Matillion, DBT, and the Microsoft Data Stack (Azure SQL Database, Azure Data Factory, Azure Synapse Analytics).
Proficiency in SQL and Python for data manipulation and transformation.
Experience with other cloud platforms such as AWS, GCP, or Azure is a plus.
Strong problem-solving skills and the ability to troubleshoot and optimize data pipelines and processes.
Excellent communication and interpersonal skills, with the ability to effectively collaborate with both technical and non-technical stakeholders.
Prior experience in a leadership or mentorship role is highly desirable.
Demonstrated ability to thrive in a fast-paced, dynamic environment and manage multiple priorities effectively.
Benefits
Opportunity to collaborate with a diverse group of colleagues in a fun, creative environment
Progressive career journey and opportunity for advancement
Continuous development through training, mentorship and certification programs
Exposure to modern technologies across various industries in an agile environment
Flexibility to work remotely, onsite or a hybrid of both as desired in certain locations
Competitive salary + bonus opportunities
Robust benefits package, matching 401(k) plan, and substantial PTO
Data Engineer (GCP) designing and maintaining scalable data platforms at LUZA Group in Portugal. Collaborating and ensuring data integrity across multiple complex datasets.
Data Architect at Integrant responsible for designing and building data solutions for analytical purposes. Involves eliciting requirements, data pipelines, and coaching teams on methodologies.
Senior Data Engineer developing and maintaining data pipelines for clients in an Agile setting. Collaborating with teams to enhance data quality and mentoring junior engineers.
Data Architect leading design and implementation of cloud data platforms for digital transformation. Collaborating with stakeholders to define data strategies and governance models.
Data Engineer Consultant designing and optimizing data infrastructure for clients' business needs. Working with SQL and data visualization tools in a mainly remote role with some onsite responsibilities in Denver.
Data Engineer creating Real - Time Data Processing applications for a leading iGaming operator. Work involves stream data manipulation and collaboration in an Agile environment.
Data Engineer at Voodoo optimizing real - time data pipelines for gaming and consumer apps to support growth. Joining a top - tier data team dedicated to monetizing via advertising partners in a competitive landscape.
Cloud Data Engineer designing data architectures for cloud platforms at fifty - five. Collaborating with local and global teams to optimize marketing ROI and customer experience.
SAP Specialist responsible for designing, developing, and executing data migration objects in Hydro’s SAPEX program. Ensuring successful ETL processes and maintaining data quality.