Senior Data Engineer crafting and developing data products for analytical insights at Zendesk. Collaborating in an Agile environment with a focus on data warehousing and process optimization.
Responsibilities
Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models
Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately
Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt
Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt
Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quality testing
Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains
Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery
Work with data and analytics experts to strive for greater functionality in our data systems
Requirements
5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments
5+ years of experience in Data Modeling and Data Architecture in a production environment
5+ years in writing complex SQL queries
5+ years of experience with Cloud columnar databases (We use Snowflake)
2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions
Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions.
Strong documentation skills for pipeline design and data flow diagrams.
Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python
Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc
Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices.
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
Data Engineer engineering DUAL Personal Lines’ strategic data platforms for global insurance group. Providing technical expertise in data engineering and collaborating with internal teams for solution delivery.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.