Data Engineer enabling data-driven decisions by maintaining pipelines for Omnisend's BI platform. Collaborating with cross-functional teams to optimize data processes and ensure accuracy.
Responsibilities
Build and maintain reliable data and ML pipelines for analytics and reporting;
Partner with analysts to deliver clean, well-modeled datasets and support their workflows;
Ensure data quality, accuracy, and availability across the BI platform;
Monitor, troubleshoot, and optimize data processes for performance and reliability;
Contribute to data warehouse development and ongoing maintenance;
Integrate Generative AI and LLM-based solutions into data pipelines and workflows.
Requirements
2+ years of hands-on experience in data engineering or analytics
Strong SQL skills with a solid understanding of data modeling, data quality, and warehouse principles
Experience working with dbt or similar transformation frameworks
Proficiency in Python; knowledge of Golang is a strong plus
Experience with CI/CD tools such as GitHub Actions
Exposure to workflow orchestration tools like Dagster, Airflow, or Prefect
Experience with BI tools (e.g., Superset, Looker, or similar)
Understanding of software development best practices and version control
Curiosity and ability to learn and apply emerging technologies, including Generative AI and LLMs.
Benefits
Gross salary starting point from 4500 EUR/month based on experience
An unlimited learning budget for self-improvement complying with the best interest of Omnisend
Working methods and best practices inspired by the best tech companies in Silicon Valley
Over 80% senior colleagues in Product - and over half company-wide - to accelerate your growth
Flexible working hours and remote work possibilities
Private health insurance
Unlimited access to psychotherapy
Workstation budget
Personalized work anniversary gifts (house cleaning, spa treatments, international flights etc.)
Annual company trip abroad, company gatherings twice a year, team-led teambuildings
Additional days off for KASP and Rifleman’s Union members
Data Warehouse Architect developing and optimizing robust data warehouse environments on SAP BW/4HANA. Critical for enabling advanced analytics and reporting across the organization.
Data Engineering Manager leading a new Data Engineering team in Bengaluru. Shaping the design and scaling of core data engineering practices across the organization.
Senior Google Data Architect designing and delivering scalable data solutions on Google Cloud Platform. Collaborating across teams to shape target - state data architectures and influence enterprise data strategy.
Sr. ETL/Data Warehouse Lead at Huntington designing, developing, and supporting ETL and Data Warehousing framework. Analyzing systems based on specifications and providing technical assistance.
Data Engineer developing scalable data lake solutions and optimizing data pipelines at U.S. Bank. Collaborating with teams to manage data governance and cloud migration activities.
Lead AI, MLOps & Data Engineer at WedR, guiding complex data projects and AI innovation. Collaborate with diverse experts in a Product Studio for digital transformations.
Lead Azure Databricks Data Engineer implementing data solutions for data engineering projects at Ryan Specialty. Collaborating with stakeholders and mentoring junior staff on data pipelines and ETL processes.
Lead Azure Databricks Data Engineer at Ryan Specialty focused on implementing data solutions and collaborating with cross - functional teams to enhance data architecture.
Senior Data Engineer designing and implementing sustainable data solutions for diverse clients. Collaborating closely with stakeholders to enhance data services and platforms in a hybrid environment.
Risk Data Engineer and Architect at Lincoln Financial supporting risk analytics through AWS data solutions. Building scalable data pipelines and collaborating with cross - functional teams.