Senior Data Engineer architecting and building intelligent data ecosystems that integrate Generative AI. Focused on delivering high-impact solutions across diverse industries in Poland.
Responsibilities
Analyze and optimize business processes by collaborating with stakeholders to uncover inefficiencies and define data requirements for automation
Design scalable, modular data architectures that integrate with Generative AI and Agentic AI systems to support real-time decision-making
Engineer robust ETL/ELT pipelines using Python, cloud-native services, and orchestration tools, supporting both batch and streaming data needs
Architect RAG and vector database solutions using semantic search to enable LLMs to retrieve curated, context-rich business data
Build intelligent data products, from predictive models and decision engines to AI-driven insights platforms
Implement data quality, validation, and governance frameworks to ensure data integrity, lineage, and compliance across systems
Lead technical discovery sessions with clients to transform complex business challenges into AI and data-driven opportunities
Mentor team members on best practices in data engineering, AI integration, and modern cloud architectures
Requirements
Expert-level Python proficiency for data engineering, including API integrations, data transformations (Pandas, PySpark), and automation
Proven experience designing and deploying large-scale data platforms on AWS, GCP, or Azure
Strong foundation in building production-grade ETL/ELT pipelines using Apache Airflow, Kafka, Spark, or cloud-native tools
Hands-on experience with vector databases (e.g., Pinecone, Weaviate, Chroma, Milvus) and implementing semantic search
Demonstrated knowledge of Generative AI and LLMs, with practical experience in RAG architectures and prompt engineering
Deep understanding of data governance, quality, and documentation, with a focus on lineage, metadata, and compliance
Familiarity with cloud services including serverless computing, managed databases, and data warehouses such as BigQuery, Redshift, or Snowflake
Experience working with complex real-world data environments, including legacy systems, SaaS integrations, APIs, and databases
Fluency in both Lithuanian and English languages, written and spoken.
Benefits
Health insurance and a yearly training budget (local and international conferences, language courses), employee-led workshops
Flexible working hours
Unlimited WFH (work from home) policy
Extra vacation days: 2 after working at NFQ for two years and 4 after four years on our team
Bonus for referrals
For those who dream of traveling: WFA (work from anywhere) possibilities in NFQ - approved countries
Senior Data Engineer supporting AI - enabled financial compliance initiative with data pipelines and ingestion processes. Collaborating with diverse teams in a mission - critical regulated environment.
Data Architect leading the definition and construction of cloud data architecture for Kyndryl. Participating in significant technological modernization initiatives, focusing on Google Cloud Platform.
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.