Data Engineer ensuring high-quality business and AI data streams at Cybercare. Collaborate with data teams to develop cutting-edge data products with latest tools and technologies.
Responsibilities
Maintain, develop, and optimize data infrastructure.
Design, construct, and enhance SQL databases, data warehousing, and data quality monitoring solutions.
Ensure timely and accurate data loading processes.
Work closely with Analytics Engineers and Data Analysts to support their data needs.
Take initiative and look for data quality, security, and pipeline efficiency improvements.
Help establish and maintain a robust data warehouse built with dbt.
Collaborate with multiple departments and Cybercare client product teams to establish data sharing.
Requirements
2+ years of experience in a Data Engineering role.
Proficiency in the Python programming language and data manipulation libraries.
Strong understanding of databases, SQL, and data modeling.
Experience with cloud platforms for data processing and storage (Google BigQuery).
Extensive experience in ETL processes, data transformation, and data pipeline development.
Experience with Apache Airflow.
Deep knowledge of data sourcing and data storage solutions.
Ability to integrate data from multiple sources, both structured and unstructured.
Experience with CI/CD integration.
Benefits
Career development training - Internal and external events, online training, conferences, books - everything you need to reach your full potential.
Health benefits - Private health insurance, online workouts or an on-site free gym 24/7, consultations, programs to improve mental health to feel and be your best. For those who seek flexibility and autonomy in their health-related benefits we give a wallet topped up with funds that you can spend across a variety of health and wellness services.
Workplace Flex - At CyberCare, we work from our Vilnius and Kaunas offices, with the option to work from home two days a week - Tuesdays and Thursdays. Additionally, you can work remotely from anywhere in the world for up to 30 days per year.
Technologies - Work with us and we will deliver the latest advancements in technology, software, and gadgets to optimize and streamline your daily workflow. Also, we will provide you with Nord Security, and Saily products so you can also secure your personal devices!
Company events - Regular internal events, quarterly team buildings, OKRs events, workshops & workation trips.
Office perks & public transportation - Fully stocked kitchens, snacks, drinks, pet friendly offices. Also we offer a discount and encourage using public transportation at a lower cost.
Time off - Additional time off for rest, for anniversaries, or to celebrate life’s special moments.
Project Manager overseeing payroll system implementation with global outsourcing partner. Leading cross - functional teams and stakeholder engagement for project success.
Data Engineer 3 optimizing Market Place processes for Walmart Global Tech's Chennai team. Developing data pipelines and ensuring efficient utilization of Market Place systems.
Data Engineer at CBTW handling data pipelines and ETL processes using SAS. Collaborating with business stakeholders and ensuring data governance within SAS environments.
Data Engineer I at Catalyst Brands developing and optimizing data pipelines for cross - functional teams. Designing next generation data platform architecture to meet increasing data demands in a retail environment.
Data Engineer at Grupo Iter responsible for data pipelines and architecture in Azure. Collaborating on data governance and integrating analytics with Power BI.
Full Stack Data Architect for Concurrency designing Azure data - intensive applications. Leading complex data architecture initiatives and mentoring engineering teams in a high - performance environment.
AHEAD builds digital business platforms; seeking a Data Engineer in a development program. Join us to grow into a technical leader emphasizing skills across various practices.
Data Engineer creating clean, reliable data pipelines for Plenti, a fintech lender. Collaborating with modern tools like AWS and Databricks to enhance data quality and analytics.