IT Technical Engineer accountable for software development at Nokia, improving application performance and adapting digital technologies. Supporting business and IT projects with technical expertise.
Responsibilities
Develop ETL/ELT workflows for both batch and streaming data, supporting structured and unstructured data sources.
Implement and support Lakehouse and Medallion architecture (Bronze/Silver/Gold) using Azure Data Lake Storage and Delta Lake.
Optimize Spark jobs through effective partitioning, caching, cluster tuning, and cost-efficient resource utilization.
Integrate Databricks with key Azure services, including Data Factory, Synapse, Key Vault.
Implement strong data governance practices using Unity Catalog, encryption, lineage, and data quality checks.
Monitor, troubleshoot, and improve the performance of data pipelines, clusters, and job executions.
Collaborate with business analysts, architects, and business stakeholders to develop end-to-end analytics solutions.
Support CI/CD automation, Git branching strategies, and DevOps best practices.
Support change management activities, and follow coding standards.
Requirements
3–5+ years of hands-on experience in Azure Databricks, Spark, PySpark, SQL, and data engineering with Strong experience implementing Delta Lake, Unity Catalog, Lakehouse architecture, and Databricks Jobs/Pipelines.
Proficiency with Azure services such as Data Lake Storage, Data Factory, Integrated runtimes, and Key Vault.
Strong understanding of ETL/ELT design, data modeling, medallion architecture, and large-scale data processing.
Experience with performance tuning, cluster optimization, and troubleshooting distributed systems and Hands-on experience with Power BI.
Experience with CI/CD pipelines, Azure DevOps, Git version control.
Exposure to Unity Catalog advanced features, Row-Level Security, and data governance frameworks (Nice-To-Have).
Knowledge of monitoring and observability tools, data quality frameworks, or automation solutions (Nice-To-Have).
Azure certifications such as DP-203 or Databricks Data Engineer Associate/Professional (Nice-To-Have).
Experience integrating Databricks with external systems via APIs, connectors, and streaming services (Nice-To-Have).
Benefits
Flexible and hybrid working schemes
A minimum of 90 days of Maternity and Paternity Leave, with the option to return to work within a year following the birth or adoption of a child (based on eligibility)
Life insurance to all employees to provide peace of mind and financial security
Well-being programs to support your mental and physical health
Opportunities to join and receive support from Nokia Employee Resource Groups (NERGs)
Employee Growth Solutions to support your personalized career & skills development
Diverse pool of Coaches & Mentors to whom you have easy access
A learning environment which promotes personal growth and professional development - for your role and beyond
Senior Data Engineer driving data intelligence requirements and scalable data solutions for a global consulting firm. Collaborating across functions to enhance Microsoft architecture and analytics capabilities.
Experienced AI Engineer designing and building production - grade agentic AI systems using generative AI and large language models. Collaborating with data engineers, data scientists in a tech - driven company.
Intermediate Data Engineer designing and building data pipelines for travel industry data management. Collaborating across teams to ensure reliable data for analytics and reporting.
Data Engineer managing and organizing datasets for AI models at Walaris, developing AI - driven autonomous systems for defense and security applications.
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.