SQL Migration Data Engineer at Auxo Solutions focusing on Azure SQL/Fabric Lakehouse migrations and building data pipelines. Collaborating on technical designs and data governance for modernization initiatives.
Responsibilities
Collaborate with the delivery team on a Server 2012 to Azure SQL/Fabric Lakehouse migration, including assessment, planning, and execution.
Develop and optimize ETL/ELT processes to migrate legacy SQL 2012 databases to modern cloud data platforms, minimizing data loss and downtime.
Design and build data pipelines using Azure Data Factory, Databricks, and Microsoft Fabric Lakehouse to transform monolithic databases into distributed Lakehouse architectures.
Develop APIs and data services on top of Microsoft Fabric Lakehouse to expose migrated data for downstream applications and stakeholders.
Collaborate with infrastructure and application teams to assess legacy SQL 2012 environments, identify technical debt, and plan phased migration approaches.
Develop infrastructure and automation required for optimal extraction, transformation, and loading of data from SQL Server 2012 and other legacy sources using SQL, dbt, Python, and Fabric technologies.
Define and document cloud solution architectures, migration roadmaps, and technical designs for data modernization initiatives.
Generate and document unit tests, performance benchmarks, and migration validation scripts.
Establish data quality frameworks and governance practices for migrated data assets in Lakehouse environments.
Requirements
Bachelor's Degree in Computer Science or related field.
Azure Cloud Certifications strongly preferred.
At least 3 years of Data Engineering experience, with 1+ years specifically in SQL Server migrations to cloud platforms.
Hands-on experience with SQL Server 2012 architecture, T-SQL optimization, and migration patterns (compatibility issues, index strategies, etc.).
Proficiency in Azure Data Factory, Synapse Analytics, Azure SQL, Data Lake Storage, and Microsoft Fabric (especially Lakehouse), including data modeling, partitioning, and optimization for analytical workloads.
Demonstrated experience building APIs or data services on top of Lakehouse/Delta Lake architectures.
Proficiency with dbt for transformation logic and data lineage documentation.
Strong command of Python, SQL, T-SQL, and scripting for automation and data validation.
Experience with Azure Infrastructure-as-Code (Bicep, ARM templates, Terraform).
Experience building CI/CD pipelines for data infrastructure.
Knowledge of data governance, metadata management, and data quality frameworks.
Ability to work independently in Agile environments with minimal supervision on external client projects.
Benefits
Medical, Dental, and Vision Insurance.
Life, Short Term Disability, and Long Term Disability Insurance.
Data Engineer engineering DUAL Personal Lines’ strategic data platforms for global insurance group. Providing technical expertise in data engineering and collaborating with internal teams for solution delivery.
Data Engineer role focused on creating and monitoring data pipelines in an innovative energy company. Collaborate with IT and departments to ensure quality data availability in a hybrid work environment.
Data Engineer developing cloud solutions and software tools on Microsoft Azure big data platform. Collaborating with various teams for data analysis and visualization in healthcare.
Boomi Integration Architect designing and leading integration solutions for data warehouses. Collaborating with cross - functional teams to implement scalable integration patterns using Boomi technologies.
Seeking a Boomi Integration Architect specializing in Data Warehouse and Master Data Hub implementations. Responsible for designing high - performance integration solutions across enterprise platforms.
Principal Data Engineer at Serko enhancing global travel tech through data - driven solutions. Collaborating across teams in Bengaluru to drive innovative engineering and best practices.
Data Engineer at Keyrus responsible for building and optimizing data pipelines for major projects. Contributing to data solutions and ensuring data quality in a growing team.
Data Architect designing and implementing scalable data architectures for Keyrus in Bordeaux. Leading client transitions and contributing to the tech ecosystem with innovative data solutions.