Data Engineer III at Hanger, Inc. designing and maintaining data solutions using Microsoft Azure. Collaborating with stakeholders and optimizing ETL processes for enterprise analytics.
Responsibilities
Design, develop, and maintain scalable data pipelines and integrations using Microsoft Azure and Microsoft Fabric.
Build and optimize ETL/ELT processes to ingest structured and semi-structured data from internal and external systems.
Design and implement logical and physical data models to support enterprise analytics, reporting, and business intelligence initiatives.
Develop and optimize relational databases, including Microsoft SQL Server and PostgreSQL (PostgresDB).
Apply dimensional modeling techniques (e.g., star and snowflake schemas) to support reporting and analytical use cases.
Leverage Microsoft Azure services (e.g., Azure Data Factory, Azure Synapse, Azure SQL, Azure Storage) to deliver scalable, secure data solutions.
Develop and maintain Microsoft Fabric solutions, including pipelines, lakehouse environments, and semantic models.
Integrate and manage data ingestion tools such as Fivetran or similar data integration platforms.
Monitor, troubleshoot, and optimize data pipeline performance and reliability.
Ensure adherence to data governance, security, and compliance standards.
Collaborate with cross-functional teams to translate business requirements into scalable technical solutions.
Provide mentorship and technical guidance to Data Engineers I and II.
Participate in code reviews and promote best practices in data engineering and DevOps methodologies.
Requirements
5+ years of experience in data engineering, data warehousing, or related technical roles.
Hands-on experience building and maintaining cloud-based ETL/ELT pipelines.
Strong experience with Microsoft Azure data services.
Demonstrated experience in data modeling for enterprise analytics.
Advanced proficiency in SQL and experience with Microsoft SQL Server and PostgreSQL (PostgresDB).
Must have, or be eligible to obtain, a valid driver’s license and driving record within the standards outlined within Hanger’s Motor Vehicle Safety Policy and Procedures.
Preferred: Bachelor’s degree in Computer Science, Information Systems, Engineering, or a related technical field.
Experience with Microsoft Fabric.
Experience with Fivetran or similar data integration platforms.
Experience in healthcare or other regulated industries.
Familiarity with DevOps practices, CI/CD pipelines, and infrastructure-as-code.
Benefits
8 Paid National Holidays & 4 additional Floating Holidays
PTO that includes Vacation and Sick time
Medical, Dental, and Vision Benefits
401k Savings and Retirement Plan
Paid Parental Bonding Leave for New Parents
Flexible Work Schedules and Part-time Opportunities
Generous Employee Referral Bonus Program
Mentorship Programs- Mentor and Mentee
Student Loan Repayment Assistance by Location
Relocation Assistance
Regional & National traveling CPO/CO/CP opportunities
Volunteering for Local and National events such as Hanger’s BAKA Bootcamp and EmpowerFest
Senior Enterprise Data Architect shaping data flow across Lundbeck by unifying enterprise data platforms. Leading architectural design and governance for scalable and sustainable data solutions.
Data Engineer supporting the development and implementation of enterprise - wide data governance practices at a climate technologies company focused on sustainability. Collaborating cross - functionally to enhance data quality and compliance processes.
Data Engineer building scalable data pipelines for analytics at UOL EdTech. Collaborating with data teams and supporting data - driven culture in education technology.
Data Engineering Intern helping build and maintain data pipelines using Python and SQL. Assisting the Data and Analytics team on various data processes and projects.
Senior Data Engineer designing and maintaining data processing pipelines for analytics and machine learning in a fast - paced startup. Collaborating with cross - functional teams to ensure data accuracy and security.
Data Engineer developing data pipelines and ETL processes for Stefanini's data architecture modernization. Involves data migration from AS400 to Microsoft Fabric Lakehouse.
Senior Data Engineer responsible for overseeing data ingestion and delivery at Kpler. Leading engineering best practices and collaborating with teams on client - facing data solutions.
Senior Data Engineer working on GCP cloud data solutions and ETL processes in AI & Data Engineering team. Collaborating within a hybrid work setup in Bangalore, India.
Lead Data Engineer designing and managing AWS data pipelines and platforms for AI & Data Engineering team. Involves collaborating with data scientists, analysts, and stakeholders for data - driven solutions.
Senior Data Engineer designing and developing scalable data pipelines using DBT and Python. Collaborating with internal stakeholders for analytics and reporting solutions.