Sr. Data Engineer driving impactful reporting and robust data solutions at URUS. Collaborating on data integration, warehousing and reporting to solve complex data challenges.
Responsibilities
Design, develop, and maintain robust and efficient ETL pipelines and processes on Databricks.
Troubleshoot and resolve Databricks pipeline errors and performance issues.
Maintain legacy SSIS packages for ETL processes.
Troubleshoot and resolve SSIS package errors and performance issues.
Optimize data flow performance and minimize data latency.
Implement data quality checks and validations within ETL processes.
Develop and maintain Databricks pipelines and datasets using Python, Spark and SQL.
Migrate legacy SSIS packages to Databricks pipelines.
Optimize Databricks jobs for performance and cost-effectiveness.
Integrate Databricks with other data sources and systems.
Participate in the design and implementation of data lake architectures.
Implement DevOps best practices for data pipelines, including CI/CD, monitoring, observability, and automated testing.
Integrate data ingestion from multiple sources (API, streaming, batch, databases) into centralized data platforms.
Use Terraform/CloudFormation (or similar IaC tools) for provisioning Databricks clusters, cloud infrastructure, and networking components.
Improve system performance and cost efficiency through monitoring, autoscaling, and cluster configurations.
Provide mentorship and technical guidance to junior data engineers and collaborate with cross-functional teams.
Participate in the design and implementation of data warehousing solutions.
Support data quality initiatives and implement data cleansing procedures.
Collaborate with business users to understand data requirements for department driven reporting needs.
Maintain existing library of complex SSRS reports, dashboards, and visualizations.
Troubleshoot and resolve SSRS report issues, including performance bottlenecks and data inconsistencies.
Comfortable in entrepreneurial, self-starting, and fast-paced environment, working both independently and with our highly skilled teams.
Communicate technical information clearly and concisely, both verbally and in writing.
Document all development work and procedures thoroughly.
Requirements
Bachelor's degree in computer science, Information Systems, or a related field.
7+ years of experience in data integration and reporting.
Extensive experience with Databricks, including Python, Spark, and Delta Lake.
Strong proficiency in SQL Server, including T-SQL, stored procedures, and functions.
Experience with SSIS (SQL Server Integration Services) development and maintenance.
Experience with SSRS (SQL Server Reporting Services) report design and development.
Experience with data warehousing concepts and best practices.
Experience with Microsoft Azure cloud platform and Microsoft Fabric desirable.
Strong analytical and problem-solving skills.
Excellent communication and interpersonal skills.
Ability to work independently and as part of a team.
Experience with Agile methodologies.
Must be legally authorized to work in the United States.
Data Migration Specialist handling large - scale data migration from legacy to enterprise PLM platform. Analyzing data structures, developing strategies, and ensuring integrity across systems.
Director leading strategy, governance, and delivery of enterprise data platform at Phillips 66. Partnering with AI, Data Science, and business teams to enhance analytics and business systems.
Product Owner driving ERP data migration initiatives for BioNTech’s global landscape. Leading effective data management and ensuring compliance with regulatory standards in a fast - paced environment.
Data Engineer II leading development and delivery of data pipelines for Syneos Health. Collaborating with teams to optimize data processing and integrate solutions into production environments.
Lead Data Engineer overseeing data operations and analytics engineering teams for OneOncology. Focused on operational excellence in data platform and model reliability for cancer care improvement.
Senior AWS Software Data Engineer at Boeing focusing on AWS Data services to support digital analytics capabilities. Collaborating with cross - functional teams to design, develop, and maintain software data solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.