Senior Data Engineer enhancing data infrastructure using Databricks and GCP. Managing data pipelines and collaboration with data scientists for innovative solutions.
Responsibilities
Design and implement robust data pipelines using Databricks and integrate these with Delta Lake for efficient data storage and management.
Manage end-to-end data workflows from ingestion to insights, utilizing FiveTran for seamless data integration and GCP for scalable cloud solutions.
Administer Databricks environments, ensuring optimal configuration, security, and performance across all data operations.
Utilize Unity Catalog to manage data governance across all Databricks workspaces, ensuring compliance with data privacy and security policies.
Develop and maintain scalable and efficient data models and architecture, supporting the strategic goals of the organization.
Collaborate with data scientists and analysts to deploy machine learning models and complex analytical projects.
Requirements
Bachelor’s degree in computer science, engineering, information technology or related field or four (4) years of relevant experience in lieu of degree.
Minimum of five (5) years of experience [nine (9) years for non-degreed candidates] in a data engineering role with significant exposure to Databricks, Delta Lake, and cloud platforms like GCP.
Proven expertise in Databricks administration and managing large-scale data environments.
Strong experience with SQL, Python, and other scripting languages commonly used in data engineering.
Familiarity with Fivetran or similar data integration tools and understanding of ETL processes.
Risk Data Engineer and Architect at Lincoln Financial supporting risk analytics through AWS data solutions. Building scalable data pipelines and collaborating with cross - functional teams.
Senior Data Engineer designing secure and scalable data systems for maritime and defense applications. Seeking experienced professional with strong expertise in AWS and Azure environments.
Data Engineer managing payment processing and data accuracy while collaborating with financial teams. Building and optimizing data pipelines for transactional data in a hybrid work environment.
Data Engineer building analytical tools for Dry Bulk market data operations at Kpler. Join a team of over 700 experts transforming data into actionable strategies.
Data Engineer developing tools for maintaining data integrity in cargo tracking at Kpler. Collaborating with analysts and engineers to enhance data quality management.
Lead Azure Data Engineer designing and optimizing data ecosystems on Microsoft Cloud. Responsible for building scalable data platforms and pipelines for analytics and reporting.
Data Engineer providing support for IBM DataStage ETL jobs at Callibrity. Collaborating with stakeholders and working to modernize technology solutions in a hybrid work environment.