Senior Data Engineer designing and implementing the Enterprise Data Platform at Stellix. Focusing on analytics and insights with a growth path to Principal Data Engineer or Data Architect.
Responsibilities
Responsible for scoping, architecting, designing, and developing robust data engineering solutions including data pipelines, data integration, and infrastructure.
Support the data architect in the creation of conceptual and logical data models.
Own the creation of physical data model optimized for analytics, reporting, and AI/machine learning use cases.
Serve as the technical owner of the data platform making architectural decisions, maintaining high code quality, and delivering scalable, reliable solutions.
Integrate data from diverse sources, including databases, APIs, flat files and cloud platforms.
Design, and build performant, scalable data pipelines using tools like dbt, Fivetran, and Airflow.
Troubleshoot issues with production data pipelines and implement monitoring and alerting as needed.
Collaborate across business, governance, QA, and analytics teams to ensure data quality, consistency, and successful solution delivery.
Define and implement enterprise scale data engineering best practices, standards and guidelines across the development life cycle.
Requirements
Bachelors degree in computer science, data science, software engineering, information systems, or related quantitative field; masters degree preferred.
10+ years of Data Engineering experience, with at least 3 years in modern cloud/data stack.
Demonstrated experience designing and implementing enterprise scale data platforms.
Proficient in data management disciplines, including data integration, modeling, building data warehouses/lakes, and data quality, or other areas relevant to data engineering responsibilities and tasks.
Strong communication skills, to be able to clearly articulate technical concepts to non-technical stakeholders.
Strong problem-solving skills and a proactive, ownership-driven mindset.
Benefits
Highly competitive Medical, Dental, and Vision Insurance
Flexible Spending or Health Savings Accounts
Unlimited Vacation Time
10 Paid Holidays
12 Paid Weeks Maternity Leave
Pet Insurance
Retirement Savings: 401(k) and Employee Stock Ownership Plan
Data Architect designing and implementing data architectures supporting analytics and ML for federal clients. Collaborating with teams to translate mission needs into robust data solutions.
IT Data Engineer developing data pipelines and integrations for Scanfil Group's global IT organization. Collaborating across teams to enhance data solutions and reporting capabilities.
Data Engineer developing Azure data solutions at PwC New Zealand. Responsibilities include data quality monitoring, pipeline development, and collaboration with stakeholders in a supportive environment.
R&D Data Engineer at DXC, transforming complex data into digital assets for global analytics and Smart Lab solutions. Collaborating on ELN and LIMS tools for enhanced data management.
Senior Data Engineer at mobility AI company designing large - scale data processing pipelines. Leading technical decisions and mentoring junior engineers in data architecture.
Data Engineer role focusing on data pipelines and processing at 42dot, a mobility AI company. Responsibilities include data collection, schema management, and pipeline monitoring.
Senior Data Engineer at Booz Allen building advanced tech solutions for mission - driven projects. Utilizing data engineering activities, pipelines, and platforms for impactful data insights.
Senior Software Engineer contributing to Workday's AI/MLOps cloud ops platform. Involves data ingestion, computation, and generation of curated data sets with modern technologies.
Data Engineer role at Citi designing and maintaining scalable data solutions. Seeking a skilled professional with extensive data engineering experience and expertise in various technologies.
Data Engineer responsible for designing and maintaining enterprise data warehouse for various projects. Collaborating with stakeholders to ensure efficient data flow and integration.