Senior Data Engineer at SS&C building and optimizing data pipelines in a lakehouse environment. Collaborating with data architects and stakeholders in the financial services sector.
Responsibilities
Implement and maintain end-to-end data pipelines for data acquisition from diverse sources, including databases, APIs, files, and messaging systems such as Kafka.
Build robust data validation, enrichment, and transformation workflows using Python and pySpark.
Develop and optimize data storage and querying layers using technologies such as Apache Iceberg, Trino, StarRocks, and Snowflake.
Implement and maintain dimensional data models, including Star and Snowflake schemas, as defined by data architecture standards.
Integrate and manage streaming data flows using Kafka for both ingestion and real-time data distribution.
Design and implement data quality checks, monitoring, and alerting to ensure high data reliability.
Contribute to metadata management, data governance, and security practices, including access controls and data masking.
Enable data distribution and consumption through files, APIs, Kafka, Snowflake data sharing, and analytics tools.
Optimize pipeline performance, cost, and scalability while troubleshooting and resolving production issues.
Collaborate closely with data architects, analysts, data scientists, and stakeholders to deliver high-quality data products.
Mentor junior engineers and promote best practices in code quality, testing, and CI/CD for data pipelines.
Requirements
Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.
5+ years of hands-on experience in data engineering roles, including at least 2 years working with big data or lakehouse platforms.
Strong proficiency in Python and pySpark for building scalable data processing pipelines.
Hands-on experience with analytical and query platforms such as Trino, StarRocks, and Snowflake.
Experience working with open table formats, particularly Apache Iceberg.
Proven experience with streaming technologies, especially Apache Kafka.
Solid understanding of dimensional modeling and data warehousing concepts.
Familiarity with data quality frameworks, metadata management, governance tools, and security best practices.
Experience with cloud platforms such as AWS, Azure, or GCP, and infrastructure-as-code tools.
Strong problem-solving skills with experience debugging and tuning complex data pipelines.
Data Engineer building and scaling client - facing Microsoft Fabric analytics platform to drive revenue and decision - making. Collaborating with teams to develop pipelines, optimize performance, and ensure client satisfaction.
Data Engineer role focusing on migrating legacy systems to ADA at BBVA. Collaborate with multidisciplinary teams and ensure system integrity during transitions.
Senior Data Engineer focused on modernizing enterprise data capabilities at U.S. Bank. Designing and building reusable data engineering patterns for consistent delivery across teams.
Principal Data Pipeline Lead at SS&C overseeing development of scalable data pipelines. Leading a small team and providing technical guidance for modern data platform integration.
Experienced Data Architect designing and implementing scalable data architecture for a financial services and healthcare technology company. Collaborating across teams to support analytics and operational needs.
Data Architect designing scalable, secure data architectures for fraud detection and risk management at Fiserv. Collaborating with cross - functional teams and managing large datasets and pipelines.
Director of Engineering overseeing development of AI - driven data platforms at LVT. Leading teams to transform sensor data into actionable insights using modern architecture and technologies.
Senior Data Engineer at Independence Pet Holdings shaping data ecosystem by building platforms and pipelines. Collaborating with teams to enhance data analytics and operational insights.
Senior Data Engineer designing and developing scalable data pipelines for fintech company. Collaborating with stakeholders to ensure analytics - ready data formats and supporting batch and streaming processes.
Senior Data Engineer at Vancity designing, building, and optimizing scalable data pipelines. Collaborating closely with analytics and business teams to deliver trusted data products while ensuring high standards of data quality.