Senior Data Engineer with expert-level database skills at FreedomPay. Collaborating on data pipelines and products across SQL Server and Snowflake.
Responsibilities
Design and evolve data models and database objects for operational and analytical workloads in Microsoft SQL Server and Snowflake (schemas, roles, warehouses, performance and cost optimization).
Build and maintain ELT/ETL pipelines (batch and near-real-time), leveraging Snowflake capabilities (Snowpipe, Streams/Tasks) and orchestration tools (e.g., Airflow or Azure Data Factory) as appropriate.
Implement and support data streaming and event-driven ingestion patterns using technologies such as Kafka, Azure Event Hubs, (topics/streams, schemas, consumers, and replay strategies).
Leverage Redis and other low-latency data stores for caching and real-time access patterns; partner with application teams to define fit-for-purpose SLAs and data freshness targets.
Develop and maintain curated datasets and self-service analytics in Sigma Computing (workbooks, datasets, governance and performance), and support legacy reporting where needed (e.g., SSRS).
Collaborate with engineering, analytics, and product teams to deliver data solutions that meet business requirements.
Automate deployments using Git-based workflows and CI/CD (e.g., Azure DevOps), including database migration/versioning (Flyway).
Use Claude Code (AI-assisted development) to accelerate data pipeline delivery (design, implementation, refactoring, documentation, and troubleshooting) while adhering to security, quality, and SDLC standards.
Participate in Agile ceremonies and contribute to continuous improvement of data engineering processes and standards.
Establish data quality, testing, and observability (e.g., unit/integration tests for pipelines, data validation, lineage, alerting, SLAs) to ensure reliable delivery.
Partner with engineering, analytics, and product teams to define and deliver data products (source-to-target mappings, contracts, SLAs), enabling trustworthy analytics and operational use cases.
Ensure data security, governance, and compliance across platforms (PII handling, encryption, auditing, retention), including Snowflake RBAC, secure data sharing, and access controls.
Troubleshoot and resolve performance, reliability, and scalability issues across data platforms; instrument pipelines with logging/metrics and on-call friendly runbooks.
Requirements
7+ years of experience in data engineering and/or database engineering, including building and operating production data pipelines.
Strong understanding of modern data engineering practices and tools (cloud data platforms, orchestration, testing/observability, DataOps, and AI-assisted development with Claude Code).
Strong English reading and writing communication skills, with an ability to express and understand complex technical concepts. As other languages are a requirement, that will be explicitly noted during the recruitment process.
Strong analytical, problem-solving, and conceptual skills.
Hands-on experience with Snowflake and integrating it into production data pipelines.
Experience enabling governed self-service analytics with Sigma Computing (datasets, workbooks, access controls, and performance best practices).
Experience with streaming/event platforms such as Kafka, Azure Event Hubs, including schema/versioning considerations and operational support.
Proficiency with Python for data engineering automation and/or building pipeline components; experience with orchestration (Airflow and/or Azure Data Factory) is strongly preferred.
Experience using Claude Code to develop, test, and iterate on data pipeline solutions (e.g., generating boilerplate, improving SQL/Python, and speeding up root-cause analysis) with appropriate human review.
Ability to work in teams and strong interpersonal skills.
Ability to work under pressure and meet tight deadlines.
Ability to anticipate potential problems and determine and implement solutions.
Senior Data Engineer designing and improving software for business capabilities at Barclays. Collaborating with teams to build a data and intelligence platform for Equity Derivatives.
Senior AI & Data Engineer developing and implementing AI solutions in collaboration with clients and teams. Working on projects involving generative AI, predictive analytics, and data mastery.
Consultant driving IA business growth in Deloitte's Artificial Intelligence & Data team. Delivering innovative solutions using data analytics and automation technologies.
Data Engineer responsible for managing data architecture and pipelines at Snappi, a neobank. Collaborating with teams to enable data processing and analysis in innovative banking solutions.
Data Engineer at Destinus developing the data platform to support production and analytics needs. Involves migrating Excel sources to Lakehouse and integrating ERP systems in a hybrid role.
Senior Data Engineer developing solutions within the Global Specialty portfolio at an insurance company. Engaging with diverse business partners to ensure high quality data reporting.
Data Engineer at UBDS Group focusing on designing and optimizing modern data platforms. Collaborating in a multidisciplinary team to develop reliable data assets for analytics and operational use cases.
Data Engineer (dbt) at SDG Group involved in all phases of data projects. Collaborate on data ingestion, transformation, and visualization in a hybrid environment.
Data Consultant at SDG Group specializing in Data & Analytics projects. Collaborate on technical - functional definitions, ETL, data modeling, and visualization for cloud solutions.
Senior Data Engineer responsible for growing customer - defined targeting calculations and developing key/value databases for real - time data processing.