Data Engineer creating data pipelines for Santander's card domain. Collaborating with an agile team on strategic projects and leveraging Databricks and PySpark expertise.
Responsibilities
The Data Engineer will work with the Card team’s Data Engineering group to create data pipelines for ingesting and exposing card-domain data in Santander Brazil’s Corporate Data Lake.
The person will work in an agile team on a strategic area project and should have experience with Databricks and PySpark.
Requirements
Databricks proficiency: Experience working with Apache Spark on Databricks, including building and optimizing data pipelines.
PySpark, Python and Kedro experience: Strong programming skills in PySpark and Python and experience with Kedro to develop, debug and maintain data transformation code.
Batch and streaming data processing: Knowledge of batch and streaming (messaging) data processing, with the ability to design, implement and maintain data processing pipelines.
DevOps knowledge: Familiarity with Jenkins for continuous integration and continuous delivery (CI/CD), as well as automation of deployment tasks and pipeline management.
Git: Proficiency with Git for source code version control and effective team collaboration.
Agile methods: Understanding of agile principles and practices such as Kanban and Scrum for effective collaboration and project management.
Orchestration (e.g., Control‑M or others): Knowledge of workflow orchestration tools, important for scheduling and controlling workflows.
Microsoft Azure knowledge: Experience with key Microsoft Azure data services, including Azure Databricks, Azure Data Factory and Azure Storage.
AWS knowledge: Experience with key AWS services such as Aurora PostgreSQL, CloudWatch, Lambda and S3.
On‑Premises environments (Cloudera) experience: Previous experience with the Cloudera platform or other on‑premises big data solutions, including Hadoop, HBase and Hive, is desirable.
Object‑oriented development knowledge: Familiarity with Java is helpful (not required to write code, but to interpret it).
Optional certifications: AZ‑900 (Microsoft Azure Fundamentals) and DP‑900 (Microsoft Azure Data Fundamentals) are preferred and demonstrate solid knowledge of the Azure platform and data fundamentals.
Benefits
Bradesco Health Plan (30% co-payment)
Bradesco Dental Plan (no employee contribution)
Life Insurance
Wellhub (Gympass)
Childcare allowance
Allowance for children with special needs
Payroll‑deductible loan
Private pension
Pet plan
SESC benefits
Conexa telemedicine
Cost allowance
Meal / Food voucher
Multi‑benefits card
Medical plan upgrade
DIFFERENTIALS:
We are a socially responsible employer: extended maternity and paternity leave
INMaterna Program: support program for pregnant employees
Newborn welcome kit and the book "It Happened When I Was Born"
Professional development: courses available through the internal university
100% remote or hybrid, depending on project applicability.
Program Manager leading enterprise - wide data migration efforts for Boeing's transition to modern data platforms. Overseeing complex processes to ensure secure and effective migrations across multiple systems.
Senior Data Engineer responsible for developing data products for Disney's immersive digital experiences. Collaborating with teams to ensure data quality and operational efficiency in a fast - paced environment.
Senior Data Engineer crafting and developing data products for analytical insights at Zendesk. Collaborating in an Agile environment with a focus on data warehousing and process optimization.
Senior Data Engineer designing and building data warehouse solutions with Snowflake for a fintech company. Collaborating with cross - functional teams to facilitate data insights and analytics.
Data Engineer developing and maintaining data pipelines and applications at EvidenceCare. Collaborating across teams to generate actionable insights from healthcare data for better decision - making.
Data Engineer managing and expanding enterprise business intelligence and data platform. Focusing on Tableau development and administration with a strong engineering background.
Lead Data Engineer overseeing engineers and advancing the data platform at American Family Insurance. Creating tools and infrastructure to empower teams across the company.
Data Architect designing end - to - end Snowflake data solutions and collaborating with technical stakeholders at Emerson. Supporting the realization of Data and Digitalization Strategy.
Manager of Data Engineering leading data assets and infrastructure initiatives at CLA. Collaborating with teams to enforce data quality standards and drive integration efforts.