Senior Big Data Engineer at Citi responsible for leading application systems analysis and programming activities. Collaborating with management to integrate functions and improve processes.
Responsibilities
Partner with multiple management teams to ensure appropriate integration of functions to meet goals
Identify and define necessary system enhancements to deploy new products and process improvements
Resolve variety of high impact problems/projects through in-depth evaluation of complex business processes
Provide expertise in area and advanced knowledge of applications programming
Ensure application design adheres to the overall architecture blueprint
Utilize advanced knowledge of system flow and develop standards for coding, testing, debugging, and implementation
Develop comprehensive knowledge of how areas of business integrate to accomplish business goals
Provide in-depth analysis with interpretive thinking to define issues and develop innovative solutions
Serve as advisor or coach to mid-level developers and analysts, allocating work as necessary
Appropriately assess risk when business decisions are made
Requirements
6-10 years of experience with Data Pipeline Development
Design, build, and maintain scalable ETL/ELT pipelines to ingest, transform, and load data from multiple sources
Big Data Infrastructure: Develop and manage large-scale data processing systems using frameworks like Apache Spark, Hadoop, and Kafka
Proficiency in programming languages like Python, or Scala
Strong expertise in data processing frameworks such as Apache Spark, Hadoop
Expertise in Data Lakehouse technologies (Apache Iceberg, Apache Hudi, Trino)
Experience with cloud data platforms like AWS, Azure, or GCP
Expertise in SQL and database technologies (e.g., Oracle, PostgreSQL, etc.)
Experience with data orchestration tools like Apache Airflow or Prefect
Familiarity with containerization (Docker, Kubernetes) is a plus
Prior experience with building distributed, multi-tier applications is highly desirable
Experience with building apps which are highly performant and scalable will be great
Data Management professional at Kyndryl involved in creating innovative data solutions and ensuring the seamless operation of complex data systems. Collaborating with teams to transform requirements into scalable database solutions.
Software Engineer designing and developing scalable data processing applications on cloud infrastructure for Thomson Reuters. Collaborating with Data Analysts on AI - enabled solutions for data management and insight generation.
Manager of Data Platform overseeing AWS cloud infrastructure and Snowflake data warehouses for Thomson Reuters. Leading the design and implementation of data processing applications in a hybrid role located in Bengaluru.
Senior Data Engineer designing scalable data pipelines and solutions for Enterprise Data Lake at Thomson Reuters. Collaborating across teams to ensure efficient data ingestion and accessibility.
Senior Data Engineer at Technis developing scalable data pipelines and solutions for innovative connected spaces products. Collaborating within a cross - functional team to deliver high - quality data - driven outcomes.
Data Architect designing and implementing data architectures supporting analytics and ML for federal clients. Collaborating with teams to translate mission needs into robust data solutions.
IT Data Engineer developing data pipelines and integrations for Scanfil Group's global IT organization. Collaborating across teams to enhance data solutions and reporting capabilities.
Data Engineer developing Azure data solutions at PwC New Zealand. Responsibilities include data quality monitoring, pipeline development, and collaboration with stakeholders in a supportive environment.
Senior Data Engineer designing and implementing the Enterprise Data Platform at Stellix. Focusing on analytics and insights with a growth path to Principal Data Engineer or Data Architect.