Data Engineer at Cargill designing, building, and maintaining data systems using cloud technologies. Working with data pipelines and infrastructure to ensure data accessibility and usability.
Responsibilities
Designs, builds and maintains moderately complex data systems
Develops moderately complex data products and solutions using advanced data engineering and cloud based technologies
Maintains and supports the development of streaming and batch data pipelines
Reviews existing data systems and architectures
Helps prepare data infrastructure to support the efficient storage and retrieval of data
Implements automated deployment pipelines to improve efficiency of code deployments
Performs moderately complex data modeling aligned with the datastore technology
Requirements
Minimum requirement of 2 years of relevant work experience
Familiarity with major cloud platforms (AWS, GCP, Azure)
Experience with modern data architectures, including data lakes, data lakehouses, and data hubs
Proficiency in data collection, ingestion tools (Kafka, AWS Glue), and storage formats (Iceberg, Parquet)
Knowledge of streaming architectures and tools (Kafka, Flink)
Strong background in data transformation and modeling using SQL-based frameworks and orchestration tools (dbt, AWS Glue, Airflow)
Familiarity with using Spark for data transformation
Proficient with programming in Python, Java, Scala, or similar languages
Expert-level proficiency in SQL for data manipulation and optimization
Data Architect defining and implementing architecture for the client's Unified Data Platform on Microsoft Azure. Collaborating across teams to ensure scalability, security, and compliance in the platform.
Data Engineer focused on Big Data and AI solutions for corporate digital transformation. Collaborating on data engineering projects and implementing analytical models.
Data Engineer joining Sicredi's team to analyze credit pricing data and build metrics. Involves multidisciplinary collaboration across various teams and data processes.
Data Engineer designing and developing end - to - end data pipelines for analytics and reporting at Deloitte. Collaborating with BI teams and optimizing data models using Azure services.
Data Engineer role focusing on end - to - end data pipelines in Azure environment. Hybrid work model at Deloitte with strong emphasis on collaboration and continuous learning culture.
Data Engineer responsible for data infrastructure design and maintenance at Avanquest. Focus on ensuring data accessibility and reliability for analysis with collaborative teams.
Finance Data Specialist at GEICO managing the build out of a Finance Data Warehouse. Collaborating with Finance and Technology teams to redefine technology management in Finance operations.
Data Engineering Intern at Bynder providing technical solutions for digital asset management. Collaborating with teams to enhance user experience and improve software infrastructure.
Azure Data Engineer responsible for designing and developing ETL/ELT pipelines using Azure Data Factory and Snowflake. Collaborating with teams to ensure data quality and optimize performance in a hybrid work environment.