Full Stack Software Engineer building cloud-based applications for R&D's field automation team. Collaborating in an agile environment to enhance connectivity and efficiency in agricultural technologies.
Responsibilities
Work in an agile development environment to build and support full-stack cloud-based applications.
Work with development lead and stakeholders to manage project priorities, deadlines, and deliverables.
Interact directly with business customers to gather and understand requirements and how they translate in application features.
Conduct functional and non-functional testing.
Troubleshoot and debug applications.
Deploy applications across all environments in collaboration with development team, product management, and delivery.
Champion code quality including unit and integration testing.
Evaluate existing applications to refactor, update and add new features.
Develop any technical documentation needed to accurately represent application design and code.
Adhere to established and modern data security practices.
Provide guidance for technical design and architecture decisions within the team.
Participate in the evaluation and selection of new technologies.
Mentor other people and teams on technologies, techniques or standards across the organization.
Requirements
Bachelor’s degree in computer science, computer engineering, or equivalent experience.
3+ years of experience developing REST APIs in a modern programming language/technology such as TypeScript or JavaScript.
3+ years of experience developing single/multi-page web-based UI’s.
2+ years of experience working with relational/NoSQL database technologies and abstraction tools.
2+ years of experience developing in the AWS ecosystem and practical experience with Lambda, ECS, EC2, S3, IAM, RDS.
Practical experience architecting and designing full-stack software systems.
Practical experience and knowledge of software development best practices.
Demonstrable technical leadership.
Demonstrable ability to manage ambiguity.
Excellent written and verbal communication skills to technical and non-technical audiences.
Experience developing IoT tech. or distributed systems generating large amounts of data. (Desirable)
Practical experience developing applications or scripting in Python 3 (Desirable)
Experience with asynchronous, event driven technologies such as Kafka and RabbitMQ (Desirable)
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.
Data Engineer strengthening data platform team at Samba TV to improve data analytics and reporting capabilities. Building on AWS, Databricks, BigQuery, and Snowflake technology.
Data Engineer focusing on secure ETL/ELT data pipelines and compliance in healthcare. Designing scalable ingestion frameworks and ensuring alignment with federal standards.