Data Engineer designing and maintaining data pipelines at Agile Defense. Focused on integrating technologies for vital national security and civilian missions.
Responsibilities
Design, develop, and maintain data ingestion pipelines to collect structured and unstructured data from user applications and bulk data sources.
Implement efficient data transformation and storage strategies to facilitate easy retrieval and analysis.
Collaborate with software development teams to optimize application-generated data for streamlined aggregation.
Ensure data integrity, security, and compliance with industry regulations and company policies.
Optimize query performance and improve data access patterns to support analytical and operational use cases.
Work with business stakeholders to define data requirements and improve data accessibility for decision-making.
Design and build APIs to support data ingestion, retrieval, and integration with external systems.
Develop and maintain event-driven architectures, utilizing event storage for real-time data processing
Requirements
Bachelor’s degree in Computer Science, Software Engineering, or a related technical field, or equivalent professional experience.
Minimum of eight (8) years of relevant professional experience.
Experience developing data pipelines, integration solutions, and data aggregation services in complex environments.
Must possess an active TS/SCI security clearance.
Due to U.S. Government contract requirements, only U.S. citizens are eligible for this position.
3+ years of experience in front-end development with a focus on React.
Strong proficiency in JavaScript (ES6+), TypeScript, HTML5, and CSS3.
Knowledge of modern CSS frameworks (Tailwind CSS, Styled Components, or SASS/SCSS).
Experience with build tools and package managers (Webpack, Vite, pnpm, Yarn, npm).
Familiarity with RESTful APIs, Eventstores, and WebSockets for data fetching.
Experience with testing frameworks (Jest, React Testing Library, Cypress) is a plus.
Knowledge of performance optimization techniques and best practices for front-end development.
Experience with CI/CD pipelines and version control (Git, GitLab, etc.).
Strong problem-solving skills and the ability to work independently or collaboratively.
Proficiency in database technologies, including SQL, NoSQL, and event-based databases (e.g., PostgreSQL, MySQL, MongoDB, EventStoreDB).
Strong programming skills in Golang, Python, and JavaScript/TypeScript.
Experience with cloud-based data solutions (AWS, GCP, or Azure) and data warehousing.
Familiarity with building APIs, data integration tools, and best practices for data aggregation.
Understanding of API-based data aggregation and event-driven architectures.
Experience working with streaming and batch data processing frameworks is a plus.
Ability to work independently and collaboratively within a team.
Building microservice APIs in Golang/Python (e.g., Gin, Echo, FastAPI, etc.) is a plus.
Designing, maintaining, and updating database schemas and configurations (SQL/NoSQL/EventStoreDB) is a plus.
Aggregating user-created and imported classified data into versioned datasets is a plus.
Designing data provenance, lineage, retention, and management solutions is a plus.
Digital Analytics Capability - Adobe Data Engineer at Bankwest. Supports Adobe Experience Cloud applications and contributes to digital analytics capabilities.
Senior Data Engineer responsible for Cloud Data Lake design and development in financial services. Collaborate with management and technical teams to execute technology product roadmap.
Senior Data Architect for the Chief Data Office at State Street driving Global Data Architecture and Management. Collaborating with stakeholders to streamline data processes and enhance data quality.
Senior Data Engineer responsible for reliable data integration and transformation at Evertec. Collaborating with stakeholders and engineering teams using SQL and cloud platforms.
Data Engineer needed to build and maintain cloud data pipelines for a digital transformation consultancy. Collaborate on analytics initiatives and ensure data quality in São Paulo.
Senior Data Engineer at Verity designing and evolving data architectures. Building scalable pipelines and implementing data engineering best practices in cloud environments.
Data Engineer responsible for building and maintaining data solutions on AWS and GCP. Focus on Lakehouse architecture to support analytics, reporting, and AI/ML use cases.
Data Engineer Pleno at Verity focusing on building data pipelines and analytics frameworks. Engage with teams to ensure data quality and enhance cloud - based solutions.
Data Engineer for Verity, a digital transformation consultancy, designing data architectures and building scalable pipelines. Collaborates on data quality and analytical dataset structuring.
Data Engineer at GFT managing data systems and workflows, focusing on data engineering and data science collaboration. Leveraging technologies like Python, Airflow, and AWS.