Data Engineer collaborating on data pipelines for analytics consulting firm, enhancing data flow and supporting decision-making. Work with Agile teams in a hybrid environment.
Responsibilities
Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems
Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake
Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment
Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
Optimize data pipelines for performance, reliability, and cost-effectiveness, leveraging AWS best practices and cloud-native technologies.
Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable data solutions.
Strong problem-solving skills and attention to detail.
Requirements
What You’ll Do:
Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions in full-stack development tools and technologies
Work with a team of developers with deep experience in machine learning, distributed microservices, and full stack systems
Utilize programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift and Snowflake
Share your passion for staying on top of tech trends, experimenting with and learning new technologies, participating in internal & external technology communities, and mentoring other members of the engineering community
Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences to help millions of Americans achieve financial empowerment
Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance
Optimize data pipelines for performance, reliability, and cost-effectiveness, leveraging AWS best practices and cloud-native technologies.
Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver scalable data solutions.
Strong problem-solving skills and attention to detail.
Preferred Qualifications:
8+ years of experience building and deploying large-scale data processing pipelines in a production environment.
Hands-on experience in designing and building data pipelines
4+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud)
4+ years experience with Distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL)
4+ year experience working on real-time data and streaming applications
4+ years of experience with NoSQL implementation (Mongo, Cassandra)
4+ years of data warehousing experience (Redshift or Snowflake)
4+ years of experience with UNIX/Linux including basic commands and shell scripting
2+ years of experience with Agile engineering practices
Benefits
Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, challenging, and entrepreneurial environment, with a high degree of individual responsibility.
***Tiger Analytics provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, national origin, ancestry, marital status, protected veteran status, disability status, or any other basis as protected by federal, state, or local law.***
Data Engineer responsible for architecting, developing, and maintaining Allegiant’s enterprise data infrastructure. Overseeing transition to cloud hosted data warehouse and developing next - generation data tools.
Senior Data Engineer developing Azure - based data solutions for clients in the Data & AI department. Collaborating with architects and consultants to enhance automated decision making.
Data Engineer II focusing on strategic ingestion product for Travelers. Building data solutions and collaborating across teams to support analytic transformation.
Data Engineer at Tatum focusing on scalable data solutions and blockchain technology. Collaborating with teams to ensure data integrity and manage data infrastructure in a hybrid setup.
Senior Lead Data Engineer at Capital One collaborating with Agile teams and mentoring developers. Leading full - stack development and driving cloud - based solutions for financial empowerment.
Senior Data Engineer responsible for developing data pipelines and collaborating with US business teams. Working at Beghou Consulting, a life sciences company providing advanced analytics and technology solutions.
Data Solutions Architect designing enterprise - scale Azure and Databricks Lakehouse solutions for clinical trials and life sciences data enabling advanced analytics and compliance.
Data Architect at ADEO ensuring interoperability of IT systems through architecture design and data knowledge diffusion. Collaborating with teams to maintain data integrity and quality standards in an international setup.
Consultant, Data Engineer leading end - to - end data solutions and analytics. Collaborating with clients to improve data strategies and deliver actionable insights.
Big Data Engineer developing applications for Synchrony’s Enterprise Data Lake within an Agile scrum team. Collaborating to deliver high - quality data ingestion and maintain data governance standards.