Calix Cloud Data Engineer involved in architecture design, data ingestion, and analytics for service provider transformation. Collaborating with cross-functional teams in a flexible hybrid work model.
Responsibilities
Work closely with Cloud product owners to understand, analyze product requirements and provide feedback
Develop conceptual, logical, physical models and meta data solutions
Design and manage an array of data design deliverables including data models, data diagrams, data flows and corresponding data dictionary documentations
Determine database structural requirements by analyzing client operations, applications, and data from existing systems
Technical leadership of software design in meeting requirements of service stability, reliability, scalability, and security
Guiding technical discussions within engineer group and making technical recommendations
Design review and code review with peer engineers
Guiding testing architecture for large scale data ingestion and transformations
Customer facing engineering role in debugging and resolving field issues
Requirements
10-12 years of software engineering experience delivering quality products
10+ years of development experience performing Data modeling, master data management and building ETL/data pipeline implementations
Proficiency in both Google Cloud Platform (GCP) services (BigQuery, Dataflow, Dataproc, PubSub/Kafka, Cloud Storage) and AWS services (Redshift, Glue, Kinesis, S3)
Proven experience in designing, building, and maintaining scalable data pipelines across GCP and AWS
Knowledge of big data processing frameworks such as Apache Spark, Flink and Beam
Proficient in using dbt/Dataform for data transformation and modeling within the data warehouse environment
Strong knowledge of SQL and at least one programming language (Python, Java, or Scala)
Understanding of Docker and Kubernetes for deploying data applications
Knowledge of data catalog tools (e.g., DataHub, Collibra, Alation) to ingest and maintain metadata
Strong analytical and troubleshooting skills, particularly in complex data scenarios
Ability to work effectively in a team environment and engage with cross-functional teams
Proficient in conveying complex technical concepts to stakeholders
Knowledge of data governance, security best practices, and compliance regulations in both GCP and AWS environments
Bachelor’s degree in Computer Science, Information Technology, or a related field
Relevant certifications (e.g., Google Cloud Professional Data Engineer, AWS Certified Data Analytics – Specialty)
Benefits
Flexible hybrid work model - work from Bangalore office for 20 days in a quarter
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.
Data Engineer strengthening data platform team at Samba TV to improve data analytics and reporting capabilities. Building on AWS, Databricks, BigQuery, and Snowflake technology.
Data Engineer focusing on secure ETL/ELT data pipelines and compliance in healthcare. Designing scalable ingestion frameworks and ensuring alignment with federal standards.