Senior Data Engineer focusing on building intelligent data platforms for IoT analytics at Acuity Brands. Collaborating with cross-functional teams to enhance AI/ML applications and optimize operations.
Responsibilities
Play a critical role in architecting and developing pipelines that transform raw IoT data into actionable intelligence
Build robust data models that ensure data quality and consistency, enabling downstream analytics and AI/ML applications that optimize operations, enhance user experiences, and unlock new business opportunities
Design and implement scalable data engineering pipelines for ingesting, transforming, and storing IoT telemetry data using Apache Flink, Apache Spark, and Databricks
Build and maintain time-series data solutions using PostgreSQL and TimescaleDB to support high-resolution telemetry analytics
Integrate data governance frameworks for metadata management, lineage tracking, and compliance within a data lake ecosystem
Apply performance tuning techniques in Databricks to optimize batch processing speed and resource utilization
Design and implement Infrastructure as Code (IaC) solutions using Terraform and Azure Bicep to provision and manage cloud resources across multi-cloud environments
Optimize and monitor pipeline performance and resource utilization across distributed environments such as Kubernetes and Databricks clusters
Leverage telemetry and diverse data sources to design, test, and deploy AI/ML solutions, collaborating with data scientists and engineers to build deep learning capabilities within the platform for prediction, early alerting, and prescriptive recommendations
Define appropriate business metrics to measure the impact of AI/ML solutions within the platform
Write scalable, distributed, and highly efficient code, in languages such as Python, Java, PySpark, Scala, and R
Requirements
A BS in Computer Science, Statistics, Mathematics, or a related field
5+ years of experience in building end-to-end analytics solutions, including designing and implementing real-time and batch data pipelines for IoT telemetry or time-series data
Excellent problem-solving, critical thinking, and communication skills
Demonstrated initiative to find solutions to complex problems at scale and operationalize them
Demonstrated ability to work in ambiguous situations and across organizational boundaries
Lead with respect, accountability, integrity, and a positive can-do attitude
Experience working in a data-intensive environment and translating business needs into data requirements
2+ years of experience in building and operationalizing analytics pipelines and services, adopting the container ecosystem of Docker and Kubernetes
2+ years of demonstrated AI/ML pipeline development with relevant code experience
2+ years’ experience using one or more of the following: TensorFlow, MLFlow, PyTorch, SparkML, etc.
2+ years’ experience using one or more of the following frameworks: Apache Flink, Spark, Spark (structured) Streaming, Akka, Kafka, etc.
2+ years’ experience building scalable batch data pipelines on cloud-based platforms, such as Databricks
Benefits
Generous benefits including health care
Dental coverage
Vision plans
401K benefits
Commissions/incentive compensation depending on the role
Data Engineer designing and maintaining data pipelines at Black Semiconductor. Collaborating with process, equipment, and IT teams to support manufacturing analytics and decision - making.
Junior Data Engineer role focusing on Business Intelligence and Big Data at Avanade. Collaborating on data analysis and SQL queries in a supportive learning environment.
GCP Data Engineer designing and developing data processing modules for Ki, an algorithmic insurance carrier. Working closely with multiple teams to optimize data pipelines and reporting.
Data Engineer at Securian Financial optimizing scalable data pipelines for AI and advanced analytics. Collaborating with teams to deliver secure and accessible data solutions.
IT Data Engineering Co‑Op at BlueRock Therapeutics supports development of scientific data systems. Collaboration on data workflows and foundational AWS data engineering tasks.
Data Engineer I building and operationalizing complex data solutions for Travelers' analytics using Databricks. Collaborating within teams to educate end users and support data governance.
Data Engineer shaping modern data architecture to drive golf’s digital transformation. Collaborating with teams to enhance data pipelines and insights for customer engagement and revenue growth.
Staff Data Engineer overseeing complex data systems for CITY Furniture. Responsible for architecting and optimizing data ecosystems in a hybrid work environment.
Data Engineer strengthening data platform team at Samba TV to improve data analytics and reporting capabilities. Building on AWS, Databricks, BigQuery, and Snowflake technology.
Data Engineer focusing on secure ETL/ELT data pipelines and compliance in healthcare. Designing scalable ingestion frameworks and ensuring alignment with federal standards.