Data Engineer II focusing on data ingestion and architecture for SimplePractice's analytics platform. Collaborating across teams to build scalable data systems and maintain data quality
Responsibilities
Partner with analysts to build scalable systems that help unlock the value of data from a wide range of sources such as backend databases, event streams, and marketing platforms
Consult with our Product and Engineering Teams in the creation of new data in the production environment
Create company wide alignment through standardized metrics across the company
Promote importance of dimensional data models in communicating across the organization
Manage the complete data stack from ingestion through data consumption
Connect our teams and their workflows to centralized and secure databases
Build tools to increase transparency in reporting company wide business outcomes
Define and promote data engineering best practices
Design scalable data solutions leveraging cloud data technologies, preferably in AWS
Help define data quality and data security framework to measure and monitor data quality across the enterprise
Excellent problem-solving & critical-thinking skills to meet complex data challenges and requirements in a fast paced, rapidly changing environment
Requirements
4+ years of progressive professional experience preferred
Top-notch SQL, statistical/window functions, complex data types
Expert in relational technology, data modeling, and in dimensional modeling
Expert in at least two database engines, preferably MySQL, Snowflake, or Postgres
Metadata-driven and database-centric concepts
Database performance
Expert at ETL and ETL tools, including Airflow/Prefect, DBT, Airbyte, Fivetran
ELT and schema-on-read concepts
Data ingestion tools, such as Kafka, DMS, Singer
At least one programming language, preferably Python
Unix/Linux scripting, such as bash
Experience with APIs, such as via curl
Experience with achieving performance through parallelism
Experience with cloud-based infrastructure, particularly AWS
Cloud storage, S3
Data storage formats, such as Parquet, ORC
Experience with external tables
Unstructured and semi-structured data types, JSON
Experience with at least one visualization tool, preferably, Looker, Tableau, Sisense
Excellent communication skills
BS/MS degree in Engineering, Mathematics, Physics, Computer Science or equivalent experience
Benefits
Privatized Medical, Dental & Vision Coverage
Work From Home stipend
Flexible Time Off (FTO), wellbeing days, paid holidays, and Summer Fridays
Monthly Meal Reimbursement
Holiday Bonus, 15-day Aguinaldo
Hybrid Work Schedule & Catered Lunch
A relocation bonus for candidates joining us from a different city
Data Engineer at Equinix implementing data architecture solutions for scalability and analytics. Collaborating with teams to design data pipelines and maintain data models for business objectives.
Data Warehouse Architect developing and optimizing robust data warehouse environments on SAP BW/4HANA. Critical for enabling advanced analytics and reporting across the organization.
Data Engineering Manager leading a new Data Engineering team in Bengaluru. Shaping the design and scaling of core data engineering practices across the organization.
Sr. ETL/Data Warehouse Lead at Huntington designing, developing, and supporting ETL and Data Warehousing framework. Analyzing systems based on specifications and providing technical assistance.
Senior Google Data Architect designing and delivering scalable data solutions on Google Cloud Platform. Collaborating across teams to shape target - state data architectures and influence enterprise data strategy.
Data Engineer developing scalable data lake solutions and optimizing data pipelines at U.S. Bank. Collaborating with teams to manage data governance and cloud migration activities.
Lead AI, MLOps & Data Engineer at WedR, guiding complex data projects and AI innovation. Collaborate with diverse experts in a Product Studio for digital transformations.
Lead Azure Databricks Data Engineer implementing data solutions for data engineering projects at Ryan Specialty. Collaborating with stakeholders and mentoring junior staff on data pipelines and ETL processes.
Lead Azure Databricks Data Engineer at Ryan Specialty focused on implementing data solutions and collaborating with cross - functional teams to enhance data architecture.
Senior Data Engineer designing and implementing sustainable data solutions for diverse clients. Collaborating closely with stakeholders to enhance data services and platforms in a hybrid environment.