Senior Data Engineering Consultant at Gradion transforming data infrastructure for global clients. Modernizing legacy systems and operationalizing AI/ML solutions with cloud and AI consultants.
Responsibilities
Advise senior client stakeholders on modern data architecture, cloud migration strategies, and the secure, compliant use of data for business value.
Define clear roadmaps for clients to transition from legacy data warehouses to scalable cloud-native data platforms (e.g., Data Lakes, Lakehouses).
Design data pipelines and structures that enable clients to monetize data assets and derive actionable insights.
Conduct data maturity assessments and define target-state architectures and roadmaps.
Communicate complex data and AI topics in clear business language to executives and stakeholders.
Lead the design and implementation of robust, scalable, and cost-efficient data infrastructure (data lakehouse, data mesh, or centralized warehouse) on major hyperscaler platforms (AWS, Azure, GCP).
Develop and optimize high-throughput data pipelines using modern ELT/ETL tools (e.g., Spark, Flink, Kafka) to handle large data volumes and integrate disparate data sources.
Build, deploy, and manage production-ready machine learning AI/ML pipelines (MLOps) including feature stores, ML model registries, model training, workflows (e.g., MLflow, Vertex AI, SageMaker, Azure ML), serving and monitoring.
Explore and implement data engineering infrastructure required to develop and deploy Small Language Models and other applied AI solutions within client environments for specific client use cases.
Establish data governance, lineage, and compliance controls to ensure trustworthy AI and regulatory readiness.
Contribute to Gradion’s internal frameworks for data platform modernization and AI readiness.
Define and implement data governance frameworks aligned with ISO, FINMA, GDPR, or MedTech requirements.
Embed data security, masking, and access control into pipelines and platform layers.
Help clients design policy-as-code and automated compliance guardrails for data and AI systems.
Conduct technical and architectural assessments (data platform Health Checks) to identify bottlenecks, security gaps, and cost inefficiencies.
Requirements
7+ years of experience in data engineering, data architecture, or ML platform engineering roles.
Strong background in ETL/ELT, data lake/warehouse architecture, and distributed data processing.
Hands-on experience with one or more cloud data ecosystems (AWS, Azure, GCP, Snowflake, Databricks, BigQuery, Synapse, etc.).
Proficiency in Python and SQL; experience with modern frameworks such as Spark, Airflow, dbt, Kafka.
Familiarity with containerization and orchestration (Docker, Kubernetes).
Experience designing and implementing MLOps principles and tooling (e.g., Kubeflow, MLflow, SageMaker, Azure ML). or integrating data pipelines with AI/ML workloads.
Understanding of data security, compliance, and governance frameworks (ISO, GDPR, SOC2).
Consulting mindset, able to translate technical depth into client value and communicate clearly with business stakeholders.
Excellent communication, presentation, and stakeholder management skills, comfortable working with both technical teams and C-level executives (English proficiency, German a plus).
Desired
Experience with data monetization, data products, or real-time analytics.
Familiarity with LLM/SLM architectures, vector databases, or retrieval-augmented generation (RAG) patterns.
Experience with Databricks, Snowflake, or other modern data warehousing/lakehouse platforms.
Familiarity with distributed processing frameworks (e.g., Spark).
A Master's degree in Computer Science, Data Science, or a related quantitative field.
Experience in regulated industries (finance, healthcare, MedTech) or with cross-border data environments (EU/US/APAC).
Certifications such as AWS Data Analytics, Azure Data Engineer, or GCP Professional Data Engineer are a plus.
Benefits
A laptop is provided
Community Tech activities
A fun & dynamic environment with freedom to be creative
EU Commercial Data Engineer developing scalable data solutions for Genmab’s commercial teams. Collaborating with cross - functional teams to enhance business insights and decision making through reliable data.
Principal Data Engineer designing and developing innovative data analytical solutions for the gaming industry. Leading and mentoring while engaging with clients to fulfill their data engineering needs.
Specialist, Data Engineering at CoverMyMeds enhancing and expanding data platforms for commercial data products. Collaborating with multiple teams to design scalable data solutions from various sources.
Team Lead in Data Engineering at Avanquest mentoring data engineering team and ensuring efficient data management across platforms. Collaborating with departments to align solutions and optimize workflows.
Data Architect at RSM leading AI - driven data migration initiatives within Salesforce ecosystem. Implementing data governance and optimizing performance across complex datasets.
Senior Data Engineer at Capgemini designing and optimizing scalable data architectures on Databricks and GCP. Collaborating across teams to transform business needs into reliable technical solutions.
Data Engineer transforming legacy on - premises systems to cloud - native architectures for advanced data analytics. Collaborating with teams to build efficient data solutions using Python and AWS.
Data Engineering Academy focused on Snowflake and Databricks for professionals interested in expanding their technical capabilities. Fully remote with future office work in Monterrey or Saltillo after completion.
Senior Data Engineer at Intent HQ designing and scaling data platforms. Building high - impact intelligence from millions of customer insights with a focus on performance and reliability.