Delivery Lead managing data integration initiatives for analytical solutions at EXL. Leading teams and ensuring high-quality project delivery with a focus on client outcomes.
Responsibilities
Own end-to-end delivery of data management features, ensuring on-time and high-quality output.
Break down large initiatives into milestones, sprints, and actionable tasks.
Coordinate work across engineering, data science, product, and infrastructure teams.
Monitor progress, manage risks, and adjust plans when requirements evolve.
Ensure architectural decisions align with data strategy, scalability, and reliability goals.
Provide directions on Data modeling, Data pipelines (ETL/ELT), Data governance & quality frameworks, Metadata management, Data lineage tools
API and platform integration
Build, lead, and mentor a team of data engineers, software engineers, and data analysts.
Facilitate career development, performance reviews, and skill-building.
Create a culture of accountability, ownership, and continuous improvement.
Resolve conflicts, remove blockers, and keep the team aligned on priorities.
Work with product managers to define clear requirements and success metrics.
Collaborate with business stakeholders on data needs, SLAs, and integration priorities.
Communicate progress to executives and provide roadmap visibility.
Manage expectations around delivery timelines, scope, and trade-offs
Design, develop, test, and deploy data integration processes (batch or real-time) using tools such Data Factory, Databricks, Matillion, Airflow, Sqoop, etc.
Create functional & technical documentation – e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc.
Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs.
May serve as project or DI lead, overseeing multiple consultants from various competencies
Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration
Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes
Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate.
Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels
Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik.
Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices.
Requirements
10-14 Years industry implementation experience with data integration tools such as Databricks , Azure Data Factory, etc.
Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure.
Must have at least 5+ experience in managing technical team and leading them to deliver large scale solutions.
Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage.
Bachelor’s degree or equivalent experience, Master’s Degree Preferred
Strong data warehousing, OLTP systems, data integration and SDLC
Strong experience in big data frameworks & working experience in Spark or Hadoop or Hive (incl. derivatives like pySpark (prefered), SparkScala or SparkSQL) or Similar, along with experience in libraries / frameworks to accelerate code development
Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.)
Must have excellent communication skills (English) to interface with US clients directly and verify in proficient in all types of communication modes Listening, Reading, Writing and Speaking
Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.)
Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar
Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.)
Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data.
Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP)
Stronga experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms.
3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc
Engineering Intern assisting software engineers in software development and testing for Boomi's integration platform. Gain hands - on experience in a collaborative environment rich with technological innovation.
Citizen Developer & Consultant specializing in Dynamics 365 CRM & Power Platform. Aiding customers in business process analysis and digital solutions creation.
Lead the 'Software Development & AI' team at infodas, establishing standards and fostering collaboration. Drive technical excellence in software development and AI solutions.
Intern in software development at AEB involving agile methodologies and technologies like Spring and Java. Opportunity to write your thesis post - internship with experienced colleagues.
Internship in Mechanical Engineering at ANDRITZ providing project management and support tasks. Involves document control and supplier proposal analysis in Barueri.
Prozessingenieur developing new manufacturing processes in a non - profit pharmaceutical company. Collaborating with departments to optimize productions and ensure quality management documentation.
Snowflake Developer responsible for building scalable data pipelines and integrations. Seeking expertise in Snowflake SQL for data transformation and analytics.
Engineering Intern supporting design engineers in natural gas projects. Collaborating on construction documents and hydraulic analysis for pipeline systems.
Senior Application Developer responsible for designing, developing, and maintaining complex applications at Horizon Blue Cross Blue Shield. Leading technical teams and mentoring junior developers in a hybrid environment.
Senior Associate in Digital Engineering at PwC providing consulting services to optimize operational efficiency and effectiveness in product development. Collaborating with clients to enhance processes and drive business performance.