AI Engineer designing and operationalizing AI-driven solutions for analytics. Collaborating with data scientists and MLOps engineers in innovative projects.
Responsibilities
Develop AI Agents: Design, code, and implement AI agents and copilots using Google's Gemini API and Microsoft Copilot Studio.
Integrate Systems: Write robust Python code to connect AI agents with enterprise data sources, internal tools, and third-party services via REST APIs.
Implement RAG Patterns: Build and refine Retrieval-Augmented Generation (RAG) pipelines to ensure agents provide accurate, context-aware responses grounded in proprietary data.
Prompt Engineering: Craft, test, and iterate on effective prompts to guide agent behavior, ensuring reliability, safety, and desired outcomes.
Full Lifecycle Development: Participate in the entire development lifecycle, from initial concept and design through to testing, deployment, and maintenance.
Collaboration: Work closely with senior engineers to overcome technical challenges and with product managers to translate business requirements into functional agent capabilities.
Troubleshooting: Debug and resolve issues in agent performance, whether they stem from the underlying LLM, the data pipeline, or the integration code.
Work with analytics, product, and engineering teams to define and deliver AI solutions.
Participate in architecture reviews and iterative development cycles.
Support knowledge sharing and internal GenAI capability building.
Requirements
Bachelor's degree in Computer Science, Software Engineering, or a related field.
2-4 years of professional software development experience.
Strong programming proficiency in Python and a solid understanding of object-oriented principles.
At least 1 year of hands-on experience building applications with Large Language Models (LLMs) through professional work, significant personal projects, or open-source contributions.
Solid understanding of core LLM concepts, including prompt engineering, embeddings, and function calling/tool use.
Experience consuming and interacting with REST APIs.
A proactive, problem-solving mindset and a strong desire to learn and adapt in a fast-evolving field.
Direct experience making API calls to Google's Gemini, OpenAI models, or using Microsoft Copilot Studio/Azure OpenAI Service.
Familiarity with agentic frameworks like LangChain, LlamaIndex, or Microsoft's Semantic Kernel.
Experience with cloud services on GCP (like Vertex AI) or Azure.
Knowledge of vector databases (e.g., Pinecone, Chroma, Weaviate) and how they fit into RAG architectures.
Basic understanding of CI/CD pipelines and containerization (Docker).
Analytics Lead at Create building and scaling the analytics foundation. Partnering with stakeholders to deliver insights and self - service analytics tools.
AI Engineer designing and operationalizing advanced AI driven solutions for data analytics at Dun & Bradstreet. Collaborating closely with data scientists and MLOps engineers to build production grade systems.
AI Engineer designing and operationalizing AI driven solutions for Dun & Bradstreet's global Analytics organization. Collaborating closely with data scientists and stakeholders to build production - grade systems.
Technology, Data and Change Risk Lead responsible for audit assurance across technology, data and changes at the Bank. Collaborating with senior management and leading delivery of audit strategies.
Senior Data Engineer delivering data science solutions in AI & Data Engineering team. Collaborating on advanced analytics and statistical models in Pune, Maharashtra.
Data Science Trainee involved in developing advanced analytics and AI solutions to improve Beneva's products and services. Collaborating with senior data scientists to propose innovative data solutions.
Lead Clinical Data Science Programmer at ICON plc designing and developing clinical data solutions. Collaborating with teams to ensure efficient handling and analysis of clinical trial data.
Senior Pricing Data Scientist at Engage3, LLC overseeing machine learning PIM pipelines and mentoring junior analysts. Responsible for data quality, project scheduling, and data - driven decision making.
Data Scientist Intern role at P&G developing AI models for consumer analytics. Collaborating in a diverse team to improve user experiences while gaining hands - on experience in a global setting.
Lead design and build of scalable ETL pipelines and data models for Workday Prism. Ensure security and operational management of data while supporting HR and Finance stakeholders.