Enabling Quick Access to Corporate Knowledge For Minimizing Support Specialist Dependency

In this case study, we explore how the company leveraged Azure services and the innovative 'ChatGPT on Your Data' approach, incorporating Retrieval-Augmented Generation (RAG), to effectively manage and utilize a large collection of documents. This solution significantly streamlined information retrieval, reducing dependency on support specialists and enhancing their efficiency in addressing requests.
Organization (under NDA):
The company is renowned for delivering exceptional software solutions, catering to organizations of diverse sizes. Its expertise lies in facilitating the deployment, management, and cost-efficiency of major cloud-based technologies.
The company manages an extensive collection of documents in various file formats, spread across multiple internal and external systems.
The aim is to make it faster for users to get information on their own, reducing their need for help from support staff, and at the same time, helping those staff handle requests better.
Environment and Requirements
The company operates within a technologically advanced environment, characterized by a comprehensive integration of their data and information systems on the Azure platform.
Key aspects of this environment include:
Data Storage. All the client's data, along with user queries, are securely housed within their Azure tenant. This setup ensures not only the safety and confidentiality of the data but also allows for seamless integration and management within the Azure ecosystem.
Data Synchronization and Format Support. The client's environment supports real-time synchronization with a variety of data sources such as SharePoint and Zendesk with Azure Storage. It is capable of handling an array of document formats including PDF, Microsoft Office formats, HTML, and Markdown, thereby accommodating a wide spectrum of documentation needs.
Document Processing Capabilities. A sophisticated document processing logic is integral to the client's setup. It adeptly manages tables and images in documents, preserving the format and data structure, which is essential for maintaining information accuracy and usability.
Analytics and Request Tracking. The system includes a feature to track user queries, which can be employed for detailed analytics and review. This function provides critical insights into user interaction and information requirements.
Potential Integration with Copilot for Microsoft 365. The client's environment is primed for potential integration with Copilot for Microsoft 365, using Microsoft Graph Data Connect and Microsoft Teams message extension. This integration would further enhance the system’s capabilities within the Microsoft ecosystem.
Utilization of Azure OpenAI Models. The Azure OpenAI GPT 3.5 and GPT4 Turbo models are a cornerstone of the client’s environment, providing sophisticated language processing and response generation capabilities.
Role-Based Access Control. A robust access control system is in place to govern document retrieval based on user roles and document permissions. This ensures a secure and compliant information access framework.
API for Integration. The environment includes an API for seamless integration with the client's existing services, facilitating a unified and efficient technology ecosystem.
Customizable Profiles. The system features chatbot profiles that can be tailored for different departmental roles, such as IT support and Marketing, ensuring relevant and role-specific information retrieval.
Flexibility for Custom Integrations. Designed for versatility, the client’s setup allows for the integration with custom APIs and structured data sources like databases, accommodating unique data management and retrieval needs.
Solution
Our solution is a comprehensive system designed to revolutionize how company knowledge is accessed and utilized. At its heart are Microsoft Teams and web chatots, enabling users to pose questions and receive instant answers derived from the company's extensive knowledge base. Utilizing the "ChatGPT on Your Data" approach with Retrieval-Augmented Generation (RAG), our bots efficiently handle large datasets.
Complementing these chatbots, we have implemented a Microsoft Teams Dashboard. This dashboard offers administrators a comprehensive overview of chatbot usage, highlighting key metrics such as frequent inquiries, areas requiring additional attention, and recent updates to the knowledge base.
A crucial component of our solution is the automatic processing of both existing and new documents, seamlessly integrating them into the knowledge base. This ongoing process ensures that the most current and relevant information is always available at the fingertips of users and administrators.
Delving into the actual architecture, our solution leverages multiple components within the Microsoft Teams application:
Microsoft Teams chatbot. Facilitates quick and direct communication for immediate query resolution.
Web-based chatbot. Offers a more customized communication experience, catering to specific user preferences.
Dashboard. A centralized hub for administrators, providing detailed statistics and configuration options.
The AI Orchestration layer plays a pivotal role, identifying user intents and determining the optimal way to process each request, whether it's sourcing information from the knowledge base or elsewhere.
The 'Searcher' component actively seeks out relevant documents based on user requests, leveraging Azure OpenAI to find the most accurate answers. Simultaneously, the 'Loader' works tirelessly, importing documents from various external sources, preprocessing them for efficiency. These documents are then handed over to the 'Indexer', which processes the content and metadata, creates embeddings using Azure OpenAI Service, segments the documents into manageable chunks, and stores them in Azure AI Search.
Technologies
At the core of our technology stack is Azure OpenAI Service, providing cutting-edge language processing capabilities, complemented by Azure AI Document Intelligence for sophisticated document processing. Azure AI Search plays a critical role in swiftly navigating through extensive data, ensuring that information retrieval is both accurate and rapid. The incorporation of Semantic Kernel and LangChain enhances the system's ability to understand and generate human-like responses, making interactions intuitive and effective.
The back-end infrastructure is built on ASP.NET, known for its robustness and scalability, while the Microsoft Bot Framework forms the backbone of our conversational AI applications. For front-end development, we employ Blazor to create a dynamic and user-friendly Dashboard, and React for building a web-based chatbot, integrating Microsoft Bot Framework components for a seamless user experience.
For a comprehensive overview of our technological capabilities, we invite you to visit our Technologies page.