London Stock Exchange Group developed a client services assistant application using Amazon Q Business to enhance their post-trade customer support. The solution leverages RAG techniques to provide accurate and quick responses to complex member queries by accessing internal documents and public rulebooks. The system includes a robust validation process using Claude v2 to ensure response accuracy against a golden answer dataset, delivering responses within seconds and improving both customer experience and staff productivity.
The London Stock Exchange Group (LSEG) case study presents a comprehensive example of implementing LLMs in a highly regulated financial environment, specifically focusing on enhancing post-trade client services through their London Clearing House (LCH) division. This case study is particularly noteworthy as it demonstrates how a major financial institution approached GenAI deployment with careful consideration for accuracy, reliability, and security.
### Business Context and Challenge
LCH, as part of LSEG's Markets division, provides critical clearing house services across multiple asset classes including OTC and listed interest rates, fixed income, foreign exchange, credit default swaps, equities, and commodities. Their client services team faced challenges in efficiently handling a broad range of complex technical queries from customers about their services and policies. The traditional approach relied on static FAQs and an in-house knowledge center, which wasn't optimal for handling the dynamic nature of customer inquiries.
### Solution Architecture and Implementation
The solution architecture demonstrates a well-thought-out approach to implementing LLMs in production:
* **Technology Selection Process**: The team conducted a thorough evaluation of different LLM approaches, including prompt engineering, RAG, and custom model fine-tuning. They chose Amazon Q Business primarily due to its built-in enterprise search capabilities and ease of deployment, prioritizing practical implementation over complex custom solutions.
* **Data Integration Strategy**: The system incorporates multiple data sources:
* Internal knowledge repositories and FAQs stored in S3 buckets
* Public-facing website content accessed via web crawler
* CRM software integration
* Custom knowledge base for validation
* **Security and Authentication**: The implementation includes robust security measures:
* SAML 2.0 IAM federation for secure access
* Integration with third-party identity providers
* Proper IAM role configuration for AWS service access
* **Custom Frontend Development**: Instead of using the default Amazon Q Business interface, the team developed a custom UI:
* Hosted on Amazon ECS
* Interfaces with API Gateway
* Uses Lambda functions for business logic and authorization
### Quality Assurance and Validation
A particularly impressive aspect of this implementation is the rigorous validation system:
* Created a "golden answer" knowledge base with 100 verified Q&A pairs
* Implemented a validator Lambda function that leverages Anthropic's Claude v2 model through Amazon Bedrock
* Built automated comparison mechanisms to verify response accuracy
* Maintains a DynamoDB table for tracking and storing validation results
### Production Deployment Strategy
The team adopted a careful, phased approach to deployment:
* Started with thorough testing of response accuracy and performance
* Validated both structured and unstructured data handling
* Implemented response time monitoring, achieving consistent few-second response times
* Planned for gradual expansion to additional use cases and functions
### Technical Architecture Details
The system operates through a series of well-defined components:
* **Frontend Layer**: Custom web interface built for internal client services team
* **API Layer**: REST API through API Gateway managing client interactions
* **Processing Layer**: Multiple Lambda functions handling:
* Authorization and authentication
* Chat synchronization with Amazon Q Business
* Response validation
* **Storage Layer**: Combination of:
* S3 for document storage
* DynamoDB for operational data
* Indexed knowledge base for quick retrieval
### Results and Impact
The implementation has shown significant positive outcomes:
* Reduced response time to seconds for complex queries
* Improved accuracy through the validation system
* Enhanced customer experience through consistent and quick responses
* Increased staff productivity by automating information retrieval
### Future Roadmap
LSEG has planned several enhancements:
* Integration with existing email systems
* Deeper CRM integration
* Expansion to additional use cases within LSEG
* Continuous improvement of the validation system
### Key Learning Points
The case study highlights several important aspects of successful LLMOps implementation:
* The importance of selecting appropriate tools based on practical requirements rather than technical complexity
* The value of robust validation systems in regulated environments
* The benefits of a phased deployment approach
* The need for clear attribution and source citations in AI-generated responses
* The importance of balancing automation with human oversight
This implementation stands out for its practical approach to solving real business problems while maintaining high standards for accuracy and reliability in a regulated financial environment. It demonstrates how modern LLM technologies can be effectively deployed in production while meeting strict enterprise requirements for security, accuracy, and performance.
Start your new ML Project today with ZenML Pro
Join 1,000s of members already deploying models with ZenML.