Releases: awslabs/LISA
v4.0.2
Enhancements
- Revised base configuration to eliminate default RagRepository declaration. Important: Ensure config-custom.yaml contains an empty array declaration if no configurations are defined.
- Implemented multi-instance LISA deployment support within single AWS accounts. Customers may now deploy more than one LISA environment into a single account.
- Optimized data schema architecture to eliminate redundant reference patterns
User Interface Improvements
- Enhanced proxy configuration to support HTTP status code propagation for improved error handling
- Introduced configurable markdown viewer toggle for non-standard model outputs
- Implemented redesigned administrative configuration interface
- Enhanced session management:
- Removed UUID exposure from breadcrumb navigation
- Transitioned to last-modified timestamp display from access time
- Improved session loading indicators for enhanced user feedback
- Integrated document library refresh functionality
- Resolved critical Redux store corruption issue affecting state management overrides, reducing noticeable latency when fetching data in the UI
Acknowledgements
Full Changelog: https://github.com/awslabs/LISA/compare/v4.0.1..v4.0.2
v4.0.1
Bug Fixes
Vector Store Management
- Enhanced UI to display default repository name when not specified
- Improved UI to show "GLOBAL" when no groups are assigned
- Refined repository schema regex to ensure valid input fields
- Optimized admin routing for RAG repository access
- Updated RAG Configuration table to align with config destruction property
- Resolved issue preventing creation of OpenSearch vector stores
User Interface
- Implemented consistent positioning of chat input at the bottom of the screen
Acknowledgements
Full Changelog: https://github.com/awslabs/LISA/compare/v4.0.0..v4.0.1
v4.0.0
Our 4.0 launch brings enhanced RAG repository management features to LISA’s chatbot user interface (UI). Our new RAG document library allows users to view and manage RAG repository files. Administrators are now able to manage and configure vector stores (also known as RAG repositories), and document ingestion pipelines directly in the Configuration page without having to redeploy LISA.
Enhancements
RAG Repository Management
- Admins can create, edit, delete RAG repositories via LISA’s Configuration UI. Admins can also manage access through the UI. LISA re-deployments are no longer required.
- Admins can create, edit, delete new document ingestion pipelines via LISA’s Configuration UI. LISA re-deployments are no longer required.
- We added a RAG deletion pipeline that automatically removes S3 documents when deleted from RAG repositories.
- We introduced new API endpoints for dynamic management of vector stores and ingestion pipelines.
- Customers who previously configured LISA with RAG repositories (v3.5 and before) will be able to view these legacy RAG repositories in the Configuration UI. However, they will not be able to make any changes through the UI. Admins must continue to manage RAG repositories through the config file. We recommend that when you are ready, you delete any legacy RAG repositories through the UI. Then you will need to redeploy CDK which will automatically tear down the legacy repository’s resources. Then you will be able to recreate RAG repositories through the UI and re-load documents.
Document Library
- Added a RAG Document Library page in the chatbot UI. Users can download previously uploaded documents from the RAG repositories that they have access to.
- Users can also delete files from RAG repositories that they originally uploaded in the Document Library. Admins can delete any files through the Document Library. Files are also automatically removed from S3.
Note: As of LISA 4.0, new RAG repositories and document ingestion pipelines can no longer be configured at deployment via YAML.
Security
- Updated third-party dependencies.
Acknowledgements
v3.5.1
v3.5.1
Bug Fixes
Chat Session Management
- Resolved url redirect issue that prevented creation of new chat sessions via the New button
- Resolved intermittent loading issues when accessing historical conversations due to LangChain memory object
- Addressed error handling for LLM interactions after multiple prompts
Document Summarization
- Fixed stability issues with document summarization functionality in existing chat sessions
UI
- Corrected display scaling issues in Firefox for large screen resolutions
v3.5.0
Key Features
User Interface Modernization
- New year new me? We are rolling out an updated user interface (UI) in Q1. This release is the first stage of this effort.
- Document Summarization
- Building on existing non-RAG in context capabilities, we added a more comprehensive Document Summarization feature. This includes a dedicated modal interface where users:
- Upload text-based documents
- Select from approved summarization models
- Select and customize summarization prompts
- Choose between integrating summaries into existing chat sessions or initiating new ones
- System administrators retain full control through configuration settings in the Admin Configuration page
- Building on existing non-RAG in context capabilities, we added a more comprehensive Document Summarization feature. This includes a dedicated modal interface where users:
Other UI Enhancements
- Refactored chatbot UI in advance of upcoming UI improvements and this launch
- Consolidated existing chatbot features to streamline the UI
- Added several components to improve user experience: copy button, response generation animation
- Markdown formatting updated in LLM responses
Other System Enhancements
- Enhanced user data integration with RAG metadata infrastructure, enabling improved file management within vector stores
- Optimized RAG metadata schema to accommodate expanded documentation requirements
- Started updating sdk to be compliant with current APIs
- Implementation of updated corporate brand guidelines
Coming soon
Our development roadmap includes several significant UI/UX enhancements:
- Streamlined vector store file administration and access control
- Integrated ingestion pipeline management
- Enhanced Model Management user interface
Acknowledgements
Full Changelog: v3.4.0...v3.5.0
v3.4.0
Key Features
Vector Store Support
- Implemented support for multiple vector stores of the same type. For example, you can now configure more than 1 OpenSearch vector store with LISA.
- Introduced granular access control for vector stores based on a list of provided IDP groups. If a list isn’t provided the vector store is available to all LISA users.
- Expanded APIs for vector store file management to now include file listing and removal capabilities.
Deployment Flexibility
- Enabled custom IAM role overrides with documented minimum permissions available on our documentation site
- Introduced partition and domain override functionality
Other System Enhancements
- Enhanced create model validation to ensure data integrity
- Upgraded to Python 3.11 runtime for improved performance
- Updated various third-party dependencies to maintain security and functionality
- Updated the ChatUI:
- Refined ChatUI for improved message display
- Upgraded markdown parsing capabilities
- Implemented a copy feature for AI-generated responses
Coming soon
Happy Holidays! We have a lot in store for 2025. Our roadmap is customer driven. Please reach out to us via Github issues to talk more! Early in the new year you’ll see chatbot UI and vector store enhancements.
Acknowledgements
Full Changelog: v3.3.2...v3.4.0
v3.3.2
Bug Fixes
- Resolved issue where invalid schema import was causing create model api calls to fail
- Resolved issue where RAG citations weren't being populated in metadata for non-streaming requests
- Resolved issue where managing in-memory file context wouldn't display success notification and close the modal
Acknowledgements
Full Changelog: v3.3.1...v3.3.2
v3.3.1
Bug Fixes
- Resolved issue where AWS partition was hardcoded in RAG Pipeline
- Added back in LiteLLM environment override support
- Updated Makefile Model and ECR Account Number parsing
Acknowledgements
Full Changelog: v3.3.0...v3.3.1
v3.3.0
Key Features
RAG ETL Pipeline
- This feature introduces a second RAG ingestion capability for LISA customers. Today, customers can manually upload documents via the chatbot user interface directly into a vector store. With this new ingestion pipeline, customers have a flexible, scalable solution for automating the loading of documents into configured vector stores.
Enhancements
- Implemented a confirmation modal prior to closing the create model wizard, enhancing user control and preventing accidental data loss
- Added functionality allowing users to optionally override auto-generated security groups with custom security groups at deployment time
Acknowledgements
Full Changelog: v3.2.1...v3.3.0
v3.2.1
Bug Fixes
- Resolved issue where subnet wasn't being passed into ec2 instance creation
- Resolved role creation issue when deploying with custom subnets
- Updated docker image to grant permissions on copied in files
Coming Soon
- Version 3.3.0 will include a new RAG ingestion pipeline. This will allow users to configure an S3 bucket and an ingestion trigger. When triggered, these documents will be pre-processed and loaded into the selected vector store.
Acknowledgements
Full Changelog: v3.2.0...v3.2.1