Releases: aws-solutions/qnabot-on-aws
Releases · aws-solutions/qnabot-on-aws
v7.0.0
[7.0.0] - 2025-01-23
Added
- Streaming responses feature that enhances QnABot responses by providing real-time streaming from Large Language Models (LLMs) to the chat interface. This introduces a cloudformation parameter
EnableStreaming
to optionally create resources needed for streaming through a nested stack. See README. - Enhanced Guardrail Integration that implements pre-processing and post-processing guardrails to provide improved content control and broader security for your chatbot application. See README.
- Implemented Converse API to simplify LLM workflows by providing a consistent interface for different LLM providers, role prompting and eliminating the need for input tagging for Bedrock Guardrails. This introduces customizable system prompts
LLM_GENERATE_QUERY_SYSTEM_PROMPT
andLLM_QA_SYSTEM_PROMPT
in content designer to support role-based prompting. For more information, see system prompts in supported models and model features and using Converse API. - Ability to use both RAG with Bedrock KnowledgeBase and Kendra as fallback options. A new setting
FALLBACK_ORDER
in the content designer allows users to specify the fallback order of these options. - Ability to set a TTL on records added to the DynamoDB UsersTable. (PR #671) - contributed by (@richhaase)
- Mistral as a new LLM provider option and support for latest Anthropic Sonnet 3.5 V2, Haiku 3.5 V1, Amazon Nova Models, Ai21 Jambda Instruct, Cohere R plus and Meta Llama 3.1 models.
Changed
- Upgraded to Node 20.
- Upgraded AWS SDK dependencies.
- Migrated Settings to DynamoDB store rather than SSM, allowing for longer custom settings and prompts.
- Moved Amazon Q Business Plugin from samples repo to QnABot repo as an example lambda hook
- Reduced the size of Bedrock KnowledgeBase output by removing citations from plaintext response.
- Updated OpenSearch EBSOptions VolumeType to
gp3
fromgp2
. - Updated IAM permissions and cloudformation outputs.
- Added additional non-user input fields to OpenSearch metrics data redaction exclusion list.
- Migrated to Poetry for Python dependency management
- Updated innerHTML usages to innerText per security best practices
Fixed
- Fixed issues with Excel file import and character handling when reading questions import file
- Improvements for merging of chained items (PR #720) - contributed by (@amendlik)
Deprecated
- Settings stored in SSM Parameters. These will automatically be moved to DynamoDB when you upgrade.
- Sagemaker embeddings and LLM workflows.
- KendraCrawlerSNS Topic workflow Issue #742
- Bedrock LLM Models Cohere Command Text, Jurassic-2 Mid and Ultra per Amazon Bedrock Model Lifefycle
Security
- Patched Jinja2, nanoid, path-to-regexp vulnerability
v6.1.5
[6.1.5] - 2024-11-20
Security
- Patched langchain, cross-spawn & elliptic vulnerability
v6.1.4
v6.1.3
v6.1.2
[6.1.2] - 2024-10-07
Fixed
- Cleared context state credential and updated the page history after logout
Changed
- Added Anthropic Claude 3.5 Sonnet as an additional option to the list LLM models provided through cloudformation parameters
LLMBedrockModelId
andBedrockKnowledgeBaseModel
Deprecated
- Sagemaker support has been deprecated and will be removed in the next release
v6.1.1
[6.1.1] - 2024-09-26
Fixed
- Added back .gitignore to fix custom deployment issues through github repo
- Improved performance of lambda invocation from frontend to save settings faster
- Fixed bug that limited response card buttons to only 5 buttons Issue #765
- Security patch for body-parser, micromatch, path-to-regexp, and webpack
- Added support for crawled links in Bedrock Knowledge Base to be shown as referenced links
- Fixed an issue where the context is expanded by default and can't be closed when Knowledge Base returns lists in the response
- Fixed limit on import file sizes Issue #766
v6.1.0
[6.1.0] - 2024-08-29
Added
- Integration with Guardrails for Amazon Bedrock and Amazon Bedrock Knowledge Base Integration (see documentation)
- Ability to customize prompt template for RAG using Amazon Bedrock Knowledge Base through setting
KNOWLEDGE_BASE_PROMPT_TEMPLATE
(see documentation). - Ability to customize inference parameters for LLM specified in
BedrockKnowledgeBaseModel
inference parameters forBedrockKnowledgeBaseModel
through settingKNOWLEDGE_BASE_MODEL_PARAMS
(see documentation) - Ability to customize search type (e.g.
SEMANTIC
orHYBRID
) for how data sources in the knowledge base are queried through settingKNOWLEDGE_BASE_SEARCH_TYPE
(see documentation) - Ability to customize maximum number of retrieved results for RAG using Amazon Bedrock Knowledge Base through setting
KNOWLEDGE_BASE_MAX_NUMBER_OF_RETRIEVED_RESULTS
(see documentation). - Ability to customize metadata and filters for RAG using Amazon Bedrock Knowledge through setting
KNOWLEDGE_BASE_METADATA_FILTERS
(see documentation) - Added an option to specify the retention period for log groups through cloudformation parameter
LogRetentionPeriod
- Anonymized operational metrics for some designer settings
Changed
- Improved fault tolerance of Testall, Export, Import functionalities and added ContentDesignerOutputBucket
- Added Amazon Titan Text Embeddings V2 as an additional option to the list of embedding models provided through cloudformation parameter EmbeddingsBedrockModelId
- Added Amazon Titan Text Premier as an additional option to the list LLM models provided through cloudformation parameters LLMBedrockModelId and BedrockKnowledgeBaseModel. Issue 746
- Changed Sagemaker LLM image to latest
- Changed
CustomQnABotSettings
parameter store to Advanced Tier to accommodate storing additional custom settings
Removed
- Removed Amazon Lex V1 resources
- Removed Canvas LMS integration
Fixed
- Fixed import settings in content designer for double byte characters
- Fixed an edge case where the Knowledge Base could return a context starting with
#
characters, causing font differences in the returned text due to Markdown formatting - Fixed session attribute
qnabot_gotanswer
not being set totrue
after receiving hits from Knowledge Base
Security
- Security patch for axios, moto, read-excel-file, handlebars, boto3, click, elliptic & postcss
v6.0.3
[6.0.3] - 2024-08-06
Security
- Patched fast-xml-parser vulnerability
v6.0.2
[6.0.2] - 2024-07-22
Added
- Added a migration documentation for Migrating QnABot configurations and data from existing deployment to new deployment
- Added a documentation for Bedrock Knowledge Base
Fixed
- Improve logout functionality which signs out the user and invalidates the access and refresh tokens that Amazon Cognito issued to a user. Issue #747
- Fixed bug that restricted import of questions with answers that consisted of only double-byte characters. Issue #731
- Fixed bug with chained questions causing errors in the fulfillment lambda.
Updated
- Removed aws-sdk (JavaScript V2) from dependency list.
- Updated parameter description for elicit response bot settings in the content designer settings. Issue #745
- Removed LLM models
meta.llama2-70b-chat-v1
andmeta.llama2-13b-chat-v1
from the list of models in the Cloudformation parameterLLMBedrockModelId
since these models will be unavailable on Amazon Bedrock starting from August 12, 2024. - Updated the setting
LLM_QA_NO_HITS_REGEX
in the Content Designer to include a default patternSorry, I don't know
in prompts specified through the settingLLM_QA_PROMPT_TEMPLATE
and other patterns returned by LLMs in their responses. - Constrainted the query made to Bedrock Knowledge Base to maximum of 1000 characters input query as per the input requirements.
v6.0.1
[6.0.1] - 2024-06-26
Fixed
- Fixed bug that was restricting stack names to be below 26 characters. Issue #741
- Fixed a looping issue when using slots and chaining #721 (PR #721) - contributed by (@amendlik)
- Github links with incorrect paths.
Updated
- Security patches for braces, urllib3, and ws.
- Improved latency of IAM policy propagation when switching the Bedrock embedding model.