Skip to content

Commit

Permalink
Merge pull request #781 from koid/feature/support-bedrock-claude3
Browse files Browse the repository at this point in the history
added support for bedrock/claude3
  • Loading branch information
mrT23 authored Mar 18, 2024
2 parents eff6468 + 3bae515 commit dd83b19
Show file tree
Hide file tree
Showing 5 changed files with 20 additions and 4 deletions.
11 changes: 9 additions & 2 deletions docs/docs/usage-guide/additional_configurations.md
Original file line number Diff line number Diff line change
Expand Up @@ -162,15 +162,22 @@ To use Amazon Bedrock and its foundational models, add the below configuration:

```
[config] # in configuration.toml
model = "anthropic.claude-v2"
fallback_models="anthropic.claude-instant-v1"
model="bedrock/anthropic.claude-3-sonnet-20240229-v1:0"
model_turbo="bedrock/anthropic.claude-3-sonnet-20240229-v1:0"
fallback_models=["bedrock/anthropic.claude-v2:1"]
[aws] # in .secrets.toml
bedrock_region = "us-east-1"
```

Note that you have to add access to foundational models before using them. Please refer to [this document](https://docs.aws.amazon.com/bedrock/latest/userguide/setting-up.html) for more details.

If you are using the claude-3 model, please configure the following settings as there are parameters incompatible with claude-3.
```
[litellm]
drop_params = true
```

AWS session is automatically authenticated from your environment, but you can also explicitly set `AWS_ACCESS_KEY_ID` and `AWS_SECRET_ACCESS_KEY` environment variables.


Expand Down
5 changes: 5 additions & 0 deletions pr_agent/algo/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,9 @@
'anthropic.claude-v1': 100000,
'anthropic.claude-v2': 100000,
'anthropic/claude-3-opus-20240229': 100000,
'bedrock/anthropic.claude-instant-v1': 100000,
'bedrock/anthropic.claude-v2': 100000,
'bedrock/anthropic.claude-v2:1': 100000,
'bedrock/anthropic.claude-3-sonnet-20240229-v1:0': 100000,
'bedrock/anthropic.claude-3-haiku-20240307-v1:0': 100000,
}
3 changes: 3 additions & 0 deletions pr_agent/algo/ai_handlers/litellm_ai_handler.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,8 @@ def __init__(self):
assert litellm_token, "LITELLM_TOKEN is required"
os.environ["LITELLM_TOKEN"] = litellm_token
litellm.use_client = True
if get_settings().get("LITELLM.DROP_PARAMS", None):
litellm.drop_params = get_settings().litellm.drop_params
if get_settings().get("OPENAI.ORG", None):
litellm.organization = get_settings().openai.org
if get_settings().get("OPENAI.API_TYPE", None):
Expand Down Expand Up @@ -68,6 +70,7 @@ def __init__(self):
)
if get_settings().get("AWS.BEDROCK_REGION", None):
litellm.AmazonAnthropicConfig.max_tokens_to_sample = 2000
litellm.AmazonAnthropicClaude3Config.max_tokens = 2000
self.aws_bedrock_client = boto3.client(
service_name="bedrock-runtime",
region_name=get_settings().aws.bedrock_region,
Expand Down
3 changes: 2 additions & 1 deletion pr_agent/settings/configuration.toml
Original file line number Diff line number Diff line change
Expand Up @@ -192,7 +192,8 @@ pr_commands = [
url = ""

[litellm]
#use_client = false
# use_client = false
# drop_params = false

[pr_similar_issue]
skip_comments = false
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ GitPython==3.1.32
google-cloud-aiplatform==1.35.0
google-cloud-storage==2.10.0
Jinja2==3.1.2
litellm==1.29.1
litellm==1.31.10
loguru==0.7.2
msrest==0.7.1
openai==1.13.3
Expand Down

0 comments on commit dd83b19

Please sign in to comment.