Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
119 changes: 119 additions & 0 deletions docs/my-website/docs/providers/bedrock_vector_store.md
Original file line number Diff line number Diff line change
Expand Up @@ -138,6 +138,125 @@ print(response.choices[0].message.content)
</Tabs>


## Filter Results

Filter by metadata attributes.

**Operators** (OpenAI-style, auto-translated):
- `eq`, `ne`, `gt`, `gte`, `lt`, `lte`, `in`, `nin`

**AWS operators** (use directly):
- `equals`, `notEquals`, `greaterThan`, `greaterThanOrEquals`, `lessThan`, `lessThanOrEquals`, `in`, `notIn`, `startsWith`, `listContains`, `stringContains`

<Tabs>
<TabItem value="single-filter" label="Single Filter">

```python
response = await litellm.acompletion(
model="anthropic/claude-3-5-sonnet",
messages=[{"role": "user", "content": "What are the latest updates?"}],
tools=[{
"type": "file_search",
"vector_store_ids": ["YOUR_KNOWLEDGE_BASE_ID"],
"filters": {
"key": "category",
"value": "updates",
"operator": "eq"
}
}]
)
```

</TabItem>

<TabItem value="and-filters" label="AND">

```python
response = await litellm.acompletion(
model="anthropic/claude-3-5-sonnet",
messages=[{"role": "user", "content": "What are the policies?"}],
tools=[{
"type": "file_search",
"vector_store_ids": ["YOUR_KNOWLEDGE_BASE_ID"],
"filters": {
"and": [
{"key": "category", "value": "policy", "operator": "eq"},
{"key": "year", "value": 2024, "operator": "gte"}
]
}
}]
)
```

</TabItem>

<TabItem value="or-filters" label="OR">

```python
response = await litellm.acompletion(
model="anthropic/claude-3-5-sonnet",
messages=[{"role": "user", "content": "Show me technical docs"}],
tools=[{
"type": "file_search",
"vector_store_ids": ["YOUR_KNOWLEDGE_BASE_ID"],
"filters": {
"or": [
{"key": "category", "value": "api", "operator": "eq"},
{"key": "category", "value": "sdk", "operator": "eq"}
]
}
}]
)
```

</TabItem>

<TabItem value="advanced-filters" label="AWS Operators">

```python
response = await litellm.acompletion(
model="anthropic/claude-3-5-sonnet",
messages=[{"role": "user", "content": "Find docs"}],
tools=[{
"type": "file_search",
"vector_store_ids": ["YOUR_KNOWLEDGE_BASE_ID"],
"filters": {
"and": [
{"key": "title", "value": "Guide", "operator": "stringContains"},
{"key": "tags", "value": "important", "operator": "listContains"}
]
}
}]
)
```

</TabItem>

<TabItem value="proxy-filters" label="Proxy">

```bash
curl http://localhost:4000/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $LITELLM_API_KEY" \
-d '{
"model": "claude-3-5-sonnet",
"messages": [{"role": "user", "content": "What are our policies?"}],
"tools": [{
"type": "file_search",
"vector_store_ids": ["YOUR_KNOWLEDGE_BASE_ID"],
"filters": {
"and": [
{"key": "department", "value": "engineering", "operator": "eq"},
{"key": "type", "value": "policy", "operator": "eq"}
]
}
}]
}'
```

</TabItem>
</Tabs>

## Accessing Search Results

See how to access vector store search results in your response:
Expand Down
17 changes: 17 additions & 0 deletions litellm/llms/base_llm/vector_store/transformation.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,11 @@
from typing import TYPE_CHECKING, Any, Dict, List, Optional, Tuple, Union

import httpx
from numpy import isin

from litellm.types.router import GenericLiteLLMParams
from litellm.types.vector_stores import (
VECTOR_STORE_OPENAI_PARAMS,
VectorStoreCreateOptionalRequestParams,
VectorStoreCreateResponse,
VectorStoreSearchOptionalRequestParams,
Expand All @@ -24,6 +26,20 @@


class BaseVectorStoreConfig:

def get_supported_openai_params(
self, model: str
) -> List[VECTOR_STORE_OPENAI_PARAMS]:
return []

def map_openai_params(
self,
non_default_params: dict,
optional_params: dict,
drop_params: bool,
) -> dict:
return optional_params

@abstractmethod
def transform_search_vector_store_request(
self,
Expand All @@ -34,6 +50,7 @@ def transform_search_vector_store_request(
litellm_logging_obj: LiteLLMLoggingObj,
litellm_params: dict,
) -> Tuple[str, Dict]:

pass

@abstractmethod
Expand Down
Loading
Loading