Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 17 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,23 @@ All notable changes to the OpenIntent SDK will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.15.1] - 2026-03-06

### Changed

- **Gemini SDK Migration** — Replaced deprecated `google-generativeai` SDK with `google-genai` (v1.0+). The `GeminiAdapter` now uses the modern `genai.Client` pattern (`client.models.generate_content` / `client.models.generate_content_stream`) instead of the legacy `genai.configure()` + `GenerativeModel()` approach.
- **GeminiAdapter Rewrite** — Full protocol parity with OpenAI/Anthropic adapters: prompt/completion/total token counts from `usage_metadata`, finish reason mapping from candidates, function call tracking with provider-native IDs, streaming usage metadata from final chunks, and multi-turn `GeminiChatSession` with proper history management (including during streaming).
- **LLMEngine Gemini Integration** — Added `_messages_to_gemini_contents()` for proper message conversion (system messages become `system_instruction`, assistant messages use `model` role), `_tools_to_gemini_format()` for tool schema conversion to Gemini function declarations, and raw provider fallback paths for Gemini in both `_call_raw_provider` and `_stream_raw_provider`.
- **Default model names** updated from `gemini-1.5-pro` / `gemini-2.0-flash` to `gemini-3-flash` across SDK, docs, and frontend.
- **Dependency** — `google-generativeai>=0.4.0` replaced with `google-genai>=1.0.0` in `gemini` and `all-adapters` optional extras.

### Updated

- All version references updated to 0.15.1 across Python SDK, MCP server package, documentation, and changelog.
- Documentation and examples updated to reflect new SDK patterns and model names.

---

## [0.15.0] - 2026-03-02

### Added
Expand Down
17 changes: 17 additions & 0 deletions docs/changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,23 @@ All notable changes to the OpenIntent SDK will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.15.1] - 2026-03-06

### Changed

- **Gemini SDK Migration** — Replaced deprecated `google-generativeai` SDK with `google-genai` (v1.0+). The `GeminiAdapter` now uses the modern `genai.Client` pattern (`client.models.generate_content` / `client.models.generate_content_stream`) instead of the legacy `genai.configure()` + `GenerativeModel()` approach.
- **GeminiAdapter Rewrite** — Full protocol parity with OpenAI/Anthropic adapters: prompt/completion/total token counts from `usage_metadata`, finish reason mapping from candidates, function call tracking with provider-native IDs, streaming usage metadata from final chunks, and multi-turn `GeminiChatSession` with proper history management (including during streaming).
- **LLMEngine Gemini Integration** — Added `_messages_to_gemini_contents()` for proper message conversion (system messages become `system_instruction`, assistant messages use `model` role), `_tools_to_gemini_format()` for tool schema conversion to Gemini function declarations, and raw provider fallback paths for Gemini in both `_call_raw_provider` and `_stream_raw_provider`.
- **Default model names** updated from `gemini-1.5-pro` / `gemini-2.0-flash` to `gemini-3-flash` across SDK, docs, and frontend.
- **Dependency** — `google-generativeai>=0.4.0` replaced with `google-genai>=1.0.0` in `gemini` and `all-adapters` optional extras.

### Updated

- All version references updated to 0.15.1 across Python SDK, MCP server package, documentation, and changelog.
- Documentation and examples updated to reflect new SDK patterns and model names.

---

## [0.15.0] - 2026-03-02

### Added
Expand Down
16 changes: 9 additions & 7 deletions docs/examples/llm-adapters.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,16 +68,18 @@ for chunk in adapter.chat_complete_stream(
## Google Gemini

```python
import google.generativeai as genai
from google import genai
from openintent.adapters import GeminiAdapter

genai.configure(api_key="...")
model = genai.GenerativeModel("gemini-pro")
gemini = genai.Client(api_key="...")

adapter = GeminiAdapter(model, client, intent.id)
response = adapter.chat_complete(
messages=[{"role": "user", "content": "Explain machine learning"}]
)
adapter = GeminiAdapter(gemini, client, intent.id, model="gemini-3-flash")
response = adapter.generate_content("Explain machine learning")
print(response.text)

# Streaming
for chunk in adapter.generate_content("Explain in detail", stream=True):
print(chunk.text, end="", flush=True)
```

## DeepSeek
Expand Down
2 changes: 1 addition & 1 deletion docs/getting-started/llm-quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ The `model=` parameter accepts any model supported by the [LLM adapters](../../g
@Agent("researcher", model="claude-sonnet-4-20250514")

# Google Gemini
@Agent("researcher", model="gemini-2.0-flash")
@Agent("researcher", model="gemini-3-flash")

# DeepSeek
@Agent("researcher", model="deepseek-chat")
Expand Down
9 changes: 4 additions & 5 deletions docs/guide/adapters.md
Original file line number Diff line number Diff line change
Expand Up @@ -97,14 +97,13 @@ pip install openintent[all-adapters] # All adapters
=== "Gemini"

```python
from google import genai
from openintent.adapters import GeminiAdapter

adapter = GeminiAdapter(gemini_client, oi_client, intent_id)
gemini = genai.Client(api_key="...")
adapter = GeminiAdapter(gemini, oi_client, intent_id, model="gemini-3-flash")

response = adapter.generate_content(
model="gemini-pro",
contents=[{"role": "user", "parts": [{"text": "Hello"}]}]
)
response = adapter.generate_content("Hello")
```

=== "Grok / DeepSeek"
Expand Down
4 changes: 2 additions & 2 deletions docs/overrides/home.html
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@

<!-- Hero -->
<div class="oi-hero">
<div class="oi-hero__badge">v0.15.0 &mdash; Native FastAPI SSE &amp; RFC-0010 Retry MCP Tools</div>
<div class="oi-hero__badge">v0.15.1 &mdash; Gemini Adapter Rebuild for google-genai SDK</div>
<h1 class="oi-hero__title">Stop Duct-Taping Your Agents Together</h1>
<p class="oi-hero__subtitle">
OpenIntent is a durable, auditable protocol for multi-agent coordination. Structured intents replace fragile chat chains. Versioned state replaces guesswork. Ship agent systems that actually work in production.
Expand All @@ -50,7 +50,7 @@ <h1 class="oi-hero__title">Stop Duct-Taping Your Agents Together</h1>
<div class="oi-stat__label">Tests</div>
</div>
<div class="oi-stat">
<div class="oi-stat__number">v0.15.0</div>
<div class="oi-stat__number">v0.15.1</div>
<div class="oi-stat__label">Latest</div>
</div>
</div>
Expand Down
2 changes: 1 addition & 1 deletion mcp-server/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "@openintentai/mcp-server",
"version": "0.15.0",
"version": "0.15.1",
"description": "MCP server exposing the OpenIntent Coordination Protocol as MCP tools and resources",
"main": "dist/index.js",
"types": "dist/index.d.ts",
Expand Down
2 changes: 1 addition & 1 deletion mcp-server/src/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ async function main() {
const server = new Server(
{
name: "openintent-mcp",
version: "0.15.0",
version: "0.15.1",
},
{
capabilities: {
Expand Down
2 changes: 1 addition & 1 deletion mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -204,7 +204,7 @@ extra:
link: https://pypi.org/project/openintent/
version:
provider: mike
announcement: "<strong>v0.15.0</strong> is here &mdash; Native FastAPI SSE replaces sse-starlette, plus 4 new RFC-0010 retry MCP tools (70 total). <a href='changelog/'>Read the changelog &rarr;</a>"
announcement: "<strong>v0.15.1</strong> is here &mdash; Gemini adapter rebuilt for google-genai SDK v1.0+, full LLMEngine Gemini integration. <a href='changelog/'>Read the changelog &rarr;</a>"
meta:
- name: description
content: "OpenIntent Python SDK — structured multi-agent coordination protocol with decorator-first agents, 23 RFCs, 7 LLM adapters, federation, MCP integration, and built-in FastAPI server."
Expand Down
2 changes: 1 addition & 1 deletion openintent/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -233,7 +233,7 @@ def get_server() -> tuple[Any, Any, Any]:
)


__version__ = "0.15.0"
__version__ = "0.15.1"
__all__ = [
"OpenIntentClient",
"AsyncOpenIntentClient",
Expand Down
11 changes: 5 additions & 6 deletions openintent/adapters/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
Supported providers:
- OpenAI (GPT-4, GPT-4o, GPT-5.2, codex models like gpt-5.2-codex, etc.)
- Anthropic (Claude 3, Claude 4, etc.)
- Google Gemini (Gemini 1.5, Gemini 2, etc.)
- Google Gemini (Gemini 3 Flash, Gemini 3 Pro, Gemini 3.1 Pro, etc.)
- xAI Grok (Grok-beta, etc.)
- DeepSeek (DeepSeek-chat, DeepSeek-coder, etc.)
- Azure OpenAI (GPT-4, GPT-4o via Azure endpoints)
Expand Down Expand Up @@ -58,15 +58,14 @@

Example usage with Google Gemini:

import google.generativeai as genai
from google import genai
from openintent import OpenIntentClient
from openintent.adapters import GeminiAdapter

genai.configure(api_key="...")
model = genai.GenerativeModel("gemini-1.5-pro")

client = OpenIntentClient(base_url="...", api_key="...")
adapter = GeminiAdapter(model, client, intent_id="...")
gemini = genai.Client(api_key="...")

adapter = GeminiAdapter(gemini, client, intent_id="...", model="gemini-3-flash")

response = adapter.generate_content("Hello, how are you?")
"""
Expand Down
Loading