99
1010## 🚀 Features
1111
12- - ⚙️ ** Modular Architecture** — Compose agents from interchangeable components.
13- - 🧩 ** Multi-LLM Support** — Connect to multiple providers seamlessly:
14- - ** OpenAI** (GPT-4o, GPT-4, GPT-3.5 Turbo)
15- - ** Anthropic** (Claude 3 family models (Opus, Sonnet, Haiku)
16- - ** Google** (Gemini family models (Pro, Flash)
17- - ** Ollama/llama-cpp** (local models like Llama, Mistral, etc.)
18- - ⚡ ** Optimized for Speed and Memory** — Built in C++ with focus on performance.
19- - 🔁 ** Built-In Workflow Patterns**
20- - Prompt Chaining
21- - Routing
22- - Parallelization
23- - Orchestrator-Workers
24- - Evaluator-Optimizer
25- - 🤖 ** Autonomous Agents** — Supports modern reasoning strategies:
26- - ReAct (Reason + Act)
12+ - ⚙️ ** Modular Architecture** — Compose agents from interchangeable components.
13+ - 🧩 ** Multi-LLM Support** — Connect to multiple providers seamlessly:
14+ - ** OpenAI** (GPT-4o, GPT-4, GPT-3.5 Turbo)
15+ - ** Anthropic** (Claude 3 family models (Opus, Sonnet, Haiku)
16+ - ** Google** (Gemini family models (Pro, Flash)
17+ - ** Ollama/llama-cpp** (local models like Llama, Mistral, etc.)
18+ - ⚡ ** Optimized for Speed and Memory** — Built in C++ with focus on performance.
19+ - 🔁 ** Built-In Workflow Patterns**
20+ - Prompt Chaining
21+ - Routing
22+ - Parallelization
23+ - Orchestrator-Workers
24+ - Evaluator-Optimizer
25+ - 🤖 ** Autonomous Agents** — Supports modern reasoning strategies:
26+ - ReAct (Reason + Act)
2727 - CoT (Chain-of-Thought) [ In Development]
28- - Plan and Execute
29- - Zero-Shot [ In Development]
28+ - Plan and Execute
29+ - Zero-Shot [ In Development]
3030 - Reflexion [ In Development]
31- - 🧠 ** Extensible Tooling System** — Plug in your own tools or use built-in ones (Web Search, Wikipedia, Python Executor, etc).
31+ - 🧠 ** Extensible Tooling System** — Plug in your own tools or use built-in ones (Web Search, Wikipedia, Python Executor, etc).
3232
3333## ⚙️ Requirements
3434
3737
3838- Dependencies (already provided for convenience)
3939 - python3 (3.11+)
40- - libcpr (C++ Requests)
41- - libcurl
4240 - nlohmann/json
4341 - spdlog
4442
@@ -105,7 +103,7 @@ The framework will check for API keys in the following order:
105103Here's a simple example of creating and running an autonomous agent:
106104
107105``` cpp
108- #include < agents-cpp/agent_context .h>
106+ #include < agents-cpp/context .h>
109107#include < agents-cpp/agents/autonomous_agent.h>
110108#include < agents-cpp/llm_interface.h>
111109#include < agents-cpp/tools/tool_registry.h>
@@ -117,7 +115,7 @@ int main() {
117115 auto llm = createLLM("anthropic", "<your_api_key_here>", "claude-3-5-sonnet-20240620");
118116
119117 // Create agent context
120- auto context = std::make_shared<AgentContext >();
118+ auto context = std::make_shared<Context >();
121119 context->setLLM(llm);
122120
123121 // Register tools
@@ -238,7 +236,7 @@ bazel run examples:<simple_agent> -- your_api_key_here
238236- ` lib/ ` : Public library for SDK
239237- ` include/agents-cpp/ ` : Public headers
240238 - ` types.h ` : Common type definitions
241- - ` agent_context .h` : Context for agent execution
239+ - ` context .h` : Context for agent execution
242240 - ` llm_interface.h ` : Interface for LLM providers
243241 - ` tool.h ` : Tool interface
244242 - ` memory.h ` : Agent memory interface
@@ -283,7 +281,7 @@ You can create custom workflows by extending the `Workflow` base class or combin
283281```cpp
284282class CustomWorkflow : public Workflow {
285283public:
286- CustomWorkflow(std::shared_ptr<AgentContext > context)
284+ CustomWorkflow(std::shared_ptr<Context > context)
287285 : Workflow(context) {}
288286
289287 JsonObject run(const String& input) override {
0 commit comments