Skip to content

Commit 07c156a

Browse files
authored
feat: dedicated-ai-and-template-section (#263)
added dedicated-ai-and-template-section ## Summary by Sourcery Add dedicated AI and template sections to the documentation, introducing new content for blockchain and AI integration, smart contract templates, and AI-powered development tools New Features: - Added comprehensive documentation for Model Context Protocol (MCP) - Introduced AI code assistant (RooCode) documentation - Created template libraries for EVM and Fabric smart contracts - Added OpenAI and pgvector integration guide Documentation: - Expanded platform documentation with new sections on AI and blockchain integration - Added detailed guides for using AI tools in blockchain development - Created template documentation for smart contract development Chores: - Reorganized documentation structure - Added new meta.json files for new documentation sections
1 parent b021ece commit 07c156a

File tree

30 files changed

+5739
-41
lines changed

30 files changed

+5739
-41
lines changed

content/docs/blockchain-and-ai/ai-code-assistant.mdx

Lines changed: 306 additions & 0 deletions
Large diffs are not rendered by default.

content/docs/blockchain-and-ai/blockchain-and-ai.mdx

Lines changed: 917 additions & 0 deletions
Large diffs are not rendered by default.

content/docs/blockchain-and-ai/mcp.mdx

Lines changed: 544 additions & 0 deletions
Large diffs are not rendered by default.
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
{
2+
"title": "Blockchain and AI",
3+
"icon": "Brain",
4+
"pages": [
5+
"blockchain-and-ai",
6+
"mcp",
7+
"ai-code-assistant",
8+
"open-ai-nodes-and-pg-vector"
9+
]
10+
}
Lines changed: 211 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,211 @@
1+
---
2+
title: "Open AI nodes and pgvector"
3+
description:
4+
A Guide to Building an AI-Powered Workflow with OpenAI Nodes and Vector
5+
Storage in Hasura
6+
sidebar_position: 2
7+
keywords: [integration studio, OpenAI, Hasura, pgvector, AI, SettleMint]
8+
---
9+
10+
This guide will demonstrate how to use the **SettleMint Integration Studio** to
11+
create a flow that incorporates OpenAI nodes for vectorization and utilizes the
12+
`pgvector` plugin in Hasura for similarity searches. If you are new to
13+
SettleMint, check out the
14+
[Getting Started Guide](/launching-the-platform/managed-cloud-deployment/quickstart).
15+
16+
In this guide, you will learn to create workflows that:
17+
18+
- Use **OpenAI nodes** to vectorize data.
19+
- Store vectorized data in **Hasura** using `pgvector`.
20+
- Conduct similarity searches to find relevant matches for new queries.
21+
22+
### Prerequisites
23+
24+
- A SettleMint Platform account with **Integration Studio** and **Hasura**
25+
deployed
26+
- Access to the Integration Studio and Hasura consoles in your SettleMint
27+
environment
28+
- An OpenAI API key for using the OpenAI nodes
29+
- A data source to vectorize (e.g., Graph Node, Attestation Indexer, or external
30+
API endpoint)
31+
32+
### Example Flow Available
33+
34+
The Integration Studio includes a pre-built AI example flow that demonstrates
35+
these concepts. The flow uses the SettleMint Platform's attestation indexer as a
36+
data source, showing how to:
37+
38+
- Fetch attestation data via HTTP endpoint
39+
- Process and vectorize the attestation content
40+
- Store vectors in Hasura
41+
- Perform similarity searches
42+
43+
You can use this flow as a reference while building your own implementation.
44+
Each step described in this guide can be found in the example flow.
45+
46+
---
47+
48+
## Part 1: Creating a Workflow to Fetch, Vectorize, and Store Data
49+
50+
### Step 1: Set Up Vector Storage in Hasura
51+
52+
1. Access your SettleMint's Hasura instance through the admin console.
53+
54+
2. Create a new table called `document_embeddings` with the following columns:
55+
- `id` (type: UUID, primary key)
56+
- `embedding` (type: vector(1536)) - For storing OpenAI embeddings
57+
58+
### Step 2: Set Up the Integration Studio Flow
59+
60+
1. **Open Integration Studio** in SettleMint and click on **Create Flow** to
61+
start a new workflow.
62+
63+
### Step 3: Fetch Data from an External API
64+
65+
1. **Add an HTTP Request Node** to retrieve data from an external API, such as a
66+
document or product listing service.
67+
2. Configure the **API endpoint** and any necessary authentication settings.
68+
3. **Add a JSON Node** to parse the response data, focusing on fields like `id`
69+
and `content` for further processing.
70+
71+
### Step 4: Vectorize Data with OpenAI Node
72+
73+
1. **Insert an OpenAI Node** in the workflow:
74+
- Use this node to generate vector embeddings for the text data using
75+
OpenAI's Embedding API.
76+
- Configure the OpenAI node to use the appropriate model and input data, such
77+
as `text-embedding-ada-002`.
78+
79+
![Create an OpenAI node](../../img/developer-guides/openai-node.png)
80+
81+
### Step 5: Store Vectors in Hasura with pgvector
82+
83+
1. **Add a GraphQL Node** to save the vector embeddings and data `id` in Hasura.
84+
2. Set up a **GraphQL Mutation** to store the vectors and associated IDs in a
85+
table enabled with `pgvector`.
86+
87+
Example Mutation:
88+
89+
```graphql
90+
mutation insertVector($id: uuid!, $vector: [Float!]!) {
91+
insert_vectors(objects: { id: $id, vector: $vector }) {
92+
affected_rows
93+
}
94+
}
95+
```
96+
97+
3. Ensure correct data mapping from the fetched data and vectorized output.
98+
99+
### Step 6: Deploy and Test the Workflow
100+
101+
1. **Deploy the Flow** within Integration Studio and **run it** to confirm that
102+
data is fetched, vectorized, and stored in Hasura.
103+
2. **Verify Hasura Data** by checking the table to ensure vectorized entries and
104+
corresponding IDs are stored correctly.
105+
106+
---
107+
108+
## Part 2: Setting Up a Similarity Search Endpoint
109+
110+
### Step 1: Create a POST Endpoint
111+
112+
1. **Add an HTTP POST Node** to accept a JSON payload with a `query` string to
113+
be vectorized and compared to stored data.
114+
115+
Payload Example:
116+
117+
```json
118+
{
119+
"query": "input string for similarity search"
120+
}
121+
```
122+
123+
2. **Parse the Request** by adding a JSON node to extract the `query` field from
124+
the incoming POST request.
125+
126+
### Step 2: Vectorize the Input Query
127+
128+
1. **Add an OpenAI Node** to convert the incoming `query` string into a vector
129+
representation.
130+
131+
Example Configuration:
132+
133+
```text
134+
Model: text-embedding-ada-002
135+
Input: {{msg.payload.query}}
136+
```
137+
138+
### Step 3: Perform a Similarity Search with Hasura
139+
140+
1. **Add a GraphQL Node** to perform a vector similarity search within Hasura
141+
using the `pgvector` plugin.
142+
2. Use a **GraphQL Query** to order results by similarity, returning the top 5
143+
most similar records.
144+
145+
Example Query:
146+
147+
```graphql
148+
query searchVectors($vector: [Float!]!) {
149+
vectors(order_by: { vector: { _vector_distance: $vector } }, limit: 5) {
150+
id
151+
vector
152+
}
153+
}
154+
```
155+
156+
3. Map the vector from the OpenAI node output as the `vector` input for the
157+
Hasura query.
158+
159+
### Step 4: Format and Return the Results
160+
161+
1. **Add a Function Node** to format the response, listing the top 5 matches in
162+
a structured JSON format.
163+
164+
### Step 5: Test the Flow
165+
166+
1. **Deploy the Flow** and send a POST request to confirm the similarity search
167+
functionality.
168+
2. **Verify Response** to ensure that the flow accurately returns the top 5
169+
matches from the vectorized data in Hasura.
170+
171+
---
172+
173+
## Next Steps
174+
175+
Now that you have built an AI-powered workflow, here are some
176+
blockchain-specific applications you can explore:
177+
178+
### Vectorize On-Chain Data
179+
180+
- Index and vectorize smart contract events for similarity-based event
181+
monitoring
182+
- Create embeddings from transaction data to detect patterns or anomalies
183+
- Vectorize NFT metadata for content-based recommendations
184+
- Build semantic search for on-chain attestations
185+
186+
### Advanced Use Cases
187+
188+
- Combine transaction data with natural language descriptions for enhanced
189+
search
190+
- Create AI-powered analytics dashboards using vectorized blockchain metrics
191+
- Implement fraud detection by vectorizing transaction patterns
192+
- Build a semantic search engine for smart contract code and documentation
193+
194+
### Integration Ideas
195+
196+
- Connect to multiple blockchain indexers to vectorize data across networks
197+
- Combine off-chain and on-chain data vectors for comprehensive analysis
198+
- Set up automated alerts based on similarity to known patterns
199+
- Create a knowledge base from vectorized blockchain documentation
200+
201+
For further resources, check out:
202+
203+
- [SettleMint Integration Studio Documentation](/building-with-settlemint/integration-studio/)
204+
- [Node-RED Documentation](https://nodered.org/docs/)
205+
- [OpenAI API Documentation](https://openai.com/docs/)
206+
- [Hasura pgvector Documentation](https://hasura.io/docs/3.0/connectors/postgresql/native-operations/vector-search/)
207+
208+
---
209+
210+
This guide should enable you to build AI-powered workflows with SettleMint's new
211+
OpenAI nodes and `pgvector` support in Hasura for efficient similarity searches.

content/docs/building-with-settlemint/evm-chains-guide/attestation-indexer.mdx

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -100,7 +100,7 @@ identities, or events that can be independently verified by others.
100100

101101
---
102102

103-
## 3. How eas works
103+
## 3. How EAS works
104104

105105
```mermaid
106106
graph TD
@@ -303,7 +303,7 @@ const config = {
303303
// Connect to the blockchain
304304
const provider = new ethers.JsonRpcProvider(config.rpcUrl);
305305
const signer = new ethers.Wallet(config.privateKey, provider);
306-
const eas = new EAS(config.easAddress);
306+
const EAS = new EAS(config.easAddress);
307307
eas.connect(signer);
308308

309309
// Create an encoder that matches our schema structure
@@ -363,7 +363,7 @@ attestations. You can verify attestations using one of the following methods:
363363
<Tabs>
364364
<Tab value="eas-sdk" label="Using the EAS SDK">
365365

366-
#### Verification using the eas sdk
366+
#### Verification using the EAS sdk
367367

368368
The EAS SDK provides an easy way to verify attestations programmatically, making
369369
it ideal for off-chain use cases.
@@ -383,7 +383,7 @@ const config = {
383383
async function verifyAttestation(attestationUID) {
384384
// Setup our blockchain connection
385385
const provider = new ethers.JsonRpcProvider(config.rpcUrl);
386-
const eas = new EAS(config.easAddress);
386+
const EAS = new EAS(config.easAddress);
387387
eas.connect(provider);
388388

389389
console.log("🔍 Verifying attestation:", attestationUID);
@@ -638,7 +638,7 @@ The flow includes:
638638
"id": "setup_inject",
639639
"type": "inject",
640640
"z": "eas_flow",
641-
"name": "Inputs: RpcUrl, registry address, eas address, private key",
641+
"name": "Inputs: RpcUrl, registry address, EAS address, private key",
642642
"props": [
643643
{
644644
"p": "rpcUrl",
@@ -675,7 +675,7 @@ The flow includes:
675675
"type": "function",
676676
"z": "eas_flow",
677677
"name": "Setup global variables",
678-
"func": "// Initialize provider with specific network parameters\nconst provider = new ethers.JsonRpcProvider(msg.rpcUrl)\n\nconst signer = new ethers.Wallet(msg.privateKey, provider);\n\n// Initialize EAS with specific gas settings\nconst eas = new eassdk.EAS(msg.easAddress);\neas.connect(signer);\n\n// Store in global context\nglobal.set('provider', provider);\nglobal.set('signer', signer);\nglobal.set('eas', eas);\nglobal.set('registryAddress', msg.registryAddress);\n\nmsg.payload = 'EAS configuration initialized';\nreturn msg;",
678+
"func": "// Initialize provider with specific network parameters\nconst provider = new ethers.JsonRpcProvider(msg.rpcUrl)\n\nconst signer = new ethers.Wallet(msg.privateKey, provider);\n\n// Initialize EAS with specific gas settings\nconst EAS = new eassdk.EAS(msg.easAddress);\neas.connect(signer);\n\n// Store in global context\nglobal.set('provider', provider);\nglobal.set('signer', signer);\nglobal.set('eas', eas);\nglobal.set('registryAddress', msg.registryAddress);\n\nmsg.payload = 'EAS configuration initialized';\nreturn msg;",
679679
"outputs": 1,
680680
"timeout": "",
681681
"noerr": 0,
@@ -757,7 +757,7 @@ The flow includes:
757757
"type": "function",
758758
"z": "eas_flow",
759759
"name": "Create attestation",
760-
"func": "// Get global variables\nconst eas = global.get('eas');\nconst schemaUID = msg.schemaUID;\n\n// Create an encoder that matches our schema structure\nconst schemaEncoder = new eassdk.SchemaEncoder(\"string username, string platform, string handle\");\n\n// The actual data we want to attest to\nconst attestationData = [\n { name: \"username\", value: \"awesome_developer\", type: \"string\" },\n { name: \"platform\", value: \"GitHub\", type: \"string\" },\n { name: \"handle\", value: \"@devmaster\", type: \"string\" }\n];\n\ntry {\n // Convert our data into the format EAS expects\n const encodedData = schemaEncoder.encodeData(attestationData);\n\n // Create the attestation\n const tx = await eas.attest({\n schema: schemaUID,\n data: {\n recipient: \"0x0000000000000000000000000000000000000000\", // Public attestation\n expirationTime: 0, // Never expires\n revocable: true, // Can be revoked later if needed\n data: encodedData // Our encoded attestation data\n }\n });\n\n // Wait for confirmation and get the result\n const receipt = await tx.wait();\n\n // Store attestation UID for later verification\n global.set('attestationUID', receipt.attestationUID);\n\n msg.payload = {\n success: true,\n attestationUID: receipt,\n message: \"Attestation created successfully!\"\n };\n} catch (error) {\n msg.payload = {\n success: false,\n error: error.message\n };\n}\n\nreturn msg;",
760+
"func": "// Get global variables\nconst EAS = global.get('eas');\nconst schemaUID = msg.schemaUID;\n\n// Create an encoder that matches our schema structure\nconst schemaEncoder = new eassdk.SchemaEncoder(\"string username, string platform, string handle\");\n\n// The actual data we want to attest to\nconst attestationData = [\n { name: \"username\", value: \"awesome_developer\", type: \"string\" },\n { name: \"platform\", value: \"GitHub\", type: \"string\" },\n { name: \"handle\", value: \"@devmaster\", type: \"string\" }\n];\n\ntry {\n // Convert our data into the format EAS expects\n const encodedData = schemaEncoder.encodeData(attestationData);\n\n // Create the attestation\n const tx = await eas.attest({\n schema: schemaUID,\n data: {\n recipient: \"0x0000000000000000000000000000000000000000\", // Public attestation\n expirationTime: 0, // Never expires\n revocable: true, // Can be revoked later if needed\n data: encodedData // Our encoded attestation data\n }\n });\n\n // Wait for confirmation and get the result\n const receipt = await tx.wait();\n\n // Store attestation UID for later verification\n global.set('attestationUID', receipt.attestationUID);\n\n msg.payload = {\n success: true,\n attestationUID: receipt,\n message: \"Attestation created successfully!\"\n };\n} catch (error) {\n msg.payload = {\n success: false,\n error: error.message\n };\n}\n\nreturn msg;",
761761
"outputs": 1,
762762
"timeout": "",
763763
"noerr": 0,
@@ -803,7 +803,7 @@ The flow includes:
803803
"type": "function",
804804
"z": "eas_flow",
805805
"name": "Verify attestation",
806-
"func": "const eas = global.get('eas');\nconst attestationUID = msg.attestationUID;\n\ntry {\n const attestation = await eas.getAttestation(attestationUID);\n const schemaEncoder = new eassdk.SchemaEncoder(\"string pshandle, string socialMedia, string socialMediaHandle\");\n const decodedData = schemaEncoder.decodeData(attestation.data);\n\n msg.payload = {\n isValid: !attestation.revoked,\n attestation: {\n attester: attestation.attester,\n time: new Date(Number(attestation.time) * 1000).toLocaleString(),\n expirationTime: attestation.expirationTime > 0 \n ? new Date(Number(attestation.expirationTime) * 1000).toLocaleString()\n : 'Never',\n revoked: attestation.revoked\n },\n data: {\n psHandle: decodedData[0].value.toString(),\n socialMedia: decodedData[1].value.toString(),\n socialMediaHandle: decodedData[2].value.toString()\n }\n };\n} catch (error) {\n msg.payload = { \n success: false, \n error: error.message,\n details: JSON.stringify(error, Object.getOwnPropertyNames(error))\n };\n}\n\nreturn msg;",
806+
"func": "const EAS = global.get('eas');\nconst attestationUID = msg.attestationUID;\n\ntry {\n const attestation = await eas.getAttestation(attestationUID);\n const schemaEncoder = new eassdk.SchemaEncoder(\"string pshandle, string socialMedia, string socialMediaHandle\");\n const decodedData = schemaEncoder.decodeData(attestation.data);\n\n msg.payload = {\n isValid: !attestation.revoked,\n attestation: {\n attester: attestation.attester,\n time: new Date(Number(attestation.time) * 1000).toLocaleString(),\n expirationTime: attestation.expirationTime > 0 \n ? new Date(Number(attestation.expirationTime) * 1000).toLocaleString()\n : 'Never',\n revoked: attestation.revoked\n },\n data: {\n psHandle: decodedData[0].value.toString(),\n socialMedia: decodedData[1].value.toString(),\n socialMediaHandle: decodedData[2].value.toString()\n }\n };\n} catch (error) {\n msg.payload = { \n success: false, \n error: error.message,\n details: JSON.stringify(error, Object.getOwnPropertyNames(error))\n };\n}\n\nreturn msg;",
807807
"outputs": 1,
808808
"timeout": "",
809809
"noerr": 0,

0 commit comments

Comments
 (0)